Replies: 6 comments 10 replies
-
how much memory do you have? memory usage correlates with following formula |
Beta Was this translation helpful? Give feedback.
-
I have ~125Gib of RAM with ~12GiB free.
|
Beta Was this translation helpful? Give feedback.
-
Please remove it from general section, currently max_file_size calculates from system.parts
it means 100 parallel tables and 100 parallel parts inside tables = 1000 streams which will allocate buffers for your use case
is enough
this could to affect CPU usage (clickhouse-sever will struggle to CPU with clickhouse-backup) |
Beta Was this translation helpful? Give feedback.
-
did you apply recommendations from #571 (comment) ? |
Beta Was this translation helpful? Give feedback.
-
which is your source location datacenter? are you sure you have enough bandwidth from blackbaze side to your datacenter? ok , need more context could you share SELECT database, table, count(), quantiles(0.5,0.99)(bytes_on_disk)
FROM system.parts WHERE database!='system' GROUP BY database, table and current config parameters
and |
Beta Was this translation helpful? Give feedback.
-
Try following settings general:
upload_concurrency: 5
s3:
concurrency: 5
max_parts_count: 5000 |
Beta Was this translation helpful? Give feedback.
-
I'm benchmarking backup upload to S3 but I can not figure out how settings affect the memory usage.
I try to tune the settings to optimize upload time and memory usage but I most of the time I got OOM error.
Can you explain how the memory usage can be calculated depending on the settings ?
Beta Was this translation helpful? Give feedback.
All reactions