-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Deploying Spark fails with insufficient max JVM memory #981
Comments
Yes this is a known issue but we prefer not to increase the default memory significantly. To see the reasoning, please take a look at where Another option is to limit the number of threads dynamically, i.e., even if |
This is a good idea to dynamically set the number of cores, if the configured memory is insufficient and when the |
I was testing a simple developer setup for pipelines:
This finished with some errors which I ignored: https://paste.googleplex.com/5983969268465664
docker-compose -f docker/compose-controller-spark-sql-single.yaml up --force-recreate
Finished with error due to JVM max memory: https://paste.googleplex.com/4827520718864384
Fixed by editing docker/.env
JAVA_OPTS=-Xms10g -Xmx10g
Should we make this change permanent?
The text was updated successfully, but these errors were encountered: