Skip to content
This repository was archived by the owner on Jan 9, 2020. It is now read-only.

Conversation

coderanger
Copy link

This allows setting things like HADOOP_CONF_DIR in the more traditional Spark way.

What changes were proposed in this pull request?

Adds a source "${SPARK_HOME}/bin/load-spark-env.sh" to the command in each non-spark-class container.

How was this patch tested?

Manual testing with my local development environment.

This allows setting things like HADOOP_CONF_DIR in the more traditional Spark way.
@foxish
Copy link
Member

foxish commented Jan 24, 2018

@coderanger, would be great if you could help rebase this entire fork on top of the spark upstream effort - then we'd be in a better position to use this PR - since the dockerfiles etc are now very different;

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants