Monday, July 21, 2014

Run Spark Shell locally

If you want to run spark to access the local file system, here is the simple way:
HADOOP_CONF_DIR=. MASTER=local spark-shell
If you don't give HADOOP_CONF_DIR, spark will use /etc/hadoop/conf which may point to a cluster running in pseduo mode. When HADOOP_CONF_DIR points to a dir without any Hadoop configuration, the file system will be local. It also works for spark-submit.

No comments:

Post a Comment