2

I am using PySpark from Django and connect to a spark master node using SparkSession to execute a job on the cluster.

My question is do I need a full install of spark on my local machine? All the documentation has me install spark and then add the PySpark libraries to the python path. I don't believe I need all ~500mb of that to connect to an existing cluster. I'm trying to lighten my docker containers.

Thanks for the help.

1 Answer 1

2

Although I have not tested it, as of Spark 2.1, PySpark is available from PyPi (for installation via pip) precisely for cases such as yours. From the docs:

The Python packaging for Spark is not intended to replace all of the other use cases. This Python packaged version of Spark is suitable for interacting with an existing cluster (be it Spark standalone, YARN, or Mesos) - but does not contain the tools required to setup your own standalone Spark cluster. You can download the full version of Spark from the Apache Spark downloads page.

NOTE: If you are using this with a Spark standalone cluster you must ensure that the version (including minor version) matches or you may experience odd errors

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.