我最近安装了 pyspark。它已正确安装。当我在 python 中使用以下简单程序时,出现错误。
>>from pyspark import SparkContext
>>sc = SparkContext()
>>data = range(1,1000)
>>rdd = sc.parallelize(data)
>>rdd.collect()
运行最后一行时出现错误,其关键行似乎是
[Stage 0:> (0 + 0) / 4]18/01/15 14:36:32 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)
org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/pyspark/python/lib/pyspark.zip/pyspark/worker.py", line 123, in main
("%d.%d" % sys.version_info[:2], version))
Exception: Python in worker has different version 2.7 than that in driver 3.5, PySpark cannot run with different minor versions.Please check environment variables PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON are correctly set.
我在 .bashrc 中有以下变量
export SPARK_HOME=/opt/spark
export PYTHONPATH=$SPARK_HOME/python3
我正在使用 Python 3。
原文由 Akash Kumar 发布,翻译遵循 CC BY-SA 4.0 许可协议
顺便说一句,如果你使用 PyCharm,你可以添加
PYSPARK_PYTHON
和PYSPARK_DRIVER_PYTHON
来运行/调试下面每个图像的配置