standalone模式下在提交spark examples中pi.py时:
出现下面的问题
:/usr/lib/spark#./bin/spark-submit --master spark://master:7077 --executor-memory 512m examples/src/main/python/pi.py
Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000d5550000, 715849728, 0) failed; error='Cannot allocate memory' (errno=12)
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (mmap) failed to map 715849728 bytes for committing reserved memory.
# An error report file with more information is saved as:
# /usr/lib/spark/hs_err_pid16414.log
我怎么修改spark.executor.memory的大小都没有效果,我的配置是这样的三台机子都是1g内存,是因为配置不够的原因吗?求大神速速解答啊
额,1g…你free一下,估计已经爆了。