spark 安装 启动失败

如题,安装spark 后,startall时候报错,log如下

Spark Command: /usr/java/jdk1.8.0_45/bin/java -cp /usr/local/spark-1.6.2/conf/:/usr/local/spark-1.6.2/lib/spark-assembly-1.6.2-hadoop2.2.0.jar -Xms1g -Xmx1g org.apache.spark.deploy.master.Master --ip localhost --port 7077 --webui-port 8080
========================================
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
        at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
        at java.lang.Class.getMethod0(Class.java:3018)
        at java.lang.Class.getMethod(Class.java:1784)
        at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 7 more
~
~
阅读 10.4k
2 个回答
新手上路,请多包涵

相同的问题,添加配置文件中的变量,至少这个问题过去了
vi ~/spark/conf/spark-env.sh

export SPARK_DIST_CLASSPATH=$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0-cdh5.9.3.jar:$HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.6.0-cdh5.9.3.jar:$HADOOP_HOME/share/hadoop/common/lib/commons-cli-1.2.jar

撰写回答
你尚未登录,登录后可以
  • 和开发者交流问题的细节
  • 关注并接收问题和回答的更新提醒
  • 参与内容的编辑和改进,让解决方法与时俱进