hadoop上传文件到HDFS报错

先上错误信息:

WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/Users/laiyinan/OneDrive/%e6%96%87%e6%a1%a3/%e4%b9%a6%e7%b1%8d/%e5%a4%a7%e6%95%b0%e6%8d%ae/%e8%be%be%e5%86%85BIG/%e5%a4%a7%e6%95%b0%e6%8d%ae/%e5%a4%a7%e6%95%b0%e6%8d%ae%e7%ac%ac%e4%ba%8c%e9%98%b6%e6%ae%b5/02-Hadoop/Day02/Code/HDFS/lib/hadoop-auth-2.7.1.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release

java.lang.ExceptionInInitializerError
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2806)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2802)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2668)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:371)
    at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:160)
    at org.apache.hadoop.fs.FileSystem$1.run(FileSystem.java:157)
    at java.base/java.security.AccessController.doPrivileged(Native Method)
    at java.base/javax.security.auth.Subject.doAs(Subject.java:423)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:157)
    at cn.tedu.hdfs.HDFSDemo.put(HDFSDemo.java:53)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.base/java.lang.reflect.Method.invoke(Method.java:566)
    at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
    at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
    at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
    at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
    at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
    at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
    at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
    at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
    at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
    at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
    at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
    at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
    at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
    at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69)
    at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
    at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:221)
    at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54)
Caused by: java.lang.StringIndexOutOfBoundsException: begin 0, end 3, length 2
    at java.base/java.lang.String.checkBoundsBeginEnd(String.java:3319)
    at java.base/java.lang.String.substring(String.java:1874)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:50)
    ... 34 more


Process finished with exit code 255

再上代码:

@Test
    public void put() throws IOException, URISyntaxException, InterruptedException {
        Configuration conf = new Configuration();
        // 在代码中指定的配置优先于xml中的配置
        conf.set("dfs.replication", "1");
        FileSystem fs = FileSystem.get(new URI("hdfs://127.0.0.1:9011"), conf, "root");

        // 表示在HDFS上创建指定的文件
        OutputStream out = fs.create(new Path("/user/hadoop/a.log"));
        FileInputStream in = new FileInputStream("/usr/local/Cellar/hadoop/logs/hadoop-laiyinan-datanode-laiyinandeMacBook-Pro.local.log");
        IOUtils.copyBytes(in, out, conf);
        in.close();
        out.close();

    }

已知hadoop正常运行在9011端口上
尝试用JDK1.8和JDK11都有这个问题

如何解决?

阅读 2k
1 个回答

错误是JDK版本不对导致的

JDK8U251是会报错
JDK8U291不会报错,正常运行

撰写回答
你尚未登录,登录后可以
  • 和开发者交流问题的细节
  • 关注并接收问题和回答的更新提醒
  • 参与内容的编辑和改进,让解决方法与时俱进
推荐问题