在spirng controller 里获取hadoop文件内容时,hadoop 服务批量报错
ERROR datanode.DataNode: BlockSender.sendChunks() exception:
java.io.IOException: 你的主机中的软件中止了一个已建立的连接。
at sun.nio.ch.SocketDispatcher.write0(Native Method)
at sun.nio.ch.SocketDispatcher.write(Unknown Source)
at sun.nio.ch.IOUtil.writeFromNativeBuffer(Unknown Source)
at sun.nio.ch.IOUtil.write(Unknown Source)
at sun.nio.ch.SocketChannelImpl.write(Unknown Source)
at sun.nio.ch.FileChannelImpl.transferToTrustedChannel(Unknown Source)
at sun.nio.ch.FileChannelImpl.transferTo(Unknown Source)
at org.apache.hadoop.net.SocketOutputStream.transferToFully(SocketOutputStream.java:223)
at org.apache.hadoop.hdfs.server.datanode.FileIoProvider.transferToSocketFully(FileIoProvider.java:280)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendPacket(BlockSender.java:619)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.doSendBlock(BlockSender.java:803)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.sendBlock(BlockSender.java:750)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:607)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:152)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:104)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:290)
at java.lang.Thread.run(Unknown Source)
bean
@Bean
public FileSystem fileSystem() throws URISyntaxException, IOException {
URI uri = new URI("hdfs://127.0.0.1:9000");
Configuration configuration = new Configuration();
return FileSystem.get(uri,configuration);
}
controller代码
FSDataInputStream fsDataInputStream = fileSystem.open(new Path("/1.mp4"));
byte [] a = new byte[20];
fsDataInputStream.read(a);
fsDataInputStream.close();
controller代码不会报错,我试过断点调试,调试的慢,不会报错。调试的快(跳过spring框架代码)就会出错。
我写过单独main方法也试了,不管怎么读取和关闭都不会报错。是不是spring框架哪里没配置好?