flink作业提交pyflink作业到yarn application报找不到python任务脚本,我用下列命令启动:
./flink run-application -t yarn-application \
-Dyarn.application.name=flinkcdctestpython \
-Dyarn.provided.lib.dirs="hdfs://nameservice1/pyflink/flink-dist-181" \
-pyarch hdfs://nameservice1/pyflink/pyflink181.zip \
-pyclientexec pyflink181.zip/pyflink181/bin/python \
-pyexec pyflink181.zip/pyflink181/bin/python \
-py hdfs://nameservice1/pyflink/wc2.py
会报错:2024-05-24 16:38:02,030 INFO org.apache.flink.client.python.PythonDriver [] - pyflink181.zip/pyflink181/bin/python: can't open file 'hdfs://nameservice1/pyflink/wc2.py': [Errno 2] No such file or directory
如果提交java作业则可以成功,使用下列命令:
./flink run-application -t yarn-application \
-Djobmanager.memory.process.size=1024m \
-Dtaskmanager.memory.process.size=1024m \
-Dyarn.application.name=flinkcdctest \
-Dyarn.provided.lib.dirs="hdfs://nameservice1/pyflink/flink-dist-181" \
hdfs://nameservice1/pyflink/StateMachineExample.jar
是可以成功的,不会报找不到jar,但是换成python脚本就不行。找得到jar说明hdfs的配置是没问题的。
是我哪里配置还有可能疏漏吗?
flink作业提交pyflink作业到yarn application报找不到python任务脚本,我用下列命令启动:
./flink run-application -t yarn-application \
-Dyarn.application.name=flinkcdctestpython \
-Dyarn.provided.lib.dirs="hdfs://nameservice1/pyflink/flink-dist-181" \
-pyarch hdfs://nameservice1/pyflink/pyflink181.zip \
-pyclientexec pyflink181.zip/pyflink181/bin/python \
-pyexec pyflink181.zip/pyflink181/bin/python \
-py hdfs://nameservice1/pyflink/wc2.py
会报错:2024-05-24 16:38:02,030 INFO org.apache.flink.client.python.PythonDriver [] - pyflink181.zip/pyflink181/bin/python: can't open file 'hdfs://nameservice1/pyflink/wc2.py': [Errno 2] No such file or directory
如果提交java作业则可以成功,使用下列命令:
./flink run-application -t yarn-application \
-Djobmanager.memory.process.size=1024m \
-Dtaskmanager.memory.process.size=1024m \
-Dyarn.application.name=flinkcdctest \
-Dyarn.provided.lib.dirs="hdfs://nameservice1/pyflink/flink-dist-181" \
hdfs://nameservice1/pyflink/StateMachineExample.jar
是可以成功的,不会报找不到jar,但是换成python脚本就不行。找得到jar说明hdfs的配置是没问题的。
是我哪里配置还有可能疏漏吗?