"pipeline.cached-files\":\"name:krb5.conf,path:/Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/auth/kerberos/krb5.conf,executable:false;name:hive.keytab,path:/Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/auth/kerberos/hive.keytab,executable:false\"
-mode yarn-per-job -jobType sync -job /Users/wtz/work_place/project_place/flinkx_1.12/flinkx/flinkx-examples/json/stream/stream.json -flinkxDistDir /Users/wtz/work_place/project_place/flinkx_1.12/flinkx/flinkx-dist -flinkConfDir /Users/wtz/work_place/conf_place/flink/kudu1 -hadoopConfDir /Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/conf -flinkLibDir /Users/wtz/work_place/jar_place/flink-1.12.5/lib -confProp {\"yarn.application.queue\":\"default\",\"pipeline.cached-files\":\"name:krb5.conf,path:/Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/auth/kerberos/krb5.conf,executable:false;name:hive.keytab,path:/Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/auth/kerberos/hive.keytab,executable:false\",\"flinkx.dirty-data.jdbc.table\":\"flinkx_dirty_data_tiezhu\",\"flinkx.dirty-data.jdbc.url\":\"jdbc:mysql://172.16.100.186:3306/test\",\"flinkx.dirty-data.output-type\":\"log\",\"flinkx.dirty-data.log.print-interval\":1,\"flinkx.dirty-data.jdbc.password\":\"Abc123456\",\"flinkx.dirty-data.max-collect-failed-rows\":4,\"flinkx.dirty-data.jdbc.username\":\"test\",\"flinkx.dirty-data.max-rows\":2}
任务JVM参数
-Djava.security.krb5.conf=/Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/auth/kerberos/krb5.conf
HADOOP_USER_NAME=hive
security.kerberos.login.use-ticket-cache: true security.kerberos.login.keytab: /Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/auth/kerberos/hive.keytab security.kerberos.login.principal: hive/cdp01@CDP7DTSTACK.COM
配置好这些就能提交任务了。
-krb5conf /Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/auth/kerberos/krb5.conf -keytab /Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/auth/kerberos/hive.keytab -principal hive/cdp01@CDP7DTSTACK.COM
-mode yarnPer -job /Users/wtz/work_place/job_place/json/1.10/binlog-stream.json -pluginRoot /Users/wtz/work_place/project_place/flinkx_1.10/flinkx/syncplugins -flinkLibJar /Users/wtz/work_place/jar_place/flink-1.10.1/lib -flinkconf /Users/wtz/work_place/conf_place/flink/kudu1 -yarnconf /Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/conf -pluginLoadMode shipfile -queue default -confProp "{\"flink.checkpoint.interval\":30000}" -krb5conf /Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/auth/kerberos/krb5.conf -keytab /Users/wtz/work_place/conf_place/hadoop2/cdp_liantong/auth/kerberos/hive.keytab -principal hive/cdp01@CDP7DTSTACK.COM
问题记录
2022-01-11 17:56:02,685 - 10821 WARN [main] org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory:The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 2022-01-11 17:56:04,475 - 12611 WARN [Thread-6] org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer:DataStreamer Exception java.lang.NullPointerException at org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:133) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:490) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:299) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:242) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:211) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:183) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1437) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554) Exception in thread "main" org.apache.flink.client.deployment.ClusterDeploymentException: Could not deploy Yarn job cluster. at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:491) at com.dtstack.flinkx.client.yarn.YarnPerJobClusterClientHelper.submit(YarnPerJobClusterClientHelper.java:97) at com.dtstack.flinkx.client.Launcher.main(Launcher.java:126) Caused by: java.io.IOException: DataStreamer Exception: at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:668) Caused by: java.lang.NullPointerException at org.apache.hadoop.crypto.CryptoInputStream.(CryptoInputStream.java:133) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:490) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:299) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:242) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:211) at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:183) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1437) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1385) at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:554)