You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I had searched in the issues and found no similar issues.
What happened
I want to use Flink parameters when starting Flink. I have tried two methods: retrieving them from the command line during startup and setting them in the env{} section of the job configuration file, but neither method worked.
Method 1:
Execute SeaTunnel Flink Job: ${FLINK_HOME}/bin/flink run -m yarn-cluster -D security.kerberos.login.keytab=/home/dis_poc.keytab -D [email protected] -D security.kerberos.krb5-conf.path=/etc/krb5.conf -c org.apache.seatunnel.core.starter.flink.SeaTunnelFlink /home/q/dis/apache-seatunnel-2.3.8/starter/seatunnel-flink-13-starter.jar --config /home/q/dis/apache-seatunnel-2.3.8/config/hive2ob/hive2ob.txt --name SeaTunnel
Setting HADOOP_CONF_DIR=/etc/hadoop/conf because no HADOOP_CONF_DIR or HADOOP_CLASSPATH was set.
Setting HBASE_CONF_DIR=/etc/hbase/conf because no HBASE_CONF_DIR was set.
2024-11-27 09:56:42,867 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli [] - Found Yarn properties file under /tmp/.yarn-properties-root.
2024-11-27 09:56:42,867 INFO org.apache.flink.yarn.cli.FlinkYarnSessionCli [] - Found Yarn properties file under /tmp/.yarn-properties-root.
2024-11-27 09:56:45,022 WARN org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory [] - The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
2024-11-27 09:56:46,101 WARN org.apache.flink.yarn.configuration.YarnLogConfigUtil [] - The configuration directory ('/home/q/dis/flink-1.13.6/conf') already contains a LOG4J config file.If you want to use logback, then please delete or rename the log configuration file.
2024-11-27 09:56:46,257 INFO org.apache.flink.yarn.YarnClusterDescriptor [] - No path for the flink jar passed. Using the location of class org.apache.flink.yarn.YarnClusterDescriptor to locate the jar
------------------------------------------------------------
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Flink job executed failed
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:372)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246)
at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
at org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)
Caused by: org.apache.seatunnel.core.starter.exception.CommandExecuteException: Flink job executed failed
at org.apache.seatunnel.core.starter.flink.command.FlinkTaskExecuteCommand.execute(FlinkTaskExecuteCommand.java:63)
at org.apache.seatunnel.core.starter.SeaTunnel.run(SeaTunnel.java:40)
at org.apache.seatunnel.core.starter.flink.SeaTunnelFlink.main(SeaTunnelFlink.java:34)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
... 11 more
Caused by: org.apache.seatunnel.core.starter.exception.TaskExecuteException: Execute Flink job error
at org.apache.seatunnel.core.starter.flink.execution.FlinkExecution.execute(FlinkExecution.java:143)
at org.apache.seatunnel.core.starter.flink.command.FlinkTaskExecuteCommand.execute(FlinkTaskExecuteCommand.java:61)
... 18 more
Caused by: org.apache.flink.client.deployment.ClusterDeploymentException: Could not deploy Yarn job cluster.
at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:481)
at org.apache.flink.client.deployment.executors.AbstractJobClusterExecutor.execute(AbstractJobClusterExecutor.java:81)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:1956)
at org.apache.flink.client.program.StreamContextEnvironment.executeAsync(StreamContextEnvironment.java:137)
at org.apache.flink.client.program.StreamContextEnvironment.execute(StreamContextEnvironment.java:76)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1833)
at org.apache.seatunnel.core.starter.flink.execution.FlinkExecution.execute(FlinkExecution.java:131)
... 19 more
Caused by: java.lang.RuntimeException: Hadoop security with Kerberos is enabled but the login user does not have Kerberos credentials or delegation tokens!
at org.apache.flink.yarn.YarnClusterDescriptor.deployInternal(YarnClusterDescriptor.java:528)
at org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:474)
... 25 more
Search before asking
What happened
I want to use Flink parameters when starting Flink. I have tried two methods: retrieving them from the command line during startup and setting them in the env{} section of the job configuration file, but neither method worked.
Method 1:
/apache-seatunnel-2.3.8/bin/start-seatunnel-flink-13-connector-v2.sh --config /apache-seatunnel-2.3.8/config/hive2ob/hive2ob_partition.txt -m yarn-cluster -Dsecurity.kerberos.login.keytab="/home/dis_poc.keytab" -Dsecurity.kerberos.login.principal="[email protected]" -Dsecurity.kerberos.krb5-conf.path="/etc/krb5.conf"
Method 2:
seatunnel config file:
env {
execution.parallelism = 1
job.mode = "STREAMING"
checkpoint.interval = 900000
flink.security.kerberos.login.keytab = "/home/dis_poc.keytab"
flink.security.kerberos.login.principal = "[email protected]"
flink.security.kerberos.krb5-conf.path = "/etc/krb5.conf"
security.kerberos.login.keytab = "/home/dis_poc.keytab"
security.kerberos.login.principal = "[email protected]"
security.kerberos.krb5-conf.path = "/etc/krb5.conf"
}
SeaTunnel Version
2.3.8
SeaTunnel Config
Running Command
Error Exception
Zeta or Flink or Spark Version
Flink 1.13.6
Java or Scala Version
1.8
Screenshots
No response
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: