Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[example] paimon action sync database from mysql #4038

Open
2 of 3 tasks
jalousiex opened this issue Sep 6, 2024 · 3 comments
Open
2 of 3 tasks

[example] paimon action sync database from mysql #4038

jalousiex opened this issue Sep 6, 2024 · 3 comments
Labels
bug Something isn't working

Comments

@jalousiex
Copy link

jalousiex commented Sep 6, 2024

Search before asking

  • I had searched in the issues and found no similar issues.

Java Version

java version "1.8.0_311"
Java(TM) SE Runtime Environment (build 1.8.0_311-b11)
Java HotSpot(TM) 64-Bit Server VM (build 25.311-b11, mixed mode)

Scala Version

2.12.x

StreamPark Version

apache-streampark_2.12-2.1.4-incubating-bin.tar

Flink Version

flink 1.18 standalone session 1 master, 3 workers

dinky-app-1.18-1.1.0-jar-with-dependencies.jar
dinky-client-1.18-1.1.0.jar
dinky-client-base-1.1.0.jar
dinky-common-1.1.0.jar
flink-cep-1.18.1.jar
flink-connector-files-1.18.1.jar
flink-connector-jdbc-3.2.0-1.18.jar
flink-csv-1.18.1.jar
flink-dist-1.18.1.jar
flink-doris-connector-1.18-1.6.2.jar
flink-json-1.18.1.jar
flink-scala_2.12-1.18.1.jar
flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
flink-sql-connector-mysql-cdc-3.0.1.jar
flink-sql-connector-mysql-cdc-3.1.1.jar
flink-table-api-java-uber-1.18.1.jar
flink-table-planner_2.12-1.18.1.jar
flink-table-runtime-1.18.1.jar
log4j-1.2-api-2.17.1.jar
log4j-api-2.17.1.jar
log4j-core-2.17.1.jar
log4j-slf4j-impl-2.17.1.jar
mysql-connector-java-8.0.27.jar
ojdbc8-23.2.0.0.jar
paimon-flink-1.18-0.8.2.jar

deploy mode

None

What happened

follow paimon's guide

first using flink run to test a paimon action job successed

这里有个问题,flink 只有 mysql cdc 3.1.1 时报错找不到类,添加mysql cdc 3.0.1包后解决
java.lang.NoClassDefFoundError: com/ververica/cdc/debezium/DebeziumDeserializationSchema

root@gp_mdw:/opt/flink-1.18.1# bin/flink run \
> -Dexecution.checkpointing.interval=10s \
> -Dexecution.checkpointing.num-retained=5 \
> -Dstate.checkpoints.num-retained=10 \
> -Dpipeline.name='sync-db-mysql-to-paimon-s3' \
> ../fjars/paimon-flink-action-0.8.2.jar \
> mysql_sync_database \
> --mysql-conf hostname=172.31.4.149 \
> --mysql-conf port=3306 \
> --mysql-conf username=testuser \
> --mysql-conf password=***** \
> --mysql-conf database-name=testflink \
> --warehouse s3://flink/paimon/ \
> --catalog_conf s3.endpoint=https://ossapi-tst \
> --catalog_conf s3.access-key=*****\
> --catalog_conf s3.secret-key=*****\
> --catalog_conf s3.path.style.access=true \
> --database ods \
> --including_tables='o.*|product.?|shipments' \
> --table_prefix my_ \
> --table_suffix _001 \
> --table_conf source.checkpoint-align.enabled=true \
> --table_conf sink.parallelism=1
2024-09-06 15:53:15,106 WARN  org.apache.hadoop.metrics2.impl.MetricsConfig                [] - Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
2024-09-06 15:53:15,130 INFO  org.apache.hadoop.metrics2.impl.MetricsSystemImpl            [] - Scheduled Metric snapshot period at 10 second(s).
2024-09-06 15:53:15,130 INFO  org.apache.hadoop.metrics2.impl.MetricsSystemImpl            [] - s3a-file-system metrics system started
2024-09-06 15:53:19,443 WARN  org.apache.hadoop.fs.s3a.S3ABlockOutputStream                [] - Application invoked the Syncable API against stream writing to paimon/ods.db/my_orders_001/schema/.schema-3.a32f8201-c8fd-4e07-a5a1-f1e33d7f023e.tmp. This is unsupported
Job has been submitted with JobID f51a72da1d85e971cdd683424a5bdaf0

now using streampark to submit a custom code job

upload local job = paimon-flink-action-0.8.2.jar
Dependency Upload Jar = cdc jar already in flink/lib
Dynamic Properties

-Dexecution.checkpointing.interval=10s 
-Dexecution.checkpointing.num-retained=5 
-Dstate.checkpoints.num-retained=10 

Program Args

mysql_sync_database 
--mysql-conf hostname=172.31.4.149 
--mysql-conf port=3306 
--mysql-conf username=testuser 
--mysql-conf password=*****
--mysql-conf database-name=testflink 
--warehouse s3://flink/paimon/ 
--catalog_conf s3.endpoint=https://ossapi-tst 
--catalog_conf s3.access-key=*****
--catalog_conf s3.secret-key=*****
--catalog_conf s3.path.style.access=true 
--database default 
--including_tables=o.*|product.?|shipments 
--table_prefix my_ 
--table_suffix _001 
--table_conf source.checkpoint-align.enabled=true 
--table_conf changelog-producer=input 
--table_conf sink.parallelism=1

Error Exception

这个问题很奇怪,使用mysql cdc 3.1.1 折腾了很久,然后添加了mysql cdc 3.0.1包之后就解决了?

java.util.concurrent.CompletionException: java.lang.reflect.InvocationTargetException
	at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273)
	at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1592)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.streampark.flink.client.FlinkClient$.$anonfun$proxy$1(FlinkClient.scala:87)
	at org.apache.streampark.flink.proxy.FlinkShimsProxy$.$anonfun$proxy$1(FlinkShimsProxy.scala:60)
	at org.apache.streampark.common.util.ClassLoaderUtils$.runAsClassLoader(ClassLoaderUtils.scala:38)
	at org.apache.streampark.flink.proxy.FlinkShimsProxy$.proxy(FlinkShimsProxy.scala:60)
	at org.apache.streampark.flink.client.FlinkClient$.proxy(FlinkClient.scala:82)
	at org.apache.streampark.flink.client.FlinkClient$.submit(FlinkClient.scala:53)
	at org.apache.streampark.flink.client.FlinkClient.submit(FlinkClient.scala)
	at org.apache.streampark.console.core.service.impl.ApplicationServiceImpl.lambda$start$8(ApplicationServiceImpl.java:1655)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
	... 3 more
Caused by: java.lang.RuntimeException: 

[flink-submit] Both JobGraph submit plan and Rest API submit plan all failed!
JobGraph Submit plan failed detail:
------------------------------------------------------------------
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: org.apache.paimon.fs.UnsupportedSchemeException: Could not find a file io implementation for scheme 's3' in the classpath.  FlinkFileIOLoader also cannot access this path. Hadoop FileSystem also cannot access this path 's3://flink/paimon'.
	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:372)
	at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
	at org.apache.flink.client.program.PackagedProgramUtils.getPipelineFromProgram(PackagedProgramUtils.java:158)
	at org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:82)
	at org.apache.streampark.flink.client.trait.FlinkClientTrait.getJobGraph(FlinkClientTrait.scala:252)
	at org.apache.streampark.flink.client.trait.FlinkClientTrait.getJobGraph$(FlinkClientTrait.scala:231)
	at org.apache.streampark.flink.client.impl.RemoteClient$.jobGraphSubmit(RemoteClient.scala:139)
	at org.apache.streampark.flink.client.impl.RemoteClient$.$anonfun$doSubmit$1(RemoteClient.scala:48)
	at org.apache.streampark.flink.client.trait.FlinkClientTrait.$anonfun$trySubmit$1(FlinkClientTrait.scala:207)
	at scala.util.Try$.apply(Try.scala:209)
	at org.apache.streampark.flink.client.trait.FlinkClientTrait.trySubmit(FlinkClientTrait.scala:205)
	at org.apache.streampark.flink.client.trait.FlinkClientTrait.trySubmit$(FlinkClientTrait.scala:201)
	at org.apache.streampark.flink.client.impl.RemoteClient$.doSubmit(RemoteClient.scala:49)
	at org.apache.streampark.flink.client.trait.FlinkClientTrait.submit(FlinkClientTrait.scala:123)
	at org.apache.streampark.flink.client.trait.FlinkClientTrait.submit$(FlinkClientTrait.scala:60)
	at org.apache.streampark.flink.client.impl.RemoteClient$.submit(RemoteClient.scala:34)
	at org.apache.streampark.flink.client.FlinkClientHandler$.submit(FlinkClientHandler.scala:40)
	at org.apache.streampark.flink.client.FlinkClientHandler.submit(FlinkClientHandler.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.streampark.flink.client.FlinkClient$.$anonfun$proxy$1(FlinkClient.scala:87)
	at org.apache.streampark.flink.proxy.FlinkShimsProxy$.$anonfun$proxy$1(FlinkShimsProxy.scala:60)
	at org.apache.streampark.common.util.ClassLoaderUtils$.runAsClassLoader(ClassLoaderUtils.scala:38)
	at org.apache.streampark.flink.proxy.FlinkShimsProxy$.proxy(FlinkShimsProxy.scala:60)
	at org.apache.streampark.flink.client.FlinkClient$.proxy(FlinkClient.scala:82)
	at org.apache.streampark.flink.client.FlinkClient$.submit(FlinkClient.scala:53)
	at org.apache.streampark.flink.client.FlinkClient.submit(FlinkClient.scala)
	at org.apache.streampark.console.core.service.impl.ApplicationServiceImpl.lambda$start$8(ApplicationServiceImpl.java:1655)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.UncheckedIOException: org.apache.paimon.fs.UnsupportedSchemeException: Could not find a file io implementation for scheme 's3' in the classpath.  FlinkFileIOLoader also cannot access this path. Hadoop FileSystem also cannot access this path 's3://flink/paimon'.
	at org.apache.paimon.catalog.CatalogFactory.createCatalog(CatalogFactory.java:92)
	at org.apache.paimon.catalog.CatalogFactory.createCatalog(CatalogFactory.java:66)
	at org.apache.paimon.flink.FlinkCatalogFactory.createPaimonCatalog(FlinkCatalogFactory.java:80)
	at org.apache.paimon.flink.action.ActionBase.initPaimonCatalog(ActionBase.java:71)
	at org.apache.paimon.flink.action.ActionBase.<init>(ActionBase.java:58)
	at org.apache.paimon.flink.action.cdc.SynchronizationActionBase.<init>(SynchronizationActionBase.java:77)
	at org.apache.paimon.flink.action.cdc.SyncDatabaseActionBase.<init>(SyncDatabaseActionBase.java:61)
	at org.apache.paimon.flink.action.cdc.mysql.MySqlSyncDatabaseAction.<init>(MySqlSyncDatabaseAction.java:108)
	at org.apache.paimon.flink.action.cdc.mysql.MySqlSyncDatabaseActionFactory.createAction(MySqlSyncDatabaseActionFactory.java:52)
	at org.apache.paimon.flink.action.cdc.mysql.MySqlSyncDatabaseActionFactory.createAction(MySqlSyncDatabaseActionFactory.java:31)
	at org.apache.paimon.flink.action.cdc.SynchronizationActionFactoryBase.create(SynchronizationActionFactoryBase.java:45)
	at org.apache.paimon.flink.action.cdc.SyncDatabaseActionFactoryBase.create(SyncDatabaseActionFactoryBase.java:44)
	at org.apache.paimon.flink.action.ActionFactory.createAction(ActionFactory.java:82)
	at org.apache.paimon.flink.action.FlinkActions.main(FlinkActions.java:38)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
	... 33 more
Caused by: org.apache.paimon.fs.UnsupportedSchemeException: Could not find a file io implementation for scheme 's3' in the classpath.  FlinkFileIOLoader also cannot access this path. Hadoop FileSystem also cannot access this path 's3://flink/paimon'.
	at org.apache.paimon.fs.FileIO.get(FileIO.java:420)
	at org.apache.paimon.catalog.CatalogFactory.createCatalog(CatalogFactory.java:89)
	... 51 more
	Suppressed: java.nio.file.AccessDeniedException: s3://flink/paimon: org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials provided by DynamicTemporaryAWSCredentialsProvider TemporaryAWSCredentialsProvider SimpleAWSCredentialsProvider EnvironmentVariableCredentialsProvider IAMInstanceCredentialsProvider : com.amazonaws.SdkClientException: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY))
		at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:212)
		at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:175)
		at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3799)
		at org.apache.hadoop.fs.s3a.S3AFileSystem.innerGetFileStatus(S3AFileSystem.java:3688)
		at org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$exists$34(S3AFileSystem.java:4703)
		at org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.lambda$trackDurationOfOperation$5(IOStatisticsBinding.java:499)
		at org.apache.hadoop.fs.statistics.impl.IOStatisticsBinding.trackDuration(IOStatisticsBinding.java:444)
		at org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2337)
		at org.apache.hadoop.fs.s3a.S3AFileSystem.trackDurationAndSpan(S3AFileSystem.java:2356)
		at org.apache.hadoop.fs.s3a.S3AFileSystem.exists(S3AFileSystem.java:4701)
		at org.apache.flink.fs.s3hadoop.common.HadoopFileSystem.exists(HadoopFileSystem.java:165)
		at org.apache.paimon.flink.FlinkFileIO.exists(FlinkFileIO.java:100)
		at org.apache.paimon.fs.FileIOUtils.checkAccess(FileIOUtils.java:37)
		at org.apache.paimon.fs.FileIO.get(FileIO.java:388)
		... 52 more
	Caused by: org.apache.hadoop.fs.s3a.auth.NoAuthWithAWSException: No AWS Credentials provided by DynamicTemporaryAWSCredentialsProvider TemporaryAWSCredentialsProvider SimpleAWSCredentialsProvider EnvironmentVariableCredentialsProvider IAMInstanceCredentialsProvider : com.amazonaws.SdkClientException: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY))
		at org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:216)
		at com.amazonaws.http.AmazonHttpClient$RequestExecutor.getCredentialsFromContext(AmazonHttpClient.java:1269)
		at com.amazonaws.http.AmazonHttpClient$RequestExecutor.runBeforeRequestHandlers(AmazonHttpClient.java:845)
		at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:794)
		at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:781)
		at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:755)
		at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:715)
		at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:697)
		at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:561)
		at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:541)
		at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5456)
		at com.amazonaws.services.s3.AmazonS3Client.getBucketRegionViaHeadRequest(AmazonS3Client.java:6432)
		at com.amazonaws.services.s3.AmazonS3Client.fetchRegionFromCache(AmazonS3Client.java:6404)
		at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5441)
		at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:5403)
		at com.amazonaws.services.s3.AmazonS3Client.getObjectMetadata(AmazonS3Client.java:1372)
		at org.apache.hadoop.fs.s3a.S3AFileSystem.lambda$getObjectMetadata$10(S3AFileSystem.java:2545)
		at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:414)
		at org.apache.hadoop.fs.s3a.Invoker.retryUntranslated(Invoker.java:377)
		at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2533)
		at org.apache.hadoop.fs.s3a.S3AFileSystem.getObjectMetadata(S3AFileSystem.java:2513)
		at org.apache.hadoop.fs.s3a.S3AFileSystem.s3GetFileStatus(S3AFileSystem.java:3776)
		... 63 more
	Caused by: com.amazonaws.SdkClientException: Unable to load AWS credentials from environment variables (AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY))
		at com.amazonaws.auth.EnvironmentVariableCredentialsProvider.getCredentials(EnvironmentVariableCredentialsProvider.java:49)
		at org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:177)
		... 84 more
	Suppressed: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "s3"
		at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3443)
		at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3466)
		at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:174)
		at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3574)
		at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3521)
		at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:540)
		at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365)
		at org.apache.paimon.fs.hadoop.HadoopFileIO.createFileSystem(HadoopFileIO.java:175)
		at org.apache.paimon.fs.hadoop.HadoopFileIO.getFileSystem(HadoopFileIO.java:168)
		at org.apache.paimon.fs.hadoop.HadoopFileIO.getFileSystem(HadoopFileIO.java:145)
		at org.apache.paimon.fs.hadoop.HadoopFileIO.exists(HadoopFileIO.java:110)
		at org.apache.paimon.fs.FileIOUtils.checkAccess(FileIOUtils.java:37)
		at org.apache.paimon.fs.FileIO.get(FileIO.java:397)
		... 52 more

Screenshots

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!(您是否要贡献这个PR?)

Code of Conduct

@jalousiex jalousiex added the bug Something isn't working label Sep 6, 2024
@jalousiex jalousiex changed the title [Bug] paimon action sync database failed while flink run worked [example] paimon action sync database from mysql Sep 6, 2024
@Mrart
Copy link
Contributor

Mrart commented Sep 8, 2024

mysql cdc uses the 3.2.0 package, then you should sink to paimon, and the error shows that paimon catalog does not support s3 for now

@Mrart
Copy link
Contributor

Mrart commented Sep 8, 2024

java.lang.NoClassDefFoundError: com/ververica/cdc/debezium/DebeziumDeserializationSchema 这个mysql cdc的bug,切换到3.2.0版本

@jalousiex
Copy link
Author

mysql cdc uses the 3.2.0 package, then you should sink to paimon, and the error shows that paimon catalog does not support s3 for now

实际上0.8.2是支持的,这个任务成功跑起来了,日志显示jobGraph提交失败,restApi成功了
关于s3,flink在不同封装上表现差异比较大,原生支持较好,dinky提交也报类似错误失败,但streampark最终是跑起来了
另外,很奇怪动态参数没有生效,以下为重启streampark之后,运行custom任务的完整日志

2024-09-11 14:52:37 | INFO  | XNIO-1 task-2 | io.undertow.servlet:389] Initializing Spring DispatcherServlet 'dispatcherServlet'
2024-09-11 14:52:37 | INFO  | XNIO-1 task-2 | org.springframework.web.servlet.DispatcherServlet:525] Initializing Servlet 'dispatcherServlet'
2024-09-11 14:52:37 | INFO  | XNIO-1 task-2 | org.springframework.web.servlet.DispatcherServlet:547] Completed initialization in 3 ms
2024-09-11 14:52:37 | INFO  | XNIO-1 task-1 | org.apache.streampark.console.base.handler.GlobalExceptionHandler:61] Permission denied: Unauthorized
2024-09-11 14:52:37 | INFO  | XNIO-1 task-2 | org.apache.streampark.console.base.handler.GlobalExceptionHandler:54] Unauthenticated: This subject is anonymous - it does not have any identifying principals and authorization operations require an identity to check against.  A Subject instance will acquire these identifying principals automatically after a successful login is performed be executing org.apache.shiro.subject.Subject.login(AuthenticationToken) or when 'Remember Me' functionality is enabled by the SecurityManager.  This exception can also occur when a previously logged-in Subject has logged out which makes it anonymous again.  Because an identity is currently not known due to any of these conditions, authorization is denied.
2024-09-11 14:52:37 | INFO  | XNIO-1 task-4 | org.apache.streampark.console.base.handler.GlobalExceptionHandler:54] Unauthenticated: This subject is anonymous - it does not have any identifying principals and authorization operations require an identity to check against.  A Subject instance will acquire these identifying principals automatically after a successful login is performed be executing org.apache.shiro.subject.Subject.login(AuthenticationToken) or when 'Remember Me' functionality is enabled by the SecurityManager.  This exception can also occur when a previously logged-in Subject has logged out which makes it anonymous again.  Because an identity is currently not known due to any of these conditions, authorization is denied.
14:54:07.624 [streampark-flink-app-bootstrap-0] DEBUG org.apache.streampark.common.util.CommandUtils - [StreamPark] Command execute:
java -classpath /opt/flink-1.18.1/lib/flink-dist-1.18.1.jar org.apache.flink.client.cli.CliFrontend --version
14:54:08.065 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.common.conf.FlinkVersion - [StreamPark] SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Version: 1.18.1, Commit ID: a8c8b1c

14:54:08.072 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] add flink shims urls classloader,flink version:
----------------------------------------- flink version -----------------------------------
     flinkHome    : /opt/flink-1.18.1
     distJarName  : flink-dist-1.18.1.jar
     flinkVersion : 1.18.1
     majorVersion : 1.18
     scalaVersion : 2.12
     shimsVersion : streampark-flink-shims_flink-1.18
-------------------------------------------------------------------------------------------

14:54:08.082 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] include streampark lib: streampark-flink-client-core_2.12-2.1.4.jar
14:54:08.083 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] include flink shims jar lib: streampark-flink-shims_flink-1.18_2.12-2.1.4.jar
14:54:08.083 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] include streampark lib: streampark-common_2.12-2.1.4.jar
14:54:08.084 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] include jar lib: streampark-shaded-jackson-1.0.0.jar
14:54:08.084 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] include streampark lib: streampark-flink-shims-base_2.12-2.1.4.jar
14:54:08.084 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] include streampark lib: streampark-flink-client-api_2.12-2.1.4.jar
14:54:08.084 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] include streampark lib: streampark-flink-proxy_2.12-2.1.4.jar
14:54:08.084 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] include streampark lib: streampark-flink-packer_2.12-2.1.4.jar
14:54:08.085 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] include streampark lib: streampark-flink-kubernetes_2.12-2.1.4.jar
14:54:08.085 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.proxy.FlinkShimsProxy - [StreamPark] include streampark lib: streampark-flink-core_2.12-2.1.4.jar
14:54:08.926 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark]
--------------------------------------- flink job start ---------------------------------------
    userFlinkHome    : /opt/flink-1.18.1
    flinkVersion     : 1.18.1
    appName          : mysql-to-paimon-s3
    devMode          : CUSTOM_CODE
    execMode         : REMOTE
    k8sNamespace     : null
    flinkExposedType : null
    clusterId        : null
    applicationType  : Apache Flink
    savePoint        : null
    properties       : rest.address -> 172.31.4.220 rest.port -> 8081 execution.checkpointing.interval -> 10s state.checkpoints.num-retained -> 10 execution.checkpointing.num-retained -> 5 classloader.resolve-order -> parent-first
    args             : mysql_sync_database
--mysql-conf hostname=172.31.4.149
--mysql-conf port=3306
--mysql-conf username=testuser
--mysql-conf password=***
--mysql-conf database-name=testflink
--warehouse s3://flink/paimon/
--catalog_conf s3.endpoint=https://ossapi-tst
--catalog_conf s3.access-key=ak_flink
--catalog_conf s3.secret-key=***
--catalog_conf s3.path.style.access=true
--database ods
--including_tables 'o.*|product.?|shipments'
--table_prefix my_
--table_suffix _001
--table_conf source.checkpoint-align.enabled=true
--table_conf changelog-producer=input
--table_conf sink.parallelism=1
    appConf          : json://{"$internal.application.main":"org.apache.paimon.flink.action.FlinkActions"}
    flinkBuildResult : { workspacePath: /opt/streampark_ws/workspace/100001, shadedJarPath: /opt/streampark_ws/workspace/100001/streampark-flinkjob_mysql-to-paimon-s3.jar, pass: true }
-------------------------------------------------------------------------------------------

2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.checkpoints.num-retained, 20
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: fs.allowed-fallback-filesystems, s3
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.execution.failover-strategy, region
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.rpc.address, 172.31.4.220
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.num-retained, 10
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.savepoints.dir, s3://flink/savepoints/
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.aligned-checkpoint-timeout, 60s
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.bind-host, 0.0.0.0
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.secret-key, ******
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.endpoint, https://ossapi-tst.hengrui.com
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.checkpoints-after-tasks-finish.enabled, true
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: parallelism.default, 1
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.numberOfTaskSlots, 4
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: env.java.opts.all, --add-exports=java.base/sun.net.util=ALL-UNNAMED --add-exports=java.rmi/sun.rmi.registry=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED --add-exports=java.security.jgss/sun.security.krb5=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.text=ALL-UNNAMED --add-opens=java.base/java.time=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.locks=ALL-UNNAMED
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: pekko.ask.timeout, 30s
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.memory.process.size, 3072m
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.mode, EXACTLY_ONCE
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.bind-host, 0.0.0.0
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: classloader.resolve-order, parent-first
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: web.timeout, 50000
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.memory.process.size, 2048m
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.rpc.port, 6123
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: rest.bind-address, 0.0.0.0
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.interval, 1min
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.timeout, 3min
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.access-key, ak_flink
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.memory.managed.size, 0
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.externalized-checkpoint-retention, DELETE_ON_CANCELLATION
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: rest.address, 172.31.4.220
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.max-concurrent-checkpoints, 2
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.checkpoints.dir, s3://flink/checkpoints/
2024-09-11 14:54:08 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.path.style.access, true
14:54:09.280 [streampark-flink-app-bootstrap-0] WARN org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] param:$internal.application.main is error,skip it.
14:54:09.321 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] submit application dynamicProperties:  rest.address :172.31.4.220
14:54:09.322 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] submit application dynamicProperties:  rest.port :8081
14:54:09.322 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] submit application dynamicProperties:  execution.checkpointing.interval :10s
14:54:09.322 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] submit application dynamicProperties:  state.checkpoints.num-retained :10
14:54:09.322 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] submit application dynamicProperties:  execution.checkpointing.num-retained :5
14:54:09.322 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] submit application dynamicProperties:  classloader.resolve-order :parent-first
14:54:09.332 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - cliArgs: -t remote -Drest.address=172.31.4.220 -Drest.port=8081 -Dexecution.checkpointing.interval=10s -Dstate.checkpoints.num-retained=10 -Dexecution.checkpointing.num-retained=5 -Dclassloader.resolve-order=parent-first
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.checkpoints.num-retained, 20
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: fs.allowed-fallback-filesystems, s3
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.execution.failover-strategy, region
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.rpc.address, 172.31.4.220
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.num-retained, 10
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.savepoints.dir, s3://flink/savepoints/
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.aligned-checkpoint-timeout, 60s
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.bind-host, 0.0.0.0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.secret-key, ******
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.endpoint, https://ossapi-tst.hengrui.com
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.checkpoints-after-tasks-finish.enabled, true
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: parallelism.default, 1
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.numberOfTaskSlots, 4
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: env.java.opts.all, --add-exports=java.base/sun.net.util=ALL-UNNAMED --add-exports=java.rmi/sun.rmi.registry=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED --add-exports=java.security.jgss/sun.security.krb5=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.text=ALL-UNNAMED --add-opens=java.base/java.time=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.locks=ALL-UNNAMED
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: pekko.ask.timeout, 30s
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.memory.process.size, 3072m
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.mode, EXACTLY_ONCE
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.bind-host, 0.0.0.0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: classloader.resolve-order, parent-first
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: web.timeout, 50000
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.memory.process.size, 2048m
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.rpc.port, 6123
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: rest.bind-address, 0.0.0.0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.interval, 1min
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.timeout, 3min
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.access-key, ak_flink
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.memory.managed.size, 0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.externalized-checkpoint-retention, DELETE_ON_CANCELLATION
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: rest.address, 172.31.4.220
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.max-concurrent-checkpoints, 2
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.checkpoints.dir, s3://flink/checkpoints/
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.path.style.access, true
14:54:09.344 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] Custom commandline: [org.apache.flink.client.cli.GenericCLI@53ad1d67, org.apache.flink.yarn.cli.FlinkYarnSessionCli@430e4fa6, org.apache.flink.client.cli.DefaultCLI@23c80632]
14:54:09.345 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] Checking custom commandline org.apache.flink.client.cli.GenericCLI@53ad1d67, isActive: true
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.checkpoints.num-retained, 20
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: fs.allowed-fallback-filesystems, s3
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.execution.failover-strategy, region
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.rpc.address, 172.31.4.220
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.num-retained, 10
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.savepoints.dir, s3://flink/savepoints/
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.aligned-checkpoint-timeout, 60s
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.bind-host, 0.0.0.0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.secret-key, ******
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.endpoint, https://ossapi-tst.hengrui.com
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.checkpoints-after-tasks-finish.enabled, true
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: parallelism.default, 1
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.numberOfTaskSlots, 4
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: env.java.opts.all, --add-exports=java.base/sun.net.util=ALL-UNNAMED --add-exports=java.rmi/sun.rmi.registry=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED --add-exports=java.security.jgss/sun.security.krb5=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.text=ALL-UNNAMED --add-opens=java.base/java.time=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.locks=ALL-UNNAMED
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: pekko.ask.timeout, 30s
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.memory.process.size, 3072m
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.mode, EXACTLY_ONCE
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.bind-host, 0.0.0.0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: classloader.resolve-order, parent-first
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: web.timeout, 50000
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.memory.process.size, 2048m
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.rpc.port, 6123
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: rest.bind-address, 0.0.0.0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.interval, 1min
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.timeout, 3min
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.access-key, ak_flink
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.memory.managed.size, 0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.externalized-checkpoint-retention, DELETE_ON_CANCELLATION
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: rest.address, 172.31.4.220
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.max-concurrent-checkpoints, 2
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.checkpoints.dir, s3://flink/checkpoints/
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.path.style.access, true
14:54:09.428 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] [flink-submit] Submit job with JobGraph Plan.
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.checkpoints.num-retained, 20
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: fs.allowed-fallback-filesystems, s3
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.execution.failover-strategy, region
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.rpc.address, 172.31.4.220
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.num-retained, 10
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.savepoints.dir, s3://flink/savepoints/
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.aligned-checkpoint-timeout, 60s
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.bind-host, 0.0.0.0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.secret-key, ******
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.endpoint, https://ossapi-tst.hengrui.com
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.checkpoints-after-tasks-finish.enabled, true
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: parallelism.default, 1
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.numberOfTaskSlots, 4
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: env.java.opts.all, --add-exports=java.base/sun.net.util=ALL-UNNAMED --add-exports=java.rmi/sun.rmi.registry=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED --add-exports=java.security.jgss/sun.security.krb5=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.text=ALL-UNNAMED --add-opens=java.base/java.time=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.locks=ALL-UNNAMED
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: pekko.ask.timeout, 30s
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.memory.process.size, 3072m
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.mode, EXACTLY_ONCE
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.bind-host, 0.0.0.0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: classloader.resolve-order, parent-first
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: web.timeout, 50000
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.memory.process.size, 2048m
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: jobmanager.rpc.port, 6123
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: rest.bind-address, 0.0.0.0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.interval, 1min
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.timeout, 3min
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.access-key, ak_flink
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: taskmanager.memory.managed.size, 0
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.externalized-checkpoint-retention, DELETE_ON_CANCELLATION
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: rest.address, 172.31.4.220
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: execution.checkpointing.max-concurrent-checkpoints, 2
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: state.checkpoints.dir, s3://flink/checkpoints/
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.flink.configuration.GlobalConfiguration:160] Loading configuration property: s3.path.style.access, true
2024-09-11 14:54:09 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.paimon.flink.action.ActionFactory:73] mysql_sync_database job args: --mysql-conf hostname=172.31.4.149 --mysql-conf port=3306 --mysql-conf username=testuser --mysql-conf password=testuser --mysql-conf database-name=testflink --warehouse s3://flink/paimon/ --catalog_conf s3.endpoint=https://ossapi-tst --catalog_conf s3.access-key=ak_flink --catalog_conf s3.secret-key=ak_flink --catalog_conf s3.path.style.access=true --database ods --including_tables o.*|product.?|shipments --table_prefix my_ --table_suffix _001 --table_conf source.checkpoint-align.enabled=true --table_conf changelog-producer=input --table_conf sink.parallelism=1
2024-09-11 14:54:09 | WARN  | streampark-flink-app-bootstrap-0 | org.apache.paimon.utils.HadoopUtils:125] Could not find Hadoop configuration via any of the supported methods
2024-09-11 14:54:10 | WARN  | streampark-flink-app-bootstrap-0 | org.apache.hadoop.util.NativeCodeLoader:60] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14:54:10.664 [streampark-flink-app-bootstrap-0] ERROR org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] REMOTE mode submit by jobGraph fail.org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: org.apache.paimon.fs.UnsupportedSchemeException: Could not find a file io implementation for scheme 's3' in the classpath.  FlinkFileIOLoader also cannot access this path. Hadoop FileSystem also cannot access this path 's3://flink/paimon'.
        at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:372)
        at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
        at org.apache.flink.client.program.PackagedProgramUtils.getPipelineFromProgram(PackagedProgramUtils.java:158)
        at org.apache.flink.client.program.PackagedProgramUtils.createJobGraph(PackagedProgramUtils.java:82)
        at org.apache.streampark.flink.client.trait.FlinkClientTrait.getJobGraph(FlinkClientTrait.scala:252)
        at org.apache.streampark.flink.client.trait.FlinkClientTrait.getJobGraph$(FlinkClientTrait.scala:231)
        at org.apache.streampark.flink.client.impl.RemoteClient$.jobGraphSubmit(RemoteClient.scala:139)
        at org.apache.streampark.flink.client.impl.RemoteClient$.$anonfun$doSubmit$1(RemoteClient.scala:48)
        at org.apache.streampark.flink.client.trait.FlinkClientTrait.$anonfun$trySubmit$1(FlinkClientTrait.scala:207)
        at scala.util.Try$.apply(Try.scala:209)
        at org.apache.streampark.flink.client.trait.FlinkClientTrait.trySubmit(FlinkClientTrait.scala:205)
        at org.apache.streampark.flink.client.trait.FlinkClientTrait.trySubmit$(FlinkClientTrait.scala:201)
        at org.apache.streampark.flink.client.impl.RemoteClient$.doSubmit(RemoteClient.scala:49)
        at org.apache.streampark.flink.client.trait.FlinkClientTrait.submit(FlinkClientTrait.scala:123)
        at org.apache.streampark.flink.client.trait.FlinkClientTrait.submit$(FlinkClientTrait.scala:60)
        at org.apache.streampark.flink.client.impl.RemoteClient$.submit(RemoteClient.scala:34)
        at org.apache.streampark.flink.client.FlinkClientHandler$.submit(FlinkClientHandler.scala:40)
        at org.apache.streampark.flink.client.FlinkClientHandler.submit(FlinkClientHandler.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.streampark.flink.client.FlinkClient$.$anonfun$proxy$1(FlinkClient.scala:87)
        at org.apache.streampark.flink.proxy.FlinkShimsProxy$.$anonfun$proxy$1(FlinkShimsProxy.scala:60)
        at org.apache.streampark.common.util.ClassLoaderUtils$.runAsClassLoader(ClassLoaderUtils.scala:38)
        at org.apache.streampark.flink.proxy.FlinkShimsProxy$.proxy(FlinkShimsProxy.scala:60)
        at org.apache.streampark.flink.client.FlinkClient$.proxy(FlinkClient.scala:82)
        at org.apache.streampark.flink.client.FlinkClient$.submit(FlinkClient.scala:53)
        at org.apache.streampark.flink.client.FlinkClient.submit(FlinkClient.scala)
        at org.apache.streampark.console.core.service.impl.ApplicationServiceImpl.lambda$start$8(ApplicationServiceImpl.java:1655)
        at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.UncheckedIOException: org.apache.paimon.fs.UnsupportedSchemeException: Could not find a file io implementation for scheme 's3' in the classpath.  FlinkFileIOLoader also cannot access this path. Hadoop FileSystem also cannot access this path 's3://flink/paimon'.
        at org.apache.paimon.catalog.CatalogFactory.createCatalog(CatalogFactory.java:92)
        at org.apache.paimon.catalog.CatalogFactory.createCatalog(CatalogFactory.java:66)
        at org.apache.paimon.flink.FlinkCatalogFactory.createPaimonCatalog(FlinkCatalogFactory.java:80)
        at org.apache.paimon.flink.action.ActionBase.initPaimonCatalog(ActionBase.java:71)
        at org.apache.paimon.flink.action.ActionBase.<init>(ActionBase.java:58)
        at org.apache.paimon.flink.action.cdc.SynchronizationActionBase.<init>(SynchronizationActionBase.java:77)
        at org.apache.paimon.flink.action.cdc.SyncDatabaseActionBase.<init>(SyncDatabaseActionBase.java:61)
        at org.apache.paimon.flink.action.cdc.mysql.MySqlSyncDatabaseAction.<init>(MySqlSyncDatabaseAction.java:108)
        at org.apache.paimon.flink.action.cdc.mysql.MySqlSyncDatabaseActionFactory.createAction(MySqlSyncDatabaseActionFactory.java:52)
        at org.apache.paimon.flink.action.cdc.mysql.MySqlSyncDatabaseActionFactory.createAction(MySqlSyncDatabaseActionFactory.java:31)
        at org.apache.paimon.flink.action.cdc.SynchronizationActionFactoryBase.create(SynchronizationActionFactoryBase.java:45)
        at org.apache.paimon.flink.action.cdc.SyncDatabaseActionFactoryBase.create(SyncDatabaseActionFactoryBase.java:44)
        at org.apache.paimon.flink.action.ActionFactory.createAction(ActionFactory.java:82)
        at org.apache.paimon.flink.action.FlinkActions.main(FlinkActions.java:38)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
        ... 33 more
Caused by: org.apache.paimon.fs.UnsupportedSchemeException: Could not find a file io implementation for scheme 's3' in the classpath.  FlinkFileIOLoader also cannot access this path. Hadoop FileSystem also cannot access this path 's3://flink/paimon'.
        at org.apache.paimon.fs.FileIO.get(FileIO.java:420)
        at org.apache.paimon.catalog.CatalogFactory.createCatalog(CatalogFactory.java:89)
        ... 51 more
        Suppressed: org.apache.flink.core.fs.UnsupportedFileSystemSchemeException: Could not find a file system implementation for scheme 's3'. The scheme is directly supported by Flink through the following plugin(s): flink-s3-fs-hadoop, flink-s3-fs-presto. Please ensure that each plugin resides within its own subfolder within the plugins directory. See https://nightlies.apache.org/flink/flink-docs-stable/docs/deployment/filesystems/plugins/ for more information. If you want to use a Hadoop file system for that scheme, please add the scheme to the configuration fs.allowed-fallback-filesystems. For a full list of supported file systems, please see https://nightlies.apache.org/flink/flink-docs-stable/ops/filesystems/.
                at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:515)
                at org.apache.flink.core.fs.FileSystem.get(FileSystem.java:409)
                at org.apache.flink.core.fs.Path.getFileSystem(Path.java:274)
                at org.apache.paimon.flink.FlinkFileIO.getFileSystem(FlinkFileIO.java:127)
                at org.apache.paimon.flink.FlinkFileIO.exists(FlinkFileIO.java:100)
                at org.apache.paimon.fs.FileIOUtils.checkAccess(FileIOUtils.java:37)
                at org.apache.paimon.fs.FileIO.get(FileIO.java:388)
                ... 52 more
        Suppressed: org.apache.hadoop.fs.UnsupportedFileSystemException: No FileSystem for scheme "s3"
                at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3443)
                at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3466)
                at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:174)
                at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3574)
                at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3521)
                at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:540)
                at org.apache.hadoop.fs.Path.getFileSystem(Path.java:365)
                at org.apache.paimon.fs.hadoop.HadoopFileIO.createFileSystem(HadoopFileIO.java:175)
                at org.apache.paimon.fs.hadoop.HadoopFileIO.getFileSystem(HadoopFileIO.java:168)
                at org.apache.paimon.fs.hadoop.HadoopFileIO.getFileSystem(HadoopFileIO.java:145)
                at org.apache.paimon.fs.hadoop.HadoopFileIO.exists(HadoopFileIO.java:110)
                at org.apache.paimon.fs.FileIOUtils.checkAccess(FileIOUtils.java:37)
                at org.apache.paimon.fs.FileIO.get(FileIO.java:397)
                ... 52 more
14:54:14.550 [streampark-flink-app-bootstrap-0] INFO org.apache.streampark.flink.client.impl.RemoteClient - [StreamPark] REMOTE mode submit by restApi, WebInterfaceURL http://172.31.4.220:8081, jobId: 8f0fb2ab846651cf78dd908a5a0bb610
2024-09-11 14:54:14 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.streampark.console.core.task.FlinkAppHttpWatcher:635] [StreamPark][FlinkAppHttpWatcher] setOptioning
2024-09-11 14:54:14 | INFO  | streampark-flink-app-bootstrap-0 | org.apache.streampark.console.core.task.FlinkAppHttpWatcher:646] [StreamPark][FlinkAppHttpWatcher] add app to tracking,appId:100001
^C

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants