-
Notifications
You must be signed in to change notification settings - Fork 363
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
S3 source connector: Reached the end of stream with xx bytes left to read #1255
Labels
Comments
The parquet file that generated by S3 Sink connector seems valid to us. See below for a few validations:
|
Sink connector config that I used
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
What version of the Stream Reactor are you reporting this issue for?
Build from the
master
branch on May 26, 2024 by myself.Are you running the correct version of Kafka/Confluent for the Stream reactor release?
Running Kafka cluster (MSK) on AWS, under version
2.8.2.tiered
Do you have a supported version of the data source/sink .i.e Cassandra 3.0.9?
Have you read the docs?
Yes
What is the expected behaviour?
Restore the backup files into Kafka Topic.
It should restore all the messages into Kafka topic without any errors.
What was observed?
java.io.EOFException: Reached the end of stream with 8388608 bytes left to read
What is your Connect cluster configuration (connect-avro-distributed.properties)?
What is your connector properties configuration (my-connector.properties)?
Please provide full log files (redact and sensitive information)
The text was updated successfully, but these errors were encountered: