Reputation: 23
Use Case: I want to restore the data from S3 to kafka topic (Note: I have already written the data from kafka to S3 partition vise).
I am using a lenses docker image to run the lenses box
docker run -e ADV_HOST=127.0.0.1 -e EULA="https://licenses.lenses.io/d/?id=my-id" --rm -p 3030:3030 -p 9092:9092 lensesio/box:latest
So once the box is up, I am creating a connector with the following configuration
connector.class=io.lenses.streamreactor.connect.aws.s3.source.S3SourceConnector
tasks.max=1
topics=test_topic
connect.s3.reader.buffer.size=8192
connect.s3.aws.access.key=AWS_ACCESS_KEY
connect.s3.signing.region=us-east-1
connect.s3.bucket.name=bucket-name
connect.s3.kcql=insert into test_topic select * from bucket-name:jsonTest STOREAS `json` LIMIT 5000
connect.s3.format=json
connect.s3.poll.interval=10
name=s3-source-connector
connect.s3.aws.region=us-east-1
connect.s3.aws.secret.key=AWS_SECRET_KEY
And connector is also getting created and its in running state, but somwhow data is not getting read from the s3 and written into kafka topic.
I am attaching the sample json file which i am working with:
{"id": 1, "created": "2016-05-06 13:53:00", "product": "OP-DAX-P-20150201-95.7", "price": 94.2, "qty":100}
I have checked the logs of the connector but no issues there like no exception or anything, so I don't understand why this is happening, as i have no experience with this connector.
Upvotes: 0
Views: 47