Reputation: 3
I have been using Kafka connect for the confluent platform using the following guide
But it doesn't update the sink file anymore, any changes in the source file are not written in the kafka topic.
I have already deleted all tmp files but no change.
Thanks in advance
Upvotes: 0
Views: 4009
Reputation: 3572
Appending data in the test.txt file, but the new data not being sent to the topic, nor to sink file. It does send new data to the topic and to the sink file only after stopping and starting the connector.
Upvotes: 0
Reputation: 987
I faced the same problem before. But correcting the path of the input and output files in the properties files as below worked for me. And it streamed from input file(test.txt
) to output file(test.sink.txt
).
name=local-file-source
connector.class=FileStreamSource
tasks.max=1
file=/home/mypath/kafka/test.txt
topic=connect-test
name=local-file-sink
connector.class=FileStreamSink
tasks.max=1
file=/home/mypath/kafka/test.sink.txt
topics=connect-test
Upvotes: 0
Reputation: 11
The FileStreamSource/Sink does not work after it worked fine and you've already restarted the zookeeper, kafka server and the connector but still it does not work then the problem is with the CONNECT.OFFSETS file in the kafka directory. You should delete it and create a new empty one.
Upvotes: 1
Reputation: 5947
To OP, I have had this like 5 mins ago but when I restarted the connector it's fine, both test.sink.txt and the consumer are getting new line added. So in a nutshell, just restart your connector.
Upvotes: 0
Reputation: 2313
Start up a new file source connector with a new location for storing the offsets. This connector is meant as a demo and really doesn't handle anything except a simple file that only gets append updates. Note, you shouldn't be doing anything with this connector other than a simple demo. Have a look at the connector hub if you need something for production.
Upvotes: 1