Reputation: 103
I am working with the confluent kafka, zookeeper in docker. I successfully submit a json file to kafka topic then consume as follow
curl -X POST \
-H "Content-Type: application/json" \
--data '{"name": "quickstart-file-source", "config {"connector.class":"org.apache.kafka.connect.file.FileStreamSourceConnector", "tasks.max":"1", "topic":"quickstart-data", "file": "/tmp/quickstart/input.json"}}' \
http://localhost:28081/connectors
Above curl command has only one json file which executes successfully but I need to post multiple json files. Is there any way to do it?
Here is my kafka connect
docker run -d \
--name=kafka-connect-avro \
--net=host \
-e CONNECT_BOOTSTRAP_SERVERS=localhost:29091 \
-e CONNECT_REST_PORT=28081 \
-e CONNECT_GROUP_ID="quickstart-avro" \
-e CONNECT_CONFIG_STORAGE_TOPIC="quickstart-avro-config" \
-e CONNECT_OFFSET_STORAGE_TOPIC="quickstart-avro-offsets" \
-e CONNECT_STATUS_STORAGE_TOPIC="quickstart-avro-status" \
-e CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR=1 \
-e CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR=1 \
-e CONNECT_STATUS_STORAGE_REPLICATION_FACTOR=1 \
-e CONNECT_KEY_CONVERTER="org.apache.kafka.connect.json.JsonConverter" \
-e CONNECT_VALUE_CONVERTER="org.apache.kafka.connect.json.JsonConverter" \
-e CONNECT_INTERNAL_KEY_CONVERTER="org.apache.kafka.connect.json.JsonConverter" \
-e CONNECT_INTERNAL_VALUE_CONVERTER="org.apache.kafka.connect.json.JsonConverter" \
-e CONNECT_REST_ADVERTISED_HOST_NAME="localhost" \
-e CONNECT_LOG4J_ROOT_LOGLEVEL=DEBUG \
-v /tmp/quickstart/file:/tmp/quickstart \
confluentinc/cp-kafka-connect:latest
Upvotes: 1
Views: 1848
Reputation: 191723
You could make individual JSON files in the current directory and post them separately in a loop
e.g.
$ ls *.json # list your connectors
payload1.json
payload2.json
And then loop over them
for f in `ls *.json`; do
curl -X POST -H "Content-Type: application/json" \
--data@${f} http://localhost:28081/connectors
done
Upvotes: 1