Reputation: 41
I'm trying to capture some changes in a SQL Server database using Kafka in standalone mode. In this task, the Confluent platform and Docker are not options.
The CDC on source database is already enabled. On the Kafka side, I'm having issues with Debezium connector. I'll describe more details bellow:
--- zookeeper.properties:
dataDir=C:/kafka_2.13-2.7.0/data
clientPort=2181
maxClientCnxns=0
admin.enableServer=false
--- server.properties:
broker.id=0
listeners=PLAINTEXT://localhost:9092
num.network.threads=3
num.io.threads=8
socket.send.buffer.bytes=102400
socket.receive.buffer.bytes=102400
socket.request.max.bytes=104857600
log.dirs=C:/kafka_2.13-2.7.0/kafka-logs
num.partitions=1
num.recovery.threads.per.data.dir=1
offsets.topic.replication.factor=1
transaction.state.log.replication.factor=1
transaction.state.log.min.isr=1
log.retention.hours=168
log.segment.bytes=1073741824
log.retention.check.interval.ms=30000
zookeeper.connect=localhost:2181
zookeeper.connection.timeout.ms=18000
group.initial.rebalance.delay.ms=0
auto.create.topics.enable=true
delete.topic.enable=true
--- sqlserver-source-connector.properties:
name=sqlserver-connector
connector.class=io.debezium.connector.sqlserver.SqlServerConnector
database.hostname=myhostname
database.port=1433
database.user=sa
database.password=mySuperSecretPassord
database.dbname=myDbName
database.server.name=myhostname
table.include.list=dbo.employes
database.history.kafka.bootstrap.servers=myhostname:9092
database.history.kafka.topic=dbhistory.fullfillment
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
offset.storage.file.filename=C:/kafka_2.13-2.7.0/connect.offsets
bootstrap.servers=localhost:9092
--- file-employes-sink.properties:
name=file-employes-sink
connector.class=FileStreamSink
tasks.max=1
file=file-employes-sink.txt
topics=myhostname.dbo.EMPLOYES
When I try to run
.\connect-standalone.bat ..\..\config\sqlserver-source-connector.properties ..\..\config\file-employes-sink.properties
and also check the connectors that could have been created through the link http://localhost:8083/connectors, here's what I get:
["file-employes-sink"]
That means that I only get to create my sink connector, right? What about my source connector? Shouldn't it be listed here?
Upvotes: 0
Views: 1643
Reputation: 191743
You need to provide connect-standalone.properties
as the first parameter to connect-standalone
to configure the worker
You then provide your two connector property files to start two connectors
You could also use connect-distributed
, with its properties file, then you'd HTTP POST the two connector configs
While Confluent Platform doesn't support Windows officially, it would still work the same. So would Docker
Upvotes: 1