Reputation: 1639
SQL Server offers bulk insert functionality. You can see that this file reads from e.g. a csv
file and inserts to table.
I am understanding that this has a clear drawback when working with Kafka:
My question is about how to overcome the above drawbacks; something about this whole process looks wrong. What is most worrying to me is the 2nd drawback, writing to disk. Would I be able to write a file to memory, and then execute bulk insert over it?
Upvotes: 1
Views: 1985
Reputation: 191671
Sure, it's "possible", but ideally you wouldn't use this BULK INSERT
method from a CSV.
Instead, you can use Kafka Connect JDBC sink, which buffers records in memory, not as a file, as a Kafka Consumer, then uses regular INSERT INTO table VALUES
query
If you only want to be able to query Kafka data with SQL functions, then you don't need to upload data to a relational database - you can use ksqlDB or PrestoDB, for example
Upvotes: 1