BdEngineer
BdEngineer

Reputation: 3199

Will the streaming works for transactional data use case?

Me using spark-sql-2.4.1v , kafka with Cassandra. I have a scenario where I would get different transnational data which might consists update records... I need to update records already received earlier with added fields' values.

Can this be achieved using spark-streaming , kakfa with Cassandra.

If so how can I should proceed ? any clue please. If not what else I need to add in my tech stack ?

Thanks.

Upvotes: 0

Views: 40

Answers (1)

Alex Ott
Alex Ott

Reputation: 87244

Just write data via Spark Cassandra Connector as described in documentation (for RDDs, for DataFrames) - this operation will update existing data, or insert new. Depending on the selected API, you may need to configure connector to append data to table, instead of complete overwrite each time.

Upvotes: 1

Related Questions