Reputation: 11
I want to map opc-ua data into kafka topic.
Upvotes: 1
Views: 6647
Reputation: 21
You can use the OPC Router with the Kafka Plug-In to stream OPC UA or OPC Classic data to Kafka.
The OPC Router is listed as official solution in the confluent hub
Upvotes: 0
Reputation: 127
You can do that with https://github.com/vogler75/automation-gateway/. It connects to 1-n OPC UA servers, you can access the OPC UA tags via GraphQL or MQTT and you can add a logger to log the OPC UA values to Apache Kafka.
Upvotes: 0
Reputation: 301
I've worked with that problem once upon a time. That was a POC work. Here is my thoughts about this.
Kafka has two API's to communicate external systems which are Kafka Client API and Kafka Connector. Kafka API simple and powerful API yet you need to implement solution to be able to provide fault tolerancy and distribute the work to workers if you need. Kafka Connect is powerful and designed to support a solution to solve this problem to be able to make developers like easier.
OPC servers doesn't work as a cluster. It works like redundant pairs. You can't distribute the load in OPC server products.
So, does we really need for a kafka connect cluster to send data from OPC to Kafka? Because kafka connect designed to work as microservices to be able to distribute the load and it works as cluster. But, OPC servers just work on a single machine and if the data fits in a single machine then you don't need to create a seperate cluster to send data.
And you are creating extra component that you need to manage for this purpose that doesn't really necessary in our case.
My solution was simple. I go with Kafka client API. It works as an agent on the OPC server installed machine. So thats really simplified the problem.
I decided to that agent should work in process. Because, my purpose was to send data to Kafka as fast as possible and I need a solution that works fast and need to be lightweight.
Inter-process communication creates overhead and our agent are gonna work in the same machine with OPC server installed. OPC messages are small and the program is lightweigth, I don't want to create an overhead for this purpose and decided to go with in-process. So, to be able to do achieve that first, I looked for OPC client API's.
I saw that there was an official client API's provided by OPC UA Foundation and written in C#. You can find it here. And also I know that Java API is the most powerful API in Kafka Client. So, I look for another OPC client API written in Java (as I said it's because of I want to work that solution in-process). Then I found Eclipse Milo. But it was not tested on production in earlier times and I decided to go with the official standard API written in C#.
So, I work with OPC UA Standard Library and Kafka C# Client library. The integration was very simple yet you need think about some problems like your program may crash some reasons and if you are not tolerate to data loss you need to implement a solution to persist data in an embedded db like LiteDB (Its because it's already written in c# and works in-process with our program).
In this solution to be able to achieve HA and eleminate SPOF, you can place a 3 OPC server that installed on 3 machine and install this agent software on those machines, then you are good to go for production with highly available and fault tolerant solution.
PS: We were gonna work with KepServerEX - which we decided for their support and has lots of driver support- and it works only windows machines.
Upvotes: 0
Reputation: 87
OPC is your server and with OPC's node you have your data.When you have your data it is your choice to do anything with the data.You can write a handler class or service to work with your data fetched from OPC server.You have to write a basic class with one of your favorite method to put the data fetched into Kafka or Redis or anything else.Basically you need the kafka's java client and include it in your project along with Milo OPC, read the data with Milo node.readValue().get()
and put it on Kafka with the Kafka's java client.
Upvotes: 1