Gilad
Gilad

Reputation: 588

Micro-services architecture, need advise

We are working on a system that is supposed to 'run' jobs on distributed systems.

When jobs are accepted they need to go through a pipeline before they can be executed on the end system.

We've decided to go with a micro-services architecture but there one thing that bothers me and i'm not sure what would be the best practice.

When a job is accepted it will first be persisted into a database, then - each micro-service in the pipeline will do some additional work to prepare the job for execution.

I want the persisted data to be updated on each such station in the pipeline to reflect the actual state of the job, or the its status in the pipeline.

In addition, while a job is being executed on the end system - its status should also get updated.

What would be the best practice in sense of updating the database (job's status) in each station:

  1. Each such station (micro-service) in the pipeline accesses the database directly and updates the job's status

  2. There is another micro-service that exposes the data (REST) and serves as DAL, each micro-service in the pipeline updates the job's status through this service

  3. Other?....

Help/advise would be highly appreciated.

Thanx a lot!!

Upvotes: 0

Views: 402

Answers (5)

techagrammer
techagrammer

Reputation: 1306

If you would like to code the workflow:

Micorservice A which accepts the Job and command for update the job Micorservice B which provide read model for the Job

Based on JobCreatedEvents use some messaging queue and process and update the job through queue pipelines and keep updating JobStatus through every node in pipeline.

I am assuming you know things about queues and consumers.

Myself new to Camunda(workflow engine), that might be used not completely sure

Upvotes: 1

ElasticCode
ElasticCode

Reputation: 7875

Consider also SAGA Pattern

A Saga is a sequence of local transactions where each transaction updates data within a single service. The first transaction is initiated by an external request corresponding to the system operation, and then each subsequent step is triggered by the completion of the previous one.


http://microservices.io/patterns/data/saga.html
https://dzone.com/articles/saga-pattern-how-to-implement-business-transaction
https://medium.com/@tomasz_96685/saga-pattern-and-microservices-architecture-d4b46071afcf

Upvotes: 1

Sean Farmar
Sean Farmar

Reputation: 2293

To add to what was said by @Anunay and @Mohamed Abdul Jawad

I'd consider writing the state from the units of work in your pipeline to a view (table/cache(insert only)), you can use messaging or simply insert a row into that view and have the readers of the state pick up the correct state based on some logic (date or state or a composite key). as this view is not really owned by any domain service it can be available to any readers (read-only) to consume...

Upvotes: 1

Binary
Binary

Reputation: 112

accessing some shared database between microservices is highly not recommended as this will violate the basic rule of microservices architecture.

microservice must be autonomous and keep it own logic and data

also to achive a good microservice design you should losely couple your microservices

Upvotes: 0

Anunay
Anunay

Reputation: 1893

Multiple microservices accessing the database is not recommended. Here you have the case where each of the service needs to be triggered, then they update the data and then some how call the next service.

You really need a mechanism to orchestrate the services. A workflow engine might fit the bill.

I would however suggest an event driven system. I might be going beyond with a limited knowledge of the data that you have. Have one service that gives you basic crud on data and other services that have logic to change the data (I would at this point would like to ask why you want different services to change the state, if its a biz req, its fine) Once you get the data written just create an event to which services can subscribe and react to it.

This will allow you to easily add more states to your pipeline in future. You will need a service to manage the event queue.

As far as logging the state of the event was concerned it can be done easily by logging the events.

If you opt for workflow route you may use Amazon SWF or Camunda or really there quite a few options out there. If going for the event route you need to look into event driven system in mciroservies.

Upvotes: 0

Related Questions