Reputation: 133
We have a .NET Core console app that serves the role of a Saga/Process manager.
This Saga app communicates with other microservices via Azure Service Bus (with the use of MassTransit for messaging abstraction - MassTransit.Azure.ServiceBus)
The app contains a state machine (MassTransit/Automatonymous) that handles events - triggered by Service Bus messages.
In the current scenario the initial Saga event is triggered from an Azure Function App by publishing a message via MassTransit:
busControl.Publish(createSearchPageLinkEvent);
NOW, when:
a) the Saga app is ran as-is, (no containerization) - everything works fine, the event is handled correctly.
b) the Saga app is put into a Docker container, locally (using docker-compose in VS2017) - an Exception occurs. In essence - it seems that upon Publishing the message indeed reaches the Saga app, however the following Exception occurs instantly (excerpt):
Exception received on receiver: sb://***.servicebus.windows.net/link_provider_saga during RenewLock, Microsoft.Azure.ServiceBus.MessageLockLostException: The lock supplied is invalid. Either the lock expired, or the message has already been removed from the queue
Here is the message handling code in the state machine (Automatonymous) that never gets reached when dockerized:
Initially(
When(CreateSearchPageLinkEvent)
.Then(context =>
{
//Exception occurs before we get here
_log.Information($"{context.Instance.CorrelationId} CreateSearchPageLinkEvent for ");
context.Instance.PropertyType = context.Data.PropertyType;
context.Instance.SideName = context.Data.SideName;
context.Instance.TransactionType = context.Data.TransactionType;
context.Instance.Url = context.Data.Url;
})
Here is the docker-compose config:
version: '3.4'
services:
saga.azure:
image: ${DOCKER_REGISTRY-}sagaazure
build:
context: .
dockerfile: AcquireLinkTaskTracking.Azure\Dockerfile
ports:
- "443:443"
- "5671:5671"
- "5672:5672"
- "9350-9354:9350-9354"
Here is the app's DockerFile:
FROM microsoft/dotnet:2.1-runtime-nanoserver-1803 AS base
WORKDIR /app
FROM microsoft/dotnet:2.1-sdk-nanoserver-1803 AS build
WORKDIR /src
RUN dotnet restore AcquireLinkTaskTracking.Azure/Saga.Azure.csproj
//(...) Lots of dependendcy copying here
COPY . .
WORKDIR /src/AcquireLinkTaskTracking.Azure
RUN dotnet build Saga.Azure.csproj -c Debug -o /app
FROM build AS publish
RUN dotnet publish Saga.Azure.csproj -c Debug -o /app
FROM base AS final
WORKDIR /app
COPY --from=publish /app .
ENTRYPOINT ["dotnet", "Saga.Azure.dll"]
Why would Docker cause communication issue? My hunches are:
a) incorrect port mapping/publishing - however the triggering microservice obviously reaches the container somehow
b) The TLS protocol/certificate (used by Azure Service Bus) are not set up correctly (not a trivial thing to do)
PS the dockerized Saga app is launched from VS2017 locally by "Start Debugging" wth docker-compose
PS2 using EXPOSE for port 80 in the Dockerfile did not solve the issue
Upvotes: 2
Views: 966
Reputation: 133
Ok, so this was a simple issue:
We simply used the wrong base image. The asp net core runtime image was needed. The diff to our dockerfile goes like this:
-FROM microsoft/dotnet:2.1-runtime-nanoserver-1803 AS base
+FROM microsoft/dotnet:2.1-aspnetcore-runtime-nanoserver-sac2016 AS base
-FROM microsoft/dotnet:2.1-sdk-nanoserver-1803 AS build
+FROM microsoft/dotnet:2.1-sdk-nanoserver-sac2016 AS build
you live, you learn.
Upvotes: 0
Reputation: 26057
Why would Docker cause communication issue?
Networking issue I suspect. Intermittent errors will take place and as a user you have to retry. And it's not just Docker where you could run into this problem.
RenewLock
is a client-side initiated operation and is not guaranteed to be successful. As such, failure to renew the message lock should be handled by a retry mechanism. You'd need to confirm with MassTransit if that's implemented or not. If not, your code will continue processing message assuming lock was extended when it wasn't. And when completion of an incoming message will be attempted, you'll get the MessageLockLostException
exception.
Upvotes: 0