Reputation:
I'm going to create an application that will act as some sort of task manager. For stability reasons I will not use threads but processes instead. I have to deal with several third party libraries and/of COM servers that are not always that stable and can produce severe crashes sometimes. This may (of course) not affect the task manager
Problem with using processes is how to communicate with them? The process must f.e. give a status back of what it's doing every x seconds.
I was thinking of using TCP over a separate port per process, but is this the best way of doing this?
Upvotes: 2
Views: 176
Reputation: 39695
I would go with WCF and named pipes as well, but if you´re up for it you could use signaling and shared memory (memory mapped files). It´s way faster, but it might not be worth the code complexity.
Take a look at this blog post for some sample code.
Upvotes: 0
Reputation: 21178
I think you should look at Instrumentation:
http://msdn.microsoft.com/en-us/magazine/cc300488.aspx
Any other approach such as talking on ports or WCF adds layers which can make deducing the root cause of problems harder; it also pails in comparison performance wise to Instrumentation. WMI is built around high performance monitoring. In addition, operationally this is the best approach as it plays into Admin tools for monitoring the health of processes.
Upvotes: 0
Reputation: 273264
You could use WCF (with the NetNamedPipeBinding binding).
And maybe consider AppDomains to run your processes in.
Upvotes: 2
Reputation: 52320
Using pipes would be a good option. Look at the System.IO.Pipes namespace.
Upvotes: 1
Reputation: 38112
Named pipes would probably be more efficient. Take a look at WCF:
Expose a WCF Service through a Named Pipes binding
Upvotes: 6