Reputation: 5689
I have a memory stream that I'm writing to in a background task. I have a separate task that is reading from the stream.
The reader misses items that were written to the stream which implies that the reader and writer are both mutating the stream.
Is that explanation plausible? If so, do I need to synchronise these activities?
I have a service that returns a stream in process to another part of the code. It would seem, if this is the case, that streams are not an appropriate choice.
Upvotes: 0
Views: 128
Reputation: 353
You can't use the same MemoryStream
instance to perform both reading and writing. You need to create new instances each time you want to read or write to memory.
And yes, in this strategy you would need to synchronize read and write access to prevent them from happening at the same time from different threads.
You may be able to do something as simple as use the lock
keyword around both of the places where you are reading and writing after you have created the MemoryStream
instances.
Upvotes: 0
Reputation: 929
You must implement synchronization by BlockingCollection<>
, or ProducerConcumer
pattern. This technique used for concurrent list.
To implement ProducerConsumer
pattern, you need ConcurrentBag<>
or ConcurrentDictionary<>
.
A sample code for process file:
public class FileDescriptionAudDbStore
{
private static BlockingCollection<FileDescription> _dbStore;
public BlockingCollection<FileDescription> DbStore
{
get
{
if (_dbStore == null)
{
_dbStore = new BlockingCollection<FileDescription>(new ConcurrentBag<FileDescription>());
}
return _dbStore;
}
}
}
Upvotes: -3
Reputation: 1063854
A MemoryStream
only supports a single access at any time; it does not support concurrent read and write, concurrent reads, or concurrent writes. You could synchronize, but to be honest that sounds like you're using MemoryStream
in a very unusual way - you would need to synchronize over entire chunks of operations, not just discreet reads/writes. In particular, note that MemoryStream
only has a single position; it doesn't have a separate read position and write position, so you would need to exercise great care to make sure that the position is correct for the current operation.
I wonder whether Pipe
might be what you're actually after, since that is both async-first and supports separate read / write operations; Pipe
is basically a streaming buffer where a producer can push data in (with back-pressure if the backlog becomes too large), while a consumer can pull data out, all while using pooled memory.
Upvotes: 9