Reputation: 1371
I have an application that downloads image streams from security cameras (one thread for each camera). I issue a get request to the camera and the camera responds by sending an endless stream of jpg images.
If the cameras are sending data to the program faster than the program can process the data what is the behavior of the application? Right now I notice that the computer's used memory goes up to 95% then stays there, but this used memory isn't attached to any particular process. Is this because the socket buffers continually expand to a certain point and then just start dropping packets when they cannot expand further?
I'm using .Net sockets, if that matters.
Upvotes: 1
Views: 129
Reputation: 84149
If by "stream" you mean TCP then in-kernel socket receive buffer would fill up and OS network stack would apply TCP flow control to slow down the sender.
If, on the other hand, you are working with UDP then at some point your receiver will start dropping packets, and unless you have some sequencing in the application-level protocol, you won't know about it.
Upvotes: 3