Reputation: 1276
I'm creating a linux device driver that create a character device. The data that it returns on reads is logically divided into 16-byte units.
I was planning on implementing this division by returning however many units fit into the read buffer, but I'm not sure what to do if the read buffer is too small (<16 bytes).
What should I do here? Or is there a better way to achieve the division I'm trying to represent?
Upvotes: 5
Views: 1176
Reputation: 780974
You could act like the datagram socket device driver: it always returns just a single datagram. If the read buffer is smaller, the excess is discarded -- it's the caller's responsibility to provide enough space for a whole datagram (typically, the application protocol specifies the maximum datagram size).
The documentation of your device should specify that it works in 16-byte units, so there's no reason why a caller would want to provide a buffer smaller than this. So any lost data due to the above discarding could be considered a bug in the calling application.
However, it would also be reasonable to return more than 16 at a time if the caller asks for it -- that suggests that the application will split it up into units itself. This could be more performance, since it minimizes system calls. But if the buffer isn't a multiple of 16, you could discard the remainder of the last unit. Just make sure this is documented, so they know to make it a multiple.
If you're worried about generic applications like cat
, I don't think you need to. I would expect them to use very large input buffers, simply for performance reasons.
Upvotes: 6