foo_l
foo_l

Reputation: 601

best allocation of memory

I have 200 groups. Each group has 100 devices, i.e. a total of 20000 devices divided into 200 groups of 100 each.

Now when each device gets registered with the server, the server assigns a group id to the device. (100 devices has same group id.) At a later stage the server sends the multicast data with the group id so that the data is received to all the devices having that group id.

The problem is that I need to allocate a single chunk of memory(say 25bytes) for each group to store the data so that all the devices in that group will use that chunk for their processing. My idea is to allocate a big chunk (say 25 * 200 = 5000 bytes) and assign each group a 25 byte block (grp0 points to start address, grp1 points to start+25 address and so on).

Is this the best way? Any other ideas?

Upvotes: 0

Views: 112

Answers (2)

thedayofcondor
thedayofcondor

Reputation: 3876

For your example, I would use an array.

Provided the number of your clients does not change, allocating a single block is the most efficient way:

  • You do a single malloc call instead of 100
  • you avoid the overhead associate with the list that will track every memory block allocation
  • your data is kept in one piece, which it makes it more easily cacheable by the processor cache, compared to 100 small blocks placed god-knows-where

Said that, probably the difference with just 100 elements is negligible, but multiplied by 200 groups can give you a performace boost (really depend on how you are using the data structure)

In case of a dynamic structure instead (for example, your clients connect and disconnect so they are not always 100) you should use a linked list - which allocates the memory when needed (so you end up with 100 different memory blocks)

Upvotes: 1

bellati
bellati

Reputation: 82

As stated by ArjunShankar, you will take O(1) time to ACCESS a device within a group, that's not bad assuming you don't have to process too much to find a specific device (assuming you have to find it). If you're planning to process them simultaneously and the number grows large (or your available memory is limited), you should take a look at some techniques such as disk pagination.

Upvotes: 0

Related Questions