abend
abend

Reputation: 483

What is the best approach when caching info?

My app sends emails. I currently:

  1. get a list of customers from the DB (objects)
  2. get a list of their email types from the DB (ditto)
  3. get a list of email recipients/unique data for the email from the DB (again)
  4. use the data above to generate mailmessages
  5. loop through the mailmessages and send them out while logging the smtp status

Now this behavior is fine when your sending off 500 emails, but what is the impact if it's 10,000+ emails? I imagine at some point the amount of objects I am storing until I get to step 5 is considerable. How can I measure it to know I am approaching capacity? I figure I can at least time this whole scenario to understand how long it takes as a clue toward when it is becoming a drag on the system.

Would it be better to run this scenario on a per customer basis? It seems that would be less efficient, hitting the DB potentially hundreds of times instead of 3 or so. I know the logging will be one off hits back to the DB.

I am looking for an approach, not a code resolution. I got in trouble last time I didn't specify that.

Upvotes: 0

Views: 79

Answers (1)

Kjartan
Kjartan

Reputation: 19151

I guess this depends on several things, such as how big/powerful your system is (database capacity, processing/memmory and more?), and how important it is to send out these mails quickly, among other things.

An idea might be to use a temporary DB table to store the info from steps 1-4. You could do this in batches(as Blogobeard mentioned), or all at once, depending on efficiency. The actual mailing-job could then also be split into batches, and when a mail is sent, that customer`s info would be deleted from the temp-table.

There are probably several ways to fine-tune this, and it's probably easier to give a better advice once you've tried something, and have some specific results to act on.

Upvotes: 1

Related Questions