Reputation: 1429
I have a asp.net core 2.2 app hosted as an azure app service. This sporadically receives large amounts of burst traffic. Much of the resources go to verifying postal/zip codes. The queries to the postal code table are all selects. There are no updates or inserts. These queries are a performance bottleneck. The table itself is about 700mb. I've been looking for a way to store this particular table in memory at startup possibly as a separate db context. Is it possible to fill this secondary db context with data from a main context connected to the SQL DB at startup? Is this a bad idea?
Upvotes: 3
Views: 632
Reputation: 4012
Putting them in memory is not a good idea. You might already be using shared resources in azure so this will impact the performance. Not sure about your limitations but i can suggest the following.
Scale Up
if you can try to scale up the db instance. Will certainly help.
Query Optimization
Work on query optimization if you haven't already. Look for best practices around.
Read Replica
If your table is part of transnational database you may create read-replica. Read replicas are optimized for read queries.
Cache
You can consider distributed cache like Redis. Depending on your table e.g. if it's look up data then let it build on demand using cache-aside pattern.
NoSQL/Data ware house
Performance of RDBMS is subjected to many factors. Accessing Data this big within transnational database will remain a challenge. If your business case allows , you can use NoSQL data store or dataware house solution.
Upvotes: 1