PFDRm
PFDRm

Reputation: 11

Performance of Reading and Writing to Database: Can it be faster?

Basically this code that I've created uses super csv to read in a csv file, of which about half of them need to call the data base to then update a value. The high level description of code is that I read a csv file in row by row and if certain checks are successful then it query's to update the database with the new value.

I was told to do some basic load testing, never used any kind of tools before, but after using google chrome developer to get the upload times and the records amount I collected the following data.

enter image description here

How accurate is google chrome developer tool when using it to measure time? because 2.68 seconds seems like a long time to read 18 records and update, 8? However, this seems to be roughly O(n log n ) right? Do you think it would be faster to do a only read and then a batch to database after? Would that be faster?

Anytime I query the database for existence of certain values, I use count and distinct, is there a more efficient way?

Any help would be greatly appreciated! here to learn all I can.

Upvotes: 0

Views: 383

Answers (1)

Dmitri T
Dmitri T

Reputation: 168042

Chrome Developer Tools is quite accurate but it adds extra time you might not be really interested in like connect time and the time to render the page.

When it comes to load testing I think you should be rather simulating multiple users than adding more rows, however you can combine these approaches to see the outcome. The list of free and open source load testing tools can be found i.e. in Open Source Load Testing Tools: Which One Should You Use? article

With regards to "more efficient way" - we cannot state anything without seeing your code, I can only suggest considering using profiler tools for the programming language you're using and inspecting SQL query plan

If you cannot figure out what's wrong:

Upvotes: 1

Related Questions