Gaurav
Gaurav

Reputation: 2103

sqlite3 for multiple processes? how does other process gets effected after a huge update in data?

I am trying to use sqlite3 with node.js as reader and one c process as a writer.

From what I have read in http://www.sqlite.org/faq.html#q5, I understand that I can use sqlite3 with multiple processes or many read processes and one writer process. it is also mentioned that sqlite3 is a replacement of fopen (and not Oracle).

What I want to know when a process updates the database.
1. Does it inform other processes that DB is updated?
2. if my writer process loads 100MB data and deletes 100MB data every hour, is sqlite3 is a good choice?
3. regarding #2, do other processes need to reload the file or memory cache? does it even use some caching mechanism? do other processes know which data is updated?
4. would it help if I merge reader and writer process? [for that I have to pull info from writer server using the socket so one socket connection is added].

PS: Maybe, this question is more related to how the information is stored in database file? After an information update or deletion, how is it handled?

Upvotes: 2

Views: 1219

Answers (1)

MK.
MK.

Reputation: 34587

  1. no
  2. 100Mb is probably border line. I would consider real database like PostgreSQL
  3. Other processes will need to query data to see what changed if you have multiple writing processes sqlite might not be a good choice
  4. if in the future you will have processes running on different machines sqlite is definitely not a good choice.

Upvotes: 1

Related Questions