Calpau
Calpau

Reputation: 919

Does the memory usage of SQLite remain static regardless of DB size?

I have a 700MB SQLite3 database that I'm reading/writing to with a simple Python program. I'm trying to gauge the memory usage of the program as it operates on the database. I've used these methods:

The first two support the conclusion it uses no more than 20MB at any given time. I can start with an empty database and fill it up with 700MB of data and it remains under 20MB:

Memory profiler's figure never went above 15.805MiB:

Line #    Mem usage    Increment   Line Contents
================================================
   ...
   229   13.227 MiB    0.000 MiB       @profile
   230                                 def loop(self):
   231                                     """Loop to record DB entries"""
   234   15.805 MiB    2.578 MiB           for ev in range(self.numEvents):
   ...

pstuil said peak usage was 16.22265625MB

Now top/htop is a little weirder. Both said that the python process's memory usage wasn't above 20MB, but I could also clearly see the free memory steadily decreasing as it filled up the database via the used number:

Mem:   4047636k total,   529600k used,  3518036k free,    83636k buffers

My questions:

Regarding the last point, my ultimate objective is to use a rather large SQLite database of unknown size on an embedded system with limited RAM and I would like to know if it's true that that memory usage is more or less constant regardless of the size of the database.

Upvotes: 1

Views: 1329

Answers (1)

Colonel Thirty Two
Colonel Thirty Two

Reputation: 26569

SQLite's memory usage doesn't depend on the size of the database; SQLite can handle terabyte-sized databases just fine, and it only loads the parts of the database that it needs (plus a small, configurable-sized cache).

SQLite should be fine on embedded systems; that's originally what it's designed for.

Upvotes: 3

Related Questions