Reputation: 117
After looking at ways to go about this, I'm a bit overwhelmed, having never done this before, so here I am. I'm looking into ways to store my data for a personal project. I've never used so much data before, and I usually just stored it in a .txt
or .csv
, but I'm curious if there would be a better way to go about it. I'm currently looking into json
at the moment, as I feel the structure could prove useful in the future (not that important).
In terms of the data, just think of a dictionary: idx | string | string | string | string ..
.
Would it be worth the investment to store all the information into a DBMS if I just want to grab it by the index to display? I'm more curious about why you'd take one route over the other.
Upvotes: 1
Views: 1237
Reputation: 17
I’m working at ObjectBox and am new to C++ myself, so I can say that it’s quite easy to use even if you aren’t familiar with any data storage solutions. This is a free to use NoSQL database.
You define the schema (a separate file that lists the properties of your object, like the word itself and its definition) and ObjectBox generates the binding code automatically. Storing data in your app is then just a matter of using a native C++ object, like in the example below:
int main() {
obx::Store store(create_obx_model());
obx::Box<Word> box(store);
// Create an object
obx_id id = box.put({.word = "entry 1", .definition = "definition 1"});
// Read from the database
std::unique_ptr<Word> entry = box.get(id);
return 0;
}
Here is how the schema file looks like:
table Word {
id: ulong;
word: string;
definition: string;
}
Upvotes: 1
Reputation: 25613
There are thousands of already implemented standard solutions.
As already mentioned by others: Connect to a data base with a library of your choice. But this is typically the slowest method as in all libraries I know, all and everything will be converted to text to transfer it to the database core and there sometimes parsed back to native data. This consumes a lot of time and memory.
Use your own data structures and use a common serializer.
Use a library to describe your data structures and the IO facilities for them like protobuf
. There are a lot of implementations.
Or simply do it handcrafted... but my advice is: Take a already implemented solution as it saves your time and especially helps to not reinvent the wheel and bugs.
Upvotes: 1
Reputation: 32953
With lightweight options like SQLite the investment per se will be pretty much limited (in as well complexity as extra footprint), and you will be able to offload a whole bunch of concerns to the library: memory management, data retrieval by criterion, etc. With JSON you will either have to keep the entire dataset in RAM (but that doesn't scale plus on each start-stop you will have to load and save it) and you will have to figure out querying all by yourself.
There are other options as well but in my opinion SQLite is very mature, very well known and pretty solid, and if all you need is pure ISAM style access, the SQL you will need to understand is minimal.
Upvotes: 2