Reputation: 93
Can any one guide me about my query?, i m making application for banking sector with fuzzy logic. i have to import table with 100 million rows daily. and i am using MySql for this application which is processing slowly. so is there any another server for handling my database which can access fast?
Upvotes: 5
Views: 3997
Reputation: 221165
We roughly load about half that many rows a day in our RDBMS (Oracle) and it would not occur to me to implement such a thing without access to DBA knowledge about my RDBMS. We fine-tune this system several times a month and we still encounter new issues all the time. This is such a non-trivial task that the only valid answer is:
Don't play around, have your managers get a DBA who knows their business!
Note: Our system has been in place for 10 years now. It hasn't been built in a day...
Upvotes: 6
Reputation: 5144
Well it seems your business' main logic does not depend on importing those 100mio rows into a database, otherwise you wouldn't be stuck with doing this by yourself, right? (correct me if I'm wrong)
Are you sure you need to import those rows into a database when the main business doesn't need to? What kind of questions are you going to ask of the date? Can't you maybe keep the log files on a bunch of servers and query them with eg Hadoop? Or can you aggregate the information contained in the log files and only store a condensed version?
I'm also asking this because it sounds like you're planning to perform some at least moderately sophisticated analysis on the data and the trouble with this amount of data won't stop once you have it in a DB.
Upvotes: 0
Reputation: 9148
100 million rows daily?
You have to be realistic. I doubt any single instance of any database out there can handle this type of thouroughput efficiently. You should probably look at clustering options and other optimising techniques such as splitting data in two diffent DB's (sharding).
MySQL Enterprise has a bunch of features built-in that could ease and moniter the clustering process, but I think MySQL community edition supports it too.
Good-luck!
Upvotes: 3
Reputation: 2666
How are you doing it? One hugh transaction?
Perhaps try to make small transactions in chunks of 100 or 1000.
Is there an index on that table? Drop the index before starting the improt (if that is possible due to unique costraints etc.) and rebuild the index after the import.
An other option would perhaps be an in memory database.
Upvotes: 0