Reputation: 1
I think of a big project, which gonna take much load on sever resources and bandwidth. It should work with huge MySQL database of different webpages (maybe about 20 million, I hope MySQL can handle such amount of records). But I will need to share the requests to the database between several servers I think, because one single server won't be able to handle such a load. I wonder how big projects like Google or Archive.org are doing this? I thought of primitive method, which is like follows:
I feel like this is a noob method, but I would like to know your opinion, and maybe the ways you see how this can be implemented in more optimized and clever form.
Will be happy for any tip.
Thanks, Dennis.
Upvotes: 0
Views: 145
Reputation: 127262
(maybe about 20 million, I hope MySQL can handle such amount of records)
20M records is next to nothing for a database. It's all about how your use the database, how to write your SQL, your datamodel, etc. etc. Don't worry about 20M records, that's not going to be any problem at all.
Upvotes: 1