Hello, I am working on a website for a client which is basically biological database, and some have pictures, I am not allowed to give more out. It will be stored in MySQL.
Basically the client says there are about 500 million different items of data, which each holds about 10 small text fields. There are no logical splits in the data. So I cant just split it over multiple servers, with one server holding type a etc. There are also multiple tables, which link together. To be honest I want to do this myself, although I have never done a website over multiple servers, or basically got to the limit over using one server. Are there any resources or links or helpful comments on how to do this please? How much will one server take? Is there a way to make multiple Linux servers act like one database as it should be faster this way? I am planning to use multiple Linux servers, how much would I need? Thanks! Steve --------------------------------------------------------------------- Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail <[EMAIL PROTECTED]> To unsubscribe, e-mail <[EMAIL PROTECTED]> Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php