Viji V Nair wrote:
There are catches in the SAN controllers also. SAN vendors wont give
that much information regarding their internal controller design. They
will say they have 4 external 4G ports, you should also check how many
internal ports they have and the how the controllers are operatin
On Tue, Jan 26, 2010 at 11:11 PM, Greg Smith wrote:
> Viji V Nair wrote:
>
>> A 15k rpm SAS drive will give you a throughput of 12MB and 120 IOPS. Now
>> you can calculate the number of disks, specifically spindles, for getting
>> your desired throughput and IOPs
>>
>
> I think you mean 120MB/s
Viji V Nair wrote:
A 15k rpm SAS drive will give you a throughput of 12MB and 120 IOPS.
Now you can calculate the number of disks, specifically spindles, for
getting your desired throughput and IOPs
I think you mean 120MB/s for that first part. Regardless, presuming you
can provision a data
On Tue, Jan 26, 2010 at 5:15 PM, Matthew Wakeling wrote:
> On Mon, 25 Jan 2010, nair rajiv wrote:
>
>> I am working on a project that will take out structured content from
>> wikipedia and put it in our database...
>>
>> there is a table which will approximately have 5 crore entries after data
>>
On Mon, 25 Jan 2010, nair rajiv wrote:
I am working on a project that will take out structured content from
wikipedia and put it in our database...
there is a table which will approximately have 5 crore entries after data
harvesting.
Have you asked the Wikimedia Foundation if they mind you con
On Mon, 25 Jan 2010, Viji V Nair wrote:
I think this wont help that much if you have a single machine. Partition the
table and keep the data in different nodes. Have a look at the tools like
pgpool.II
So partitioning. You have three choices:
1. Use a single table
2. Partition the table on the
On Tue, Jan 26, 2010 at 9:18 AM, nair rajiv wrote:
>
>
> On Tue, Jan 26, 2010 at 6:19 AM, Andres Freund wrote:
>
>> On Tuesday 26 January 2010 01:39:48 nair rajiv wrote:
>> > On Tue, Jan 26, 2010 at 1:01 AM, Craig James
>> wrote:
>> > I am working on a project that will take out struct
On Tue, Jan 26, 2010 at 6:19 AM, Andres Freund wrote:
> On Tuesday 26 January 2010 01:39:48 nair rajiv wrote:
> > On Tue, Jan 26, 2010 at 1:01 AM, Craig James
> wrote:
> > I am working on a project that will take out structured content
> > from wikipedia
> > and put it in our database.
On Tuesday 26 January 2010 01:39:48 nair rajiv wrote:
> On Tue, Jan 26, 2010 at 1:01 AM, Craig James
wrote:
> I am working on a project that will take out structured content
> from wikipedia
> and put it in our database. Before putting the data into the database I
> wrote a script to
> f
On Tue, Jan 26, 2010 at 1:01 AM, Craig James wrote:
> Kevin Grittner wrote:
>
>> nair rajiv wrote:
>>
>>
>>> I found there is a table which will approximately have 5 crore
>>> entries after data harvesting.
>>> Is it advisable to keep so much data in one table ?
>>>
>> That's 50,000,000 rows, ri
Kevin Grittner wrote:
nair rajiv wrote:
I found there is a table which will approximately have 5 crore
entries after data harvesting.
Is it advisable to keep so much data in one table ?
That's 50,000,000 rows, right?
You should remember that words like lac and crore are not English words,
nair rajiv wrote:
> I found there is a table which will approximately have 5 crore
> entries after data harvesting.
> Is it advisable to keep so much data in one table ?
That's 50,000,000 rows, right? At this site, you're looking at a
non-partitioned table with more than seven times that if y
On Mon, Jan 25, 2010 at 10:53 PM, nair rajiv wrote:
> Hello,
>
> I am working on a project that will take out structured content
> from wikipedia
> and put it in our database. Before putting the data into the database I
> wrote a script to
> find out the number of rows every table would
On Mon, Jan 25, 2010 at 10:53 PM, nair rajiv wrote:
> Hello,
>
> I am working on a project that will take out structured content
> from wikipedia
> and put it in our database. Before putting the data into the database I
> wrote a script to
> find out the number of rows every table would
Hello,
I am working on a project that will take out structured content
from wikipedia
and put it in our database. Before putting the data into the database I
wrote a script to
find out the number of rows every table would be having after the data is in
and I found
there is a table which
15 matches
Mail list logo