Hello. Had the same question. What if I need to store 4-6 Tb and do
queries? Can't find any clue in documentation.
Am 11.07.2015 03:28 schrieb "Mohammed Guller" <moham...@glassbeam.com>:

>  Hi Ravi,
>
> First, Neither Spark nor Spark SQL is a database. Both are compute
> engines, which need to be paired with a storage system. Seconds, they are
> designed for processing large distributed datasets. If you have only
> 100,000 records or even a million records, you don’t need Spark. A RDBMS
> will perform much better for that volume of data.
>
>
>
> Mohammed
>
>
>
> *From:* Ravisankar Mani [mailto:rrav...@gmail.com]
> *Sent:* Friday, July 10, 2015 3:50 AM
> *To:* user@spark.apache.org
> *Subject:* Spark performance
>
>
>
> Hi everyone,
>
> I have planned to move mssql server to spark?.  I have using around 50,000
> to 1l records.
>
>  The spark performance is slow when compared to mssql server.
>
>
>
> What is the best data base(Spark or sql) to store or retrieve data around
> 50,000 to 1l records ?
>
> regards,
>
> Ravi
>
>
>

Reply via email to