Kushagra Deep
From: Mich Talebzadeh
Date: Monday, 12 October 2020 at 11:23 PM
To: Santosh74
Cc: "user @spark"
Subject: Re: Spark as computing engine vs spark cluster
Hi Santosh,
Generally speaking, there are two ways of making a process faster:
1. Do more intelligent work by creati
Hi Santosh,
Generally speaking, there are two ways of making a process faster:
1. Do more intelligent work by creating indexes, cubes etc thus reducing
the processing time
2. Throw hardware and memory at it using something like Spark
multi-cluster with fully managed cloud service lik
Spark is a computation engine that runs on a set of distributed nodes. You
must "bring your own" hardware, although of course there are hosted
solutions available.
On Sat, Oct 10, 2020 at 9:24 AM Santosh74 wrote:
> Is spark compute engine only or it's also cluster which comes with set of
> hard
Is spark compute engine only or it's also cluster which comes with set of
hardware /nodes ? What exactly is spark clusterr?
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubscribe e-mail: user-unsu