Hello,
Is spark.driver.memory per Job or shared across jobs? You should do load
testing before setting this?
Thanks & regards
Arko
On Sun, Mar 24, 2019 at 3:09 PM Pat Ferrel wrote:
>
> 2 Slaves, one of which is also Master.
>
> Node 1 & 2 are slaves. Node 1 is where I run start-all.sh.
>
> Th
ack as JValue (using something like
> parse())
>
> Thanks,
> Muthu
>
> On Wed, Sep 19, 2018 at 6:35 AM Arko Provo Mukherjee <
> arkoprovomukher...@gmail.com> wrote:
>
>> Hello Spark Gurus,
>>
>> I am running into an issue with Encoding and wanted
Hello Spark Gurus,
I am running into an issue with Encoding and wanted your help.
I have a case class with a JObject in it. Ex:
*case class SomeClass(a: String, b: JObject)*
I also have an encoder for this case class:
*val encoder = Encoders.product[**SomeClass**]*
Now I am creating a DataFrame
rameter (the ip
> address will be resolved from the supplied HOSTNAME)
>
> best,
> --Jakob
>
> On Mon, Feb 22, 2016 at 5:09 PM, Arko Provo Mukherjee
> wrote:
>> Hello,
>>
>> I am running Spark on Windows.
>>
>> I start up master as follows:
>&g
Hello,
I am running Spark on Windows.
I start up master as follows:
.\spark-class.cmd org.apache.spark.deploy.master.Master
I see that the SparkMaster doesn't start on 127.0.0.1 but starts on my
"actual" IP. This is troublesome for me as I use it in my code and
need to change every time I restar
9 PM, Ted Yu wrote:
> Cycling old bits:
>
> http://search-hadoop.com/m/q3RTtHrxMj2abwOk2
>
> On Fri, Feb 19, 2016 at 6:40 PM, Arko Provo Mukherjee
> wrote:
>>
>> Hi,
>>
>> Thanks for your response. Is there a similar link for Windows? I am
>> not sure
p!
Thanks & regards
Arko
On Fri, Feb 19, 2016 at 6:35 PM, Ted Yu wrote:
> Please see https://spark.apache.org/docs/latest/spark-standalone.html
>
> On Fri, Feb 19, 2016 at 6:27 PM, Arko Provo Mukherjee
> wrote:
>>
>> Hi,
>>
>> Thanks for your response, that
ob via the program?
Thanks & regards
Arko
On Fri, Feb 19, 2016 at 6:01 PM, Holden Karau wrote:
> How are you trying to launch your application? Do you have the Spark jars on
> your class path?
>
>
> On Friday, February 19, 2016, Arko Provo Mukherjee
> wrote:
>>
>> He
Hello,
I am trying to submit a spark job via a program.
When I run it, I receive the following error:
Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
org/apache/spark/launcher/SparkLauncher
at Spark.SparkConnector.run(MySpark.scala:33)
at java.lang.Thread.run(Thread
Hello,
I am trying to use sbt assembly to generate a fat JAR.
Here is my \project\assembly.sbt file:
resolvers += Resolver.url("bintray-sbt-plugins",
url("http://dl.bintray.com/sbt/sbt-plugin-releases";))(Resolver.ivyStylePatterns)
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.9")
Howev
reference that is
>> automatically added when a C# project is created in Visual Studio. As
>> Silvio pointed out below, it is a .NET assembly and not really used by
>> SparkCLR.
>>
>>
>>
>> *From:* Silvio Fiorito [mailto:silvio.fior...@granturing.com]
&g
=+NET+on+Apache+Spark+
>
> On Tue, Feb 9, 2016 at 11:43 AM, Arko Provo Mukherjee <
> arkoprovomukher...@gmail.com> wrote:
>
>> Hello,
>>
>> I want to use Spark (preferable Spark SQL) using C#. Anyone has any
>> pointers to that?
>>
>> Thanks & regards
>> Arko
>>
>>
>
Hello,
I want to use Spark (preferable Spark SQL) using C#. Anyone has any
pointers to that?
Thanks & regards
Arko
Hello Spark Gurus,
I am trying to learn Spark. I am specially interested in GraphX.
Since Spark can used in streaming context as well, I wanted to know
whether it is possible to use the Spark Toolkits like GraphX or MLlib
in the streaming context?
Apologies if this is a stupid question but I am
14 matches
Mail list logo