Since I got no feedback, I'll try asking differently:
Can anyone point me to any resources regarding how to run the project's
tests?
Where can I find a good Docker image that would serve as a YARN cluster for
submitting jobs?
Thanks,
Shmuel
On Sun, Sep 16, 2018 at 10:09 PM Shmuel Bl
the tests on a docker linux container with the
Spark build mounted from the host PC. Has anyone done this? Do you have a
recommended docker image to work with?
4. any special consideration I should think of?
Thanks,
Shmuel
--
Shmuel Blitz
Big Data Developer
Email: shmuel.bl...@simi
Singaraju <
gautam.singar...@gmail.com> wrote:
> Hi all,
>
> Any suggestions on optimization of running ML Pipeline inference in a
> webapp in a multi-tenant low-latency mode.
>
> Suggestions would be appreciated.
>
> Thank you!
>
--
Shmuel Blitz
Big Data Developer
Email: shmu
achieve this?
> >
> > Thank you in advance.
>
> --
> Apostolos N. Papadopoulos, Associate Professor
> Department of Informatics
> Aristotle University of Thessaloniki
> Thessaloniki, GREECE
> tel: ++0030312310991918
> email: papad...@csd.auth.gr
> twitter: @papadopoulos_ap
&
xpressions.GenericRowWithSchema cannot be
>> cast to scala.Tuple2
>>
>> Even the schema says that key is of type struct of (string, string).
>>
>> Any idea why this is happening?
>>
>>
>> Thanks
>>
>> Nikhil
>>
>>
>&
easy to understand.
>
> --
> 1427357...@qq.com
>
>
> *From:* Shmuel Blitz
> *Date:* 2018-03-26 15:31
> *To:* 1427357...@qq.com
> *CC:* spark?users ; dev
> *Subject:* Re: the issue about the + in column,can we support the string
> please?
> Hi,
>
> you can get the sam
quot;(${ctx.javaType(dataType)})($eval1 $symbol $eval2)")
> case CalendarIntervalType =>
> defineCodeGen(ctx, ev, (eval1, eval2) => s"$eval1.add($eval2)")
> case _ =>
> defineCodeGen(ctx, ev, (eval1, eval2) => s"$eval1 $symbol $eval2")
available for the job. 200 is the default value for
> spark.sql.shuffle.partitions. Alternatively you could try increasing the
> value of spark.sql.shuffle.partitions to latest 750.
>
> thanks,
> rohitk
>
> On Sun, Mar 25, 2018 at 1:25 PM, Shmuel Blitz > wrote:
>
Were you able to run on a cluster level or for a specific job?
>
> Did you configure it on the spark-default.conf?
>
> On Sun, 25 Mar 2018 at 10:34 Shmuel Blitz
> wrote:
>
>> Just to let you know, I have managed to run SparkLens on our cluster.
>>
>> I switched
1.6.
>>
>> I tested it and it looks working and now i'm testing the branch for a
>> wide tests, Please use the branch for Spark 1.6
>>
>> On Fri, Mar 23, 2018 at 12:43 AM, Shmuel Blitz <
>> shmuel.bl...@similarweb.com> wrote:
>>
>&
Hi Rohit,
Thanks for sharing this great tool.
I tried running a spark job with the tool, but it failed with an
*IncompatibleClassChangeError
*Exception.
I have opened an issue on Github.(
https://github.com/qubole/sparklens/issues/1)
Shmuel
On Thu, Mar 22, 2018 at 5:05 PM, Shmuel Blitz
wrote
; applications and can be a useful guide on the path towards tuning
>>>>> applications for lower runtime or cost.
>>>>>
>>>>> Please clone from here: https://github.com/qubole/sparklens
>>>>> Old blogpost: https://www.qubole.com/blog/introduc
ot;:\"string\",\"nullable\":tru
>>>>> e,\"metadata\":{\"name\":\"statecode\",\"scale\":0}},{\"name
>>>>> \":\"Socialid\",\"type\":\"string\",\"nullable\":true,\"meta
>&
\":6}},{\"name\":\"longitude\",
>>>>> \"type\":\"decimal(6,6)\",\"nullable\":true,\"metadata\":{
>>>>> \"name\":\"longitude\",\"scale\":6}},{\"name\":\"line\"
;>
> >> thanks,
> >> Rohit Karlupia
> >>
> >>
> >
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>
--
Shmuel Blitz
Big Data Developer
Email: shmuel.bl...@similarweb.com
www.similarweb.com
<https://www.facebook.com/SimilarWeb/>
<https://www.linkedin.com/company/429838/> <https://twitter.com/similarweb>
ther side of this problem is that in the source cluster, Spark reads
the table values without any problem and without the extra TBLPROPERTIES.
Why is this happening? How can it be fixed?
--
[image: Logo]
<https://www.similarweb.com/?utm_source=WiseStamp&utm_medium=email&utm_term=&utm_co
16 matches
Mail list logo