Re: Convert a line of String into column

2019-10-05 Thread Dhaval Modi
Hi, 1st convert "lines" to dataframe. You will get one column with original string in one row. Post this, use string split on this column to convert to Array of String. After This, you can use explode function to have each element of the array as columns. On Wed 2 Oct, 2019, 03:18 , wrote:

Re: Stopping a Spark Streaming Context gracefully

2018-07-15 Thread Dhaval Modi
+1 Regards, Dhaval Modi dhavalmod...@gmail.com On 8 November 2017 at 00:06, Bryan Jeffrey wrote: > Hello. > > I am running Spark 2.1, Scala 2.11. We're running several Spark streaming > jobs. In some cases we restart these jobs on an occasional basis. We have > code

Re: Properly stop applications or jobs within the application

2018-07-15 Thread Dhaval Modi
@sagar - YARN kill is not a reliable process for spark streaming. Regards, Dhaval Modi dhavalmod...@gmail.com On 8 March 2018 at 17:18, bsikander wrote: > I am running in Spark standalone mode. No YARN. > > anyways, yarn application -kill is a manual process. I donot want that. I

Re: Stopping StreamingContext

2018-07-15 Thread Dhaval Modi
+1 Regards, Dhaval Modi dhavalmod...@gmail.com On 29 March 2018 at 19:57, Sidney Feiner wrote: > Hey, > > I have a Spark Streaming application processing some events. > > Sometimes, I want to stop the application if a get a specific event. I > collect the executor's re

How to stop streaming jobs

2018-07-15 Thread Dhaval Modi
at are the other ways to stop these jobs? Regards, Dhaval Modi dhavalmod...@gmail.com

Re: Advice on multiple streaming job

2018-05-07 Thread Dhaval Modi
JSON messages needs to be flattened and stored in Hive. For these 100's of topic, currently we have 100's of jobs running independently and using different UI port. Regards, Dhaval Modi dhavalmod...@gmail.com On 7 May 2018 at 13:53, Gerard Maas wrote: > Dhaval, > > Whi

Re: Advice on multiple streaming job

2018-05-06 Thread Dhaval Modi
r >> other mécanismes (mesos, kubernetes, yarn). The CNI will allow to always >> bind on the same ports because each container will have its own IP. Some >> other solution like mesos and marathon can work without CNI , with host IP >> binding, but will manage the ports fo

Re: Advice on multiple streaming job

2018-05-06 Thread Dhaval Modi
't any > conflict. > > Le sam. 5 mai 2018 à 17:10, Dhaval Modi a écrit : > >> Hi All, >> >> Need advice on executing multiple streaming jobs. >> >> Problem:- We have 100's of streaming job. Every streaming job uses new >> port. Also, S

Advice on multiple streaming job

2018-05-05 Thread Dhaval Modi
this situation? or Am I missing any thing? Thanking you in advance. Regards, Dhaval Modi dhavalmod...@gmail.com

Re: How to set NameSpace while storing from Spark to HBase using saveAsNewAPIHadoopDataSet

2016-12-19 Thread Dhaval Modi
Replace with ":" Regards, Dhaval Modi On 19 December 2016 at 13:10, Rabin Banerjee wrote: > HI All, > > I am trying to save data from Spark into HBase using saveHadoopDataSet > API . Please refer the below code . Code is working fine .But the table is > gett

Re: This simple UDF is not working!

2016-03-25 Thread Dhaval Modi
olumnName", toDate(src(s"$columnName"), lit(s"dd/MM/"))) Regards, Dhaval Modi dhavalmod...@gmail.com On 26 March 2016 at 04:34, Mich Talebzadeh wrote: > Hi Ted, > > I decided to take a short cut here. I created the map leaving date as it > is (p(1)) below

Re: What is the most efficient and scalable way to get all the recommendation results from ALS model ?

2016-03-20 Thread Dhaval Modi
+1 On Mar 21, 2016 09:52, "Hiroyuki Yamada" wrote: > Could anyone give me some advices or recommendations or usual ways to do > this ? > > I am trying to get all (probably top 100) product recommendations for each > user from a model (MatrixFactorizationModel), > but I haven't figured out yet to

Re: Spark SQL drops the HIVE table in "overwrite" mode while writing into table

2016-03-06 Thread Dhaval Modi
are trying to overwrite an external table or > temporary table created in hivecontext? > > > Regards, > Gourav Sengupta > > On Sat, Mar 5, 2016 at 3:01 PM, Dhaval Modi > wrote: > >> Hi Team, >> >> I am facing a issue while writing dataframe back to

Spark SQL drops the HIVE table in "overwrite" mode while writing into table

2016-03-05 Thread Dhaval Modi
/tgt_table>* at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122) at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114) ++++++ Regards, Dhaval Modi dhavalmod...@gmail.com