Hi,
1st convert "lines" to dataframe. You will get one column with original
string in one row.
Post this, use string split on this column to convert to Array of String.
After This, you can use explode function to have each element of the array
as columns.
On Wed 2 Oct, 2019, 03:18 , wrote:
+1
Regards,
Dhaval Modi
dhavalmod...@gmail.com
On 8 November 2017 at 00:06, Bryan Jeffrey wrote:
> Hello.
>
> I am running Spark 2.1, Scala 2.11. We're running several Spark streaming
> jobs. In some cases we restart these jobs on an occasional basis. We have
> code
@sagar - YARN kill is not a reliable process for spark streaming.
Regards,
Dhaval Modi
dhavalmod...@gmail.com
On 8 March 2018 at 17:18, bsikander wrote:
> I am running in Spark standalone mode. No YARN.
>
> anyways, yarn application -kill is a manual process. I donot want that. I
+1
Regards,
Dhaval Modi
dhavalmod...@gmail.com
On 29 March 2018 at 19:57, Sidney Feiner wrote:
> Hey,
>
> I have a Spark Streaming application processing some events.
>
> Sometimes, I want to stop the application if a get a specific event. I
> collect the executor's re
at are the other ways to stop these jobs?
Regards,
Dhaval Modi
dhavalmod...@gmail.com
JSON messages needs to be flattened and stored in Hive.
For these 100's of topic, currently we have 100's of jobs running
independently and using different UI port.
Regards,
Dhaval Modi
dhavalmod...@gmail.com
On 7 May 2018 at 13:53, Gerard Maas wrote:
> Dhaval,
>
> Whi
r
>> other mécanismes (mesos, kubernetes, yarn). The CNI will allow to always
>> bind on the same ports because each container will have its own IP. Some
>> other solution like mesos and marathon can work without CNI , with host IP
>> binding, but will manage the ports fo
't any
> conflict.
>
> Le sam. 5 mai 2018 à 17:10, Dhaval Modi a écrit :
>
>> Hi All,
>>
>> Need advice on executing multiple streaming jobs.
>>
>> Problem:- We have 100's of streaming job. Every streaming job uses new
>> port. Also, S
this situation? or Am I missing any thing?
Thanking you in advance.
Regards,
Dhaval Modi
dhavalmod...@gmail.com
Replace with ":"
Regards,
Dhaval Modi
On 19 December 2016 at 13:10, Rabin Banerjee
wrote:
> HI All,
>
> I am trying to save data from Spark into HBase using saveHadoopDataSet
> API . Please refer the below code . Code is working fine .But the table is
> gett
olumnName", toDate(src(s"$columnName"),
lit(s"dd/MM/")))
Regards,
Dhaval Modi
dhavalmod...@gmail.com
On 26 March 2016 at 04:34, Mich Talebzadeh
wrote:
> Hi Ted,
>
> I decided to take a short cut here. I created the map leaving date as it
> is (p(1)) below
+1
On Mar 21, 2016 09:52, "Hiroyuki Yamada" wrote:
> Could anyone give me some advices or recommendations or usual ways to do
> this ?
>
> I am trying to get all (probably top 100) product recommendations for each
> user from a model (MatrixFactorizationModel),
> but I haven't figured out yet to
are trying to overwrite an external table or
> temporary table created in hivecontext?
>
>
> Regards,
> Gourav Sengupta
>
> On Sat, Mar 5, 2016 at 3:01 PM, Dhaval Modi
> wrote:
>
>> Hi Team,
>>
>> I am facing a issue while writing dataframe back to
/tgt_table>*
at
org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1122)
at
org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
++++++
Regards,
Dhaval Modi
dhavalmod...@gmail.com
14 matches
Mail list logo