For a similar problem where we wanted to preserve and track null entries,
we load the CSV as a DataSet[Array[Object]] and then transform it into
DataSet[Row] using a custom RowSerializer(
https://gist.github.com/Shiti/d0572c089cc08654019c) which handles null.
The Table API(which supports null) can
For a similar problem where we wanted to preserve and track null entries,
we load the CSV as a DataSet[Array[Object]] and then transform it into
DataSet[Row] using a custom RowSerializer(
https://gist.github.com/Shiti/d0572c089cc08654019c) which handles null.
The Table API(which supports null) can
./bin/flink run /path/to/jar arguments
or
./bin/flink run -c MainClass /path/to/jar arguments
On Fri, Oct 23, 2015 at 5:50 PM, Stefano Bortoli
wrote:
> What I normally do is to
>
> java -cp MYUBERJAR.jar my.package.mainclass
>
> does it make sense?
>
> 2015-10-23 17:22 GMT+02:00 Flavio Pomperm
What I normally do is to
java -cp MYUBERJAR.jar my.package.mainclass
does it make sense?
2015-10-23 17:22 GMT+02:00 Flavio Pompermaier :
> could you write ne the command please?I'm not in the office right now..
> On 23 Oct 2015 17:10, "Maximilian Michels" wrote:
>
>> Could you try submitting t
could you write ne the command please?I'm not in the office right now..
On 23 Oct 2015 17:10, "Maximilian Michels" wrote:
> Could you try submitting the job from the command-line and see if it works?
>
> Thanks,
> Max
>
> On Fri, Oct 23, 2015 at 4:42 PM, Flavio Pompermaier
> wrote:
>
>> 0.10-sna
Could you try submitting the job from the command-line and see if it works?
Thanks,
Max
On Fri, Oct 23, 2015 at 4:42 PM, Flavio Pompermaier
wrote:
> 0.10-snapshot
> On 23 Oct 2015 16:09, "Maximilian Michels" wrote:
>
>> Hi Flavio,
>>
>> Which version of Flink are you using?
>>
>> Cheers,
>> Ma
0.10-snapshot
On 23 Oct 2015 16:09, "Maximilian Michels" wrote:
> Hi Flavio,
>
> Which version of Flink are you using?
>
> Cheers,
> Max
>
> On Fri, Oct 23, 2015 at 2:45 PM, Flavio Pompermaier
> wrote:
>
>> Hi to all,
>> I'm trying to run a job from the web interface but I get this error:
>>
>>
Hi Guido,
This depends on your use case but you may read those values as type String
and treat them accordingly.
Cheers,
Max
On Fri, Oct 23, 2015 at 1:59 PM, Guido wrote:
> Hello,
> I would like to ask if there were any particular ways to read or treat
> null (e.g. Name, Lastname,, Age..) valu
Hi Flavio,
Which version of Flink are you using?
Cheers,
Max
On Fri, Oct 23, 2015 at 2:45 PM, Flavio Pompermaier
wrote:
> Hi to all,
> I'm trying to run a job from the web interface but I get this error:
>
> java.lang.RuntimeException: java.io.FileNotFoundException: JAR entry
> core-site.xml
Hi Philip,
How about making the empty field of type String? Then you can read the CSV
into a DataSet and treat the empty string as a null value. Not very nice
but a workaround. As of now, Flink deliberately doesn't support null values.
Regards,
Max
On Thu, Oct 22, 2015 at 4:30 PM, Philip Lee wr
Hi to all,
I'm trying to run a job from the web interface but I get this error:
java.lang.RuntimeException: java.io.FileNotFoundException: JAR entry
core-site.xml not found in /tmp/webclient-jobs/EntitonsJsonizer.jar
at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:
Hello,
I would like to ask if there were any particular ways to read or treat null
(e.g. Name, Lastname,, Age..) value in a dataset using readCsvFile, without
being forced to ignore them.
Thanks for your time.
Guido
Hi Niels,
Thank you for your question. Flink relies entirely on the Kerberos
support of Hadoop. So your question could also be rephrased to "Does
Hadoop support long-term authentication using Kerberos?". And the
answer is: Yes!
While Hadoop uses Kerberos tickets to authenticate users with service
Hi Paul,
the key based state should now be fixed in the current 0.10-SNAPSHOT builds if
you want to continue playing around with it.
Cheers,
Aljoscha
> On 21 Oct 2015, at 19:40, Aljoscha Krettek wrote:
>
> Hi Paul,
> good to hear that the windowing works for you.
>
> With the key based state I
14 matches
Mail list logo