>
> Hi All,
>
> Note:- (This is a full time position)
>
> I copied Geoff in this email chain who is actively looking for Spark/Scala
> Developer for a full time position with DataBricks 200k+
> (Salary/Bonus/Stocks/H1b/GC Process).
>
> If you know anyone or your friends who is good in Spark/Scala e
> Hi Flink Experts,
>
I am trying to read an S3 file from my Intellij using Flink I am.comimg
> across Aws Auth error can someone help below are all the details.
>
> I have Aws credentials in homefolder/.aws/credentials
>
My Intellij Environment Variables:-
> ENABLE_BUILT_IN_PLUGINS=flink-s3-fs
Here is my Intellij question.
https://stackoverflow.com/questions/66536868/flink-aws-s3-access-issue-intellij-idea?noredirect=1#comment117626682_66536868
On Mon, Mar 8, 2021 at 11:22 AM sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote:
>
> Hi Flink Experts,
>>
my stack overflow question.
https://stackoverflow.com/questions/66536868/flink-aws-s3-access-issue-intellij-idea?noredirect=1#comment117626682_66536868
On Tue, Mar 9, 2021 at 11:28 AM sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote:
> Here is my Intellij question.
&g
Sri
On Tue, Mar 9, 2021 at 11:30 AM sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote:
> my stack overflow question.
>
>
> https://stackoverflow.com/questions/66536868/flink-aws-s3-access-issue-intellij-idea?noredirect=1#comment117626682_66536868
>
> On Tue, Mar
ound was
> to build my own version of the processor API and include the missing part.
>
> Med venlig hilsen / Best regards
> Lasse Nedergaard
>
>
> Den 10. mar. 2021 kl. 17.33 skrev sri hari kali charan Tummala
> :
>
>
> Flink,
>
> I am able to access Kin
Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see
ConfigConstants.ENV_FLINK_LIB_DIR will this work ?
Thanks
Sri
On Wed, Mar 10, 2021 at 1:23 PM sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote:
> I am not getting what you both are talking about
Let's close this issue guys please answer my questions. I am using Flink
1.8.1.
Thanks
Sri
On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote:
> Also I don't see ConfigConstants.ENV_FLINK_PLUGINS_DIR I only see
> ConfigConstants.
>
> On 3/12/2021 4:33 AM, sri hari kali charan Tummala wrote:
>
> Let's close this issue guys please answer my questions. I am using Flink
> 1.8.1.
>
> Thanks
> Sri
>
> On Wed, 10 Mar 2021 at 13:25, sri hari kali charan Tummala <
> ka
If anyone working have flink version 1.8.1 code reading S3 in Intellij in
public GitHub please pass it on that will be huge help.
Thanks
Sri
On Fri, 12 Mar 2021 at 08:08, sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote:
> Which I already did in my pin still its no
ectory that the file resides in.
>
> On 3/12/2021 5:10 PM, sri hari kali charan Tummala wrote:
>
> If anyone working have flink version 1.8.1 code reading S3 in Intellij in
> public GitHub please pass it on that will be huge help.
>
>
> Thanks
> Sri
>
> On Fri, 12
Hi Flink Experts,
how to achieve at least once semantics with FlinkDynamoDBStreamsConsumer +
DynamoDB Streams ? Flink checkpointing or save points do the job?
My Scenario:-
Flink application uses FlinkDynamoDBStreamsConsumer which reads latest
changes from DynamoDB streams but if my software fail
Ok, no problem.
On Wed, Aug 21, 2019 at 12:22 AM Pei HE wrote:
> Thanks Kali for the information. However, it doesn't work for me, because
> I need features in Flink 1.7.x or later and use manged Amazon MSK.
> --
> Pei
>
>
>
> On Tue, Aug 20, 2019 at 7:17 PM
Aws already has auto scale flink cluster it’s called Kinesis Data Analytics
just add your flink Jar to Kinesis Sql analytics that’s all , aws will auto
provision a flink cluster and do the admin part for you.
On Saturday, September 28, 2019, David Anderson wrote:
> I believe there can be advanta
check this.
https://github.com/kali786516/FlinkStreamAndSql/blob/b8bcbadaa3cb6bfdae891f10ad1205e256adbc1e/src/main/scala/com/aws/examples/dynamodb/dynStreams/FlinkDynamoDBStreams.scala#L42
https://github.com/kali786516/FlinkStreamAndSql/blob/b8bcbadaa3cb6bfdae891f10ad1205e256adbc1e/src/main/scala
Hi All,
have a question did anyone compared the performance of Flink batch job
writing to s3 vs spark writing to s3?
--
Thanks & Regards
Sri Tummala
Hi All,
have a question did anyone compared the performance of Flink batch job
writing to s3 vs spark writing to s3?
--
Thanks & Regards
Sri Tummala
ing on Java and using the same set of libraries.
> Of course, if one system has a very specific optimization for your use
> case, that could be much faster.
>
>
> On Mon, Feb 24, 2020 at 11:26 PM sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> Hi A
Ok, thanks for the clarification.
On Wed, Feb 26, 2020 at 9:22 AM Arvid Heise wrote:
> Exactly. We use the hadoop-fs as an indirection on top of that, but Spark
> probably does the same.
>
> On Wed, Feb 26, 2020 at 3:52 PM sri hari kali charan Tummala <
> kali.tumm..
sorry for being lazy I would have gone through flink source code.
On Wed, Feb 26, 2020 at 9:35 AM sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote:
> Ok, thanks for the clarification.
>
> On Wed, Feb 26, 2020 at 9:22 AM Arvid Heise wrote:
>
>> Exactly. We
Yes Scala is the best.
On Fri, Jan 28, 2022, 9:57 AM Nicolás Ferrario
wrote:
> Hi Seb.
>
> In my team we are migrating things to Kotlin because we find it much
> easier to deal with. It's like the best of both worlds, but you do give up
> on Flink Scala serializers, since the only way to get Kot
Hi All,
I am trying to read data from kinesis stream and applying SQL
transformation (distinct) and then tryting to write to CSV sink which is
failinf due to this issue (org.apache.flink.table.api.TableException:
AppendStreamTableSink requires that Table has only insert changes.) , full
code is he
Hi ,
I am trying to write flink table to streaming Sink it fails at casting Java
to Scala or Scala to Java, it fails at below step can anyone help me out ?
about this error.
val sink2:SinkFunction[Row] = StreamingFileSink.forRowFormat(new
Path("/Users/kalit_000/Downloads/FlinkStreamAndSql/sr
entioned occurs.
>
> However, you can use `toAppendStream` method to change the retractable
> stream to an append only stream. For example,
> `tEnv.sqlQuery(query).distinct().toAppendStream[Row]` and then you can get
> an append only stream. You can then add csv sink to this stream.
&g
sink is an append only sink (it's hard to update what has
> been written in the middle of a file), the exception you mentioned occurs.
>
> However, you can use `toAppendStream` method to change the retractable
> stream to an append only stream. For example,
> `tEnv.sqlQuery(query).d
uery(query).distinct().toAppendStream[Row]` and then you can get
> an append only stream. You can then add csv sink to this stream.
>
> sri hari kali charan Tummala 于2019年7月16日周二
> 上午3:32写道:
> Hi All,
>
> I am trying to read data from kinesis stream and applying SQL
&
executing the
> erroneous step? Please print them here so that we can investigate the
> problem.
>
> sri hari kali charan Tummala 于2019年7月16日周二
> 上午4:49写道:
>
>> Hi ,
>>
>> I am trying to write flink table to streaming Sink it fails at casting
>> Java to Scala or
>
>
> Hi All,
>
> I am trying to convert sql query results value to distinct and writing to
> CSV which is failing with below error.
>
> *Exception in thread "main" org.apache.flink.table.api.TableException:
> Only tables that originate from Scala DataStreams can be converted to Scala
> DataStreams
com.amazonaws
dynamodb-streams-kinesis-adapter
1.4.0
com.amazonaws
aws-java-sdk
1.11.579
On Tue, Jul 16, 2019 at 11:00 AM sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote
windows for question 1 or question 2 or both ?
Thanks
Sri
On Tue, Jul 16, 2019 at 12:25 PM taher koitawala wrote:
> Looks like you need a window
>
> On Tue, Jul 16, 2019, 9:24 PM sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> Hi All,
Hi All,
I am trying to convert a Tuple[Boolean,Row] to Row using map function, I am
getting this error asking me for InferedR , what is InferedR in FLink?
val mymapFunction: MapFunction[tuple.Tuple2[Boolean, Row],AnyVal] =
new MapFunction[tuple.Tuple2[Boolean, Row],AnyVal]() {
overrid
ronment. Maybe there is
> something wrong with the source data?
>
> Best, Hequn
>
> On Wed, Jul 17, 2019 at 12:53 AM sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> windows for question 1 or question 2 or both ?
>>
>> Thanks
>> Sr
t;/Users/kalit_000/Downloads/FlinkStreamAndSql/src/main/resources/csvOut8",
FileSystem.WriteMode.NO_OVERWRITE, "\n", "|")
ds.
writeAsCsv("/Users/kalit_000/Downloads/FlinkStreamAndSql/src/main/resources/csvOut8",
FileSystem.WriteMode.NO
AndSql/src/main/resources/csvOut125",
FileSystem.WriteMode.OVERWRITE)
On Wed, Jul 17, 2019 at 6:47 PM sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote:
> Amazing all issues resolved in one go thanks Cheng , one issue though I
> can't write map.(_._2) to CSV looks
try cte common table expressions if it supports or sql subquery.
On Fri, Jul 26, 2019 at 1:00 PM Fanbin Bu wrote:
> how about move query db filter to the outer select.
>
> On Fri, Jul 26, 2019 at 9:31 AM Tony Wei wrote:
>
>> Hi,
>>
>> If I have multiple where conditions in my SQL, is it possibl
Hi Folks,
Is anyone hiring for Flink or Scala Akka contract corp to corp positions ?
I am open in market looking for work in Scala Spark or Flink Scala or Scala
Akka world.
Thanks
Sri
PI is not in a good shape. Would you like to
> start from there?
>
> Best regards,
> Jing
>
> On Fri, Jun 3, 2022 at 4:29 PM sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> Hi Folks,
>>
>> Is anyone hiring for Flink or Scala Akka
Hi Jing,
Please add me kali.tumm...@gmail.com.
Thanks
Sri
On Sat, Jun 4, 2022 at 4:47 PM Jing Ge wrote:
> Hi Santhosh,
>
> just invited you. Please check your email. Looking forward to knowing your
> story! Thanks!
>
> To anyone else who wants to join, please send an email to
> user@flink.apac
Hi Flink Community,
I want to go through flink source code in my free time is there a document
that I can go through that explains to me where to start? other than Java
doc is there anything else to start my reserve engineering.
Thanks & Regards
Sri Tummala
s.apache.org/flink/flink-docs-master/
>
> Best regards,
> Jing
>
> On Mon, Jun 6, 2022 at 3:39 AM sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> Hi Flink Community,
>>
>> I want to go through flink source code in my free time is ther
Hi Flink Community,
can someone point me to a good config-driven flink data movement tool
Github repos? Imagine I build my ETL dag connecting source -->
transformations --> target just using a config file.
below are a few spark examples:-
https://github.com/mvrpl/big-shipper
https://github.com/Bi
o fit your requirements.
>
> Best,
> Austin
>
> [1]: https://seatunnel.apache.org/
>
> On Tue, Jun 7, 2022 at 2:19 PM sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> Hi Flink Community,
>>
>> can someone point me to a good config-d
e-operator
>
> On Tue, Jun 7, 2022 at 3:11 PM sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> thanks but looks like a spark tool is there something similar in flink?
>>
>> Thanks
>> Sri
>>
>> On Tue, Jun 7, 2022 at 12:0
gt; https://github.com/datakaveri/iudx-adaptor-framework
>
> On Tue, 7 Jun 2022 at 23:49, sri hari kali charan Tummala <
> kali.tumm...@gmail.com> wrote:
>
>> Hi Flink Community,
>>
>> can someone point me to a good config-driven flink data moveme
orking on such a thing.
>> It's in early stages and needs a lot more work.
>> I'm open to collaborating.
>> https://github.com/datakaveri/iudx-adaptor-framework
>>
>> On Tue, 7 Jun 2022 at 23:49, sri hari kali charan Tummala <
>> kali.tumm...@gmail.com
this looks interesting Nasa Akka project https://github.com/NASARace/race
On Mon, Jun 20, 2022 at 7:06 PM sri hari kali charan Tummala <
kali.tumm...@gmail.com> wrote:
> found one more flink tool
> https://www.splunk.com/en_us/products/stream-processing.html
>
> On Wed, Jun 1
Hi Flink Users/ Spark Users,
Is anyone hiring contract corp to corp big Data spark scala or Flink scala
roles ?
Thanks
Sri
Hi All,
Is anyone looking for a spark scala contract role inside the USA? A company
called Maxonic has an open spark scala contract position (100% remote)
inside the USA if anyone is interested, please send your CV to
kali.tumm...@gmail.com.
Thanks & Regards
Sri Tummala
Hi Community,
I got laid off at Apple in Feb 2023 which forced me move out of USA due to
immigration problem (h1b) I was a Big Data,Spark,Scala,Python and Flink
consultant with over 12+ years of experience.
I am still haven't landed in a job in India since then I need referrals in
India in produc
Hi Folks,
I am currently seeking full-time positions in Flink Scala in India or the
USA (non consulting) , specifically at the Principal or Staff level
positions in India or USA.
I require an h1b transfer and assistance with relocation from India , my
i40 is approved.
Thanks & Regards
Sri Tummal
Hi Flink Community,
I'm a Hands on Apache Flink Software Engineer looking for job opportunities
in India or the UK (with Tier 2 sponsorship). If anyone knows of openings
or can point me in the right direction, please let me know.
Thanks,
Sri Tummala
Hi Flink Community,
I'm a Hands on Apache Flink/Spark Software Engineer looking for job
opportunities in India, USA (with h1b transfer ) , UK , Singapore,UAE or
Australia . Is anyone is hiring , please let me know.
Thanks,
Sri T
Hi Flink Community,
I'm a Hands on Apache Flink/Spark Software Engineer looking for job
opportunities in India, USA (with h1b transfer ) or the UK (with Tier 2
sponsorship). If anyone knows of openings or can point me in the right
direction, please let me know.
Thanks,
Sri Tummala
53 matches
Mail list logo