hi Jiang,
i was facing the very same issue ,the solution is write to file and using
oracle external table to do the insert.
hope this could help.
Dalin
On Thu, Apr 18, 2019 at 11:43 AM Jörn Franke wrote:
> What is the size of the data? How much time does it need on HDFS and how
> much on Orac
Use unix time and write the unix time to oracle as number column type ,create
virtual column in oracle database for the unix time like “oracle_time
generated always as (to_date('1970010108','MMDDHH24')+(1/24/60/60)*unixtime
)
> On Mar 20, 2018, at 11:08 PM, Gurusamy Thirupathy wrote:
>
>
Hi Panagiotis ,
Wondering you solved the problem or not? Coz I met the same issue today. I’d
appreciate so much if you could paste the code snippet if it’s working .
Thanks.
> 在 2018年4月6日,上午7:40,Aakash Basu 写道:
>
> Hi Panagiotis,
>
> I did that, but it still prints the result of the first
I need it cached to improve throughput ,only hope it can be refreshed once a
day not every batch.
> On Nov 13, 2017, at 4:49 PM, Burak Yavuz wrote:
>
> I think if you don't cache the jdbc table, then it should auto-refresh.
>
> On Mon, Nov 13, 2017 at 1:2
Hi
I’m using struct streaming(spark 2.2) to receive Kafka msg ,it works great.
The thing is I need to join the Kafka message with a relative static table
stored in mysql database (let’s call it metadata here).
So is it possible to reload the metadata table after some time interval(like
daily