I'm not sure we want to do that. If you "SELECT foo AS bar", then the
column name is foo but the column label is bar. We probably want to return
the latter.
On Fri, Dec 17, 2021 at 9:07 AM Gary Liu wrote:
> In spark sql jdbc module, it's using getColumnLabel to get column names
> from the remote
In spark sql jdbc module, it's using getColumnLabel to get column names
from the remote database, but in some databases, like SAS, it returns
column description instead. Should getColumnName be used?
This is from the SAS technical support:
In the documentation,
https://docs.oracle.com/javase/7/do
*Challenge*
Insert data from Spark dataframe when one or more columns in theOracle
table rely on some derived_colums dependent on data in one or more
dataframe columns.
Standard JDBC from Spark to Oracle does batch insert of dataframe into
Oracle *so it cannot handle these derived columns*. Refer
Hi Maciej Bryński,
Did you happen to finish that or is there a way to do it?
--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
2016-07-22 23:05 GMT+02:00 Ramon Rosa da Silva :
> Hi Folks,
>
>
>
> What do you think about allow update SaveMode from
> DataFrame.write.mode(“update”)?
>
> Now Spark just has jdbc insert.
I'm working on patch that creates new mode - 'upsert'.
In Mysql it will use 'REPLACE INTO' command.
M.
---
Hi Folks,
What do you think about allow update SaveMode from
DataFrame.write.mode("update")?
Now Spark just has jdbc insert.
This e-mail message, including any attachments, is for the sole use of the
person to whom it has been sent and may contain information that is
confidential or legally p