Many thanks Ayan.
I tried that as well as follows:
val broadcastValue = "123456789" // I assume this will be sent as a
constant for the batch
val df = spark.read.
format("com.databricks.spark.xml").
option("rootTag", "hierarchy").
option("rowTag",
Hi Mitch
Add it in the DF first
from pyspark.sql.functions import lit
df = df.withColumn('broadcastId, lit(broadcastValue))
Then you will be able to access the column in the temp view
Re: Partitioning, DataFrame.write also supports partitionBy clause and you
can use it along with saveAsTable.
Thanks Zhang,
That is not working. I need to send the value for variable broadcastValue,
it cannot interpret it.
scala> sqltext = """
| INSERT INTO TABLE michtest.BroadcastStaging PARTITION (broadcastId
= broadcastValue, brand = "dummy")
| SELECT
| ocis_party_id A
> scala> spark.sql($sqltext)
> :41: error: not found: value $sqltext
> spark.sql($sqltext)
^
+-- should be Scala language
Try this:
scala> spark.sql(sqltext)
--
Cheers,
-z
On Thu, 16 Apr 2020 08:49:40 +0100
Mich Talebzadeh wrote:
> I have
I have a variable to be passed to a column of partition as shown below
*val broadcastValue = "123456789" * // I assume this will be sent as a
constant for the batch
// Create a DF on top of XML
df.createOrReplaceTempView("tmp")
// Need to create and populate target Parquet table
michtest.Broadcas