A good feature of spark structured streaming is that it can join the static
dataframe with the streaming dataframe. To cite an example as below. users
is a static dataframe read from database. transactionStream is from a
stream. By the joining operation, we can get the spending of each country
accu
Hi Spark Users,
I am trying to execute bash script from my spark app. I can run the below
command without issues from spark-shell however when I use it in the spark-app
and submit with spark-submit, container is not able to find the directories.
val result = "export LD_LIBRARY_PATH=/ binaries/
Are local paths not exposed in containers ?
Thanks,
Nasrulla
From: Nasrulla Khan Haris
Sent: Thursday, July 23, 2020 6:13 PM
To: user@spark.apache.org
Subject: Unable to run bash script when using spark-submit in cluster mode.
Importance: High
Hi Spark Users,
I am trying to execute bash script
Hi folks,
Been trying to debug this issue:
https://gist.github.com/nssalian/203e20432c2ed237717be28642b1871a
*Context:*
*The application (Pyspark):*
1. Read a Hive table from the Metastore (Running Hive 1.2.2)
2. Print schema of the Dataframe read.
3. Do a show() on the df captured. The above err
a potential reason might be that you are getting a classnotfound exception
when you run on the cluster (due to a missing jar in your uber jar) and you
are possibly silently eating up exceptions in your code.
1- you can check if there are any failed tasks
2- you can check if there are any failed ex