I agree with you that it's not the recommended approach. But I just want to
understand which change caused this change in behavior. If you can point me
to some Jira in which this change was made, that would be greatly
appreciated.
Regards,
Shrikant
On Mon, 2 Jan 2023 at 9:46 PM, Sean Owen wrote:
Not true, you've never been able to use the SparkSession inside a Spark
task. You aren't actually using it, if the application worked in Spark 2.x.
Now, you need to avoid accidentally serializing it, which was the right
thing to do even in Spark 2.x. Just move the sesion inside main(), not a
member
If that was the case and deserialized session would not work, the
application would not have worked.
As per the logs and debug prints, in spark 2.3 the main object is not
getting deserialized in executor, otherise it would have failed then also.
On Mon, 2 Jan 2023 at 9:15 PM, Sean Owen wrote:
>
It silently allowed the object to serialize, though the
serialized/deserialized session would not work. Now it explicitly fails.
On Mon, Jan 2, 2023 at 9:43 AM Shrikant Prasad
wrote:
> Thats right. But the serialization would be happening in Spark 2.3 also,
> why we dont see this error there?
>
Thats right. But the serialization would be happening in Spark 2.3 also,
why we dont see this error there?
On Mon, 2 Jan 2023 at 9:09 PM, Sean Owen wrote:
> Oh, it's because you are defining "spark" within your driver object, and
> then it's getting serialized because you are trying to use TestM
Oh, it's because you are defining "spark" within your driver object, and
then it's getting serialized because you are trying to use TestMain methods
in your program.
This was never correct, but now it's an explicit error in Spark 3. The
session should not be a member variable.
On Mon, Jan 2, 2023
Please see these logs. The error is thrown in executor:
23/01/02 15:14:44 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.ExceptionInInitializerError
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(
It's not running on the executor; that's not the issue. See your stack
trace, where it clearly happens in the driver.
On Mon, Jan 2, 2023 at 8:58 AM Shrikant Prasad
wrote:
> Even if I set the master as yarn, it will not have access to rest of the
> spark confs. It will need spark.yarn.app.id.
>
Even if I set the master as yarn, it will not have access to rest of the
spark confs. It will need spark.yarn.app.id.
The main issue is if its working as it is in Spark 2.3 why its not working
in Spark 3 i.e why the session is getting created on executor.
Another thing we tried is removing the df
So call .setMaster("yarn"), per the error
On Mon, Jan 2, 2023 at 8:20 AM Shrikant Prasad
wrote:
> We are running it in cluster deploy mode with yarn.
>
> Regards,
> Shrikant
>
> On Mon, 2 Jan 2023 at 6:15 PM, Stelios Philippou
> wrote:
>
>> Can we see your Spark Configuration parameters ?
>>
>>
We are running it in cluster deploy mode with yarn.
Regards,
Shrikant
On Mon, 2 Jan 2023 at 6:15 PM, Stelios Philippou wrote:
> Can we see your Spark Configuration parameters ?
>
> The mater URL refers to as per java
> new SparkConf()setMaster("local[*]")
> according to where you want to ru
Can we see your Spark Configuration parameters ?
The mater URL refers to as per java
new SparkConf()setMaster("local[*]")
according to where you want to run this
On Mon, 2 Jan 2023 at 14:38, Shrikant Prasad wrote:
> Hi,
>
> I am trying to migrate one spark application from Spark 2.3 to 3.0.
12 matches
Mail list logo