Hi, Steve.
Sure, you can suggest, but I'm wondering how the suggested namespaces are
able to satisfy the existing visibility rules. Could you give us some
examples specifically?
> Can I suggest some common prefix for third-party-classes put into the
spark package tree, just to make clear that the
Hi, Steve.
Sure, you can suggest, but I'm wondering how the suggested namespaces are
able to satisfy the existing visibility rules. Could you give us some
examples specifically?
> Can I suggest some common prefix for third-party-classes put into the
spark package tree, just to make clear that the
Hi Ximo, sorry for delaying, was busy with other stuff. I will raise a PR in
this week, let me ping you for review to leverage your help, thanks.
Cheng Su
On Sep 21, 2020, at 8:16 AM, XIMO GUANTER GONZALBEZ
wrote:
Hi Cheng,
I think there still isn’t a PR for this, right? Do you need any he
Hi Cheng,
I think there still isn’t a PR for this, right? Do you need any help? I am very
interested in this feature getting into master, so I am happy to help or even
move this feature forward based on the PR you opened, but I don’t want to
“steal” the feature from you if you’re interested in
I've just been stack-trace-chasing the 404-in-task-commit code:
https://issues.apache.org/jira/browse/HADOOP-17216
And although it's got an org.apache.spark. prefix, it's
actually org.apache.spark.sql.delta, which lives in github, so the
code/issue tracker lives elsewhere.
I understand why they'
Hi All,
This is regarding an improvement issue SPARK-30985(
https://github.com/apache/spark/pull/27735). Has this caught someone's
attention yet?
Basically, SPARK_CONF_DIR hosts all the user specific configuration files,
e.g.
1. spark-defaults.conf - containing all the spark properties.
2