Github user massie commented on a diff in the pull request:
https://github.com/apache/spark/pull/7403#discussion_r38663395
--- Diff: core/src/main/scala/org/apache/spark/Dependency.scala ---
@@ -76,6 +78,15 @@ class ShuffleDependency[K, V, C](
override def rdd: RDD[Product2[K, V]] =
_rdd.asInstanceOf[RDD[Product2[K, V]]]
+ /**
+ * The key, value and combiner classes are serialized so that shuffle
manager
+ * implementation can use the information to build
+ */
+ val keyClassName: String = reflect.classTag[K].runtimeClass.getName
+ val valueClassName: String = reflect.classTag[V].runtimeClass.getName
+ // Note: It's possible that the combiner class tag is null, if the
combineByKey
+ // methods in PairRDDFunctions are used instead of
combineByKeyWithClassTag.
+ val combinerClassName: Option[String] =
Option(reflect.classTag[C]).map(_.runtimeClass.getName)
--- End diff --
@andrewor14 Let me know which approach you prefer -- (a) keeping the class
names public or (b) changing the `registerShuffle` arguments.
The first approach has the advantage that the data types are available
everywhere the `ShuffleDependency` is used. The latter approach doesn't require
the class names be `public` -- they would exist at all (and instead be passed
to `registerShuffle`).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]