I faced similar issue with "wholeTextFiles" function due to version
compatibility. Spark 1.0 with Hadoop 2.4.1 worked. Did you try other
function such as "textFile" to check if the issue is specific to
"wholeTextFiles"?
Spark needs to be re-compiled for different hadoop versions. However, you
can
One way to do that is to use RDD.toDebugString to check the dependency
graph and it also gives a good idea regarding stages.
On Mon, Aug 4, 2014 at 8:55 PM, rpandya wrote:
> Is there a way to visualize the task dependency graph of an application,
> during or after its execution? The list of sta