[SparkSQL]. MissingRequirementError when creating dataframe from RDD (new error in 1.4)

2015-06-18 Thread Adam Lewandowski
Since upgrading to Spark 1.4, I'm getting a scala.reflect.internal.MissingRequirementError when creating a DataFrame from an RDD. The error references a case class in the application (the RDD's type parameter), which has been verified to be present. Items of note: 1) This is running on AWS EMR (YAR

Re: AWS SDK HttpClient version conflict (spark.files.userClassPathFirst not working)

2015-03-15 Thread Adam Lewandowski
anged in Spark 1.3.0 (released recently). The relevant configuration flag changed names to 'spark.executor.userClassPathFirst', and it does work the way I expected it (unlike in v1.2.0). On Thu, Mar 12, 2015 at 2:50 PM, Adam Lewandowski < adam.lewandow...@gmail.com> wrote: > I&#

AWS SDK HttpClient version conflict (spark.files.userClassPathFirst not working)

2015-03-12 Thread Adam Lewandowski
with identical results. Has anyone else seen this issue, had any success with the "spark.files.userClassPathFirst" flag, or been able to use the AWS SDK? I was going to submit this a Spark JIRA issue, but thought I would check here first. Thanks, Adam Lewandowski