Yeah, sorry that error message is not very intuitive.  There is already a
JIRA open to make it better: SPARK-2059
<https://issues.apache.org/jira/browse/SPARK-2059>

Also, a bug has been fixed in master regarding attributes that contain "_".
 So if you are running 1.0 you might try upgrading.


On Wed, Jun 18, 2014 at 4:05 AM, Tobias Pfeiffer <t...@preferred.jp> wrote:

> The error message *means* that there is no column called c_address.
> However, maybe it's a bug with Spark SQL not understanding the
> a.c_address syntax. Can you double-check the column name is correct?
>
> Thanks
> Tobias
>
> On Wed, Jun 18, 2014 at 5:02 AM, Zuhair Khayyat
> <zuhair.khay...@gmail.com> wrote:
> > Dear all,
> >
> > I am trying to run the following query on Spark SQL using some custom
> TPC-H
> > tables with standalone Spark cluster configuration:
> >
> > SELECT * FROM history a JOIN history b ON a.o_custkey = b.o_custkey WHERE
> > a.c_address <> b.c_address;
> >
> > Unfortunately I get the following error during execution:
> >
> > java.lang.reflect.InvocationTargetException
> >
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >
> >         at java.lang.reflect.Method.invoke(Method.java:606)
> >
> >         at
> >
> org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:40)
> >
> >         at
> > org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
> >
> > Caused by: org.apache.spark.SparkException: Job aborted due to stage
> > failure: Task 0.0:2 failed 4 times, most recent failure: Exception
> failure
> > in TID 12 on host kw2260.kaust.edu.sa:
> > org.apache.spark.sql.catalyst.errors.package$TreeNodeException: No
> function
> > to evaluate expression. type: UnresolvedAttribute, tree: 'a.c_address
> >
> >
> >
> org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute.eval(unresolved.scala:59)
> >
> >
> >
> org.apache.spark.sql.catalyst.expressions.Equals.eval(predicates.scala:147)
> >
> >
> > org.apache.spark.sql.catalyst.expressions.Not.eval(predicates.scala:74)
> >
> >
> > org.apache.spark.sql.catalyst.expressions.And.eval(predicates.scala:100)
> >
> >
> > Is this a bug or am I doing something wrong?
> >
> >
> > Regards,
> >
> > Zuhair Khayyat
>

Reply via email to