Hi Steve,
  I am running on the cdh5.0.0 VM (which is CentOS 6.5)   Given the
difference in O/S and Hadoop distro between us my results are not likely to
be of direct help to you. But in any case i will let you know (likely
offline).


2014-07-27 20:02 GMT-07:00 Steve Nunez <snu...@hortonworks.com>:

> Whilst we¹re on this topic, I¹d be interested to see if you get hive
> failures. I¹m trying to build on a Mac using HDP and seem to be getting
> failures related to Parquet. I¹ll know for sure once I get in tomorrow and
> confirm with engineering, but this is likely because the version of Hive
> is 0.12.0, and Parquet is only supported in Hive 0.13 (HDP is 0.13)
>
> Any idea on what it would take to bump the Hive version up to the latest?
>
> Regards,
>         - SteveN
>
>
>
> On 7/27/14, 19:39, "Stephen Boesch" <java...@gmail.com> wrote:
>
> > OK i'll do it after confirming all the tests run
> >
> >
> >2014-07-27 19:36 GMT-07:00 Reynold Xin <r...@databricks.com>:
> >
> >> Would you like to submit a pull request? All doc source code are in the
> >> docs folder. Cheers.
> >>
> >>
> >>
> >> On Sun, Jul 27, 2014 at 7:35 PM, Stephen Boesch <java...@gmail.com>
> >>wrote:
> >>
> >> > i Reynold,
> >> >   thanks for responding here. Yes I had looked at the building with
> >>maven
> >> > page in the past.  I have not noticed  that the "package" step must
> >> happen
> >> > *before *the test.  I had assumed it were a corequisite -as seen in my
> >> > command line.
> >> >
> >> > So the following sequence appears to work fine (so far so good - well
> >> past
> >> > when the prior attempts failed):
> >> >
> >> >
> >> >  mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive clean package
> >> > mvn -Pyarn -Phadoop-2.3 -Phive test
> >> >
> >> > AFA documentation,  yes adding another sentence to that same "Building
> >> with
> >> > Maven" page would likely be helpful to future generations.
> >> >
> >> >
> >> > 2014-07-27 19:10 GMT-07:00 Reynold Xin <r...@databricks.com>:
> >> >
> >> > > To run through all the tests you'd need to create the assembly jar
> >> first.
> >> > >
> >> > >
> >> > > I've seen this asked a few times. Maybe we should make it more
> >>obvious.
> >> > >
> >> > >
> >> > >
> >> > > http://spark.apache.org/docs/latest/building-with-maven.html
> >> > >
> >> > > Spark Tests in Maven
> >> > >
> >> > > Tests are run by default via the ScalaTest Maven plugin
> >> > >
> >><http://www.scalatest.org/user_guide/using_the_scalatest_maven_plugin
> >> >.
> >> > > Some of the require Spark to be packaged first, so always run mvn
> >> package
> >> > >  with -DskipTests the first time. You can then run the tests with
> >>mvn
> >> > > -Dhadoop.version=... test.
> >> > >
> >> > > The ScalaTest plugin also supports running only a specific test
> >>suite
> >> as
> >> > > follows:
> >> > >
> >> > > mvn -Dhadoop.version=...
> >> -DwildcardSuites=org.apache.spark.repl.ReplSuite
> >> > > test
> >> > >
> >> > >
> >> > >
> >> > >
> >> > >
> >> > > On Sun, Jul 27, 2014 at 7:07 PM, Stephen Boesch <java...@gmail.com>
> >> > wrote:
> >> > >
> >> > > > I have pulled latest from github this afternoon.   There are many
> >> many
> >> > > > errors:
> >> > > >
> >> > > > <source_home>/assembly/target/scala-2.10: No such file or
> >>directory
> >> > > >
> >> > > > This causes many tests to fail.
> >> > > >
> >> > > > Here is the command line I am running
> >> > > >
> >> > > >     mvn -Pyarn -Phadoop-2.3 -Phive package test
> >> > > >
> >> > >
> >> >
> >>
>
>
>
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to
> which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Reply via email to