I think flink-spargel is missing the guava dependency.

On Thu, May 14, 2015 at 8:18 AM, Aljoscha Krettek <aljos...@apache.org>
wrote:

> @Robert, this seems like a problem with the Shading?
>
> On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram
> <rajaram.lok...@gmail.com> wrote:
> > Thanks Aljioscha. I was able to change as recommended and able to run the
> > entire test suite in local successfully.
> > However Travis build is failing for pull request:
> > https://github.com/apache/flink/pull/673.
> >
> > It's a compilation failure:
> >
> > [ERROR] Failed to execute goal
> > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> > (default-compile) on project flink-spargel: Compilation failure:
> > Compilation failure:
> > [ERROR]
> >
> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/java/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30]
> > package com.google.common.base does not exist
> >
> > I can definitely see the package imported in the class, compiling and
> > passing all tests in local.
> > Anything I am missing here?
> >
> > Thanks,
> > Lokesh
> >
> > On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek <aljos...@apache.org>
> > wrote:
> >
> >> I think you can replace Validate.NotNull(p) with require(p != null, "p
> >> is null (or something like this)").
> >>
> >> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram
> >> <rajaram.lok...@gmail.com> wrote:
> >> > 1. I think I can use require for replacing Validate.isTrue
> >> > 2. What about Validate.notNull? If require is used it would throw
> >> > IllegalArgumentException,
> >> > if assume or assert is used it would throw AssertionError which is not
> >> > compatible with current implementation.
> >> >
> >> > Please let me know if my understanding is correct. Also, let me know
> your
> >> > thoughts.
> >> >
> >> > Thanks,
> >> > Lokesh
> >> >
> >> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek <
> aljos...@apache.org>
> >> > wrote:
> >> >
> >> >> I would propose using the methods as Chiwan suggested. If everyone
> >> >> agrees I can change the Jira issue.
> >> >>
> >> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram
> >> >> <rajaram.lok...@gmail.com> wrote:
> >> >> > Thank you for the reference links. Which approach should I take,
> >> casting
> >> >> or
> >> >> > use scala methods.
> >> >> > If it's the latter option will the JIRA ticket FLINK-1711
> >> >> > <https://issues.apache.org/jira/browse/FLINK-1711> be updated to
> >> >> reflect it?
> >> >> >
> >> >> > Thanks,
> >> >> > Lokesh
> >> >> >
> >> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park <chiwanp...@icloud.com
> >
> >> >> wrote:
> >> >> >
> >> >> >> Hi. There is some problems using Guava’s check method in Scala. (
> >> >> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> <
> >> >> >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R3k
> >)
> >> You
> >> >> >> can solve this error simply with casting last argument to
> >> >> java.lang.Object.
> >> >> >> But I think we’d better use `require`, `assume`, `assert` method
> >> >> provided
> >> >> >> by Scala. (
> >> >> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> <
> >> >> >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.html
> >)
> >> >> >> Because this changes affects many other codes, so we should
> discuss
> >> >> about
> >> >> >> changing Guava's method to Scala’s method.
> >> >> >>
> >> >> >> Regards.
> >> >> >> Chiwan Park (Sent with iPhone)
> >> >> >>
> >> >> >>
> >> >> >>
> >> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram <
> >> >> rajaram.lok...@gmail.com>
> >> >> >> wrote:
> >> >> >> >
> >> >> >> > Hello All,
> >> >> >> >
> >> >> >> > I am new to Flink community and am very excited about the
> project
> >> and
> >> >> >> work
> >> >> >> > you all have been doing. Kudos!!
> >> >> >> >
> >> >> >> > I was looking to pickup some starter task. Robert recommended to
> >> pick
> >> >> up
> >> >> >> > https://issues.apache.org/jira/browse/FLINK-1711. Thanks Robert
> >> for
> >> >> your
> >> >> >> > guidance.
> >> >> >> >
> >> >> >> > Sorry for a dumb question. I am done with code changes but my
> "mvn
> >> >> >> verify"
> >> >> >> > failing only for the scala module as follows
> >> >> >> >
> >> >> >> >
> >> >> >>
> >> >>
> >>
> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.scala:77:
> >> >> >> > error: ambiguous reference to overloaded definition,
> >> >> >> > [ERROR] both method checkNotNull in object Preconditions of type
> >> >> [T](x$1:
> >> >> >> > T, x$2: String, x$3: <repeated...>[Object])T
> >> >> >> > [ERROR] and  method checkNotNull in object Preconditions of type
> >> >> [T](x$1:
> >> >> >> > T, x$2: Any)T
> >> >> >> > [ERROR] match argument types ((L, R) => O,String)
> >> >> >> > [ERROR]     Preconditions.checkNotNull(fun, "Join function must
> >> not be
> >> >> >> > null.")
> >> >> >> >
> >> >> >> > Same error I see for all of the Scala classes I changed. Any
> >> pointers
> >> >> >> here
> >> >> >> > will be very helpful for me to proceed further. Please let me
> know
> >> if
> >> >> you
> >> >> >> > need more information.
> >> >> >> >
> >> >> >> > Thanks in advance for your help and support.
> >> >> >> >
> >> >> >> > Thanks,
> >> >> >> > Lokesh
> >> >> >>
> >> >> >>
> >> >>
> >>
>

Reply via email to