Thank you so much Wencong Liu, that fixed it!!!



Sent with Proton Mail secure email.

------- Original Message -------
On Thursday, May 11th, 2023 at 11:11 PM, Wencong Liu <liuwencle...@163.com> 
wrote:


> 
> 
> Hi Brandon Wright,
> 
> 
> I think you could try the following actions in IntelliJ IDE:
> First, execute the command "mvn clean install -Dfast -DskipTests=true 
> -Dscala-2.12" in terminal.
> Second, in "File -> Invalidate Caches", select all options and restart the 
> IDE.
> 
> Finally, click "maven reload" in the maven plugin, and wait until the 
> reloading process is finished.
> If it not work after these actions, you could try more times.
> 
> 
> Best,
> 
> 
> Wencong Liu
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At 2023-05-12 07:16:09, "Brandon Wright" brandonwrig...@proton.me.INVALID 
> wrote:
> 
> > I clone the Flink git repository, master branch, I configure a Java 8 JDK, 
> > and I can build the flink project successfully with:
> > 
> > mvn clean package -DskipTests
> > 
> > However, when I load the project into IntelliJ, and try to compile the 
> > project and run the Scala tests in the IDE I get a lot of compilation 
> > errors with the existing Scala code like:
> > 
> > ./flink/flink-scala/src/test/scala/org/apache/flink/api/scala/DeltaIterationSanityCheckTest.scala:33:41
> > could not find implicit value for evidence parameter of type 
> > org.apache.flink.api.common.typeinfo.TypeInformation[(Int, String)]
> > val solutionInput = env.fromElements((1, "1"))
> > 
> > and
> > 
> > ./flink/flink-table/flink-table-api-scala/src/test/scala/org/apache/flink/table/types/extraction/DataTypeExtractorScalaTest.scala:39:7
> > overloaded method value assertThatThrownBy with alternatives:
> > (x$1: org.assertj.core.api.ThrowableAssert.ThrowingCallable,x$2: 
> > String,x$3: Object*)org.assertj.core.api.AbstractThrowableAssert[, _ <: 
> > Throwable] <and>
> > (x$1: 
> > org.assertj.core.api.ThrowableAssert.ThrowingCallable)org.assertj.core.api.AbstractThrowableAssert[,
> >  _ <: Throwable]
> > cannot be applied to (() => Unit)
> > assertThatThrownBy(() => runExtraction(testSpec))
> > 
> > Clearly, the same code is compiling when using the Maven build via command 
> > line, so this must be some kind of environment/config issue. I'd to get the 
> > code building within IntelliJ so I can use the debugger and step through 
> > unit tests. I don't want to make source changes quite yet. I'd like to just 
> > step through the code as it is.
> > 
> > My first guess is the IntelliJ IDE is using the wrong version of the Scala 
> > compiler. In IntelliJ, in "Project Structure" -> "Platform Settings" -> 
> > "Global Libraries", I have "scala-sdk-2.12.7" configured and nothing else. 
> > I believe that's the specific version of Scala that the Flink code is 
> > intended to compile with. I've checked all the project settings and 
> > preferences and I don't see any other places I can configure or even verify 
> > which version of Scala is being used.
> > 
> > Additional points:
> > 
> > - I can run/debug Java unit tests via the IntelliJ IDE, but not Scala unit 
> > tests.
> > - If I do "Build" -> "Rebuild Project", I get Scala compilation errors as 
> > mentioned above, but no Java errors. The Java code seems to compile 
> > successfully.
> > - I'm using the current version of IntelliJ 2023.1.1 Ultimate with the 
> > Scala plugin installed.
> > - I've read and followed the instructions on 
> > https://nightlies.apache.org/flink/flink-docs-master/docs/flinkdev/ide_setup/.
> >  These docs don't mention specifying the version of the Scala compiler at 
> > all.
> > - This is a clean repo on "master" branch with absolutely zero changes.
> > - In IntelliJ, in "Project Structure" -> "Project Settings" -> "Project", 
> > I've chosen a Java 8 JDK, which I presume is the best choice for building 
> > Flink code today
> > 
> > Thanks for any help!

Reply via email to