Hi Esa, which Scala version do you use? Flink supports Scala 2.11 (and Scala 2.10 support was dropped with Flink 1.4.0).
Fabian 2018-02-22 9:28 GMT+01:00 Esa Heikkinen <esa.heikki...@student.tut.fi>: > > > > It should be ok. This is the list of my all imports. First part of it has > been highlighted weaker. I don’t know why. > > > > import org.apache.flink.streaming.api.windowing.time.Time > > import org.apache.flink.api.java.utils.ParameterTool > > import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment > > import org.apache.flink.streaming.api.windowing.time.Time > > import org.apache.flink.cep.scala.{CEP, PatternStream} > > import org.apache.flink.cep.scala.pattern.Pattern > > import org.apache.flink.cep.{PatternFlatSelectFunction, > PatternFlatTimeoutFunction} > > import org.apache.flink.streaming.api.datastream. > SingleOutputStreamOperator > > import org.apache.flink.streaming.api.functions.source. > ParallelSourceFunction > > import org.apache.flink.streaming.api.functions.source. > SourceFunction.SourceContext > > import org.apache.flink.util.Collector > > import org.apache.flink.streaming.api.scala._ > > import org.apache.flink.api.scala._ > > import org.apache.flink.table.api.scala._ > > import org.apache.flink.table.api.scala.StreamTableEnvironment > > import org.apache.flink.table.api.java.StreamTableEnvironment > > > > > > import org.apache.flink.types.Row > > import org.apache.flink.streaming.api.TimeCharacteristic > > import org.apache.flink.streaming.api.scala.{DataStream, > StreamExecutionEnvironment} > > import org.apache.flink.table.api.TableEnvironment > > import org.apache.flink.table.sources.CsvTableSource > > import org.apache.flink.api.common.typeinfo.Types > > > > BR Esa > > > > *From:* Xingcan Cui [mailto:xingc...@gmail.com] > *Sent:* Thursday, February 22, 2018 10:09 AM > > *To:* Esa Heikkinen <esa.heikki...@student.tut.fi> > *Cc:* user@flink.apache.org > *Subject:* Re: Problems to use toAppendStream > > > > Hi Esa, > > > > just to remind that don’t miss the dot and underscore. > > > > Best, > > Xingcan > > > > On 22 Feb 2018, at 3:59 PM, Esa Heikkinen <esa.heikki...@student.tut.fi> > wrote: > > > > Hi > > > > Actually I have also line “import org.apache.flink.streaming.api.scala” > on my code, but this line seems to be highlighted weaker in window of IDEA > IntelliJ editor. What does this mean ? > > > > But the same errors will still be generated. > > > > Esa > > > > *From:* Fabian Hueske [mailto:fhue...@gmail.com <fhue...@gmail.com>] > *Sent:* Wednesday, February 21, 2018 9:41 PM > *To:* Esa Heikkinen <esa.heikki...@student.tut.fi> > *Cc:* user@flink.apache.org > *Subject:* Re: Problems to use toAppendStream > > > > Hi Esa, > > whenever you observe the error "could not find implicit value for evidence > parameter of type X" in a streaming program, you need to add the following > import: > > import org.apache.flink.streaming.api.scala._ > > Best, Fabian > > > > 2018-02-21 19:49 GMT+01:00 Esa Heikkinen <heikk...@student.tut.fi>: > > > > Hi > > > > > > I have tried to solve below Errors for long time, but no succeed yet. Could > you give some hint how to solve it ? > > > > Errors in compiling: > > ------------------ > > Error:(56, 46) could not find implicit value for evidence parameter of type > org.apache.flink.api.common.typeinfo.TypeInformation[org.apache.flink.types.Row] > > val stream = tableEnv.toAppendStream[Row](tableTest) > > > > Error:(56, 46) not enough arguments for method toAppendStream: (implicit > evidence$3: > org.apache.flink.api.common.typeinfo.TypeInformation[org.apache.flink.types.Row])org.apache.flink.streaming.api.scala.DataStream[org.apache.flink.types.Row]. > > Unspecified value parameter evidence$3. > > val stream = tableEnv.toAppendStream[Row](tableTest) > > > > Code: > > ----------------- > > import org.apache.flink.types.Row > > import org.apache.flink.streaming.api.TimeCharacteristic > > import org.apache.flink.streaming.api.scala.{DataStream, > StreamExecutionEnvironment} > > import org.apache.flink.table.api.TableEnvironment > > import org.apache.flink.table.sources.CsvTableSource > > import org.apache.flink.api.common.typeinfo.Types > > > > object CepTest2 { > > > > def main(args: Array[String]) { > > > > println("Start ...") > > > > val env = StreamExecutionEnvironment.getExecutionEnvironment > > env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime) > > > > //val tableEnv = StreamTableEnvironment.getTableEnvironment(env) > > val tableEnv = TableEnvironment.getTableEnvironment(env) > > > > val csvtable = CsvTableSource > > .builder > > .path("/home/esa/Log_EX1_gen_track_5.csv") > > .ignoreFirstLine > > .fieldDelimiter(",") > > .field("time", Types.INT) > > .field("id", Types.STRING) > > .field("sources", Types.STRING) > > .field("targets", Types.STRING) > > .field("attr", Types.STRING) > > .field("data", Types.STRING) > > .build > > > > tableEnv.registerTableSource("test", csvtable) > > > > val tableTest = > tableEnv.scan("test").where("id='5'").select("id,sources,targets") > > > > val stream = tableEnv.toAppendStream[Row](tableTest) > > > > stream.print > > env.execute() > > } > > } > > -------------------- > > >