Hi Esa and Fabian,

sorry for my inaccurate conclusion before, but I think the reason is clear now. 
The org.apache.flink.streaming.api.scala._ and org.apache.flink.api.scala._  
should not be imported simultaneously due to conflict. Just remove either of 
them.

Best,
Xingcan

> On 22 Feb 2018, at 5:20 PM, Xingcan Cui <xingc...@gmail.com> wrote:
> 
> Hi Fabian and Esa,
> 
> I ran the code myself and also noticed the strange behavior. It seems that 
> only I explicitly import the function i.e., 
> org.apache.flink.streaming.api.scala.asScalaStream, can it works. In other 
> words, the underscore import becomes useless. I also checked other package 
> objects (e.g., org.apache.flink.table.api.scala._) and they are the same.
> 
> @Esa, you can temporarily solve the problem by importing 
> org.apache.flink.streaming.api.scala.asScalaStream in your code and we'll 
> continue working on this issue.
> 
> Best,
> Xingcan
> 
>> On 22 Feb 2018, at 4:47 PM, Esa Heikkinen <esa.heikki...@student.tut.fi 
>> <mailto:esa.heikki...@student.tut.fi>> wrote:
>> 
>> Hi
>>  
>> How to check versions ?
>>  
>> In pom.xml there are lines:
>>  
>>                              <properties>
>>                                                           
>> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
>>                                                           
>> <flink.version>1.4.0</flink.version>
>>                                                           
>> <slf4j.version>1.7.7</slf4j.version>
>>                                                           
>> <log4j.version>1.2.17</log4j.version>
>>                                                           
>> <scala.binary.version>2.11</scala.binary.version>
>>                                                           
>> <scala.version>2.11.11</scala.version>
>>                              </properties>
>>  
>> BR Esa
>>  
>> From: Fabian Hueske [mailto:fhue...@gmail.com <mailto:fhue...@gmail.com>] 
>> Sent: Thursday, February 22, 2018 10:35 AM
>> To: Esa Heikkinen <esa.heikki...@student.tut.fi 
>> <mailto:esa.heikki...@student.tut.fi>>
>> Cc: Xingcan Cui <xingc...@gmail.com <mailto:xingc...@gmail.com>>; 
>> user@flink.apache.org <mailto:user@flink.apache.org>
>> Subject: Re: Problems to use toAppendStream
>>  
>> Hi Esa,
>> 
>> which Scala version do you use?
>> Flink supports Scala 2.11 (and Scala 2.10 support was dropped with Flink 
>> 1.4.0).
>> 
>> Fabian
>>  
>> 2018-02-22 9:28 GMT+01:00 Esa Heikkinen <esa.heikki...@student.tut.fi 
>> <mailto:esa.heikki...@student.tut.fi>>:
>>  
>> 
>> It should be ok. This is the list of my all imports. First part of it has 
>> been highlighted weaker. I don’t know why.
>>  
>> import org.apache.flink.streaming.api.windowing.time.Time
>> import org.apache.flink.api.java.utils.ParameterTool
>> import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment
>> import org.apache.flink.streaming.api.windowing.time.Time
>> import org.apache.flink.cep.scala.{CEP, PatternStream}
>> import org.apache.flink.cep.scala.pattern.Pattern
>> import org.apache.flink.cep.{PatternFlatSelectFunction, 
>> PatternFlatTimeoutFunction}
>> import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator
>> import org.apache.flink.streaming.api.functions.source.ParallelSourceFunction
>> import 
>> org.apache.flink.streaming.api.functions.source.SourceFunction.SourceContext
>> import org.apache.flink.util.Collector
>> import org.apache.flink.streaming.api.scala._
>> import org.apache.flink.api.scala._
>> import org.apache.flink.table.api.scala._
>> import org.apache.flink.table.api.scala.StreamTableEnvironment
>> import org.apache.flink.table.api.java.StreamTableEnvironment
>>  
>>  
>> import org.apache.flink.types.Row
>> import org.apache.flink.streaming.api.TimeCharacteristic
>> import org.apache.flink.streaming.api.scala.{DataStream, 
>> StreamExecutionEnvironment}
>> import org.apache.flink.table.api.TableEnvironment
>> import org.apache.flink.table.sources.CsvTableSource
>> import org.apache.flink.api.common.typeinfo.Types
>>  
>> BR Esa
>>  
>> From: Xingcan Cui [mailto:xingc...@gmail.com <mailto:xingc...@gmail.com>] 
>> Sent: Thursday, February 22, 2018 10:09 AM
>> 
>> To: Esa Heikkinen <esa.heikki...@student.tut.fi 
>> <mailto:esa.heikki...@student.tut.fi>>
>> Cc: user@flink.apache.org <mailto:user@flink.apache.org>
>> Subject: Re: Problems to use toAppendStream
>>  
>> Hi Esa,
>>  
>> just to remind that don’t miss the dot and underscore.
>>  
>> Best,
>> Xingcan
>>  
>> 
>> On 22 Feb 2018, at 3:59 PM, Esa Heikkinen <esa.heikki...@student.tut.fi 
>> <mailto:esa.heikki...@student.tut.fi>> wrote:
>>  
>> Hi
>>  
>> Actually I have also line “import org.apache.flink.streaming.api.scala” on 
>> my code, but this line seems to be highlighted weaker in window of IDEA 
>> IntelliJ editor. What does this mean ?
>>  
>> But the same errors will still be generated.
>>  
>> Esa
>>  
>> From: Fabian Hueske [mailto:fhue...@gmail.com <mailto:fhue...@gmail.com>] 
>> Sent: Wednesday, February 21, 2018 9:41 PM
>> To: Esa Heikkinen <esa.heikki...@student.tut.fi 
>> <mailto:esa.heikki...@student.tut.fi>>
>> Cc: user@flink.apache.org <mailto:user@flink.apache.org>
>> Subject: Re: Problems to use toAppendStream
>>  
>> Hi Esa,
>> 
>> whenever you observe the error "could not find implicit value for evidence 
>> parameter of type X" in a streaming program, you need to add the following 
>> import:
>> 
>> import org.apache.flink.streaming.api.scala._
>> 
>> Best, Fabian
>>  
>> 2018-02-21 19:49 GMT+01:00 Esa Heikkinen <heikk...@student.tut.fi 
>> <mailto:heikk...@student.tut.fi>>:
>>  
>> Hi
>>  
>>  
>> I have tried to solve below Errors for long time, but no succeed yet. Could 
>> you give some hint how to solve it ?
>>  
>> Errors in compiling:
>> ------------------
>> Error:(56, 46) could not find implicit value for evidence parameter of type 
>> org.apache.flink.api.common.typeinfo.TypeInformation[org.apache.flink.types.Row]
>>     val stream = tableEnv.toAppendStream[Row](tableTest)
>>  
>> Error:(56, 46) not enough arguments for method toAppendStream: (implicit 
>> evidence$3: 
>> org.apache.flink.api.common.typeinfo.TypeInformation[org.apache.flink.types.Row])org.apache.flink.streaming.api.scala.DataStream[org.apache.flink.types.Row].
>> Unspecified value parameter evidence$3.
>>     val stream = tableEnv.toAppendStream[Row](tableTest)
>>  
>> Code:
>> -----------------
>> import org.apache.flink.types.Row
>> import org.apache.flink.streaming.api.TimeCharacteristic
>> import org.apache.flink.streaming.api.scala.{DataStream, 
>> StreamExecutionEnvironment}
>> import org.apache.flink.table.api.TableEnvironment
>> import org.apache.flink.table.sources.CsvTableSource
>> import org.apache.flink.api.common.typeinfo.Types
>>  
>> object CepTest2 {
>>  
>>   def main(args: Array[String]) {
>>  
>>     println("Start ...")
>>  
>>     val env = StreamExecutionEnvironment.getExecutionEnvironment
>>     env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime)
>>  
>>     //val tableEnv = StreamTableEnvironment.getTableEnvironment(env)
>>     val tableEnv = TableEnvironment.getTableEnvironment(env)
>>  
>>     val csvtable = CsvTableSource
>>       .builder
>>       .path("/home/esa/Log_EX1_gen_track_5.csv")
>>       .ignoreFirstLine
>>       .fieldDelimiter(",")
>>       .field("time", Types.INT)
>>       .field("id", Types.STRING)
>>       .field("sources", Types.STRING)
>>       .field("targets", Types.STRING)
>>       .field("attr", Types.STRING)
>>       .field("data", Types.STRING)
>>       .build
>>  
>>     tableEnv.registerTableSource("test", csvtable)
>>  
>>     val tableTest = 
>> tableEnv.scan("test").where("id='5'").select("id,sources,targets")
>>  
>>     val stream = tableEnv.toAppendStream[Row](tableTest)
>>  
>>     stream.print
>>     env.execute()
>>   }
>> }
>> --------------------
> 

Reply via email to