oogle Cloud Dataflow provides distributed dataset which called PCollection,
and syntactic sugar based on PCollection is provided in the form of
"apply". Note that "apply" is different from spark api "map" which passing
each element of the source through a function func. I wonder can spark
support t
r 29, 2015 at 2:06 PM, Olivier Girardot
> wrote:
>
>> I guess you can use cast(id as String) instead of just id in your where
>> clause ?
>>
>> Le mer. 29 avr. 2015 à 12:13, lonely Feb a écrit :
>>
>> > Hi all, we are transfer our HIVE job into SparkSQL, but
Hi all, we are transfer our HIVE job into SparkSQL, but we found a litter
difference between HIVE and Spark SQL that our sql has a statement like:
select A from B where id regexp '^12345$'
in HIVE it works fine but in Spark SQL we got a:
java.lang.ClassCastException: java.lang.Long cannot be cas
Hi all, I tried to transfer some hive jobs into spark-sql. When i ran a sql
job with python udf i got a exception:
java.lang.ArrayIndexOutOfBoundsException: 9
at
org.apache.spark.sql.catalyst.expressions.GenericRow.apply(Row.scala:142)
at
org.apache.spark.sql.catalyst.expressions.B
Anyone can help? Thanks a lot !
2015-03-16 11:45 GMT+08:00 lonely Feb :
> yes
>
> 2015-03-16 11:43 GMT+08:00 Mridul Muralidharan :
>
>> Cross region as in different data centers ?
>>
>> - Mridul
>>
>> On Sun, Mar 15, 2015 at 8:08 PM, lonely Feb wrote:
yes
2015-03-16 11:43 GMT+08:00 Mridul Muralidharan :
> Cross region as in different data centers ?
>
> - Mridul
>
> On Sun, Mar 15, 2015 at 8:08 PM, lonely Feb wrote:
> > Hi all, i meet up with a problem that torrent broadcast hang out in my
> > spark cluster (1.2,
eout)
>
> or simply
> import scala.concurrent.duration._
> Await.result(result.future, 10 seconds)
>
>
>
> On Sun, Mar 15, 2015 at 8:08 PM, lonely Feb wrote:
>
>> Hi all, i meet up with a problem that torrent broadcast hang out in my
>> spark cluster
Hi all, i meet up with a problem that torrent broadcast hang out in my
spark cluster (1.2, standalone) , particularly serious when driver and
executors are cross-region. when i read the code of broadcast i found that
a sync block read here:
def fetchBlockSync(host: String, port: Int, execId: Str