Re: GraphX question about graph traversal

2014-08-20 Thread Cesar Arevalo
code I need to modify. I'll let you know how it goes. -Cesar On Wed, Aug 20, 2014 at 2:14 PM, Ankur Dave wrote: > At 2014-08-20 10:34:50 -0700, Cesar Arevalo > wrote: > > I would like to get the type B vertices that are connected through type A > > vertices where the

Re: GraphX question about graph traversal

2014-08-20 Thread Cesar Arevalo
uestion-about-graph-traversal-tp12491p12494.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional comma

GraphX question about graph traversal

2014-08-20 Thread Cesar Arevalo
I was thinking of using the pregel API, so I will continue looking into that. Anyway, I look forward to a response. Best, -- Cesar Arevalo Software Engineer ❘ Zephyr Health 450 Mission Street, Suite #201 ❘ San Francisco, CA 94105 m: +1 415-571-7687 ❘ s: arevalocesar | t: @zephyrhealth <https://twitter.

Re: NullPointerException when connecting from Spark to a Hive table backed by HBase

2014-08-19 Thread Cesar Arevalo
com/NullPointerException-when-connecting-from-Spark-to-a-Hive-table-backed-by-HBase-tp12284p12331.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> - >> To unsubscribe, e-mail

Re: NullPointerException when connecting from Spark to a Hive table backed by HBase

2014-08-18 Thread Cesar Arevalo
ve-exec/hive-exec-0.12.0.jar > > Thanks > Best Regards > > > On Mon, Aug 18, 2014 at 12:45 PM, Cesar Arevalo > wrote: > >> Nope, it is NOT null. Check this out: >> >> scala> hiveContext == null >> res2: Boolean = false >> >> >&g

Re: NullPointerException when connecting from Spark to a Hive table backed by HBase

2014-08-18 Thread Cesar Arevalo
, Aug 18, 2014 at 12:00 AM, Akhil Das wrote: > Looks like your hiveContext is null. Have a look at this documentation. > <https://spark.apache.org/docs/latest/sql-programming-guide.html#hive-tables> > > Thanks > Best Regards > > > On Mon, Aug 18, 2014 at 12:09

NullPointerException when connecting from Spark to a Hive table backed by HBase

2014-08-17 Thread Cesar Arevalo
ar:/opt/spark-poc/lib_managed/bundles/com.jolbox/bonecp/bonecp-0.7.1.RELEASE.jar:/opt/spark-poc/sbt/ivy/cache/com.datastax.cassandra/cassandra-driver-core/bundles/cassandra-driver-core-2.0.4.jar:/opt/spark-poc/lib_managed/jars/org.json/json/json-20090211.jar Can anybody help me? Best, --

Re: Broadcast variable in Spark Java application

2014-07-07 Thread Cesar Arevalo
Hi Praveen: It may be easier for other people to help you if you provide more details about what you are doing. It may be worthwhile to also mention which spark version you are using. And if you can share the code which doesn't work for you, that may also give others more clues as to what you a

Re: Spark 1.0 failed on HDP 2.0 with absurd exception

2014-07-05 Thread Cesar Arevalo
rkers NUMNumber of workers to start (Default: 2) > --worker-cores NUM Number of cores for the workers (Default: 1) > --worker-memory MEM Memory per Worker (e.g. 1000M, 2G) (Default: 1G) > > > Seems like the old spark notation any ideas? > > Thank you, > Konst

Re: Spark Streaming on top of Cassandra?

2014-07-04 Thread Cesar Arevalo
Hi Zarzyk: If I were you, just to start, I would look at the following: https://groups.google.com/forum/#!topic/spark-users/htQQA3KidEQ http://www.slideshare.net/planetcassandra/south-bay-cassandrealtime-analytics-using-cassandra-spark-and-shark-at-ooyala http://spark-summit.org/2014/talk/using-s

Re: Spark Streaming on top of Cassandra?

2014-07-04 Thread Cesar Arevalo
Hi Zarzyk: If I were you, just to start, I would look at the following: https://groups.google.com/forum/#!topic/spark-users/htQQA3KidEQ http://www.slideshare.net/planetcassandra/south-bay-cassandrealtime-analytics-using-cassandra-spark-and-shark-at-ooyala http://spark-summit.org/2014/talk/using-s

Anybody changed their mind about going to the Spark Summit 2014

2014-06-27 Thread Cesar Arevalo
Hi All: I was wondering if anybody had bought a ticket for the upcoming Spark Summit 2014 this coming week and had changed their mind about going. Let me know, since it has sold out and I can't buy a ticket anymore, I would be interested in buying it. Best, -- Cesar Arevalo Software Eng