Hi,
I'm getting the same error while manually setting up Spark cluster.
Has there been any update about this error?
Rgds
Niranda
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Invalid-Class-Exception-tp6859p13346.html
Sent from the Apache Spark
Hi,
I had the same issue in my Java code while I was trying to connect to a
locally hosted spark server (using sbin/start-all.sh etc) using an IDE
(IntelliJ).
I packaged my app into a jar and used spark-submit (in bin/) and it worked!
Hope this helps
Rgds
--
View this message in context:
Hi,
I'm planning to use Spark SQL JDBC datasource provider in various RDBMS
databases.
what are the databases currently supported by Spark JDBC relation provider?
rgds
--
Niranda
@n1r44 <https://twitter.com/N1R44>
https://pythagoreanscript.wordpress.com/
Hi,
would like to know if there is an update on this?
rgds
On Mon, Jan 12, 2015 at 10:44 AM, Niranda Perera
wrote:
> Hi,
>
> I found out that SparkSQL supports only a relatively small subset of SQL
> dialect currently.
>
> I would like to know the roadmap for the coming rele
y optimizer / execution engine instead of the
> catalyst optimizer that is shipped with Spark.
>
> On Thu, Jan 22, 2015 at 3:12 AM, Niranda Perera
> wrote:
>
>> Hi,
>>
>> would like to know if there is an update on this?
>>
>> rgds
>>
>> On Mo
--
Niranda
[1]
https://github.com/wso2-dev/carbon-analytics/tree/master/components/xanalytics
--
*Niranda Perera*
Software Engineer, WSO2 Inc.
Mobile: +94-71-554-8430
Twitter: @n1r44 <https://twitter.com/N1R44>
nd there is an example library for reading Avro data
> <https://github.com/databricks/spark-avro>.
>
> On Thu, Nov 27, 2014 at 10:31 PM, Niranda Perera wrote:
>
>> Hi,
>>
>> I am evaluating Spark for an analytic component where we do batch
>> processing o
n(String[] args) throws Exception {
SparkConf sparkConf = new SparkConf()
.setMaster("local[2]")
.setAppName("avro-spark-test")
.setSparkHome("/home/niranda/software/spark-1.2.0-bin-hadoop1");
JavaSparkContext
Hi,
I have this simple spark app.
public class AvroSparkTest {
public static void main(String[] args) throws Exception {
SparkConf sparkConf = new SparkConf()
.setMaster("spark://niranda-ThinkPad-T540p:7077") //
("local[2]")
.se
SuchMethodError:
com.google.common.hash.HashFunction.hashInt" error occurs,
which is understandable because hashInt is not available before Guava 12.
So, I''m wondering why this occurs?
Cheers
--
Niranda Perera
kConf sparkConf = new SparkConf()
.setMaster("spark://niranda-ThinkPad-T540p:7077")
//("local[2]")
.setAppName("avro-spark-test");
JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);
JavaSQLContext sqlContex
pp you send
> to spark-submit.
>
> On Tue, Jan 6, 2015 at 10:15 AM, Niranda Perera
> wrote:
>
>> Hi Sean,
>>
>> My mistake, Guava 11 dependency came from the hadoop-commons indeed.
>>
>> I'm running the following simple app in spark 1.2.0 standalone
Hi,
Are insert statements supported in Spark? if so, can you please give me an
example?
Rgds
--
Niranda
Hi,
I found out that SparkSQL supports only a relatively small subset of SQL
dialect currently.
I would like to know the roadmap for the coming releases.
And, are you focusing more on popularizing the 'Hive on Spark' SQL dialect
or the Spark SQL dialect?
Rgds
--
Niranda
15 matches
Mail list logo