How to implement top() and filter() on object List for JavaRDD

2015-07-07 Thread Hafsa Asif
Hi, I have an object list of Users and I want to implement top() and filter() methods on the object list. Let me explain you the whole scenario: 1. I have User object list named "usersList". I fill it during record set. User user = new User(); user.setUserName(rec

How to solve ThreadException in Apache Spark standalone Java Application

2015-07-07 Thread Hafsa Asif
Hi, I run the following simple Java spark standalone app with maven command "exec:java -Dexec.mainClass=SimpleApp" public class SimpleApp { public static void main(String[] args) { System.out.println("Reading and Connecting with Spark."); try { String logFile =

How to solve ThreadException in Apache Spark standalone Java Application

2015-07-07 Thread Hafsa Asif
Hi, I run the following simple Java spark standalone app with maven command "exec:java -Dexec.mainClass=SimpleApp" public class SimpleApp { public static void main(String[] args) { System.out.println("Reading and Connecting with Spark."); try { String logFile =

Re: How to implement top() and filter() on object List for JavaRDD

2015-07-07 Thread Hafsa Asif
Thank u for your quick response. But, I tried this and get the error as shown in pic error.jpg -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-implement-top-and-filter-on-o

Re: How to implement top() and filter() on object List for JavaRDD

2015-07-07 Thread Hafsa Asif
I have also tried this stupid code snippet, only thinking that it may even compile code Function1 FILTER_USER = new AbstractFunction1() { public Object apply(User user){ return user; } }; FILTER_USER is fine but cannot be applied to the following two options but no

Re: How to solve ThreadException in Apache Spark standalone Java Application

2015-07-07 Thread Hafsa Asif
I tried also sc.stop(). Sorry I didnot include that in my question, but still getting thread exception. It is also need to mention that I am working on VM Machine. 15/07/07 06:00:32 ERROR ActorSystemImpl: Uncaught error from thread [sparkDriver-akka.actor.default-dispatcher-5] java.lang.Interrupte

Re: How to implement top() and filter() on object List for JavaRDD

2015-07-07 Thread Hafsa Asif
Rusty, I am very thankful for your help. Actually, I am facing difficulty in objects. My plan is that, I have an object list containing list of User objects. After parallelizing it through spark context, I apply comparator on user.getUserName(). As usernames are sorted, their related user object a

Re: How to implement top() and filter() on object List for JavaRDD

2015-07-07 Thread Hafsa Asif
-07 16:54 GMT+02:00 Hafsa Asif : > Rusty, > > I am very thankful for your help. Actually, I am facing difficulty in > objects. My plan is that, I have an object list containing list of User > objects. After parallelizing it through spark context, I apply comparator > on user

Re: How to implement top() and filter() on object List for JavaRDD

2015-07-07 Thread Hafsa Asif
@Override >>> public String call(PPP arg0) throws Exception { >>> String userName = usr1.getUserName().toUpperCase(); >>> return userName ; >>> } >>> >>>

Re: How to solve ThreadException in Apache Spark standalone Java Application

2015-07-14 Thread Hafsa Asif
I m still looking forward for the answer. I want to know how to properly close everything about spark in java standalone app. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-solve-ThreadException-in-Apache-Spark-standalone-Java-Application-tp23675p23

Re: Spark Intro

2015-07-14 Thread Hafsa Asif
Hi, I was also in the same situation as we were using MySQL. Let me give some clearfications: 1. Spark provides a great methodology for big data analysis. So, if you want to make your system more analytical and want deep prepared analytical methods to analyze your data, then its a very good option.

Re: Create RDD from output of unix command

2015-07-14 Thread Hafsa Asif
Your question is very interesting. What I suggest is, that copy your output in some text file. Read text file in your code and apply RDD. Just consider wordcount example by Spark. I love this example with Java client. Well, Spark is an analytical engine and it has a slogan to analyze big big data s

Re: Spark application with a RESTful API

2015-07-14 Thread Hafsa Asif
I have almost the same case. I will tell you what I am actually doing, if it is according to your requirement, then I will love to help you. 1. my database is aerospike. I get data from it. 2. written standalone spark app (it does not run in standalone mode, but with simple java command or maven c

Issue wihle applying filters/conditions in DataFrame in Spark

2016-03-22 Thread Hafsa Asif
Hello everyone, I am trying to get benefits of DataFrames (to perform all SQL BASED operations like 'Where Clause', Joining etc.) as mentioned in https://spark.apache.org/docs/1.5.1/api/java/org/apache/spark/sql/DataFrame.html. I am using, Aerospike and Spark (1.4.1) Java Client in Spring Framewor

Re: Issue wihle applying filters/conditions in DataFrame in Spark

2016-03-22 Thread Hafsa Asif
yes I know it is because of NullPointerEception, but could not understand why? The complete stack trace is : [2016-03-22 13:40:14.894] boot - 10493 WARN [main] --- AnnotationConfigApplicationContext: Exception encountered during context initialization - cancelling refresh attempt: org.springframe

Re: Issue wihle applying filters/conditions in DataFrame in Spark

2016-03-22 Thread Hafsa Asif
Even If I m using this query then also give NullPointerException: "SELECT clientId FROM activePush" -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Issue-wihle-applying-filters-conditions-in-DataFrame-in-Spark-tp26560p26562.html Sent from the Apache Spark Us

Re: Issue wihle applying filters/conditions in DataFrame in Spark

2016-03-22 Thread Hafsa Asif
springBootVersion = '1.2.8.RELEASE' springDIVersion = '0.5.4.RELEASE' thriftGradleVersion = '0.3.1' Other Gradle configs: compile "org.apache.thrift:libthrift:0.9.3" compile 'org.slf4j:slf4j-api:1.7.14' compile 'org.apache.kafka:kafka_2.11:0.9.0.0' compile 'org.apach

Serialization issue with Spark

2016-03-22 Thread Hafsa Asif
Hello, I am facing Spark serialization issue in Spark (1.4.1 - Java Client) with Spring Framework. It is known that Spark needs serialization and it requires every class need to be implemented with java.io.Serializable. But, in the documentation link: http://spark.apache.org/docs/latest/tuning.html

Re: Serialization issue with Spark

2016-03-23 Thread Hafsa Asif
Can anyone please help me in this issue? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Serialization-issue-with-Spark-tp26565p26579.html Sent from the Apache Spark User List mailing list archive at Nabble.com. -

Convert Simple Kafka Consumer to standalone Spark JavaStream Consumer

2015-07-21 Thread Hafsa Asif
Hi, I have a simple High level Kafka Consumer like : package matchinguu.kafka.consumer; import kafka.consumer.Consumer; import kafka.consumer.ConsumerConfig; import kafka.consumer.ConsumerIterator; import kafka.consumer.KafkaStream; import kafka.javaapi.consumer.ConsumerConnector; import java.u