You can create a JavaRDD as normal and then call the .rdd() to get the RDD.
Thanks
Best Regards
On Mon, Sep 28, 2015 at 9:01 PM, Rohith P
wrote:
> Hi all,
> I am trying to work with spark-redis connector (redislabs) which
> requires all transactions between redis and spark be in RDD's. Th
Hi,
I am trying to use internal UDFs that we have added as permanent functions
to Hive, from within Spark SQL query (using HiveContext), but i encounter
NoSuchObjectException, i.e. the function could not be found.
However, if i execute 'show functions' command in spark SQL, the permanent
function
See
http://stackoverflow.com/questions/26516865/is-it-possible-to-run-hadoop-jobs-like-the-wordcount-sample-in-the-local-mode,
https://issues.apache.org/jira/browse/SPARK-6961 and finally
https://issues.apache.org/jira/browse/HADOOP-10775. The easy solution is to
download a Windows Hadoop distribut
not sure, so downloaded again release 1.4.1 with Hadoop 2.6 and later
options from http://spark.apache.org/downloads.html assuming the version is
consistent and run the following on Windows 10
c:\spark-1.4.1-bin-hadoop2.6>bin\run-example HdfsTest
still got similar exception below: (I heard ther
A very basic support that is there in DStream is DStream.transform() which
take arbitrary RDD => RDD function. This function can actually choose to do
different computation with time. That may be of help to you.
On Tue, Sep 29, 2015 at 12:06 PM, Archit Thakur
wrote:
> Hi,
>
> We are using spark
Dear Spark developers,
I have created a simple Spark application for spark submit. It calls a machine
learning library from Spark MLlib that is executed in a number of iterations
that correspond to the same number of task in Spark. It seems that Spark
creates an executor for each task and then
Hi,
We are using spark streaming as our processing engine, and as part of
output we want to push the data to UI. Now there would be multiple users
accessing the system with there different filters on. Based on the filters
and other inputs we want to either run a SQL Query on DStream or do a
custo
Dear All,
I am trying to understand how exactly spark network module works. Looking at
Netty package, I would like to intercept every server response for block
fetch. As I understood the place which is responsible for sending remote
blocks is "TransportRequestHandler.processFetchRequest". Im tryin