bcc dev@ and add user@
This is more a user@ list question rather than a dev@ list question. You
can do something like this:
object MySimpleApp {
def loadResources(): Unit = // define some idempotent way to load
resources, e.g. with a flag or lazy val
def main() = {
...
sc.paralleli
Hi,
I am trying to invoke C library from the Spark Stack using JNI interface
(here is sample application code)
class SimpleApp {
// ---Native methods
@native def foo (Top: String): String
}
object SimpleApp {
def main(args: Array[String]) {
val conf = new
SparkConf().setAppName("Simp
Thanks to all, I solved the problem.
I'm sorry if the question was off topic, next time I'll post to
stackoverflow.
Thanks a lot
2016-11-25 17:19 GMT+01:00 marco rocchi :
> Thanks for the helping.
> I've created my ssh tunnel at port 4040, and setted browser firefox SOCKS
> to localhost:4040.
> N
Marco,
Depending on your configuration, maybe what you're looking for is:
localhost:4040
Check this two StackOverflow answers:
http://stackoverflow.com/questions/31460079/spark-ui-on-aws-emr
or similar questions. This is not a specific Spark issue.
Please check StackOverflow or post to User Mail
Thank you.
On 11/25/2016 02:02 PM, Takeshi Yamamuro wrote:
> Hi,
>
> Seems we forget to pass `parts:Array[Partition]` into `JDBCRelation`.
> This was removed in this
> commit:
> https://github.com/apache/spark/commit/b3130c7b6a1ab4975023f08c3ab02ee8d2c7e995#diff-f70bda59304588cc3abfa3a9840653f4L
Thanks for the helping.
I've created my ssh tunnel at port 4040, and setted browser firefox SOCKS
to localhost:4040.
Now When I run a job I can read from INFO message: "SparkUI activated at
http://192.168.1.204:4040";. But if I open the browser and type local host
or http://192.168.1.204:4040, webU
I believe https://github.com/apache/spark/pull/15975 fixes this regression.
I am sorry for the trouble.
2016-11-25 22:23 GMT+09:00 Sean Owen :
> See https://github.com/apache/spark/pull/15499#discussion_r89008564 in
> particular. Hyukjin / Xiao do we need to undo part of this change?
>
>
> On Fr
See https://github.com/apache/spark/pull/15499#discussion_r89008564 in
particular. Hyukjin / Xiao do we need to undo part of this change?
On Fri, Nov 25, 2016 at 1:02 PM Takeshi Yamamuro
wrote:
> Hi,
>
> Seems we forget to pass `parts:Array[Partition]` into `JDBCRelation`.
> This was removed in
Hi,
Seems we forget to pass `parts:Array[Partition]` into `JDBCRelation`.
This was removed in this commit:
https://github.com/apache/spark/commit/b3130c7b6a1ab4975023f08c3ab02ee8d2c7e995#diff-f70bda59304588cc3abfa3a9840653f4L237
// maropu
On Fri, Nov 25, 2016 at 9:50 PM, Maciej Szymkiewicz
wrot
Hi,
I've been reviewing my notes to https://git.io/v1UVC using Spark built
from 51b1c1551d3a7147403b9e821fcc7c8f57b4824c and it looks like JDBC
ignores both:
* (columnName, lowerBound, upperBound, numPartitions)
* predicates
and loads everything into a single partition. Can anyone confirm th
This is more of a question for the spark user’s list, but if you look at
FoxyProxy and SSH tunnels it’ll get you going.
These instructions from AWS for accessing EMR are a good start
http://docs.aws.amazon.com/ElasticMapReduce/latest/DeveloperGuide/emr-ssh-tunnel.html
http://docs.aws.amazon.com
SparkPi is just an example, so its performance doesn't really matter.
Simpler is better.
Kryo could be an issue but that would be a change in Kryo.
On Fri, Nov 25, 2016 at 7:30 AM Prasun Ratn wrote:
> Hi,
>
> I am seeing perf degradation in the Spark/Pi example on a single-node
> setup (using lo
12 matches
Mail list logo