So, Can I increase the number of threads by manually coding in the Spark
code?
On Sat, Feb 7, 2015 at 6:52 PM, Sean Owen wrote:
> If you look at the threads, the other 30 are almost surely not Spark
> worker threads. They're the JVM finalizer, GC threads, Jetty
> listeners, etc. Nothing wrong wi
If you look at the threads, the other 30 are almost surely not Spark
worker threads. They're the JVM finalizer, GC threads, Jetty
listeners, etc. Nothing wrong with this. Your OS has hundreds of
threads running now, most of which are idle, and up to 4 of which can
be executing. In a one-machine cl
> 1
You have 4 CPU core and 34 threads (system wide, you likely have many more,
by the way).
Think of it as having 4 espresso machine and 34 baristas. Does the fact
that you have only 4 espresso machine mean you can only have 4 baristas? Of
course not, there's plenty more work other than making esp
Hi,
I am using YourKit tool to profile Spark jobs that is run in my Single Node
Spark Cluster.
When I see the YourKit UI Performance Charts, the thread count always
remains at
All threads: 34
Daemon threads: 32
Here are my questions:
1. My system can run only 4 threads simultaneously, and obvious