mode)?
On Thu, Oct 6, 2016 at 8:14 PM, Vadim Semenov
wrote:
> It may be related to the CDH version of spark you're using.
> When I use REST API I get yarn application id there
>
> Try opening http://localhost:4040/api/v1/applications/0/stages
>
> On Thu, Oct 6, 2016
Hi,
When I start Spark v1.6 (cdh5.8.0) in Yarn client mode I see that 4040
port is avaiable, but UI shows nothing and API returns not full information.
I started Spark application like this:
spark-submit --master yarn-client --class
org.apache.spark.examples.SparkPi
/usr/lib/spark/examp
Hi,
When I start Spark v1.6 (cdh5.8.0) in YARN Master mode I don't see API (
http://localhost:4040/api/v1/applications is unavailable) on port 4040.
I started Spark application like this:
spark-submit --master yarn-cluster --class
org.apache.spark.examples.SparkPi
/usr/lib/spark/examples
lications
> on Standalone and Yarn.
>
> On Fri, Sep 16, 2016 at 10:32 PM, Vladimir Tretyakov <
> vladimir.tretya...@sematext.com> wrote:
>
>> Hello.
>>
>> Found that there is also Spark metric Sink like MetricsServlet.
>> which is enabled by default:
>>
other than Standalone?
Why are there 2 ways to get information, REST API and this Sink?
Best regards, Vladimir.
On Mon, Sep 12, 2016 at 3:53 PM, Vladimir Tretyakov <
vladimir.tretya...@sematext.com> wrote:
> Hello Saisai Shao, Jacek Laskowski , thx for information.
>
> We
i/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sun, Sep 11, 2016 at 11:18 AM, Vladimir Tretyakov
>> wrote:
>> > Hello Jacek, thx a lot, it works.
>> >
>> >
t; That's correct. One app one web UI. Open 4041 and you'll see the other
> app.
>
> Jacek
>
> On 9 Sep 2016 11:53 a.m., "Vladimir Tretyakov" <
> vladimir.tretya...@sematext.com> wrote:
>
>> Hello again.
>>
>> I am trying to play with Spark v
onitor/lib/spm-monitor-spark.jar=myValue-*2*
:spark-executor:default
...
...
...
Can I do something like that in Spark for executor? If not maybe it can be
done in the future? Will be useful.
Thx, best redgards, Vladimir Tretyakov.
/driver's working directory, so they
> can load the file.
>
> - Kousuke
>
>
>
> (2014/09/12 2:16), Vladimir Tretyakov wrote:
>
> Hi, Kousuke,
>
> Can you please explain a bit detailed what do you mean, I am new in
> Spark, looked at
> https://spark.apache
ity?
Thx for answer.
On Thu, Sep 11, 2014 at 5:55 PM, Kousuke Saruta
wrote:
> Hi Vladimir
>
> How about use --files option with spark-submit?
>
> - Kousuke
>
>
> (2014/09/11 23:43), Vladimir Tretyakov wrote:
>
> Hi again, yeah , I've tried to use ” spark
t; should find this file in their local FS because this file is loaded locally.
>
>
>
> Besides I think this might be a kind of workaround, a better solution is
> to fix this by some other solutions.
>
>
>
> Thanks
>
> Jerry
>
>
>
> *From:* Vladimir Trety
the yarn container, so metrics
> system cannot load the right sinks.
>
>
>
> Thanks
>
> Jerry
>
>
>
> *From:* Vladimir Tretyakov [mailto:vladimir.tretya...@sematext.com]
> *Sent:* Thursday, September 11, 2014 7:30 PM
> *To:* user@spark.apache.org
> *Su
Hello, we are in Sematext (https://apps.sematext.com/) are writing
Monitoring tool for Spark and we came across one question:
How to enable JMX metrics for YARN deployment?
We put "*.sink.jmx.class=org.apache.spark.metrics.sink.JmxSink"
to file $SPARK_HOME/conf/metrics.properties but it doesn't w
13 matches
Mail list logo