Hi,
I am trying to calculate CPU utilization of an Executor(JVM level CPU
usage) using Event log. Can someone please help me with this?
1) Which column/properties to select
2) the correct formula to derive cpu usage
Has anyone done anything similar to this?
We have many pipelines and those are
-
> https://mallikarjuna_g.gitbooks.io/spark/content/spark-SparkListener.html
> ?
>
> Cheers,
> Sonal
> https://github.com/zinggAI/zingg
>
>
>
> On Thu, Jan 20, 2022 at 10:49 AM Prasad Bhalerao <
> prasadbhalerao1...@gmail.com> wrote:
>
>> Hello,
>>
>&g
Hello,
Is there any way we can profile spark applications which will show no. of
invocations of spark api and their execution time etc etc just the way
jprofiler shows all the details?
Thanks,
Prasad
arquet)
>
> Then you use the change stream after the dump and join it on the snapshot
> - similarly than what your database is doing.
> After that you can build the aggregates and reports from that table.
>
> - T
>
> On 4 Apr 2019, at 22.35, Prasad Bhalerao
> wrote:
>
Jason Nerothin
wrote:
> Hi Prasad,
>
> Could you create an Oracle-side view that captures only the relevant
> records and the use Spark JDBC connector to load the view into Spark?
>
> On Thu, Apr 4, 2019 at 1:48 PM Prasad Bhalerao <
> prasadbhalerao1...@gmail.com>
Hi,
I am exploring spark for my Reporting application.
My use case is as follows...
I have 4-5 oracle tables which contains more than 1.5 billion rows. These
tables are updated very frequently every day. I don't have choice to change
database technology. So this data is going to remain in Oracle o