Hi,

As some of you may be aware, I have been an active member of the Spark
community for years. More recently I have been working on the practical
aspect of using Spark on Kubernetes for business solutions.

As a Googlle advantage partner I have been involved in creating GCP
Dataproc clusters specifically geared for Spark with Kafka and ZooKeepers
on Dockers for real time trading systems.

More recently, I have focused on using Spark on Kubernetes AKA Spark on K8
predominantly og Google Kubernetes engines GKE
<https://cloud.google.com/kubernetes-engine> .In a nutshell, Spark can run
on clusters managed by Kubernetes. This feature makes use of native
Kubernetes scheduler that has been added to Spark. We also
discussed naming conventions for Docker images for Spark in these forums
with Java, Scala, OS and Spark versions.

So I have created these docker images now to be tested and used. So coming
to the point , how can I add these to Spark community dev pages with self
contained documentation. In some forums like Kafka this can be done by
making a request to be added to the contributors list of Apache Kafka.
However, I am not sure how to approach it in Spark forums. Appreciate any
advice.

Mich




*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

Reply via email to