Hi,

I want to start a Spark standalone cluster programatically in java.

I have been checking these classes,
- org.apache.spark.deploy.master.Master
- org.apache.spark.deploy.worker.Worker

I successfully started a master with this simple main class.

 public static void main(String[] args) {
        SparkConf conf = new SparkConf();
        Master.startSystemAndActor("localhost", 4500, 8080, conf);
}


but I'm finding it hard to carry out a similar approach for the worker.

can anyone give an example of how to pass a value to the workerNumber field
in the Worker.startSystemAndActor constructor (in the java env)?

Cheers
-- 
Niranda

Reply via email to