Re: can't get jobs to run on cluster (enough memory and cpus are available on worker)

2014-07-21 Thread Matt Work Coarr
e "launch_cluster" in ec2/spark_ec2.py, where the >> ports seem to be configured. >> >> >> On Thu, Jul 17, 2014 at 1:29 PM, Matt Work Coarr >> wrote: >> > Thanks Marcelo! This is a huge help!! >> > >> > Looking at the executor lo

Re: can't get jobs to run on cluster (enough memory and cpus are available on worker)

2014-07-17 Thread Matt Work Coarr
Thanks Marcelo! This is a huge help!! Looking at the executor logs (in a vanilla spark install, I'm finding them in $SPARK_HOME/work/*)... It launches the executor, but it looks like the CoarseGrainedExecutorBackend is having trouble talking to the driver (exactly what you said!!!). Do you know

Re: can't get jobs to run on cluster (enough memory and cpus are available on worker)

2014-07-16 Thread Matt Work Coarr
t; Have you looked at the slave machine to see if the process has > actually launched? If it has, have you tried peeking into its log > file? > > (That error is printed whenever the executors fail to report back to > the driver. Insufficient resources to launch the executor is the mos

can't get jobs to run on cluster (enough memory and cpus are available on worker)

2014-07-15 Thread Matt Work Coarr
Hello spark folks, I have a simple spark cluster setup but I can't get jobs to run on it. I am using the standlone mode. One master, one slave. Both machines have 32GB ram and 8 cores. The slave is setup with one worker that has 8 cores and 24GB memory allocated. My application requires 2 cor

Re: creating new ami image for spark ec2 commands

2014-06-06 Thread Matt Work Coarr
Thanks Akhil! I'll give that a try!

Re: creating new ami image for spark ec2 commands

2014-06-06 Thread Matt Work Coarr
gt; "c3.4xlarge": "pvm", > "c3.8xlarge": "pvm" > } > if opts.instance_type in instance_types: > instance_type = instance_types[opts.instance_type] > else: > instance_type = "pvm" > print >> stderr

creating new ami image for spark ec2 commands

2014-06-05 Thread Matt Work Coarr
How would I go about creating a new AMI image that I can use with the spark ec2 commands? I can't seem to find any documentation. I'm looking for a list of steps that I'd need to perform to make an Amazon Linux image ready to be used by the spark ec2 tools. I've been reading through the spark 1.0

spark ec2 commandline tool error "VPC security groups may not be used for a non-VPC launch"

2014-05-19 Thread Matt Work Coarr
Hi, I'm attempting to run "spark-ec2 launch" on AWS. My AWS instances would be in our EC2 VPC (which seems to be causing a problem). The two security groups MyClusterName-master and MyClusterName-slaves have already been setup with the same ports open as the security group that spark-ec2 tries to