Hi Sudipta,

I would also like to suggest to ask this question in Cloudera mailing list
since you have HDFS, MAPREDUCE and Yarn requirements. Spark can work with
HDFS and YARN but it is more like a client to those clusters. Cloudera can
provide services to answer your question more clearly. I'm not affiliate
with Cloudera but it seems they are the only one who is very active in the
spark project and provides a hadoop distribution.

HTH,

Jerry

btw, who is Paco Nathan?

On Thu, Jan 22, 2015 at 10:03 AM, Babu, Prashanth <
prashanth.b...@nttdata.com> wrote:

>  Sudipta,
>
>
>
> Use the Docker image [1] and play around with Hadoop and Spark in the VM
> for a while.
>
> Decide on your use case(s) and then you can move ahead for installing on a
> cluster, etc.
>
> This Docker image has all you want [HDFS + MapReduce + Spark + YARN].
>
>
>
> All the best!
>
>
>
> [1]: https://github.com/sequenceiq/docker-spark
>
>
>
> *From:* Sudipta Banerjee [mailto:asudipta.baner...@gmail.com]
> *Sent:* 22 January 2015 14:51
> *To:* Marco Shaw
> *Cc:* user@spark.apache.org
> *Subject:* Re: Spark Team - Paco Nathan said that your team can help
>
>
>
> Hi Marco,
>
> Thanks for the confirmation. Please let me know what are the lot more
> detail you need to answer a very specific question  WHAT IS THE MINIMUM
> HARDWARE CONFIGURATION REQUIRED TO BUILT HDFS+ MAPREDUCE+SPARK+YARN  on a
> system? Please let me know if you need any further information and if you
> dont know please drive across with the $10000 to Sir Paco Nathan and get me
> the answer.
>
> Thanks and Regards,
>
> Sudipta
>
>
>
> On Thu, Jan 22, 2015 at 5:33 PM, Marco Shaw <marco.s...@gmail.com> wrote:
>
> Hi,
>
> Let me reword your request so you understand how (too) generic your
> question is....
>
> "Hi, I have $10,000, please find me some means of transportation so I can
> get to work."
>
> Please provide (a lot) more details. If you can't, consider using one of
> the pre-built express VMs from either Cloudera, Hortonworks or MapR, for
> example.
>
> Marco
>
>
>
>
> > On Jan 22, 2015, at 7:36 AM, Sudipta Banerjee <
> asudipta.baner...@gmail.com> wrote:
> >
> >
> >
> > Hi Apache-Spark team ,
> >
> > What are the system requirements installing Hadoop and Apache Spark?
> > I have attached the screen shot of Gparted.
> >
> >
> > Thanks and regards,
> > Sudipta
> >
> >
> >
> >
> > --
> > Sudipta Banerjee
> > Consultant, Business Analytics and Cloud Based Architecture
> > Call me +919019578099
>
> > <Screenshot - Wednesday 21 January 2015 - 10:55:29 IST.png>
> >
>
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
>
> --
>
> Sudipta Banerjee
>
> Consultant, Business Analytics and Cloud Based Architecture
>
> Call me +919019578099
>
> ______________________________________________________________________
> Disclaimer: This email and any attachments are sent in strictest confidence
> for the sole use of the addressee and may contain legally privileged,
> confidential, and proprietary data. If you are not the intended recipient,
> please advise the sender by replying promptly to this email and then delete
> and destroy this email and any attachments without any further use, copying
> or forwarding.
>

Reply via email to