g.Thread.run(Thread.java:750)
Could someone help me how to proceed further?
--
Thanks and Regards
*Ranga Reddy*
*--*
*Bangalore, Karnataka, India*
*Mobile : +91-9986183183 | Email: rangareddy.av...@gmail.com
*
Thanks Ted. Will do.
On Wed, Mar 18, 2015 at 2:27 PM, Ted Yu wrote:
> Ranga:
> Please apply the patch from:
> https://github.com/apache/spark/pull/4867
>
> And rebuild Spark - the build would use Tachyon-0.6.1
>
> Cheers
>
> On Wed, Mar 18, 2015 at 2:23 PM, Ra
Hi Haoyuan
No. I assumed that Spark-1.3.0 was already built with Tachyon-0.6.0. If
not, I can rebuild and try. Could you let me know how to rebuild with 0.6.0?
Thanks for your help.
- Ranga
On Wed, Mar 18, 2015 at 12:59 PM, Haoyuan Li wrote:
> Did you recompile it with Tachyon 0.
ailed 10 attempts to
create tachyon dir in
/tmp_spark_tachyon/spark-e3538a20-5e42-48a4-ad67-4b97aded90e4/
Thanks for any other pointers.
- Ranga
On Wed, Mar 18, 2015 at 9:53 AM, Ranga wrote:
> Thanks for the information. Will rebuild with 0.6.0 till the patch is
> merged.
>
> On T
Thanks for the information. Will rebuild with 0.6.0 till the patch is
merged.
On Tue, Mar 17, 2015 at 7:24 PM, Ted Yu wrote:
> Ranga:
> Take a look at https://github.com/apache/spark/pull/4867
>
> Cheers
>
> On Tue, Mar 17, 2015 at 6:08 PM, fightf...@163.com
>
yon has been used in a
production environment by anybody in this group?
Appreciate your help with this.
- Ranga
gt; On Mon, Dec 29, 2014 at 6:45 PM, sranga wrote:
> > Hi
> >
> > Could Spark-SQL be used from within a custom actor that acts as a
> receiver
> > for a streaming application? If yes, what is the recommended way of
> passing
> > the SparkContext to th
p your
use-case though.
You could also increase the spark.storage.memoryFraction if that is an
option.
- Ranga
On Wed, Dec 10, 2014 at 10:23 PM, Aaron Davidson wrote:
> The ContextCleaner uncaches RDDs that have gone out of scope on the
> driver. So it's possible that the given RDD is n
ance metadata to
> obtain the temporary credentials.
> --
>
> Maybe you can use AWS SDK in your application to provide AWS credentials?
>
> https://github.com/seratch/AWScala
>
>
> On Oct 14, 2014, at 11:10 AM, Ranga wrote:
>
> One related question. Could I specify
One related question. Could I specify the "
com.amazonaws.services.s3.AmazonS3Client" implementation for the "
fs.s3.impl" parameter? Let me try that and update this thread with my
findings.
On Tue, Oct 14, 2014 at 10:48 AM, Ranga wrote:
> Thanks for the input.
> Yes
t;- If Spark is not able to use the IAMRole credentials, I may have to
> >generate a static key-id/secret. This may or may not be possible in
> the
> >environment I am in (from a policy perspective)
> >
> >
> >
> > - Ranga
> >
> > On Tue, Oc
te a static key-id/secret. This may or may not be possible in the
environment I am in (from a policy perspective)
- Ranga
On Tue, Oct 14, 2014 at 4:21 AM, Rafal Kwasny wrote:
> Hi,
> keep in mind that you're going to have a bad time if your secret key
> contains a "/
Hi Daniil
Could you provide some more details on how the cluster should be
launched/configured? The EC2 instance that I am dealing with uses the
concept of IAMRoles. I don't have any "keyfile" to specify to the spark-ec2
script.
Thanks for your help.
- Ranga
On Mon, Oct 13,
Is there a way to specify a request header during the
.textFile call?
- Ranga
On Mon, Oct 13, 2014 at 11:03 AM, Ranga wrote:
> Hi
>
> I am trying to access files/buckets in S3 and encountering a permissions
> issue. The buckets are configured to authenticate using an IAMRole p
well.
Any help is appreciated.
- Ranga
around, I am able to proceed
with this for now.
- Ranga
On Wed, Oct 8, 2014 at 9:18 PM, Ranga wrote:
> This is a bit strange. When I print the schema for the RDD, it reflects
> the correct data type for each column. But doing any kind of mathematical
> calculation seems to result in
int)
...
from table
Any other pointers? Thanks for the help.
- Ranga
On Wed, Oct 8, 2014 at 5:20 PM, Ranga wrote:
> Sorry. Its 1.1.0.
> After digging a bit more into this, it seems like the OpenCSV Deseralizer
> converts all the columns to a String type. This maybe throwing the
> e
:11 PM, Michael Armbrust
wrote:
> Which version of Spark are you running?
>
> On Wed, Oct 8, 2014 at 4:18 PM, Ranga wrote:
>
>> Thanks Michael. Should the cast be done in the source RDD or while doing
>> the SUM?
>> To give a better picture here is the code seque
rom sourceRDD group by c1, c2)
// This query throws the exception when I collect the results
I tried adding the cast to the aggRdd query above and that didn't help.
- Ranga
On Wed, Oct 8, 2014 at 3:52 PM, Michael Armbrust
wrote:
> Using SUM on a string should automatically cast the co
terAsTable
function is called? Are there other approaches that I should be looking at?
Thanks for your help.
- Ranga
20 matches
Mail list logo