Prasanna,
AFAIK spark does not handle folders without partition column names in them
and there is no way to get spark to do it.
I think the reason for this is that parquet file hierarchies had this info
and historically spark deals more with those.
On Mon, Nov 28, 2016 at 9:48 AM, Prasanna Santhan
I have HDFS servers running locally and "hadoop dfs -ls /" are all running fine.
>From spark-shell I do this:
val lines = sc.textFile("hdfs:///test")
and I get this error message.
java.io.IOException: Failed on local exception: java.io.EOFException; Host
Details : local host is: "localhost.locald
I am getting the following error when trying to build spark. I tried various
sizes for the -Xmx and other memory related arguments to the java command line,
but the assembly command still fails.
$ sbt/sbt assembly
...
[info] Compiling 298 Scala sources and 17 Java sources to
/vagrant/spark-0.9.
: killed
Date: Sat, 22 Mar 2014 20:03:28 +0800
Large memory is need to build spark, I think you should make xmx larger, 2g for
example.
发件人: Bharath Bhushan
发送时间: 2014/3/22 12:50
收件人: user@spark.apache.org
主题: unable to build spark - sbt/sbt: line 50: killed
I am getting the following error
I am facing a weird failure where "sbt/sbt assembly” shows a lot of SSL
certificate errors for repo.maven.apache.org. Is anyone else facing the same
problems? Any idea why this is happening? Yesterday I was able to successfully
run it.
Loading https://repo.maven.apache.org shows an invalid cert
I don’t see the errors anymore. Thanks Aaron.
On 24-Mar-2014, at 12:52 am, Aaron Davidson wrote:
> These errors should be fixed on master with Sean's PR:
> https://github.com/apache/spark/pull/209
>
> The orbit errors are quite possibly due to using https instead of http,
> whether or not the
Creating simple.sbt and src/ in $SPARK_HOME allows me to run a standalone scala
program in the downloaded spark code tree. For example my directory layout is:
$ ls spark-0.9.0-incubating-bin-hadoop2
…
simple.sbt
src
…
$ tree src
src
`-- main
`-- scala
`— SimpleApp.scala
— Bharath
On
Is there a way to see the resource usage of each spark-shell command — say time
taken and memory used?
I checked the WebUI of spark-shell and of the master and I don’t see any such
breakdown. I see the time taken in the INFO logs but nothing about memory
usage. It would also be nice to track the
Ph: +1 (760) 203 3257
> http://www.sigmoidanalytics.com
> @mayur_rustagi
>
>
>
> On Tue, Mar 25, 2014 at 6:04 AM, Bharath Bhushan
> wrote:
> Is there a way to see the resource usage of each spark-shell command — say
> time taken and memory used?
> I checked the We
I am facing different kinds of java.lang.ClassNotFoundException when trying to
run spark on mesos. One error has to do with
org.apache.spark.executor.MesosExecutorBackend. Another has to do with
org.apache.spark.serializer.JavaSerializer. I see other people complaining
about similar issues.
I
; Cheers,
> Tim
>
> - Original Message -
>> From: "Bharath Bhushan"
>> To: user@spark.apache.org
>> Sent: Monday, March 31, 2014 8:16:19 AM
>> Subject: java.lang.ClassNotFoundException - spark on mesos
>>
>> I am facing different kinds of java
.
On 31-Mar-2014, at 11:30 pm, Tim St Clair wrote:
> It sounds like the protobuf issue.
>
> So FWIW, You might want to try updating the 0.9.0 w/pom mods for mesos &
> protobuf.
>
> mesos 0.17.0 & protobuf 2.5
>
> Cheers,
> Tim
>
> - Origina
I was talking about the protobuf version issue as not fixed. I could not find
any reference to the problem or the fix.
Reg. SPARK-1052, I could pull in the fix into my 0.9.0 tree (from the tar ball
on the website) and I see the fix in the latest git.
Thanks
On 01-Apr-2014, at 3:28 am, deric w
Another problem I noticed is that the current 1.0.0 git tree still gives me the
ClassNotFoundException. I see that the SPARK-1052 is already fixed there. I
then modified the pom.xml for mesos and protobuf and that still gave the
ClassNotFoundException. I also tried modifying pom.xml only for mes
nks
On 01/04/14 11:04 am, Bharath Bhushan wrote:
Another problem I noticed is that the current 1.0.0 git tree still gives me the
ClassNotFoundException. I see that the SPARK-1052 is already fixed there. I
then modified the pom.xml for mesos and protobuf and that still gave the
ClassNotFoundExcept
Ian,
I also faced a similar issue and the discussion is ongoing here:
http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-ClassNotFoundException-spark-on-mesos-td3510.html
Are you facing a ClassNotFoundException too?
On 02/04/14 2:21 am, Ian Ferreira wrote:
From what I can tell I ne
Does spark in general assure exactly once semantics? What happens to
those guarantees in the presence of updateStateByKey operations -- are
they also assured to be exactly once?
Thanks
manku.timma at outlook dot com
17 matches
Mail list logo