Hi,

Good progress!

Can you remove metastore_db directory and start ./bin/pyspark over? I
don't think starting from ~ is necessary.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Wed, Apr 26, 2017 at 8:10 PM, Afshin, Bardia
<bardia.afs...@capitalone.com> wrote:
> Kicking off the process from ~ directory makes the message go away. I guess
> the metastore_db created is relative to path of where it’s executed.
>
> FIX: kick off from ~ directory
>
> ./spark-2.1.0-bin-hadoop2.7/bin/pysark
>
>
>
> From: "Afshin, Bardia" <bardia.afs...@capitalone.com>
> Date: Wednesday, April 26, 2017 at 9:47 AM
> To: Jacek Laskowski <ja...@japila.pl>
> Cc: "user@spark.apache.org" <user@spark.apache.org>
>
>
> Subject: Re: weird error message
>
>
>
> Thanks for the hint, I don’t think. I thought it’s a permission issue that
> it cannot read or write to ~/metastore_db but the directory is definitely
> there
>
>
>
> drwxrwx---  5 ubuntu ubuntu 4.0K Apr 25 23:27 metastore_db
>
>
>
>
>
> Just re ran the command from within root spark folder ./bin/pyspark and the
> same issue.
>
>
>
> Caused by: ERROR XBM0H: Directory
> /home/ubuntu/spark-2.1.0-bin-hadoop2.7/metastore_db cannot be created.
>
>                 at
> org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
>
>                 at
> org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
>
>                 at
> org.apache.derby.impl.services.monitor.StorageFactoryService$10.run(Unknown
> Source)
>
>                 at java.security.AccessController.doPrivileged(Native
> Method)
>
>                 at
> org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
> Source)
>
>                 at
> org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
> Source)
>
>                 at
> org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
> Source)
>
>                 at
> org.apache.derby.impl.services.monitor.FileMonitor.createPersistentService(Unknown
> Source)
>
>                 at
> org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
> Source)
>
>                 at org.apache.derby.impl.jdbc.EmbedConnection$5.run(Unknown
> Source)
>
>                 at java.security.AccessController.doPrivileged(Native
> Method)
>
>                 at
> org.apache.derby.impl.jdbc.EmbedConnection.createPersistentService(Unknown
> Source)
>
>                 ... 105 more
>
> Traceback (most recent call last):
>
>   File "/home/ubuntu/spark-2.1.0-bin-hadoop2.7/python/pyspark/shell.py",
> line 43, in <module>
>
>     spark = SparkSession.builder\
>
>   File
> "/home/ubuntu/spark-2.1.0-bin-hadoop2.7/python/pyspark/sql/session.py", line
> 179, in getOrCreate
>
>     session._jsparkSession.sessionState().conf().setConfString(key, value)
>
>   File
> "/home/ubuntu/spark-2.1.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py",
> line 1133, in __call__
>
>   File "/home/ubuntu/spark-2.1.0-bin-hadoop2.7/python/pyspark/sql/utils.py",
> line 79, in deco
>
>     raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
>
> pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating
> 'org.apache.spark.sql.hive.HiveSessionState':"
>
>>>>
>
> ubuntu@:~/spark-2.1.0-bin-hadoop2.7$ ps aux | grep spark
>
> ubuntu     2796  0.0  0.0  10460   932 pts/0    S+   16:44   0:00 grep
> --color=auto spark
>
>
>
> From: Jacek Laskowski <ja...@japila.pl>
> Date: Wednesday, April 26, 2017 at 12:51 AM
> To: "Afshin, Bardia" <bardia.afs...@capitalone.com>
> Cc: user <user@spark.apache.org>
> Subject: Re: weird error message
>
>
>
> Hi,
>
>
>
> You've got two spark sessions up and running (and given Spark SQL uses
> Derby-managed Hive MetaStock hence the issue)
>
>
>
> Please don't start spark-submit from inside bin. Rather bin/spark-submit...
>
>
>
> Jacek
>
>
>
>
>
> On 26 Apr 2017 1:57 a.m., "Afshin, Bardia" <bardia.afs...@capitalone.com>
> wrote:
>
> I’m having issues when I fire up pyspark on a fresh install.
>
> When I submit the same process via spark-submit it works.
>
>
>
> Here’s a dump of the trace:
>
>             at
> org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:109)
>
>             at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>             at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>
>             at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>             at java.lang.reflect.Method.invoke(Method.java:497)
>
>             at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
>
>             at
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
>
>             at py4j.Gateway.invoke(Gateway.java:280)
>
>             at
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
>
>             at py4j.commands.CallCommand.execute(CallCommand.java:79)
>
>             at py4j.GatewayConnection.run(GatewayConnection.java:214)
>
>             at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.sql.SQLException: Failed to create database 'metastore_db',
> see the next exception for details.
>
>             at
> org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
> Source)
>
>             at
> org.apache.derby.impl.jdbc.SQLExceptionFactory.getSQLException(Unknown
> Source)
>
>             at org.apache.derby.impl.jdbc.Util.seeNextException(Unknown
> Source)
>
>             at
> org.apache.derby.impl.jdbc.EmbedConnection.createDatabase(Unknown Source)
>
>             at org.apache.derby.impl.jdbc.EmbedConnection.<init>(Unknown
> Source)
>
>             at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
>
>             at org.apache.derby.jdbc.InternalDriver$1.run(Unknown Source)
>
>             at java.security.AccessController.doPrivileged(Native Method)
>
>             at
> org.apache.derby.jdbc.InternalDriver.getNewEmbedConnection(Unknown Source)
>
>             at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
>
>             at org.apache.derby.jdbc.InternalDriver.connect(Unknown Source)
>
>             at org.apache.derby.jdbc.AutoloadedDriver.connect(Unknown
> Source)
>
>             at java.sql.DriverManager.getConnection(DriverManager.java:664)
>
>             at java.sql.DriverManager.getConnection(DriverManager.java:208)
>
>             at
> com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)
>
>             at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)
>
>             ... 92 more
>
> Caused by: ERROR XJ041: Failed to create database 'metastore_db', see the
> next exception for details.
>
>             at
> org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
>
>             at
> org.apache.derby.impl.jdbc.SQLExceptionFactory.wrapArgsForTransportAcrossDRDA(Unknown
> Source)
>
>             ... 108 more
>
> Caused by: ERROR XBM0H: Directory
> /home/ubuntu/spark-2.1.0-bin-hadoop2.7/bin/metastore_db cannot be created.
>
>             at
> org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
>
>             at
> org.apache.derby.iapi.error.StandardException.newException(Unknown Source)
>
>             at
> org.apache.derby.impl.services.monitor.StorageFactoryService$10.run(Unknown
> Source)
>
>             at java.security.AccessController.doPrivileged(Native Method)
>
>             at
> org.apache.derby.impl.services.monitor.StorageFactoryService.createServiceRoot(Unknown
> Source)
>
>             at
> org.apache.derby.impl.services.monitor.BaseMonitor.bootService(Unknown
> Source)
>
>             at
> org.apache.derby.impl.services.monitor.BaseMonitor.createPersistentService(Unknown
> Source)
>
>             at
> org.apache.derby.impl.services.monitor.FileMonitor.createPersistentService(Unknown
> Source)
>
>             at
> org.apache.derby.iapi.services.monitor.Monitor.createPersistentService(Unknown
> Source)
>
>             at org.apache.derby.impl.jdbc.EmbedConnection$5.run(Unknown
> Source)
>
>             at java.security.AccessController.doPrivileged(Native Method)
>
>             at
> org.apache.derby.impl.jdbc.EmbedConnection.createPersistentService(Unknown
> Source)
>
>             ... 105 more
>
> Traceback (most recent call last):
>
>   File "/home/ubuntu/spark-2.1.0-bin-hadoop2.7/python/pyspark/shell.py",
> line 43, in <module>
>
>     spark = SparkSession.builder\
>
>   File
> "/home/ubuntu/spark-2.1.0-bin-hadoop2.7/python/pyspark/sql/session.py", line
> 179, in getOrCreate
>
>     session._jsparkSession.sessionState().conf().setConfString(key, value)
>
>   File
> "/home/ubuntu/spark-2.1.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py",
> line 1133, in __call__
>
>   File "/home/ubuntu/spark-2.1.0-bin-hadoop2.7/python/pyspark/sql/utils.py",
> line 79, in deco
>
>     raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
>
> pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating
> 'org.apache.spark.sql.hive.HiveSessionState':"
>
>
>
> ________________________________
>
> The information contained in this e-mail is confidential and/or proprietary
> to Capital One and/or its affiliates and may only be used solely in
> performance of work or services for Capital One. The information transmitted
> herewith is intended only for use by the individual or entity to which it is
> addressed. If the reader of this message is not the intended recipient, you
> are hereby notified that any review, retransmission, dissemination,
> distribution, copying or other use of, or taking of any action in reliance
> upon this information is strictly prohibited. If you have received this
> communication in error, please contact the sender and delete the material
> from your computer.
>
>
>
>
> ________________________________
>
> The information contained in this e-mail is confidential and/or proprietary
> to Capital One and/or its affiliates and may only be used solely in
> performance of work or services for Capital One. The information transmitted
> herewith is intended only for use by the individual or entity to which it is
> addressed. If the reader of this message is not the intended recipient, you
> are hereby notified that any review, retransmission, dissemination,
> distribution, copying or other use of, or taking of any action in reliance
> upon this information is strictly prohibited. If you have received this
> communication in error, please contact the sender and delete the material
> from your computer.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to