Re: Network-related environemental problem when running JDBCSuite

2015-10-15 Thread Richard Hillegas
dress-getlocalhost-throws-unknownhostexception Thanks, -Rick Richard Hillegas/San Francisco/IBM@IBMUS wrote on 10/15/2015 11:15:29 AM: > From: Richard Hillegas/San Francisco/IBM@IBMUS > To: Dev > Date: 10/15/2015 11:16 AM > Subject: Re: Network-related environemental problem when

Re: Network-related environemental problem when running JDBCSuite

2015-10-15 Thread Richard Hillegas
-with-two-workers export SPARK_LOCAL_IP=127.0.0.1 Then I got errors related to booting the metastore_db. So I deleted that directory. After that I was able to run spark-shell again. Now let's see if this hack fixes the tests... Thanks, Rick Hillegas Richard Hillegas/San Francisco/IBM@

Re: Network-related environemental problem when running JDBCSuite

2015-10-15 Thread Richard Hillegas
parkSubmit$.doRunMain$1 (SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) :10: error: not found: value

Network-related environemental problem when running JDBCSuite

2015-10-15 Thread Richard Hillegas
I am seeing what look like environmental errors when I try to run a test on a clean local branch which has been sync'd to the head of the development trunk. I would appreciate advice about how to debug or hack around this problem. For the record, the test ran cleanly last week. This is the experi

Re: unsubscribe

2015-09-30 Thread Richard Hillegas
Hi Sukesh, To unsubscribe from the dev list, please send a message to dev-unsubscr...@spark.apache.org. To unsubscribe from the user list, please send a message user-unsubscr...@spark.apache.org. Please see: http://spark.apache.org/community.html#mailing-lists. Thanks, -Rick sukesh kumar wrote

Re: [Discuss] NOTICE file for transitive "NOTICE"s

2015-09-28 Thread Richard Hillegas
Thanks, Sean! Sean Owen wrote on 09/25/2015 06:35:46 AM: > From: Sean Owen > To: Reynold Xin , Richard Hillegas/San > Francisco/IBM@IBMUS > Cc: "dev@spark.apache.org" > Date: 09/25/2015 07:21 PM > Subject: Re: [Discuss] NOTICE file for transitive "NOTICE&quo

Re: [Discuss] NOTICE file for transitive "NOTICE"s

2015-09-24 Thread Richard Hillegas
Sean Owen wrote on 09/24/2015 12:40:12 PM: > From: Sean Owen > To: Richard Hillegas/San Francisco/IBM@IBMUS > Cc: "dev@spark.apache.org" > Date: 09/24/2015 12:40 PM > Subject: Re: [Discuss] NOTICE file for transitive "NOTICE"s > > Yes, the issue of wh

Re: [Discuss] NOTICE file for transitive "NOTICE"s

2015-09-24 Thread Richard Hillegas
-howto.html#permissive-deps Thanks, -Rick Sean Owen wrote on 09/24/2015 12:07:01 PM: > From: Sean Owen > To: Richard Hillegas/San Francisco/IBM@IBMUS > Cc: "dev@spark.apache.org" > Date: 09/24/2015 12:08 PM > Subject: Re: [Discuss] NOTICE file for transitive "NO

Re: [Discuss] NOTICE file for transitive "NOTICE"s

2015-09-24 Thread Richard Hillegas
NG NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE." Thanks, -Rick Reynold Xin wrote on 09/24/2015 10:55:53 AM: > From: Reynold Xin > To: Sean Owen > Cc: Richard Hillegas/San Francisco/IBM@IBMUS,

Re: [VOTE] Release Apache Spark 1.5.1 (RC1)

2015-09-24 Thread Richard Hillegas
> > entirely possible there's still a mistake somewhere in here (possibly > > a new dependency, etc). Please point it out if you see such a thing. > > > > But so far what you describe is "working as intended", as far as I > > know, according to Apa

Re: [VOTE] Release Apache Spark 1.5.1 (RC1)

2015-09-24 Thread Richard Hillegas
-1 (non-binding) I was able to build Spark cleanly from the source distribution using the command in README.md: build/mvn -DskipTests clean package However, while I was waiting for the build to complete, I started going through the NOTICE file. I was confused about where to find licenses fo

Re: column identifiers in Spark SQL

2015-09-22 Thread Richard Hillegas
ysisException: cannot resolve 'c\"d' given input columns A, b, c"d; line 1 pos 7 sqlContext.sql("""select `c\"d` from test_data""").show Thanks, -Rick Michael Armbrust wrote on 09/22/2015 01:16:12 PM: > From: Michael Armbrust > To: R

Re: Derby version in Spark

2015-09-22 Thread Richard Hillegas
Thanks, Ted. I'll follow up with the Hive folks. Cheers, -Rick Ted Yu wrote on 09/22/2015 03:41:12 PM: > From: Ted Yu > To: Richard Hillegas/San Francisco/IBM@IBMUS > Cc: Dev > Date: 09/22/2015 03:41 PM > Subject: Re: Derby version in Spark > > I cloned

Re: Derby version in Spark

2015-09-22 Thread Richard Hillegas
3.jarjersey-guice-1.9.jar parquet-encoding-1.7.0.jar Ted Yu wrote on 09/22/2015 01:32:39 PM: > From: Ted Yu > To: Richard Hillegas/San Francisco/IBM@IBMUS > Cc: Dev > Date: 09/22/2015 01:33 PM > Subject: Re: Derby version in Spark > > Which Spark release are you

Derby version in Spark

2015-09-22 Thread Richard Hillegas
I see that lib_managed/jars holds these old Derby versions: lib_managed/jars/derby-10.10.1.1.jar lib_managed/jars/derby-10.10.2.0.jar The Derby 10.10 release family supports some ancient JVMs: Java SE 5 and Java ME CDC/Foundation Profile 1.1. It's hard to imagine anyone running Spark on the

Re: column identifiers in Spark SQL

2015-09-22 Thread Richard Hillegas
" from test_data""").show And embedded quotes inside quoted identifiers are swallowed up: // this now returns rows consisting of the string literal "cd" sqlContext.sql("""select "c""d" from test_data""").show Than

column identifiers in Spark SQL

2015-09-22 Thread Richard Hillegas
I am puzzled by the behavior of column identifiers in Spark SQL. I don't find any guidance in the "Spark SQL and DataFrame Guide" at http://spark.apache.org/docs/latest/sql-programming-guide.html. I am seeing odd behavior related to case-sensitivity and to delimited (quoted) identifiers. Conside

Re: Unsubscribe

2015-09-21 Thread Richard Hillegas
To unsubscribe from the dev list, please send a message to dev-unsubscr...@spark.apache.org as described here: http://spark.apache.org/community.html#mailing-lists. Thanks, -Rick Dulaj Viduranga wrote on 09/21/2015 10:15:58 AM: > From: Dulaj Viduranga > To: dev@spark.apache.org > Date: 09/21/