Just for clarification, this is the fullstracktrace: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 16/08/03 18:18:44 INFO SparkContext: Running Spark version 2.0.0 16/08/03 18:18:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 16/08/03 18:18:44 INFO SecurityManager: Changing view acls to: marchifl 16/08/03 18:18:44 INFO SecurityManager: Changing modify acls to: marchifl 16/08/03 18:18:44 INFO SecurityManager: Changing view acls groups to: 16/08/03 18:18:44 INFO SecurityManager: Changing modify acls groups to: 16/08/03 18:18:44 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(marchifl); groups with view permissions: Set(); users with modify permissions: Set(marchifl); groups with modify permissions: Set() 16/08/03 18:18:45 INFO Utils: Successfully started service 'sparkDriver' on port 49997. 16/08/03 18:18:45 INFO SparkEnv: Registering MapOutputTracker 16/08/03 18:18:45 INFO SparkEnv: Registering BlockManagerMaster 16/08/03 18:18:45 INFO DiskBlockManager: Created local directory at C:\Users\marchifl\AppData\Local\Temp\blockmgr-6935118d-e820-49c9-8625-9a4be2d59104 16/08/03 18:18:45 INFO MemoryStore: MemoryStore started with capacity 1458.6 MB 16/08/03 18:18:45 INFO SparkEnv: Registering OutputCommitCoordinator 16/08/03 18:18:46 INFO Utils: Successfully started service 'SparkUI' on port 4040. 16/08/03 18:18:46 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.36.10.192:4040 16/08/03 18:18:46 INFO Executor: Starting executor ID driver on host localhost 16/08/03 18:18:46 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50031. 16/08/03 18:18:46 INFO NettyBlockTransferService: Server created on 10.36.10.192:50031 16/08/03 18:18:46 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.36.10.192, 50031) 16/08/03 18:18:46 INFO BlockManagerMasterEndpoint: Registering block manager 10.36.10.192:50031 with 1458.6 MB RAM, BlockManagerId(driver, 10.36.10.192, 50031) 16/08/03 18:18:46 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.36.10.192, 50031) 16/08/03 18:18:46 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect. 16/08/03 18:18:46 INFO SharedState: Warehouse path is 'file:C:\Users\marchifl\scalaWorkspace\SparkStreamingApp2/spark-warehouse'. 16/08/03 18:18:48 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 59.6 KB, free 1458.5 MB) 16/08/03 18:18:48 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 14.3 KB, free 1458.5 MB) 16/08/03 18:18:48 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.36.10.192:50031 (size: 14.3 KB, free: 1458.6 MB) 16/08/03 18:18:48 INFO SparkContext: Created broadcast 0 from textFile at MLUtils.scala:99 16/08/03 18:18:48 INFO FileInputFormat: Total input paths to process : 1 16/08/03 18:18:48 INFO SparkContext: Starting job: reduce at MLUtils.scala:92 16/08/03 18:18:48 INFO DAGScheduler: Got job 0 (reduce at MLUtils.scala:92) with 6 output partitions 16/08/03 18:18:48 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at MLUtils.scala:92) 16/08/03 18:18:48 INFO DAGScheduler: Parents of final stage: List() 16/08/03 18:18:48 INFO DAGScheduler: Missing parents: List() 16/08/03 18:18:48 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[5] at map at MLUtils.scala:90), which has no missing parents 16/08/03 18:18:48 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 4.0 KB, free 1458.5 MB) 16/08/03 18:18:48 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.2 KB, free 1458.5 MB) 16/08/03 18:18:48 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.36.10.192:50031 (size: 2.2 KB, free: 1458.6 MB) 16/08/03 18:18:48 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1012 16/08/03 18:18:48 INFO DAGScheduler: Submitting 6 missing tasks from ResultStage 0 (MapPartitionsRDD[5] at map at MLUtils.scala:90) 16/08/03 18:18:48 INFO TaskSchedulerImpl: Adding task set 0.0 with 6 tasks 16/08/03 18:18:48 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, partition 0, PROCESS_LOCAL, 5373 bytes) 16/08/03 18:18:48 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, partition 1, PROCESS_LOCAL, 5373 bytes) 16/08/03 18:18:48 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, localhost, partition 2, PROCESS_LOCAL, 5373 bytes) 16/08/03 18:18:48 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, localhost, partition 3, PROCESS_LOCAL, 5373 bytes) 16/08/03 18:18:48 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, localhost, partition 4, PROCESS_LOCAL, 5373 bytes) 16/08/03 18:18:48 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, localhost, partition 5, PROCESS_LOCAL, 5373 bytes) 16/08/03 18:18:48 INFO Executor: Running task 2.0 in stage 0.0 (TID 2) 16/08/03 18:18:48 INFO Executor: Running task 1.0 in stage 0.0 (TID 1) 16/08/03 18:18:48 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) 16/08/03 18:18:48 INFO Executor: Running task 3.0 in stage 0.0 (TID 3) 16/08/03 18:18:48 INFO Executor: Running task 4.0 in stage 0.0 (TID 4) 16/08/03 18:18:48 INFO Executor: Running task 5.0 in stage 0.0 (TID 5) 16/08/03 18:18:49 INFO HadoopRDD: Input split: file:/C:/data/sample_libsvm_data.txt:0+17456 16/08/03 18:18:49 INFO HadoopRDD: Input split: file:/C:/data/sample_libsvm_data.txt:34912+17456 16/08/03 18:18:49 INFO HadoopRDD: Input split: file:/C:/data/sample_libsvm_data.txt:52368+17456 16/08/03 18:18:49 INFO HadoopRDD: Input split: file:/C:/data/sample_libsvm_data.txt:69824+17456 16/08/03 18:18:49 INFO HadoopRDD: Input split: file:/C:/data/sample_libsvm_data.txt:87280+17456 16/08/03 18:18:49 INFO HadoopRDD: Input split: file:/C:/data/sample_libsvm_data.txt:17456+17456 16/08/03 18:18:49 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id 16/08/03 18:18:49 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id 16/08/03 18:18:49 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 16/08/03 18:18:49 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition 16/08/03 18:18:49 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id 16/08/03 18:18:49 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id 16/08/03 18:18:49 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1032 bytes result sent to driver 16/08/03 18:18:49 INFO Executor: Finished task 5.0 in stage 0.0 (TID 5). 1119 bytes result sent to driver 16/08/03 18:18:49 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1119 bytes result sent to driver 16/08/03 18:18:49 INFO Executor: Finished task 3.0 in stage 0.0 (TID 3). 1032 bytes result sent to driver 16/08/03 18:18:49 INFO Executor: Finished task 4.0 in stage 0.0 (TID 4). 1032 bytes result sent to driver 16/08/03 18:18:49 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 208 ms on localhost (1/6) 16/08/03 18:18:49 INFO Executor: Finished task 2.0 in stage 0.0 (TID 2). 1032 bytes result sent to driver 16/08/03 18:18:49 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 212 ms on localhost (2/6) 16/08/03 18:18:49 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 214 ms on localhost (3/6) 16/08/03 18:18:49 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 219 ms on localhost (4/6) 16/08/03 18:18:49 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 265 ms on localhost (5/6) 16/08/03 18:18:49 INFO DAGScheduler: ResultStage 0 (reduce at MLUtils.scala:92) finished in 0.285 s 16/08/03 18:18:49 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 225 ms on localhost (6/6) 16/08/03 18:18:49 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 16/08/03 18:18:49 INFO DAGScheduler: Job 0 finished: reduce at MLUtils.scala:92, took 0.432523 s Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: file:C:/Users/marchifl/scalaWorkspace/SparkStreamingApp2/spark-warehouse at org.apache.hadoop.fs.Path.initialize(Path.java:206) at org.apache.hadoop.fs.Path.<init>(Path.java:172) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.makeQualifiedPath(SessionCatalog.scala:114) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:145) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(SessionCatalog.scala:89) at org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:95) at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:95) at org.apache.spark.sql.internal.SessionState$$anon$1.<init>(SessionState.scala:112) at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:112) at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:111) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64) at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:382) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:132) at com.spark.lab.JavaRandomForestRegressorExample.main(JavaRandomForestRegressorExample.java:29) Caused by: java.net.URISyntaxException: Relative path in absolute URI: file:C:/Users/marchifl/scalaWorkspace/SparkStreamingApp2/spark-warehouse at java.net.URI.checkPath(Unknown Source) at java.net.URI.<init>(Unknown Source) at org.apache.hadoop.fs.Path.initialize(Path.java:203) ... 15 more 16/08/03 18:18:49 INFO SparkContext: Invoking stop() from shutdown hook 16/08/03 18:18:49 INFO SparkUI: Stopped Spark web UI at http://10.36.10.192:4040 16/08/03 18:18:49 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 16/08/03 18:18:49 INFO MemoryStore: MemoryStore cleared 16/08/03 18:18:49 INFO BlockManager: BlockManager stopped 16/08/03 18:18:49 INFO BlockManagerMaster: BlockManagerMaster stopped 16/08/03 18:18:49 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 16/08/03 18:18:49 INFO SparkContext: Successfully stopped SparkContext 16/08/03 18:18:49 INFO ShutdownHookManager: Shutdown hook called 16/08/03 18:18:49 INFO ShutdownHookManager: Deleting directory C:\Users\marchifl\AppData\Local\Temp\spark-68f13d67-3601-4f28-802a-dcb5c3cf8694
-- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-net-URISyntaxException-Relative-path-in-absolute-URI-tp27466p27468.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org