HyukjinKwon commented on PR #45776:
URL: https://github.com/apache/spark/pull/45776#issuecomment-2033671882

   In addition, the errors I get are:
   
   ```
   
   
   scala> spark.read.json().show()
   24/04/03 15:37:24 WARN DataSource: All paths were ignored:
   
   org.apache.spark.sql.AnalysisException: [UNABLE_TO_INFER_SCHEMA] Unable to 
infer schema for JSON. It must be specified manually. SQLSTATE: 42KD9
     at 
org.apache.spark.sql.errors.QueryCompilationErrors$.dataSchemaNotSpecifiedError(QueryCompilationErrors.scala:1581)
     at 
org.apache.spark.sql.execution.datasources.DataSource.$anonfun$getOrInferFileFormatSchema$12(DataSource.scala:212)
     at scala.Option.getOrElse(Option.scala:201)
     at 
org.apache.spark.sql.execution.datasources.DataSource.getOrInferFileFormatSchema(DataSource.scala:212)
     at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:409)
     at 
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:232)
     at 
org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:214)
     at scala.Option.getOrElse(Option.scala:201)
     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:214)
     at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:365)
     ... 42 elided
   
   scala> spark.read.csv().show()
   24/04/03 15:37:27 WARN DataSource: All paths were ignored:
   
   org.apache.spark.sql.AnalysisException: [UNABLE_TO_INFER_SCHEMA] Unable to 
infer schema for CSV. It must be specified manually. SQLSTATE: 42KD9
     at 
org.apache.spark.sql.errors.QueryCompilationErrors$.dataSchemaNotSpecifiedError(QueryCompilationErrors.scala:1581)
     at 
org.apache.spark.sql.execution.datasources.DataSource.$anonfun$getOrInferFileFormatSchema$12(DataSource.scala:212)
     at scala.Option.getOrElse(Option.scala:201)
     at 
org.apache.spark.sql.execution.datasources.DataSource.getOrInferFileFormatSchema(DataSource.scala:212)
     at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:409)
     at 
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:232)
     at 
org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:214)
     at scala.Option.getOrElse(Option.scala:201)
     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:214)
     at org.apache.spark.sql.DataFrameReader.csv(DataFrameReader.scala:541)
     ... 42 elided
   
   scala> spark.read.orc().show()
   24/04/03 15:37:29 WARN DataSource: All paths were ignored:
   
   org.apache.spark.sql.AnalysisException: [UNABLE_TO_INFER_SCHEMA] Unable to 
infer schema for ORC. It must be specified manually. SQLSTATE: 42KD9
     at 
org.apache.spark.sql.errors.QueryCompilationErrors$.dataSchemaNotSpecifiedError(QueryCompilationErrors.scala:1581)
     at 
org.apache.spark.sql.execution.datasources.DataSource.$anonfun$getOrInferFileFormatSchema$12(DataSource.scala:212)
     at scala.Option.getOrElse(Option.scala:201)
     at 
org.apache.spark.sql.execution.datasources.DataSource.getOrInferFileFormatSchema(DataSource.scala:212)
     at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:409)
     at 
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:232)
     at 
org.apache.spark.sql.DataFrameReader.$anonfun$load$2(DataFrameReader.scala:214)
     at scala.Option.getOrElse(Option.scala:201)
     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:214)
     at org.apache.spark.sql.DataFrameReader.orc(DataFrameReader.scala:663)
     ... 42 elided
   
   ```
   
   and when I specify the schema, it works.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to