You can use backticks to quote the column names.

Cheng

On 6/3/15 2:49 AM, David Mitchell wrote:

I am having the same problem reading JSON. There does not seem to be a way of selecting a field that has a space, "Executor Info" from the Spark logs.

I suggest that we open a JIRA ticket to address this issue.

On Jun 2, 2015 10:08 AM, "ayan guha" <guha.a...@gmail.com <mailto:guha.a...@gmail.com>> wrote:

    I would think the easiest way would be to create a view in DB with
    column names with no space.

    In fact, you can "pass" a sql in place of a real table.

    From documentation: "The JDBC table that should be read. Note that
    anything that is valid in a `FROM` clause of a SQL query can be
    used. For example, instead of a full table you could also use a
    subquery in parentheses."

    Kindly let the community know if this works

    On Tue, Jun 2, 2015 at 6:43 PM, Sachin Goyal
    <sachin.go...@jabong.com <mailto:sachin.go...@jabong.com>> wrote:

        Hi,

        We are using spark sql (1.3.1) to load data from Microsoft sql
        server using jdbc (as described in
        
https://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases).


        It is working fine except when there is a space in column
        names (we can't modify the schemas to remove space as it is a
        legacy database).

        Sqoop is able to handle such scenarios by enclosing column
        names in '[ ]' - the recommended method from microsoft sql
        server.
        
(https://github.com/apache/sqoop/blob/trunk/src/java/org/apache/sqoop/manager/SQLServerManager.java
        - line no 319)

        Is there a way to handle this in spark sql?

        Thanks,
        sachin




-- Best Regards,
    Ayan Guha


Reply via email to