anyone explain to me what exactly is needed
>>> to support a new data type in SparkSQL's Parquet storage engine?
>>>
>>> Thanks.
>>>
>>> Alex
>>>
>>> On Mon, Dec 29, 2014 at 10:20 PM, Wang, Daoyuan
>>> wrote:
>>>
>>
t;> going to be compatible long term.
>>>
>>>
>>>
>>> Michael
>>>
>>>
>>>
>>> On Mon, Dec 29, 2014 at 8:13 AM, Alessandro Baretta <
>>> alexbare...@gmail.com> wrote:
>>>
>>> Daoyuan,
>>>
>
nanoseconds now. Since passing too many flags is ugly, now I need the whole
>> SQLContext, so that we can put more flags there.
>>
>>
>>
>> Thanks,
>>
>> Daoyuan
>>
>>
>>
>> *From:* Michael Armbrust [mailto:mich...@databricks.com]
>> *Se
2014 10:43 AM
> *To:* Alessandro Baretta
> *Cc:* Wang, Daoyuan; dev@spark.apache.org
> *Subject:* Re: Unsupported Catalyst types in Parquet
>
>
>
> Yeah, I saw those. The problem is that #3822 truncates timestamps that
> include nanoseconds.
>
>
>
> On Mon, Dec 29, 2014 a
10:43 AM
To: Alessandro Baretta
Cc: Wang, Daoyuan; dev@spark.apache.org
Subject: Re: Unsupported Catalyst types in Parquet
Yeah, I saw those. The problem is that #3822 truncates timestamps that include
nanoseconds.
On Mon, Dec 29, 2014 at 5:14 PM, Alessandro Baretta
mailto:alexbare...@gmail.com
;> wrote:
>>>
>>>> Hi Alex,
>>>>
>>>> I'll create JIRA SPARK-4985 for date type support in parquet, and
>>>> SPARK-4987 for timestamp type support. For decimal type, I think we only
>>>> support decimals that fits in a
;>
>>> -----Original Message-
>>> From: Alessandro Baretta [mailto:alexbare...@gmail.com]
>>> Sent: Saturday, December 27, 2014 2:47 PM
>>> To: dev@spark.apache.org; Michael Armbrust
>>> Subject: Unsupported Catalyst types in Parquet
>>>
>>&g
lexbare...@gmail.com]
>> Sent: Saturday, December 27, 2014 2:47 PM
>> To: dev@spark.apache.org; Michael Armbrust
>> Subject: Unsupported Catalyst types in Parquet
>>
>> Michael,
>>
>> I'm having trouble storing my SchemaRDDs in Parquet format with SparkSQL,
&
ssandro Baretta [mailto:alexbare...@gmail.com]
> Sent: Saturday, December 27, 2014 2:47 PM
> To: dev@spark.apache.org; Michael Armbrust
> Subject: Unsupported Catalyst types in Parquet
>
> Michael,
>
> I'm having trouble storing my SchemaRDDs in Parquet format with SparkSQL
Sent: Saturday, December 27, 2014 2:47 PM
To: dev@spark.apache.org; Michael Armbrust
Subject: Unsupported Catalyst types in Parquet
Michael,
I'm having trouble storing my SchemaRDDs in Parquet format with SparkSQL, due
to my RDDs having having DateType and DecimalType fields. What would it ta
Michael,
I'm having trouble storing my SchemaRDDs in Parquet format with SparkSQL,
due to my RDDs having having DateType and DecimalType fields. What would it
take to add Parquet support for these Catalyst? Are there any other
Catalyst types for which there is no Catalyst support?
Alex
11 matches
Mail list logo