The code for this runs in http://spark-prs.appspot.com (see
https://github.com/databricks/spark-pr-dashboard/blob/1e799c9e510fa8cdc9a6c084a777436bebeabe10/sparkprs/controllers/tasks.py#L137
)
I checked the AppEngine logs and it looks like we're getting error
responses, possibly due to a credential
Thank you so much Josh .. !!
2019년 4월 25일 (목) 오후 3:04, Josh Rosen 님이 작성:
> The code for this runs in http://spark-prs.appspot.com (see
> https://github.com/databricks/spark-pr-dashboard/blob/1e799c9e510fa8cdc9a6c084a777436bebeabe10/sparkprs/controllers/tasks.py#L137
> )
>
> I checked the AppEngin
Can anyone take a look for this one? OPEN status JIRAs are being rapidly
increased (from around 2400 to 2600)
2019년 4월 19일 (금) 오후 8:05, Hyukjin Kwon 님이 작성:
> Hi all,
>
> Looks 'spark/dev/github_jira_sync.py' is not running correctly somewhere.
> Usually the JIRA's status should be updated to "IN
Michael,
I have listed used cases above should we proceed with a design doc?
Best,
Stavros
Στις Δευ, 18 Μαρ 2019, 12:21 μ.μ. ο χρήστης Stavros Kontopoulos <
stavros.kontopou...@lightbend.com> έγραψε:
> Not really, if we agree that we want this, I can put together a design
> document and take it
Did you re-create your df when you update the timezone conf?
On Wed, Apr 24, 2019 at 9:18 PM Shubham Chaurasia
wrote:
> Writing:
> scala> df.write.orc("")
>
> For looking into contents, I used orc-tools-X.Y.Z-uber.jar (
> https://orc.apache.org/docs/java-tools.html)
>
> On Wed, Apr 24, 2019 at 6
Writing:
scala> df.write.orc("")
For looking into contents, I used orc-tools-X.Y.Z-uber.jar (
https://orc.apache.org/docs/java-tools.html)
On Wed, Apr 24, 2019 at 6:24 PM Wenchen Fan wrote:
> How did you read/write the timestamp value from/to ORC file?
>
> On Wed, Apr 24, 2019 at 6:30 PM Shubha
How did you read/write the timestamp value from/to ORC file?
On Wed, Apr 24, 2019 at 6:30 PM Shubham Chaurasia
wrote:
> Hi All,
>
> Consider the following(spark v2.4.0):
>
> Basically I change values of `spark.sql.session.timeZone` and perform an
> orc write. Here are 3 samples:-
>
> 1)
> scala>
Hi All,
Consider the following(spark v2.4.0):
Basically I change values of `spark.sql.session.timeZone` and perform an
orc write. Here are 3 samples:-
1)
scala> spark.conf.set("spark.sql.session.timeZone", "Asia/Kolkata")
scala> val df = sc.parallelize(Seq("2019-04-23
09:15:04.0")).toDF("ts").w
Unsubscribe