This vote is cancelled in favor of rc5.
On Thu, Jul 14, 2016 at 11:59 AM, Reynold Xin wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 2.0.0. The vote is open until Sunday, July 17, 2016 at 12:00 PDT and passes
> if a majority of at least 3 +1 PMC votes are ca
Please vote on releasing the following candidate as Apache Spark version
2.0.0. The vote is open until Friday, July 22, 2016 at 20:00 PDT and passes
if a majority of at least 3 +1 PMC votes are cast.
[ ] +1 Release this package as Apache Spark 2.0.0
[ ] -1 Do not release this package because ...
Hi All,
I have been trying to access tables from other schema's , apart from
default , to pull data into dataframe.
i was successful in doing it using the default schema in hive database.
But when i try any other schema/database in hive, i am getting below
error.(Have also not seen any examples r
Ah in that case: 0
On Tue, Jul 19, 2016 at 3:26 PM, Jonathan Kelly
wrote:
> The docs link from Reynold's initial email is apparently no longer valid.
> He posted an updated link a little later in this same thread.
>
>
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc4-docs-upda
The docs link from Reynold's initial email is apparently no longer valid.
He posted an updated link a little later in this same thread.
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc4-docs-updated/
On Tue, Jul 19, 2016 at 3:19 PM Holden Karau wrote:
> -1 : The docs don't seem
I am trying to find the root cause of recent Spark application failure in
production. When the Spark application is running I can check NodeManager's
yarn.nodemanager.log-dir property to get the Spark executor container logs.
The container has logs for both the running Spark applications
Here i
-1 : The docs don't seem to be fully built (e.g.
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc4-docs/streaming-programming-guide.html
is a zero byte file currently) - although if this is a transient apache
issue no worries.
On Thu, Jul 14, 2016 at 11:59 AM, Reynold Xin wrote:
+0
Our internal test suites seem mostly happy, except for SPARK-16632.
Since there's a somewhat easy workaround, I don't think it's a blocker
for 2.0.0.
On Thu, Jul 14, 2016 at 11:59 AM, Reynold Xin wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 2.0.0. The vo
Hi Reynold,
So far we've been able to transition everything to `SparkSession`. I was just
following up on behalf of Maciej.
Michael
> On Jul 19, 2016, at 11:02 AM, Reynold Xin wrote:
>
> dropping user list
>
> Yup I just took a look -- you are right.
>
> What's the reason you'd need a HiveC
This line: "build/sbt clean assembly"
should also be changed, right?
On Tue, Jul 19, 2016 at 1:18 AM, Sean Owen wrote:
> If the change is just to replace "sbt assembly/assembly" with "sbt
> package", done. LMK if there are more edits.
>
> On Mon, Jul 18, 2016 at 10:00 PM, Michael Gummelt
> wro
dropping user list
Yup I just took a look -- you are right.
What's the reason you'd need a HiveContext? The only method that
HiveContext has and SQLContext does not have is refreshTable. Given this is
meant for helping code transition, it might be easier to just use
SQLContext and change the plac
Sorry Reynold, I want to triple check this with you. I'm looking at the
`SparkSession.sqlContext` field in the latest 2.0 branch, and it appears that
that val is set specifically to an instance of the `SQLContext` class. A cast
to `HiveContext` will fail. Maybe there's a misunderstanding here. T
Yes. But in order to access methods available only in HiveContext a user
cast is required.
On Tuesday, July 19, 2016, Maciej Bryński wrote:
> @Reynold Xin,
> How this will work with Hive Support ?
> SparkSession.sqlContext return HiveContext ?
>
> 2016-07-19 0:26 GMT+02:00 Reynold Xin
> >:
> >
@Reynold Xin,
How this will work with Hive Support ?
SparkSession.sqlContext return HiveContext ?
2016-07-19 0:26 GMT+02:00 Reynold Xin :
> Good idea.
>
> https://github.com/apache/spark/pull/14252
>
>
>
> On Mon, Jul 18, 2016 at 12:16 PM, Michael Armbrust
> wrote:
>>
>> + dev, reynold
>>
>> Yeah
Luciano,
afaik the spark-package-tool also makes it easy to upload packages to
spark-packages website. You are of course free to include any maven
coordinate in the --packages parameter
--jakob
On Fri, Jul 15, 2016 at 1:42 PM, Ismaël Mejía wrote:
> Thanks for the info Burak, I will check the rep
Can I point out this guy? https://issues.apache.org/jira/browse/SPARK-15705
I managed to find a workaround, but this is still IMO a pretty significant
bug.
--
View this message in context:
http://apache-spark-developers-list.1001551.n3.nabble.com/VOTE-Release-Apache-Spark-2-0-0-RC4-tp18317p183
Are there any 'work in progress' release notes for 2.0.0 yet? I don't see
anything in the rc docs like "what's new" or "migration guide"?
On Thu, 9 Jun 2016 at 10:06 Sean Owen wrote:
> Available but mostly as JIRA output:
> https://spark.apache.org/news/spark-2.0.0-preview.html
>
> On Thu, Jun 9
@Sean Owen,
As we're not planning to implement DataSets in Python do you plan to revert
this Jira ?
https://issues.apache.org/jira/browse/SPARK-13594
2016-07-19 10:07 GMT+02:00 Sean Owen :
> I think unfortunately at least this one is gonna block:
> https://issues.apache.org/jira/browse/SPARK-1662
I just find that MutableAggregationBuffer.update will convert data for every
update, which is terrible when I use something like Map, Array.
It is hard to implement a collect_set udaf, which will be O(n^2) in this
convert semantic.
Any advice?
--
View this message in context:
http://apache-sp
If the change is just to replace "sbt assembly/assembly" with "sbt
package", done. LMK if there are more edits.
On Mon, Jul 18, 2016 at 10:00 PM, Michael Gummelt
wrote:
> I just flailed on this a bit before finding this email. Can someone please
> update
> https://cwiki.apache.org/confluence/dis
I think unfortunately at least this one is gonna block:
https://issues.apache.org/jira/browse/SPARK-16620
Good news is that just about anything else that's at all a blocker has
been resolved and there are only about 6 issues of any kind at all
targeted for 2.0. It seems very close.
On Thu, Jul 14
21 matches
Mail list logo