[Streaming] Infinite delay when stopping the context

2016-04-14 Thread Sergio Ramírez
Hello: I use the stop method in my streaming programs to finish the executions of my experiments. However, despite of getting these messages: / //16/04/14 12:03:39 INFO JobGenerator: Stopping JobGenerator immediately// //16/04/14 12:03:39 INFO RecurringTimer: Stopped timer for JobGenerator aft

Modify text in spark-packages

2016-02-23 Thread Sergio Ramírez
Hello, I have some problems in modifying the description of some of my packages in spark-packages.com. However, I haven't been able to change anything. I've written to the e-mail direction in charge of managing this page, but I got no answer. Any clue? Thanks --

Re: Unchecked contribution (JIRA and PR)

2015-11-26 Thread Sergio Ramírez
On Wed, Nov 4, 2015 at 7:23 AM, Sergio Ramírez <mailto:sramire...@ugr.es>> wrote: OK, for me, time is not a problem. I was just worried about there was no movement in those issues. I think they are good contributions. For example, I have found no complex discretization a

Re: Unchecked contribution (JIRA and PR)

2015-11-04 Thread Sergio Ramírez
o include a package. On Tue, Nov 3, 2015 at 2:49 AM, Sergio Ramírez mailto:sramire...@ugr.es>> wrote: Hello all: I developed two packages for MLlib in March. These have been also upload to the spark-packages repository. Associated to these packages, I created t

Unchecked contribution (JIRA and PR)

2015-11-03 Thread Sergio Ramírez
https://github.com/apache/spark/pull/5170 https://issues.apache.org/jira/browse/SPARK-6531 https://issues.apache.org/jira/browse/SPARK-6509 These remain unassigned in JIRA and unverified in GitHub. Could anyone explain why are they in this state yet? Is it normal? Thanks! Sergio R. -- Sergio Ra

Fixed number of partitions in RangePartitioner

2015-07-22 Thread Sergio Ramírez
Hi all: I am developing an algorithm that needs to put together elements with the same key as much as possible but with always using a fixed number of partitions. To do that, this algorithm sorts by key the elements. The problem is that the number of distinct keys influences in the number of

Spark hangs without notification (broadcasting)

2015-06-15 Thread Sergio Ramírez
Hi everyone: I am having several problems with an algorithm for MLLIB that I am developing. It uses large broadcasted variables with many iteration and breeze vectors as RDDs. The problem is that in some stages the spark program freezes without notification. I have tried to reduce the use of