Thanks so much! I was just looking for it today on spark-packages - you've
read my mind :)

On Wed, Sep 9, 2015 at 5:53 PM, Burak Yavuz <brk...@gmail.com> wrote:

> By the way, I published
> http://spark-packages.org/package/brkyvz/lazy-linalg that contains many
> of the arithmetic operations for use in Scala. I really would appreciate
> any feedback!
>
> On Tue, Aug 25, 2015 at 11:06 AM, Kristina Rogale Plazonic <
> kpl...@gmail.com> wrote:
>
>> YES PLEASE!
>>
>> :)))))))
>>
>> On Tue, Aug 25, 2015 at 1:57 PM, Burak Yavuz <brk...@gmail.com> wrote:
>>
>>> Hmm. I have a lot of code on the local linear algebra operations using
>>> Spark's Matrix and Vector representations
>>> done for https://issues.apache.org/jira/browse/SPARK-6442.
>>>
>>> I can make a Spark package with that code if people are interested.
>>>
>>> Best,
>>> Burak
>>>
>>> On Tue, Aug 25, 2015 at 10:54 AM, Kristina Rogale Plazonic <
>>> kpl...@gmail.com> wrote:
>>>
>>>> However I do think it's easier than it seems to write the implicits;
>>>>> it doesn't involve new classes or anything. Yes it's pretty much just
>>>>> what you wrote. There is a class "Vector" in Spark. This declaration
>>>>> can be in an object; you don't implement your own class. (Also you can
>>>>> use "toBreeze" to get Breeze vectors.)
>>>>
>>>>
>>>> The implicit conversion with the implicit def happens for the first
>>>> vector in the sum, but not the second vector (see below).
>>>>
>>>> At this point I give up, because I spent way too much time.  I am so
>>>> disappointed.  So many times I heard "Spark makes simple things easy and
>>>> complicated things possible". Well, here is the simplest thing you can
>>>> imagine in linear algebra, but heck, it is not easy or intuitive.  It was
>>>> easier to run a DeepLearning algo (from another library) than add two
>>>> vectors.
>>>>
>>>> If anybody has a workaround other than implementing your own
>>>> add/substract/scalarMultiply, PLEASE let me know.
>>>>
>>>> Here is the code and error from (freshly started) spark-shell:
>>>>
>>>> scala> import breeze.linalg.{DenseVector => BDV, SparseVector => BSV,
>>>> Vector => BV}
>>>> import breeze.linalg.{DenseVector=>BDV, SparseVector=>BSV, Vector=>BV}
>>>>
>>>> scala> import org.apache.spark.mllib.linalg.Vectors
>>>> import org.apache.spark.mllib.linalg.Vectors
>>>>
>>>> scala> val v1 = Vectors.dense(1.0, 2.0, 3.0)
>>>> v1: org.apache.spark.mllib.linalg.Vector = [1.0,2.0,3.0]
>>>>
>>>> scala> import org.apache.spark.mllib.linalg.{Vector =>SparkVector}
>>>> import org.apache.spark.mllib.linalg.{Vector=>SparkVector}
>>>>
>>>> scala> object MyUtils {
>>>>      |   implicit def toBreeze(v:SparkVector) = BV(v.toArray)
>>>>      | }
>>>> warning: there were 1 feature warning(s); re-run with -feature for
>>>> details
>>>> defined module MyUtils
>>>>
>>>> scala> import MyUtils._
>>>> import MyUtils._
>>>>
>>>> scala> v1:BV[Double]
>>>> res2: breeze.linalg.Vector[Double] = DenseVector(1.0, 2.0, 3.0)
>>>>
>>>> scala> v1 + v1
>>>> <console>:30: error: could not find implicit value for parameter op:
>>>> breeze.linalg.operators.OpAdd.Impl2[breeze.linalg.Vector[Double],org.apache.spark.mllib.linalg.Vector,That]
>>>>               v1 + v1
>>>>                  ^
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Reply via email to