Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/4972#issuecomment-78625350
  
    I haven't thought it through at close range but I had meant something like 
...
    
    ```
    def sizeOf(xs: AnyRef): Int = xs match {
        case x: Array[Int]     => x.length * 4
        case x: Array[Double]  => x.length * 8
        case x: Array[Long]    => x.length * 8
        ...
      }
    ```
    
    ... but if the point is merely to avoid native code, how about just using 
the 'equivalent' of the `Arrays` methods from Scala? they look like non-native 
code and do about the same thing as the original optimized version you pasted.
    
    I think you're taking it a step further and trying to assess the component 
type just once, which is probably a further win but enough to matter and be 
worth the complexity?
    
    Hm, I don't think `@specialized` can be used here, but I wish it could.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to