#4 but with MemoryOnly (more scala-like)

http://docs.scala-lang.org/style/naming-conventions.html

Constants, Values, Variable and Methods

Constant names should be in upper camel case. That is, if the member is
final, immutable and it belongs to a package object or an object, it may be
considered a constant (similar to Java’sstatic final members):


   1. object Container {
   2.     val MyConstant = ...
   3. }


2015-03-04 17:11 GMT-08:00 Xiangrui Meng <men...@gmail.com>:

> Hi all,
>
> There are many places where we use enum-like types in Spark, but in
> different ways. Every approach has both pros and cons. I wonder
> whether there should be an “official” approach for enum-like types in
> Spark.
>
> 1. Scala’s Enumeration (e.g., SchedulingMode, WorkerState, etc)
>
> * All types show up as Enumeration.Value in Java.
>
> http://spark.apache.org/docs/latest/api/java/org/apache/spark/scheduler/SchedulingMode.html
>
> 2. Java’s Enum (e.g., SaveMode, IOMode)
>
> * Implementation must be in a Java file.
> * Values doesn’t show up in the ScalaDoc:
>
> http://spark.apache.org/docs/latest/api/scala/#org.apache.spark.network.util.IOMode
>
> 3. Static fields in Java (e.g., TripletFields)
>
> * Implementation must be in a Java file.
> * Doesn’t need “()” in Java code.
> * Values don't show up in the ScalaDoc:
>
> http://spark.apache.org/docs/latest/api/scala/#org.apache.spark.graphx.TripletFields
>
> 4. Objects in Scala. (e.g., StorageLevel)
>
> * Needs “()” in Java code.
> * Values show up in both ScalaDoc and JavaDoc:
>
> http://spark.apache.org/docs/latest/api/scala/#org.apache.spark.storage.StorageLevel$
>
> http://spark.apache.org/docs/latest/api/java/org/apache/spark/storage/StorageLevel.html
>
> It would be great if we have an “official” approach for this as well
> as the naming convention for enum-like values (“MEMORY_ONLY” or
> “MemoryOnly”). Personally, I like 4) with “MEMORY_ONLY”. Any thoughts?
>
> Best,
> Xiangrui
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to