Some of it is intentionally undocumented, as far as I know, as an experimental option that may change, or legacy, or safety valve flag. Certainly anything that's marked an internal conf. (That does raise the question of who it's for, if you have to read source to find it.)
I don't know if we need to overhaul the conf system, but there may indeed be some confs that could legitimately be documented. I don't know which. On Tue, Jan 14, 2020 at 7:32 PM Nicholas Chammas <nicholas.cham...@gmail.com> wrote: > > I filed SPARK-30510 thinking that we had forgotten to document an option, but > it turns out that there's a whole bunch of stuff under SQLConf.scala that has > no public documentation under http://spark.apache.org/docs. > > Would it be appropriate to somehow automatically generate a documentation > page from SQLConf.scala, as Hyukjin suggested on that ticket? > > Another thought that comes to mind is moving the config definitions out of > Scala and into a data format like YAML or JSON, and then sourcing that both > for SQLConf as well as for whatever documentation page we want to generate. > What do you think of that idea? > > Nick > --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org