Hi everyone,

I'd like to start a discussion about adding API stability annotations to 
PyFlink, similar to those currently used in the Java APIs (@PublicEvolving, 
@Experimental, etc.).

Currently, the Java APIs use annotations like @PublicEvolving and @Experimental 
to clearly indicate the stability status of API components. This is super 
helpful for understanding which parts of the API are stable for production use 
and which might change in future releases.

The Python APIs lack this kind of explicit stability indication. It's possible 
to add a `deprecated` directive to the class/function docstrings to produce a 
deprecated warning in the API documentation, but it's not widely used.

I would like to port the API stability annotations from the Java project into 
Python equivalents as decorators for Python classes/functions, and to align the 
stability annotations between the Java and Python APIs. These decorators would 
enrich the docstrings of the underlying classes/functions to output consistent 
sections in the API documentation describing the stability of the 
class/function as PublicEvolving, Experimental, Deprecated, etc. For example:

Example usage:
```python

@PublicEvolving()
class Table(object):
  ...

  @Deprecated(since="2.1.0", detail="Use :func:`Table.get_resolved_schema` 
instead.")
  def get_schema(self) -> TableSchema:
    ...
```

Would output in the API documentation a notice that the Table class' interface 
is public but evolving, and that `get_schema` is deprecated since 2.1.0 in 
favour of `get_resolved_schema`.

The only real wrinkle is that the @Deprecated annotation in Java outputs a 
warning at compile time if it is used, whereas Python doesn't have a compile 
time. A warning could be emitted at runtime on function use, for example, but 
I'm not entirely sure how nice that would be. 

Thanks for reading! I would love any opinions/feedback on whether this would be 
useful to developers currently using the Python APIs.

Thanks,
Mika

Reply via email to