Hi all, Databricks has an offering for their serverless notebooks - where they save session state - so one can shut down the cluster, and start it back up and continue as if the cluster wasn't shut down. You can read about that here <https://www.databricks.com/blog/seamlessly-resume-sessions-serverless-notebooks> .
Technically, it means they serialize the state - and load it back on a cluster when needed. I don't know of any way to do that in the OSS version of Spark nor any API that can support that. Is this something that can be achieved with the OSS Spark version today? If not - is this something that can be achieved in a reasonable effort? I'd appreciate your thoughts on this, Nimrod
