I'd like to change the default interpreter to pyspark in the spark interpreter group, but seems the default interpreter is defined in interpreter-setting.json which is packaged in spark interpreter jar so that I can not modify it. Is there any other way that I can change the default interpreter ?
-- Best Regards Jeff Zhang