There seems to be some thread confusion here. @Deepak, I'll reply to you in the original thread ("Interested in applying as a technical writer").
Sorry, Abdul! On Wed, Jun 10, 2020 at 1:59 AM Deepak Vohra <dvohr...@yahoo.com> wrote: > Thanks Marta, > > One of the lacking features I noticed is that several example programs are > missing for Java and Scala as in Data Sources section on page Apache > Flink 1.10 Documentation: Flink DataSet API Programming Guide > <https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/batch/> > > Apache Flink 1.10 Documentation: Flink DataSet API Programming Guide > > <https://ci.apache.org/projects/flink/flink-docs-release-1.10/dev/batch/> > I could develop the example programs among other additions. > > regards, > Deepak > > > > > On Saturday, June 6, 2020, 01:16:14 a.m. PDT, Marta Paes Moreira < > ma...@ververica.com> wrote: > > > Hi, Abdul. > > Thank you for reaching out and for your interest in Google Season of Docs > (GSoD)! > > We really appreciate it that you took the time to walk us through your > experience, as well as your previous contributions to open source. Even > though we are accepting applications from people with any kind of > background, being familiar with Flink and similar projects is definitely > an advantage — so, we highly encourage you to apply! You can learn more > about the application process of GSoD in [1,2]. > > Let us know if you have any questions, we're happy to help! > > Thanks, > > Marta > > [1] https://developers.google.com/season-of-docs/docs/timeline > [2] > > https://developers.google.com/season-of-docs/docs/tech-writer-application-hints > > On Thu, Jun 4, 2020 at 7:46 PM abdul basit <abp...@gmail.com> wrote: > > > Hello Aljoscha Krettek and Seth Wiesman, > > > > I am Abdul Basit, An enthusiast Data Science and Engineering > professional. > > I am currently doing my Industrial PhD focussing on developing predictive > > maintenance solutions with the help of Big Data, Cloud and AI tools. My > PhD > > is in collaboration with Eit Digital, City University of London and Bosch > > group. > > > > I have previously done two masters in Data Engineering, Machine learning > > and have 3 years of industrial experience in these areas. My aim and > > interest is to bridge the gap between researchers and industry by helping > > researchers to adapt to the modern tools in the domain of Data and AI. > For > > example, introducing them to tools for building data preparation > pipelines > > with Apache Airflow,Apache Spark, Kafka and others. > > > > I have contributed <https://github.com/apache/airflow/pull/6007> to > > > Apache Airflow by implementing a AWS Glue plugin for developing better > data > > and ML pipelines. While working on Data Engineering tasks as part of my > > projects, I became interested in Apache flink and after about it > use-cases, > > I really started to appreciate it and tried to do some hands-on by > > replacing Spark Streaming with Flink. > > > > Having worked on a few Big data processing frameworks, I believe that > > flink has a lot of potential that has yet to be appreciated by the big > data > > community. Documentation and guides available with the official project > > plays a very crucial role in increasing the adaptability of the project. > > So, I started to search for the option to contribute to this awesome > > project and fortunately found the documentation Improvement proposal > > "FLIP-60: Restructure the Table API & SQL documentation" and "Flip > > 12369 Extend the Table API & SQL Documentation" , which is part of google > > season of code. > > > > I will be very happy to contribute towards any of these two tasks of > > restructuring and extending Table APi and SQL Documentation. I have read > > the detailed descriptions of these projects and they align perfectly with > > my experience. My previous exposure to Apache spark and Kafka will help > me > > to adopt more efficiently to the task. I will look forward to your > opinion > > on my profile before the official application to google season of Docs. I > > have attached my CV for the reference. > > > > -- > > Thanks > > > > Regards, > > Abdul Basit > > > > >