What is the best-practice/kafka way to get http(s) POST requests into a Kafka topic (kafka v2.0.0 installed on a HDP cluster)? Have never used kafka before and would like to know the best way that this should be done. Basically, we have a public URL that is going to receive requests from a specific external URL based on event hooks ( https://developers.acuityscheduling.com/docs/webhooks) and I want to get these requests into a kafka topic. I've seen this (https://docs.confluent.io/3.0.0/kafka-rest/docs/intro.html), but am a bit confused (again, have never used kafka before). Will there need to be an always-on producer to read from these event hooks to produce into a topic? What is the best practice way to do this to account for whatever common fault tolerances that should be built into a kafka producer for this kind of live event feed? No way to just automatically dump the requests into the topic (and avoid having to ensure such a simple forwarding producer is always alive (and thus not forever missing the data that came in during that downtime))?
Thank you