Here the basic table structure for all the customers will be similar.I will distinguish them by creating different dbs.And the name of the dbs will be same as the name of the topic created in kafka.While inserting or updating the table in db i will use the topic name to create the connection.
On Fri, Aug 7, 2015 at 12:11 PM, Kishore Senji <kse...@gmail.com> wrote: > >>I will add a topic name in my kafka .So, is it possible to make storm > read data from that kafka topic when the cluster is running. > > I assume that you are referring to making the Spout read the data from the > new topic that is created at runtime (and not really adding a new Spout > thereby changing the topology as that is not possible). If so, this is not > possible by the KafkaSpout that ships with Storm. You would have to extend > it or create your own Spout which can do that. > > But why are you creating a new topic for every customer and expecting to > read from them as and when they are created. Assume for a moment even if > this is possible from KafkaSpout that ships with Storm, your down stream > bolt which stores data in to Mongodb has to be aware of which message > belongs to which user to appropriately store the record. > > Why wouldn't you have only one topic and identify the user as part of the > message? > > > On Thu, Aug 6, 2015 at 10:24 PM, Ritesh Sinha < > kumarriteshranjansi...@gmail.com> wrote: > >> I have a topology which runs in the following way : >> It reads data from the Kafka topic and pass it to storm.Storm processes >> the data and stores it into two different DBs(mongo & cassandra). >> >> Here, the kafka topic is the name of the customer also the database name >> in mongodb and cassandra is same as the name of the kafka topic. >> >> Now, Suppose i have submitted the topology and it is running and i get a >> new customer. >> >> I will add a topic name in my kafka .So, is it possible to make storm >> read data from that kafka topic when the cluster is running. >> >> Thanks >> > >