I know these methods , but i need to create events using the timestamps in
the data tuples ,means every time a new tuple is generated using the
timestamp in a CSV file .this will be useful to simulate the data rate
with time just like real sensor data .
On Fri, May 1, 2015 at 2:52 PM, Juan Rodrí
Hi,
Maybe you could use streamingContext.fileStream like in the example from
https://spark.apache.org/docs/latest/streaming-programming-guide.html#input-dstreams-and-receivers,
you can read "from files on any file system compatible with the HDFS API
(that is, HDFS, S3, NFS, etc.)". You could split