Handling Exception or Control in spark dataframe write()

2016-12-14 Thread bhayat
Hello, I am writing my RDD into parquet format but what i understand that write() method is still experimental and i do not know how i will deal with possible exceptions. For example: schemaXXX.write().mode(saveMode).parquet(parquetPathInHdfs); In this example i do not know how i will handle e

Stream compressed data from KafkaUtils.createDirectStream

2016-11-03 Thread bhayat
Hello, I really wonder that whether i can stream compressed data with using KafkaUtils.createDirectStream(...) or not. This is formal code that i use ; JavaPairInputDStream messages = KafkaUtils.createStream(javaStreamingContext, zookeeperConfiguration, groupName, topicMap, StorageLevel.MEMORY_A