Hello,
I'm using Spark streaming to process kafka message, and wants to use a prop
file as the input and broadcast the properties:
val props = new Properties()
props.load(new FileInputStream(args(0)))
val sc = initSparkContext()
val propsBC = sc.broadcast(props)
println(s"propFileBC 1: " + propsB
Hi,
I am new to Spark/Streaming, and tried to run modified
FlumeEventCount.scala example to display all events by adding the call:
stream.map(e => "Event:header:" + e.event.get(0).toString + "body: " +
new String(e.event.getBody.array)).print()
The spark-submit runs fine with --master local