I see. Are you able to connect to the databricks spark cluster with a scala
repl?

The same URL should work with the R spark client. I'm not sure how you're
meant to specify the kafka credentials but perhaps this will help you
connect r to spark.

On Thu, Jul 29, 2021, 6:35 AM Paulo Ramos <paulo.ra...@3pillarglobal.com>
wrote:

> Yes
>
> That piece of code is not working for me, because when I run it I am not
> getting it.
>
> I am trying to connect through Data Bricks, so I think I have to change
> the "local" connection in master?
>
> Also, how do I add in the read options the kafka username and psswd?
>
> Notes: Kafka is hosted in confluent
>
> Best Regards
> Paulo
>
> On Wed, Jul 28, 2021 at 8:52 PM Blake Miller <blak3mil...@gmail.com>
> wrote:
>
>> Paulo,
>>
>> Could you clarify what your question is? I wasn't able to understand it
>> from your email.
>>
>> I understand that you want to read data from Kafka in R, and you shared
>> some R code. What is the issue that you're having with this code?
>>
>> Regards,
>> Blake
>>
>> On Tue, Jul 27, 2021 at 2:42 PM Paulo Ramos
>> <paulo.ra...@3pillarglobal.com.invalid> wrote:
>>
>> > Hi All,
>> >
>> > I have a technical question, regarding reading stream msg from kafka
>> using
>> > R.
>> >
>> > This is what I am currently using:
>> >
>> > if (FALSE) { library <https://rdrr.io/r/base/library.html>(sparklyr
>> > <https://spark.rstudio.com/>) sc <- spark_connect
>> > <https://spark.rstudio.com/reference/spark-connections.html>(master =
>> > "local", version = "2.3", packages = "kafka") read_options <- list
>> > <https://rdrr.io/r/base/list.html>(kafka.bootstrap.servers =
>> > "localhost:9092", subscribe = "topic1") write_options <- list
>> > <https://rdrr.io/r/base/list.html>(kafka.bootstrap.servers =
>> > "localhost:9092", topic = "topic2") stream <- stream_read_kafka(sc,
>> options
>> > = read_options) %>% stream_write_kafka
>> > <https://spark.rstudio.com/reference/stream_write_kafka.html>(options =
>> > write_options) stream_stop
>> > <https://spark.rstudio.com/reference/stream_stop.html>(stream) }
>> >
>> > But I need to add
>> > sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule
>> > required username="" password="" to connect to my kafka topic. Also
>> what do
>> > I have to add in master = "local"  if I am using databrick. This is for
>> > consumer part
>> >
>> > Any help regarding this issue will be a great help
>> >
>> > Best Regards
>> >
>> > --
>> > Paulo Ramos
>> > 3Pillar Global
>> > Office:
>> > North America | Asia Pacific | Europe | Latin America
>> > www.3PillarGlobal.com
>> >
>> > <
>> >
>> https://pages.3pillarglobal.com/pm-ebook.html?utm_source=Email&utm_campaign=email-signature&utm_medium=Banner&utm_content=Ebook
>> > >
>> >
>>
>
>
> --
> Paulo Ramos
> 3Pillar Global
> Office:
> North America | Asia Pacific | Europe | Latin America
> www.3PillarGlobal.com
>
>
> <https://pages.3pillarglobal.com/pm-ebook.html?utm_source=Email&utm_campaign=email-signature&utm_medium=Banner&utm_content=Ebook>
>

Reply via email to