Hello Spark community

I am trying to connect a worker to the connect server.

Following the documentation, I am able to get the spark-connect server to
run in the simple one node way.


Am I correct to assume that the spark-connect server can work with spark
workers? If so, how do I connect a spark worker to a spark-connect server?
I have a standalone spark setup, and I am used to using scripts that start
worker daemons and connect them to the master. I tried connecting a worker
to a connect server similar to how I would do it with a master.


First I start the connector service:

$SPARK_HOME/sbin/start-connect-server.sh --packages
org.apache.spark:spark-connect_2.12:3.5.4


Then I try to connect a worker:

$SPARK_HOME/sbin/spark-daemon.sh start
org.apache.spark.deploy.worker.Worker 1 spark://nxxcxx:15002

However, I get an error:

25/01/23 11:57:19 INFO Worker: Connecting to master cxxnxx:15002...

25/01/23 11:57:19 INFO TransportClientFactory: Successfully created
connection to cxxnxx/192.xxx.x.xxx:15002 after 38 ms (0 ms spent in
bootstraps)

25/01/23 11:57:20 WARN TransportChannelHandler: Exception in connection
from cxxxnxx/192.xxx.x.xx:15002

java.lang.IllegalArgumentException: Too large frame: 19808389169144




-- 
Andrew Petersen, PhD
Advanced Computing, Office of Information Technology
2620 Hillsborough Street
datascience.oit.ncsu.edu

Reply via email to