hey Dodgy Bob, Linux &  C programmers, conscientious non - objector,

I have a great idea I want share with you.
In linux I am familiar with wc {wc = word count} (linux users don't like
long winded typing ).
wc  flags are :
-c, --bytes print the byte counts
       -m, --chars
              print the character counts
       -l, --lines
              print the newline counts.


*zahid@192:~/Downloads> wc -w /etc/hostname55 /etc/hostname*

The first programme I was tasked to write in C was to replicate the linux
wc utility .
I called it wordcount.c with word -c -l -m or word wordcount  -c -l  /etc.

Anyway  on this page https://spark.apache.org/examples.html
there are examples of word count in scala,python and Java.

I kinda feel left out because I know a little  C and little Linux.
I think  it is great idea for the sake of "*familiarity* *for the client"*
( application developer ).
I was thinking of raising a JIRA but I thought I would consult with fellow
developers first. :)

Please be kind.

Backbutton.co.uk
¯\_(ツ)_/¯
♡۶Java♡۶RMI ♡۶
Make Use Method {MUM}
makeuse.org
<http://www.backbutton.co.uk>


On Mon, 9 Mar 2020 at 08:57, Zahid Rahman <zahidr1...@gmail.com> wrote:

> Hey floyd ,
>
> I just realised something:
> You need to practice using the adduser command to create users
> or in your case useradd  because that's  less painless for you to create a
> user.
> Instead of working in root.
> Trust me it is good for you.
> Then you will realise this bit of code new SparkConf() is reading from the
> etc/hostname and not etc/host file for ip_address.
>
> Backbutton.co.uk
> ¯\_(ツ)_/¯
> ♡۶Java♡۶RMI ♡۶
> Make Use Method {MUM}
> makeuse.org
> <http://www.backbutton.co.uk>
>
>
> On Wed, 4 Mar 2020 at 21:14, Andrew Melo <andrew.m...@gmail.com> wrote:
>
>> Hello Zabid,
>>
>> On Wed, Mar 4, 2020 at 1:47 PM Zahid Rahman <zahidr1...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I found the problem was because on my  Linux   Operating System the
>>> /etc/hostname was blank.
>>>
>>> *STEP 1*
>>> I searched  on google the error message and there was an answer
>>> suggesting
>>> I should add to /etc/hostname
>>>
>>> 127.0.0.1  [hostname] localhost.
>>>
>>
>> I believe you've confused /etc/hostname and /etc/hosts --
>>
>>
>>>
>>> I did that but there was still  an error,  this time the spark  log in
>>> standard output was concatenating the text content
>>> of etc/hostname  like so ,   127.0.0.1[hostname]localhost.
>>>
>>> *STEP 2*
>>> My second attempt was to change the /etc/hostname to 127.0.0.1
>>> This time I was getting a warning with information about "using loop
>>> back"  rather than an error.
>>>
>>> *STEP 3*
>>> I wasn't happy with that so then I changed the /etc/hostname to (see
>>> below) ,
>>> then the warning message disappeared. my guess is that it is the act of
>>> creating spark session as to the cause of error,
>>> in SparkConf() API.
>>>
>>>      SparkConf sparkConf = new SparkConf()
>>>              .setAppName("Simple Application")
>>>              .setMaster("local")
>>>              .set("spark.executor.memory","2g");
>>>
>>> $ cat /etc/hostname
>>> # hosts         This file describes a number of hostname-to-address
>>> #               mappings for the TCP/IP subsystem.  It is mostly
>>> #               used at boot time, when no name servers are running.
>>> #               On small systems, this file can be used instead of a
>>> #               "named" name server.
>>> # Syntax:
>>> #
>>> # IP-Address  Full-Qualified-Hostname  Short-Hostname
>>> #
>>>
>>> 192.168.0.42
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> zahid@localhost
>>> :~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
>>> mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
>>> -Dexec.args="input.txt"
>>> [INFO] Scanning for projects...
>>> [WARNING]
>>> [WARNING] Some problems were encountered while building the effective
>>> model for javacodegeek:examples:jar:1.0-SNAPSHOT
>>> [WARNING] 'build.plugins.plugin.version' for
>>> org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
>>> column 21
>>> [WARNING]
>>> [WARNING] It is highly recommended to fix these problems because they
>>> threaten the stability of your build.
>>> [WARNING]
>>> [WARNING] For this reason, future Maven versions might no longer support
>>> building such malformed projects.
>>> [WARNING]
>>> [INFO]
>>> [INFO] -----------------------< javacodegeek:examples
>>> >------------------------
>>> [INFO] Building examples 1.0-SNAPSHOT
>>> [INFO] --------------------------------[ jar
>>> ]---------------------------------
>>> [INFO]
>>> [INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
>>> WARNING: An illegal reflective access operation has occurred
>>> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
>>> (file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
>>> to method java.nio.Bits.unaligned()
>>> WARNING: Please consider reporting this to the maintainers of
>>> org.apache.spark.unsafe.Platform
>>> WARNING: Use --illegal-access=warn to enable warnings of further illegal
>>> reflective access operations
>>> WARNING: All illegal access operations will be denied in a future release
>>> Using Spark's default log4j profile:
>>> org/apache/spark/log4j-defaults.properties
>>> 20/02/29 17:20:40 INFO SparkContext: Running Spark version 2.4.5
>>> 20/02/29 17:20:40 WARN NativeCodeLoader: Unable to load native-hadoop
>>> library for your platform... using builtin-java classes where applicable
>>> 20/02/29 17:20:41 INFO SparkContext: Submitted application: Word Count
>>> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls to: zahid
>>> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls to: zahid
>>> 20/02/29 17:20:41 INFO SecurityManager: Changing view acls groups to:
>>> 20/02/29 17:20:41 INFO SecurityManager: Changing modify acls groups to:
>>> 20/02/29 17:20:41 INFO SecurityManager: SecurityManager: authentication
>>> disabled; ui acls disabled; users  with view permissions: Set(zahid);
>>> groups with view permissions: Set(); users  with modify permissions:
>>> Set(zahid); groups with modify permissions: Set()
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:20:41 ERROR SparkContext: Error initializing SparkContext.
>>> java.net.BindException: Cannot assign requested address: Service
>>> 'sparkDriver' failed after 16 retries (on a random free port)! Consider
>>> explicitly setting the appropriate binding address for the service
>>> 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
>>> correct binding address.
>>>         at java.base/sun.nio.ch.Net.bind0(Native Method)
>>>         at java.base/sun.nio.ch.Net.bind(Net.java:469)
>>>         at java.base/sun.nio.ch.Net.bind(Net.java:458)
>>>         at
>>> java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220)
>>>         at
>>> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:132)
>>>         at
>>> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:551)
>>>         at
>>> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1346)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:503)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:488)
>>>         at
>>> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:985)
>>>         at
>>> io.netty.channel.AbstractChannel.bind(AbstractChannel.java:247)
>>>         at
>>> io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:344)
>>>         at
>>> io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
>>>         at
>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:510)
>>>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:518)
>>>         at
>>> io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
>>>         at
>>> io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
>>>         at
>>> io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
>>>         at java.base/java.lang.Thread.run(Thread.java:830)
>>> 20/02/29 17:20:41 INFO SparkContext: Successfully stopped SparkContext
>>> [WARNING]
>>> java.net.BindException: Cannot assign requested address: Service
>>> 'sparkDriver' failed after 16 retries (on a random free port)! Consider
>>> explicitly setting the appropriate binding address for the service
>>> 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
>>> correct binding address.
>>>     at sun.nio.ch.Net.bind0 (Native Method)
>>>     at sun.nio.ch.Net.bind (Net.java:469)
>>>     at sun.nio.ch.Net.bind (Net.java:458)
>>>     at sun.nio.ch.ServerSocketChannelImpl.bind
>>> (ServerSocketChannelImpl.java:220)
>>>     at io.netty.channel.socket.nio.NioServerSocketChannel.doBind
>>> (NioServerSocketChannel.java:132)
>>>     at io.netty.channel.AbstractChannel$AbstractUnsafe.bind
>>> (AbstractChannel.java:551)
>>>     at io.netty.channel.DefaultChannelPipeline$HeadContext.bind
>>> (DefaultChannelPipeline.java:1346)
>>>     at io.netty.channel.AbstractChannelHandlerContext.invokeBind
>>> (AbstractChannelHandlerContext.java:503)
>>>     at io.netty.channel.AbstractChannelHandlerContext.bind
>>> (AbstractChannelHandlerContext.java:488)
>>>     at io.netty.channel.DefaultChannelPipeline.bind
>>> (DefaultChannelPipeline.java:985)
>>>     at io.netty.channel.AbstractChannel.bind (AbstractChannel.java:247)
>>>     at io.netty.bootstrap.AbstractBootstrap$2.run
>>> (AbstractBootstrap.java:344)
>>>     at io.netty.util.concurrent.AbstractEventExecutor.safeExecute
>>> (AbstractEventExecutor.java:163)
>>>     at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks
>>> (SingleThreadEventExecutor.java:510)
>>>     at io.netty.channel.nio.NioEventLoop.run (NioEventLoop.java:518)
>>>     at io.netty.util.concurrent.SingleThreadEventExecutor$6.run
>>> (SingleThreadEventExecutor.java:1044)
>>>     at io.netty.util.internal.ThreadExecutorMap$2.run
>>> (ThreadExecutorMap.java:74)
>>>     at io.netty.util.concurrent.FastThreadLocalRunnable.run
>>> (FastThreadLocalRunnable.java:30)
>>>     at java.lang.Thread.run (Thread.java:830)
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] BUILD FAILURE
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] Total time:  3.977 s
>>> [INFO] Finished at: 2020-02-29T17:20:43Z
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [ERROR] Failed to execute goal
>>> org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project
>>> examples: An exception occured while executing the Java class. Cannot
>>> assign requested address: Service 'sparkDriver' failed after 16 retries (on
>>> a random free port)! Consider explicitly setting the appropriate binding
>>> address for the service 'sparkDriver' (for example spark.driver.bindAddress
>>> for SparkDriver) to the correct binding address. -> [Help 1]
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>> -e switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>>> please read the following articles:
>>> [ERROR] [Help 1]
>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>> zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
>>> mvn exec:java -Dexec.mainClass=com.javacodegeek.examples.SparkExampleRDD
>>> -Dexec.args="input.txt" > log.out
>>> WARNING: An illegal reflective access operation has occurred
>>> WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform
>>> (file:/home/zahid/.m2/repository/org/apache/spark/spark-unsafe_2.12/2.4.5/spark-unsafe_2.12-2.4.5.jar)
>>> to method java.nio.Bits.unaligned()
>>> WARNING: Please consider reporting this to the maintainers of
>>> org.apache.spark.unsafe.Platform
>>> WARNING: Use --illegal-access=warn to enable warnings of further illegal
>>> reflective access operations
>>> WARNING: All illegal access operations will be denied in a future release
>>> Using Spark's default log4j profile:
>>> org/apache/spark/log4j-defaults.properties
>>> 20/02/29 17:21:12 INFO SparkContext: Running Spark version 2.4.5
>>> 20/02/29 17:21:12 WARN NativeCodeLoader: Unable to load native-hadoop
>>> library for your platform... using builtin-java classes where applicable
>>> 20/02/29 17:21:12 INFO SparkContext: Submitted application: Word Count
>>> 20/02/29 17:21:12 INFO SecurityManager: Changing view acls to: zahid
>>> 20/02/29 17:21:12 INFO SecurityManager: Changing modify acls to: zahid
>>> 20/02/29 17:21:12 INFO SecurityManager: Changing view acls groups to:
>>> 20/02/29 17:21:12 INFO SecurityManager: Changing modify acls groups to:
>>> 20/02/29 17:21:12 INFO SecurityManager: SecurityManager: authentication
>>> disabled; ui acls disabled; users  with view permissions: Set(zahid);
>>> groups with view permissions: Set(); users  with modify permissions:
>>> Set(zahid); groups with modify permissions: Set()
>>> 20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:12 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 WARN Utils: Service 'sparkDriver' could not bind on a
>>> random free port. You may check whether configuring an appropriate binding
>>> address.
>>> 20/02/29 17:21:13 ERROR SparkContext: Error initializing SparkContext.
>>> java.net.BindException: Cannot assign requested address: Service
>>> 'sparkDriver' failed after 16 retries (on a random free port)! Consider
>>> explicitly setting the appropriate binding address for the service
>>> 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
>>> correct binding address.
>>>         at java.base/sun.nio.ch.Net.bind0(Native Method)
>>>         at java.base/sun.nio.ch.Net.bind(Net.java:469)
>>>         at java.base/sun.nio.ch.Net.bind(Net.java:458)
>>>         at
>>> java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220)
>>>         at
>>> io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:132)
>>>         at
>>> io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:551)
>>>         at
>>> io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1346)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:503)
>>>         at
>>> io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:488)
>>>         at
>>> io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:985)
>>>         at
>>> io.netty.channel.AbstractChannel.bind(AbstractChannel.java:247)
>>>         at
>>> io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:344)
>>>         at
>>> io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
>>>         at
>>> io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:510)
>>>         at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:518)
>>>         at
>>> io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044)
>>>         at
>>> io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
>>>         at
>>> io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
>>>         at java.base/java.lang.Thread.run(Thread.java:830)
>>> 20/02/29 17:21:13 INFO SparkContext: Successfully stopped SparkContext
>>> zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
>>> ls
>>> input.txt  log.out  pom.xml  src  target
>>> zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
>>> cat log.out
>>> [INFO] Scanning for projects...
>>> [WARNING]
>>> [WARNING] Some problems were encountered while building the effective
>>> model for javacodegeek:examples:jar:1.0-SNAPSHOT
>>> [WARNING] 'build.plugins.plugin.version' for
>>> org.apache.maven.plugins:maven-compiler-plugin is missing. @ line 12,
>>> column 21
>>> [WARNING]
>>> [WARNING] It is highly recommended to fix these problems because they
>>> threaten the stability of your build.
>>> [WARNING]
>>> [WARNING] For this reason, future Maven versions might no longer support
>>> building such malformed projects.
>>> [WARNING]
>>> [INFO]
>>> [INFO] -----------------------< javacodegeek:examples
>>> >------------------------
>>> [INFO] Building examples 1.0-SNAPSHOT
>>> [INFO] --------------------------------[ jar
>>> ]---------------------------------
>>> [INFO]
>>> [INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ examples ---
>>> [WARNING]
>>> java.net.BindException: Cannot assign requested address: Service
>>> 'sparkDriver' failed after 16 retries (on a random free port)! Consider
>>> explicitly setting the appropriate binding address for the service
>>> 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the
>>> correct binding address.
>>>     at sun.nio.ch.Net.bind0 (Native Method)
>>>     at sun.nio.ch.Net.bind (Net.java:469)
>>>     at sun.nio.ch.Net.bind (Net.java:458)
>>>     at sun.nio.ch.ServerSocketChannelImpl.bind
>>> (ServerSocketChannelImpl.java:220)
>>>     at io.netty.channel.socket.nio.NioServerSocketChannel.doBind
>>> (NioServerSocketChannel.java:132)
>>>     at io.netty.channel.AbstractChannel$AbstractUnsafe.bind
>>> (AbstractChannel.java:551)
>>>     at io.netty.channel.DefaultChannelPipeline$HeadContext.bind
>>> (DefaultChannelPipeline.java:1346)
>>>     at io.netty.channel.AbstractChannelHandlerContext.invokeBind
>>> (AbstractChannelHandlerContext.java:503)
>>>     at io.netty.channel.AbstractChannelHandlerContext.bind
>>> (AbstractChannelHandlerContext.java:488)
>>>     at io.netty.channel.DefaultChannelPipeline.bind
>>> (DefaultChannelPipeline.java:985)
>>>     at io.netty.channel.AbstractChannel.bind (AbstractChannel.java:247)
>>>     at io.netty.bootstrap.AbstractBootstrap$2.run
>>> (AbstractBootstrap.java:344)
>>>     at io.netty.util.concurrent.AbstractEventExecutor.safeExecute
>>> (AbstractEventExecutor.java:163)
>>>     at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks
>>> (SingleThreadEventExecutor.java:510)
>>>     at io.netty.channel.nio.NioEventLoop.run (NioEventLoop.java:518)
>>>     at io.netty.util.concurrent.SingleThreadEventExecutor$6.run
>>> (SingleThreadEventExecutor.java:1044)
>>>     at io.netty.util.internal.ThreadExecutorMap$2.run
>>> (ThreadExecutorMap.java:74)
>>>     at io.netty.util.concurrent.FastThreadLocalRunnable.run
>>> (FastThreadLocalRunnable.java:30)
>>>     at java.lang.Thread.run (Thread.java:830)
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] BUILD FAILURE
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [INFO] Total time:  4.949 s
>>> [INFO] Finished at: 2020-02-29T17:21:16Z
>>> [INFO]
>>> ------------------------------------------------------------------------
>>> [ERROR] Failed to execute goal
>>> org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project
>>> examples: An exception occured while executing the Java class. Cannot
>>> assign requested address: Service 'sparkDriver' failed after 16 retries (on
>>> a random free port)! Consider explicitly setting the appropriate binding
>>> address for the service 'sparkDriver' (for example spark.driver.bindAddress
>>> for SparkDriver) to the correct binding address. -> [Help 1]
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>> -e switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>>> please read the following articles:
>>> [ERROR] [Help 1] 
>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionExc:
>>> BUGeption
>>> <http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException>
>>> zahid@localhost:~/Downloads/apachespark/Apache-Spark-Example/Java-Code-Geek>
>>>
>>>
>>>
>>> Backbutton.co.uk
>>> ¯\_(ツ)_/¯
>>> ♡۶Java♡۶RMI ♡۶
>>> Make Use Method {MUM}
>>> makeuse.org
>>> <http://www.backbutton.co.uk>
>>>
>>

Reply via email to