I am trying to pass lambda expressions to Spark JavaRDD methods.
Having using lambda expressions in Java, in general, I was hoping for
similar behavour and coding patterns, but am finding confusing compile
errors.
The use case is a lambda expression that has a number of statements,
returning a boolean from various points in the logic.
I have tried both inline, as well as defining a Function functional type
with no luck.
Here is an example:
Function<String, Boolean> checkHeaders2 = x -> {if
(x.startsWith("npi")||x.startsWith("CPT"))
return new Boolean(false);
else new Boolean(true); };
This code gets an error stating that method must return a Boolean.
I know that the lambda expression can be shortened and included as a simple
one statement return, but using non-Spark Java 8 and a Predicate functional
type this would compile and be usable.
What am I missing and how to use the Spark Function to define lambda
exressions made up of mutliple Java statements.
Thanks
rd
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-Java-Function-API-with-Java-8-tp25794.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]