Ok, that rules out a whole class of errors. Let's continue the diagnostic:
- How are you submitting the application to spark?
- How/which version of spark are you using within your build tool?
- Could you have dirty ivy or maven caches that use some locally-built
version of spark?

On Wed, Oct 5, 2016 at 3:35 PM, kant kodali <kanth...@gmail.com> wrote:

> I am running locally so they all are on one host
>
>
>
> On Wed, Oct 5, 2016 3:12 PM, Jakob Odersky ja...@odersky.com wrote:
>
>> Are all spark and scala versions the same? By "all" I mean the master,
>> worker and driver instances.
>>
>

Reply via email to