Maybe it’s the reverse - the package is built to run in latest but not 
compatible with slightly older (3.5.2 was Dec 2018)

________________________________
From: Jeff Zhang <zjf...@gmail.com>
Sent: Thursday, December 26, 2019 5:36:50 PM
To: Felix Cheung <felixcheun...@hotmail.com>
Cc: user.spark <user@spark.apache.org>
Subject: Re: Fail to use SparkR of 3.0 preview 2

I use R 3.5.2

Felix Cheung <felixcheun...@hotmail.com<mailto:felixcheun...@hotmail.com>> 
于2019年12月27日周五 上午4:32写道:
It looks like a change in the method signature in R base packages.

Which version of R are you running on?

________________________________
From: Jeff Zhang <zjf...@gmail.com<mailto:zjf...@gmail.com>>
Sent: Thursday, December 26, 2019 12:46:12 AM
To: user.spark <user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Fail to use SparkR of 3.0 preview 2

I tried SparkR of spark 3.0 preview 2, but hit the following issue.

Error in rbind(info, getNamespaceInfo(env, "S3methods")) :
  number of columns of matrices must match (see arg 2)
Error: package or namespace load failed for ‘SparkR’ in rbind(info, 
getNamespaceInfo(env, "S3methods")):
 number of columns of matrices must match (see arg 2)
During startup - Warning messages:
1: package ‘SparkR’ was built under R version 3.6.2
2: package ‘SparkR’ in options("defaultPackages") was not found

Does anyone know what might be wrong ? Thanks



--
Best Regards

Jeff Zhang


--
Best Regards

Jeff Zhang

Reply via email to