Does the following options works for you?

./bin/spark-shell --conf spark.jars.ivy=${HOME}/.ivy2
./bin/spark-shell --conf spark.jars.ivy=/Users/yourname/.ivy2

I think the issue is that ~ is not interpreted by shell and just passthrough to 
the Ivy lib.

Thanks,
Cheng Pan



> On Apr 29, 2025, at 02:41, Jacek Laskowski <ja...@japila.pl> wrote:
> 
> Hi Wenchen,
> 
> Looks like it didn't work in 3.5 either.
> 
> ❯ ./bin/spark-shell --version
> 25/04/28 20:37:48 WARN Utils: Your hostname, Jaceks-Mac-mini.local resolves 
> to a loopback address: 127.0.0.1; using 192.168.68.100 instead (on interface 
> en1)
> 25/04/28 20:37:48 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to 
> another address
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 3.5.5
>       /_/
> 
> Using Scala version 2.12.18, OpenJDK 64-Bit Server VM, 17.0.15
> Branch HEAD
> Compiled by user ubuntu on 2025-02-23T20:30:46Z
> Revision 7c29c664cdc9321205a98a14858aaf8daaa19db2
> Url https://github.com/apache/spark
> Type --help for more information.
> 
> ❯ ./bin/spark-shell --conf spark.jars.ivy=~/.ivy2
> 25/04/28 20:37:54 WARN Utils: Your hostname, Jaceks-Mac-mini.local resolves 
> to a loopback address: 127.0.0.1; using 192.168.68.100 instead (on interface 
> en1)
> 25/04/28 20:37:54 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to 
> another address
> Exception in thread "main" java.lang.IllegalArgumentException: basedir must 
> be absolute: ~/.ivy2/local
> at org.apache.ivy.util.Checks.checkAbsolute(Checks.java:48)
> at 
> org.apache.ivy.plugins.repository.file.FileRepository.setBaseDir(FileRepository.java:137)
> at 
> org.apache.ivy.plugins.repository.file.FileRepository.<init>(FileRepository.java:44)
> at 
> org.apache.spark.deploy.SparkSubmitUtils$.createRepoResolvers(SparkSubmit.scala:1274)
> at 
> org.apache.spark.deploy.SparkSubmitUtils$.buildIvySettings(SparkSubmit.scala:1381)
> at 
> org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:182)
> at 
> org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:339)
> at org.apache.spark.deploy.SparkSubmit.org 
> <http://org.apache.spark.deploy.sparksubmit.org/>$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:969)
> at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:199)
> at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:222)
> at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
> at 
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1125)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1134)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 
> We should change the wording in the migration guide and fix it (at least 
> report it as an issue). I can do both if allowed :wink:
> 
> Pozdrawiam,
> Jacek Laskowski
> ----
> "The Internals Of" Online Books <https://books.japila.pl/>
> Follow me on Bluesky <https://bsky.app/profile/books.japila.pl>
> 
>  <https://twitter.com/jaceklaskowski>
> 
> On Mon, Apr 28, 2025 at 5:04 AM Wenchen Fan <cloud0...@gmail.com 
> <mailto:cloud0...@gmail.com>> wrote:
>> Hi Jacek,
>> 
>> Thanks for reporting the issue! Did you hit the same problem when you set 
>> the `spark.jars.ivy` config with Spark 3.5? If this config never worked with 
>> a relative path, we should change the wording in the migration guide.
>> 
>> Thanks,
>> Wenchen
>> 
>> On Sun, Apr 27, 2025 at 10:27 PM Jacek Laskowski <ja...@japila.pl 
>> <mailto:ja...@japila.pl>> wrote:
>>> Hi,
>>> 
>>> I found in docs/core-migration-guide.md:
>>> 
>>> - Since Spark 4.0, Spark uses `~/.ivy2.5.2` as Ivy user directory by 
>>> default to isolate the existing systems from Apache Ivy's incompatibility. 
>>> To restore the legacy behavior, you can set `spark.jars.ivy` to `~/.ivy2`.
>>> 
>>> With that, I used spark.jars.ivy to run spark-shell with ~/.ivy2 directory 
>>> for local dependencies.
>>> 
>>> bin/spark-sql --conf spark.jars.ivy=~/.ivy2
>>> 
>>> This ended up with the following issue:
>>> 
>>> Exception in thread "main" java.lang.IllegalArgumentException: basedir must 
>>> be absolute: ~/.ivy2/local
>>> at org.apache.ivy.util.Checks.checkAbsolute(Checks.java:48)
>>> at 
>>> org.apache.ivy.plugins.repository.file.FileRepository.setBaseDir(FileRepository.java:137)
>>> at 
>>> org.apache.ivy.plugins.repository.file.FileRepository.<init>(FileRepository.java:44)
>>> at 
>>> org.apache.spark.util.MavenUtils$.createRepoResolvers(MavenUtils.scala:159)
>>> at org.apache.spark.util.MavenUtils$.buildIvySettings(MavenUtils.scala:287)
>>> at 
>>> org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:118)
>>> at 
>>> org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:341)
>>> at org.apache.spark.deploy.SparkSubmit.org 
>>> <http://org.apache.spark.deploy.sparksubmit.org/>$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:961)
>>> at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:204)
>>> at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:227)
>>> at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:96)
>>> at 
>>> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1132)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1141)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> 
>>> A workaround is to use an absolute path.
>>> 
>>> Is this a known issue? Should I report it against rc4? Please guide. Thanks!
>>> 
>>> Pozdrawiam,
>>> Jacek Laskowski
>>> ----
>>> "The Internals Of" Online Books <https://books.japila.pl/>
>>> Follow me on Bluesky <https://bsky.app/profile/books.japila.pl>
>>> 
>>>  <https://twitter.com/jaceklaskowski>

Reply via email to