Hi Burak,

Thank you for the tip.
Unfortunately it does not work. It throws:

java.net.MalformedURLException: unknown protocol: s3n]
at
org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1003)
at
org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

It looks like the meat is in the createRepoResolvers which does not
currently support s3 repo. I will file a jira ticket for this.

Best Regards,

Jerry

On Sat, Oct 3, 2015 at 12:50 PM, Burak Yavuz <brk...@gmail.com> wrote:

> Hi Jerry,
>
> The --packages feature doesn't support private repositories right now.
> However, in the case of s3, maybe it might work. Could you please try using
> the --repositories flag and provide the address:
> `$ spark-submit --packages my:awesome:package --repositories
> s3n://$aws_ak:$aws_sak@bucket/path/to/repo`
>
> If that doesn't work, could you please file a JIRA?
>
> Best,
> Burak
>
>
> On Thu, Oct 1, 2015 at 8:58 PM, Jerry Lam <chiling...@gmail.com> wrote:
>
>> Hi spark users and developers,
>>
>> I'm trying to use spark-submit --packages against private s3 repository.
>> With sbt, I'm using fm-sbt-s3-resolver with proper aws s3 credentials. I
>> wonder how can I add this resolver into spark-submit such that --packages
>> can resolve dependencies from private repo?
>>
>> Thank you!
>>
>> Jerry
>>
>
>

Reply via email to