I know it's not enabled by default when the binary artifacts are built, but
not exactly sure why it's not built separately at all. It's almost a
dependencies-only pom artifact, but there are two source files. Steve do
you have an angle on that?

On Mon, May 31, 2021 at 5:37 AM Erik Torres <etserr...@gmail.com> wrote:

> Hi,
>
> I'm following this documentation
> <https://spark.apache.org/docs/latest/cloud-integration.html#installation> to
> configure my Spark-based application to interact with Amazon S3. However, I
> cannot find the spark-hadoop-cloud module in Maven central for the
> non-commercial distribution of Apache Spark. From the documentation I would
> expect that I can get this module as a Maven dependency in my project.
> However, I ended up building the spark-hadoop-cloud module from the Spark's
> code <https://github.com/apache/spark>.
>
> Is this the expected way to setup the integration with Amazon S3? I think
> I'm missing something here.
>
> Thanks in advance!
>
> Erik
>

Reply via email to