Hi Austin,
In the end I added the following target override for Scala:
```
maven_install(
artifacts = [
# testing
maven.artifact(
group = "com.google.truth",
artifact = "truth",
version = "1.0.1",
),
] + flink_artifacts(
That would be awesome Austin, thanks again for your help on that. In the
meantime, I also filled an issue in the `rules_scala` repo:
https://github.com/bazelbuild/rules_scala/issues/1268.
--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
I know @Aaron Levin is using `rules_scala` for
building Flink apps, perhaps he can help us out here (and hope he doesn't
mind the ping).
On Wed, May 12, 2021 at 4:13 PM Austin Cawley-Edwards <
austin.caw...@gmail.com> wrote:
> Yikes, I see what you mean. I also can not get `neverlink` or addin
Yikes, I see what you mean. I also can not get `neverlink` or adding the
org.scala.lang artifacts to the deploy_env to remove them from the uber jar.
I'm not super familiar with sbt/ scala, but do you know how exactly the
assembly `includeScala` works? Is it just a flag that is passed to scalac?
Hi Austin,
Yep, removing Flink dependencies is working well as you pointed out.
The problem now is that I would also need to remove the scala library...by
inspecting the jar you will see a lot of scala-related classes. If you take
a look at the end of the build.sbt file, I have
```
// exclude Sc
Hi Salva,
I think you're almost there. Confusion is definitely not helped by the
ADDONS/ PROVIDED_ADDONS thingy – I think I tried to get too fancy with that
in the linked thread.
I think the only thing you have to do differently is to adjust the target
you are building/ deploying – instead of
`//
Hi Austin,
I followed your instructions and gave `rules_jvm_external` a try.
Overall, I think I advanced a bit, but I'm not quite there yet. I have
followed the link [1] given by Matthias, making the necessary changes to my
repo:
https://github.com/salvalcantara/bazel-flink-scala
In particular,
Great! Feel free to post back if you run into anything else or come up with
a nice template – I agree it would be a nice thing for the community to
have.
Best,
Austin
On Tue, May 4, 2021 at 12:37 AM Salva Alcántara
wrote:
> Hey Austin,
>
> There was no special reason for vendoring using `bazel-
Hey Austin,
There was no special reason for vendoring using `bazel-deps`, really. I just
took another project as a reference for mine and that project was already
using `bazel-deps`. I am going to give `rules_jvm_external` a try, and
hopefully I can make it work!
Regards,
Salva
--
Sent from:
Hey Salva,
This appears to be a bug in the `bazel-deps` tool, caused by mixing scala
and Java dependencies. The tool seems to use the same target name for both,
and thus produces duplicate targets (one for scala and one for java).
If you look at the dict lines that are reported as conflicting, yo
Hi Matthias,
Thanks a lot for your reply. I am already aware of that reference, but it's
not exactly what I need. What I'd like to have is the typical word count
(hello world) app migrated from sbt to bazel, in order to use it as a
template for my Flink/Scala apps.
--
Sent from: http://apache
g||scala:2.12.11","name||//vendor/org/apache/flink:flink_clients","visibility||//visibility:public","kind||import","deps|||L|||","jars|||L|||//external:jar/org/apache/flink/flink_clients_2_12","sources|||L|||",&quo
le/code/findbugs:jsr305|||//vendor/org/apache/flink:flink_streaming_java_2_12|||//vendor/org/apache/flink:flink_core|||//vendor/org/apache/flink:flink_java|||//vendor/org/apache/flink:flink_runtime_2_12|||//vendor/org/apache/flink:flink_optimizer_2_12","processorClasses|||L|||"
13 matches
Mail list logo