Hi KhajaAsmath Mohammed
Please check the configuration of "spark.speculation.interval", just pass
the "30" to it.
'''
override def start(): Unit = {
backend.start()
if (!isLocal && conf.get(SPECULATION_ENABLED)) {
logInfo("Starting speculative execution thread")
speculationSched
Something is passing this invalid 30s value, yes. Hard to say which
property it is. I'd check if your cluster config sets anything with the
value 30s - whatever is reading this property is not expecting it.
On Mon, Apr 12, 2021, 2:25 PM KhajaAsmath Mohammed
wrote:
> Hi Sean,
>
> Do you think any
If the question is about the Scala / SBT ecosystem, then some of the SBT
plugins would be moved to JFrog (https://scala.jfrog.io/) hopefully by the
end of this month.
Checkout the following links:
- https://twitter.com/eed3si9n/status/1381627420927782916?s=20
- https://twitter.com/SethTis
Hi Sean,
Do you think anything that can cause this with DFS client?
java.lang.NumberFormatException: For input string: "30s"
at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.parseL
I am using spark hbase connector provided by hortonwokrs. I was able to run
without issues in my local environment and has this issue in emr.
Thanks,
Asmath
> On Apr 12, 2021, at 2:15 PM, Sean Owen wrote:
>
>
> Somewhere you're passing a property that expects a number, but give it "30s".
>
Somewhere you're passing a property that expects a number, but give it
"30s". Is it a time property somewhere that really just wants MS or
something? But most time properties (all?) in Spark should accept that type
of input anyway. Really depends on what property has a problem and what is
setting i
HI,
I am getting weird error when running spark job in emr cluster. Same
program runs in my local machine. Is there anything that I need to do to
resolve this?
21/04/12 18:48:45 ERROR SparkContext: Error initializing SparkContext.
java.lang.NumberFormatException: For input string: "30s"
I tried
Not all the Spark packages in https://spark-packages.org/ are eligible for
maven central. We are looking for the replacement of Bintray for
spark-packages.org.
Bo Zhang is actively working on this. Bo, can you share your ideas with the
community?
Cheers,
Xiao
On Mon, Apr 12, 2021 at 9:28 AM Sea
Spark itself is distributed via Maven Central primarily, so I don't think
it will be affected?
On Mon, Apr 12, 2021 at 11:22 AM Florian CASTELAIN <
florian.castel...@redlab.io> wrote:
> Hello.
>
>
>
> Bintray will shutdown on first May.
>
>
>
> I just saw that packages are hosted on Bintray (whic
Hello.
Bintray will shutdown on first May.
I just saw that packages are hosted on Bintray (which is actually down for
maintenance).
What will happen after first May ? Is there any maintenance to do in projects
to still be able to download spark dependencies ?
Regards !
[signature_299490615]<
Hi,
looks like you have answered some questions whcih I generally ask. Another
thing, can you please let me know the environment? Is it AWS, GCP, Azure,
Databricks, HDP, etc?
Regards,
Gourav
On Sun, Apr 11, 2021 at 8:39 AM András Kolbert
wrote:
> Hi,
>
> Sure!
>
> Application:
> - Spark versio
11 matches
Mail list logo