[ https://issues.apache.org/jira/browse/SPARK-51408?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yang Jie resolved SPARK-51408. ------------------------------ Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 50173 [https://github.com/apache/spark/pull/50173] > AmIpFilterSuite#testProxyUpdate fails in some networks > ------------------------------------------------------ > > Key: SPARK-51408 > URL: https://issues.apache.org/jira/browse/SPARK-51408 > Project: Spark > Issue Type: Bug > Components: Tests, YARN > Affects Versions: 4.0.0 > Reporter: Chris Nauroth > Assignee: Chris Nauroth > Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > While verifying Spark 4.0.0 RC2, I consistently saw YARN test > {{AmIpFilterSuite#testProxyUpdate}} failing in my environment. The test is > written to eventually expect a {{ServletException}} from > {{getProxyAddresses}} after 5 seconds of retries, but I never received this > exception. > This test and the corresponding {{AmIpFilter}} were introduced in SPARK-48238 > as a fork of the Hadoop implementation to resolve a dependency conflict. > However, it seems this test had a small bug in the way it was adapted into > the Spark codebase. The {{AmIpFilter#getProxyAddresses()}} logic may either > return an empty set or throw a {{ServletException}} if it can't find any > valid configured proxies. The Hadoop test's assertion allows for either of > these conditions, but the Spark implementation strictly requires an exception > to be thrown. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org