Hello,

We are using Apache Flink 1.10.1 version. During our security scans following 
issues are reported by our scan tool.
Please let us know your comments on these dependency vulnerabilities.

Thanks,
Suchithra

-----Original Message-----
From: m...@gsuite.cloud.apache.org <m...@gsuite.cloud.apache.org> On Behalf Of 
Apache Security Team
Sent: Thursday, August 6, 2020 1:08 PM
To: V N, Suchithra (Nokia - IN/Bangalore) <suchithra....@nokia.com>
Cc: Jash, Shaswata (Nokia - IN/Bangalore) <shaswata.j...@nokia.com>; Prabhala, 
Anuradha (Nokia - IN/Bangalore) <anuradha.prabh...@nokia.com>; Badagandi, 
Srinivas B. (Nokia - IN/Bangalore) <srinivas.b.badaga...@nokia.com>
Subject: Re: Security vulnerabilities with Apache Flink 1.10.1 version

Hi,

Outdated dependencies are not always security issues.  A project would only be 
affected if a dependency was used in such a way that the affected underlying 
code is used and the vulnerabilities were exposed.
We typically get reports sent to us from scanning tools that looks at 
dependencies out of context on how they are actually used in the projects.  As 
such we reject these reports and suggest you either a) show how the product is 
affected by the dependency vulnerabilities, or
b) simply mention this as a normal bug report to that project.  Since 
dependency vulnerabilities are quite public, there is no need to use this 
private reporting mechanism for them.

Regards, Mark

On Thu, Aug 6, 2020 at 6:04 AM V N, Suchithra (Nokia - IN/Bangalore) 
<suchithra....@nokia.com> wrote:
>
> Hello,
>
>
>
> We are using Apache Flink 1.10.1 version. During our security scans following 
> issues are reported by our scan tool.
>
>
>
> 1.Package : log4j-1.2.17
>
> Severity: CRITICAL
>
> Fix version: 2.8.2
>
>
>
> Description:
>
> Apache Log4j contains a flaw that is triggered as the SocketServer class 
> accepts log data from untrusted network traffic, which it then insecurely 
> deserializes. This may allow a remote attacker to potentially execute 
> arbitrary code.
>
>
>
> Path:
>
> /opt/flink/lib/log4j-1.2.17.jar
>
> /opt/flink/bin/bash-java-utils.jar:log4j
>
>
>
> References:
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2019-17571
>
> https://seclists.org/oss-sec/2019/q4/167
>
> https://logging.apache.org/log4j/1.2/
>
>
>
> 2.Package: guava-14.0.1
>
> Severity: HIGH
>
> Fix version: 25.0, 24.1.1
>
>
>
> Description:
>
> Google Guava contains a flaw in the 
> CompoundOrdering_CustomFieldSerializer::deserialize() function in 
> com/google/common/collect/CompoundOrdering_CustomFieldSerializer.java that is 
> triggered when deserializing Java objects. With specially crafted serialized 
> data, a context-dependent can exhaust available memory, resulting in a denial 
> of service.
>
>
>
> References:
>
> https://github.com/google/guava/wiki/CVE-2018-10237
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2018-10237
>
>
>
> 3.Package: guava-14.0.1,18.0
>
> Severity: HIGH
>
> Fix version: 25.0, 24.1.1
>
>
>
> Description:
>
> Google Guava contains a flaw in the AtomicDoubleArray::readObject() function 
> in com/google/common/util/concurrent/AtomicDoubleArray.java that is triggered 
> when deserializing Java objects. With specially crafted serialized data, a 
> context-dependent can cause a process linked against the library to exhaust 
> available memory.
>
>
>
> References:
>
> https://github.com/google/guava/wiki/CVE-2018-10237
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2018-10237
>
>
>
> 4. Package: guava-19.0
>
> Severity: HIGH
>
> Fix version: 25.0, 24.1.1, 23.6.1
>
>
>
> Description:
>
> Google Guava contains a flaw in the 
> CompoundOrdering_CustomFieldSerializer::deserialize() function in 
> com/google/common/collect/CompoundOrdering_CustomFieldSerializer.java that is 
> triggered when deserializing Java objects. With specially crafted serialized 
> data, a context-dependent can exhaust available memory, resulting in a denial 
> of service.
>
>
>
> References:
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2018-10237
>
> https://github.com/google/guava/wiki/CVE-2018-10237
>
>
>
> Path:
>
> /opt/flink/lib/flink-table-blink_2.11-1.10.1.jar:guava
>
> /opt/flink/lib/flink-table_2.11-1.10.1.jar:guava
>
> /opt/flink/lib/flink-dist_2.11-1.10.1.jar:guava
>
> /opt/flink/examples/streaming/Twitter.jar:guava
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:guava
>
> /opt/flink/lib/flink-table-blink_2.11-1.10.1.jar:guava
>
> /opt/flink/lib/flink-table_2.11-1.10.1.jar:guava
>
> /opt/flink/lib/flink-dist_2.11-1.10.1.jar:guava
>
> /opt/flink/examples/streaming/Twitter.jar:guava
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:guava
>
>
>
> 5.Package: commons_io-2.4
>
> Severity: HIGH
>
> Fix version: 2.5
>
>
>
> Description:
>
> Apache Commons IO contains a flaw that is due to the program failing to 
> restrict which class can be serialized. This may allow a remote attacker to 
> execute arbitrary Java code via deserialization methods.
>
>
>
> Path:
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:commons-io
>
> /opt/flink/lib/flink-dist_2.11-1.10.1.jar:commons-io
>
>
>
> References:
>
> https://issues.apache.org/jira/browse/IO-487
>
> http://commons.apache.org/proper/commons-io/
>
>
>
> 6.Package: commons_beanutils-1.9.3
>
> Severity: HIGH
>
> Fix version: 1.9.3
>
>
>
> Description:
>
> Commons BeanUtils contains a flaw that is due to it failing to restrict the 
> setting of Class Loader attributes via the class attribute of an ActionForm 
> object. This may allow a remote attacker to manipulate the loaded Classloader 
> and execute arbitrary code.
>
>
>
> Path:
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:commons-beanut
> ils
>
>
>
> References:
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2014-0114
>
> https://seclists.org/bugtraq/2014/Apr/177
>
>
>
> 7.Package: hadoop-2.6.5
>
> Severity: HIGH
>
> Fix version: 3.1.1, 3.0.3, 2.9.2, 2.8.5
>
>
>
> Description:
>
> Apache Hadoop contains a flaw that allows traversing outside of a restricted 
> path. The issue is due to the FileUtil::unpackEntries() functions in 
> fs/FileUtil.java not properly sanitizing user input, specifically path 
> traversal style attacks (e.g. '../') supplied via filenames. With a specially 
> crafted ZIP archive, a context-dependent attacker can write to arbitrary 
> files.
>
>
>
> References:
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2018-8009
>
> https://github.com/snyk/zip-slip-vulnerability
>
>
>
> 8.Package: hadoop-2.6.5
>
> Severity: HIGH
>
> Fix version: 3.1.1, 3.0.3, 2.9.2, 2.8.5
>
>
>
> Description:
>
> Apache Hadoop contains a flaw that allows traversing outside of a restricted 
> path. The issue is due to the FileUtil::unZip() functions in fs/FileUtil.java 
> not properly sanitizing user input, specifically path traversal style attacks 
> (e.g. '../') supplied via filenames. With a specially crafted ZIP archive, a 
> context-dependent attacker can write to arbitrary files.
>
>
>
> References:
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2018-8009
>
> https://github.com/snyk/zip-slip-vulnerability
>
>
>
> 9.Package: hadoop-2.6.5
>
> Severity: HIGH
>
> Fix version: 3.1.1, 2.9.2, 2.8.5
>
>
>
> Description:
>
> Apache Hadoop contains an unspecified flaw which may allow a local attacker 
> with yarn privileges to gain elevated root privileges and execute arbitrary 
> commands. No further details have been provided.
>
>
>
> References:
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2018-8029
>
> https://seclists.org/oss-sec/2019/q2/132
>
>
>
> 10.Package: hadoop-2.6.5
>
> Severity: HIGH
>
> Fix version: 2.7.7
>
>
>
> Description:
>
> Apache Hadoop contains an unspecified flaw that may allow a local attacker 
> who is able to escalate their privileges to yarn user to gain root 
> privileges. No further details have been provided.
>
>
>
> References:
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2018-11766
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2016-6811
>
>
>
> 11.Package: hadoop-2.6.5
>
> Severity: HIGH
>
> Fix version: 2.7.0
>
>
>
> Description:
>
> Apache Hadoop contains a flaw in a servlet on the DataNode related to the 
> HDFS namespace browsing functionality. The issue is triggered as a query 
> parameter is not properly validated when handling NameNode information. This 
> may allow a remote attacker to have an unspecified impact.
>
>
>
> References:
>
> https://cve.mitre.org/cgi-bin/cvename.cgi?name=2017-3162
>
> https://seclists.org/oss-sec/2017/q2/126
>
>
>
> 12.Package: hadoop-2.6.5
>
> Severity: HIGH
>
>
>
> Description:
>
> In Apache Hadoop 3.1.0 to 3.1.1, 3.0.0-alpha1 to 3.0.3, 2.9.0 to 2.9.1, and 
> 2.0.0-alpha to 2.8.4, the user/group information can be corrupted across 
> storing in fsimage and reading back from fsimage.
>
>
>
> References:
>
> https://nvd.nist.gov/vuln/detail/CVE-2018-11768
>
>
>
> Paths:
>
> Hadoop paths
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:hadoop-yarn-co
> mmon
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:hadoop-yarn-cl
> ient
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:hadoop-yarn-ap
> i
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:hadoop-mapredu
> ce-client-core
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:hadoop-hdfs
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:hadoop-common
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:hadoop-auth
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:hadoop-annotat
> ions
>
>
>
> 13.Package: protobuf- 2.5.0, 2.6.1
>
> Severity: HIGH
>
>
>
> Description:
>
> protobuf allows remote authenticated attackers to cause a heap-based buffer 
> overflow.
>
>
>
> Path:
>
> /opt/flink/lib/flink-dist_2.11-1.10.1.jar:protobuf-java
>
> /opt/flink/lib/flink-shaded-hadoop-2-uber-2.6.5-7.0.jar:protobuf-java
>
>
>
> References:
>
> https://nvd.nist.gov/vuln/detail/CVE-2015-5237
>
>
>
> Please let us know your comments on these issues and fix plans.
>
>
>
> Regards,
>
> Suchithra
>
>

Reply via email to