Please Cc me in replies. On Sun, Nov 12, 2023 at 12:10:21PM -0300, Santiago Ruano Rincón wrote: > Following the email sent by Ilu to debian-project (Message-ID: > <4b93ed08-f148-4c7f-b172-f967f7de7...@gmx.net>), and as we have > discussed during the MiniDebConf UY 2023 with other Debian Members, I > would like to call for a vote about issuing a Debian public statement > regarding > the EU Cyber Resilience Act (CRA) and the Product Liability Directive > (PLD). The CRA is in the final stage in the legislative process in the > EU Parliament, and we think it will impact negatively the Debian > Project, users, developers, companies that rely on Debian, and the FLOSS > community as a whole. Even if the CRA will be probably adopted before > the time the vote ends (if it takes place), we think it is important to > take a public stand about it.
In the process of reading background material, I understand why you see this matter as important. The proposed resolution has aspects that I find questionable though. > b. Knowing whether software is commercial or not isn't feasible, > neither in Debian nor in most free software projects - we don't track > people's employment status or history, nor do we check who finances > upstream projects. As far as I understand it, it never is a question whether a particular software is commercial or not. It can be both at the same time. What is a question is how someone interacts with said software. If a contribution is compensated, then that activity fairly obviously is commercial and the regulation is rather explicit about such activity coming with responsibility about the aspect that has been changed. A redistribution may also be a commercial activity. This can be read from e.g. (10) ... a commercial activity might be characterized not only by charging a price for a product, but also by charging a price for technical support services, ... So much of the time, the product made available in commercial capacity is not a complete software, but a change made to the software. It is very unclear how the regulation can be applied to patches. A possible interpretation is that when sending a patch, the relevant entity assumes responsibility for the entire software, which also is unrealistic. Does this interpretation make sense to you? If not, why? An interesting side aspect here is that SaaS is explicitly exempted from the regulation. (9) ... It does not regulate services, such as Software-as-a-Service (SaaS), ... Directive ... (NIS2) applies to cloud computing services and cloud service models, such as SaaS. ... Therefore a possible effect of CRA is pushing software out of the market by never making it available and only providing services using the software to avoid being covered. > c. If upstream projects stop developing for fear of being in the > scope of CRA and its financial consequences, system security will > actually get worse instead of better. Given the above, I do not think that focusing on upstream projects is a good idea. How about changing that to: c. Paid developers and companies may stop contributing to upstream projects for fear of being in the scope of CRA and ... > d. Having to get legal advice before giving a present to society > will discourage many developers, especially those without a company or > other organisation supporting them. Given the above, this makes less sense to me. To me, there is a clear intention of not covering non-commercial contributions. However, many of us get paid for contributions, so telling apart which contribution is a commercial activity and which is not is a difficult affair resulting in said discouragement. > 2. Debian is well known for its security track record through practices > of responsible disclosure and coordination with upstream developers and > other Free Software projects. We aim to live up to the commitment made > in the Social Contract: "We will not hide problems." (3) > a. The Free Software community has developed a fine-tuned, well > working system of responsible disclosure in case of security issues > which will be overturned by the mandatory reporting to European > authorities within 24 hours (Art. 11 CRA). I think this misses an important detail. The relevant article requires a vulnerability to be actively exploited. Therefore, most of the vulnerabilities that we deal with are not covered. On the flip side, this turns the obligation useless as any non-conforming vendor will simply claim that their vulnerability was not actively exploited. > c. Security issue tracking and remediation is intentionally > decentralized and distributed. The reporting of security issues to > ENISA and the intended propagation to other authorities and national > administrations would collect all software vulnerabilities in one place, > greatly increasing the risk of leaking information about vulnerabilities > to threat actors, representing a threat for all the users around the > world, including European citizens. Given the "actively exploited" requirement, there isn't much to leak, right? > d. Activists use Debian (e.g. through derivatives such as Tails), > among other reasons, to protect themselves from authoritarian > governments; handing threat actors exploits they can use for oppression > is against what Debian stands for. When a vulnerability is actively exploited, chances are that authoritarian governments are involved. This becomes a weak argument to me and I suggest skipping it for brevity. > 3. While proprietary software is developed behind closed doors, Free > Software development is done in the open, transparent for everyone. To > keep even with proprietary software the open development process needs > to be entirely exempt from CRA requirements, just as the development of > software in private is. A "making available on the market" can only be > considered after development is finished and the software is released. I think this is partially covered by Article 4. 3. Member States shall not prevent the making available of unfinished software which does not comply with this Regulation provided that the software is only made available for a limited period required for testing purposes and that a visible sign clearly indicates that it does not comply with this Regulation and will not be available on the market for purposes other than testing. So in principle, we could attach that warning sign to all software and argue that 100 years still is limited at which point there no longer is any commercial activity about it. I suspect it doesn't work like this, but we need to be more precise about why this is item insufficient. So while I see value in such a statement in general, the current wording makes me believe that proponents of CRA may easily dismiss many of the arguments brought forward. Helmut