Perhaps ones *intentions* are good, but when you're making decisions for 
someone else, it is often necessary to take a step back and look at the bigger 
picture of what one is doing. 

I do not consider myself to be immune to the effects of this thought process. I 
have kids. They remind me constantly that I'm just as subject to the "I know 
better" mindset as anyone else. 

But you said the best thing to do is "keep an open mind." What Luca has 
literally said multiple times is "I've already made my decision and I'm only 
looking for comments that support the technical aspects of that decision."

That squashes discussion. Only one person has made such comments, and that 
person was not me. 

On a more procedural note...

I have repeatedly voiced my opinion that automatically deleting files is a bad 
idea and I stand by that opinion for reasons already stated. 

However, recognizing that I don't always get my way, I've thought about how one 
might "do it anyway" while still addressing my underlying concerns and here's 
what I came up with:

The two biggest things that are underneath are a) applications that don't clean 
up after themselves and b) a change in how the system behaves vs. what the 
users expect. 

To address a) in a coherent way, if I had infinite resources, I would create a 
package similar to popularity-contest that is obvious and optional and reports 
back what commonly appears in various scratch spaces. That is how I would 
gather a wide range of data on what packages don't behave well. D-i can allow 
the admin to opt in or not on the same screen as the one that asks about 
popularity-contest. (Those that opt in or out will likely do so for that same 
reasons, after all.)

As for b) the underlying problem is the change in *expected behavior* of the 
system. The real problem has nothing to do with whether or not it's technically 
a good idea. It's a shift in expectation with potentially disastrous 
consequences. Deleted files are often just gone. 

So to mitigate that, I would 1) only implement it on new installs (we'll come 
back to this), 2) mention it at least twice in the various d-i screens, and...

...3) I would put a file in any auto-cleaned space named "1-AUTOCLEAN.txt" that 
contains some verbage explaining that things in this directory will be wiped 
based on rules set in (wherever). 

This is how files like /etc/resolv.conf read when they are controlled by other 
processes. They just have text that tells you, "Don't change this directly. 
Your changes will be overwritten. Make your changes in (canonical place)."

Last but not least, I would go ahead and deploy the packages that automatically 
clean tmp spaces even to existing systems, but their default configuration 
would be disabled. The only thing that would enable them would be a) 
debian-installer (optionally, possibly as default), or b) admins who have heard 
about this and decide it's a good idea. 

Some will opt in, some won't. But the packages and their default configs could 
be pushed out safely. No one (even me) would have a reasonable complaint about 
such an arrangement. 

That would allow the expectation to shift over time while significantly 
reducing the number of surprised users who get their data deleted. 

In the end, this is all for people to use. So to me, it's much more an issue of 
making sure the people know what they're using and how to use it than anything 
else. If it were just a technical issue, this would be a much shorter 
conversation. ;)

--J

Sent from my mobile device.

________________________________
From: "Barak A. Pearlmutter" <ba...@cs.nuim.ie>
Sent: Tuesday, May 7, 2024 07:18
To: r...@neoquasar.org
Cc: Luca Boccassi; debian-devel@lists.debian.org
Subject: Re: Re: Make /tmp/ a tmpfs and cleanup /var/tmp/ on a timer by default 
[was: Re: systemd: tmpfiles.d not cleaning /var/tmp by default]

Rhys, I think you're being unfair. We have a *technical* disagreement 
here. But our hearts are all in the same place: Luca, myself, and all 
the other DDs discussing this, all want what's best for our users, we 
all want to build the best OS possible, and are all discussing the 
issue in good faith. 

There is an unavoidable tension, and we're hashing it out. Upstream 
has fielded a default behaviour which requires adjustment of a variety 
of other programs and workflows. Basically, anything that stores stuff 
in /tmp or /var/tmp needs to be made might-be-deleted-aware. There are 
mechanisms for dealing with this, but they're pretty complicated, and 
differ wildly for different file lifetimes etc. Other distributions 
have adopted that default, and rather than using exposed mechanisms 
for avoiding unexpected deletion, are just telling people not to count 
on files in /var/tmp/ surviving a reboot if the computer is shut down 
more than a month, or whatever. What should Debian do? You can make 
arguments both ways, and we are. Generally we follow upstream unless 
there's a compelling reason not to. You can suggest various strategies 
for making things reliable despite following upstream. You can discuss 
why maybe upstream should not be followed in this case. This is 
precisely the kind of discussion that leads to good decisions, with 
everyone keeping an open mind and sharing information and ideas. 

Reply via email to