> Is seconds granular enough? The only reason why one would ever want to go into fractions of seconds would be some sort of unittesting with very low delays.
In any normal environment the max is going to be tens if not hundreds or thousands of seconds. Also note that the delay and interval (ie. not max interval) are also currently exported in seconds, so having more granularity for max_seconds is kind of pointless. I have been considering whether I could make proc_dointvec_jiffies accept floating point input (and output) though... although that seems a little harder and probably out of scope of this change.