The branch main has been updated by gonzo:

URL: 
https://cgit.FreeBSD.org/src/commit/?id=37cd6c20dbcf251e38d6dfb9d3e02022941f6fc7

commit 37cd6c20dbcf251e38d6dfb9d3e02022941f6fc7
Author:     Oleksandr Tymoshenko <go...@freebsd.org>
AuthorDate: 2021-03-04 07:23:31 +0000
Commit:     Oleksandr Tymoshenko <go...@freebsd.org>
CommitDate: 2021-03-04 07:23:31 +0000

    cron: consume blanks in system crontabs before options
    
    On system crontabs, multiple blanks are not being consumed after reading the
    username. This change adds blank consumption before parsing any -[qn] 
options.
    Without this change, an entry like:
    
      * * * * * username  -n true  # Two spaces between username and option.
    
    will fail, as the shell will try to execute (' -n true'), while an entry 
like:
    
      * * * * * username -n true   # One space between username and option.
    
    works as expected (executes 'true').
    
    For user crontabs, this is not an issue as the preceding (day of week
    or @shortcut) processing consumes any leading whitespace.
    
    PR:             253699
    Submitted by:   Eric A. Borisch <ebori...@gmail.com>
    MFC after:      1 week
---
 usr.sbin/cron/lib/entry.c | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/usr.sbin/cron/lib/entry.c b/usr.sbin/cron/lib/entry.c
index 66ead885bea8..2693c9c8d07a 100644
--- a/usr.sbin/cron/lib/entry.c
+++ b/usr.sbin/cron/lib/entry.c
@@ -315,6 +315,9 @@ load_entry(file, error_func, pw, envp)
                        goto eof;
                }
 
+               /* need to have consumed blanks when checking options below */
+               Skip_Blanks(ch, file)
+               unget_char(ch, file);
 #ifdef LOGIN_CAP
                if ((s = strrchr(username, '/')) != NULL) {
                        *s = '\0';
_______________________________________________
dev-commits-src-main@freebsd.org mailing list
https://lists.freebsd.org/mailman/listinfo/dev-commits-src-main
To unsubscribe, send any mail to "dev-commits-src-main-unsubscr...@freebsd.org"

Reply via email to