--- Begin Message ---
Package: release.debian.org
Control: affects -1 + src:curl
X-Debbugs-Cc: c...@packages.debian.org
User: release.debian....@packages.debian.org
Usertags: pu
Tags: bookworm
Severity: normal
[ Reason ]
I would like to backport wcurl into stable, wcurl is a script which we ship as
part of the curl package on unstable and testing.
wcurl is a command line tool which lets you download URLs without having to
remember any parameters.
https://samueloph.dev/blog/announcing-wcurl-a-curl-wrapper-to-download-files/
https://curl.se/wcurl/
[ Impact ]
Users need to use wcurl from bookworm-backports or wait until the next stable
release.
[ Tests ]
wcurl has unit testing and is shipped and tested in other OSes.
[ Risks ]
wcurl is a POSIX-compliant shell script with 318 lines of code including
comments.
There's a risk here if the user already installed something at /usr/bin/wcurl,
but if that's the case, it's very likely they manually installed wcurl itself.
[ Checklist ]
[x] *all* changes are documented in the d/changelog
[x] I reviewed all changes and I approve them
[x] attach debdiff against the package in (old)stable
[x] the issue is verified as fixed in unstable
[ Changes ]
Install wcurl and wcurl.1 on the curl package.
[ Other info ]
This is not a bugfix, so I will understand if the release team rejects this. I
wanted to at least create the request as I believe it's a good change for stable
users.
--
Samuel Henrique <samueloph>
diff -Nru curl-7.88.1/debian/changelog curl-7.88.1/debian/changelog
--- curl-7.88.1/debian/changelog 2024-09-17 21:29:24.000000000 +0200
+++ curl-7.88.1/debian/changelog 2024-12-29 16:49:11.000000000 +0100
@@ -1,3 +1,9 @@
+curl (7.88.1-10+deb12u9) bookworm; urgency=medium
+
+ * Install wcurl on curl package
+
+ -- Samuel Henrique <samuel...@debian.org> Sun, 29 Dec 2024 16:49:11 +0100
+
curl (7.88.1-10+deb12u8) bookworm; urgency=medium
* Team upload.
diff -Nru curl-7.88.1/debian/curl.install curl-7.88.1/debian/curl.install
--- curl-7.88.1/debian/curl.install 2024-09-17 21:29:24.000000000 +0200
+++ curl-7.88.1/debian/curl.install 2024-12-29 16:49:11.000000000 +0100
@@ -1,3 +1,4 @@
#!/usr/bin/dh-exec
usr/bin/curl
<!cross> usr/share/zsh/*
+../wcurl/wcurl usr/bin
diff -Nru curl-7.88.1/debian/curl.manpages curl-7.88.1/debian/curl.manpages
--- curl-7.88.1/debian/curl.manpages 2024-09-17 21:29:24.000000000 +0200
+++ curl-7.88.1/debian/curl.manpages 2024-12-29 16:49:11.000000000 +0100
@@ -1 +1,2 @@
docs/curl.1
+debian/wcurl/wcurl.1
diff -Nru curl-7.88.1/debian/wcurl/wcurl curl-7.88.1/debian/wcurl/wcurl
--- curl-7.88.1/debian/wcurl/wcurl 1970-01-01 01:00:00.000000000 +0100
+++ curl-7.88.1/debian/wcurl/wcurl 2024-12-29 16:49:11.000000000 +0100
@@ -0,0 +1,318 @@
+#!/bin/sh
+
+# wcurl - a simple wrapper around curl to easily download files.
+#
+# Requires curl >= 7.46.0 (2015)
+#
+# Copyright (C) Samuel Henrique <samuel...@debian.org>, Sergio Durigan
+# Junior <sergi...@debian.org> and many contributors, see the AUTHORS
+# file.
+#
+# Permission to use, copy, modify, and distribute this software for any purpose
+# with or without fee is hereby granted, provided that the above copyright
+# notice and this permission notice appear in all copies.
+#
+# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT OF THIRD PARTY RIGHTS.
IN
+# NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
+# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
+# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE
+# OR OTHER DEALINGS IN THE SOFTWARE.
+#
+# Except as contained in this notice, the name of a copyright holder shall not
be
+# used in advertising or otherwise to promote the sale, use or other dealings
in
+# this Software without prior written authorization of the copyright holder.
+#
+# SPDX-License-Identifier: curl
+
+# Stop on errors and on usage of unset variables.
+set -eu
+
+VERSION="2024.12.08"
+
+PROGRAM_NAME="$(basename "$0")"
+readonly PROGRAM_NAME
+
+# Display the version.
+print_version()
+{
+ cat << _EOF_
+${VERSION}
+_EOF_
+}
+
+# Display the program usage.
+usage()
+{
+ cat << _EOF_
+${PROGRAM_NAME} -- a simple wrapper around curl to easily download files.
+
+Usage: ${PROGRAM_NAME} <URL>...
+ ${PROGRAM_NAME} [--curl-options <CURL_OPTIONS>]...
[--no-decode-filename] [-o|-O|--output <PATH>] [--dry-run] [--] <URL>...
+ ${PROGRAM_NAME} [--curl-options=<CURL_OPTIONS>]...
[--no-decode-filename] [--output=<PATH>] [--dry-run] [--] <URL>...
+ ${PROGRAM_NAME} -h|--help
+ ${PROGRAM_NAME} -V|--version
+
+Options:
+
+ --curl-options <CURL_OPTIONS>: Specify extra options to be passed when
invoking curl. May be
+ specified more than once.
+
+ -o, -O, --output <PATH>: Use the provided output path instead of getting it
from the URL. If
+ multiple URLs are provided, all files will have the
same name with a
+ number appended to the end (curl >= 7.83.0). If
this option is provided
+ multiple times, only the last value is considered.
+
+ --no-decode-filename: Don't percent-decode the output filename, even if the
percent-encoding in
+ the URL was done by wcurl, e.g.: The URL contained
whitespaces.
+
+ --dry-run: Don't actually execute curl, just print what would be invoked.
+
+ -V, --version: Print version information.
+
+ -h, --help: Print this usage message.
+
+ <URL>: The URL to be downloaded. May be specified more than once.
+
+ <CURL_OPTIONS>: Any option supported by curl can be set here. This is not
used by wcurl; it's
+ instead forwarded to the curl invocation.
+
+ <URL>: The URL to be downloaded. May be specified more than once.
+_EOF_
+}
+
+# Display an error message and bail out.
+error()
+{
+ printf "%s\n" "$*" > /dev/stderr
+ exit 1
+}
+
+# Extra curl options provided by the user.
+# This will be set per-URL for every URL provided.
+# Some options are global, but we are erroring on the side of needlesly setting
+# them multiple times instead of causing issues with parameters that needs to
+# be set per-URL.
+CURL_OPTIONS=""
+
+# The URLs to be downloaded.
+URLS=""
+
+# Will be set to the percent-decoded filename parsed from the URL, unless
+# --output or --no-decode-filename are used.
+OUTPUT_PATH=""
+HAS_USER_SET_OUTPUT="false"
+
+# The parameters that will be passed per-URL to curl.
+readonly PER_URL_PARAMETERS="\
+ --fail \
+ --globoff \
+ --location \
+ --proto-default https \
+ --remote-time \
+ --retry 10 \
+ --retry-max-time 10 "
+
+# Whether to invoke curl or not.
+DRY_RUN="false"
+
+# Sanitize parameters.
+sanitize()
+{
+ if [ -z "${URLS}" ]; then
+ error "You must provide at least one URL to download."
+ fi
+
+ readonly CURL_OPTIONS URLS DRY_RUN HAS_USER_SET_OUTPUT
+}
+
+# Indicate via exit code whether the string given in the first parameter
+# consists solely of characters from the string given in the second parameter.
+# In other words, it returns 0 if the first parameter only contains characters
+# from the second parameter, e.g.: Are $1 characters a subset of $2 characters?
+is_subset_of()
+{
+ case "${1}" in
+ *[!${2}]*|'') return 1;;
+ esac
+}
+
+# Print the given string percent-decoded.
+percent_decode()
+{
+ # Encodings of control characters (00-1F) are passed through without
decoding.
+ # Iterate on the input character-by-character, decoding it.
+ printf "%s\n" "${1}" | fold -w1 | while IFS= read -r decode_out; do
+ # If character is a "%", read the next character as decode_hex1.
+ if [ "${decode_out}" = % ] && IFS= read -r decode_hex1; then
+ decode_out="${decode_out}${decode_hex1}"
+ # If there's one more character, read it as decode_hex2.
+ if IFS= read -r decode_hex2; then
+ decode_out="${decode_out}${decode_hex2}"
+ # Skip decoding if this is a control character (00-1F).
+ # Skip decoding if DECODE_FILENAME is not "true".
+ if is_subset_of "${decode_hex1}" "23456789abcdefABCDEF" && \
+ is_subset_of "${decode_hex2}" "0123456789abcdefABCDEF" && \
+ [ "${DECODE_FILENAME}" = "true" ]; then
+ # Use printf to decode it into octal and then decode it to
the final format.
+ decode_out="$(printf "%b" "\\$(printf %o
"0x${decode_hex1}${decode_hex2}")")"
+ fi
+ fi
+ fi
+ printf %s "${decode_out}"
+ done
+}
+
+# Print the percent-decoded filename portion of the given URL.
+get_url_filename()
+{
+ # Remove protocol and query string if present.
+ hostname_and_path="$(printf %s "${1}" | sed -e 's,^[^/]*//,,' -e
's,?.*$,,')"
+ # If what remains contains a slash, there's a path; return it
percent-decoded.
+ case "${hostname_and_path}" in
+ # sed to remove everything preceeding the last '/', e.g.:
"example/something" becomes "something"
+ */*) percent_decode "$(printf %s "${hostname_and_path}" | sed -e
's,^.*/,,')";;
+ esac
+ # No slash means there was just a hostname and no path; return empty
string.
+}
+
+# Execute curl with the list of URLs provided by the user.
+exec_curl()
+{
+ CMD="curl "
+
+ # Store version to check if it supports --no-clobber and --parallel.
+ curl_version=$($CMD --version | cut -f2 -d' ' | head -n1)
+ curl_version_major=$(echo "$curl_version" | cut -f1 -d.)
+ curl_version_minor=$(echo "$curl_version" | cut -f2 -d.)
+
+ CURL_HAS_NO_CLOBBER=""
+ CURL_HAS_PARALLEL=""
+ # --no-clobber is only supported since 7.83.0.
+ # --parallel is only supported since 7.66.0.
+ if [ "${curl_version_major}" -ge 8 ]; then
+ CURL_HAS_NO_CLOBBER="--no-clobber"
+ CURL_HAS_PARALLEL="--parallel"
+ elif [ "${curl_version_major}" -eq 7 ];then
+ if [ "${curl_version_minor}" -ge 83 ]; then
+ CURL_HAS_NO_CLOBBER="--no-clobber"
+ fi
+ if [ "${curl_version_minor}" -ge 66 ]; then
+ CURL_HAS_PARALLEL="--parallel"
+ fi
+ fi
+
+ # Detecting whether we need --parallel. It's easier to rely on
+ # the shell's argument parsing.
+ # shellcheck disable=SC2086
+ set -- $URLS
+
+ if [ "$#" -gt 1 ]; then
+ CURL_PARALLEL="$CURL_HAS_PARALLEL"
+ else
+ CURL_PARALLEL=""
+ fi
+
+ # Start assembling the command.
+ #
+ # We use 'set --' here (again) because (a) we don't have arrays on
+ # POSIX shell, and (b) we need better control over the way we
+ # split arguments.
+ #
+ # shellcheck disable=SC2086
+ set -- ${CMD} ${CURL_PARALLEL}
+
+ NEXT_PARAMETER=""
+ for url in ${URLS}; do
+ # If the user did not provide an output path, define one.
+ if [ "${HAS_USER_SET_OUTPUT}" = "false" ]; then
+ OUTPUT_PATH="$(get_url_filename "${url}")"
+ # If we could not get a path from the URL, use the default:
index.html.
+ [ -z "${OUTPUT_PATH}" ] && OUTPUT_PATH=index.html
+ fi
+ # shellcheck disable=SC2086
+ set -- "$@" ${NEXT_PARAMETER} ${PER_URL_PARAMETERS}
${CURL_HAS_NO_CLOBBER} ${CURL_OPTIONS} --output "${OUTPUT_PATH}" "${url}"
+ NEXT_PARAMETER="--next"
+ done
+
+ if [ "${DRY_RUN}" = "false" ]; then
+ exec "$@"
+ else
+ printf "%s\n" "$@"
+ fi
+}
+
+# Default to decoding the output filename
+DECODE_FILENAME="true"
+
+# Use "${1-}" in order to avoid errors because of 'set -u'.
+while [ -n "${1-}" ]; do
+ case "${1}" in
+ --curl-options=*)
+ opt=$(printf "%s\n" "${1}" | sed 's/^--curl-options=//')
+ CURL_OPTIONS="${CURL_OPTIONS} ${opt}"
+ ;;
+
+ --curl-options)
+ shift
+ CURL_OPTIONS="${CURL_OPTIONS} ${1}"
+ ;;
+
+ --dry-run)
+ DRY_RUN="true"
+ ;;
+
+ --output=*)
+ opt=$(printf "%s\n" "${1}" | sed 's/^--output=//')
+ HAS_USER_SET_OUTPUT="true"
+ OUTPUT_PATH="${opt}"
+ ;;
+
+ -o|-O|--output)
+ shift
+ HAS_USER_SET_OUTPUT="true"
+ OUTPUT_PATH="${1}"
+ ;;
+
+ --no-decode-filename)
+ DECODE_FILENAME="false"
+ ;;
+
+ -h|--help)
+ usage
+ exit 0
+ ;;
+
+ -V|--version)
+ print_version
+ exit 0
+ ;;
+
+ --)
+ # This is the start of the list of URLs.
+ shift
+ for url in "$@"; do
+ # Encode whitespaces into %20, since wget supports those URLs.
+ newurl=$(printf "%s\n" "${url}" | sed 's/ /%20/g')
+ URLS="${URLS} ${newurl}"
+ done
+ break
+ ;;
+
+ -*)
+ error "Unknown option: '$1'."
+ ;;
+
+ *)
+ # This must be a URL.
+ # Encode whitespaces into %20, since wget supports those URLs.
+ newurl=$(printf "%s\n" "${1}" | sed 's/ /%20/g')
+ URLS="${URLS} ${newurl}"
+ ;;
+ esac
+ shift
+done
+
+sanitize
+exec_curl
diff -Nru curl-7.88.1/debian/wcurl/wcurl.1 curl-7.88.1/debian/wcurl/wcurl.1
--- curl-7.88.1/debian/wcurl/wcurl.1 1970-01-01 01:00:00.000000000 +0100
+++ curl-7.88.1/debian/wcurl/wcurl.1 2024-12-29 16:49:11.000000000 +0100
@@ -0,0 +1,127 @@
+.\" **************************************************************************
+.\" * _ _ ____ _
+.\" * Project ___| | | | _ \| |
+.\" * / __| | | | |_) | |
+.\" * | (__| |_| | _ <| |___
+.\" * \___|\___/|_| \_\_____|
+.\" *
+.\" * Copyright (C) Samuel Henrique <samuel...@debian.org>, et al.
+.\" *
+.\" * This software is licensed as described in the file COPYING, which
+.\" * you should have received as part of this distribution. The terms
+.\" * are also available at https://curl.se/docs/copyright.html.
+.\" *
+.\" * You may opt to use, copy, modify, merge, publish, distribute and/or sell
+.\" * copies of the Software, and permit persons to whom the Software is
+.\" * furnished to do so, under the terms of the COPYING file.
+.\" *
+.\" * This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
+.\" * KIND, either express or implied.
+.\" *
+.\" * SPDX-License-Identifier: curl
+.\" *
+.\" **************************************************************************
+.TH wcurl "1" "2024.12.08" "wcurl" "User Commands"
+.SH NAME
+.B wcurl
+- a simple wrapper around curl to easily download files.
+.SH SYNOPSIS
+.nf
+\fBwcurl \fI<URL>\fP...\fR
+\fBwcurl [\-\-curl\-options \fI<CURL_OPTIONS>\fP]... [\-\-dry\-run]
[\-\-no\-decode\-filename] [\-o|\-O|\-\-output <PATH>] [\-\-] \fI<URL>\fP...\fR
+\fBwcurl [\-\-curl\-options=\fI<CURL_OPTIONS>\fP]... [\-\-dry\-run]
[\-\-no\-decode\-filename] [\-\-output=<PATH>] [\-\-] \fI<URL>\fP...\fR
+\fBwcurl \-V|\-\-version\fR
+\fBwcurl \-h|\-\-help\fR
+.fi
+.SH DESCRIPTION
+\fBwcurl\fR is a simple curl wrapper which lets you use curl to download files
+without having to remember any parameters.
+.PP
+Simply call \fBwcurl\fR with a list of URLs you want to download and
\fBwcurl\fR will pick
+sane defaults.
+.PP
+If you need anything more complex, you can provide any of curl's supported
+parameters via the \fB\-\-curl\-options\fR option. Just beware that you likely
+should be using curl directly if your use case is not covered.
+.PP
+.TP
+By default, \fBwcurl\fR will:
+.br
+\[bu] Percent-encode whitespaces in URLs;
+.br
+\[bu] Download multiple URLs in parallel if the installed curl's version is
>= 7.66.0;
+.br
+\[bu] Follow redirects;
+.br
+\[bu] Automatically choose a filename as output;
+.br
+\[bu] Avoid overwriting files if the installed curl's version is >= 7.83.0
(--no-clobber);
+.br
+\[bu] Perform retries;
+.br
+\[bu] Set the downloaded file timestamp to the value provided by the server,
if available;
+.br
+\[bu] Default to the protocol used as https if the URL doesn't contain any;
+.br
+\[bu] Disable \fBcurl\fR's URL globbing parser so \fB{}\fR and \fB[]\fR
characters in URLs are not treated specially.
+.br
+\[bu] Percent-decode the resulting filename.
+.br
+\[bu] Use "index.html" as default filename if there's none in the URL.
+.SH OPTIONS
+.TP
+\fB\-\-curl\-options, \-\-curl\-options=\fI<CURL_OPTIONS>\fR...\fR
+Specify extra options to be passed when invoking curl. May be specified more
than once.
+.TP
+\fB\-o, \-O, \-\-output=\fI<PATH>\fR...\fR
+Use the provided output path instead of getting it from the URL. If multiple
+URLs are provided, all files will have the same name with a number appended to
+the end (curl >= 7.83.0). If this option is provided multiple times, only the
+last value is considered.
+.TP
+\fB\-\-dry\-run\fR
+Don't actually execute curl, just print what would be invoked.
+.TP
+\fB\-V, \-\-version\fR
+Print version information.
+.TP
+\fB\-h, \-\-help\fR
+Print help message.
+.SH CURL_OPTIONS
+Any option supported by curl can be set here.
+This is not used by \fBwcurl\fR; it's instead forwarded to the curl invocation.
+.SH URL
+Anything which is not a parameter will be considered an URL.
+\fBwcurl\fR will percent-encode whitespaces and pass that to curl, which will
perform the
+parsing of the URL.
+.SH EXAMPLES
+Download a single file:
+.br
+\fBwcurl example.com/filename.txt\fR
+.PP
+Download two files in parallel:
+.br
+\fBwcurl example.com/filename1.txt example.com/filename2.txt\fR
+.PP
+Download a file passing the \fI\-\-progress\-bar\fR and \fI\-\-http2\fR flags
to curl:
+.br
+\fBwcurl \-\-curl\-options="\-\-progress\-bar \-\-http2"
example.com/filename.txt\fR
+.PP
+Resume from an interrupted download (if more options are used, this needs to
be the last one in the list):
+.br
+\fBwcurl \-\-curl\-options="\-\-continue-at \-" example.com/filename.txt\fR
+.SH AUTHORS
+Samuel Henrique <samuel...@debian.org>
+.br
+Sergio durigan junior <sergi...@debian.org>
+.br
+and many contributors, see the AUTHORS file.
+.SH REPORTING BUGS
+If you experience any problems with \fBwcurl\fR that you do not experience
with curl,
+submit an issue on Github:
+.br
+https://github.com/curl/wcurl
+.SH COPYRIGHT
+\fBwcurl\fR is licensed under the curl license
+.SH SEE ALSO
+.BR curl (1)
--- End Message ---