e one.
You can upload two text files and it will produce a unified diff (if
that's what you want). Or a side to side comparison. YMMV, but I doubt
you've tried.
Who knows what Mick Crane is trying to do, though. Create a web page? A
script? A script to create a web page?
And if nobody her
Dunno, if this was mentioned before.
I am using "pdfarranger" to combine padf files. It is a nice graphical tool
with easy-to-use.
Maybe it is this, you are looking for?
Best
Hans
On Thursday, October 02, 2025 12:27:47 PM Greg wrote:
> Right. I tried to use diff for word and syntactical changes, but the
> verbose output and the diff by line seemed to make the app impracticable
> for literary (in the broadest sense!) use.
I did mention wdiff -- you should try that on text --
On Thu, Oct 02, 2025 at 07:02:55PM +0100, mick.crane wrote:
[...]
> I asked ChatGPT [...]
> So I wondered, is it line feed, encoding, the patch protocols, some issue
> with the html.
Or ChatGPT bullshitting its way through a syntactically wrong diff.
Correct syntaxes (not just "statistically c
mick.crane wrote:
> On 2025-10-02 12:24, john doe wrote:
> > On 10/2/25 1:13 PM, mick.crane wrote:
> > > Please bear in mind I don't know what I'm doing.
> > > I only ever used diff once before.
> >
> >
> >
> > > I was trying to get ChatGPT to post a patch file rather than the
> > > whole thing
On Thu, Oct 02, 2025 at 01:36:20PM -0500, John Hasler wrote:
> mick writes:
> > So I wondered, is it line feed, encoding, the patch protocols, some
> > issue with the html.
>
> ChatGPT isn't going to run diff to generate a patch file. It is going to
> produce a block of text that looks like one.
mick writes:
> So I wondered, is it line feed, encoding, the patch protocols, some
> issue with the html.
ChatGPT isn't going to run diff to generate a patch file. It is going to
produce a block of text that looks like one.
--
John Hasler
j...@sugarbit.com
Elmwood, WI USA
On Thu, Oct 02, 2025 at 04:27:47PM -, Greg wrote:
On 2025-10-02, rhkra...@gmail.com wrote:
* Somewhat OT, but for the sake of something (completeness) I'd like to
also mention word diffs which (1) are often more useful for human readers of
text (as opposed to programs), as (2) they show
On 2025-10-02, Russell L. Harris wrote:
> On Thu, Oct 02, 2025 at 04:27:47PM -, Greg wrote:
>>On 2025-10-02, rhkra...@gmail.com wrote:
>>>
>>>* Somewhat OT, but for the sake of something (completeness) I'd like to
>>> also mention word diffs which (1) are often more useful for human reade
On 2025-10-02, rhkra...@gmail.com wrote:
>
>* Somewhat OT, but for the sake of something (completeness) I'd like to
> also mention word diffs which (1) are often more useful for human readers of
> text (as opposed to programs), as (2) they show (or try to show) differences
> by
> individua
25 at 12:13:11 +0100, mick.crane wrote:
> > Does anybody know how the system/syntax for diff files for Bookworm can
> > be explained?
>
> If you mean how diff(1) and patch(1) are used, it's not specific to
> Debian or Bookworm.
On Thu, Oct 02, 2025 at 12:13:11 +0100, mick.crane wrote:
> Does anybody know how the system/syntax for diff files for Bookworm can be
> explained?
If you mean how diff(1) and patch(1) are used, it's not specific to
Debian or Bookworm.
diff has three different output modes: legacy, c
ody know how the system/syntax for diff files for Bookworm can be
> explained?
> At my end the patch partly worked and then bailed before 208c222,231
> Maybe it's something about code blocks in a browser.
I'd recommend using "diff -u" (i.e. the "unified format
the commits are coming from)?
Does anybody know how the system/syntax for diff files for Bookworm can
be explained?
I don't get what you are asking.
At my end the patch partly worked and then bailed before 208c222,231
Maybe it's something about code blocks in a browser.
What patch is that?
--
John Doe
Please bear in mind I don't know what I'm doing.
I only ever used diff once before.
I was trying to get ChatGPT to post a patch file rather than the whole
thing but it's been unsuccessful.
Does anybody know how the system/syntax for diff files for Bookworm can
be explained?
At m
On Sat, Sep 20, 2025 at 6:52 PM Tom Browder wrote:
> I just tried out the selection on my host and I remember one thing which
> turned me off: for a *nix program to show a default file name with spaces
> seems wrong.
>
> However, I will give it another try tomorrow.
I tried it, and it works! H
Maybe the most simple option, but it works for me when I want
concatenate PDF's:
gs -dBATCH -dNOPAUSE -q -sDEVICE=pdfwrite -sOutputFile=$OUTPUTFILE.pdf
$FILE_IN1.pdf $FILE_IN3.pdf
Toni Mas
Missatge de Tom Browder del dia ds., 20 de
set. 2025 a les 16:48:
>
> I have a new HP all-in-one printer I
e [options] PDF-sourcefile1..PDF-sourcefilen PDF-destfile
EXAMPLE
pdfunite sample1.pdf sample2.pdf sample.pdf
$ dpkg -S /usr/bin/pdfunite
poppler-utils: /usr/bin/pdfunite
A decade ago exiftool was unable to add metadata to files created by
pdfunite. There was no this issue if files were merged
On Sat, Sep 20, 2025 at 7:48 AM Tom Browder wrote:
> The PDF Tool kit (pdftk) can be used if you're smart enough to decipher the
> man page\
$ man pdfunite | col -b | expand | sed -ne '/SYN/{N;p};/EXA/{N;p;q}'
SYNOPSIS
pdfunite [options] PDF-sourcefile1..PDF-sourcefilen PDF-destfile
EXAMPL
On 2025-09-21, Stefan Monnier wrote:
>> I used to use xsane almost exclusively for scanning to PDF, but it
>> only does single pages.
>
> It doesn't. In the window where you can click to "Scan", at the
> top-right under the menu you have a button that says "Save".
> If you click on it, you'll see
> I used to use xsane almost exclusively for scanning to PDF, but it
> only does single pages.
It doesn't. In the window where you can click to "Scan", at the
top-right under the menu you have a button that says "Save".
If you click on it, you'll see several choices, one of them called
"multipage
On Sat, 20 Sept 2025 at 23:57, Charlie Gibbs wrote:
> On Sat Sep 20 16:30:29 2025 "Roy J. Tellason, Sr." wrote:
> > On Saturday 20 September 2025 10:48:01 am Tom Browder wrote:
> >> I have a new HP all-in-one printer I can use to scan to PDF.
> >> Unfortunately I haven't found any Debian app
, while naps2 is a very flexible utility that
allows pages to be rotated, or re-arranged by dragging and dropping.
When it comes to selecting individual pages or combining multiple
PDF files, I use pdftk. The syntax is a little odd, but now that
I've learned it I find it can do whatever I wan
On Sat, Sep 20, 2025 at 09:56 Greg wrote:
> On 2025-09-20, Tom Browder wrote:
>
…
> have a new HP all-in-one printer I can use to scan to PDF. Unfortunately
…
> Really? I use simple-scan, which produces multi-page scans on my
…
> I remember I did try simple-scan early on. And I will try it
On Saturday 20 September 2025 10:48:01 am Tom Browder wrote:
> I have a new HP all-in-one printer I can use to scan to PDF. Unfortunately
> I haven't found any Debian app that can produce multiplle-page PDF scans.
I use Simple Scan for this.
--
Member of the toughest, meanest, deadliest, most u
On 9/20/25 7:56 AM, Greg wrote:
On 2025-09-20, Tom Browder wrote:
I have a new HP all-in-one printer I can use to scan to PDF. Unfortunately
I haven't found any Debian app that can produce multiplle-page PDF scans.
The only app I have any scanning success with is XSane (its docs are not
good
On Sat, Sep 20, 2025 at 10:56 AM Greg wrote:
> On 2025-09-20, Tom Browder wrote:
> >
> > I have a new HP all-in-one printer I can use to scan to PDF.
> Unfortunately
> > I haven't found any Debian app that can produce multiplle-page PDF
> scans.
> > The only app I have any scanning success with
On 2025-09-20, Tom Browder wrote:
>
> I have a new HP all-in-one printer I can use to scan to PDF. Unfortunately
> I haven't found any Debian app that can produce multiplle-page PDF scans.
> The only app I have any scanning success with is XSane (its docs are not
> good, nor is its
> user interfa
I have a new HP all-in-one printer I can use to scan to PDF. Unfortunately
I haven't found any Debian app that can produce multiplle-page PDF scans.
The only app I have any scanning success with is XSane (its docs are not
good, nor is its
user interface).
The PDF Tool kit (pdftk) can be used if y
On Fri, Aug 22, 2025 at 2:45 PM Ken Mankoff wrote:
> I cannot set ulimit -Hn above the current value of 32768. But for now, things
> seem to be working better.
That's probably set via PAM or the like for a default max hard limit
for most users.
One could reconfigure that, if needed/desired, but
Hi,
On Sat, Aug 23, 2025 at 09:26:16AM +0200, Michel Verdier wrote:
> Good mailers have different reply modes. "reply" use only From, "wide
> reply" use also Cc and adresses from To (I think you used this one), and
> "reply to list" which is the good one to use. If you don't have it and
> don't wa
On 2025-08-22, Ken Mankoff wrote:
> I note your signature says "(please don't CC me when replying to the list)"
> but when I reply to your email, you are in the To field, and the list is in
> the CC field. I'm not sure what to do with this. I decided to remove you from
> the To field. I hope that'
Hi Jan,
On 2025-08-19 at 04:55 -07, Jan Claeys wrote...
> On Fri, 2025-08-15 at 20:55 -0700, Ken Mankoff wrote:
> It sounds like both of these messages are related to 'inotify' (and/or
> one of the other kernel APIs for watching for file changes like
> 'fanotify').
>
> [...]
>
> If you want to se
ax=9223372036854775807
> If you're seeing the message "Too many open files", you're *probably*
> hitting EMFILE (the per-process limit) rather than ENFILE (system-wide),
> but it would be helpful to see that definitively, e.g. with strace.
>
> If you are hitting EMF
I have about a dozen kalarm files
/.config/akonadi/agent_config_akonadi_kalarm_* and
./.config/session/kalarm_*.
None are newer than 2022. Are these left over from when kalarm was
using akonadi? Can I safely delete them?
On Fri, 2025-08-15 at 20:55 -0700, Ken Mankoff wrote:
> I'd like to report a bug but don't know what package, and reportbug
> says I should email this list. I'm running Debian Trixie KDE Wayland,
> and repeatedly seeing "Too many open files".
limit on the number of open file descriptors has
been reached (see the description of RLIMIT_NOFILE in
getrlimit(2)).
ENFILE The system-wide limit on the total number of open files has been
reached.
And their canonical error messages are:
quite low,
then go to open a fair number of files, just to explicitly trigger such
an error, and my capturing it with strace and (at least some of)
those results (may not show it all, as that capture may get quite long).
So ...
$ (n=5; strace -fv -eall -s2048 -o strace.out bash -c 'echo $(uli
Hi Michael,
On 2025-08-18 at 21:55 -07, Michael Paoli wrote...
> Did you accidentally repost same again, or did you not see the earlier
> reply posting?
Sorry (again!). I incorrectly re-posted, then replied to your text (but not
directly to your email because I had not yet subscribed and did no
Did you accidentally repost same again, or did you not see the earlier
reply posting?
Have a look at:
https://lists.debian.org/debian-user/2025/08/msg00571.html
The list isn't write-only. :-)
< Subject: Re: Too many open files
< From: Michael Paoli <[13]michael.pa...@berkeley.ed
on Wayland, Qt,
etc.) and shows it occurring more than the few times I see it from some
terminal commands that I run. But nothing is crashing because of it as far as I
can tell.
> $ dolphin . kf.solid.backends.fstab: Failed to acquire watch file
> descriptor Too many open files
Dolphin st
Hello,
I'd like to report a bug but don't know what package, and reportbug says I
should email this list. I'm running Debian Trixie KDE Wayland, and repeatedly
seeing "Too many open files".
Two examples:
$ dolphin . # dolphin window opens, but this is printed:
kf.sol
Thanks, and useful looking reporting,
though not (yet) a bug report.
Well, let's see ...
from your
> $ ulimit -a
> open files (-n) 32768
That should typically be way more than ample,
even excessive - but that would be another
issue (though possible to be relate
Hello,
I'd like to report a bug but don't know what package, and reportbug says I
should email this list. I'm running Debian Trixie KDE Wayland, and repeatedly
seeing "Too many open files".
Two examples:
$ dolphin .
kf.solid.backends.fstab: Failed to acquire watch
. So, the blocks of compressed data may
> have part of one large file, or maybe thousands of files. The compression
> formats generally don't know or care when it comes to tar archive or the like,
> so there's not really much info they could provide to tar about what's within
>
On 2025-08-10 10:10:38 +0200, Nicolas George wrote:
> Vincent Lefevre (HE12025-08-10):
> > Yes, an example:
> >
> > qaa% ls -l PROGRAMME-FFC-2024.pdf
> > -rw-r--r-- 1 vinc17 vinc17 40991563 2024-11-11 02:18:14
> > PROGRAMME-FFC-2024.pdf
> > qaa% xz -k PROGRAMME-FFC-2024.pdf
> > qaa% xz -lv PROGRA
On 2025-08-10 10:09:32 +0200, Nicolas George wrote:
> Vincent Lefevre (HE12025-08-10):
> > Big files have several blocks. See my example. And that's with
> > the default options.
>
> What you showed is not default options, especially not for archives.
> Nobody does “t
Vincent Lefevre (HE12025-08-10):
> Yes, an example:
>
> qaa% ls -l PROGRAMME-FFC-2024.pdf
> -rw-r--r-- 1 vinc17 vinc17 40991563 2024-11-11 02:18:14 PROGRAMME-FFC-2024.pdf
> qaa% xz -k PROGRAMME-FFC-2024.pdf
> qaa% xz -lv PROGRAMME-FFC-2024.pdf.xz
> PROGRAMME-FFC-2024.pdf.xz (1/1)
> Streams:
Vincent Lefevre (HE12025-08-10):
> Big files have several blocks. See my example. And that's with
> the default options.
What you showed is not default options, especially not for archives.
Nobody does “tar -cf out.tar … ; xz out.tar”, only “tar -c … | xz >
out.tar.xz”, or -J that
ssed data may
> have part of one large file, or maybe thousands of files. The compression
> formats generally don't know or care when it comes to tar archive or the like,
> so there's not really much info they could provide to tar about what's within
> ... for the most par
sands of files. The compression
formats generally don't know or care when it comes to tar archive or the like,
so there's not really much info they could provide to tar about what's within
... for the most part all they really know is its data, and perhaps
also bits like
how long the
On 2025-08-09 23:47:26 +0200, Nicolas George wrote:
> But as I said, it does not do that, and that would be useless
> because xz files do not have block in practice.
Big files have several blocks. See my example. And that's with
the default options.
--
Vincent Lefèvre -
t using "xz -d"
> > could be more efficient as "xz -d" will uncompress the whole file
> > while this may not be necessary:
> >
> > tar tf file.tar.xz
> >
> > is sufficient. This may allow one to skip xz blocks if the archive
> > contains bi
On 2025-08-09 19:39:57 +0200, Nicolas George wrote:
> Vincent Lefevre (HE12025-08-09):
> > xz compresses into several blocks by default (of course, this is
> > visible only on very big files, where it really matters).
>
> Before posting, I have checked on files of varied s
at block, 100K of compressed input, I am now
7M into that file I do not need to extract.
But as I said, it does not do that, and that would be useless because xz
files do not have block in practice.
Regards,
--
Nicolas George
ncompress the whole file
> while this may not be necessary:
>
> tar tf file.tar.xz
>
> is sufficient. This may allow one to skip xz blocks if the archive
> contains big files. That said, I don't know whether GNU tar has
> such an optimization.
I rather doubt any tar imp
Vincent Lefevre (HE12025-08-09):
> xz compresses into several blocks by default (of course, this is
> visible only on very big files, where it really matters).
Before posting, I have checked on files of varied sizes, including a 14G
one. What did you test?
Regards,
--
Nicolas George
On 2025-08-09 12:46:35 +0200, Nicolas George wrote:
> Moreover, such an optimization would be mostly useless, as most xz files
> out there are made of a single block with sizes not specified.
xz compresses into several blocks by default (of course, this is
visible only on very big files, wh
is may allow one to skip xz blocks if the archive
> contains big files. That said, I don't know whether GNU tar has
> such an optimization.
GNU tar not have any such optimization, that can be observed easily
with:
~ $ ldd =tar
linux-vdso.so.1 (0x147694d94000)
z
is sufficient. This may allow one to skip xz blocks if the archive
contains big files. That said, I don't know whether GNU tar has
such an optimization.
--
Vincent Lefèvre - Web: <https://www.vinc17.net/>
100% accessible validated (X)HTML - Blog: <https://www.vinc17.net/blo
On Thu, Aug 7, 2025 at 5:01 AM Richard Owlett wrote:
> My questions:
>1. Can individual files or directories be extracted from XYZ.tar.xz ?
Yes.
$ cd $(mktemp -d)
$ >f
$ mkdir d
$ >d/f
$ tar -cf - [fd] | xz -9 > tar.xz
$ rm -rf [fd]
$ ls -A
tar.xz
$
So, we now have our tar.x
On 8/8/25 9:08 PM, Vincent Lefevre wrote:
On 2025-08-07 09:33:34 -0400, Greg Wooledge wrote:
On Thu, Aug 07, 2025 at 07:00:31 -0500, Richard Owlett wrote:
My questions:
1. Can individual files or directories be extracted from XYZ.tar.xz ?
Yes.
2. Is there a compressed format that
ed as a whole. So the reading program has to
decompress the archive from its start at least up to the end of the
last desired file.
Regardless of compression, tar is not very efficient when it comes to
picking files from the archive. If the storage medium is slow, then it
may last quite some time un
On 2025-08-07 09:33:34 -0400, Greg Wooledge wrote:
> On Thu, Aug 07, 2025 at 07:00:31 -0500, Richard Owlett wrote:
> > My questions:
> > 1. Can individual files or directories be extracted from XYZ.tar.xz ?
>
> Yes.
>
> > 2. Is there a compressed format that m
On Thu 07 Aug 2025 at 09:34:11 (-0400), Felix Miata wrote:
> Richard Owlett composed on 2025-08-07 07:00 (UTC-0500):
>
> >1. Can individual files or directories be extracted from XYZ.tar.xz ?
>
> 1: open mc (if not already installed: sudo apt install mc)
> 2: naviga
Richard Owlett composed on 2025-08-07 13:37 (UTC-0500):
> /ABC/qrz/help is a directory *ONLY* on another person's machine.
> I had downloaded XYZ.tar.xz from a repository.
> I wish to decompress it to obtain a local copy of only the files in
> /ABC/qrz/help for my local m
On Thu, Aug 07, 2025 at 13:37:36 -0500, Richard Owlett wrote:
> /ABC/qrz/help is a directory *ONLY* on another person's machine.
> I had downloaded XYZ.tar.xz from a repository.
> I wish to decompress it to obtain a local copy of only the files in
> /ABC/qrz/help for my local mac
On 8/7/25 10:48 AM, Felix Miata wrote:
Richard Owlett composed on 2025-08-07 10:01 (UTC-0500):
Felix Miata wrote:
Richard Owlett composed on 2025-08-07 07:00 (UTC-0500):
1. Can individual files or directories be extracted from XYZ.tar.xz ?
1: open mc (if not already installed
Richard Owlett composed on 2025-08-07 10:01 (UTC-0500):
> Felix Miata wrote:
>> Richard Owlett composed on 2025-08-07 07:00 (UTC-0500):
>>> 1. Can individual files or directories be extracted from XYZ.tar.xz ?
>> 1: open mc (if not already installed: sudo apt insta
On Thu, Aug 07, 2025 at 10:01:01 -0500, Richard Owlett wrote:
> XYZ.tar.xz is only file in current directory.
> I have launched mc from the command line.
> How do I extract from XYZ.tar.xz all files that originated in directory
> /ABC/qrz/help to the current directory?
I wouldn'
Just uppack the tarfile and then ignore or delete the stuff you
aren't interested in.
What are you actually trying to do?
--
John Hasler
j...@sugarbit.com
Elmwood, WI USA
On 8/7/25 8:34 AM, Felix Miata wrote:
Richard Owlett composed on 2025-08-07 07:00 (UTC-0500):
1. Can individual files or directories be extracted from XYZ.tar.xz ?
1: open mc (if not already installed: sudo apt install mc)
2: navigate to the location of XYZ.tar.xz
3:
4: navigate to the
On Thu, Aug 07, 2025 at 07:41:33AM -0500, Richard Owlett wrote:
> On 8/7/25 7:00 AM, Richard Owlett wrote:
> > I'm too acclimated to decompressing with single mouse-click ;/
> > Where would I find a good introduction to tar archives?
> >
> > 1. Can individual f
Richard Owlett composed on 2025-08-07 07:00 (UTC-0500):
>1. Can individual files or directories be extracted from XYZ.tar.xz ?
1: open mc (if not already installed: sudo apt install mc)
2: navigate to the location of XYZ.tar.xz
3:
4: navigate to the file(s) you wish extracted & sele
On Thu, Aug 07, 2025 at 07:00:31 -0500, Richard Owlett wrote:
> My questions:
> 1. Can individual files or directories be extracted from XYZ.tar.xz ?
Yes.
> 2. Is there a compressed format that makes above convenient?
If you plan to do this repeatedly, the .zip format is a superi
On Thu, Aug 07, 2025 at 07:41:33AM -0500, Richard Owlett wrote:
> On 8/7/25 7:00 AM, Richard Owlett wrote:
> > I'm too acclimated to decompressing with single mouse-click ;/
> > Where would I find a good introduction to tar archives?
> >
> > [SNIP]
> > My que
Hi,
Richard Owlett wrote
> 1. Can individual files or directories be extracted from XYZ.tar.xz ?
Look up exact path of your file, like with:
tar tJf XYZ.tar.xz | less
and the search capabilities of "less".
Or by:
tar tJf XYZ.tar.xz | '...some.search.expression...
On 8/7/25 7:00 AM, Richard Owlett wrote:
I'm too acclimated to decompressing with single mouse-click ;/
Where would I find a good introduction to tar archives?
[SNIP]
My questions:
1. Can individual files or directories be extracted from XYZ.tar.xz ?
Better prepared web search answers
identified
the directory containing files of interest.
Though analog oriented, I have decades of customer service, QA/QC, and
engineering support.
I accepted the implied challenge to "Put up or ..." <*GRIN*>
I have two primary goals:
1. Convert the content of the current &quo
itself is quite safe from being overwritten.
But MS-Windows or the boot firmware might decide to add files to
the EFI partition which is advertised by the partition table in the
System Area of the ISO filesystem.
I reported the above on this list sometime in the past (few years?), but am
unable
being overwritten.
But MS-Windows or the boot firmware might decide to add files to
the EFI partition which is advertised by the partition table in the
System Area of the ISO filesystem.
> I reported the above on this list sometime in the past (few years?), but am
> unable to locate the thread
On 8/2/25 21:50, Titus Newswanger wrote:
On 8/2/25 22:44, David Christensen wrote:
5. Verify the computed SHA256 checksum appears in the downloaded
SHA512SUMS file:
I've been meaning to learn how to sha512sum after it is written to disk.
Now I've got it. Here are my results:
on the older us
On 8/2/25 20:44, David Christensen wrote:
5. Verify the computed SHA256 checksum appears in the downloaded
SHA512SUMS file:
Sorry for the error -- that should be:
5. Verify the computed SHA512 checksum appears in the downloaded
SHA512SUMS file:
David
On Sun, Aug 03, 2025 at 06:42:13AM -0500, Nate Bargmann wrote:
[...]
> Meanwhile "dd" has always worked for me. I'll have to remember Tomas'
> recommendation for "oflag=sync" for the next time I write an image,
> though that might be a while.
I usually just remember "there was a flag for that..
* On 2025 03 Aug 05:31 -0500, Jonathan Dowland wrote:
>
> On Sun Aug 3, 2025 at 6:52 AM BST, tomas wrote:
> > I always recommend to add "oflag=sync" to dd itself: this way it
> > syncs as it goes and you don't have to wait for a (potentially
> > long) time for sync to "come back".
> >
> > Togethe
On Sun Aug 3, 2025 at 6:52 AM BST, tomas wrote:
I always recommend to add "oflag=sync" to dd itself: this way it
syncs as it goes and you don't have to wait for a (potentially
long) time for sync to "come back".
Together with "status=progress" you get a visual feedback on how
things are going
On Sun, Aug 03, 2025 at 11:58:59AM +0200, Nicolas George wrote:
[...]
> As Tomas pointed, with dd specifically you can use oflag=sync to have it
> sync explicitly between each block, to get a better progress estimate.
> Be sure to use a large block size or you will ruin performances.
Oh, yes, fo
Titus Newswanger composed on 2025-08-02 23:50 (UTC-0500):
> I've been meaning to learn how to sha512sum after it is written to disk.
> Now I've got it. Here are my results:
> on the older usb disk that worked, sha512sum matched
> on the new faulty disk, after writing with dd, sha512sum did not
Titus Newswanger (HE12025-08-02):
> Strangely, it did not display status updates like it used to until after it
> completed.
The rest has been explained, but not that. Let me.
First dd wrote all the data extremely fast. We can assume the input file
was recently downloaded and still in memory cach
Hi,
a late comment to the failure messages about the USB stick in the
system log:
> [40799.447110] sd 7:0:0:0: [sdf] Attached SCSI removable disk
> [41167.136643] usb 2-1.2: reset high-speed USB device number 27 using
> xhci_hcd
The reset message is not caused by unplugging of the USB device.
I
On Sat, Aug 02, 2025 at 10:49:44PM -0500, Titus Newswanger wrote:
> On 8/2/25 20:53, Nate Bargmann wrote:
> > The command you're probably thinking of is sync.
I always recommend to add "oflag=sync" to dd itself: this way it
syncs as it goes and you don't have to wait for a (potentially
long) time
On 8/2/25 22:44, David Christensen wrote:
5. Verify the computed SHA256 checksum appears in the downloaded
SHA512SUMS file:
I've been meaning to learn how to sha512sum after it is written to disk.
Now I've got it. Here are my results:
on the older usb disk that worked, sha512sum matched
on
On 8/2/25 20:53, Nate Bargmann wrote:
The command you're probably thinking of is sync.
Thanks, I added sync to my bash notes. I had looked at man sync earlier
today and thought that's not it...
However it turned out to be a hardware issue. This was my first time
using that new flash drive.
e usb port. Next strange thing was
when I mounted the finished boot media, the directory contents looked
normal, but the text files - readme, changelog, etc contents were garbled.
Attempted boot, grub came up with the usual options, I selected
graphical install (the default) then kernel panic,
The command you're probably thinking of is sync.
- Nate
--
"The optimist proclaims that we live in the best of all
possible worlds. The pessimist fears this is true."
Web: https://www.n0nb.us
Projects: https://github.com/N0NB
GPG fingerprint: 82D6 4F6B 0E67 CD41 F689 BBA6 FB2C 5130 D55A 8819
hen I mounted the finished boot media, the directory contents looked
normal, but the text files - readme, changelog, etc contents were garbled.
Attempted boot, grub came up with the usual options, I selected
graphical install (the default) then kernel panic, with something like
"initrd cont
On Tue, 2025-07-22 at 09:35 -0500, David Wright wrote:
> > > I used to use pdfjoin to join pdfs (though there was a bug where
> > > some
> > > pages would be oriented wrongly) and I needed to join some pdfs
> > > recently. But there were so many dependencies for pdfjoin that I
> > > decided
> > > t
On Mon 21 Jul 2025 at 10:38:55 (-0400), Stefan Monnier wrote:
> > I used to use pdfjoin to join pdfs (though there was a bug where some
> > pages would be oriented wrongly) and I needed to join some pdfs
> > recently. But there were so many dependencies for pdfjoin that I decided
> > to try pdfunit
> I used to use pdfjoin to join pdfs (though there was a bug where some
> pages would be oriented wrongly) and I needed to join some pdfs
> recently. But there were so many dependencies for pdfjoin that I decided
> to try pdfunite (that somebody had recently mentioned here), which I
> already had i
On 2025-07-11, Nicolas George wrote:
> hw (HE12025-07-11):
>> (S)FTP is still in use like for cameras, scanners (printers) and phones.
>
> Do you have a few examples of brand and models of cameras and phones
> that use FTP?
Some high-end cameras use it. Phones, not so much.
1 - 100 of 10026 matches
Mail list logo