Ah, so it is bug that happens when "honor nodump flag=yes" on Linux.  I
couldn't repeat it because FreeBSD implements nodump differently.

You could work around it either by not using that option or by having another
options clause that sets "honor nodump flag=no" for the specific places where
you have pipes or sockets.

__Martin


>>>>> On Thu, 15 Feb 2024 09:14:18 -0500, Peter Sjoberg said:
> 
> Sorry for late reply, had to let my normal backups finish up first.
> 
> I did ask google how to run gdb and managed to get something out that I=20
> now attached.
> No symbols since I'm running the distro version insead of github.
> Did see that a new version is released - 13.0.4. Running some more=20
> backups now (copy to external drive) but after that I will update and=20
> see if anything changed.
> 
> /ps
> 
> 
> On 2024-02-13 10:55, Martin Simmons wrote:
> > It works for me on FreeBSD with Bacula 15 from git.
> >
> > Can you attach gdb to the bacula-fd while it is running and issue the g=
> db
> > command:
> >
> > thread apply all bt
> >
> > Also, try running bacula-fd with -d 150 -dt -v -fP which will make it p=
> rint
> > the debug info to the terminal.  Level 150 should show what it is doing=
>  for
> > the fifo.
> >
> > __Martin
> >
> >
> >>>>>> On Tue, 13 Feb 2024 09:07:43 -0500, Peter Sjoberg said:
> >> On 2024-02-13 02:49, Eric Bollengier wrote:
> >>> Hello Peter,
> >>>
> >>> Without the ReadFifo directive, it's unlikely to cause a problem,
> >> Unlikely maybe but that is the problem and I can even reproduce it!
> >> My setup is based on ubuntu 22.04 LTS (was trying debian but align is
> >> broken there) using the community repo
> >>
> >>   =C2=A0 debhttps://www.bacula.org/packages/<redacted>/debs/13.0.3 jam=
> my main
> >>
> >>> and the file daemon output is pretty clear, we are not at this file.
> >> The file daemon output shows last file that worked, not the file it is
> >> trying to backup.
> >>
> >> To reproduce I did
> >>
> >> *1 - create a fileset that backups just /tmp/debug*
> >>
> >> FileSet {
> >>       Name =3D "debugfs2"
> >>       Ignore FileSet Changes =3D yes
> >>       Include {
> >>         Options {
> >>           signature=3DMD5
> >>           honor nodump flag=3Dyes
> >>           noatime=3Dyes
> >>           keepatime =3D no
> >>           sparse=3Dyes
> >>           exclude =3D yes
> >>           wild =3D *~
> >>           wild =3D *.tmp
> >>           }
> >>         File =3D "/tmp/debug"
> >>         }
> >>       }
> >>
> >> *2 - create a pipe ("mkfifo random_pipe") and a plane file ("date
> >>   >a_red_herring") in /tmp/debug*
> >>
> >> peters@quark:/tmp/debug$ find /tmp/debug/ -ls
> >>       11017      0 drwxr-xr-x   2 peters   peters         80 Feb 13 08=
> :47 /tmp/debug/
> >>       11031      4 -rw-r--r--   1 peters   peters         32 Feb 13 08=
> :47 /tmp/debug/a_red_herring
> >>       11029      0 prw-r--r--   1 peters   peters          0 Feb 13 08=
> :46 /tmp/debug/random_pipe
> >> peters@quark:/tmp/debug$
> >>
> >> *3 - start a backup;
> >> *
> >>
> >> root@quark:~#echo run BackupQ_quark FileSet=3D"debugfs2" Level=3DFull =
> yes|bconsole
> >>
> >> *4 - confirm it hangs*
> >>
> >> root@quark:~# echo stat client=3Dquark-fd|bconsole #CLIENTSTAT
> >> Connecting to Director quark:9101
> >> 1000 OK: 10002 techwiz-dir Version: 13.0.3 (02 May 2023)
> >> Enter a period to cancel a command.
> >> stat client=3Dquark-fd
> >> Connecting to Client quark-fd at quark:9102
> >>
> >> quark-fd Version: 13.0.3 (02 May 2023)  x86_64-pc-linux-gnu-bacula-ent=
> erprise ubuntu 22.04
> >> Daemon started 12-Feb-24 23:55. Jobs: run=3D5 running=3D1.
> >>    Heap: heap=3D856,064 smbytes=3D603,907 max_bytes=3D1,219,047 bufs=3D=
> 178 max_bufs=3D429
> >>    Sizes: boffset_t=3D8 size_t=3D8 debug=3D0 trace=3D0 mode=3D0,0 bwli=
> mit=3D0kB/s
> >>    Crypto: fips=3DN/A crypto=3DOpenSSL 3.0.2 15 Mar 2022
> >>    Plugin: bpipe-fd.so(2)
> >>
> >> Running Jobs:
> >> JobId 315 Job BackupQ_quark.2024-02-13_08.48.07_46 is running.
> >>       Full Backup Job started: 13-Feb-24 08:48
> >>       Files=3D1 Bytes=3D40 AveBytes/sec=3D0 LastBytes/sec=3D2 Errors=3D=
> 0
> >>       Bwlimit=3D0 ReadBytes=3D32
> >>       Files: Examined=3D1 Backed up=3D1
> >>       Processing file: /tmp/debug/a_red_herring
> >>       SDReadSeqNo=3D8 fd=3D5 SDtls=3D1
> >> Director connected using TLS at: 13-Feb-24 08:55
> >> =3D=3D=3D=3D
> >>
> >> *5 - release the job by sending something to the pipe; *
> >>
> >> root@quark:~# echo >/tmp/debug/random_pipe
> >>
> >> *6 - confirm the job finished*
> >>
> >> root@quark:~# echo 'llist jobid=3D315'|bconsole
> >> Connecting to Director quark:9101
> >> 1000 OK: 10002 techwiz-dir Version: 13.0.3 (02 May 2023)
> >> Enter a period to cancel a command.
> >> llist jobid=3D315
> >> Automatically selected Catalog: MyCatalog
> >> Using Catalog "MyCatalog"
> >>              JobId: 315
> >>                Job: BackupQ_quark.2024-02-13_08.48.07_46
> >>               Name: BackupQ_quark
> >>        PurgedFiles: 0
> >>               Type: B
> >>              Level: F
> >>           ClientId: 32
> >>         ClientName: quark-fd
> >>          JobStatus: T
> >>          SchedTime: 2024-02-13 08:48:07
> >>          StartTime: 2024-02-13 08:48:09
> >>            EndTime: 2024-02-13 08:56:40
> >>        RealEndTime: 2024-02-13 08:56:40
> >>           JobTDate: 1,707,832,600
> >>       VolSessionId: 66
> >>     VolSessionTime: 1,707,779,559
> >>           JobFiles: 3
> >>           JobBytes: 40
> >>          ReadBytes: 32
> >>          JobErrors: 0
> >>    JobMissingFiles: 0
> >>             PoolId: 2
> >>           PoolName: File-Full
> >>         PriorJobId: 0
> >>           PriorJob:
> >>          FileSetId: 7
> >>            FileSet: debugfs2
> >>           HasCache: 0
> >>            Comment:
> >>           Reviewed: 0
> >>
> >> You have messages.
> >> root@quark:~#
> >>
> >> /ps
> >>
> >>> The problem can be somewhere else, and a good start is a "status dir"
> >>> and "status storage".
> >>>
> >>> Best Regards,
> >>> Eric
> >>>
> >>> On 2/13/24 06:23, Peter Sjoberg wrote:
> >>>> Actually, I think I found the root cause - a pipe!
> >>>>
> >>>> The file listed in the client status is not the problem but close to
> >>>> it is a pipe (maybe next file) and that is what causing the issue in
> >>>> all cases.
> >>>> I striped down the directory to just one file and it still fails
> >>>>
> >>>> root@defiant1:/home/debug# find .zoom -ls
> >>>>  =C2=A0=C2=A023855106=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwx------=C2=A0=
> =C2=A0 4 sys=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 adm=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 Feb 13
> >>>> 00:11 .zoom
> >>>>  =C2=A0=C2=A023855107=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwxrwxr-x=C2=A0=
> =C2=A0 2 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=
> =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 Feb 13
> >>>> 00:11 .zoom/data
> >>>>  =C2=A0=C2=A023855116=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 0 prw-r--r--=C2=A0=
> =C2=A0 1 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=
> =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 0 Mar 12
> >>>> 2021 .zoom/data/com.zoom.ipc.confapp__res
> >>>>  =C2=A0=C2=A023855110=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwxrwxr-x=C2=A0=
> =C2=A0 2 sys=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 adm=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 May 5
> >>>> 2020 .zoom/reports
> >>>> root@defiant1:/home/debug#
> >>>>
> >>>>
> >>>> and if I send something to the pipe the job finish ok and restoring
> >>>> the job did include the pipe.
> >>>>
> >>>> /ps
> >>>>
> >>>> On 2024-02-12 23:36, Peter Sjoberg wrote:
>>>>> In short - no and no, no special files
>>>>> Also, while I haven't waited forever I have left it for several
>>>>> hours so it's not like it's just some house cleaning left.
>>>>> It does happen on different servers, sample below is my laptop.
> >>>>>
>>>>> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D My fileset:
>>>>> FileSet {
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0 Name =3D "debugfs2"
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0 Ignore FileSet Changes =3D yes
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0 Include {
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 Options {
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 signature=3DMD5
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 honor nodump flag=
> =3Dyes
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 noatime=3Dyes
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 keepatime =3D no
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 sparse=3Dyes
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 exclude =3D yes
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 wild =3D *~
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 wild =3D *.tmp
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 }
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 File =3D "/home/ba/.zoom"
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 }
>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0 }
> >>>>>
>>>>> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D run command
>>>>> echo run BackupQ_defiant1 FileSet=3D"debugfs2" Level=3DFull yes|bco=
> nsole
> >>>>>
> >>>>>
>>>>> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D client status when hung
>>>>> root@quark:~# echo stat client=3Ddefiant1-fd|bconsole #CLIENTSTAT
>>>>> Connecting to Director quark:9101
>>>>> 1000 OK: 10002 techwiz-dir Version: 13.0.3 (02 May 2023)
>>>>> Enter a period to cancel a command.
>>>>> stat client=3Ddefiant1-fd
>>>>> Connecting to Client defiant1-fd at defiant1:9102
> >>>>>
>>>>> defiant1-fd Version: 13.0.3 (02 May 2023)
>>>>> x86_64-pc-linux-gnu-bacula-enterprise ubuntu 22.04
>>>>> Daemon started 12-Feb-24 23:22. Jobs: run=3D2 running=3D1.
>>>>>  =C2=A0Heap: heap=3D856,064 smbytes=3D606,583 max_bytes=3D794,675 b=
> ufs=3D188
>>>>> max_bufs=3D203
>>>>>  =C2=A0Sizes: boffset_t=3D8 size_t=3D8 debug=3D0 trace=3D0 mode=3D0=
> ,0 bwlimit=3D0kB/s
>>>>>  =C2=A0Crypto: fips=3DN/A crypto=3DOpenSSL 3.0.2 15 Mar 2022
>>>>>  =C2=A0Plugin: bpipe-fd.so(2)
> >>>>>
>>>>> Running Jobs:
>>>>> JobId 255 Job BackupQ_defiant1.2024-02-12_23.29.48_18 is running.
>>>>>  =C2=A0=C2=A0=C2=A0 Full Backup Job started: 12-Feb-24 23:29
>>>>>  =C2=A0=C2=A0=C2=A0 Files=3D2 Bytes=3D0 AveBytes/sec=3D0 LastBytes/=
> sec=3D0 Errors=3D0
>>>>>  =C2=A0=C2=A0=C2=A0 Bwlimit=3D0 ReadBytes=3D0
>>>>>  =C2=A0=C2=A0=C2=A0 Files: Examined=3D2 Backed up=3D2
>>>>>  =C2=A0=C2=A0=C2=A0 Processing file: /home/ba/.zoom/logs
>>>>>  =C2=A0=C2=A0=C2=A0 SDReadSeqNo=3D8 fd=3D5 SDtls=3D1
>>>>> Director connected using TLS at: 12-Feb-24 23:30
>>>>> =3D=3D=3D=3D
> >>>>>
> >>>>>
>>>>> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Content of that di=
> rectory
> >>>>>
>>>>> root@defiant1:~# find=C2=A0 /home/ba/.zoom -ls
>>>>>  =C2=A0 15728688=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwx------=C2=A0=C2=
> =A0 7 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 Apr 28
>>>>> 2021 /home/ba/.zoom
>>>>>  =C2=A0 16130303=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwx------=C2=A0=C2=
> =A0 2 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 Apr 28
>>>>> 2021 /home/ba/.zoom/screenCapture
>>>>>  =C2=A0 16130301=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwxrwxr-x=C2=A0=C2=
> =A0 2 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 Feb 12
>>>>> 18:47 /home/ba/.zoom/logs
>>>>>  =C2=A0 16023538=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwxrwxr-x=C2=A0=C2=
> =A0 4 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 Apr 28
>>>>> 2021 /home/ba/.zoom/data
>>>>>  =C2=A0 16023540=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 0 prw-r--r--=C2=A0=C2=
> =A0 1 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 0 May=C2=A0 5
>>>>> 2020 /home/ba/.zoom/data/com.zoom.ipc.assistantapp__res
>>>>>  =C2=A0 16023541=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 0 prw-r--r--=C2=A0=C2=
> =A0 1 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 0 Mar 12
>>>>> 2021 /home/ba/.zoom/data/com.zoom.ipc.confapp__req
>>>>>  =C2=A0 16023539=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 0 prw-r--r--=C2=A0=C2=
> =A0 1 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 0 Apr 28
>>>>> 2021 /home/ba/.zoom/data/com.zoom.ipc.assistantapp__req
>>>>>  =C2=A0 16130305=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwx------=C2=A0=C2=
> =A0 2 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 Mar 23
>>>>> 2021 /home/ba/.zoom/data/VirtualBkgnd_Custom
>>>>>  =C2=A0 16131475=C2=A0=C2=A0 1564 -rw-------=C2=A0=C2=A0 1 ba=C2=A0=
> =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=
>  1597940 Mar 23
>>>>> 2021
>>>>> /home/ba/.zoom/data/VirtualBkgnd_Custom/{ff6d8a57-d810-4dd2-bf1b-83=
> 66c063728f}
>>>>>  =C2=A0 16023542=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 0 prw-r--r--=C2=A0=C2=
> =A0 1 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 0 Mar 12
>>>>> 2021 /home/ba/.zoom/data/com.zoom.ipc.confapp__res
>>>>>  =C2=A0 16023545=C2=A0=C2=A0=C2=A0=C2=A0 52 -rw-------=C2=A0=C2=A0 =
> 1 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=
> =C2=A0=C2=A0=C2=A0=C2=A0 53248 Apr 28
>>>>> 2021 /home/ba/.zoom/data/zoomus.enc.db
>>>>>  =C2=A0 16130304=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwx------=C2=A0=C2=
> =A0 2 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 Mar 23
>>>>> 2021 /home/ba/.zoom/data/ConfAvatar
>>>>>  =C2=A0 16131472=C2=A0=C2=A0=C2=A0=C2=A0 36 -rw-------=C2=A0=C2=A0 =
> 1 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=
> =C2=A0=C2=A0=C2=A0=C2=A0 36397 Mar 23
>>>>> 2021
>>>>> /home/ba/.zoom/data/ConfAvatar/conf_avatar_6c72761c1ad5cc6f485dce39=
> 66cbb705_100
>>>>>  =C2=A0 16131473=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 -rw-------=C2=A0=C2=
> =A0 1 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 1020 Mar 23
>>>>> 2021
>>>>> /home/ba/.zoom/data/ConfAvatar/conf_avatar_9e6b3f01c5d33a2052c2681a=
> 42b4e659_100
>>>>>  =C2=A0 16131474=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 -rw-------=C2=A0=C2=
> =A0 1 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 1020 Mar 23
>>>>> 2021
>>>>> /home/ba/.zoom/data/ConfAvatar/conf_avatar_e977cbed2632f5b11882e92e=
> 31f32516_100
>>>>>  =C2=A0 16023544=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 8 -rw-------=C2=A0=C2=
> =A0 1 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 5120 Mar 23
>>>>> 2021 /home/ba/.zoom/data/zoommeeting.enc.db
>>>>>  =C2=A0 16130302=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwxrwxr-x=C2=A0=C2=
> =A0 2 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 May=C2=A0 5
>>>>> 2020 /home/ba/.zoom/reports
>>>>>  =C2=A0 16130300=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4 drwx------=C2=A0=C2=
> =A0 2 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ba=C2=A0=C2=A0=C2=A0=C2=A0=C2=
> =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 4096 Apr 28
>>>>> 2021 /home/ba/.zoom/im
>>>>> root@defiant1:~#
> >>>>>
>>>>> /ps
> >>>>>
> >>>>>
>>>>> On 2024-02-12 20:03, Gary R. Schmidt wrote:
> >>>>>> On 13/02/2024 11:08, Phil Stracchino wrote:
> >>>>>> On 2/12/24 18:35, Peter Sjoberg wrote:
> >>>>>>>> Hi all
> >>>>>>>>
> >>>>>>>> I have a strange problem and (on my system) reproducible problem=
> .
> >>>>>>>> When I do backup of some directories then bacula-fd just hangs
> >>>>>>>> and never complete.
> >>>>>>>> The directories in question are not very strange and backup of
> >>>>>>>> them works find with older versions of -fd
> >>>>>>>
> >>>>>> Silly question:=C2=A0 Do the problem directories contain named pip=
> es or
> >>>>>> sockets?
> >>>>>> Another possibly silly question: Are there any soft links that may
> >>>>>> cause a loop?
> >>>>>>
> >>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0Cheers,
> >>>>>>  =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 Gary=C2=A0=C2=A0=C2=A0=
>  B-)
> >>>>>>
> >>>>>>
> >>>>>> _______________________________________________
> >>>>>> Bacula-users mailing list
> >>>>>> Bacula-users@lists.sourceforge.net
> >>>>>> https://lists.sourceforge.net/lists/listinfo/bacula-users
> >>>>>
> >>>>>
> >>>>>
>>>>> _______________________________________________
>>>>> Bacula-users mailing list
>>>>> Bacula-users@lists.sourceforge.net
>>>>> https://lists.sourceforge.net/lists/listinfo/bacula-users
> >>>>
> >>>>
> >>>>
> >>>> _______________________________________________
> >>>> Bacula-users mailing list
> >>>> Bacula-users@lists.sourceforge.net
> >>>> https://lists.sourceforge.net/lists/listinfo/bacula-users
> 
> 


_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to