A bug sneaked into that release.  Please update to the newest version.

Am 28.03.24 um 09:26 schrieb 'Alexander P' via bareos-users:
Hello everyone,
Yesterday I updated from Bareos 22.1.2 to 23.0.3.
I also updated the database as described here
https://docs.benbenos.org/introductionandtutorial/updatingbareos.html

Now I have the problem that copy jobs from a storage to another. After exactly one minute I get the email that the job failed.
But when I look at the two storages then I can see that they still copied.

There is nowhere else a message or error (Syslog/Postgress/....). Only this is what makes it difficult to find the error.



Logfile Bareos Director
##############
28-Mar 09:13 bareos-dir.mgm.domain.tld JobId 418788: Bareos bareos-dir.mgm.domain.tld 23.0.3~pre47.36e516c0b (19Mar24):
  Build OS:               Ubuntu 20.04.5 LTS
  Current JobId:          418788
  Current Job:  Copy_OVH_Full_To_Muc.2024-03-28_09.13.55_57
  Catalog:                "MyCatalog" (From Default catalog)
  Start time:             28-Mar-2024 09:13:57
  End time:               28-Mar-2024 09:13:58
  Elapsed time:           1 sec
  Priority:               100
  Bareos binary info:     Bareos community build (UNSUPPORTED): Get professional support from https://www.bareos.com
  Job triggered by:       User
  Termination:            Copying OK

28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418789: Start Copying JobId 418789, Job=Copy_OVH_Full_To_Muc.2024-03-28_09.13.57_01 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418789: Connected Storage daemon at bareos-sd01.mgm.domain.tld:9103, encryption: TLS_CHACHA20_POLY1305_SHA256 TLSv1.3 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418789:  Encryption: TLS_CHACHA20_POLY1305_SHA256 TLSv1.3 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: Connected Storage daemon at bareos-sd01.mgm.muc01.fti.int:9103, encryption: TLS_CHACHA20_POLY1305_SHA256 TLSv1.3 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790:  Encryption: TLS_CHACHA20_POLY1305_SHA256 TLSv1.3 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418789: Using Device "Disk_Drive000" to read. 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: There are no more Jobs associated with Volume "MUC_Disk-Tape-3779". Marking it purged. 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: All records pruned from Volume "MUC_Disk-Tape-3779"; marking it "Purged" 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: Recycled volume "MUC_Disk-Tape-3779" 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: Using Device "Disk_Drive000" to write. 28-Mar 09:14 bareos-sd01.mgm.domain.tld JobId 418789: Connected Storage daemon at bareos-sd01.mgm.muc01.fti.int:9103, encryption: TLS_CHACHA20_POLY1305_SHA256 TLSv1.3 28-Mar 09:14 bareos-sd01.mgm.domain.tld JobId 418789: Ready to read from volume "OVH_Disk-Tape-3110" on device "Disk_Drive000" (/var/lib/bareos/storage_ovh/Disk_Drive000). 28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: 3307 Issuing autochanger "unload slot 5106, drive 0" command. 28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: 3304 Issuing autochanger "load slot 3779, drive 0" command. 28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: 3305 Autochanger "load slot 3779, drive 0", status is OK. 28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: Recycled volume "MUC_Disk-Tape-3779" on device "Disk_Drive000" (/var/lib/bareos/storage_muc/Disk_Drive000), all previous data lost. 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: Max Volume jobs=1 exceeded. Marking Volume "MUC_Disk-Tape-3779" as Used. 28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: autoxflate-sd: MUC_File-Autochanger OUT:[SD->inflate=yes->deflate=no->DEV] IN:[DEV->inflate=yes->deflate=no->SD] 28-Mar 09:14 bareos-sd01.mgm.domain.tld JobId 418789: Forward spacing Volume "OVH_Disk-Tape-3110" to file:block 0:257.
28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: Spooling data ...
28-Mar 09:15 bareos-dir.mgm.domain.tld JobId 418789: Error: Bareos bareos-dir.mgm.domain.tld 23.0.3~pre47.36e516c0b (19Mar24):
  Build OS:               Ubuntu 20.04.5 LTS
  Prev Backup JobId:      418773
  Prev Backup Job:  fai-salt.mgm.domain.tld_LinuxAll_Job.2024-03-28_07.58.55_32
  New Backup JobId:       418790
  Current JobId:          418789
  Current Job:  Copy_OVH_Full_To_Muc.2024-03-28_09.13.57_01
  Backup Level:           Incremental
  Client:                 fai-salt.mgm.domain.tld
  FileSet:                "LinuxAll"
  Read Pool:              "OVH_Full" (From Job resource)
  Read Storage:           "OVH_File-Autochanger" (From Pool resource)
  Write Pool:             "MUC_Copy_Full" (From Job Pool's NextPool resource)   Write Storage:          "MUC_File-Autochanger" (From Storage from Pool's NextPool resource)   Next Pool:              "MUC_Copy_Full" (From Job Pool's NextPool resource)
  Catalog:                "MyCatalog" (From Default catalog)
  Start time:             28-Mar-2024 09:14:00
  End time:               28-Mar-2024 09:15:00
  Elapsed time:           1 min
  Priority:               100
  SD Files Written:       0
  SD Bytes Written:       0 (0 B)
  Rate:                   0.0 KB/s
  Volume name(s):         MUC_Disk-Tape-3779
  Volume Session Id:      73
  Volume Session Time:    1711600421
  Last Volume Bytes:      254 (254 B)
  SD Errors:              0
  SD termination status:  Running
  Bareos binary info:     Bareos community build (UNSUPPORTED): Get professional support from https://www.bareos.com
  Job triggered by:       User
  Termination:            *** Copying Error ***



Email Message
##############
28-Mar 09:13 bareos-dir.mgm.domain.tld JobId 418789: Copying using JobId=418773 Job=fai-salt.mgm.domain.tld_LinuxAll_Job.2024-03-28_07.58.55_32 28-Mar 09:13 bareos-dir.mgm.domain.tld JobId 418789: Bootstrap records written to /var/lib/bareos/bareos-dir.mgm.domain.tld.restore.2.bsr 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418789: Start Copying JobId 418789, Job=Copy_OVH_Full_To_Muc.2024-03-28_09.13.57_01 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418789: Connected Storage daemon at bareos-sd01.mgm.domain.tld:9103, encryption: TLS_CHACHA20_POLY1305_SHA256 TLSv1.3 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418789:  Encryption: TLS_CHACHA20_POLY1305_SHA256 TLSv1.3 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: Connected Storage daemon at bareos-sd01.mgm.muc01.fti.int:9103, encryption: TLS_CHACHA20_POLY1305_SHA256 TLSv1.3 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790:  Encryption: TLS_CHACHA20_POLY1305_SHA256 TLSv1.3 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418789: Using Device "Disk_Drive000" to read. 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: There are no more Jobs associated with Volume "MUC_Disk-Tape-3779". Marking it purged. 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: All records pruned from Volume "MUC_Disk-Tape-3779"; marking it "Purged" 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: Recycled volume "MUC_Disk-Tape-3779" 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: Using Device "Disk_Drive000" to write. 28-Mar 09:14 bareos-sd01.mgm.domain.tld JobId 418789: Connected Storage daemon at bareos-sd01.mgm.muc01.fti.int:9103, encryption: TLS_CHACHA20_POLY1305_SHA256 TLSv1.3 28-Mar 09:14 bareos-sd01.mgm.domain.tld JobId 418789: Ready to read from volume "OVH_Disk-Tape-3110" on device "Disk_Drive000" (/var/lib/bareos/storage_ovh/Disk_Drive000). 28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: 3307 Issuing autochanger "unload slot 5106, drive 0" command. 28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: 3304 Issuing autochanger "load slot 3779, drive 0" command. 28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: 3305 Autochanger "load slot 3779, drive 0", status is OK. 28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: Recycled volume "MUC_Disk-Tape-3779" on device "Disk_Drive000" (/var/lib/bareos/storage_muc/Disk_Drive000), all previous data lost. 28-Mar 09:14 bareos-dir.mgm.domain.tld JobId 418790: Max Volume jobs=1 exceeded. Marking Volume "MUC_Disk-Tape-3779" as Used. 28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: autoxflate-sd: MUC_File-Autochanger OUT:[SD->inflate=yes->deflate=no->DEV] IN:[DEV->inflate=yes->deflate=no->SD] 28-Mar 09:14 bareos-sd01.mgm.domain.tld JobId 418789: Forward spacing Volume "OVH_Disk-Tape-3110" to file:block 0:257.
28-Mar 09:14 bareos-sd01.mgm.muc01.fti.int JobId 418790: Spooling data ...
28-Mar 09:15 bareos-dir.mgm.domain.tld JobId 418789: Error: Bareos bareos-dir.mgm.domain.tld 23.0.3~pre47.36e516c0b (19Mar24):
  Build OS:               Ubuntu 20.04.5 LTS
  Prev Backup JobId:      418773
  Prev Backup Job:  fai-salt.mgm.domain.tld_LinuxAll_Job.2024-03-28_07.58.55_32
  New Backup JobId:       418790
  Current JobId:          418789
  Current Job:  Copy_OVH_Full_To_Muc.2024-03-28_09.13.57_01
  Backup Level:           Incremental
  Client:                 fai-salt.mgm.domain.tld
  FileSet:                "LinuxAll"
  Read Pool:              "OVH_Full" (From Job resource)
  Read Storage:           "OVH_File-Autochanger" (From Pool resource)
  Write Pool:             "MUC_Copy_Full" (From Job Pool's NextPool resource)   Write Storage:          "MUC_File-Autochanger" (From Storage from Pool's NextPool resource)   Next Pool:              "MUC_Copy_Full" (From Job Pool's NextPool resource)
  Catalog:                "MyCatalog" (From Default catalog)
  Start time:             28-Mar-2024 09:14:00
  End time:               28-Mar-2024 09:15:00
  Elapsed time:           1 min
  Priority:               100
  SD Files Written:       0
  SD Bytes Written:       0 (0 B)
  Rate:                   0.0 KB/s
  Volume name(s):         MUC_Disk-Tape-3779
  Volume Session Id:      73
  Volume Session Time:    1711600421
  Last Volume Bytes:      254 (254 B)
  SD Errors:              0
  SD termination status:  Running
  Bareos binary info:     Bareos community build (UNSUPPORTED): Get professional support from https://www.bareos.com
  Job triggered by:       User
  Termination:            *** Copying Error ***


The Jobs Are running

Source Storage
##############
JobId=418789 Level=Incremental Type=Copy Name=Copy_OVH_Full_To_Muc Status=Running
Reading: Volume="OVH_Disk-Tape-3110"
    pool="OVH_Full" device="Disk_Drive000" (/var/lib/bareos/storage_ovh/Disk_Drive000)
Writing: Volume="OVH_Disk-Tape-3110"
    pool="OVH_Full" device="Disk_Drive000" (/var/lib/bareos/storage_ovh/Disk_Drive000)
    spooling=0 despooling=0 despool_wait=0
    Files=173,960 Bytes=7,699,083,169 AveBytes/sec=4 LastBytes/sec=23,363,584
    FDSocket closed


Destination Storage
##############
JobId=418790 Level=Full Type=Backup Name=fai-salt.mgm.domain.tld_LinuxAll_Job Status=Running
Writing: Volume="MUC_Disk-Tape-3779"
    pool="MUC_Copy_Full" device="Disk_Drive000" (/var/lib/bareos/storage_muc/Disk_Drive000)
    spooling=1 despooling=0 despool_wait=0
    Files=174,615 Bytes=8,043,094,315 AveBytes/sec=17,447,059 LastBytes/sec=16,173,213
    FDSocket closed


--
You received this message because you are subscribed to the Google Groups "bareos-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to bareos-users+unsubscr...@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/bareos-users/94600364-080e-4bfa-bd58-cac433f27295n%40googlegroups.com <https://groups.google.com/d/msgid/bareos-users/94600364-080e-4bfa-bd58-cac433f27295n%40googlegroups.com?utm_medium=email&utm_source=footer>.

--
 Sebastian surasebastian.s...@bareos.com
 Bareos GmbH & Co. KG            Phone: +49 221 630693-0
 https://www.bareos.com
 Sitz der Gesellschaft: Köln | Amtsgericht Köln: HRA 29646
 Komplementär: Bareos Verwaltungs-GmbH
 Geschäftsführer: Stephan Dühr, Jörg Steffens, Philipp Storz

--
You received this message because you are subscribed to the Google Groups 
"bareos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to bareos-users+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/bareos-users/2c9f265d-536b-447f-ac8b-840818984a7a%40bareos.com.

Reply via email to