Hi Martin,

Thank you for your suggestion.  It actually lead me to figure the root cause.

When I run "status dir", I couldn't find jobid 10504.  That was petty odd.  Its 
at  this time that I tried to recall what I had done 2 days before and 
remembered I had restarted the director but didn't restart the storage.  I 
always assumed the storage kills all the current jobs when the director is 
restarted.

Anyway, after recalling that, I decided to restart both the director and the 
two storage to see what happens and indeed they have been running backup since 
then.

Regards,
William

________________________________________
From: Martin Simmons <mar...@lispworks.com>
Sent: July 17, 2019 8:49:07 AM
To: William Muriithi
Cc: bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] BLOCKED waiting to create a volume on labelled 
volume

What does "status dir" show about running jobs?  In particular, is jobid 10504
still running?

Does the console log show which job is asking for the new volume?

Also, have you run the update slots command since changing tapes?  It seems
strange that both 000302L6 and 000131L6 are shown as being in slot 1.

__Martin



>>>>> On Tue, 16 Jul 2019 15:18:17 +0000, William Muriithi said:
>
> Hello,
>
> I had a working bacula, till I recently run out of tapes.  After, I changed 
> the file and job retention period and prune was able to identify some usable 
> tapes.  When a tape is pruned, I have it moved to scratch pool.
>
> I thought that was all I needed to do, but the system hasn't managed to run a 
> successful backup since then.  Its in a blocked status.  This only have 
> happened when I had unlabelled volume, but this isn't the case this time.  
> Can't really make sense whats happening here. For example, it seem to be 
> attempting to write to a full tape.  Any chance you may have come across a 
> similar problem?
>
> This is the status of the storage system:
>
> Daemon started 04-Jun-19 18:02. Jobs: run=1441, running=5.
>  Heap: heap=135,168 smbytes=1,361,115 max_bytes=1,459,564 bufs=305 
> max_bufs=2,659
>  Sizes: boffset_t=8 size_t=8 int32_t=4 int64_t=8 mode=0,0
>
> Running Jobs:
> Writing: Full Backup job deuterium-sparrow-job JobId=10504 Volume="000152L6"
>     pool="eng_father" device="ULTRIUM-HH7" (/dev/nst0)
>     spooling=0 despooling=0 despool_wait=1
>     Files=5,687,567 Bytes=3,291,363,396,950 AveBytes/sec=138,469 
> LastBytes/sec=0
>     FDReadSeqNo=97,415,374 in_msg=81503899 out_msg=5 fd=12
> Writing: Full Backup job eng-backup01-niobium-job JobId=10643 
> Volume="000152L6"
>     pool="eng_father" device="ULTRIUM-HH7" (/dev/nst0)
>     spooling=0 despooling=0 despool_wait=0
>     Files=8,695,889 Bytes=397,583,989,107 AveBytes/sec=13,892 LastBytes/sec=0
>     FDReadSeqNo=70,935,857 in_msg=49043683 out_msg=5 fd=7
> Writing: Differential Backup job eng-backup02-repos-job JobId=10843 Volume=""
>     pool="eng_father" device="ULTRIUM-HH7" (/dev/nst0)
>     spooling=0 despooling=1 despool_wait=0
>     Files=7,571 Bytes=563,263,802,969 AveBytes/sec=21,262 LastBytes/sec=0
>     FDReadSeqNo=8,649,492 in_msg=8630895 out_msg=5 fd=16
> Writing: Incremental Backup job earth-ftp-data-job JobId=10865 
> Volume="000301L6"
>     pool="eng_father" device="ULTRIUM-HH7" (/dev/nst0)
>     spooling=0 despooling=0 despool_wait=0
>     Files=0 Bytes=0 AveBytes/sec=0 LastBytes/sec=0
>     FDReadSeqNo=6 in_msg=6 out_msg=4 fd=19
> Writing: Incremental Backup job eng-backup01-fisheye-job JobId=10952 
> Volume="000312L6"
>     pool="eng_father" device="ULTRIUM-HH7" (/dev/nst0)
>     spooling=0 despooling=0 despool_wait=0
>     Files=0 Bytes=0 AveBytes/sec=0 LastBytes/sec=0
>     FDReadSeqNo=6 in_msg=6 out_msg=4 fd=23
> Writing: Differential Backup job eng-backup02-earth-job JobId=11005 
> Volume="000312L6"
>     pool="eng_father" device="ULTRIUM-HH7" (/dev/nst0)
>     spooling=0 despooling=0 despool_wait=0
>     Files=0 Bytes=0 AveBytes/sec=0 LastBytes/sec=0
>     FDReadSeqNo=6 in_msg=6 out_msg=4 fd=25
> ====
>
> Jobs waiting to reserve a drive:
> ====
>
> Terminated Jobs:
>  JobId  Level    Files      Bytes   Status   Finished        Name
> ===================================================================
>  10835  Incr          0         0   OK       12-Jul-19 23:05 
> eng-backup02-earth-job
>  10836  Incr          0         0   OK       12-Jul-19 23:05 
> eng-backup02-hydrogen-job
>  10837  Incr          0         0   OK       12-Jul-19 23:05 
> eng-backup02-jupiter-job
>  10838  Incr          0         0   OK       12-Jul-19 23:05 
> eng-backup02-lithium-job
>  10781  Diff         98    3.909 G  OK       12-Jul-19 23:19 
> earth-ftp-data-job
>  10839  Incr     52,112    7.373 G  OK       12-Jul-19 23:22 
> eng-backup02-copper-job
>  10840  Incr         10    230.1 M  OK       13-Jul-19 01:31 
> eng-backup02-tellurium-job
>  10841  Incr        385    247.9 M  OK       13-Jul-19 01:31 
> eng-backup02-nitrogen-job
>  10842  Incr          0         0   OK       13-Jul-19 01:31 
> eng-backup02-bacula-job
>  10951  Incr          0         0   Cancel   15-Jul-19 23:05 
> earth-ftp-data-job
> ====
>
> Device status:
> Autochanger "SuperLoader3" with devices:
>    "ULTRIUM-HH7" (/dev/nst0)
>
> Device "ULTRIUM-HH7" (/dev/nst0) is waiting for:
>     Volume:      000312L6
>     Pool:        eng_father
>     Media type:  LTO-6
>     Device is BLOCKED waiting to create a volume for:
>        Pool:        eng_father
>        Media type:  LTO-6
>     Slot 15 is loaded in drive 0.
>     Total Bytes Read=64,512 Blocks Read=1 Bytes/block=64,512
>     Positioned at File=0 Block=0
> ==
> ====
>
> Used Volume status:
> 000312L6 on device "ULTRIUM-HH7" (/dev/nst0)
>     Reader=0 writers=3 reserves=3 volinuse=0
> ====
>
> \Data spooling: 2 active jobs, 53,481,749,471 bytes; 753 total jobs, 
> 3,138,735,672,227 max bytes/job.
> Attr spooling: 3 active jobs, 0 bytes; 1435 total jobs, 14,058,244,089 max 
> bytes.
>
>
>
>
> Pool: eng_father
> +---------+------------+-----------+---------+--------------------+----------+--------------+---------+------+-----------+-----------+---------------------+
> | MediaId | VolumeName | VolStatus | Enabled | VolBytes           | VolFiles 
> | VolRetention | Recycle | Slot | InChanger | MediaType | LastWritten         
> |
> +---------+------------+-----------+---------+--------------------+----------+--------------+---------+------+-----------+-----------+---------------------+
> |      27 | 000301L6   | Recycle   |       1 |                  1 |        0 
> |    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-03 03:01:14 
> |
> |      28 | 000312L6   | Recycle   |       1 |                  1 |        0 
> |    3,456,000 |       1 |   15 |         1 | LTO-6     | 2019-04-05 14:29:43 
> |
> |      34 | 000305L6   | Full      |       1 |  2,517,495,644,160 |    2,518 
> |    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-13 21:26:04 
> |
> |      43 | 000109L6   | Full      |       1 |  5,127,510,076,416 |    5,128 
> |    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-15 04:26:37 
> |
> ........
> |      96 | 000152L6   | Full      |       1 |  6,488,584,768,512 |    6,491 
> |    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-07-13 06:11:32 
> |
> |     115 | 000180L6   | Full      |       1 |  5,498,305,892,352 |    5,499 
> |    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-07-07 07:30:29 
> |
> |     116 | 000161L6   | Full      |       1 |  4,360,401,561,600 |    4,361 
> |    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-07-08 07:46:35 
> |
> +---------+------------+-----------+---------+--------------------+----------+--------------+---------+------+-----------+-----------+---------------------+
>
> Pool: Scratch
> +---------+------------+-----------+---------+-------------------+----------+--------------+---------+------+-----------+-----------+---------------------+
> | MediaId | VolumeName | VolStatus | Enabled | VolBytes          | VolFiles | 
> VolRetention | Recycle | Slot | InChanger | MediaType | LastWritten         |
> +---------+------------+-----------+---------+-------------------+----------+--------------+---------+------+-----------+-----------+---------------------+
> |      29 | 000302L6   | Purged    |       1 | 3,072,768,168,960 |    3,102 | 
>    3,456,000 |       1 |    1 |         1 | LTO-6     | 2019-04-07 01:02:14 |
> |      30 | 000300L6   | Purged    |       1 | 6,732,455,353,344 |    6,783 | 
>    3,456,000 |       1 |    8 |         1 | LTO-6     | 2019-04-09 23:39:28 |
> |      31 | 000308L6   | Purged    |       1 | 5,182,668,352,512 |    5,184 | 
>    3,456,000 |       1 |    9 |         1 | LTO-6     | 2019-04-11 00:42:11 |
> |      32 | 000314L6   | Purged    |       1 | 2,517,496,547,328 |    2,518 | 
>    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-11 13:16:00 |
> |      33 | 000311L6   | Purged    |       1 | 6,517,582,912,512 |    6,519 | 
>    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-13 01:09:06 |
> |      36 | 000319L6   | Purged    |       1 | 4,007,779,614,720 |    4,026 | 
>    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-09 16:23:47 |
> |      37 | 000317L6   | Purged    |       1 | 2,608,719,289,344 |    2,609 | 
>    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-10 10:07:04 |
> |      39 | 000114L6   | Purged    |       1 | 2,627,848,903,680 |    2,635 | 
>    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-13 13:57:21 |
> |      40 | 000129L6   | Purged    |       1 | 2,836,434,585,600 |    2,837 | 
>    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-14 01:00:15 |
> |      41 | 000113L6   | Purged    |       1 | 4,554,468,108,288 |    4,562 | 
>    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-15 08:51:50 |
> |      42 | 000112L6   | Purged    |       1 | 2,551,475,856,384 |    2,552 | 
>    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-15 19:50:17 |
> |      46 | 000173L6   | Purged    |       1 | 2,590,008,357,888 |    2,592 | 
>    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-18 02:45:04 |
> |      48 | 000171L6   | Purged    |       1 | 4,706,576,437,248 |    4,744 | 
>    3,456,000 |       1 |    0 |         0 | LTO-6     | 2019-04-24 12:57:32 |
> |     117 | 000131L6   | Append    |       1 |            64,512 |        0 | 
>   31,536,000 |       1 |    1 |         1 | LTO-6     | 0000-00-00 00:00:00 |
> |     118 | 000135L6   | Append    |       1 |            64,512 |        0 | 
>   31,536,000 |       1 |    3 |         1 | LTO-6     | 0000-00-00 00:00:00 |
> |     119 | 000132L6   | Append    |       1 |            64,512 |        0 | 
>   31,536,000 |       1 |    4 |         1 | LTO-6     | 0000-00-00 00:00:00 |
> |     120 | 000134L6   | Append    |       1 |            64,512 |        0 | 
>   31,536,000 |       1 |    5 |         1 | LTO-6     | 0000-00-00 00:00:00 |
> |     121 | 000133L6   | Append    |       1 |            64,512 |        0 | 
>   31,536,000 |       1 |    8 |         1 | LTO-6     | 0000-00-00 00:00:00 |
> +---------+------------+-----------+---------+-------------------+----------+--------------+---------+------+-----------+-----------+---------------------+
>
> Regards,
> William
>
> _______________________________________________
> Bacula-users mailing list
> Bacula-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bacula-users
>


_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to