Re: [Bacula-users] Move content of a volume to another volume

2017-01-05 Thread Uwe Schuerkamp
On Thu, Jan 05, 2017 at 06:36:10PM +0100, Lukas Hejtmanek wrote: > On Thu, Jan 05, 2017 at 05:14:38PM +0100, Uwe Schuerkamp wrote: > > Hm, if you have an error on the tape, how are you going to recover the > > data off of it? Or are you saying that you have a tape volume in > > status "Error" withi

Re: [Bacula-users] BLOCKED waiting for media

2017-01-05 Thread Ian Douglas
On Thursday 05 January 2017 23:25:59 Ian Douglas wrote: > The messages do not say WHICH server, I presume Director? But it is > running I restarted Director, then reran the job, it picked up a 2 file discrepancy between database and tape which it corrected, and now seems to be running okay

Re: [Bacula-users] BLOCKED waiting for media

2017-01-05 Thread Ian Douglas
On Wednesday 04 January 2017 07:15:42 Josh Fisher wrote: > > Now the job that WAS running (and all ones after it) are Error. > > Yes. So you cannot trust the job that was running. I would probably > restart all daemons and erase that tape and start over. Thanks. Was hoping for a more elegant wa

Re: [Bacula-users] Move content of a volume to another volume

2017-01-05 Thread Lukas Hejtmanek
On Thu, Jan 05, 2017 at 05:14:38PM +0100, Uwe Schuerkamp wrote: > Hm, if you have an error on the tape, how are you going to recover the > data off of it? Or are you saying that you have a tape volume in > status "Error" within your bacula setup? the tape has write error, so I suppose it is still

Re: [Bacula-users] Move content of a volume to another volume

2017-01-05 Thread Uwe Schuerkamp
On Thu, Jan 05, 2017 at 03:19:00PM +0100, Lukas Hejtmanek wrote: > Hello, > > is there a way in bacula to move all the data from one volume to another > volume in the same pool? I tried migrate job but it seems to be possible to > migrate only from one pool to another. > > I just need to move off

Re: [Bacula-users] mysqldump not working due to missing password

2017-01-05 Thread Martin Simmons
I think the job will fail if the script returns a non zero exit status. The problem is that shell backquote doesn't return an exit status, but you could do something like if test -z "$databases"; then exit 1; else for db ... done; fi __Martin > On Thu, 5 Jan 2017 13:55:41 +0100, Stefan Kieh

[Bacula-users] Move content of a volume to another volume

2017-01-05 Thread Lukas Hejtmanek
Hello, is there a way in bacula to move all the data from one volume to another volume in the same pool? I tried migrate job but it seems to be possible to migrate only from one pool to another. I just need to move off the data from one tape to replace the tape because of error of the tape. But I

[Bacula-users] Issue with Run Script interpreter

2017-01-05 Thread c . monty
Hi, I want this command to be executed: su tstadm -c \"/usr/sap/TST/HDB00/exe/hdbsql -t -U BKPOPERATOR 'BACKUP DATA INCREMENTAL USING FILE ('/usr/sap/TST/HDB00/backup/data/backup-incr-TST-bareos-schedule');'\" When I define a related Run Script ClientRunBeforeJob = "su tstadm -c \"/usr/sap/TST/H

Re: [Bacula-users] mysqldump not working due to missing password

2017-01-05 Thread Stefan Kiehne
You're right I messed up missing to pass the --defaults-extra-file in the mysql command and thus it would fail trying to get the list of databases. Messed around in a VM an got to the same conslusion. I changed the config now I'll see tomorrow how it went. Thanks for the tips. Still is there a way

Re: [Bacula-users] mysqldump not working due to missing password

2017-01-05 Thread Martin Simmons
> On Thu, 5 Jan 2017 08:47:08 +0100, Stefan Kiehne said: > > Hi guys, > > I have a ClientRunBeforeJob directive that makes a dump of each database > (using mysqldump) before the backup. > > ClientRunBeforeJob = "sh -c 'databases=`mysql -u root -e \"SHOW DATABASES;\" > | grep -Ev \"(Database

[Bacula-users] File Attributes Database Issues

2017-01-05 Thread Can Şirin
Hi everyone, I am using Bacula 7.2.0 on RHEL7. Dir, sd and fd daemons running on same server. My backup size is almost 20 TB with millions of small files and writing the data to LTO4 tapes. I have run 8 jobs concurrently with different 8 LTO4 drives. While jobs are running, I had no proble