> From: "James Harper" <james.har...@bendigoit.com.au>
>> > On 14. apr. 2011, at 13.57, James Harper wrote: >> > >>> > > It's not. MSSQL is "MicroSoft SQL", so guess which platforms it runs on >>> > > :) >>> > > >>> > > I'm not sure if a Windows equivalent of 'touch' could work while the >>> > > database files are open, and even if it did, I'd be reluctant to tinker >>> > > with it. >> > >> > I have a large, dedicated MS SQL server with a couple dozen databases >> > ranging >> > from a few MBs to several GBs. I dump them with ClientRunBeforeJob scripts >> > and >> > back up the dumps. It's really no different than dumping mysql dbs on the >> > fly. >> > I also have management tasks running which dumps transaction logs at >> > regular >> > intervals throughout the day and back them up at night. >> > >> > Check out the SQLCMD.EXE program installed with SQL Server. If you want I >> > could post some example scripts? >> > >> > Or did I misunderstand something? >> > > I don't have space on this particular server to dump out 40GB of database > (the database itself is about 80MB but there are lots of FILESTREAM images > that would be included in the backup stream). > > I have found that all I need to do is run a 'CHECKPOINT' to ensure all data > is written out to the database. Can you post an example Job with a > ClientRunBeforeJob? I just get errors every time I try to do it. > > Thanks > > James If you run the SQL Server services under a domain account you should be able to script a backup to a windows share and then back that up elsewhere. That being said... if you're running a db server that close to the limit (less than 40gb free, really?) you're asking for trouble, or at least making a lot more work for yourself. Or did you really mean that you don't have the time/performance to dump a file that big? You can probably just backup the db occasionally but backup the logs frequently. My system has < 100GB of databases, but I have 800GB of database backup files on a separate drive inside the same server. With SATA drives being so cheap it makes sense to keep your SQL Server backup files on the server for as long as possible because restores from a tape are a lot more work. If you're running >= SQL 2005 you can set up a database maintenance plan that a) backs up all "user" databases automatically so you won't forget to add the backup script when you add a database and b) expires the backup files automatically for you based on whatever number of days you want. I generally have 3 DB maintenance plans. 1 to do the system backups with a short retention period, 1 to do full backup of the user dbs, and 1 to do transaction log backups of the user dbs. The backup files can even be compressed in SQL 2008. Bacula backs up those db backup files quite well. If you try to do your backups of closed systems like Microsoft products in ways specifically warned against don't expect it to go very well. Even if it works today it might break tomorrow after a SQL Server or Windows update is installed, and you probably won't know it's not working until you have to do a restore from the backup that didn't get your data like you thought it did. Bob ------------------------------------------------------------------------------ Benefiting from Server Virtualization: Beyond Initial Workload Consolidation -- Increasing the use of server virtualization is a top priority.Virtualization can reduce costs, simplify management, and improve application availability and disaster protection. Learn more about boosting the value of server virtualization. http://p.sf.net/sfu/vmware-sfdev2dev _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users