On Mon, Mar 20, 2006 at 10:34:06AM +0100, Marco Fretz wrote: > hello > > i need an a feedback to the following situation: > > i want to script a backup script with bash script. a script that reads a > file / database that contains the backup jobs (remote server, remote > user, remote dir, ...). > > i think, that file would contain about 10-30 rows. additionally i want > to to some logging into a file or database. and i need some files / rows > to define the excludes for each remote dir. > > now my question: should i use sqlite or a textfile and awk for this?
Whatever you feel most comfortable with, but databases are not necessarily easy to get up in the face of disaster and your dataset is so small they do not produce a meaningful performance benefit. Another pointer: it's not too hard to work without bash-specific features, and portable scripts are much more useful. Try to be sh-compatible unless there's a very good reason to use shell-specific features. Joachim [1] As in, especially awk can be abused as a general-purpose programming language, but it will be rather painful.