As a footnote: I have fixed the problem by converting all the non-UTF8
filenames to UTF8. I discovered a neat little Linux program called
"convmv" which does this automatically.
Steve
On 5 Jan 2008, at 21:29, Dan Langille wrote:
> Stephen Winnall wrote:
>> On 5 Jan
On 5 Jan 2008, at 16:03, Dan Langille wrote:
>
> I have confirmed a bug: the job silently fails without reporting the
> following error, which is logged in /var/log/messages:
>
> ERROR: invalid byte sequence for encoding "UTF8": 0x9f
> HINT: This error can also happen if the byte sequence does
ch( "2004-05-05\ Erg\212nzung\ 0001.jpeg" );
touch( "2004-05-05\ Erg\212nzung\ 0002.jpeg" );
sub touch {
my $filename = shift;
open FILE, ">$filename";
close FILE;
}
Regards
Steve
On 5 Jan 2008, at 04:39, Dan Langille wrote:
> Dan Langille wrote:
&g
Hi Eric
Thanks for the tip. The files that are causing me grief are old files
which Bacula/PostgreSQL used to handle OK. My client is UTF8 these
days, but these files are remnants which were originally created as
MacRoman. I must confess I can't remember definitively what encoding I
was us
I have been using Bacula for over two years quite happily on an old
Red Hat 9 server. The last version of Bacula that I used was a hand-
compiled 2.0.0 with PostgreSQL 7.3.9.
This server is the data storage for my Mac OS X and Windows clients ,
which it serves with Netatalk and Samba. So any