It looks to me like S3 support is missing.

What is the PluginDirectory in your bacula-sd.conf?

Find the bacula-sd-cloud-driver-9.6.3.so in that directory and post the output
of:

objdump -t /...path.../...to.../bacula-sd-cloud-driver-9.6.3.so | grep _driver

__Martin


>>>>> On Thu, 14 May 2020 06:24:53 +0000, Rick Tuk said:
> 
> LS,
> 
> I read Martin Simmons' reply to Phillip Dale’s message.
> The traceback I found did not have much information in it, so I installed gd 
> and changed the btraceback script to run gd as root.
> 
> The following traceback is triggered by opening bconsole and trying to list 
> all volumes in cloud, same sd crash happens:
> 
> [New LWP 5942]
> [New LWP 6084]
> [Thread debugging using libthread_db enabled]
> Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
> 0x00007ff1e39ca03f in select () from /lib/x86_64-linux-gnu/libc.so.6
> $1 = "14-May-2020 08:18:11\000\000\000\000\000\000\000\000\000"
> $2 = 0x56298dfb0ee0 <my_name> "soteria.svc.mostwanted.io-sd"
> $3 = 0x56298ee4d0e8 "bacula-sd"
> $4 = 0x56298ee4d128 "/opt/bacula/bin/bacula-sd"
> $5 = 0x0
> $6 = '\000' <repeats 49 times>
> $7 = 0x7ff1e42ad55b "9.6.3 (09 March 2020)"
> $8 = 0x7ff1e42ad53a "x86_64-pc-linux-gnu"
> $9 = 0x7ff1e42ad533 "ubuntu"
> $10 = 0x7ff1e42ad555 "18.04"
> $11 = "soteria", '\000' <repeats 42 times>
> $12 = 0x7ff1e42ad54e "ubuntu 18.04"
> Environment variable "TestName" not defined.
> #0  0x00007ff1e39ca03f in select () from /lib/x86_64-linux-gnu/libc.so.6
> #1  0x00007ff1e4269618 in bnet_thread_server (addrs=<optimized out>, 
> max_clients=41, client_wq=0x56298dfb1020 <dird_workq>, 
> handle_client_request=0x56298dd99ee0 <handle_connection_request(void*)>) at 
> bnet_server.c:166
> #2  0x000056298dd9126a in main (argc=<optimized out>, argv=<optimized out>) 
> at stored.c:326
> 
> Thread 3 (Thread 0x7ff1e1e6f700 (LWP 6084)):
> #0  0x00007ff1e403f23a in waitpid () from 
> /lib/x86_64-linux-gnu/libpthread.so.0
> #1  0x00007ff1e429453e in signal_handler (sig=11) at signal.c:233
> #2  <signal handler called>
> #3  0x00007ff1e0c64787 in cloud_dev::get_cloud_volumes_list (this=<optimized 
> out>, dcr=0x7ff1dc00a138, volumes=0x7ff1e1e6ec50, err=@0x7ff1e1e6ec48: 
> 0x7ff1dc001330 "") at cloud_dev.h:110
> #4  0x000056298dd95829 in cloud_list_cmd (jcr=<optimized out>) at dircmd.c:815
> #5  0x000056298dd9a394 in handle_connection_request (arg=0x56298ee57428) at 
> dircmd.c:242
> #6  0x00007ff1e429f518 in workq_server (arg=0x56298dfb1020 <dird_workq>) at 
> workq.c:372
> #7  0x00007ff1e40346db in start_thread () from 
> /lib/x86_64-linux-gnu/libpthread.so.0
> #8  0x00007ff1e39d488f in clone () from /lib/x86_64-linux-gnu/libc.so.6
> 
> Thread 2 (Thread 0x7ff1e166e700 (LWP 5942)):
> #0  0x00007ff1e403af85 in pthread_cond_timedwait@@GLIBC_2.3.2 () from 
> /lib/x86_64-linux-gnu/libpthread.so.0
> #1  0x00007ff1e429eb56 in watchdog_thread (arg=<optimized out>) at 
> watchdog.c:299
> #2  0x00007ff1e40346db in start_thread () from 
> /lib/x86_64-linux-gnu/libpthread.so.0
> #3  0x00007ff1e39d488f in clone () from /lib/x86_64-linux-gnu/libc.so.6
> 
> Thread 1 (Thread 0x7ff1e4b59300 (LWP 5938)):
> #0  0x00007ff1e39ca03f in select () from /lib/x86_64-linux-gnu/libc.so.6
> #1  0x00007ff1e4269618 in bnet_thread_server (addrs=<optimized out>, 
> max_clients=41, client_wq=0x56298dfb1020 <dird_workq>, 
> handle_client_request=0x56298dd99ee0 <handle_connection_request(void*)>) at 
> bnet_server.c:166
> #2  0x000056298dd9126a in main (argc=<optimized out>, argv=<optimized out>) 
> at stored.c:326
> #0  0x00007ff1e39ca03f in select () from /lib/x86_64-linux-gnu/libc.so.6
> No symbol table info available.
> #1  0x00007ff1e4269618 in bnet_thread_server (addrs=<optimized out>, 
> max_clients=41, client_wq=0x56298dfb1020 <dird_workq>, 
> handle_client_request=0x56298dd99ee0 <handle_connection_request(void*)>) at 
> bnet_server.c:166
> 166   bnet_server.c: No such file or directory.
> maxfd = 7
> sockset = {fds_bits = {128, 0 <repeats 15 times>}}
> newsockfd = <optimized out>
> stat = <optimized out>
> clilen = 16
> clientaddr = {ss_family = 2, __ss_padding = 
> "\262\354\n`\bd\000\000\000\000\000\000\000\000\000\352٠\375\177\000\000ݲ\233\343\361\177\000\000\060\346٠\375\177\000\000\036\260\224\344\361\177\000\000\220\245\265\344\361\177\000\000\060\346٠\375\177\000\000\240\361\377\177\003\000\000\000\307c\224\344\361\177\000\000`\346٠\375\177\000\000\030\352٠\375\177\000\000X\232W\342\361\177\000\000X\232!\000\000\000\000\000\000\200W\342\361\177\000",
>  __ss_align = 1}
> tlog = <optimized out>
> turnon = 1
> request = {fd = 8, user = '\000' <repeats 127 times>, daemon = 
> "soteria.svc.mostwanted.io-sd", '\000' <repeats 99 times>, pid = 
> "5938\000\000\000\000\000", client = {{name = '\000' <repeats 127 times>, 
> addr = '\000' <repeats 127 times>, sin = 0x7ff1e38b27e0, unit = 0x0, request 
> = 0x7ffda0d9e5a0}}, server = {{name = '\000' <repeats 127 times>, addr = 
> '\000' <repeats 127 times>, sin = 0x7ff1e38b2760, unit = 0x0, request = 
> 0x7ffda0d9e5a0}}, sink = 0x0, hostname = 0x7ff1e36aeb30 <sock_hostname>, 
> hostaddr = 0x7ff1e36aeae0 <sock_hostaddr>, cleanup = 0x0, config = 0x0}
> addr = <optimized out>
> fd_ptr = 0x0
> buf = "10.96.8.100", '\000' <repeats 116 times>
> sockfds = {<SMARTALLOC> = {<No data fields>}, head = 0x7ffda0d9e3d0, tail = 
> 0x7ffda0d9e3d0, loffset = 0, num_items = 1}
> allbuf = 
> "hM\266\344\361\177\000\000\210\227\224\344\361\177\000\000\020J\266\344\361\177\000\000\020J\266\344\361\177\000\000\000\000\000\000\000\000\000\000\320L%\344\361\177\000\000\033p\351\003\000\000\000\000$iL\344\361\177\000\000\300EL\344\361\177\000\000X\311$\344\361\177\000\000\020J\266\344\361\177\000\000\000\000\000\000\361\177\000\000p\353٠\375\177\000\000\003\000\000\000\361\177\000\000`\353٠\375\177\000\000\000\000\000\000\375\177\000\000ج\265\344\361\177\000\000\000\000\000\000\000\000\000\000\020\000\000\000\000\000\000\000\001\000\000\000\223T`\275ج\265\344\361\177\000\000\301\006\\\372\000\000\000\000\330\301\344\216)V\000\000\326}'\344\361\177\000\000\330\301\344\216)V\000\000\300\313*\344"...
> #2  0x000056298dd9126a in main (argc=<optimized out>, argv=<optimized out>) 
> at stored.c:326
> 326   stored.c: No such file or directory.
> ch = <optimized out>
> no_signals = <optimized out>
> thid = 140676853856000
> uid = 0x0
> gid = 0x0
> #0  0x0000000000000000 in ?? ()
> No symbol table info available.
> #0  0x0000000000000000 in ?? ()
> No symbol table info available.
> #0  0x0000000000000000 in ?? ()
> No symbol table info available.
> #0  0x0000000000000000 in ?? ()
> No symbol table info available.
> #0  0x0000000000000000 in ?? ()
> No symbol table info available.
> Attempt to dump current JCRs. njcrs=1
> threadid=0x7ff1e1e6f700 JobId=0 JobStatus=C jcr=0x7ff1dc0008f8 name=*System*
>       use_count=1 killable=1
>       JobType=I JobLevel=
>       sched_time=14-May-2020 08:18 start_time=01-Jan-1970 01:00
>       end_time=01-Jan-1970 01:00 wait_time=01-Jan-1970 01:00
>       db=(nil) db_batch=(nil) batch_started=0
> dcr=*None*
> List plugins. Hook count=0
> 
> Met vriendelijke groet / With kind regards,
> Rick Tuk 
> 
> > On May 7, 2020, at 1:32 PM, Rick Tuk <r...@mostwanted.io> wrote:
> > 
> > LS,
> > 
> > I am trying to get Bacula 9.6.3 up and running on Ubuntu 18.04 using the 
> > bacula-cloud-storage package to store the backups to an Ceph cluster using 
> > the S3 interface.
> > All services are running, when I manually try to run a backup job (in this 
> > case a backup of the same host) The job fails with Fatal error: job.c:3011 
> > Comm error with SD. bad response to Append Data. ERR=No data available
> > When this happens the SD daemon crashes with the following error: Bacula 
> > interrupted by signal 11: Segmentation violation
> > 
> > Configs related to this setup that might be relevant:
> > 
> > bacula-dir:
> > 
> > Storage {
> >    Name = Full
> >    Address = soteria.local.domain
> >    SD Port = 9103
> >    Password = “removed-for-security"
> >    Device = Full
> >    Media Type = CloudType
> > }
> > 
> > Storage {
> >    Name = Diff
> >    Address = soteria.local.domain
> >    SD Port = 9103
> >    Password = “removed-for-security"
> >    Device = Diff
> >    Media Type = CloudType
> > }
> > 
> > Storage {
> >    Name = Inc
> >    Address = soteria.local.domain
> >    SD Port = 9103
> >    Password = “removed-for-security"
> >    Device = Diff
> >    Media Type = CloudType
> > }
> > 
> > Pool {
> >    Name = Daily
> >    Pool Type = Backup
> >    Recycle = yes
> >    AutoPrune = yes
> >    Storage = Inc
> >    File Retention = 1 months
> >    Job Retention = 1 months
> >    Volume Retention = 1 months
> >    Maximum Volume Bytes = 10G
> >    Label Format = daily-
> > }
> > 
> > bacula-sd:
> > 
> > Cloud {
> >    Name = Ceph-S3
> >    Driver = "S3"
> >    HostName = “s3.local.domain"
> >    BucketName = "bacula"
> >    AccessKey = “removed-for-security"
> >    SecretKey = “removed-for-security"
> >    Protocol = HTTPS
> >    UriStyle = Path
> >    Truncate Cache = No
> >    Upload = EachPart
> > }
> > 
> > Device {
> >    Name = Full
> >    Cloud = Ceph-S3
> >    Archive Device = /bacula/backup/full
> >    Device Type = Cloud
> >    Media Type = CloudType
> >    Maximum Part Size = 10 MB
> >    Label Media = yes
> >    Random Access = yes
> >    Automatic Mount = yes
> >    Removable Media = no
> >    Always Open = no
> > }
> > 
> > Device {
> >    Name = Diff
> >    Cloud = Ceph-S3
> >    Archive Device = /bacula/backup/diff
> >    Device Type = Cloud
> >    Media Type = CloudType
> >    Maximum Part Size = 10 MB
> >    Label Media = yes
> >    Random Access = yes
> >    Automatic Mount = yes
> >    Removable Media = no
> >    Always Open = no
> > }
> > 
> > Device {
> >    Name = Inc
> >    Cloud = Ceph-S3
> >    Archive Device = /bacula/backup/inc
> >    Device Type = Cloud
> >    Media Type = CloudType
> >    Maximum Part Size = 10 MB
> >    Label Media = yes
> >    Random Access = yes
> >    Automatic Mount = yes
> >    Removable Media = no
> >    Always Open = no
> > }
> > 
> > If any additional information is required, please let me know, I’m really 
> > hoping to get this to work soon.
> > 
> > Met vriendelijke groet / With kind regards,
> > 
> > Rick
> 
> 
> _______________________________________________
> Bacula-users mailing list
> Bacula-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bacula-users
> 


_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to