If you have any metal, a cron doing an rsync against ec2 may work well, hell you could do that with a cheap laptop that has a large hard drive running linux that is plugged in and doesn’t sleep. Enterprise? No. Works? Certainly
> On Aug 5, 2022, at 12:31 PM, Thomas Woodard <twood...@eline.com> wrote: > > Actually, soft links won't work either, because the snapshots aren't in a > subdirectory of data, and each one has a different name. > > Cron on ec2 is a bit of a pain, but yes, that does seem like the > best solution available. > >> On Fri, Aug 5, 2022 at 11:15 AM Dave <hastings.recurs...@gmail.com> wrote: >> >> Can’t you just make a cron job that runs an sh file that does a cp-rf on >> the data folder with a time stamp? The indexes are drop in when needed >> >>>> On Aug 5, 2022, at 12:07 PM, Thomas Woodard <twood...@eline.com> wrote: >>> >>> That is exactly what I was afraid of. Not being able to configure where >>> automated backups go seems like a pretty major oversight, though. Is >> anyone >>> aware of a solution other than creating a bunch of soft links? >>> >>>> On Fri, Aug 5, 2022 at 8:52 AM Shawn Heisey <apa...@elyograg.org> >> wrote: >>>> >>>>> On 8/5/22 07:42, Shawn Heisey wrote: >>>>> I've confirmed that it isn't a path security issue, by verifying that >> all >>>>> paths are allowed: >>>>> 2022-08-05 12:29:03.873 INFO (main) [ ] o.a.s.c.CoreContainer >> Allowing >>>>> use of paths: [_ALL_] >>>> >>>> I missed this part of your email until after I had already sent my other >>>> reply. Apologies for the oversight. >>>> >>>> I think the problem is likely that location must be a URL parameter, not >>>> configured in solrconfig.xml. The code looks like it supports this >>>> conclusion. >>>> >>>> Thanks, >>>> Shawn >>>> >>>> >>