Hi Michael,
That's really interesting. I assume that you're using ICU indexing?
You could update "phrases-icu.xml" and "words-icu.xml" to strip out hyphens.
You would need to re-index all your records afterwards though.
I haven't actually tested that particular change, but just taking a little
Hi Michael,
I'm glad that I got you a bit further on your journey. It's a shame about
having to use the CHR indexing. You can find more information here at
https://software.indexdata.com/zebra/doc/character-map-files.html.
After reading through that, I'm thinking perhaps that CHR indexing can
Hi Michael,
Your experience suggests to me that you're using Zebra 2.0.59 (which is the
package available in the official Debian repositories). There is a bug in that
version which causes hyphens to cause an incorrect truncation so that
"Sinti-Swing" becomes "Sinti". If you use the Indexdata De
Hi Katrin,
I actually did create a transliterate rule which was able to convert the input
from "Sinti-Swing" to "SintiSwing Sinti Swing" which created "sintiswing",
"sinti", and "swing" as tokens. However, I think the ICU chain file gets used
for both indexing and searching, so while it would
Hi Alvaro,
I'm sure the folk at Theke would be able to provide a solid recommendation
(https://theke.io/).
I mostly support English libraries and they use the CHR (ie charmap
word-phrase-utf.chr) indexing. However, libraries I support with French and
Arabic use ICU (ie icuchain words-icu.xml)
Hi all,
Fred King's email about using Koha in a Protected Health network reminded me
of a question I wanted to pose around the world. Has anyone commissioned an
external independent security audit of Koha? That is to say, a security
audit from someone other than a Koha support vendor?
If s
Hi Alvaro,
I think that you misunderstand the netstat output there.
"tcp6 0 0 :::80 :::* LISTEN 1113/apache2" means that
it's using a TCP socket that supports both IPv4 and IPv6.
Heinz:
1. Remove the entries from /etc/hosts
2. Do "nslookup something.dnshome.de"
3. Con
Hi George,
I notice that you didn't answer Katrin's questions. Answering those questions
is really important, or we can't really help you very well.
Now, it sounds to me like you've installed Koha from either a "git clone" or a
tarball, and that you're running multiple Koha instances from mult
Hi George,
I’m glad that my email helped!
It does feel strange, but that’s the way SysV scripts work. Think about it like
a shebang at the top of a script. While it starts with a #, it is still parsed
by computers as more than just a comment.
David Cook
Systems Librarian
Prosentien
Recently, I've been re-thinking the scalability of Koha, so it's interesting to
hear about this from you, Mengu, as the person who I think runs the biggest
Koha installation in the world.
Originally, I thought putting images in the database was a bad idea, but it is
an easy way to provide that
Hi Michal,
I would say that the plugin system (like many plugin systems) is risky. As
Jonathan indicates, plugins are not reviewed by the Koha Community, so we can
make no guarantees regarding safety/security of individual plugins. Since the
plugins are third-party code, they could contain anyt
You would need to clarify what you mean by "docs management system".
If you're referring to a system as described by
https://en.wikipedia.org/wiki/Document_management_system, then I would say no.
Koha isn't a records management system nor is it a digital library.
Koha is an integrated library
Thanks, Mason!
I think Mason has released a newer version now 3.23+dfsg-1+koha1, which was
needed for newer versions of Ubuntu, which used newer versions of
libjson-validator-perl than 3.06.
So the fix now should just be:
apt-get update
apt-get install libjson-validator-perl
David Cook
System
The issue is that the coverflow plugin doesn't have a "tool" or "report"
method, so it won't appear in the Tools or Report modules by default.
You'll notice that /cgi-bin/koha/tools/tools-home.pl takes you to
/cgi-bin/koha/plugins/plugins-home.pl?method=tool, and
/cgi-bin/koha/reports/reports-
Hi Lari,
It's a good question. I have the Chromium-based Edge, and it's what I use for
testing. I've been referring clients away from IE 11 and to Edge (where they
can't use Chrome/Firefox), and I haven't heard any complaints about Edge yet.
David Cook
Systems Librarian
Prosentient Systems
72/3
Hi Vinod,
I haven't played with the HTML5 Media in Koha, but I've been curious about it,
so I decided to take a look after reading your email.
Looking at the code, it appears that you can embed other videos and not just
Youtube videos. You'll need to use the system preferences HTML5MediaEnabled
Hi Vinod,
Koha, at least as of 19.11, has the capacity to handle non-YouTube media. I
won’t give you remote support here, but perhaps someone else can or perhaps you
can engage a support provider.
David Cook
Systems Librarian
Prosentient Systems
72/330 Wattle St
Ultimo, NSW 2007
Aus
I have a fair number of libraries using EDS (in Australia), and EBSCO doesn't
use Z39.50 for any of them.
EBSCO harvests the records using OAI-PMH and then they use the ILS-DI API for
the RTA (Real Time Availability). I think Wolters Kluwer's Ovid does the same
thing.
Joel, if "find @attr 1=1
I am concerned about adding backend scripts to the wiki. There's no reliable
way to ensure those scripts would be correct, and it would be trivial for
someone to inject malicious code into the scripts and have unsuspecting users
run things which could damage/compromise their backend systems by c
Sounds great, Martin!
David Cook
Software Engineer
Prosentient Systems
72/330 Wattle St
Ultimo, NSW 2007
Australia
Office: 02 9212 0899
Online: 02 8005 0595
From: Koha-devel On Behalf Of
Renvoize, Martin
Sent: Tuesday, 1 December 2020 10:14 PM
To: Koha Devel ; Koha
Subject: [K
Hi Peter,
Without rewriting the Koha code, it's impossible to authority control 2
subfields in a field. There's a few reasons why.
First, there is only one "9" subfield available for the linkage to the
authority record.
Second, Koha is hard-coded to authority control only the "a" subfield. I
Hi Tasha,
It's great to hear that you're trying to install Koha! However, if I were you,
I wouldn't bother with that old wiki page, as it's probably going to confuse
more than help.
Typically, the Koha community supports Koha installations on Debian and Ubuntu
operating systems, but there are
Hi Barry,
I think that it is reasonable to expect some minimum requirements to be listed,
but it is easier said than done, since Koha is a large multi-component system.
Personally, I tend to run many Koha instances on high powered servers with
external database servers, so it's rare for me to b
Hi Barry,
I think that all sounds reasonable. If you haven’t done so already, would you
consider requesting a wiki account via
https://wiki.koha-community.org/wiki/Special:RequestAccount, so that you can
add those notes as you’ve suggested? It should be low effort (in comparison to
all the
Hi Tom,
I can't really speak to Elasticsearch with Koha, as I still just use Zebra. I
imagine Ere in Finland or someone from Bywater in the USA might be the best to
comment on Elasticsearch.
However, I'm curious what OS you are using. Debian/Ubuntu is typically the best
option in terms of comm
That KohaRest driver looks interesting!
I haven't dug into it too deeply, but how do you handle the user/patron
authentication? Do you have the user log into VuFind and at that point you use
the backend confidential client user to validate it via your plugin API
endpoint? Have you thought about
Hi Charles,
I'm not 100% sure what you're asking here. Are you asking to find all records
where there is a 245 title that isn't Romanized?
You could try something like this:
SELECT *
FROM biblio
WHERE title <> CONVERT(title USING latin1);
I've tried that out on one of my multilingual librar
Hi Michal,
At the moment, Koha only provides MARC bibliographic records via OAI-PMH, but I
am intrigued by the prospect of providing authority records.
I think we'd probably have to provide a new endpoint for that though, as I
don't think we could make the current endpoint backwards compatible
Hi Charles,
Koha uses the "The ever popular do-nothing-approach".
In theory, this is problematic and certainly bothers me, but in practice I
think it's fairly rare that two staff members are working on the same record at
the same time. Locking and lock timeouts come with a lot of complexity, an
Hi Charles,
No worries. I wish you well with your cloud migration.
David Cook
Software Engineer
Prosentient Systems
Suite 7.03
6a Glen St
Milsons Point NSW 2061
Australia
Office: 02 9212 0899
Online: 02 8005 0595
From: Charles Kelley
Sent: Friday, 9 July 2021 11:33 AM
To: d
Hi Lizelle,
Thanks for reaching out.
That's a big question, and there are a lot of different answers, depending on
your situation.
One option is to get your database vendor to supply you with MARC records that
you can import into your Koha.
Depending on who provides your database content, t
Hey Russel,
If you have to use Centos7, you might consider using Docker to run Koha in a
Debian or Ubuntu container, since Docker should be available in Centos7. While
running Koha in Docker isn't officially supported by the community, it is a
potential solution. You would need to be skilled wi
Hi Mike,
Great to see another Australian library using Koha!
Regarding performance, I'd say the answer is "it depends".
It looks like you're running Debian Buster (based on your Apache response), so
I'm guessing you used the Debian packages to install Koha? I thought that new
instances create
It's 11am April 2nd in Australia and... yep. Yep. Yep. Fridolin is a wizard.
David Cook
Systems Librarian
Prosentient Systems
72/330 Wattle St
Ultimo, NSW 2007
Australia
Office: 02 9212 0899
Direct: 02 8005 0595
-Original Message-
From: koha-devel-boun...@lists.koha-community.org
On B
Hi Mason,
Can you tell us what version of Zebra you're running? And what is your exact
query?
According to https://packages.debian.org/stretch/idzebra-2.0, you're
probably running Zebra 2.0.59, unless you're pulling packages from
Indexdata's APT repository.
I discovered a ICU bug in Zebra 2.0.
HI again, Mason,
I just remembered a little trick that you might find useful.
Try the following:
echo "PZ 7 .W663 1984" | yaz-icu -x -c /path/to/phrases-icu.xml
echo "PZ 7 .W663 1984" | yaz-icu -x -c /path/to/words-icu.xml
That should show you how the string is normalized and tokenized for inde
36 matches
Mail list logo