Little harsh, are we? I have read the WHOLE documentation, it's a bit long
so confusion might arise + I am not familiar with postgre AT ALL so the
confusion grows.
Perhaps I am an idiot and you don't like helping idiots or perhaps it's
something else? Which one is it?
If you don't want to help me,
Mohamed,
please, try to read docs and think a bit first.
On Mon, 2 Feb 2009, Mohamed wrote:
On Mon, Feb 2, 2009 at 4:34 PM, Oleg Bartunov wrote:
On Mon, 2 Feb 2009, Oleg Bartunov wrote:
On Mon, 2 Feb 2009, Mohamed wrote:
Hehe, ok..
I don't know either but I took some lines from Al-Jaz
On Mon, Feb 2, 2009 at 4:34 PM, Oleg Bartunov wrote:
> On Mon, 2 Feb 2009, Oleg Bartunov wrote:
>
> On Mon, 2 Feb 2009, Mohamed wrote:
>>
>> Hehe, ok..
>>> I don't know either but I took some lines from Al-Jazeera :
>>> http://aljazeera.net/portal
>>>
>>> just made the change you said and creat
On Mon, 2 Feb 2009, Oleg Bartunov wrote:
On Mon, 2 Feb 2009, Mohamed wrote:
Hehe, ok..
I don't know either but I took some lines from Al-Jazeera :
http://aljazeera.net/portal
just made the change you said and created it successfully and tried this :
select ts_lexize('ayaspell', '?? ?
On Mon, 2 Feb 2009, Mohamed wrote:
Hehe, ok..
I don't know either but I took some lines from Al-Jazeera :
http://aljazeera.net/portal
just made the change you said and created it successfully and tried this :
select ts_lexize('ayaspell', '?? ??? ? ? ?? ?
?')
b
Hehe, ok..
I don't know either but I took some lines from Al-Jazeera :
http://aljazeera.net/portal
just made the change you said and created it successfully and tried this :
select ts_lexize('ayaspell', 'استشهد فلسطيني وأصيب ثلاثة في غارة إسرائيلية
جديدة')
but I got nothing... :(
Is there a way
Mohamed,
comment line in ar.affix
#FLAG long
and creation of ispell dictionary will work.
This is temp, solution.
Teodor is working on fixing affix autorecognizing.
I can't say anything about testing, since somebody should provide
first test case. I don't know how to type arabic :)
Oleg
O
Oleg, like I mentioned earlier. I have a different .affix file that I got
from Andrew with the stop file and I get no errors creating the dictionary
using that one but I get nothing out from ts_lexize.
The size on that one is : 406,219 bytes
And the size on the hunspell one (first) : 406,229 bytes
Ok, thank you Oleg.
I have another dictionary package which is a conversion to hunspell aswell:
http://wiki.services.openoffice.org/wiki/Dictionaries#Arabic_.28North_Africa_and_Middle_East.29
(Conversion of Buckwalter's Arabic morphological analyser) 2006-02-08
And running that gives me this erro
Mohamed,
We are looking on the problem.
Oleg
On Mon, 2 Feb 2009, Mohamed wrote:
No, I don't. But the ts_lexize don't return anything so I figured there must
be an error somehow.
I think we are using the same dictionary + that I am using the stopwords
file and a different affix file, because us
No, I don't. But the ts_lexize don't return anything so I figured there must
be an error somehow.
I think we are using the same dictionary + that I am using the stopwords
file and a different affix file, because using the hunspell (ayaspell) .aff
gives me this error :
ERROR: wrong affix file form
Hi Mohamed.
I don't know where you get the dictionary - I unsuccessfully tried the
OpenOffice one by myself (the Ayaspell one), and I had no arabic
stopwords file.
Renaming the file is supposed to be enough (I did it successfully for
Thailandese dictionary) - the ".aff'" file becoming the ".
12 matches
Mail list logo