Hello all! Le sam. 16 nov. 2019 à 11:06, Amirouche Boubekki < amirouche.boube...@gmail.com> a écrit :
> I restarted working on my personal search engine. > After two weeks of work, 41 files changed, 1845 insertions(+), 441 deletions(-) and 97 commits, I tagged a v0.2.0 in the repository at: https://git.sr.ht/~amz3/guile-babelia The babelia index and babelia search subcommands were removed. Instead, one has to `make web` to spawn a server and then hit the /api/search?query=foobar to make a search. To index stuff, one can POST a file like test.scm to /api/index or rely on babelia crawler subcommands. The crawler is still a work-in-progress. Do not expect the index to be compatible with future releases. > > The last iteration, gotofish, was not too bad even if it has bitrot. > Based on my research and practical experiment, it seems very clear > that there is no workaround the use of map-reduce, that might be known > as n-par-for-each [3]. > > [3] > https://www.gnu.org/software/guile/manual/html_node/Parallel-Forms.html#index-n_002dpar_002dfor_002deach > > I made a prototype similar to that n-par-for-each, except it works > with guile-fibers, is asynchronous and works with a shared pool of > threads instead of spawning N threads for each incoming query like > gotofish does. Actually, what I need is n-for-each-par-map where map happens in parallel. The implementation can be found in the babelia/pool.scm file [4]. [4] https://git.sr.ht/~amz3/guile-babelia/tree/v0.2.0/babelia/pool.scm The installation process is still a little bit akward, because one needs to change the path to wiredtiger-3.2.0-0 shared library in the source. Add my channel [5] and do `make init` to get started. [5] https://git.sr.ht/~amz3/guix-amz3-channel Happy hacking!