* Christopher Howard <christop...@librehacker.com> [2024-12-10 20:02]:
> Jean Louis <bugs@gnu.support> writes:
> 
> > Sure. But every game, and every software and Emacs itself is
> > artificial intelligence. It is extended mind. But now the term AI is
> > used in marketing to make it easier accessible to common people.

> It seems to me that some important distinctions are being blurred
> throughout this thread. I am seeing the term AI used to refer to three
> things:

> (1) generally, any kind of computation or problem solving that involves 
> computer programming;
> (2) computation that involves inferences and rules (e.g., a prolog program)
> (3) using LLM, i.e., "the use of large neural networks for language modeling" 
> (wikipedia definition).

You are right, I cannot personally say AI only for LLMs just because that is 
getting popular.

Like Basile Starynkevitch explained, there are systems like CLIPS and
RefPerSys, Prolog, etc., there are many ways how computer represents
AI. LLMs are not the only AI, that is like degradation of all of the
previous work on which that LLM was based.

LLM represent to me, the enhanced workflow taught to computer to
recognize the needs and provide results

Those human needs we have been recognizing all over place, like in
Emacs or any other software. User is moving the arrow and trying to
shoot the spaceship, but spaceship can see him, react, and fight
against the user. Every game is type of artificial intelligence.

> Activities (1) and (2) are things that I can do on my own computer, maybe 
> even without having to leave Elisp or the running, single Emacs thread. 

That is right, and many such we already all use.

But we do not integrate enough!

Integration, if that is the right work, is enhancing the human workflow to 
minimize efforts and provide optimum results. That is what I mean.

Programmers are not necessarily scientists, and so they think in terms
of typing. But it is possible to control light with brainwaves, with
special hat, or typing on computer with the eyeball movements.

Makers of LLMs now provided "trained" models  that
can type text, translate text more accurately then common translators. 

> For activity (3), even I can do it without the help of remote
> compute cluster, it is going to require a large model database, plus
> intense computing resources, like a separate computer, or an expensive
> GPU requiring proprietary drivers.

Here is example that works without GPU:
https://github.com/Mozilla-Ocho/llamafile/

and other examples on same page. 

> I'm open minded to integrations of (3), if they can be done
> cost-effectively, if they are truly useful, and if I don't have to
> give up my computing freedoms, but that has to be proven to me. And I
> don't want that approach confused with (1) and (2).

Just as usual, you have got the computing cost, electricity and
computer wearing cost. 

It seems that those files are free software, Apache 2.0 License, but I
did not inspect everything, and some models may be free, some not,
choose what is free.

To not confuse it, we shall simply talk about the LLMs when it is the
subject.

-- 
Jean Louis

---
via emacs-tangents mailing list 
(https://lists.gnu.org/mailman/listinfo/emacs-tangents)

Reply via email to