The Cyc project was the only time I have seen an advertiser put
"knowledge of metaphysics" in a want-ad for employment. Topics like
actuality, appearance, causality, etc have always seemed crucial to
me.

On 2/14/19, Jim Bromer <[email protected]> wrote:
> I should have stated my feelings a little better. I should have said
> that comments like those seem a little stale because we have been
> making them for years and none of us have the time to thoroughly
> research and review the claims and actual achievements that have been
> made.
> I do think that we need to develop more robust learning systems. I am
> trying to write a short page on my thoughts about something like this.
> One of the problems is that if we try to partition learning
> acquisition and retrieval algorithms on a more abstract level the
> partitions are immediately brittle because the distinctions become
> intangible at a higher abstract level where it has to be assumed
> further detail would have to be added to make them feasible. But if we
> had a higher abstraction to start with they would make testing them
> out much easier.
> Jim Bromer
>
> On Thu, Feb 14, 2019 at 11:30 AM Linas Vepstas <[email protected]>
> wrote:
>>
>>
>>
>> On Thu, Feb 14, 2019 at 7:47 AM Jim Bromer <[email protected]> wrote:
>>>
>>> These comments by Linus seem a little stale. Have you ever made
>>> inquiries about your recollections about what IBM said Watson could
>>> do? Have you looked into the current progress of Watson with medical
>>> knowledge?
>>
>>
>> Indeed perhaps they are stale. I had assumed that if there was progress, I
>> would have heard about it with the same ferocity as the original round of
>> advertisements.
>>
>> I did not mean to imply that expert systems are wrong, or are bad. They
>> seem quite viable, now that there is more ram, disk and compute power.
>> Algos can dig deeper, explore more branches, find and resolve more
>> inconsistencies. The Jeopardy tournament clearly proved that.
>> Historically, the systems were incapable of "learning from experience" but
>> tighter coupling to experiential systems might solve that. Whether or not
>> a suitcase fits in a trophy, or the other way around depends on one's
>> experience with trophies and suitcases; to have experience with either,
>> you have to be a human (with a need move items, and a interest in
>> trophy-granting activities (and a suitcase in a closet, and experience
>> with places that have closets or attics ...))  Common sense is
>> experiential; something you develop over decades of interacting with a
>> reactive environment.
>>
>> My critique of Doug Lenat is that perhaps he underestimated the magnitude
>> of the task,and pinned too much hope on "microtheories" to resolve
>> deductive inconsistencies.
>>
>> There is an open question: is it better to curate an existing
>> knowledge-base, slowly expanding it, or is it better to create a more
>> robust learning system, that will be able to rapidly acquire the needed
>> knowledge once it's turned on?  I'm betting on the latter, not the
>> former.
>>
>> --linas
>>
>> --
>> cassette tapes - analog TV - film cameras - you
>> Artificial General Intelligence List / AGI / see discussions +
>> participants + delivery options Permalink

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T9ccd0aac7d42f57b-M41e1fd6cae7a69195c67ea34
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to