Before Doug Lenat there was Borland Turbo Prolog for the IBM PC-386 which had examples like this in Chapter One of the book that came with. It promised that you could build an AI, right on the cover, next to the human head built out of bricks.
And it worked, as long as you had less than a few hundred assertions, and learned how to use "cut". Professional programmers, those who actually got paid real money to write software, created hundreds of large "expert systems" that offered suggestions like "if you drill through 25 feet of sandstone followed by 10 feet of shale, and the drilling mud pressure spiked then you hit a pocket of gas" and "If aspirin was prescribed and no x-ray was performed then the reimbursement should not exceed $10" . These were brittle precisely because they did not have any common-sense. On the other hand, the very nature of "common sense" in petroleum exploration and medical insurance is itself a rather dubious concept ... For a while, IBM had press releases claiming that Watson would pass the medical bar exam, "real soon now", and then be put to work processing the extremely tedious insurance forms that humans are so terrible at doing themselves. Absolutely brilliant idea, never happened ... -- Linas On Wed, Feb 13, 2019 at 1:46 PM Matt Mahoney <[email protected]> wrote: > Doug Lenat (creator of Cyc) posed this problem. > > The police arrested the demonstrators because they feared violence. > > The police arrested the demonstrators because they advocated violence. > > What does "they" refer to? > > Lenat hoped to build a database of common sense rules, a "sea of > assertions" to answer questions like this. He had dozens of people encoding > rules like "brooms have handles" in a kind of first order logic. He hoped > that Cyc would run on every computer to solve the "brittleness bottleneck" > of software. > > But he had no idea how many rules he would need. A thousand? A million? A > billion? No idea. > > That was in 1984. > > Do you know? > > On Wed, Feb 6, 2019, 8:58 PM Stefan Reich via AGI <[email protected] > wrote: > >> Hi all! >> >> Inspired by Matt Mahoney's example, I made a little program. This is his >> original example: >> >> I ate pizza with a fork. >> I ate pizza with pepperoni. >> I ate pizza with Bob. >> >> >> I use these as training examples then made some new examples for testing. >> >> My program is a logic engine that takes the following rules: >> >> // Some reasonings that everybody will understand. >> // Sorry for the curly braces, we have to help out the parser a tiny bit. >> // First, the 3 different cases of what "pizza with..." can mean. >> >> I ate pizza with pepperoni. >> => {I ate pizza} and {the pizza had pepperoni on it}. >> >> I ate pizza with Bob. >> => {I ate pizza} and {Bob was with me}. >> >> I ate pizza with a fork. >> => I used a fork to eat pizza. >> >> // Now some more easy rules. >> >> I used a fork to eat pizza. >> => I used a fork. >> >> I used a fork. >> => A fork is a tool. >> >> The pizza had pepperoni on it. >> => Pepperoni is edible. >> >> Bob was with me. >> => Bob is a person. >> >> // Some VERY basic mathematical logic >> >> $A and $B. >> => $A. >> >> $A and $B. >> => $B. >> >> // Tell the machine what is not plausible >> >> Mom is edible. => fail >> Mom is a tool. => fail >> anchovis are a tool. => fail >> anchovis are a person. => fail >> ducks are a tool. => fail >> ducks are a person. => fail >> my hands are edible. => fail >> my hands are a person. => fail >> >> >> The logic engine performs some analogy reasoning using the rules stated >> above. Note that most of the rules don't distinguish between variable and >> non-variable parts, this is usually inferred automatically. >> >> That's it! Now we give the program the following ambiguous inputs: >> >> >> I ate pizza with mom. >> I ate pizza with anchovis. >> I ate pizza with ducks. >> I ate pizza with my hands. >> >> >> ...and it comes up with these clarifications: >> >> I ate pizza with anchovis. => I ate pizza and the pizza had anchovis on >> it. >> I ate pizza with ducks. => I ate pizza and the pizza had ducks on it. >> I ate pizza with mom. => I ate pizza and mom was with me. >> I ate pizza with my hands. => I used my hands to eat pizza. >> >> >> So nice! It's all correct. Sure, nobody would say "pizza with ducks", but >> that is the one option that remained after the program eliminated the other >> two (ducks as a tool or ducks as a person), so it's a very reasonable >> interpretation. >> >> If you remove the lines that says ducks are not persons, the program will >> correctly add the interpretation "I ate pizza and ducks were with me". >> >> Full program in my fancy language. <http://tinybrain.de/1021251> >> >> In addition to the results, the program also shows some lines of >> reasoning, e.g. failed lines: >> >> Interpretation: I ate pizza and the pizza had mom on it. >> => I ate pizza. >> => the pizza had mom on it. >> => mom is edible. >> => fail >> >> >> and successful lines: >> >> Interpretation: I used my hands to eat pizza. >> => I used my hands. >> => my hands are a tool. >> >> >> Why do I make a specific program for this puzzle? Well, it's not a >> special-purpose program really. It's a general-purpose logic engine that >> supports problems of a certain complexity. Like a child understands >> eveyything as long as it's not too complicated. The plan is to make >> increasingly more capable logic engines until we can solve everything. >> >> The program itself is <100 lines <http://tinybrain.de/1021251> as you >> can see, but of course it uses some powerful library functions. >> >> Any questions ?:) >> >> Stefan >> >> -- >> Stefan Reich >> BotCompany.de // Java-based operating systems >> > *Artificial General Intelligence List <https://agi.topicbox.com/latest>* > / AGI / see discussions <https://agi.topicbox.com/groups/agi> + > participants <https://agi.topicbox.com/groups/agi/members> + delivery > options <https://agi.topicbox.com/groups/agi/subscription> Permalink > <https://agi.topicbox.com/groups/agi/T9ccd0aac7d42f57b-M9ced1a989eb5efce0a3e5a06> > -- cassette tapes - analog TV - film cameras - you ------------------------------------------ Artificial General Intelligence List: AGI Permalink: https://agi.topicbox.com/groups/agi/T9ccd0aac7d42f57b-Mcfc451033aa8b70532d5d557 Delivery options: https://agi.topicbox.com/groups/agi/subscription
