(www.wired.com) CYC-O | WIRED
ROAM_REFS: https://www.wired.com/1994/04/cyc-o/
- CYC-O
Doug Lenat's quixotic quest to create an artificial intelligence with common sense.
A decade ago Doug Lenat devised two of the most successful programs in artificial intelligence - Eurisko and AM. Now quietly working at the consortium MCC of Austin, Texas, Presiding Scientist Lenat is directing development of his latest brainchild, CYC (as in, enCYClopedic), a machine intelligence that has common sense.
WIRED: Early work in artificial intelligence pushed the thinking machine only so far. What happened?
Lenat: In about 1983, I noticed almost all of the subfields in artificial intelligence hitting the same brick wall. If something lacked common sense, as all programs up through today do, it ended up extremely brittle. Ask an expert system, like Stanford University's MYCIN, to diagnose a patient, and it will tell you what type of meningitis the patient is most likely to have. It does a good job, but tell it about your rusted-out car, and it'll tell you what kind of meningitis your car is most likely to have. It doesn't know machines don't get diseases. These programs have the appearance of intelligence without actual understanding.
So will computers now be able to really understand?
Common sense is the missing ingredient to getting programs to keep growing, the way humans do. Learning occurs at the fringe of what you know already. If you don't know much, there's not much you can learn. All these programs can learn is the two or three things they were devised to learn. Natural language understanding, speech recognition, expert systems - why have none succeeded in a big way? They are missing CYC.
Will CYC ever be conscious?
I think it's conscious now. If someone acts sane, I assume they are. CYC has models of emotions, seeking-behavior, a list of goals for itself - like finding out about the world, or preserving its own integrity, its self, if you will. It knows it is a computer program. It understands the flow of time. It can distinguish reality, conversations it has had, from unreality, or fictional stories it has been told. It acts conscious. From my point of view, that means it is. Almost any definition of consciousness, other than actual feeling of emotion, or feeling of sensation, CYC could be said to have. Unlike programs that get rebooted and start over, CYC hasn't been in the same state for seven years. Dozens of people work on it, and it remembers from day to day, minute to minute.
Initially, you thought 1 million rules seemed enough to capture common sense, then 2 million. Now, 4 million. Why has this changed?
We drastically underestimated how ambiguous humans are.
If I said I was following a ball of yarn into a labyrinth, would CYC know I felt like Theseus approaching the Minotaur?
In 1997, yes. Today, no. CYC is finally becoming smart, but we need to correct the ignorance problem. There are 20 to 40 million things that have to be put in over the next decade, common knowledge - information in textbooks, encyclopedias, almanacs, in the newspaper, in literature, and so on.