yego.me
💡 Stop wasting time. Read Youtube instead of watch. Download Chrome Extension

Elementary, Watson: The Rise of the Anthropomorphic Machine | Big Think


3m read
·Nov 4, 2024

Processing might take a few minutes. Refresh later.

So I've been asked periodically for a couple of decades whether I think artificial intelligence is possible. And I taught the artificial intelligence course at Columbia University. I've always been fascinated by the concept of intelligence. It's a subjective word. I've always been very skeptical. And I am only now newly a believer.

Now, this is subjective. This is sort of an aesthetic thing but my opinion is that IBM's Watson computer is able to answer questions, in my subjective view, that qualifies as intelligence. I spent six years in graduate school working on two things. One is machine learning, and that's the core to prediction—learning from data how to predict. That's also known as predictive modeling.

And the other is natural language processing or computational linguistics. Working with human language, because that really ties into the way we think and what we're capable of doing, and does turn out to be extremely hard for computers to do. Now, playing the TV quiz show Jeopardy means you're answering questions—quiz show questions.

The questions on that game show are really complex grammatically. And it turns out that in order to answer them, Watson looks at huge amounts of text, for example, a snapshot of all the English speaking Wikipedia articles. And it has to process text not only to look at the question it's trying to answer but to retrieve the answers themselves.

Now at the core of this, it turns out it's using predictive modeling. Now, it's not predicting the future, but it's predicting the answer to the question, you know. It's the same in that it's inferring an unknown even though someone else may already know the answer, so there's no sort of future thing. But will this turn out to be the answer to the question?

The core technology is the same. In both cases, it's learning from examples. In the case of Watson playing the TV show Jeopardy, it takes hundreds of thousands of previous Jeopardy questions from the TV show, having gone on for decades, and learns from them. And what it's learning to do is predict, is this candidate answer to this question likely to be the correct answer?

So, it's gonna come up with a whole bunch of candidate answers—hundreds of candidate answers—for the one question at hand at any given point in time. And then, amongst all these candidate answers, it's going to score each one. How likely is it to be the right answer?

And, of course, the one that gets the highest score as the highest vote of confidence—that's ultimately the one answer it's gonna give. It's correct, I believe, about 90 or 92 percent of the time that it actually buzzes in to intentionally answer the question.

You can go on YouTube and you can watch the episode where they aired the, you know, the competition between IBM's computer Watson and the all-time two human champions of Jeopardy. And it just rattles off one answer after another. And it doesn't matter how many years you've been looking at—in fact, maybe the more years you've studied the ability or inability of computers to work with human language, the more impressive it is.

It's just rattling one answer after another. I never thought that, in my lifetime, I would have cause to experience that the way I did, which was, "Wow, that's anthropomorphic. This computer seems like a person in that very specific skill set. That's incredible. I'm gonna call that intelligent."

More Articles

View All
Mr. Freeman, part 61 UNCENSORED
There was a man who was constantly suffering. He was too hot, then too cold. He had too much, then too little. He wanted to scream from joy, then wanted to hide in the corner from angst. The stress was making his heart grow callous, his body deteriorate, …
fly with me from CA to AZ | tiny airplane, big adventure! day 1
Hi, I’m Stevie, and this is my 1949 Cessna 140A that we’re going to be flying all the way from California to Wisconsin for EAA Air Venture. If you’re not familiar, Air Venture is like the pilot event every single year. 600,000 people and over 10,000 plane…
Brief History of the Royal Family
1066! The start of the royal family on these fair isles. Well, there were kings and mini countries before that and druids before that, and Pangaea before that, but we have to start somewhere and a millennia ago is plenty far – if that leaves out Æthelred …
Grizz Quiz: How Much Do You Know About Grizzly Bears? | Short Film Showcase
Maybe they’re your worst nightmare, or perhaps they bring a smile to your face. Grizzly bears are famous for triggering a whole range of different emotions, most of them passionate. You might have asked you a couple of questions. Let me start with this on…
Concrete and abstract nouns | The parts of speech | Grammar | Khan Academy
Hello Garans. So today I’d like to talk to you about the idea of concrete and abstract nouns. Before we do that, I’d like to get into some origins—some word origins or etymology. Um, so let’s take each of these words in turn. I think by digging into wha…
Electronegativity and bond type | States of matter | High school chemistry | Khan Academy
Electro negativity is probably the most important concept to understand in organic chemistry. We’re going to use a definition that Linus Pauling gives in his book “The Nature of the Chemical Bond.” So, Linus Pauling says that electron negativity refers to…