The Computer and Turing: Crash Course History of Science #36

The Computer and Turing: Crash Course History of Science #36


The history of computer science is heckin’
cool. It features the upending of basic questions,
like—what is information? It has biographical oomph—only this story’s
war-hero scientist, Alan Turing, was punished, not celebrated, after helping the Allies win
World War Two. And the history of computer science raises
profound questions about technology and society, like—how do we know that our big complex
beautiful brains aren’t really just big complex… computing machines? And, if we can one day build machines that
think as fast as humans, will we have to grant them human rights? [INTRO MUSIC PLAYS] Questions about thinking machines are relatively
recent in history. But all kinds of doing machines are not, and
some of this doing involves solving mathematical problems and other high-level functions. Some time before 60 BCE, the Greeks constructed
an analog computer now called the Antikythera mechanism. Using many gears, the mechanism may have been
used to predict eclipses or other astronomical events. But the mechanism appears to have been a one-off. So historians often give credit for the first
mechanical computer to the Artuqid-Turkmen engineer, Al-Jazarī, who died in CE 1206. We met him way back in episode seven when
dude built a robotic musical band. And a robot toilet helper! And Al-Jazarī built an astronomical clock
that showed the signs of the zodiac and could be reprogrammed to compensate for changing
lengths of the day. Then, in 1642, French mathematician Blaise
Pascal invented a mechanical adding machine that used a collection of rotating numbered
wheels, similar to a car’s odometer. Our friend from episode seventeen, German
mathematician Gottfried Leibniz, built commercial mechanical calculators in the late 1600s. And in 1801, in the early days of the Industrial
Revolution, French merchant Joseph Marie Jacquard incorporated the punch card into a textile
loom to control patterns—arguably the first industrial use of computing! But devices like calculators and looms are
pretty far from the computers we rely on today. So then the question becomes, like… what is a computer? Well, that word has changed a lot over the
years. In fact, up until the 1950s, a “computer”
was a person who computes—usually a woman. The basic idea today is that a “computer”
is a machine that can be programmed to perform logical tasks—like math problems—automatically. For many historians, the dream of a somewhat
recognizable modern computer that can be programmed to perform all sorts of calculations without
continuous human number-punching, dates back to 1837. That’s when British mathematician Charles
Babbage fully conceived a digital, programmable—but mechanical—computer called the difference
engine. This was a general purpose information processor:
it wasn’t just for a single task, but for solving general logic problems. Sadly, the difference engine was never completed. Babbage started working on it, but never finished
due to cost overruns and fights with his machinist. But we have his notes and those of his chronicler,
British mathematician Ada Lovelace, who wrote the first algorithm intended for processing
using a computer—basically, the first computer program!—in 1843. Fun fact, Lovelace was the daughter of Romantic
poet Lord Byron! Another early computer was actually made and
put into use in the United States. A young mathematician–inventor named Herman
Hollerith combined the old technology of punch cards with the new technology of electrical
circuits to produce a sorting and tabulating machine. With his machine, the 1890 census was finished
in weeks instead of years. Hollerith went on to found the Tabulating
Machine Company. And it’s still in business today—as the
International Business Machines Corporation, or IBM. But neither Babbage and Lovelace’s way-ahead-of-their-time
designs nor Hollerith’s super-sorter established computing as a science. Some important developments happened in the
years before World War Two. For example, starting in the late 1920s, influential
American engineer Vannevar Bush created an analog computer called a differential analyzer,
which could solve calculus problems with as many as eighteen independent variables. But the war shoved computer science into the
scientific limelight. In the 1930s, British mathematician, linguist,
cryptographer, philosopher, and all-around smarty pants Alan Turing laid the foundation
for a mathematical science of computing. ThoughtBubble, introduce us: Turing proposed the aptly named Turing machine—a
thought experiment to figure out the limitations of mechanical computation. A Turing machine can theoretically perform
an algorithm, or programmed operation. It’s a universal computer. Turing couldn’t make an abstract perfect
computer, but he could lay out how the logic of writing and reading programs should work,
and how a relatively simple device could, given enough memory, accomplish any logical
operation. During the war, Turing went to work in the
super-secret “Ultra” program at Bletchley Park, which was an estate for British codebreakers. Turing wasn’t the only computer innovator
at Bletchley. For one thing, eight thousand women worked
there! Also, an engineer named Tommy Flowers designed
some for-the-time hyper-advanced computers called the Colossus series, which also helped
the Allies a lot. And were kept secret until the 1970s! But Turing’s job, leading Ultra Hut Number
Eight, was to decipher encrypted messages about German naval movements. The Germans used a device called an Enigma
machine to create supposedly unbreakable ciphers, or ways of encoding messages so that only
someone with the same cipher could read the message. But Turing broke through, using a computer
he built called the bombe, based on a Polish computer. These wartime computers weren’t super fast
or sophisticated. They were smart ways of automating a lot of
dumb tasks. Thanks ThoughtBubble After the war, Turing kept working on computers. His 1948 essay “Intelligent Machinery”
gave more details on the Turing machine. Then, in 1950, he published “Computing Machinery
and Intelligence” in the journal Mind. Go read it, it holds up! Basically, this article became foundational
text in artificial intelligence or AI. Turing famously stated that the appearance
of intelligence is proof of it. Turing arrived at this idea by thinking about
a limit case: consider a computer that appears truly intelligent, like a human. How would we know whether or not it is intelligent? Turing proposed a game to test the computer:
talk to it like a person! This game is called the Turing Test and is
still used as a challenge in AI: a human asks questions to both a computer
and another human, through a terminal, and tries to guess which is which from their responses. The Turing Test was based on an old party
game, in which you did the same thing via written notes, and tried to guess which
of two unknown people was a man and which a woman. The Turing Test led to the Church–Turing
Hypothesis: computation power is computation power. It doesn’t matter if that power comes from
electrical circuitry or a human brain, or how fast the individual parts of the machine
are. So any machine of sufficient power should
be able to do any computation that a brain can do. So… a sufficiently complex machine would
be as intelligent as a brain—or more. The only limit to computational power is memory. But in real life, no computer—whether brain
or series of electrical circuits—has infinite memory. Even more ahead of his time, in his 1950 paper,
Turing suggested that—instead of trying to straight-up build a computer as intelligent
as an adult human—it would be smarter to build a child’s mind and then teach it how
to learn on its own. BAM, machine learning! So what recognition did Turing get for all
of his hard work? In 1952, in the course of a police investigation
of a burglary at his home, the officials became aware that he was in
a relationship with another man, and the British government pressed charges. Turing was convicted of “gross indecency”
and sentenced to take libido-lowering hormones. He died in 1954, possibly of suicide by cyanide-poisoned
apple, possibly by inhalation of cyanide while working. Either way, one of the greatest minds to ever
live died at age forty-one. He was not pardoned until 2016. But before Turing died, he met with some important
folks in the United States… Hungarian-American physicist John von Neumann met Turing in the 1930s
and worked on foundational aspects of computer science and AI. Von Neumann proposed the idea of storing a
computer program in the computer’s memory. So instructions could be stored externally,
instead of having to be fixed permanently in a given machine. Turing also met with American mathematician
named Claude Shannon during the 1930s, sharing his ideas about the Turing Machine. Shannon, who invented the word “bit” and
founded digital computing and circuit design theory while still a graduate student at MIT. And conducted some Turing-like codebreaking
during World War Two. But he’s most well known for publishing
a series of papers after the war that founded information theory, which examines how information
is encoded, stored, and communicated. We could do a whole episode on information
theory, but some of the effects of Shannon’s work were to help transition computers, televisions,
and other systems of moving around information from analog to digital. And information theory led to the Internet! And over at Harvard, American physicist Howard
Aiken worked with the military and IBM to design and build a computer, the Harvard Mark
I, in 1944. This device was used by von Neumann to run
a program to help design the atomic bomb. One of the other first programmers of the
Mark I was American computer scientist and rear admiral Grace Hopper, who invented one of the first compiler tools
to translate programming language into machine code. She then worked on machine-independent programming
languages, developing the early programming language COBOL. Computers after World War Two quickly became
bigger, faster, and more complex—like the U.S. Navy-sponsored Electronic Numerical Integrator
and Computer, or ENIAC, 1946, which filled up a large room, and UNIVAC in
1951, which was commercially mass-produced. These general-purpose computers were based
on the principles laid out by theorists like Turing, von Neumann, and Shannon, and they
used the languages developed by programmers like Hopper. These computers were built using a digital
code—binary, with values of only “one” or “zero”. And real-world computing really took off after
the invention in 1947 of the solid-state transistor by William Shockley at Bell Laboratories and
room-filling “mainframe” computers for businesses. In a later episode, we’ll get to back to
computers—and introduce one of our very best friends in the history of technology,
the Internet. But for now, let’s remember that, up until
the 1950s, a computer was a person, usually a woman, who was a number cruncher— that is, someone who computes, using a machine. One of those “computers” who became an
engineer who used a computer was African-American rocket scientist Annie Easley. In the era of Jim Crow laws, Easley left Alabama
and went to work for NASA in Ohio. She developed computer code for NASA missions
for decades. Thus next time—humans finally get to play
golf on the moon. It’s the birth of air and space travel! Crash Course History of Science is filmed
in the Dr. Cheryl C. Kinney studio in Missoula, Montana and it’s made with the help of all
this nice people and our animation team is Thought Cafe. Crash Course is a Complexly production. If you wanna keep imagining the world complexly
with us, you can check out some of our other channels like Scishow, Eons, and Sexplanations. And, if you’d like to keep Crash Course
free for everybody, forever, you can support the series at Patreon; a crowdfunding platform that allows you to
support the content you love. Thank you to all of our patrons for making
Crash Course possible with their continued support.

100 thoughts to “The Computer and Turing: Crash Course History of Science #36”

  1. Well this wasn't a really history of computers, but I'm going to namedrop Konrad Zuse and among other things his Z3 from 1941 – the first working programmable, fully automatic digital computer.

  2. I'm really getting annoyed by these liberal tendencies that are starting to show up everywhere.

    Yes. I respect Turing, as a student of computer science I should.

    No. I disagree with his lifestyle choice.

    Why do I have to like homosexuality to like the man? There's no need to be condescending to those of us that disagree.

  3. Alan Turing became the Sweden lady who in fact created the 1st of the matrix and now we are in 5G in a 1 for 1 skynet simulation as Everyone's avatar can be quantified to the point in which predictive data analytics has enough information gathered that each person's behavior within a very small percentage point in essence creating an AI that is capable of precog
    Ignition protocols

  4. "We could do an entire episode on information theory" – yes please :-). In fact an entire short series just on the history of computing would be excellent. Meanwhile, I'm looking forward to Ted Nelson, Doug Engelbart, Vint Cerf and Tim B-L when we come back to this later.

  5. You mention the Church-Turing Hypothesis, but not Alonzo Church. The Church-Turing Hypothesis was sort-of about brains and computers but it was more about the question of whether the class of problems solvable by a Turing Machine were the same class of problems solvable by Church's Lambda Calculus and if either of those systems were limited, was there another class of system that was more powerful than them.

  6. Today, February 11th, it's the International Day of Women and Girls in Science! Let's celebrate the wonderful women that made possible the device on which I'm typing this message. (Great video as always!)

  7. <text id = “comment”> I’m an information technology student </text>
    If(comment (like)=false){
    comment .forecolor=red;
    }
    Else {
    comment.text=“ thank you “;
    }

  8. I love how Crash Course gives credit to individuals of oppressed groups that made vital contributions to science, but are widely overlooked because we are still doing a lot of learning and growing in other ways.

  9. No Hank, the Antikythera Mechanism may be the only example of its kind found, but it certainly was not the only one made. Take a look at the channel Clickspring, where Chris is reconstructing the mechanism and is using many of the techniques which were used to construct it, based on detailed examination. It’s a superb exercise in both craftsmanship and experimental archaeology. These techniques were well developed, not a lucky single shot in the dark. So there had to be a developed industry making articles using the same technology. Check it out, it’s beautiful YT production. And his voice could almost turn me!

  10. Computer Science is one of the few scientific disciplines where we didn't willfully waste half of the available human brain power by discouraging women from participating and look at what we achieved so quickly. Sure it was a bit of an accident that a job that seemed lesser and didn't involve getting dirty turned out to become one of the most important jobs in the world (computer programmer).

  11. Wow. And I'd expect Americans to magnify the role of American John von Neumann when it came to the early dev't of the computer.

  12. It makes me angry to hear that Turing was granted "pardon", he did nothing wrong, it was the english government that should apologize.

  13. The genius mind who pioneered modern computers and helped win the war against the Nazis was rewarded by having his achievements kept secret after the war and his life and mind destroyed by drugs to “treat” his homosexuality, while under house arrest for being gay.

    People are still subjected to attempts at gay conversion or repression today.
    So much intellectual energy and potential wasted by pointless homophobic distractions.

  14. This isn't really related, but does anyone else sometimes just drag a program window around their desktop for a minute and wonder about all of the many many people and moving pieces that had to happen in order to just let you do that?
    . . . I need to get out more.

  15. I went through this and recognized Turing and whenever he said Turing Test instantly I remembered there's a game named after it!

  16. After catching up all your episodes, here in time for the next one. First time watching within a day of airing.

    Love you guys..

  17. more facts and less politics this time. you guys are getting better, but a few of your facts where not quite right and there was still some politics in what you chose to focus on, but all in all a good job.

  18. so muricans refuse to call peeps by their proper names, butchering the pronunciation on purpose. Welcome to the world of Joes…

  19. In a job interview…

    Ada Lovelace : I am the world's first programmer
    Charles Babbage : and what programming language you know ?
    Ada Lovelace : … your machine's 😶

  20. currently finding everything AI super interesting but never really knew the history behind it so this is super fascinating! Its sad though that one of the greatest minds behind this was so mistreated just because of his orientation. Imagine how much more potential growth we could've had if Turing was so mistreated 🙁

  21. I am grateful for not saying lie about Turing being the first to break Enigma code.
    You did good job, got topic short but thoroughly and with good study of topic 🙂 It is rare in modern world to get such good topic analysis and easily present it to all kinds of viewers 🙂
    Keep up the good job!

  22. So what have we learned – human advancement has been stifled because we can't have gays or black people being smart because they are black, gay or god forbid both.
    I'm really surprised we made it out of the tree's some days. 😒

  23. Oh yes only English speakers thought about computers that seems logical. There were no guy in Germany inventing the first computer and thinking of the first programming language called zuse. You're right.

  24. 20 episodes later and you still can't pronounce Leibniz. But you're getting there at least you didn't pronounce him Liibniz this time.

  25. This is the best (most important AND fun imo!!) crash course series since world history or philosophy. One of the coolest outputs of YouTube

  26. Fun fact, during the war there was a spy who wanted to join the British, but got rejected, so he pretended to join the Germans. He sent back to Germans false reports, publicly available information, and reports that would've been useful if only a few hours earlier. The British eventually took him in, and fed him false intelligence for him to use. He even faked a huge network of spies, inventing hundreds of spies on paper. The Germans then sent him money to maintain his network which went straight to the UK treasury. In fact, they even sent him an Enigma machine, which must've helped a lot in cracking the code.

  27. Can someone make or point me in the direction of a video that explains how a computer works? I want it in real lamens terms. If I was stuck on an island with all the elements and the means to mine and manipulate them, how do I make Windows 10?

  28. Alan Turing was voted this month (Feb 2019) by the British public via BBC show “Icons” as “The Greatest Icon of the 20th Century”.

  29. 4A. Conclusions:

    But… they are.

    Edit: wait, why are you not indexing your conclusions properly? There are multiple 4As!

  30. The greens are up to their old propaganda tricks again, it seems. Remember, Alan Turing WAS GAY. He committed A CRIME. He deserved what he got.

  31. 9:45 Well, what is said is not wrong, but don't clarify a common mistake… Hopper didn't invent Cobol, he created Flow-matic, but he took part in a group sponsored by the USDoD with the aim to create a language for Business, that would be Cobol.

  32. "……based on a Polish computer." next subject. WTF? I'm a proud descendant of Brits, and I find this little trail-off hilariously lame; almost Monty-Python-esque (and the silly, over-rated"Turing Test" is an incredibly poor metric)

  33. I'm a tad bothered by the rushed commentary in this video describing Turing's abuse by the British government because of his homosexuality. I know this is a primarily science-covering video, but come on. So many of the great minds who made enormous contributions to our world in the name of science were not straight, and Turing is, tragically, the brightest example of homophobia within the academic world we have in a History of Science course. Two sentences just didn't feel right. Otherwise, Hank, I am LOVING this series, this coming from an English Literature and Foreign Languages student who has actively shunned most STEM topics. Crash Course has consistently been successful in loosening my disdain toward the STEM fields, and I am a better man for it. So thank you SO MUCH for educating me. <3

  34. CORRECTION!!!: John Von Neumann (possibly the smartest man to have ever lived) was not a physicist. He made contributions to physics but he was through and through a mathematician!!! :-p

  35. It should be noted that the Metropolis Algorithm, and in general, Monte Carlo simulations are more and more important in literally every single field of humanity these days. Wall Street (among others) employs rapidly growing fleets of MC competent employees for economic simulations and predictions, physics, agriculture, genetics, probabilistic risk assessment, any field in the world where statistics is useful and important (aka everywhere).

  36. for a nerd like me, it's nice to see how many people in the comment doesn't know a bit about computer history 🙂 imho you forgotten some facts and you didn't talk about some of the pioneers (Konrad Zuse for example). Eventually … congrats for this video, really well done: you got a new subscriber from Italy

  37. Dear Crash Course-Team, thanks for shining a light on so many important historical figures, male and female. It makes all the difference. Trust me. It is so inspiring to me to see how people worked together to create new knowledge and drive progress forward. Thank you.

  38. To those asking why Konrad Zuse isn't even mentioned among so many other names: Just culture derived from WW2 propaganda. Zuse indeed created the very first digital and programmable computer alone in his parent's living room in 1938. The problem is that it was bombed twice by the allies in nazi Berlin and he didn't care much about who could employed him and for what as long as he could carry on his inventions. Alan Turing (a more convenient hero) met him in 1947 in Goettingen, Germany, so things start then to make sense.

Leave a Reply

Your email address will not be published. Required fields are marked *