The Internet and Computing: Crash Course History of Science #43

The Internet and Computing: Crash Course History of Science #43


We’ve talked a
lot about advances in biotech. But none of those could have happened without
advances in computing. It’s time to get back to data and explore
the unlikely birth, strange life, and potential futures of the Internet. The theme of the history of computing is that
what we mean by “computing” keeps changing. With the invention of the transistor in 1947,
the computer started to shrink! And speed up! And change meaning yet again, becoming a ubiquitous
dimension of contemporary life—not to mention a totally normal thing to yell at. Hey Google… can you roll the intro? [long pause] Google: I’m not sure. [Intro Music Plays] In 1965, Electronics Magazine asked computer
scientist Gordon Moore to do something scientists are generally taught not to do: predict the
future. Moore guessed that, roughly every year, the
number of electronic switches that people could squeeze onto one computer chip would
double. This meant computer chips would continue to
become faster, more powerful, and cheaper—at an absolutely amazing rate. Which might have sounded suspiciously awesome
to readers. But Moore’s prediction came true! Although it took eighteen months for each
doubling, and, arguably, this was a self-fulfilling prophecy, since engineers actively worked
towards it. Moore went on to serve as CEO as Intel and
is now worth billions. His prediction is called “Moore’s law.” Think about what this means for manufacturers:
they keep competing to invent hot new machines that make their old ones obsolete. The same applies to methods of data
Today, engineers face big questions about the physical limit of Moore’s law. Even with new tricks here and there, just
how small and fast can conventional chips get? Currently, teams at different chip manufacturers
are working to create transistors at the nanometer scale. IBM made a whole computer that’s only one
millimeter by one millimeter wide and is about as fast as a computer from 1990. As computers became smaller and cheaper, they
moved from military bases to businesses in the 1960s and to schools and homes by the
late 1970s and 1980s. And computers changed these spaces. People got used to using them for certain
tasks. But computers were pretty intimidating. Manufacturers had to make them work better
with people. So in 1970 the Xerox Corporation founded the
Palo Alto Research Center—known as Xerox PARC. Here, researchers invented many features of
modern computing. In 1973, they came up with the Xerox Alto,
the first personal computer… But Xerox didn’t think there was a market
for computers in the home yet. Other Xerox PARC inventions include laser
printing, the important networking standard called Ethernet, and even the graphical user
interface or GUI—which included folders, icons, and windows. But Xerox didn’t capitalize on these inventions. You probably know who did. In the 1970s, two nerds who dropped out of
college started selling computers you were meant to use at home, for fun and—you know,
to do… stuff, whatever you wanted. In retrospect, that was the genius of the
Apple Two, released in 1977. Along with decades of shrewd engineering and
business moves, fun made video game designer and meditation enthusiast Steve Jobs and engineer
Steve Wozniak into mega-billionaires. They had a commitment to computing for play,
not always just work. And they weren’t alone. In 1981, IBM started marketing the PC powered
by the DOS operating system, which they licensed from Microsoft—founded by Harvard dropout
Bill Gates in 1975. By 1989, Microsoft’s revenues reached one
billion dollars. You can find out more about college dropouts-turned-billionaires
elsewhere. For our purposes, note that some of the inventors
who influenced the future of computing were traditional corporate engineers like Gordon
Moore. But increasingly, they were people like the
Steves who didn’t focus on discoveries in computer science, but on design and marketing:
how to create new kinds of interactions with, and on, computers. Compare this to the birth of social media
in the early 2000s. So new social spaces emerged on computers. And connecting computers together allowed
for new communities to form—from Second Life to 4chan. For that, we have to once again thank U.S.
military research. ThoughtBubble, plug us in. Back in the late 1950s, the U.S. was really
worried about Soviet technologies. So in 1958, the Secretary of Defense authorized
a new initiative called the Defense Advanced Research Projects Agency or DARPA. DARPA set about solving a glaring problem:
what happened if Soviet attacks cut U.S. telephone lines? How could information be moved around quickly,
even after a nuclear strike? A faster computer wouldn’t help if itt was
blown to bits. What was needed was a network. So in part to defend against information loss
during a war—and in part to make researchers’ lives easier—DARPA funded the first true
network of computers, the Advanced Research Projects Agency Nework, better known as ARPANET. People give different dates for the birthday
of the Internet, but two stand out. On September 2nd, 1969, ARPANET went online. It used the then-new technology of packet
switching, or sending data in small, independent, broken-up parts that can each find their own
fastest routes and be reassembled later. This is still the basis of our networks today! At first, ARPANET only linked a few universities. But it grew as researchers found that linking
computers was useful for all sorts of reasons, nukes aside! And then, on January 1st, 1983, several computer
networks including ARPANET were joined together using a standard way of requesting and sharing
information: TCP/IP. This remains the backbone of the Internet
today. Meanwhile, French engineers created their
own computer network, connected through through telephone lines, Minitel, back in 1978—five
years before TCP/IP! Minitel was retired in 2012. And the Soviets developed their own versions
of ARPANET. But after 1991, these joined the TCP/IP-driven
Internet, and the virtual world became both larger and smaller. The Internet in the 1980s was literally that:
a network interconnecting computers. It didn’t look like a new space yet. For that, we can thank British computer scientist
Sir Tim Berners-Lee, who invented the World Wide Web in 1990. Berners-Lee pulled together existing ideas,
like hypertext and the internet, and built the first web browser to create the beginnings
of the functional and useful web we know today. The Web had profound effects. It brought the Internet to millions of people—and
brought them into it, making them feel like they had a home “online,” a virtual place
to represent themselves, meet strangers all over the world, and troll educational video
shows! The Web also democratized the tools of knowledge
making. From World War Two until 1990, building computers
and using them to do work was largely the domain of elites. A short time later, we can trade software
on GitHub, freely share 3D printing templates on Thingiverse, and benefit from the collective
wisdom of Wikipedia. It’s as if the Internet now contains not
one but several Libraries of Alexandria. They’ve radically changed how we learn and
make knowledge. Just as scientific journals had once been
invented as printed objects, since 1990, they’ve moved online—though often behind steep paywalls. In fact, Russian philosopher Vladimir Odoevsky predicted way back in 1837—in The Year 4338—that our houses
would be connected by “magnetic telegraphs.” But this came true only one hundred and fifty
years later—not two millennia! So what will happen in another hundred and
fifty years? Well, computing seems to be changing unpredictably. Not only because computers are still getting
faster, but because of at least three more fundamental shifts. One, scientists are experimenting with quantum
computers, which work in a different way than “classical,” binary ones. This is called superposition, and it has the
potential to make the computers of the future much faster than today’s. This could lead to major shifts in cryptography:
the current method of protecting our credit cards works because classical computers aren’t
strong enough to factor very large numbers quickly. But a quantum computer should be able to do
this kind of math easily. To date, however, quantum computers are not
yet finished technologies that engineers can improve, but epistemic objects: things that
scientists are still working to understand. So will quantum computing change everything? Or mostly remain a weird footnote to classical
computing? I don’t know… we’ll find out! Fundamental shift two: some researchers across
computing, history, and epistemology—the branch of philosophy that asks, what counts
as knowledge?—wonder if really really large amounts of data, called Big Data, will change
how we do science. One of the main jobs of being a scientist
has been to just collect data. But if Internet-enabled sensors of all kinds
are always transmitting back to databases, then maybe the work of science will shift
away from data collection, and even away from analysis—AI can crunch numbers—and into
asking questions about patterns that emerge from data, seemingly on their own. So instead of saying, I wonder if X is true
about the natural or social world, and then going out to observe or test, the scientist
of the future might wait for a computer to tell her, X seems true about the world, are
you interested in knowing more? This vision for using Big Data has been called
“hypothesis-free science,” and it would qualify as a new paradigm. But will it replace hypothesis-driven science? Even if AI is mostly “weak,” meaning not
like a human brain—but only, say, a sensor system that knows what temperature it is in
your house and how to adjust to the temp you want—once it’s very common, it could challenge
long-held assumptions about what thought is. In fact, many people have already entrusted
cognitive responsibilities such as knowing what time it is to AI scripts on computers
in their phones, watches, cars, and homes. Will human cognition feel different if we
keep giving AI more and more human stuff to take care of? How will society change? I don’t know… we’ll find out!!! And these are only some of the anxieties of
our hyper-connected world! We could do a whole episode on blockchain,
a list of time-stamped records which are linked using cryptography and (theoretically) resistant
to fraud, and the new social technologies it enables: like cryptocurrency, kinds of
money not backed by sovereign nations but by groups of co-invested strangers on the
Internet. Will blockchain change money, and fundamentally,
trust in strangers? Or is it just another shift in cryptography? A fad? I don’t know… we’ll find out! Let’s head back to the physical world to
look at the cost of these developments. One feature they have in common is they require
ever greater amounts of electricity and rare-earth metals. And older computers become e-waste, toxic
trash recycled by some impoverished persons at cost to their own bodies. Even as computers become so small they’re
invisible, so common they feel like part of our own brains, and so fast that they may
fundamentally change critical social structures like banking and buying animal hoodies on
Etsy… they also contribute to dangerous shifts of natural resources. Next time—we’ll wrap up our story of the
life sciences by asking questions about the future of medicine and the human brain that
remain unanswered as of early 2019. History isn’t finished! Crash Course History of Science is filmed
in the Cheryl C. Kinney Studio in Missoula, MT and it’s made with the help of all these
nice people. And our animation team is Thought Cafe. Crash Course is a Complexly Production. If you want to keep imagining the world complexly
with us you can check out some of our other channels like Animal Wonders, The Art Assignment, and
Scishow Psych. And if you would like to keep Crash Course
free forever for everyone, you can support the series on Patreon, a crowd funding platform
that allows you to support the content you love. Thank you to all our patrons for making Crash
Course possible with your continued support.

100 thoughts to “The Internet and Computing: Crash Course History of Science #43”

  1. just because more data is available… still, it is not going to guess a solution…ffs. I have simulated these complex negotiations in real time in a simulation you already know of. talk about not knowing…

  2. Crazy how much life changes within a matter of years. I'm so excited and yet terrified to see the future play out

  3. We went from mainframe to personal pc and client/server. We are at cloud computing. How far are we from moving to rather dumb consoles that access servers or even mainframes again for everything requiring any real resources?

    We are already almost dependent on our smartphone. What if the smartphone docks/syncs to the TV/VR/your eyes and is the access client for all of the above? Its only job is to manage the data coming in from servers doing all the real computing and presenting it to us through another screen. You take it to work, and it immediately connects to the work servers and you start your work day. It is the key to your car, your rail pass, and your flight ticket. Your bank needs your card/code and your phone to be present to authenticate you. It is your one device, everything else is just so you interact with it better. As long as you have it, you have the new gaming console, a gaming pc, a video editing machine, a smart home, a personal doctor, a financial advisor, an accountant, a secretary, and a chauffer.

    We are already partway there. I have four two-factor security apps and a password manager, one of the two-factor apps is only for work and I would be unable to do any work besides email or bare bones documentation without it. My dad unlocks and starts his car with his smartphone, and it is how he gets data from his insulin pump. My phone is my best tool to move through an airport or the subway with any speed.

    I've heard younger friends/coworkers talking about losing their phone like being disconnected from reality. What if it could be even more than it is now?

  4. I heard "next time we'll wrap up…" and thought "but this series has been sooooo short… Then I look down and see "#43" and think "wow… that happened fast! I wonder what we will get next??"

  5. I love that they point out that one of the key things about science is that "We don't know, we'll find out."

  6. Have I given over responsibility for knowing what time it is to AI? Not in the least. I still rely on my own ability to grade contextual clues about the time. AI clocks hold little more authority in my view than do wristwatches (a little, but not that much). The reason is that AI – and wristwatches – may be many orders of magnitude better than me at measuring time, but they don't care what time it is. I do.

  7. The scientific article paywalls are confusing. Like in physics the same paper will be behind a paywall on a journal's website and totally free on the arxiv.

  8. Humans are getting dumber though. And technology is part of the problem. Read “Is Google Making Us Stupid”, one of my favorite articles, which goes through the history of technology.

  9. In summary… a whole bunch of smart people did a whole bunch of smart things, and now i can learn about it while sitting on the toilet

  10. So excited about you doing an episode on blockchain and cryptos. I've been interested in this topic for years, and there are so many misconceptions about it. It's great that you'll make an educational video on this matter.

  11. Where's the chemistry at tho? I don't remember the quintessential to modern civilization's development thing that's got us away from being peasants called the periodic table of elements being mentioned.

  12. Hey can someone tell me where I can get one of them "magnetic telegraph's"? Home Depot doesn't seem to have any, neither does Staples.

  13. I so wanna code on a quantum computer… People always talk about their cryptography capabilities, but I really wonder how one would write a program on it and run simulations…

  14. When this series "ends," meaning you have summarized all past discoveries, I do hope you continue to check in and update us on the historic STEM advancements that still happen each year.

  15. @ 0:25 talking about invention of transistors and showing a board, with all sorts of components, but almost no transistors what so ever (I see resistors and capacitors for the most part).

  16. Do I get useful information from watching videos on YouTube?.. I don't know… We'll find out…

  17. Your point about quantum computers and cryptography is a bit misleading. The difference between classical and quantum computers in that context is not just one of power and speed. There is some stuff that quantum computers can do that simply has no equivalent in classical computing.

  18. Hypothesis-free science is terrifying. If you have enough data and look for enough connections, you will find connections, purely by chance.

  19. One drawback of the blinding rate of advancement in computer science is that, since everyone relies on computers now, we are constantly making large chunks of our collective skillsets become obsolete and be in constant need of replacement. In the professions, for example, this often means there is a divide between young people with advanced computer skills (they haven't been around long enough to fall behind yet) but not enough experience to apply them properly, and older people with rich experience but inadequate computer skills (and often managerial responsibilities that keep them from effectively passing on knowledge to the next generation).

    The chaos of a constantly-changing environment undermines the theoretically possible efficiencies of that environment.

  20. in the book outliers by Maclom Gladwell: Bill gates, lived a bus ride from University of state Michigan where it was the first university to own computers. He went to private school and they had money to spend on computers. Bill spent hours writing code for accounting.

  21. PLEASE DO MORE ANATOMY AND PHYSIOLOGY VIDEOS 💓💓💓💓 any biology really
    sincerely a nursing student

  22. Xerox also brought together existing technologies.
    Stanford Research Institute (SRI) which developed "windows, hypertext, graphics, efficient navigation and command input, video conferencing, the computer mouse, word processing, dynamic file linking, revision control, and a collaborative real-time editor (collaborative work)" {Wikipedia}. Douglas Engelbart's "Mother of All Demos" (available on youTube) demonstrated this technology. Several people who worked at SRI left to work for… Xerox PARC.
    Ethernet was "inspired" by ALOHAnet developed for DARPA, which Robert Metcalfe had studied and made part of his doctoral thesis at Harvard while working at… Xerox PARC.
    Laser Printers were a modified version of the electrophotography printer which the inventor Carlton Chestor sold to the Haloid Company, which was later became… Xerox.

  23. Lots of fact-checking skipped here. E.g. nuclear motivation for ARPANET is controversial and probably not true. I wish SciShow had a way to know when they get it wrong. ☹️

  24. 4:36, connecting the U.S. government's investment in information exchange to 4chan (which used to idolize Ron Paul to large degree) isn't necessarily ironic. The spirit of 1776 lives on in the internet and innovative nerds.

  25. Not only do I love the phrase itself, I love how Hank's eyes light up and how he says in a different, dramatic way every time.

    The whispered "I don't know" was the best!

  26. Funnily the Library of Alexandria is now trying to archive the internet (or at least as much of it as they can)

  27. saddened that there was no mention of how many enslaved people help produce technology today. more slaves now than any time in history, worlds grown 5 billion ppl in 60 yrs tho

  28. next time you binge watch cat videos on youtube for 8 hours straight blame this guy.
    Me: who watches cat videos anymore let alone who watched them for 8 hours
    Hank: ^slowly raises hand^
    Me: wow ok then

  29. The internet makes playing on Minecraft servers possible. I have videos about Minecraft servers. Thanks internet, for giving me Minecraft servers, screen recording software, video editing software, and a YouTube channel to upload my videos onto!

    (Get the hints? They are leading to WATCH MY VIDEOS!)

    Also I love the internet and crashcourse

  30. “The internet is for porn! The internet is for porn!
    Why do you think the net was born?
    Porn! Porn! Porn!” – Avenue Q

Leave a Reply

Your email address will not be published. Required fields are marked *