Placeholder Image

字幕列表 影片播放

  • We've talked a lot about advances in biotech.

  • But none of those could have happened without advances in computing.

  • It's time to get back to data and explore the unlikely birth, strange life, and potential

  • futures of the Internet.

  • The theme of the history of computing is that what we mean bycomputingkeeps changing.

  • With the invention of the transistor in 1947, the computer started to shrink!

  • And speed up!

  • And change meaning yet again, becoming a ubiquitous dimension of contemporary lifenot to mention

  • a totally normal thing to yell at.

  • Hey Google... can you roll the intro?

  • [long pause]

  • Google: I'm not sure.

  • [Intro Music Plays]

  • In 1965, Electronics Magazine asked computer scientist Gordon Moore to do something scientists

  • are generally taught not to do: predict the future.

  • Moore guessed that, roughly every year, the number of electronic switches that people

  • could squeeze onto one computer chip would double.

  • This meant computer chips would continue to become faster, more powerful, and cheaperat

  • an absolutely amazing rate.

  • Which might have sounded suspiciously awesome to readers.

  • But Moore's prediction came true!

  • Although it took eighteen months for each doubling, and, arguably, this was a self-fulfilling

  • prophecy, since engineers actively worked towards it.

  • Moore went on to serve as CEO as Intel and is now worth billions.

  • His prediction is calledMoore's law.”

  • Think about what this means for manufacturers: they keep competing to invent hot new machines

  • that make their old ones obsolete.

  • The same applies to methods of data Today, engineers face big questions about

  • the physical limit of Moore's law.

  • Even with new tricks here and there, just how small and fast can conventional chips

  • get?

  • Currently, teams at different chip manufacturers are working to create transistors at the nanometer

  • scale.

  • IBM made a whole computer that's only one millimeter by one millimeter wide and is about

  • as fast as a computer from 1990.

  • As computers became smaller and cheaper, they moved from military bases to businesses in

  • the 1960s and to schools and homes by the late 1970s and 1980s.

  • And computers changed these spaces.

  • People got used to using them for certain tasks.

  • But computers were pretty intimidating.

  • Manufacturers had to make them work better with people.

  • So in 1970 the Xerox Corporation founded the Palo Alto Research Centerknown as Xerox

  • PARC.

  • Here, researchers invented many features of modern computing.

  • In 1973, they came up with the Xerox Alto, the first personal computer

  • But Xerox didn't think there was a market for computers in the home yet.

  • Other Xerox PARC inventions include laser printing, the important networking standard

  • called Ethernet, and even the graphical user interface or GUIwhich included folders,

  • icons, and windows.

  • But Xerox didn't capitalize on these inventions.

  • You probably know who did.

  • In the 1970s, two nerds who dropped out of college started selling computers you were

  • meant to use at home, for fun andyou know, to dostuff, whatever you wanted.

  • In retrospect, that was the genius of the Apple Two, released in 1977.

  • Along with decades of shrewd engineering and business moves, fun made video game designer

  • and meditation enthusiast Steve Jobs and engineer Steve Wozniak into mega-billionaires.

  • They had a commitment to computing for play, not always just work.

  • And they weren't alone.

  • In 1981, IBM started marketing the PC powered by the DOS operating system, which they licensed

  • from Microsoftfounded by Harvard dropout Bill Gates in 1975.

  • By 1989, Microsoft's revenues reached one billion dollars.

  • You can find out more about college dropouts-turned-billionaires elsewhere.

  • For our purposes, note that some of the inventors who influenced the future of computing were

  • traditional corporate engineers like Gordon Moore.

  • But increasingly, they were people like the Steves who didn't focus on discoveries in

  • computer science, but on design and marketing: how to create new kinds of interactions with,

  • and on, computers.

  • Compare this to the birth of social media in the early 2000s.

  • So new social spaces emerged on computers.

  • And connecting computers together allowed for new communities to formfrom Second

  • Life to 4chan.

  • For that, we have to once again thank U.S. military research.

  • ThoughtBubble, plug us in.

  • Back in the late 1950s, the U.S. was really worried about Soviet technologies.

  • So in 1958, the Secretary of Defense authorized a new initiative called the Defense Advanced

  • Research Projects Agency or DARPA.

  • DARPA set about solving a glaring problem: what happened if Soviet attacks cut U.S. telephone

  • lines?

  • How could information be moved around quickly, even after a nuclear strike?

  • A faster computer wouldn't help if itt was blown to bits.

  • What was needed was a network.

  • So in part to defend against information loss during a warand in part to make researchers'

  • lives easierDARPA funded the first true network of computers, the Advanced Research

  • Projects Agency Nework, better known as ARPANET.

  • People give different dates for the birthday of the Internet, but two stand out.

  • On September 2nd, 1969, ARPANET went online.

  • It used the then-new technology of packet switching, or sending data in small, independent,

  • broken-up parts that can each find their own fastest routes and be reassembled later.

  • This is still the basis of our networks today!

  • At first, ARPANET only linked a few universities.

  • But it grew as researchers found that linking computers was useful for all sorts of reasons,

  • nukes aside!

  • And then, on January 1st, 1983, several computer networks including ARPANET were joined together

  • using a standard way of requesting and sharing information: TCP/IP.

  • This remains the backbone of the Internet today.

  • Meanwhile, French engineers created their own computer network, connected through through

  • telephone lines, Minitel, back in 1978—five years before TCP/IP!

  • Minitel was retired in 2012.

  • And the Soviets developed their own versions of ARPANET.

  • But after 1991, these joined the TCP/IP-driven Internet, and the virtual world became both

  • larger and smaller.

  • The Internet in the 1980s was literally that: a network interconnecting computers.

  • It didn't look like a new space yet.

  • For that, we can thank British computer scientist Sir Tim Berners-Lee, who invented the World

  • Wide Web in 1990.

  • Berners-Lee pulled together existing ideas, like hypertext and the internet, and built

  • the first web browser to create the beginnings of the functional and useful web we know today.

  • The Web had profound effects.

  • It brought the Internet to millions of peopleand brought them into it, making them feel like

  • they had a homeonline,” a virtual place to represent themselves, meet strangers all

  • over the world, and troll educational video shows!

  • The Web also democratized the tools of knowledge making.

  • From World War Two until 1990, building computers and using them to do work was largely the

  • domain of elites.

  • A short time later, we can trade software on GitHub, freely share 3D printing templates

  • on Thingiverse, and benefit from the collective wisdom of Wikipedia.

  • It's as if the Internet now contains not one but several Libraries of Alexandria.

  • They've radically changed how we learn and make knowledge.

  • Just as scientific journals had once been invented as printed objects, since 1990, they've

  • moved onlinethough often behind steep paywalls.

  • In fact, Russian philosopher Vladimir Odoevsky predicted way back in

  • 1837—in The Year 4338—that our houses would be connected bymagnetic telegraphs.”

  • But this came true only one hundred and fifty years laternot two millennia!

  • So what will happen in another hundred and fifty years?

  • Well, computing seems to be changing unpredictably.

  • Not only because computers are still getting faster, but because of at least three more

  • fundamental shifts.

  • One, scientists are experimenting with quantum computers, which work in a different way than

  • classical,” binary ones.

  • This is called superposition, and it has the potential to make the computers of the future

  • much faster than today's.

  • This could lead to major shifts in cryptography: the current method of protecting our credit

  • cards works because classical computers aren't strong enough to factor very large numbers

  • quickly.

  • But a quantum computer should be able to do this kind of math easily.

  • To date, however, quantum computers are not yet finished technologies that engineers can

  • improve, but epistemic objects: things that scientists are still working to understand.

  • So will quantum computing change everything?

  • Or mostly remain a weird footnote to classical computing?

  • I don't knowwe'll find out!

  • Fundamental shift two: some researchers across computing, history, and epistemologythe

  • branch of philosophy that asks, what counts as knowledge?—wonder if really really large

  • amounts of data, called Big Data, will change how we do science.

  • One of the main jobs of being a scientist has been to just collect data.

  • But if Internet-enabled sensors of all kinds are always transmitting back to databases,

  • then maybe the work of science will shift away from data collection, and even away from

  • analysisAI can crunch numbersand into asking questions about patterns that emerge

  • from data, seemingly on their own.

  • So instead of saying, I wonder if X is true about the natural or social world, and then

  • going out to observe or test, the scientist of the future might wait for a computer to

  • tell her, X seems true about the world, are you interested in knowing more?

  • This vision for using Big Data has been calledhypothesis-free science,” and it would

  • qualify as a new paradigm.

  • But will it replace hypothesis-driven science?

  • Even if AI is mostlyweak,” meaning not like a human brainbut only, say, a sensor

  • system that knows what temperature it is in your house and how to adjust to the temp you

  • wantonce it's very common, it could challenge long-held assumptions about what thought is.

  • In fact, many people have already entrusted cognitive responsibilities such as knowing

  • what time it is to AI scripts on computers in their phones, watches, cars, and homes.

  • Will human cognition feel different if we keep giving AI more and more human stuff to

  • take care of?

  • How will society change?

  • I don't knowwe'll find out!!!

  • And these are only some of the anxieties of our hyper-connected world!

  • We could do a whole episode on blockchain, a list of time-stamped records which are linked

  • using cryptography and (theoretically) resistant to fraud, and the new social technologies

  • it enables: like cryptocurrency, kinds of money not backed by sovereign nations but

  • by groups of co-invested strangers on the Internet.

  • Will blockchain change money, and fundamentally, trust in strangers?

  • Or is it just another shift in cryptography?

  • A fad?

  • I don't know... we'll find out!

  • Let's head back to the physical world to look at the cost of these developments.

  • One feature they have in common is they require ever greater amounts of electricity and rare-earth

  • metals.

  • And older computers become e-waste, toxic trash recycled by some impoverished persons

  • at cost to their own bodies.

  • Even as computers become so small they're invisible, so common they feel like part of

  • our own brains, and so fast that they may fundamentally change critical social structures

  • like banking and buying animal hoodies on Etsythey also contribute to dangerous

  • shifts of natural resources.

  • Next timewe'll wrap up our story of the life sciences by asking questions about the

  • future of medicine and the human brain that remain unanswered as of early 2019.

  • History isn't finished!

  • Crash Course History of Science is filmed in the Cheryl C. Kinney Studio in Missoula,

  • MT and it's made with the help of all these nice people.

  • And our animation team is Thought Cafe.

  • Crash Course is a Complexly Production.

  • If you want to keep imagining the world complexly with us you can check out some of our other

  • channels

  • like Animal Wonders, The Art Assignment, and Scishow Psych.

  • And if you would like to keep Crash Course free forever for everyone, you can support

  • the series on Patreon, a crowd funding platform that allows you to support the content you

  • love.

  • Thank you to all our patrons for making Crash Course possible with your continued support.

We've talked a lot about advances in biotech.

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋

B1 中級

互聯網和計算機。科學史速成班#43 (The Internet and Computing: Crash Course History of Science #43)

  • 7 0
    林宜悉 發佈於 2021 年 01 月 14 日
影片單字