Siêu thị PDFTải ngay đi em, trời tối mất

Thư viện tri thức trực tuyến

Kho tài liệu với 50,000+ tài liệu học thuật

© 2023 Siêu thị PDF - Kho tài liệu học thuật hàng đầu Việt Nam

The computer book
PREMIUM
Số trang
586
Kích thước
29.6 MB
Định dạng
PDF
Lượt xem
1045

The computer book

Nội dung xem thử

Mô tả chi tiết

THE COMPUTER BOOK

FROM THE ABACUS TO ARTIFICIAL INTELLIGENCE, 250

MILESTONES IN THE HISTORY OF COMPUTER SCIENCE

Simson L. Garfinkel and Rachel H.

Grunspan

STERLNG and the distinctive Sterling logo are registered trademarks of Sterling Publishing Co., Inc.

Text © 2018 Techzpah LLC

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or

transmitted in any form or by any means (including electronic, mechanical, photocopying, recording, or

otherwise) without prior written permission from the publisher.

All trademarks are the property of their respective owners, are used for editorial purposes only, and the

publisher makes no claim of ownership and shall acquire no right, title or interest in such trademarks by

virtue of this publication.

ISBN 978-1-45492622-1

For information about custom editions, special sales, and premium and corporate purchases, please contact

Sterling Special Sales at 800-805-5489 or [email protected].

sterlingpublishing.com

Photo Credits - see page 528

Contents

Introduction

Acknowledgments

c. 2500 BCE Sumerian Abacus

c. 700 BCE Scytale

c. 150 BCE Antikythera Mechanism

c. 60 Programmable Robot

c. 850 On Deciphering Cryptographic Messages

c. 1470 Cipher Disk

1613 First Recorded Use of the Word Computer

1621 Slide Rule

1703 Binary Arithmetic

1758 Human Computers Predict Halley’s Comet

1770 The “Mechanical Turk”

1792 Optical Telegraph

1801 The Jacquard Loom

1822 The Difference Engine

1836 Electrical Telegraph

1843 Ada Lovelace Writes a Computer Program

1843 Fax Machine Patented

1843 Edgar Allan Poe’s “The Gold-Bug”

1851 Thomas Arithmometer

1854 Boolean Algebra

1864 First Electromagnetic Spam Message

1874 Baudot Code

1874 Semiconductor Diode

1890 Tabulating the US Census

1891 Strowger Step-by-Step Switch

1914 Floating-Point Numbers

1917 Vernam Cipher

1920 Rossum’s Universal Robots

1927 Metropolis

1927 First LED

1928 Electronic Speech Synthesis

1931 Differential Analyzer

1936 Church-Turing Thesis

1941 Z3 Computer

1942 Atanasoff-Berry Computer

1942 Isaac Asimov’s Three Laws of Robotics

1943 ENIAC

1943 Colossus

1944 Delay Line Memory

1944 Binary-Coded Decimal

1945 “As We May Think”

1945 EDVAC First Draft Report

1946 Trackball

1946 Williams Tube

1947 Actual Bug Found

1947 Silicon Transistor

1948 The Bit

1948 Curta Calculator

1948 Manchester SSEM

1949 Whirlwind

1950 Error-Correcting Codes

1951 The Turing Test

1951 Magnetic Tape Used for Computers

1951 Core Memory

1951 Microprogramming

1952 Computer Speech Recognition

1953 First Transistorized Computer

1955 Artificial Intelligence Coined

1955 Computer Proves Mathematical Theorem

1956 First Disk Storage Unit

1956 The Byte

1956 Robby the Robot

1957 FORTRAN

1957 First Digital Image

1958 The Bell 101 Modem

1958 SAGE Computer Operational

1959 IBM 1401

1959 PDP-1

1959 Quicksort

1959 Airline Reservation System

1960 COBOL Computer Language

1960 Recommended Standard 232

1961 ANITA Electronic Calculator

1961 Unimate: First Mass-Produced Robot

1961 Time-Sharing

1962 Spacewar!

1962 Virtual Memory

1962 Digital Long Distance

1963 Sketchpad

1963 ASCII

1964 RAND Tablet

1964 Teletype Model 33 ASR

1964 IBM System/360

1964 BASIC Computer Language

1965 First Liquid-Crystal Display

1965 Fiber Optics

1965 DENDRAL

1965 ELIZA

1965 Touchscreen

1966 Star Trek Premieres

1966 Dynamic RAM

1967 Object-Oriented Programming

1967 First Cash Machine

1967 Head-Mounted Display

1967 Programming for Children

1967 The Mouse

1968 Carterfone Decision

1968 Software Engineering

1968 HAL 9000 Computer

1968 First Spacecraft Guided by Computer

1968 Cyberspace Coined—and Re-Coined

1968 Mother of All Demos

1968 Dot Matrix Printer

1968 Interface Message Processor (IMP)

1969 ARPANET/Internet

1969 Digital Imaging

1969 Network Working Group Request for Comments: 1

1969 Utility Computing

1969 Perceptrons

1969 UNIX

1970 Fair Credit Reporting Act

1970 Relational Database

1970 Floppy Disk

1971 Laser Printer

1971 NP-Completeness

1971 @Mail

1971 First Microprocessor

1971 First Wireless Network

1972 C Programming Language

1972 Cray Research

1972 Game of Life

1972 HP-35 Calculator

1972 Pong

1973 First Cell Phone Call

1973 Xerox Alto

1974 Data Encryption Standard

1974 First Personal Computer

1975 Adventure

1975 The Shockwave Rider

1975 AI Medical Diagnosis

1975 BYTE Magazine

1975 Homebrew Computer Club

1975 The Mythical Man-Month

1976 Public Key Cryptography

1976 Tandem NonStop

1976 Dr. Dobb’s Journal

1977 RSA Encryption

1977 Apple II

1978 First Internet Spam Message

1978 Minitel

1979 Secret Sharing

1979 VisiCalc

1980 Sinclair ZX80

1980 Flash Memory

1980 RISC

1980 Commercially Available Ethernet

1980 Usenet

1981 IBM PC

1981 Simple Mail Transfer Protocol

1981 Japan’s Fifth Generation Computer Systems

1982 AutoCAD

1982 First Commercial UNIX Workstation

1982 PostScript

1982 Microsoft and the Clones

1982 First CGI Sequence in Feature Film

1982 National Geographic Moves the Pyramids

1982 Secure Multi-Party Computation

1982 TRON

1982 Home Computer Named Machine of the Year

1983 The Qubit

1983 WarGames

1983 3-D Printing

1983 Computerization of the Local Telephone Network

1983 First Laptop

1983 MIDI Computer Music Interface

1983 Microsoft Word

1983 Nintendo Entertainment System

1983 Domain Name System

1983 IPv4 Flag Day

1984 Text-to-Speech

1984 Macintosh

1984 VPL Research, Inc.

1984 Quantum Cryptography

1984 Telebit Modems Break 9600 bps

1984 Verilog

1985 Connection Machine

1985 First Computer-Generated TV Host

1985 Zero-Knowledge Proofs

1985 FCC Approves Unlicensed Spread Spectrum

1985 NSFNET

1985 Desktop Publishing

1985 Field-Programmable Gate Array

1985 GNU Manifesto

1985 AFIS Stops a Serial Killer

1986 Software Bug Fatalities

1986 Pixar

1987 Digital Video Editing

1987 GIF

1988 MPEG

1988 CD-ROM

1988 Morris Worm

1989 World Wide Web

1989 SimCity

1989 ISP Provides Internet Access to the Public

1990 GPS Is Operational

1990 Digital Money

1991 Pretty Good Privacy (PGP)

1991 Computers at Risk

1991 Linux Kernel

1992 Boston Dynamics Founded

1992 JPEG

1992 First Mass-Market Web Browser

1992 Unicode

1993 Apple Newton

1994 First Banner Ad

1994 RSA-129 Cracked

1995 DVD

1995 E-Commerce

1995 AltaVista Web Search Engine

1995 Gartner Hype Cycle

1996 Universal Serial Bus (USB)

1997 Computer Is World Chess Champion

1997 PalmPilot

1997 E Ink

1998 Diamond Rio MP3 Player

1998 Google

1999 Collaborative Software Development

1999 Blog Is Coined

1999 Napster

2000 USB Flash Drive

2001 Wikipedia

2001 iTunes

2001 Advanced Encryption Standard

2001 Quantum Computer Factors “15”

2002 Home-Cleaning Robot

2003 CAPTCHA

2004 Product Tracking

2004 Facebook

2004 First International Meeting on Synthetic Biology

2005 Video Game Enables Research into Real-World Pandemics

2006 Hadoop Makes Big Data Possible

2006 Differential Privacy

2007 iPhone

2008 Bitcoin

2010 Air Force Builds Supercomputer with Gaming Consoles

2010 Cyber Weapons

2011 Smart Homes

2011 Watson Wins Jeopardy!

2011 World IPv6 Day

2011 Social Media Enables the Arab Spring

2012 DNA Data Storage

2013 Algorithm Influences Prison Sentence

2013 Subscription Software

2014 Data Breaches

2014 Over-the-Air Vehicle Software Updates

2015 Google Releases TensorFlow

2016 Augmented Reality Goes Mainstream

2016 Computer Beats Master at Go

~2050 Artificial General Intelligence (AGI)

~9999 The Limits of Computation?

Notes and Further Reading

Photo Credits

Introduction

The evolution of the computer likely began with the human desire to

comprehend and manipulate the environment. The earliest humans recognized

the phenomenon of quantity and used their fingers to count and act upon

material items in their world. Simple methods such as these eventually gave way

to the creation of proxy devices such as the abacus, which enabled action on

higher quantities of items, and wax tablets, on which pressed symbols enabled

information storage. Continued progress depended on harnessing and controlling

the power of the natural world—steam, electricity, light, and finally the amazing

potential of the quantum world. Over time, our new devices increased our ability

to save and find what we now call data, to communicate over distances, and to

create information products assembled from countless billions of elements, all

transformed into a uniform digital format.

These functions are the essence of computation: the ability to augment and

amplify what we can do with our minds, extending our impact to levels of

superhuman reach and capacity.

These superhuman capabilities that most of us now take for granted were a

long time coming, and it is only in recent years that access to them has been

democratized and scaled globally. A hundred years ago, the instantaneous

communication afforded by telegraph and long-distance telephony was available

only to governments, large corporations, and wealthy individuals. Today, the

ability to send international, instantaneous messages such as email is essentially

free to the majority of the world’s population.

In this book, we recount a series of connected stories of how this change

happened, selecting what we see as the seminal events in the history of

computing. The development of computing is in large part the story of

technology, both because no invention happens in isolation, and because

technology and computing are inextricably linked; fundamental technologies

have allowed people to create complex computing devices, which in turn have

driven the creation of increasingly sophisticated technologies.

The same sort of feedback loop has accelerated other related areas, such as

the mathematics of cryptography and the development of high-speed

communications systems. For example, the development of public key

cryptography in the 1970s provided the mathematical basis for sending credit

card numbers securely over the internet in the 1990s. This incentivized many

companies to invest money to build websites and e-commerce systems, which in

turn provided the financial capital for laying high-speed fiber optic networks and

researching the technology necessary to build increasingly faster

microprocessors.

In this collection of essays, we see the history of computing as a series of

overlapping technology waves, including:

Human computation. More than people who were simply facile at math, the

earliest “computers” were humans who performed repeated calculations for

days, weeks, or months at a time. The first human computers successfully

plotted the trajectory of Halley’s Comet. After this demonstration, teams were

put to work producing tables for navigation and the computation of logarithms,

with the goal of improving the accuracy of warships and artillery.

Mechanical calculation. Starting in the 17th century with the invention of

the slide rule, computation was increasingly realized with the help of mechanical

aids. This era is characterized by mechanisms such as Oughtred’s slide rule and

mechanical adding machines such as Charles Babbage’s difference engine and

the arithmometer.

Connected with mechanical computation is mechanical data storage. In the

18th century, engineers working on a variety of different systems hit upon the

idea of using holes in cards and tape to represent repeating patterns of

information that could be stored and automatically acted upon. The Jacquard

loom used holes on stiff cards to enable automated looms to weave complex,

repeating patterns. Herman Hollerith managed the scale and complexity of

processing population information for the 1890 US Census on smaller punch

cards, and Émile Baudot created a device that let human operators punch holes

in a roll of paper to represent characters as a way of making more efficient use of

long-distance telegraph lines. Boole’s algebra lets us interpret these

representations of information (holes and spaces) as binary—1s and 0s—

fundamentally altering how information is processed and stored.

With the capture and control of electricity came electric communication and

computation. Charles Wheatstone in England and Samuel Morse in the US both

built systems that could send digital information down a wire for many miles. By

the end of the 19th century, engineers had joined together millions of miles of

wires with relays, switches, and sounders, as well as the newly invented speakers

and microphones, to create vast international telegraph and telephone

communications networks. In the 1930s, scientists in England, Germany, and the

US realized that the same electrical relays that powered the telegraph and

telephone networks could also be used to calculate mathematical quantities.

Meanwhile, magnetic recording technology was developed for storing and

playing back sound—technology that would soon be repurposed for storing

additional types of information.

Electronic computation. In 1906, scientists discovered that a beam of

electrons traveling through a vacuum could be switched by applying a slight

voltage to a metal mesh, and the vacuum tube was born. In the 1940s, scientists

tried using tubes in their calculators and discovered that they ran a thousand

times faster than relays. Replacing relays with tubes allowed the creation of

computers that were a thousand times faster than the previous generation.

Solid state computing. Semiconductors—materials that can change their

electrical properties—were discovered in the 19th century, but it wasn’t until the

middle of the 20th century that scientists at Bell Laboratories discovered and

then perfected a semiconductor electronic switch—the transistor. Faster still than

tubes and solids, semiconductors use dramatically less power than tubes and can

be made smaller than the eye can see. They are also incredibly rugged. The first

transistorized computers appeared in 1953; within a decade, transistors had

replaced tubes everywhere, except for the computer’s screen. That wouldn’t

happen until the widespread deployment of flat-panel screens in the 2000s.

Parallel computing. Year after year, transistors shrank in size and got faster,

and so did computers . . . until they didn’t. The year was 2005, roughly, when

the semiconductor industry’s tricks for making each generation of

microprocessors run faster than the previous pretty much petered out.

Fortunately, the industry had one more trick up its sleeve: parallel computing, or

splitting up a problem into many small parts and solving them more or less

independently, all at the same time. Although the computing industry had

experimented with parallel computing for years (ENIAC was actually a parallel

machine, way back in 1943), massively parallel computers weren’t

commercially available until the 1980s and didn’t become commonplace until

the 2000s, when scientists started using graphic processor units (GPUs) to solve

problems in artificial intelligence (AI).

Artificial intelligence. Whereas the previous technology waves always had at

their hearts the purpose of supplementing or amplifying human intellect or

abilities, the aim of artificial intelligence is to independently extend cognition,

evolve a new concept of intelligence, and algorithmically optimize any digitized

ecosystem and its constituent parts. Thus, it is fitting that this wave be last in the

Tải ngay đi em, còn do dự, trời tối mất!