VoyForums
[ Show ]
Support VoyForums
[ Shrink ]
VoyForums Announcement: Programming and providing support for this service has been a labor of love since 1997. We are one of the few services online who values our users' privacy, and have never sold your information. We have even fought hard to defend your privacy in legal cases; however, we've done it with almost no financial support -- paying out of pocket to continue providing the service. Due to the issues imposed on us by advertisers, we also stopped hosting most ads on the forums many years ago. We hope you appreciate our efforts.

Show your support by donating any amount. (Note: We are still technically a for-profit company, so your contribution is not tax-deductible.) PayPal Acct: Feedback:

Donate to VoyForums (PayPal):

Login ] [ Contact Forum Admin ] [ Main index ] [ Post a new message ] [ Search | Check update time | Archives: 1234[5] ]


[ Next Thread | Previous Thread | Next Message | Previous Message ]

Date Posted: 03:37:53 01/18/10 Mon
Author: KEY DATES
Subject: IN THE HISTORY AND FUTUREOF INFORMATION PROCESSING

KEY DATES
IN THE HISTORY AND FUTURE
OF INFORMATION PROCESSING
from Computers and Information Systems, 1994-1995 Edition, by Sarah E. Hutchinson and Stacey C. Sawyer © 1994 RICHARD D. IRWIN, INC.
YEAR EVENT
Less than 100,000 years ago Homo sapiens begin using intelligence to further goals.
More than 5,000 years ago The abacus, which resembles the arithmetic unit of a modern computer, is developed in the Orient.
3000-700 B.C. Water clocks are built in China in 3000 B.C., in Egypt approx 1500 B.C., and in Assyria 700 B.C.
2500 B.C. Egyptians invent the idea of thinking machines: citizens turn for advice to oracles, which arc statues with priests hidden inside.
427 B.C. In the Phaedo and later works Plato expresses ideas, several millennia before the advent off the computer that are relevant to modern dilemmas regarding human thought and its relation to the mechanics of the machine.
approx. 420 B.C. Archytas of Tarentum, a friend of Plato, constructs a wooden pigeon whose movements are controlled by a jet of steam or compressed air.
approx. 415 B.C. Theaetetus, a member of Plato's Academy, creates solid geometry.
387 B.C. Plato founds the Academy for the pursuit of science and philosophy in a grove on the outskirts of Athens. It results in the fertile development of mathematical theory.
293 B.C. Euclid, also a member of Plato's Academy is the expositor of plane geometry He writes the Elements, a basic mathematics textbook for the next 2,000 years.
c. 200 B.C. In China artisans develop elaborate automata, including an entire mechanical orchestra.
725 A Chinese engineer and a Buddhist monk build the first true mechanical clock a water-driven device with an escapement that causes the clock to tick.
1540, 1772 The technology of clock and watch making results in the production of more elaborate automata during the European Renaissance. Gianello Toriano's mandolin-playing lady (1540) and P. Jacquet-Droz's child (1772) are famous examples.
1617 John Napier invents Napier's Bones, of significance to the future development of calculating engines.
1642 Blaise Pascal perfects the Pascaline, a machine that will add and subtract. It is the world's first automatic calculating machine.
1694 Gottfried Wilhelm Liebniz, an inventor of calculus perfects the Liebniz Computer a machine that multiplies by performing repetitive additions, an algorithm still used in modern computers.
1726 Jonathan Swift, in Gulliver's Travels, describes a machine that will automatically write books.
1805 Joseph-Marie Jacquard devises a method for automating weaving with a series of punched cards. This invention will be used many years later in the development of early computers.
1821 Charles Babbage is awarded the first gold medal by the British Astronomical Society for his paper "Observations on the Application of Machinery to the Computation of Mathematical Tables."
1821 Michael Faraday, widely recognized as the father of electricity, reports his discovery of electromagnetic rotation and builds the first two motors powered by electricity.
1822 Charles Babbage develops the Difference Engine, but its technical complexities exhaust his financial resources and organizational skills. He eventually abandons it to concentrate his efforts on a general-purpose computer.
1829 The first electromagnetically driven clock is constructed.
1832 Charles Babbage develops the principle of the Analytical Engine, which is the world's first computer and can be programmed to solve a wide variety of logical and computational problems.
1835 Joseph Henry invents the electrical relay, a means of transmitting electrical impulses over long distances. The relay serves as the basis for the telegraph.
1837 Samuel Finley Breese Morse patents his more practical version of the telegraph which sends letters in codes consisting of dots and dashes.
1843 Ada Lovelace, Lord Byron's only legitimate child and the world's first computer programmer publishes her own notes with her translation of L. P. Menabrea's paper on Babbage's Analytical Engine.
1846 Alexander Bain uses punched paper tape to send telegraph massages greatly improving the speed of transmission.
1847 George Boole publishes his first ideas on symbolic logic. He will develop these ideas into his theory of binary logic and arithmetic--theory that is still the basis of modern computation.
1854 An electric telegraph is installed between Paris and London.
1855 William Thomson develops a successful theory concerning the transmission of electrical signals through submarine cables.
1861 San Francisco and New York are connected by a telegraph line.
1864 Ducos de Harron develops a primitive motion-picture device in France.
1866 Cyrus West Field lays a telegraph cable across the Atlantic Ocean.
1876 Alexander Graham Bell's telephone receives U.S. Patent 174,465, the most lucrative patent ever granted.
1879 G. Frege, one of the founders of modern symbolic language, proposes a notational system for mechanical reasoning. This work is a forerunner to the predicate calculus, which will be used for knowledge representation in artificial intelligence.
1885 Boston is connected to New York by telephone.
1886 Alexander Graham Bell, with a modified version of Thomas Alva Edison's phonograph, uses wax discs for recording sound.
1888 William S. Burroughs patents an adding machine. This machine is modified four years later to include subtraction and printing. It is the world's first dependable key-driven calculator and will soon win widespread acceptance.
1888 Heinrich Hertz experiments with the transmission of what are now known as radio waves.
1888 The first commercial roll-film camera is introduced.
1890 Herman Hollerith, incorporating ideas from Jacquard's loom and Babbage's Analytical Engine patents an electromechanical information machine that uses punched cards. It wins the 1890 U.S. Census competition with the result that electricity is used for the first time in a major data processing project.
1894 Guglielmo Marconi builds his first radio equipment, which rings a bell from 30 feet away.
1896 A sound film is first shown before a paying audience in Berlin.
1896 Herman Hollerith forms the Tabulating Machine Company, which will become IBM.
1897 Alexander Popov, a Russian, uses all antenna to transmit radio waves, and Guglielmo Marconi, all Italian, receives the first patent ever granted for radio. Marconi helps organize a company to market his system.
1899 The first recording of sound occurs magnetically on wire and on a thin metal strip.
1900 Herman Hollerith introduces all automatic card feed into his information machine to process the 1900 census data.
1900 The entire civilized world is connected by telegraph, and in the United States there are more than 1.4 million telephones, 8,000 registered automobiles, and 24 million electric light bulbs. Edison's promise of "electric bulbs so cheap that only the rich will be able to afford candles" is thus realized. In addition, the Gramophone Company is advertising a choice of five thousand recordings.
1901 Marconi, in Newfoundland, receives the first transatlantic telegraphic radio transmission.
1904 John Ambrose Fleming files a patent for the first vacuum tube, a diode.
1906 Reginald Aubrey Fessenden invents AM radio and transmits by radio waves to wireless operators on U.S. ships off the Atlantic Coast. The transmission includes a Christmas carol, a violin trill, and for the first time the sound of a human voice.
1907 Lee De Forest and R. von Lieben invent the amplifier vacuum tube, known as a triode, which greatly improves radio.
1911 Herman Hollerith's Tabulating Machine Company acquires several other companies and changes its name to Computing-Tabulating-Recording Company (CTR). In 1914 Thomas J. Watson is appointed president.
1913 Henry Ford introduces the first true assembly-line method of automated production.
1913 A. Meissner invents a radio transmitter with vacuum tubes. Radio-transmitter triode modulation is introduced the following year, and in 1915 the radio-tube oscillator is introduced.
1921 Czech dramatist Karel Capek popularizes the term robot, a word he coined in 1917 to describe the mechanical people in his science-fiction drama R.U.R. (Rossum's Universal Robots). His intelligent machines, intended as servants for their human creators, end up taking over the world and destroying all mankind.
1923 Vladimir Kosma Zworkin, the father of television, gives the first demonstration of an electronic television-camera tube, using a mechanical transmitting device. He develops the iconoscope, an early type of television system, the following year.
1924 Thomas J. Watson becomes the chief executive officer of CTR and renames the company International Business Machines (IBM). IBM will become the leader of the modern industry and one of the largest industrial corporations in the world.
1925 Vannevar Bush and his co-workers develop the first analog computer, a machine designed to solve differential equations.
1926 The era of talking motion pictures is introduced by The Jazz Singer, starring Al Jolson.
1928 John von Neumann presents the minimax theorem, which will be widely used in game-playing programs.
1928 Philo T. Farnsworth demonstrates the world's first all-electronic television, and Vladimir Zworkin receives a patent for a color television system.
1929 FM radio is introduced.
1930 Vannevar Bush's analog computer, the Differential Analyzer, is built at MIT. It will be used to calculate artillery trajectories during World War II.
1932 RCA demonstrates a television receiver with a cathode-ray picture tube In 1933 Zworkin produces a cathode-ray tube, called the iconoscope, that makes high-quality television almost a reality.
1937 Building on the work of Bertrand Russell and Charles Babbage, Alan Turing publishes "On Computable Numbers," his now-celebrated paper introducing the Turing machine, a theoretical model of a computer.
1937 The Church-Turing thesis independently developed by Alonzo Church and Alan Turing, states that all problems solvable by a human being are reducible to a set of algorithms, or more simply, that machine intelligence and human intelligence are essentially equivalent.
1940 John V. Atanasoff and Clifford Berry build an electronic computer known as ABC. This is the first electronic computer, but it is not programmable.
1940 The 10,000-person British computer war effort known as Ultra creates Robinson the world's first operational computer. It is based on electromechanical relays and is powerful enough to decode messages from Enigma, the Nazis' first-generation enciphering machine.
1941 Konrad Zuse, a German, completes the world's first fully programmable digital computer, the Z-3, and hires Arnold Fast, a blind mathematician, to program it. Fast becomes the world's first programmer of an operational programmable computer.
1943 The Ultra team builds Colossus, a computer that uses electronic tubes 100 to 1000 times faster than the relays used by Robinson. It cracks increasingly complex German codes and contributes to the Allies' winning of World War II.
1944 Howard Aiken completes the first American programmable computer, the Mark I. It uses punched paper tape for programming and vacuum tubes to calculate problems.
1945 Konrad Zuse develops Plankalkul, the first high-level language.
1946 John Tukey first uses the term bit for binary digit, the basic unit of data for computers.
1946 John von Neumann publishes the first modern paper on the stored-program concept and starts computer research at the Institute for Advanced Study in Princeton.
1946 John Presper Eckert and John W. Mauchley develop ENIAC, the world's first fully electronic, general-purpose (programmable) digital computer. It is almost 1,000 times faster than the Mark I and is used for calculating ballistic-firing tables for the Army.
1946 Television enters American life even more rapidly than radio did in the 1920s. The percentage of American homes having sets jumps from 0.02% in 1946 to 72% in 1956 and more than 90% by 1983.
1947 William Bradford Schockley, Walter Hauser Brittain, and John Ardeen invent the transistor, a minute device that functions like a vacuum tube but switches current on and off at much faster speeds. It launches a revolution in microelectronics, bringing down the cost of computers and Leading to the development of minicomputers and powerful new mainframe computers.
1949 Maurice Wilkes, influenced by Eckert and Mauchley, builds EDSAC, the world's first stored-program computer. Eckert and Mauchley's new U.S. company brings out BINAC, the first American stored-program computer, soon after.
1950 The U.S. census is first handled by a programmable computer, UNIVAC, developed by Eckert and Mauchley. It is the first commercially marketed computer.
1950 Alan Turing's "Computing Machinery and Intelligence" describes the Turing test, a means for determining whether a machine is intelligent.
1950 Commercial color television begins in the U.S.; transcontinental black-and-white television is inaugurated the following year.
1950 Claude Elwood Shannon writes a proposal for a chess program.
1951 EDVAC, Eckert and Mauchley's first computer that implements the stored-program concept, is completed at the Moore School at the University of Pennsylvania.
1952 The CBS television network uses UNIVAC to correctly predict the election of Dwight D. Eisenhower as president of the United States.
1952 The pocket-size transistor radio is introduced.
1952 The 701, IBM's first production-line electronic digital computer, is designed by Nathaniel Rochester and marketed for scientific use.
1955 IBM introduces its first transistor calculator with 2,200 transistors instead of the 1,200 vacuum tubes that would otherwise be required.
1955 The first design is created for a robot-like machine for industrial use in the U.S.
1955 Allen Newell, J. C. Shaw, and Herbert Simon develop IPL-II, the first AI language.
1955 The beginning space program and the military in the U.S., recognizing the need for computers powerful enough to steer rockets to the moon and missiles through the stratosphere, fund major research projects.
1956 The first transatlantic telephone cable begins to operate.
1956 FORTRAN, the first scientific computer programming language, is invented by John Backus and a team at IBM.
1956 MANIAC I, the first computer program to beat a human being in a chess game, is developed by Stanislaw Ulam.
1956 Artificial intelligence is named at a computer conference at Dartmouth College.
1958 Jack St. Clair Kilby invents the first integrated circuit.
1958 John McCarthy introduces LISP, an early (and still widely used) AI language.
1958-1959 Jack Kilby and Robert Noyce independently develop the chip which leads to much cheaper and smaller computers.
1959 Dartmouth's Thomas Kurtz and John Kemeny find an alternative to batch processing: timesharing.
1959 Grace Murray Hopper, one of the first programmers of the Mark I, develops COBOL, a computer language designed for business use.
1959 About 6,000 computers are in operation in the United States.
1960 A U.S. company markets the world's first industrial robots.
1962 The first department of computer science offering a Ph.D. is established at Purdue University.
1962 D. Murphy and Richard Greenblatt develop the TECO text editor, one of the first word processing systems, for use on the PDP1 computer at MIT.
1962 AI researchers of the 1960s, noting the similarity between human and computer languages, adopt the goal of parsing natural-language sentences. Sususmo Kuno's parsing system reveals the great extent of syntactic and semantic ambiguity in the English language. Kuno's system is tested on the sentence "Time flies like an arrow."
1963 John McCarthy founds the Artificial intelligence Laboratory at Stanford University.
1964 IBM solidifies its leadership of the computer industry with the introduction of its 360 series.
1964 Daniel Bobrow completes his doctoral work on Student, a natural-language program that can solve high-school-level word problems in algebra.
1964 Gordon Moore, one of the founders of Fairchild Semiconductor Corporation, predicts that integrated circuits will double in complexity each year. His statement will become known as Moore's law and will prove true for decades to come.
1964 Marshall McLuhan's Understanding Media foresees electronic media, especially television, as creating a "global village in which "the medium is the message."
1965 Raj Reddy founds the Robotics Institute at Carnegie-Mellon University. The institute becomes a leading research center for AI.
1965 The DENDRAL project begins at Stanford University, headed by Bruce Buchanan, Edward Feigenbaum, and Nobel laureate Joshua Lederberg. Its purpose is to experiment on knowledge as the primary means of producing problem-solving behavior The first expert system, DENDRAL, embodies extensive knowledge of molecular-structure analysis. Follow-up work, carried out through the early 1970s, produces Meta-DENDRAL, a learning program that automatically devises new rules for DENDRAL.
Mid-1960s Computers are beginning to be widely used in the criminal justice system.
Mid-1960s Scientific and professional knowledge is beginning to be codified in a machine-readable form.
1967 Seymour Papert and his associates at MIT begin working on LOGO, an education-oriented programming language that will be widely used by children.
1967 The software business is born when IBM announces it will no longer sell software and hardware in a single unit.
1968 The film 2001: A Space Odyssey, by Arthur C. Clarke and Stanley Kubrick, presents HAL, a computer that can see, speak, hear, and think like its human colleagues aboard a spaceship.
1968 The Intel Corp. is founded. Intel will grow to become the dominant manufacturer of microprocessors in the U.S. computer industry.
1970 The floppy disk is introduced for storing data in computers.
1970 Harry Pople and Jack Myers of the University of Pittsburgh begin work on Internist, a system that aids physicians in the diagnosis of a wide range of human diseases.
1971 Kenneth Colby, Sylvia Weber, and F. D. Hilf present a report on PARRY, a program simulating a paranoid person, in a paper entitled "Artificial Paranoia." The program is so convincing that clinical psychiatrists cannot distinguish its behavior from that of a human paranoid person.
1971 The first microprocessor is introduced in the U.S. 1971 The first microprocessor is introduced in the U.S.
1971 The first pocket calculator is introduced. It can add, subtract, multiply, and divide.
1971 Daniel Bricklin and Software Arts, Inc. release the first electronic spreadsheet for PCs, VisiCalc. The program helps launch the personal computing era by showing the convenience with which information can be handled on a desktop.
1973 Alain Colmerauer presents an outline of PROLOG, a logic-programming language. The language will become enormously popular and will be adopted for use in the Japanese Fifth-Generation Program.
1974 The first computer-controlled industrial robot is developed.
1974 Edward Shortliffe completes his doctoral dissertation on MYCIN, an expert system designed to help medical practitioners prescribe an appropriate antibiotic by determining the precise identity of a blood infection. Work to augment this program with other important systems, notably TEIRESIAS and EMYCIN, will continue through the early 1980s. TEIRESIAS will be developed in 1976 by Randall Davis to serve as a powerful information-structuring tool for knowledge engineers. EMYCIN, by William van Melle, will represent the skeletal structure of inferences.
1974 The SUMEX-AIM computer communications network is established to promote the development of applications of artificial intelligence to medicine.
1975 Benoit Mandelbrot writes "Les objet fractals: Forme, hasard, et dimension," his first long essay on fractal geometry, a branch of mathematics that he developed. Fractal forms will be widely used to model chaotic phenomena in nature and to generate realistic computer images of naturally occurring objects.
1975 Medicine is becoming an important area of applications for AI research. Four major medical expert systems have been developed by now: PIP, CASNET, MYCIN, and Internist.
1975 The Defense Advanced Research Programs Agency launches its Image Understanding Program to stimulate research in the area of machine vision.
1975 More than 5,000 microcomputers are sold in the U.S., and the first personal computer, with 256 bytes of memory, is introduced.
1970s The role of knowledge in intelligent behavior is now a major focus of AI research. Bruce Buchanan and Edward Feigenbaum of Stanford University pioneer knowledge engineering.
1976 Kurzweil Computer Products introduces the Kurzweil Reading Machine, which reads aloud any printed text that is presented to it. Based on omnifont character-recognition technology, it is intended to be a sensory aid for the blind.
1976-1977 Lynn Conway and Carver Mead collaborate on a collection of principles for VLSI design. Their classic textbook Introduction to VLSI Design is published in 1980. VLSI circuits will form the basis of the fourth generation of computers.
1977 Steven Jobs and Stephen Wozniak design and build the Apple computer.
1977 Voyagers 1 and 2 are launched and radio back billions of bytes of computerized data about new discoveries as they explore the outer planets of our solar system.
1977 The Apple II, the first personal computer to be sold in assembled form, is successfully marketed.
1978 Total computer units in the United States exceed a half million.
1979 In a landmark study published in the Journal of the American Medical Association by nine researches, the performance of MYCIN is compared with that of doctors on 10 test cases of meningitis. MYCIN does at least as well as the medical experts. The potential of expert systems in medicine becomes widely recognized.
1979 Ada, a computer language developed for use by the armed forces, is named for Ada Lovelace.
1979 Pac Man and other early computerized video games appear.
1979 Hayes markets its first modem, which sets the industry standard for modems in years to come.
Early 1980s Second-generation robots arrive with the ability to precisely effect movements with five or six degrees of freedom. They are used for industrial welding and spray painting.
Early 1980s The MYCIN project produces NeoMYCIN and ONCOCIN, expert systems that incorporate hierarchical knowledge bases. They are more flexible than MYCIN.
1981 Desktop publishing takes root when Xerox brings out its Star Computer. However, it will not become popular until Apple's LaserWriter comes on the market in 1985. Desktop publishing provides writers and artists an inexpensive and efficient way to compose and print large documents.
1981 IBM introduces its Personal Computer (PC).
1982 Compact-disk players are marketed for the first time.
1982 A million-dollar advertising campaign introduces Mitch Kapor's Lotus 1-2-3. an enormously popular spreadsheet program.
1982 With over 100,000 associations between symptoms and diseases covering 70% of all the knowledge in the field, CADUCEUS, an improvement on the Internist expert system, is developed for internal medicine by Harry Pople and Jack Myers at the University of Pittsburgh. Tested against cases from the New England Journal of Medicine, it proves more accurate than humans in a wide range of categories.
1983 Six million personal computers are sold in the U.S.
1984 Apple Computer, Inc. introduces the Macintosh.
1984 RACTER, created by William Chamberlain, is the first computer program to author a book.
1984 Waseda University in Tokyo completes Wabot-2, a 200-pound robot that reads sheet music through its camera eye and plays the organ with its ten fingers and two feet.
1984 Optical disks for the storage of computer data are introduced, and IBM brings out a mega-RAM memory chip with four times the memory of earlier chips.
1984 Hewlett-Packard brings high-quality printing to PCs with its LaserJet laser printer.
1985 The MIT Media Laboratory creates the first three-dimensional holographic image to be generated entirely by computer.
1985 Aldus Corp. introduces PageMaker for the Macintosh, the first desktop publishing software.
Mid 1980s Third-generation robots arrive with limited intelligence and some vision and tactile senses.
1986 Dallas police use a robot to break into an apartment. The fugitive runs out in fright and surrenders.
1986 Electronic keyboards account for 55.2% of the American musical keyboard market, up from 9.5% in 1980. This trend is expected to continue until the market is almost all electronic.
1986 Technology for optical character recognition represents a $100-million-dollar industry that is expected to grow to several hundred million by 1990.
1986 New medical imaging systems are creating a mini-revolution. Doctors can now make accurate judgments based on views of areas inside our bodies and brains.
1986 Using image processing and pattern recognition, Lillian Schwartz comes up with an answer to a 500-year-old question: Who was the Mona Lisa? Her conclusion: Leonardo da Vinci himself.
1986 Russell Anderson's doctoral work at the University of Pennsylvania is a robotic ping-pong player that wins against human beings.
1986 The best computer chess players are now competing successfully at the senior master level, with HiTech, the leading chess machine, analyzing 200,000 board positions per second.
1987 Computerized trading helps push NYSE stocks to their greatest single-day loss.
1987 Current speech systems can provide any one of the following: a large vocabulary, continuous speech recognition, or speaker independence.
1987 Japan develops the Automated Fingerprint Identification System (AFIS), which enables U.S. law enforcement agencies to rapidly track and identify suspects.
1987 There are now 1,900 working expert systems, 1,200 more than last year. The most popular area of application is finance, followed by manufacturing control and fault diagnosis.
1988 Computer memory today costs only 10-8 of what it did in 1950.
1988 The population of industrial robots has increased from a few hundred in 1970 to several hundred thousand, most of them in Japan.
1988 In the U.S. 4,700,000 microcomputers, 120,000 minicomputers, and 11,500 mainframes are sold in this year.
1988 W. Daniel Hillis's Connection Machine is capable of 65,536 computations at the same time.
1988 Warsaw Pact forces are at least a decade behind NATO forces in artificial intelligence and other computer technologies.
1989 Computational power per unit of cost has roughly doubled every 18 to 24 months for the past 40 years.
1989 The trend from analog to digital will continue to revolutionize a growing number of industries.
Late 1980s The core avionics of a typical fighter aircraft uses 200,000 lines of software. The figure is expected to grow to about 1 million in the 1990s. The U.S. military as a whole uses about 100 million lines of software (and is expected to use 200 million by 1993). Software quality becomes an urgent issue that planners are beginning to address.
Late 1980s The computer is being recognized as a powerful tool for artistic expression.
Early 1990s A profound change in military strategy arrives. The more developed nations increasingly rely on "smart weapons," which incorporate electronic copilots; pattern recognition techniques; and advanced for tracking, identification, and destruction.
Early 1990s Continuous speech systems can handle large vocabularies for specific tasks.
Early 1990s Computer processors operate at speeds of 100 mips.
1990s Significant progress is made toward an intelligent assistant, a decision-support system capable of a wide variety of administrative and information-gathering tasks. The system can, for example, prepare a feasibility report on a project proposal after accessing several databases and "talking" to human experts.
1990s Reliable person identification, using pattern-recognition techniques applied to visual and speech patterns, replaces locks and keys in many instances.
Late 1990s An increasing number of documents never exist on paper because they incorporate information in the form of audio and video pieces.
Late 1990s Media technology is capable of producing computer-generated personalities, intelligent image systems with some human characteristics.
1999 The several-hundred-billion-dollar computer and information-processing market is largely intelligent by 1990 standards.
2000 Three-dimensional chips and smaller component geometries contribute to a multi-thousandfold improvement in computer power (compared to that of a decade earlier).
2000 Chips with over a billion components appear.
2000 The world chess champion is a computer.
Early 2000s Translating telephones allow two people across the globe to speak to each other even if they do not speak the same language.
Early 2000s Speech-to-text machines translate speech into a visual display for the deaf.
Early 2000s Exoskeletal robotic prosthetic aids enable paraplegic persons to walk and climb stairs.
Early 2000s Telephones are answered by an intelligent telephone-answering machine that converses with the calling party to determine the nature and priority of the call.
Early 2000s The cybernetic chauffeur, installed in one's car, communicates with other cars and sensors on roads. In this way it successfully drives and navigates from one point to another.
Early 21st century Computers dominate the educational environment. Courseware is intelligent enough to understand and correct the inaccuracies in the conceptual model of a student. Media technology allows students to interact with simulations of the very systems and personalities they are studying.
Early 21st century The entire production sector of society is operated by a small number of technicians and professionals. Individual customization of products is common.
Early 21st century Drugs are designed and tested on human biochemical simulators.
Early 21st century Seeing machines for the blind provide both reading and navigation functions.
2010 A personal computer has the ability to answer a large variety of queries, because it will know where to find knowledge. Communications technologies allow it to access many sources of knowledge by wireless communication.
2020-2050 A phone call, which includes highly realistic three-dimensional holographic moving images, is like visiting with the person called.
2020-2070 A computer passes the Turing test, which indicates human-level intelligence.

[ Next Thread | Previous Thread | Next Message | Previous Message ]

[ Contact Forum Admin ]


Forum timezone: GMT-8
VF Version: 3.00b, ConfDB:
Before posting please read our privacy policy.
VoyForums(tm) is a Free Service from Voyager Info-Systems.
Copyright © 1998-2019 Voyager Info-Systems. All Rights Reserved.