Hardware History Timeline by Dante

  • nintendo

    Nintendo started out as a small Japanese business, founded by Fusajiro Yamauchi on September 23, 1889,[2] as Nintendo Koppai. Based in Kyoto, Japan, the business produced and marketed Hanafuda cards. The name "Nintendo" is commonly assumed to mean "leave luck to heaven", but there are no historical records to validate this assumption
  • Period: to

    hardware history timeline

    timespan
  • The Atanasoff–Berry Computer (ABC)

    The Atanasoff–Berry Computer (ABC) was one of the first electronic digital computing devices. Conceived in 1937, the machine was not programmable, being designed only to solve systems of linear equations. It was successfully tested in 1942. However, its intermediate result storage mechanism, a paper card writer/reader, was unreliable, and when inventor John Vincent Atanasoff left Iowa State College for World War II assignments, work on the machine was discontinued.
  • First Generation (1940-1956) Vacuum Tubes

    The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
    First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time.
  • E-Book

    The first e-book may be the Index Thomisticus, a heavily annotated electronic index to the works of Thomas Aquinas, prepared by Roberto Busa beginning in the late 1940s. However, this is sometimes omitted, perhaps because the digitized text was (at least initially) a means to developing an index and concordance, rather than as a published edition in its own right.[3]
    Some years earlier the idea of the e-reader came to Bob Brown after watching his first "talkie" (movies with sound). In 1930,
  • video game console

    Although the first video games appeared in the 1950s,[4] they were played on vector displays connected to massive computers, not analog televisions. Ralph H. Baer conceived the idea of a home video game in 1951. In the 1960s he created a working video game console at Sanders Associates, but struggled for years to find a television manufacturer willing to produce the console.
  • ditgatal camera

    Digital camera technology is directly related to and evolved from the same technology that recorded television images. In 1951, the first video tape recorder (VTR) captured live images from television cameras by converting the information into electrical impulses (digital) and saving the information onto magnetic tape. Bing Crosby laboratories (the research team funded by Crosby and headed by engineer John Mullin) created the first early VTR and by 1956, VTR technology was perfected (the VR1000
  • Second Generation (1956-1963) Transistors

    Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement.
  • DEC Digital Equipment Corporation

    Digital Equipment Corporation, also known as DEC[1] and using the trademark Digital, was a major American company in the computer industry from the 1960s to the 1990s. It was a leading vendor of computer systems, including computers, software, and peripherals, and its PDP and successor VAX products were the most successful of all minicomputers in terms of sales.
    From 1957 until 1992 its headquarters were located in a former wool mill in Maynard, Massachusetts, since renamed Clock Tower Place.
  • minicomputer

    The term "minicomputer"evolved in the 1960s to describe the smaller computers that became possible with the use of transistors and core memory technologies, minimal instructions sets and less expensive peripherals such as the ubiquitous Teletype Model 33 ASR. They usually took up one or a few 19-inch rack cabinets, compared with the large mainframes that could fill a room.
    The first minicomputer was created in the USSR in 1958–1962. The computer, designated UM-1NKh, was produced at the Leningrad
  • Third Generation (1964-1971) Integrated Circuits

    The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.
    Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications.
  • personal computer

    The personal computer (PC), which is also called the microcomputer and was designed for use by one person, was first developed for businesses in the early 1970s. Digital Equipment Corporation made the PDP8 for scientific laboratories. Steve Wozniak (1950– ) and Steve Jobs (1955– ), college dropouts who founded Apple Computer in 1976, are credited with inventing the first computer for home use. Working out of a garage, they spent six months developing the prototype (initial model) for Apple I.
  • Fourth Generation (1971-Present) Microprocessors

    The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip.
  • smartphones

    Devices that combined telephony and computing were conceptualized as early as 1973, and were offered for sale beginning in 1994. The term "smartphone", however, did not appear until 1997, when Ericsson described its GS 88 "Penelope" concept as a Smart Phone.The distinction between smartphones and feature phones can be vague, and there is no official definition for what constitutes the difference between them.
  • microcomputer

    1975: Ed Roberts, the "father of the microcomputer" designed the first microcomputer, the Altair 8800, which was produced by Micro Instrumentation and Telemetry Systems (MITS). The same year, two young hackers, William Gates and Paul Allen approached MITS and promised to deliver a BASIC compiler. So they did and from the sale, Microsoft was born.
  • ditgital kodak

    In 1975 Kodak engineer Steven Sasson invented the first digital still camera, which used a Fairchild 100 x 100 pixel CCD.[1][2] By 1986 Kodak had developed a sensor with 1.4 million pixels.[3]
    A number of other inventions were made to increase usability, including improvements in sensor technology, the first Raw image format DCR, and usable host software. The original Kodak DCS was launched in 1991, and was based around a stock Nikon F3 SLR with digital components.
  • apple inc.

    Apple was established on April 1, 1976, by Steve Jobs, Steve Wozniak and Ronald Wayne[1] to sell the Apple I personal computer kit, a computer single handedly designed by Wozniak. The kits were hand-built by Wozniak[24][25] and first shown to the public at the Homebrew Computer Club.
  • laptops

    Designed in 1979 by a Briton, William Moggridge, for Grid Systems Corporation, the Grid Compass was one fifth the weight of any model equivalent in performance and was used by NASA on the space shuttle program in the early 1980's. A 340K byte bubble memory lap-top computer with die-cast magnesium case and folding electroluminescent graphics display screen.1979
  • IBM Personal Computer,

    The IBM Personal Computer, commonly known as the IBM PC, is the original version and progenitor of the IBM PC compatible hardware platform. It is IBM model number 5150, and was introduced on August 12, 1981. It was created by a team of engineers and designers under the direction of Don Estridge of the IBM Entry Systems Division in Boca Raton, Florida.
  • webcams

    First developed in 1991, a webcam was pointed at the Trojan Room coffee pot in the Cambridge University Computer Science Department. The camera was finally switched off on August 22, 2001. The final image captured by the camera can still be viewed at its homepage.[2][3] The oldest webcam still operating is FogCam at San Francisco State University, which has been running continuously since 1994.
  • Fifth Generation (Present and Beyond) Artificial Intelligence

    Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language