History of computers: A brief timeline

The history of computers began with primitive designs in the early 19th century and went on to change the world during the 20th century.

History of computers: Apple I computer 1976

  • 2000-present day

Additional resources

The history of computers goes back over 200 years. At first theorized by mathematicians and entrepreneurs, during the 19th century mechanical calculating machines were designed and built to solve the increasingly complex number-crunching challenges. The advancement of technology enabled ever more-complex computers by the early 20th century, and computers became larger and more powerful.

Today, computers are almost unrecognizable from designs of the 19th century, such as Charles Babbage's Analytical Engine — or even from the huge computers of the 20th century that occupied whole rooms, such as the Electronic Numerical Integrator and Calculator.  

Here's a brief history of computers, from their primitive number-crunching origins to the powerful modern-day machines that surf the Internet, run games and stream multimedia. 

19th century

1801: Joseph Marie Jacquard, a French merchant and inventor invents a loom that uses punched wooden cards to automatically weave fabric designs. Early computers would use similar punch cards.

1821: English mathematician Charles Babbage conceives of a steam-driven calculating machine that would be able to compute tables of numbers. Funded by the British government, the project, called the "Difference Engine" fails due to the lack of technology at the time, according to the University of Minnesota . 

1848: Ada Lovelace, an English mathematician and the daughter of poet Lord Byron, writes the world's first computer program. According to Anna Siffert, a professor of theoretical mathematics at the University of Münster in Germany, Lovelace writes the first program while translating a paper on Babbage's Analytical Engine from French into English. "She also provides her own comments on the text. Her annotations, simply called "notes," turn out to be three times as long as the actual transcript," Siffert wrote in an article for The Max Planck Society . "Lovelace also adds a step-by-step description for computation of Bernoulli numbers with Babbage's machine — basically an algorithm — which, in effect, makes her the world's first computer programmer." Bernoulli numbers are a sequence of rational numbers often used in computation.

Babbage's Analytical Engine

1853: Swedish inventor Per Georg Scheutz and his son Edvard design the world's first printing calculator. The machine is significant for being the first to "compute tabular differences and print the results," according to Uta C. Merzbach's book, " Georg Scheutz and the First Printing Calculator " (Smithsonian Institution Press, 1977).

1890: Herman Hollerith designs a punch-card system to help calculate the 1890 U.S. Census. The machine,  saves the government several years of calculations, and the U.S. taxpayer approximately $5 million, according to Columbia University  Hollerith later establishes a company that will eventually become International Business Machines Corporation ( IBM ).

Early 20th century

1931: At the Massachusetts Institute of Technology (MIT), Vannevar Bush invents and builds the Differential Analyzer, the first large-scale automatic general-purpose mechanical analog computer, according to Stanford University . 

1936: Alan Turing , a British scientist and mathematician, presents the principle of a universal machine, later called the Turing machine, in a paper called "On Computable Numbers…" according to Chris Bernhardt's book " Turing's Vision " (The MIT Press, 2017). Turing machines are capable of computing anything that is computable. The central concept of the modern computer is based on his ideas. Turing is later involved in the development of the Turing-Welchman Bombe, an electro-mechanical device designed to decipher Nazi codes during World War II, according to the UK's National Museum of Computing . 

1937: John Vincent Atanasoff, a professor of physics and mathematics at Iowa State University, submits a grant proposal to build the first electric-only computer, without using gears, cams, belts or shafts.

original garage where Bill Hewlett and Dave Packard started their business

1939: David Packard and Bill Hewlett found the Hewlett Packard Company in Palo Alto, California. The pair decide the name of their new company by the toss of a coin, and Hewlett-Packard's first headquarters are in Packard's garage, according to MIT . 

1941: German inventor and engineer Konrad Zuse completes his Z3 machine, the world's earliest digital computer, according to Gerard O'Regan's book " A Brief History of Computing " (Springer, 2021). The machine was destroyed during a bombing raid on Berlin during World War II. Zuse fled the German capital after the defeat of Nazi Germany and later released the world's first commercial digital computer, the Z4, in 1950, according to O'Regan. 

1941: Atanasoff and his graduate student, Clifford Berry, design the first digital electronic computer in the U.S., called the Atanasoff-Berry Computer (ABC). This marks the first time a computer is able to store information on its main memory, and is capable of performing one operation every 15 seconds, according to the book " Birthing the Computer " (Cambridge Scholars Publishing, 2016)

1945: Two professors at the University of Pennsylvania, John Mauchly and J. Presper Eckert, design and build the Electronic Numerical Integrator and Calculator (ENIAC). The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology" (Greenwood Press, 2003). 

Computer technicians operating the ENIAC

1946: Mauchly and Presper leave the University of Pennsylvania and receive funding from the Census Bureau to build the UNIVAC, the first commercial computer for business and government applications.

1947: William Shockley, John Bardeen and Walter Brattain of Bell Laboratories invent the transistor . They discover how to make an electric switch with solid materials and without the need for a vacuum.

1949: A team at the University of Cambridge develops the Electronic Delay Storage Automatic Calculator (EDSAC), "the first practical stored-program computer," according to O'Regan. "EDSAC ran its first program in May 1949 when it calculated a table of squares and a list of prime numbers ," O'Regan wrote. In November 1949, scientists with the Council of Scientific and Industrial Research (CSIR), now called CSIRO, build Australia's first digital computer called the Council for Scientific and Industrial Research Automatic Computer (CSIRAC). CSIRAC is the first digital computer in the world to play music, according to O'Regan.

Late 20th century

1953: Grace Hopper develops the first computer language, which eventually becomes known as COBOL, which stands for COmmon, Business-Oriented Language according to the National Museum of American History . Hopper is later dubbed the "First Lady of Software" in her posthumous Presidential Medal of Freedom citation. Thomas Johnson Watson Jr., son of IBM CEO Thomas Johnson Watson Sr., conceives the IBM 701 EDPM to help the United Nations keep tabs on Korea during the war.

1954: John Backus and his team of programmers at IBM publish a paper describing their newly created FORTRAN programming language, an acronym for FORmula TRANslation, according to MIT .

1958: Jack Kilby and Robert Noyce unveil the integrated circuit, known as the computer chip. Kilby is later awarded the Nobel Prize in Physics for his work.

1968: Douglas Engelbart reveals a prototype of the modern computer at the Fall Joint Computer Conference, San Francisco. His presentation, called "A Research Center for Augmenting Human Intellect" includes a live demonstration of his computer, including a mouse and a graphical user interface (GUI), according to the Doug Engelbart Institute . This marks the development of the computer from a specialized machine for academics to a technology that is more accessible to the general public.

The first computer mouse, invented in 1963 by Douglas C. Engelbart

1969: Ken Thompson, Dennis Ritchie and a group of other developers at Bell Labs produce UNIX, an operating system that made "large-scale networking of diverse computing systems — and the internet — practical," according to Bell Labs .. The team behind UNIX continued to develop the operating system using the C programming language, which they also optimized. 

1970: The newly formed Intel unveils the Intel 1103, the first Dynamic Access Memory (DRAM) chip.

1971: A team of IBM engineers led by Alan Shugart invents the "floppy disk," enabling data to be shared among different computers.

1972: Ralph Baer, a German-American engineer, releases Magnavox Odyssey, the world's first home game console, in September 1972 , according to the Computer Museum of America . Months later, entrepreneur Nolan Bushnell and engineer Al Alcorn with Atari release Pong, the world's first commercially successful video game. 

1973: Robert Metcalfe, a member of the research staff for Xerox, develops Ethernet for connecting multiple computers and other hardware.

1977: The Commodore Personal Electronic Transactor (PET), is released onto the home computer market, featuring an MOS Technology 8-bit 6502 microprocessor, which controls the screen, keyboard and cassette player. The PET is especially successful in the education market, according to O'Regan.

1975: The magazine cover of the January issue of "Popular Electronics" highlights the Altair 8080 as the "world's first minicomputer kit to rival commercial models." After seeing the magazine issue, two "computer geeks," Paul Allen and Bill Gates, offer to write software for the Altair, using the new BASIC language. On April 4, after the success of this first endeavor, the two childhood friends form their own software company, Microsoft.

1976: Steve Jobs and Steve Wozniak co-found Apple Computer on April Fool's Day. They unveil Apple I, the first computer with a single-circuit board and ROM (Read Only Memory), according to MIT .

Apple I computer 1976

1977: Radio Shack began its initial production run of 3,000 TRS-80 Model 1 computers — disparagingly known as the "Trash 80" — priced at $599, according to the National Museum of American History. Within a year, the company took 250,000 orders for the computer, according to the book " How TRS-80 Enthusiasts Helped Spark the PC Revolution " (The Seeker Books, 2007).

1977: The first West Coast Computer Faire is held in San Francisco. Jobs and Wozniak present the Apple II computer at the Faire, which includes color graphics and features an audio cassette drive for storage.

1978: VisiCalc, the first computerized spreadsheet program is introduced.

1979: MicroPro International, founded by software engineer Seymour Rubenstein, releases WordStar, the world's first commercially successful word processor. WordStar is programmed by Rob Barnaby, and includes 137,000 lines of code, according to Matthew G. Kirschenbaum's book " Track Changes: A Literary History of Word Processing " (Harvard University Press, 2016).

1981: "Acorn," IBM's first personal computer, is released onto the market at a price point of $1,565, according to IBM. Acorn uses the MS-DOS operating system from Windows. Optional features include a display, printer, two diskette drives, extra memory, a game adapter and more.

A worker using an Acorn computer by IBM, 1981

1983: The Apple Lisa, standing for "Local Integrated Software Architecture" but also the name of Steve Jobs' daughter, according to the National Museum of American History ( NMAH ), is the first personal computer to feature a GUI. The machine also includes a drop-down menu and icons. Also this year, the Gavilan SC is released and is the first portable computer with a flip-form design and the very first to be sold as a "laptop."

1984: The Apple Macintosh is announced to the world during a Superbowl advertisement. The Macintosh is launched with a retail price of $2,500, according to the NMAH. 

1985 : As a response to the Apple Lisa's GUI, Microsoft releases Windows in November 1985, the Guardian reported . Meanwhile, Commodore announces the Amiga 1000.

1989: Tim Berners-Lee, a British researcher at the European Organization for Nuclear Research ( CERN ), submits his proposal for what would become the World Wide Web. His paper details his ideas for Hyper Text Markup Language (HTML), the building blocks of the Web. 

1993: The Pentium microprocessor advances the use of graphics and music on PCs.

1996: Sergey Brin and Larry Page develop the Google search engine at Stanford University.

1997: Microsoft invests $150 million in Apple, which at the time is struggling financially.  This investment ends an ongoing court case in which Apple accused Microsoft of copying its operating system. 

1999: Wi-Fi, the abbreviated term for "wireless fidelity" is developed, initially covering a distance of up to 300 feet (91 meters) Wired reported . 

21st century

2001: Mac OS X, later renamed OS X then simply macOS, is released by Apple as the successor to its standard Mac Operating System. OS X goes through 16 different versions, each with "10" as its title, and the first nine iterations are nicknamed after big cats, with the first being codenamed "Cheetah," TechRadar reported.  

2003: AMD's Athlon 64, the first 64-bit processor for personal computers, is released to customers. 

2004: The Mozilla Corporation launches Mozilla Firefox 1.0. The Web browser is one of the first major challenges to Internet Explorer, owned by Microsoft. During its first five years, Firefox exceeded a billion downloads by users, according to the Web Design Museum . 

2005: Google buys Android, a Linux-based mobile phone operating system

2006: The MacBook Pro from Apple hits the shelves. The Pro is the company's first Intel-based, dual-core mobile computer. 

2009: Microsoft launches Windows 7 on July 22. The new operating system features the ability to pin applications to the taskbar, scatter windows away by shaking another window, easy-to-access jumplists, easier previews of tiles and more, TechRadar reported .  

Apple CEO Steve Jobs holds the iPad during the launch of Apple's new tablet computing device in San Francisco

2010: The iPad, Apple's flagship handheld tablet, is unveiled.

2011: Google releases the Chromebook, which runs on Google Chrome OS.

2015: Apple releases the Apple Watch. Microsoft releases Windows 10.

2016: The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing platform that had the capability to program new algorithms into their system. They're usually each tailored to attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer at the University of Maryland, College Park.

2017: The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic structure as well as variables such as shape, size, or even color. This richness provides a vast design space for exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based, digital architectures."

2019: A team at Google became the first to demonstrate quantum supremacy — creating a quantum computer that could feasibly outperform the most powerful classical computer — albeit for a very specific problem with no practical real-world application. The described the computer, dubbed "Sycamore" in a paper that same year in the journal Nature . Achieving quantum advantage – in which a quantum computer solves a problem with real-world applications faster than the most powerful classical computer —  is still a ways off. 

2022: The first exascale supercomputer, and the world's fastest, Frontier, went online at the Oak Ridge Leadership Computing Facility (OLCF) in Tennessee. Built by Hewlett Packard Enterprise (HPE) at the cost of $600 million, Frontier uses nearly 10,000 AMD EPYC 7453 64-core CPUs alongside nearly 40,000 AMD Radeon Instinct MI250X GPUs. This machine ushered in the era of exascale computing, which refers to systems that can reach more than one exaFLOP of power – used to measure the performance of a system. Only one machine – Frontier – is currently capable of reaching such levels of performance. It is currently being used as a tool to aid scientific discovery.

What is the first computer in history?

Charles Babbage's Difference Engine, designed in the 1820s, is considered the first "mechanical" computer in history, according to the Science Museum in the U.K . Powered by steam with a hand crank, the machine calculated a series of values and printed the results in a table. 

What are the five generations of computing?

The "five generations of computing" is a framework for assessing the entire history of computing and the key technological advancements throughout it. 

The first generation, spanning the 1940s to the 1950s, covered vacuum tube-based machines. The second then progressed to incorporate transistor-based computing between the 50s and the 60s. In the 60s and 70s, the third generation gave rise to integrated circuit-based computing. We are now in between the fourth and fifth generations of computing, which are microprocessor-based and AI-based computing.

What is the most powerful computer in the world?

As of November 2023, the most powerful computer in the world is the Frontier supercomputer . The machine, which can reach a performance level of up to 1.102 exaFLOPS, ushered in the age of exascale computing in 2022 when it went online at Tennessee's  Oak Ridge Leadership Computing Facility (OLCF) 

There is, however, a potentially more powerful supercomputer waiting in the wings in the form of the Aurora supercomputer, which is housed at the Argonne National Laboratory (ANL) outside of Chicago.  Aurora went online in November 2023. Right now, it lags far behind Frontier, with performance levels of just 585.34 petaFLOPS (roughly half the performance of Frontier), although it's still not finished. When work is completed, the supercomputer is expected to reach performance levels higher than 2 exaFLOPS.

What was the first killer app?

Killer apps are widely understood to be those so essential that they are core to the technology they run on. There have been so many through the years – from Word for Windows in 1989 to iTunes in 2001 to social media apps like WhatsApp in more recent years

Several pieces of software may stake a claim to be the first killer app, but there is a broad consensus that VisiCalc, a spreadsheet program created by VisiCorp and originally released for the Apple II in 1979, holds that title. Steve Jobs even credits this app for propelling the Apple II to become the success it was, according to co-creator Dan Bricklin .

  • Fortune: A Look Back At 40 Years of Apple
  • The New Yorker: The First Windows
  • " A Brief History of Computing " by Gerard O'Regan (Springer, 2021)

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Timothy Williamson

Timothy is Editor in Chief of print and digital magazines All About History and History of War . He has previously worked on sister magazine All About Space , as well as photography and creative brands including Digital Photographer and 3D Artist . He has also written for How It Works magazine, several history bookazines and has a degree in English Literature from Bath Spa University . 

The 7 most powerful supercomputers in the world right now

Computing 'paradigm shift' could see phones and laptops run twice as fast — without replacing a single component

Where does the solar system end?

Most Popular

By Anna Gora December 27, 2023

By Anna Gora December 26, 2023

By Anna Gora December 25, 2023

By Emily Cooke December 23, 2023

By Victoria Atkinson December 22, 2023

By Anna Gora December 16, 2023

By Anna Gora December 15, 2023

By Anna Gora November 09, 2023

By Donavyn Coffey November 06, 2023

By Anna Gora October 31, 2023

By Anna Gora October 26, 2023

  • 2 'You could almost see and smell their world': Remnants of 'Britain's Pompeii' reveal details of life in Bronze Age village
  • 3 Hair-straightening cream tied to woman's repeated kidney damage
  • 4 Future quantum computers will be no match for 'space encryption' that uses light to beam data around — with the 1st satellite launching in 2025
  • 5 9,000-year-old rock art discovered among dinosaur footprints in Brazil
  • 2 The 7 most powerful supercomputers in the world right now
  • 3 Fiber-optic data transfer speeds hit a rapid 301 Tbps — 1.2 million times faster than your home broadband connection
  • 4 Powerful X-class solar flare slams Earth, triggering radio blackout over the Pacific Ocean
  • 5 Polar vortex is 'spinning backwards' above Arctic after major reversal event

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Books & Arts
  • Published: 08 December 2010

Biography: The ABC of computing

  • John Gilbey 1  

Nature volume  468 ,  pages 760–761 ( 2010 ) Cite this article

1496 Accesses

Metrics details

  • Computer science

An engaging biography of John Atanasoff reveals the obscure origins of the computer, explains John Gilbey.

The Man Who Invented the Computer: The Biography of John Atanasoff, Digital Pioneer

  • Jane Smiley

Who invented the digital computer? Depending on your definition, mathematical pioneers such as John von Neumann or Alan Turing might spring to mind, but its origin lies with US physicist John Atanasoff. Although few people could name him today, this rewarding biography by Pulitzer prizewinning author Jane Smiley may change that.

Atanasoff embodies the American Dream. The son of a Bulgarian immigrant who had fled to the United States as a child in the late 1880s, he grew up on the family farm in Florida. Through mastering the slide rule, helping his father with house electrical wiring and driving the family's Model T Ford at age 11, he developed a passion for engineering and mathematics.

After graduating from the University of Florida in Gainesville in 1925, with the highest grade average it had ever recorded, Atanasoff joined a master's programme at what is now Iowa State University in Ames. He turned down an offer to move to Harvard University and gained a PhD in physics at the University of Wisconsin-Madison. He returned to Iowa State — again declining an offer from Harvard — as an assistant professor.

biography computer definition

In The Man Who Invented the Computer , Smiley describes how Atanasoff developed an interest in mechanical calculators and modified an IBM tabulator to suit his own needs. But to meet his wider scientific aspirations — in particular, to solve simultaneous linear equations quickly — he realized that he would have to build a calculator himself. His struggle to design it concluded with an episode of pure cinema. Atanasoff, “unhappy to an extreme degree”, jumped in his car and drove more than 300 kilometres to the shore of the Mississippi River. Sitting in a roadside tavern with a glass of bourbon and soda, the solution fell into place. He began to make notes on a paper napkin.

Crucially, Iowa State had an excellent college of engineering. In 1939, Atanasoff teamed up with recent graduate Clifford Berry to develop the system that became known as the Atanasoff–Berry Computer (ABC). Built on a shoestring budget, the simple 'breadboard' prototype that emerged contained significant innovations. These included the use of vacuum tubes as the computing mechanism and operating memory; binary and logical calculation; serial computation; and the use of capacitors as storage memory. By the summer of 1940, Smiley tells us, a second, more-developed prototype was running and Atanasoff and Berry had written a 35-page manuscript describing it.

biography computer definition

Other people were working on similar devices. In the United Kingdom and at Princeton University in New Jersey, Turing was investigating practical outlets for the concepts in his 1936 paper 'On Computable Numbers'. In London, British engineer Tommy Flowers was using vacuum tubes as electronic switches for telephone exchanges in the General Post Office. In Germany, Konrad Zuse was working on a floating-point calculator — albeit based on electromechanical technology — that would have a 64-word storage capacity by 1941. Smiley weaves these stories into the narrative effectively, giving a broad sense of the rich ecology of thought that burgeoned during this crucial period of technological and logical development.

The Second World War changed everything. Atanasoff left Iowa State to work in the Naval Ordnance Laboratory in Washington DC. His prototype computer remained unpatented in the basement of the physics department until the machine was broken up in 1948. The exigencies of war meant that substantial resources were made available for key computing projects such as the vast Electrical Numerical Integrator and Calculator (ENIAC) machine at the University of Pennsylvania in Philadelphia, the launch of which Atanasoff attended in 1946. But Atanasoff moved on, and in 1951 went into business for himself. His Ordnance Engineering Corporation was sold for a healthy profit five years later.

Atanasoff was brought back into the picture by the untimely death of Berry in an apparent suicide in 1963. Concerned, Atanasoff travelled to New York to investigate. The family considered that murder was a possibility — Berry's father had been shot decades earlier by a disgruntled ex-employee — but it was never proven.

In 1973, Atanasoff again found himself in the spotlight after his work was cited in the conclusions of a patent dispute between computing-industry giants Honeywell and Sperry Rand about the early development of the digital computer. Smiley quotes Judge Earl Larson's acknowledgement that “between 1937 and 1942, Atanasoff ... developed and built an automatic electronic digital computer for solving large systems of simultaneous linear algebraic equations”.

Judge Larson further noted that John Mauchly, one of the ENIAC developers who had visited Atanasoff in Iowa, had inspected the Atanasoff–Berry Computer and had read the manuscript describing it. Mauchly derived from this, the judge said, “'the invention of the automatic electronic digital computer' claimed in the ENIAC patent” — indicating Atanasoff's key contribution, albeit unwitting, to the later project.

Belatedly, and largely through the advocacy of friends and writers, Atanasoff gained recognition. Owing to his father's origins, he received early plaudits in Bulgaria, where in 1970 he was granted the Order of Cyril and Methodius, First Class. In 1990 he was awarded the National Medal of Technology by President George H. W. Bush for his invention of the electronic digital computer and for contributions to the development of a technically trained US workforce. Atanasoff died in 1995.

The Man Who Invented the Computer is a vivid telling of the early story of the computing industry. By focusing on Atanasoff, Smiley blends obscure threads with those that are better known. The result would, without embellishment, make an exceptional feature film.

Author information

Authors and affiliations.

John Gilbey teaches in the Department of Computer Science at Aberystwyth University, Aberystwyth, Ceredigion SY23 2AX, UK. [email protected],

John Gilbey

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

About this article

Cite this article.

Gilbey, J. Biography: The ABC of computing. Nature 468 , 760–761 (2010). https://doi.org/10.1038/468760a

Download citation

Published : 08 December 2010

Issue Date : 09 December 2010

DOI : https://doi.org/10.1038/468760a

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

biography computer definition

Computer Hope

A computer is a programmable device that stores, retrieves, and processes data . The term "computer" was originally given to humans ( human computers ) who performed numerical calculations using mechanical calculators, such as the abacus and slide rule . The term was later given to mechanical devices as they began replacing human computers. Today's computers are electronic devices that accept data ( input ), process that data, produce output , and store ( storage ) the results ( IPOS ).

  • Computer overview.
  • History of the computer.

How are computers used today?

What components make up a desktop computer, what parts are needed for a computer to work, computer connections.

  • Types of computers.

Who makes computers?

  • Related information.
  • All computer questions and answers.

Computer overview

Below is a picture of a computer with each of the main components. You can see the desktop computer , flat-panel display , speakers , keyboard , and mouse in the picture below. We've also labeled each of the input devices and output devices .

Desktop computer

You can find further information about other types of computers and get a breakdown of the components that make up a desktop computer later on this page.

History of the computer

The first digital computer and what most people think of as a computer was called the ENIAC (Electronic Numerical Integrator and Computer). It was built during World War II (1943-1946) and was designed to help automate the calculations being done by human computers. By doing these calculations on a computer, they could achieve results much faster and with fewer errors.

Early computers like the ENIAC used vacuum tubes and were large (sometimes room size) and only found in businesses, universities, or governments. Later, computers began utilizing transistors and smaller and cheaper parts that allowed the ordinary person to own a computer.

  • When was the first computer invented?

Today, computers do jobs that used to be complicated much simpler. For example, you can write a letter in a word processor , edit it anytime, spell check , print copies, and send it to someone across the world in seconds. All these activities would have taken someone days, if not months, to do before. Also, these examples are a small fraction of what computers can do.

  • How are computers used?
  • How does a computer work?
  • What are the advantages of using a computer?

Today's desktop computers have some or all the components ( hardware ) and peripherals below. As technology advances, older technologies, such as a floppy disk drive and Zip drive (shown below), are no longer required or included.

Example of front of computer case

  • Case or Chassis
  • Optical drive: Blu-ray , CD-ROM , CD-R , CD-RW , or DVD .
  • CPU (central processing unit)
  • Floppy disk drive
  • RAM (random-access memory)
  • Monitor , LCD (liquid-crystal display), or another display device.
  • Motherboard
  • Network card
  • Power Supply

A computer does not require all the components mentioned above. However, a computer cannot function without having at the very minimum the parts listed below.

  • Processor - Component that executes instructions from the software and hardware .
  • Memory - Temporary primary storage for data traveling between the storage and CPU.
  • Motherboard (with onboard video) - Component that connects all components.
  • Storage device (e.g., hard drive ) - Slower secondary storage that permanently stores data.

However, if you had a computer with only the minimum parts above, you would be unable to communicate with it until you connected at least one input device (e.g., keyboard ). Also, you would need at least one output device (e.g., monitor ) for you to see what is happening.

Once a computer is set up, running, and connected to a network, you could disconnect the keyboard and monitor and remotely connect . Most servers and computers in data centers are used and controlled remotely.

  • What does the inside of a computer look like?
  • How does the computer relate to the human body?

All computers have different types of connections. An example of the back of a personal computer and brief descriptions of each connection is found on our computer connections page.

  • How to set up a new computer.

Types of computers

Computers can be classified as one of three types of computers: a general-purpose computer , special-purpose computer , or specialized computer .

A general-purpose computer is what most people think of when thinking about a computer and is what this page covers.

A special-purpose computer is embedded in almost all electronic devices and is the most widely-used computer. This computer is designed for a specific task and is found in ATMs , cars, microwaves, TVs , the VCR (video cassette recorder), and other home electronics. See our special-purpose computer page for further information and examples.

A specialized computer is like a general-purpose computer but is designed only to perform one or a few different tasks. See our specialized computer for further information and examples of these computers.

When discussing a computer or a "PC," you're usually referring to a desktop computer found in a home or office. However, the lines of what makes these computers are blurring. Below are different examples of what's considered a computer today.

Desktop computer, laptop, hybrid computer, tablet, and smartphone

The picture above shows several types of computers and computing devices and is an example of their differences. Below is a complete list of general-purpose computers of past and present.

Some computers could use many different classifications. For example, a desktop computer could also be classified as a gaming computer and a personal computer.

  • Custom-built PC
  • Desktop computer
  • Diskless workstation and Thin client
  • Gaming computer
  • Hybrid computer
  • Laptop, portable, notebook computer
  • Microcomputer
  • Nanocomputer
  • PDA (personal digital assistant)
  • Personal computer
  • Prebuilt computer
  • Quantum computer
  • Stick computer
  • Supercomputer

Today, there are two types of computers: the PC (IBM compatible) and Apple Mac. Many companies make and build PCs, and if you get all the necessary parts for a computer, you can even build a custom PC. However, with Apple computers, only Apple designs and makes these computers. See our computer companies page for a listing of companies ( OEMs ) that make and build computers.

  • What type of computer should I buy?
  • Desktop computer buying tips.
  • Mac vs. PC.

Related information

  • How to use a computer.
  • Basic computer troubleshooting.
  • How to install computer hardware.
  • What makes a computer fast and powerful?
  • How to learn more about computers.
  • Why should I learn about computers?
  • Now that I've got a computer, what can I do?
  • What came before computers and mobile devices?

Barebone , Compute , Computer family , Computer Hope , Connection , Hardware terms , Home Computer , Laptop , My Computer , PC , Rig , Server , System unit

  • More from M-W
  • To save this word, you'll need to log in. Log In

Definition of biography

Did you know.

So You've Been Asked to Submit a Biography

In a library, the word biography refers both to a kind of book and to a section where books of that kind are found. Each biography tells the story of a real person's life. A biography may be about someone who lived long ago, recently, or even someone who is still living, though in the last case it must necessarily be incomplete. The term autobiography refers to a biography written by the person it's about. Autobiographies are of course also necessarily incomplete.

Sometimes biographies are significantly shorter than a book—something anyone who's been asked to submit a biography for, say, a conference or a community newsletter will be glad to know. Often the word in these contexts is shortened to bio , a term that can be both a synonym of biography and a term for what is actually a biographical sketch: a brief description of a person's life. These kinds of biographies—bios—vary, but many times they are only a few sentences long. Looking at bios that have been used in the same context can be a useful guide in determining what to put in your own.

Examples of biography in a Sentence

These examples are programmatically compiled from various online sources to illustrate current usage of the word 'biography.' Any opinions expressed in the examples do not represent those of Merriam-Webster or its editors. Send us feedback about these examples.

Word History

Late Greek biographia , from Greek bi- + -graphia -graphy

1665, in the meaning defined at sense 2

Dictionary Entries Near biography

biographize

Cite this Entry

“Biography.” Merriam-Webster.com Dictionary , Merriam-Webster, https://www.merriam-webster.com/dictionary/biography. Accessed 2 Apr. 2024.

Kids Definition

Kids definition of biography, more from merriam-webster on biography.

Nglish: Translation of biography for Spanish Speakers

Britannica English: Translation of biography for Arabic Speakers

Britannica.com: Encyclopedia article about biography

Subscribe to America's largest dictionary and get thousands more definitions and advanced search—ad free!

Play Quordle: Guess all four words in a limited number of tries.  Each of your guesses must be a real 5-letter word.

Can you solve 4 words at once?

Word of the day.

See Definitions and Examples »

Get Word of the Day daily email!

Popular in Grammar & Usage

The tangled history of 'it's' and 'its', more commonly misspelled words, why does english have so many silent letters, your vs. you're: how to use them correctly, every letter is silent, sometimes: a-z list of examples, popular in wordplay, the words of the week - mar. 29, 10 scrabble words without any vowels, 12 more bird names that sound like insults (and sometimes are), 8 uncommon words related to love, 9 superb owl words, games & quizzes.

Play Blossom: Solve today's spelling word game by finding as many words as you can using just 7 letters. Longer words score more points.

Michael Dell

Michael Dell

Who Is Michael Dell?

Michael Dell showed an early interest in technology and gadgets. At the age of 15, he purchased an early Apple computer in order to take it apart to see how it worked. In college, he started building computers and selling them directly to people, focusing on strong customer support and cheaper prices. Dell Computer was the world's largest PC maker.

Born on February 23, 1965, in Houston, Texas, and while his mother, a stockbroker, and his father, an orthodontist, pushed their son to consider medicine, Dell showed an early interest in technology and business.

A hard worker, Dell landed a job washing dishes at a Chinese restaurant at the age 12 so that he could put away money for his stamp collection. A few years later he harnassed his ability to sift through data to find new customers for newspaper subscriptions for the Houston Post , which earned the high school student $18,000 in one single year.

Intrigued by the expanding world of computers and gadgetry, Dell purchased an early Apple computer at the age of 15 for the strict purpose of taking it apart to see how it worked.

Dell Computer and Company

It was in college that Dell found the niche that would become his boom. The PC world was still young and Dell realized that no company had tried selling directly to customers. Bypassing the middleman and the markups, Dell tapped his savings account for $1,000 and started building and selling computers for people he knew at college. His emphasis, however, wasn't just on good machines, but strong customer support and cheaper prices. Soon, he had accounts outside of school and it wasn't long before Dell dropped out and focused all his efforts on his business.

The numbers proved staggering. In 1984, Dell's first full year in business, he had $6 million in sales. By 2000, Dell was a billionaire and his company had offices in 34 countries and employee count of more than 35,000. The following year, Dell Computer surpassed Compaq Computer as the world's largest PC maker.

Overall, Dell's first 20 years proved to be one of the most successful businesses on the planet, surprising such titans as Wal-Mart and General Electric. Dell's story is so compelling that, in 1999, he published a best-selling book about his success, Direct from Dell: Strategies That Revolutionized the Industry .

Philanthropy

Intensely private and notoriously shy, Dell has come out of his shell over the years, say those who know him, thanks to his wife Susan, a Dallas native whom he married in 1989. The couple has four children.

Together, the Dells have shown a willingness to spread their wealth. In 1999, the couple started the Michael and Susan Dell Foundation, a large private charity that has doled out millions to causes and people like the tsunami victims in southern Asia. In 2006, the foundation donated $50 million to the University of Texas.

"A bunch of guys sitting around trying to decide what we want to have done with our money after we're dead, that's not a very good idea," he once said, expaling his early entry into philanthropy. "Forget all that. We're going to do this while we're still here and get it right."

In 2004 Dell stepped down as CEO of the company, but he remained chairman of the board. He served on the Foundation Board of the World Economic Forum and the executive committee of the International Business Council. He also was on the U.S. President's Council of Advisors on Science and Technology and sat on the governing board of the Indian School of Business in Hyderabad.

Controversy

In recent years, however, not everything has gone right for Dell or his company. Poorly built computers resulted in the company taking a $300 million charge to fix the faulty machines, a huge issue for the company that resulted in Dell losing its top perch atop the industry. In an effort to correct things, Dell returned in 2007 as CEO, but the results have been mixed.

Poor products continued to plague the company, and despite Dell Computer's efforts to play down the issue, documents later revealed that employees were well aware of the issues affecting millions of its computers.

In July 2010, Dell made headlines when he agreed to pay more than $100 million in penalties in order to settle charges of accounting fraud that had been filed by the Security and Exchange Commission. According to the charges, Dell Computer inflated its earning statements by counting rebates from the chip maker Intel that were issued to Dell to encourage the company not to use chips from Advanced Micro Devices in its computers and servers. By padding its statements, investigators claimed, Dell Computer had misled investors about its actual earnings.

In a move to help rebuild the company he founded, Dell announced in February 2013 that he would be taking his business private again. He reached an agreement with Silver Lake Partners, a private equity firm that specializes in technology, and computer software giant Microsoft to launch a buyout of all outstanding shares of Dell. This deal has been valued between $23 billion to more than $24 billion, making it one of the biggest buyouts in recent history.

According a Reuters news report, Dell believes "this transaction will open an exciting new chapter for Dell, our customers and team members." Many analysts share some of Dell's enthusiasm, but still think the company is facing serious challenges. Dell has seen its share of the PC market drop in recent years as well as increased competition from tablet and smartphone makers.

QUICK FACTS

  • Name: Michael Dell
  • Birth Year: 1965
  • Birth date: February 23, 1965
  • Birth State: Texas
  • Birth City: Houston
  • Birth Country: United States
  • Gender: Male
  • Best Known For: Michael Dell helped launch the personal computer revolution in the 1980s with the creation of the Dell Computer Corporation, now known as Dell Inc.
  • Business and Industry
  • Internet/Computing
  • Astrological Sign: Pisces
  • University of Texas in Austin
  • Nacionalities

CITATION INFORMATION

  • Article Title: Michael Dell Biography
  • Author: Biography.com Editors
  • Website Name: The Biography.com website
  • Url: https://www.biography.com/business-leaders/michael-dell
  • Access Date:
  • Publisher: A&E; Television Networks
  • Last Updated: July 17, 2020
  • Original Published Date: April 2, 2014
  • A bunch of guys sitting around trying to decide what we want to have done with our money after we're dead, that's not a very good idea. Forget all that. We're going to do this while we're still here and get it right."[On his early entry into philanthropy.]
  • It's through curiosity and looking at opportunities in new ways that we've always mapped our path at Dell. There's always an opportunity to make a difference.

Entrepreneurs

sean diddy combs smiles at the camera, he wears a red jacket over a white shirt and circular sunglasses

Frederick Jones

lonnie johnson stands behind a wooden lectern and speaks into a microphone, he wears a black suit jacket, maroon sweater, white collared shirt and tie, behind him is a screen projection showing two charts

Lonnie Johnson

oprah winfrey smiles for a camera at premiere event

Oprah Winfrey

black and white photo of madam cj walker

Madam C.J. Walker

parkes and ferrari at monza

Enzo Ferrari

enzo ferrari looking ahead at a camera as he opens a car door to exit

The Tragic True Story of the ‘Ferrari’ Movie

suge knight

Suge Knight

jimmy buffett smiles at the camera, he wears a pink hawaiian shirt with a purple and white lei

Jimmy Buffett

jimmy dean

Rupert Murdoch

scooter braun looking offscreen at photographers in front of a backdrop

Who Is Music Mogul Scooter Braun?

Cambridge Dictionary

  • Cambridge Dictionary +Plus

Meaning of biography in English

Your browser doesn't support HTML5 audio

  • This biography offers a few glimpses of his life before he became famous .
  • Her biography revealed that she was not as rich as everyone thought .
  • The biography was a bit of a rush job .
  • The biography is an attempt to uncover the inner man.
  • The biography is woven from the many accounts which exist of things she did.
  • exercise book
  • novelistically
  • young adult

biography | Intermediate English

  • biographical

Examples of biography

Translations of biography.

Get a quick, free translation!

{{randomImageQuizHook.quizId}}

Word of the Day

peanut butter (= a soft food made from crushed peanuts) and jam (= a soft sweet food made from fruit and sugar), or a sandwich with these inside. PB&J is short for peanut butter and jelly.

Sitting on the fence (Newspaper idioms)

Sitting on the fence (Newspaper idioms)

biography computer definition

Learn more with +Plus

  • Recent and Recommended {{#preferredDictionaries}} {{name}} {{/preferredDictionaries}}
  • Definitions Clear explanations of natural written and spoken English English Learner’s Dictionary Essential British English Essential American English
  • Grammar and thesaurus Usage explanations of natural written and spoken English Grammar Thesaurus
  • Pronunciation British and American pronunciations with audio English Pronunciation
  • English–Chinese (Simplified) Chinese (Simplified)–English
  • English–Chinese (Traditional) Chinese (Traditional)–English
  • English–Dutch Dutch–English
  • English–French French–English
  • English–German German–English
  • English–Indonesian Indonesian–English
  • English–Italian Italian–English
  • English–Japanese Japanese–English
  • English–Norwegian Norwegian–English
  • English–Polish Polish–English
  • English–Portuguese Portuguese–English
  • English–Spanish Spanish–English
  • English–Swedish Swedish–English
  • Dictionary +Plus Word Lists
  • English    Noun
  • Translations
  • All translations

Add biography to one of your lists below, or create a new one.

{{message}}

Something went wrong.

There was a problem sending your report.

IMAGES

  1. How to Write a Biography Essay and Get an A+

    biography computer definition

  2. Biography

    biography computer definition

  3. How to Use Choice to Achieve Amazing Biography Projects

    biography computer definition

  4. Computer definition || History of computer || Advantages of computer

    biography computer definition

  5. 34 BIOGRAPHY FORM MEANING

    biography computer definition

  6. Introduction to Computer|Definition of Computer|Characteristics or

    biography computer definition

VIDEO

  1. What is Computer

  2. What is Computer

  3. Computer Definition, Where Computer is being used?

  4. Computer (Definition, Characteristics, Application area, and Limitation) (Jarso's Lectures)

  5. Larry Page Biography : American computer scientist and Internet entrepreneur

  6. computer full form and computer definition 🥰🥰

COMMENTS

  1. Computer

    A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations ( computation ). Modern digital electronic computers can perform generic sets of operations known as programs. These programs enable computers to perform a wide range of tasks. The term computer system may refer to a nominally ...

  2. Computer

    computer, device for processing, storing, and displaying information. Computer once meant a person who did computations, but now the term almost universally refers to automated electronic machinery. The first section of this article focuses on modern digital electronic computers and their design, constituent parts, and applications.

  3. Charles Babbage

    Charles Babbage (born December 26, 1791, London, England—died October 18, 1871, London) was an English mathematician and inventor who is credited with having conceived the first automatic digital computer. Charles Babbage. In 1812 Babbage helped found the Analytical Society, whose object was to introduce developments from the European ...

  4. History of computers: A brief timeline

    The machine is the first "automatic, general-purpose, electronic, decimal, digital computer," according to Edwin D. Reilly's book "Milestones in Computer Science and Information Technology ...

  5. Biography: The ABC of computing

    An engaging biography of John Atanasoff reveals the obscure origins of the computer, explains John Gilbey. The Man Who Invented the Computer: The Biography of John Atanasoff, Digital Pioneer Jane ...

  6. What is a Computer?

    Computer. A computer is a programmable device that stores, retrieves, and processes data. The term "computer" was originally given to humans ( human computers) who performed numerical calculations using mechanical calculators, such as the abacus and slide rule. The term was later given to mechanical devices as they began replacing human computers.

  7. History of Computers

    According to Merriam-Webster Dictionary, the computer definition is ''a programmable usually electronic device that can store, retrieve, and process data.'' Today, many people use computers at ...

  8. Computer

    Computer. A computer is a machine that uses electronics to input, process, store, and output data. Data is information such as numbers, words, and lists. Input of data means to read information from a keyboard, a storage device like a hard drive, or a sensor. The computer processes or changes the data by following the instructions in software ...

  9. Digital computer

    digital computer, any of a class of devices capable of solving problems by processing information in discrete form.It operates on data, including magnitudes, letters, and symbols, that are expressed in binary code—i.e., using only the two digits 0 and 1. By counting, comparing, and manipulating these digits or their combinations according to a set of instructions held in its memory, a ...

  10. Laptop

    Clockwise from top left: A 2021 MacBook Pro by Apple Inc.; a 2019 Microsoft Surface Pro 7 with detachable hinge (left) and a 2018 Dell XPS 15 9570 with 360 degree hinge (right); a 2014 ThinkPad Helix by Lenovo with detachable screen; and a 2014 Acer Chromebook 11. A laptop computer or notebook computer, also known as a laptop or notebook, is a small, portable personal computer (PC).

  11. What is a Computer? Everything You Need To Know

    Computer: A computer is a machine or device that performs processes, calculations and operations based on instructions provided by a software or hardware program. It is designed to execute applications and provides a variety of solutions by combining integrated hardware and software components.

  12. Computer Definition & Meaning

    How to use computer in a sentence. one that computes; specifically : a programmable usually electronic device that can store, retrieve, and process data… See the full definition

  13. Computer hardware

    PDP-11 CPU board. Computer hardware includes the physical parts of a computer, such as the central processing unit (CPU), random access memory (RAM), motherboard, computer data storage, graphics card, sound card, and computer case.It includes external devices such as a monitor, mouse, keyboard, and speakers.. By contrast, software is the set of instructions that can be stored and run by hardware.

  14. Biography Definition & Meaning

    biography: [noun] a usually written history of a person's life.

  15. Computer science

    computer science, the study of computers and computing, including their theoretical and algorithmic foundations, hardware and software, and their uses for processing information. The discipline of computer science includes the study of algorithms and data structures, computer and network design, modeling data and information processes, and ...

  16. computer

    Generally, a computer is any device that can perform numerical calculations—even an adding machine, an abacus, or a slide rule. Currently, however, the term usually refers to an electronic device that can perform automatically a series of tasks according to a precise set of instructions. The set of instructions is called a program, and the ...

  17. Michael Dell

    Birth date: February 23, 1965. Birth State: Texas. Birth City: Houston. Birth Country: United States. Gender: Male. Best Known For: Michael Dell helped launch the personal computer revolution in ...

  18. Personal computer (PC)

    personal computer (PC), a digital computer designed for use by only one person at a time. A typical personal computer assemblage consists of a central processing unit (CPU), which contains the computer's arithmetic, logic, and control circuitry on an integrated circuit; two types of computer memory, main memory, such as digital random-access memory (RAM), and auxiliary memory, such as ...

  19. Computer network

    A computer network is a set of computers sharing resources located on or provided by network nodes.Computers use common communication protocols over digital interconnections to communicate with each other. These interconnections are made up of telecommunication network technologies based on physically wired, optical, and wireless radio-frequency methods that may be arranged in a variety of ...

  20. BIOGRAPHY

    BIOGRAPHY definition: 1. the life story of a person written by someone else: 2. the life story of a person written by…. Learn more.

  21. Computer memory

    Computer memory stores information, such as data and programs, for immediate use in the computer. [2] The term memory is often synonymous with the terms RAM, main memory or primary storage. Archaic synonyms for main memory include core (for magnetic core memory) and store. [3]

  22. BIOGRAPHY

    BIOGRAPHY meaning: 1. the life story of a person written by someone else: 2. the life story of a person written by…. Learn more.

  23. Server (computing)

    In computing, a server is a piece of computer hardware or software ( computer program) that provides functionality for other programs or devices, called "clients". This architecture is called the client-server model. Servers can provide various functionalities, often called "services", such as sharing data or resources among multiple clients ...