research type of computer

15 Types Of Computers (Analog To Quantum)

Types of Computers

As a computer science student, I use computers for nearly the whole day, often several different types. For example, I work from my desktop computer ( with lovely dual monitors ), my Chromebook, and sometimes even my phone. However, I recently learned of supercomputers when studying the history of computers which piqued my curiosity. So I spent some time learning about what a computer is, as well as the different types of computers. And let me tell you, there are quite a few!

What Is A Computer?

A computer is a device that takes in some form of input data, processes it, then produces logical output. Computers used to be mechanical machines. However, in recent history, they’ve transformed into electrical devices. The earliest computers were simply calculators designed to assist in scientific computation. However, computers have since evolved to process data at incredible rates, even storing data and program instructions in their internal memory.

Within the last 60 years, computers have gone from taking up entire rooms and costing millions of dollars to being the size of a credit card and costing a mere $35. I’m referring to the first supercomputer, the CDC 6600, and the Raspberry Pi computer minicomputer respectively. Not only is the Raspberry Pi nearly a million times less expensive and many times smaller, but it’s also more than ten times faster .

“What a computer is to me is, it’s the most remarkable tool that we’ve ever come up with. It’s the equivalent to a bicycle for our minds.” -Steve Jobs

How Do Computers Work?

If you’re unsure how computers work, they probably seem like magic to you. That’s how computers seemed to me before I lifted the veil and discovered their inner workings. There are four basic functions of computers that define how they work:

  • Input – Refers to information fed into the computer
  • Memory – Refers to data and algorithms stored within the computer
  • Processing – Refers to the act of processing input data
  • Output – Refers to the processed data coming out of the computer

Types of Computers

There used to be only a few different types of computers but today, there are at least 15 types of computers in the world. These include analog computers, digital computers, hybrid computers, PCs, tablets, mainframes, servers, supercomputers, minicomputers, quantum computers, smartphones, smartwatches, and more. Additionally, with the continuously decreasing size of computers over time, a growing number of appliances are coming online, referred to as the Internet of Things (IoT).

Analog Computers

Analog computers have been around for at least 2,000 years, dating back to the Antikythera Mechanism (pictured below). However, analog computers peaked in popularity sometime around the 1950s. Eventually, analog computers peaked when they were used inside the Saturn V rocket and assisted in the Apollo Moon landings .

The invention of the integrated circuit, the transistor, and the microprocessor around the same time led to much faster, smaller, and less expensive digital computers. Since then, they haven’t disappeared but they’ve become far less popular with very few still in use today.

Types of Computers - Analog Computer - Antikythera Mechanism

What Is An Analog Computer?

Analog computers are a type of computer that uses constantly changing mechanisms and displays output data in an analog fashion. For example, an analog watch has many complex gears that are constantly turning precisely and continuously and display the time with turning hands. On the other hand, a digital watch has electric components that compute the time and display it in a still, digital fashion.

Features of Analog Computers

  • Mechanical parts such as gears and levers
  • Continuously changing mechanisms
  • Hydraulic components such as pipes and valves
  • Electrical components such as resistors and capacitors

Digital Computers

Nearly every type of computer in the world today is classified as digital. This includes all of our personal computers and wearables, supercomputers and minicomputers, and IoT devices. Digital computers process information in a different way than analog computers. Rather than processing continuously changing data, digital computers process the simplest language in the world: binary.

computer processor - 15 types of computers - digital computers

What Is Binary?

Binary, a base-2 number system, is referred to as ‘machine language’ because it’s the language that computers understand. It contains only two numbers in the entire number system: 0 and 1. With these two simple numbers, computers can take on two different states: ‘off’ and ‘on.’ The beauty of modern digital computers is that they can process many series of binary inputs in a short time.

What Is A Digital Computer?

Digital computers process digital data, often in binary format. Technically, the Abacus invented more than 4,000 years ago, was the first digital computer. However, we typically think of digital computers as the modern digital electrical computing powerhouses of today. Typically, digital computers consist of input devices such as keyboard and mouse and output devices such as screens and speakers. The ‘brain’ of a digital computer is its CPU or Central Processing Unit.

Main Components Of Digital Computers

  • Motherboard: Circuit board connecting all components
  • Processor: (CPU) Central Processing Unit
  • Video Card: (GPU) Graphics Processing Unit
  • Memory: (RAM) Random Access Memory
  • Storage: (SSD) Solid-State Drive or (HDD) Hard Disk Drive
  • Power Supply Unit: Takes in power to use for computer
  • Input/Output Devices: Keyboard, mouse, screen, speakers

Hybrid Computers

The history of hybrid computers dates back to the 1960s. In fact, the first hybrid computer, the HYCOMP 250 , was created in 1961. There were other hybrid computers that came about in the 1960s such as the HYDAC 2400 in 1963 but they never quite became mainstream devices. However, they were still made even in the 1980s as the Marconi Space and Defense System Limited came out with their Starglow Hybrid Computer. Around this time, hybrid computers have dwindled in popularity.

What is a computer? Hydac 2400 hybrid computer

What Is A Hybrid Computer?

Hybrid computers combine aspects of both digital and analog computers. Essentially, you get the high speeds and complexity of analog computers combined with the accuracy of digital computers. Also, there are often digital components of hybrid computers that act as controllers.

Quantum Computers

Quantum Computers are a mysterious new type of computer, separate from digital and analog computers. However, they do take principles from digital computers, borrowing the binary system, and extending it to include qubits. The first quantum computer , a 2-qubit device, was created fairly recently in 1998 by three leading quantum computer scientists. It However, they’ve since made tremendous progress.

Just two years later, in 2000, a functioning 4-qubit quantum computer was created by David Wineland and others of the U.S. National Institute for Standards and Technology (NIST). Only a week later, a 7-qubit quantum computer was completed by another group of researchers by utilizing trans-crotonic acid in the development of the device.

In the last 20 years, quantum computers have made… (dare I say it?) quantum leaps. Some of the leaders in the space today include IBM, Google, and the world’s first quantum computing company, D-Wave Systems Inc. Just recently, in 2015, D-Wave broke a quantum barrier in the field, the 1000 qubit barrier when they developed a 1000-qubit quantum annealing processor chip. This processor opened a world of possibilities.

What is a computer? (15 types of computers) | (D-Wave TwoX 1000 Qubit quantum annealing processor chip)

What Is A Qubit?

Qubit is short for a quantum bit . A classic bit refers to a ‘0’ or a ‘1’ and is the basis of all digital computers. However, a qubit can maintain a state of ‘0’, ‘1’, as well as both simultaneously . This seemingly magical phenomenon of the two states occurring simultaneously stems from quantum theory and is commonly referred to as superposition .

What Is A Quantum Computer?

Quantum computers are a type of computer that utilizes concepts from quantum physics such as superposition. The concept of superposition stems from the fact that, unlike digital computers that rely on bits, quantum computers use qubits or quantum bits. Because of the quantum state of the bits, quantum computers are able to perform at unprecedented speeds and are expected to soon attain quantum supremacy , leaving digital computers in the dust.

Mainframe Computers

Some of the first digital computers were large mainframe computers. They’re known to be huge computers, dubbed “big iron” from their bulky origins. The first mainframe, the Harvard Mark I, goes back to the 1940s. It was costed $200,000 to develop and was as large as a room, weighing five tons!

Mainframe computers took off in the 1960s and 1970s. However, demand began to shrink in the 1980s when in 1984, the sale of personal computers surpassed that of mainframe computers. This was shortly after the release of the Apple II and IBM’s PC, the IBM Model 5150.

Although mainframe computers have dwindled in popularity, they’re still very widely used today and will continue to be relevant in the future. To this day, roughly 70% of Fortune 500 businesses use mainframes in some regard. Additionally, innovations are still being made in mainframes. The IBM z13 (shown below) was created in 2015 and the Rockhopper (shown next to the z13) was created in late 2018.

research type of computer

What Are Mainframe Computers?

Mainframe computers are also known simply as ‘mainframes’ or even as a ‘big iron.’ They were referred to as such because they were extremely large and powerful computers. The main function of mainframes is to process extremely large amounts of data very quickly. Although popularity has dwindled in recent times, they’re still very useful today, especially in enterprise applications.

Server Computers

Servers play a major role in computing and have ever since IBM launched the first list server in 1981, the IBM VM Machine. Of course, there was also the first web server that was created in 1991 which launched the worldwide web. In more recent years, physical servers have waned and virtual cloud servers have quickly become the market leader, hosting most of today’s web pages and applications.

Over the years, there have been many types of server computers have come into existence. In fact, there are several types of servers in use today, in addition to list servers, web servers, and virtual cloud servers. Here is a brief list of some of the different types of servers.

Types of Servers

  • Application Server
  • Cloud Server
  • Communication Server
  • Computing Server
  • Database Server
  • File Server
  • Game Server
  • Mail Server
  • Media Server
  • Proxy Server
  • Virtual Server

research type of computer

What Is A Server Computer?

There are several types of server computers, also known simply as ‘servers.’ However, a server is a computer (or program) that provides additional functionality for other computers, referred to as ‘clients.’ Perhaps the most popular servers are web servers, which allow other computers such as a PC (Personal Computer) to connect to the web via the internet.

Supercomputers

Supercomputers are actually some of the earliest digital computers, with the first being the CDC 6600 which was built back in 1964. It was a highly sought-after computer for any scientists that were in need of running complex computations. The same scientists that developed the CDC 6600, also invented several other supercomputers going into the 1970s, including the Cray-1, followed by the liquid-cooled Cray-2 in the 1980s.

Through the 1990s and early 2000s, supercomputers continued advancing until in 2008, the IBM Roadrunner broke the petaFLOPS barrier. PetaFLOPS is a measure of computing speed that computes one thousand million million (10 15 ) floating-point operations per second. In other words, it’s fast . Yet, as unthinkably fast as the IBM Roadrunner was, it pales in comparison to the latest supercomputer and became obsolete just 5 years after it was made.

The Fugaku supercomputer , the successor of the 2011 Fujitsu K Computer (shown below), was operational as of June 2020. The amazing thing about the Fugaku supercomputer is that it reaches speeds of up to 415 PetaFLOPS. That’s more than three times faster than the next fastest supercomputer, the IBM Summit, which runs as fast as 122 PetaFLOPS. It won’t be long now until supercomputers reach ExaFLOPS (1,000 PetaFLOPS) territory.

research type of computer

What Is A Supercomputer?

Supercomputers are the fastest digital computers on the planet, rivaled only by quantum computers. Many will make the claim that supercomputers are also similar to mainframes because of their size and structure, but mainframes don’t come close in terms of processing speed. Supercomputers are often used for scientific work.

Minicomputers

Minicomputers, also referred to as ‘mini’s,’ first appeared in the 1960s with the first mini being the DEC PDP-8 (shown below). By today’s standards, these computers were anything but mini. However, when compared to the previous generation of computers in the 1950s that used vacuum tubes and occupied an entire room, you realize they were indeed very small.

What made minicomputers possible was the invention of the transistor in 1947 and the integrated circuit in 1958. With these new inventions replaced vacuum tubes, making computers smaller and cheaper. The DEC PDP-8 weighed 250 pounds and cost $20,000 which was smaller and cheaper than most computers available at the time.

Through the 1960s and 1970s, computers continued to make consistent strides, as described by Moore’s Law . Thus, with the inventions of the personal computer and laptop, the demand for minicomputers quickly dwindled. The decrease in popularity for minis began in the 1980s and sharply increased in the 1990s as newer computers utilized microprocessors, spelling the end for minicomputers.

Minicomputer

What Is A Minicomputer?

A minicomputer, or mini, is a computer that’s smaller and less powerful than a mainframe yet larger and more powerful than a personal computer. According to the 1970 article in The New York Times , minicomputers by definition must also cost less than $25,000. Unlike personal computers which are very much general-purpose, minicomputers were often designed for a specific function.

Personal Computer

Many people claim that the first-ever Personal Computer was created in 1971 by John Blankenbaker, known as the Kenbak-1 . This first Personal Computer, or PC, cost a reasonable $750 and had a whopping 256 bytes of RAM. The concept caught on and in 1977, the Apple-II was released, becoming the first mass-produced personal computer.

Flash forward to today and PCs have reached a whole new plateau. Today, PCs come in several shapes and sizes. There are desktops, laptops, tablets, smartphones, and even wearable computers such as smartwatches. Personal Computers in all of their variety have enabled people like never before. We’re all more connected than ever and have boundless opportunities.

Anything we want to learn is just a click away, including coding. Anyone could learn to code and launch their own product or website, just like this one. The Personal Computer, especially those on the market today, is the single most empowering invention in modern history.

Personal Computer - what is a computer - 15 types of computers

What Is A Personal Computer?

A Personal Computer, more commonly known as a PC, is a computer intended for personal use. PCs are general-purpose and highly capable devices of varying types. Desktop computers, laptops, tablets, smartphones, and smartwatches are all classified as Personal Computers.

Desktop Computer

The first-ever desktop computer was the Programma 101 (shown below), invented by Pier Giorgio Perotto in 1964. However, it’s not at all like the desktop computers of today. The Programma 101, also known as the P101, didn’t have a monitor for an output device, nor did it have a mouse for an input device.

For input, it had a small keyboard consisting of numbers, a few letters, and a few arithmetic operators. The output was printed onto a small roll of paper. One of the amazing things about this initial desktop computer is that it was capable of playing a simple mathematical dice game . Thus, becoming the first game to ever run on a desktop.

Truth be told Programma 101 was more of a calculator than a computer by today’s standards. However, it was remarkable at the time and it served to push computers forward into the modern era of computing. It can be likened to the grandfather of the Apple II, which is the grandfather of modern desktop computers.

The lineage of desktop computers runs deeper than you might have expected. Desktop computers saw a massive boost in popularity during the 1980s when desktops became cheaper and more practical for the average person. However, since around the mid-2000s, the laptop computer has overtaken the desktop in popularity.

First desktop computer, Programma 101

What Is A Desktop Computer?

A desktop computer also referred to simply as a desktop, is a type of personal computer that is intended to sit atop a desk. Modern desktops have a monitor, a keyboard, and a mouse as input and output devices. Desktops differ from Laptops in that laptops are more mobile and compact and can sit atop the user’s lap.

Laptop Computers

Laptop Computers have been around since the early 1980s. However, they really took off during the 1990s. In fact, one laptop, in particular, ended the 1990s with a lot of flash, style, and performance. Apple has pathed the way for the best new personal computers since the 1980s and the Apple iBook (as shown below) is no exception.

The iBook laptop dazzled with its looks and its groundbreaking wireless technology. It was the first of its kind to do so, using their Airport. Suddenly, there was a compact, sleek laptop computer that can wirelessly surf the web and send emails.

Today, as advanced as laptops are, they’re still decreasing in popularity as newer, smaller computer has taken over. I’m referring of course to smartphones. However, even as laptops aren’t the most popular computer, they’re still the first choice for many who need more capability than that contained in today’s smartphones.

Apple iBook laptop - what is a computer - 15 types of computers

What Is A Laptop Computer?

A laptop computer, or simply a laptop, is a portable personal computer that can rest on the user’s lap, or a desk. Laptops are more portable than PCs, yet offer very similar performance, making them extremely popular for most students and enterprises. Modern laptops are all Wi-Fi enabled, adding to their portability.

Smartphones

The first smartphone was the Simon Personal Communicator, created in 1994. In the nineties, you were the coolest person alive if you had one of these bricks. However, fast forward to 2007 and there’s a brand new hot product sweeping the market: the iPhone.

Smartphones were able to access the internet earlier in the 2000s. However, the iPhone greatly improved the experience. Also, while other smartphones at the time had built-in apps, the iPhone had an app store. The truly groundbreaking thing about the iPhone’s relation to applications is that it opened its app store up to 3rd-party developers.

Suddenly, a whole new industry was created , as well as a whole new class of developers: mobile applications. Before long, we had social media apps and awesome games like Angry Birds, don’t even get me started on Flappy Bird! With most people having a smartphone in their pocket, it has quickly become the most popular personal computer in the world and remains so.

iPhone

What Is A Smartphone?

A smartphone is a mobile phone with nearly all the same functionality as a desktop or laptop computer. Unlike older generations of mobile phones, smartphones also have large touchscreens that function as both input and output. Not only can smartphones connect to the internet wirelessly, but they can also access a wide range of applications that provide additional functionality.

Tablet Computers

Tablet computers are still pretty new, relative to the other types of computers on this list. The prototype tablets were various PDAs (Personal Digital Assistants) such as Apple’s Newton MessagePad in 1993. However, Windows coined the phrase “tablet computer” when they released arguably the first true tablet in the year 2000: The Microsoft Tablet PC .

Only 10 years later, in 2010, the legendary Steve Jobs presented the Apple iPad (shown below) and once again stunned the crowd. Unlike other tablets, Apple’s iPad had access to the Apple store and all of the applications within it. However, it was also simply an amazing new personal computer with the look, feel, and performance you would expect from any Apple product.

Just a year later, Apple launched the iPad 2 with even more features, including a front-facing camera for FaceTime video calls. It was also thinner and more powerful. Other competitors have released fantastic tablets such as Amazon’s Kindle Fire table, of which I personally own one. These two tablets are on completely opposite sides of the pricing spectrum. However, they offer a lot of value.

Apple iPad - what is a computer - 15 types of computers

What Is A Tablet Computer?

A tablet computer, also known simply as a tablet, is a flat, mobile computer with a touchscreen display. Tablets can be compared to smartphones, as they’re both very similar. Tablets are typically larger and faster than smartphones, yet lack the capability of making phone calls.

Wearable Computers

The newest and smallest PCs don’t reside in your pocket. Rather, they live on our wrists and even on our faces. I’m referring to smartwatches and smart glasses. The two leaders in the space are to no one’s surprise, Apple and Google products. Google released its smart glasses, the Google Glass (shown below) in 2013 on a limited basis, and Apple released the Apple Watch in 2015.

Wearable computers haven’t been around long, but they’re certainly here to stay. Although, they will continue to evolve. In fact, the next generation of personal computers will likely reside in us, rather than on us. Elon Musk’s company Neuralink is developing computers that will interface directly with the human brain. Prototypes are already functioning inside pigs and human trials are right around the corner.

Google Glass - what is a computer - 15 types of computers

What Are Wearable Computers?

A wearable computer, also known as a wearable, is a type of computer that is worn on the body. Smartwatches such as the Apple Watch and smart glasses such as the Google Glass are two prime examples of wearable computers. Wearables are changing the way people interface with computers.

IoT Devices

The Internet of Things (IoT) refers to the growing number of items with embedded computers and internet access. The term IoT was coined by Keven Ashton in 1999 during the internet boom. The following year, LG announced its first smart fridge with a large digital touchscreen display on the front of it.

IoT devices continued to grow in popularity at an extraordinary rate through the 2000s and in 2008, the number of “things” online surpassed the number of people on the internet. In 2009, Google began testing self-driving cars that recorded and relayed sensory data via the internet.

In 2011, you probably remember the Nest thermostat (shown below) that took the internet by storm. What was once an ugly albeit small appliance was now shiny, cool, and cost-effective as it saved money on heating bills. But the wave was just rolling in the world of IoT and many tech companies rode that wave brilliantly.

Nest Thermostat - what is a computer - 15 types of computers

What Is An IoT Device?

An IoT device is anything that has an embedded computer and sensors that send sensory data via the internet. There are many types of IoT-enabled devices including thermostats, refrigerators, doorbell cameras, drones, light bulbs, light switches, smoke detectors, air purifiers, and the list goes on.

Final Thoughts

In conclusion, the world of computing has come a long way since the invention of the first analog computer. Today, we have a vast range of computing devices at our disposal, ranging from personal computers and smartphones to supercomputers and quantum computers.

Each type of computer has its unique features, advantages, and limitations, making them suitable for various applications. Whether you need a powerful machine to run complex simulations, a portable device to stay connected on the go, or a wearable gadget to track your fitness goals, there’s a computer out there for you.

With the rise of IoT devices, we can expect computing technology to become even more integrated into our daily lives in the future, making our lives more convenient, efficient, and connected.

Tim Statler

Tim Statler is a Computer Science student at Governors State University and the creator of Comp Sci Central. He lives in Crete, IL with his wife, Stefanie, and their cats, Beyoncé and Monte. When he's not studying or writing for Comp Sci Central, he's probably just hanging out or making some delicious food.

Recent Posts

Programming Language Levels (Lowest to Highest)

When learning to code, one of the first things I was curious about was the difference in programming language levels. I recently did a deep dive into these different levels and put together this...

Is Python a High-Level Language?

Python is my favorite programming language so I wanted to know, "Is Python a High-Level Language?" I did a little bit of research to find out for myself and here is what I learned. Is Python a...

Logo for BCcampus Open Publishing

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Unit 6. Basic computer terminologies

Topic B: Types of computers

Click play on the following audio player to listen along as you read this section.

Classification of Computers by Size

Supercomputers, mainframe computers, minicomputers.

  • Personal computers (PCs) or microcomputers

A supercomputer.

Supercomputer – a powerful computer that can process large amounts of data and do a great amount of computation very quickly.

Supercomputers are used for areas related to:

  • Engineering

Supercomputers are useful for applications involving very large databases or that require a great amount of computation.

Supercomputers are used for complex tasks, such as:

  • Weather forecasting
  • Climate research
  • Scientific simulation
  • Oil and gas exploration
  • Quantum mechanics
  • Cryptanalysis

A mainframe computer.

Mainframe computer – a high-performance computer used for large information processing jobs.

Mainframe computers are primarily used in :

  • Institutions
  • Health care
  • Large businesses
  • Financial institutions
  • Stock brokerage firms
  • Insurance agencies

Mainframe computers are useful for tasks related to:

  • Census taking
  • Industry and consumer statistics
  • Enterprise resource planning
  • Transaction processing
  • e-business and e-commerce

A minicomputer.

Minicomputer – a mid-range computer that is intermediate in size, power, speed, storage capacity, etc., between a mainframe and a microcomputer.

Minicomputers are used by small organizations.

“Minicomputer” is a term that is no longer used much. In recent years, minicomputers are often referred to as small or midsize servers (a server is a central computer that provides information to other computers).

Personal computers

A personal computer.

Personal computer (PC) – a small computer designed for use by a single user at a time.

A PC or microcomputer uses a single chip (microprocessor) for its central processing unit (CPU).

“Microcomputer” is now primarily used to mean a PC, but it can refer to any kind of small computer, such as a desktop computer, laptop computer, tablet, smartphone, or wearable.

Types of personal computers

Desktop computer – a personal computer that is designed to stay at one location and fits on or under a desk. It typically has a monitor, keyboard, mouse, and a tower (system unit).

A desktop computer.

Laptop computer (or notebook) – A portable personal computer that is small enough to rest on the user’s lap and can be powered by a battery. It includes a flip down screen and a keyboard with a touchpad.

A laptop computer.

Tablet – A wireless touchscreen PC that is slightly smaller and weighs less than the average laptop.

A tablet computer.

Smartphone – A mobile phone that performs many of the functions of a personal computer.

A smartphone.

a powerful computer that can process large amounts of data and do a great amount of computation very quickly.

a high-performance computer used for large information processing jobs.

a mid-range computer that is intermediate in size, power, speed, storage capacity, etc., between a mainframe and a microcomputer.

a central computer that provides information to other computers.

a small computer designed for use by a single user at a time. Also known as a PC or a microcomputer.

a personal computer that is designed to stay at one location and fits on or under a desk. It typically has a monitor, keyboard, mouse, and a tower (system unit).

a portable personal computer that is small enough to rest on the user's lap and can be powered by a battery. It includes a flip down screen and a keyboard with a touchpad. Also known as a notebook.

a wireless touchscreen PC that is slightly smaller and weighs less than the average laptop.

a mobile phone that performs many of the functions of a personal computer.

Key Concepts of Computer Studies Copyright © 2020 by Meizhong Wang is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

research type of computer

  • Collections
  • Publications
  • K-12 Students & Educators
  • Families & Community Groups
  • Colleges & Universities
  • Business & Government Leaders
  • Make a Plan
  • Exhibits at the Museum
  • Tours & Group Reservations
  • Customize It
  • This is CHM
  • Ways to Give
  • Donor Recognition
  • Institutional Partnerships
  • Upcoming Events
  • Hours & Directions
  • Subscribe Now
  • Terms of Use
  • By Category

Bell Laboratories scientist George Stibitz uses relays for a demonstration adder

research type of computer

“Model K” Adder

Called the “Model K” Adder because he built it on his “Kitchen” table, this simple demonstration circuit provides proof of concept for applying Boolean logic to the design of computers, resulting in construction of the relay-based Model I Complex Calculator in 1939. That same year in Germany, engineer Konrad Zuse built his Z2 computer, also using telephone company relays.

Hewlett-Packard is founded

research type of computer

Hewlett and Packard in their garage workshop

David Packard and Bill Hewlett found their company in a Palo Alto, California garage. Their first product, the HP 200A Audio Oscillator, rapidly became a popular piece of test equipment for engineers. Walt Disney Pictures ordered eight of the 200B model to test recording equipment and speaker systems for the 12 specially equipped theatres that showed the movie “Fantasia” in 1940.

The Complex Number Calculator (CNC) is completed

research type of computer

Operator at Complex Number Calculator (CNC)

In 1939, Bell Telephone Laboratories completes this calculator, designed by scientist George Stibitz. In 1940, Stibitz demonstrated the CNC at an American Mathematical Society conference held at Dartmouth College. Stibitz stunned the group by performing calculations remotely on the CNC (located in New York City) using a Teletype terminal connected to New York over special telephone lines. This is likely the first example of remote access computing.

Konrad Zuse finishes the Z3 Computer

research type of computer

The Zuse Z3 Computer

The Z3, an early computer built by German engineer Konrad Zuse working in complete isolation from developments elsewhere, uses 2,300 relays, performs floating point binary arithmetic, and has a 22-bit word length. The Z3 was used for aerodynamic calculations but was destroyed in a bombing raid on Berlin in late 1943. Zuse later supervised a reconstruction of the Z3 in the 1960s, which is currently on display at the Deutsches Museum in Munich.

The first Bombe is completed

research type of computer

Bombe replica, Bletchley Park, UK

Built as an electro-mechanical means of decrypting Nazi ENIGMA-based military communications during World War II, the British Bombe is conceived of by computer pioneer Alan Turing and Harold Keen of the British Tabulating Machine Company. Hundreds of allied bombes were built in order to determine the daily rotor start positions of Enigma cipher machines, which in turn allowed the Allies to decrypt German messages. The basic idea for bombes came from Polish code-breaker Marian Rejewski's 1938 "Bomba."

The Atanasoff-Berry Computer (ABC) is completed

research type of computer

The Atanasoff-Berry Computer

After successfully demonstrating a proof-of-concept prototype in 1939, Professor John Vincent Atanasoff receives funds to build a full-scale machine at Iowa State College (now University). The machine was designed and built by Atanasoff and graduate student Clifford Berry between 1939 and 1942. The ABC was at the center of a patent dispute related to the invention of the computer, which was resolved in 1973 when it was shown that ENIAC co-designer John Mauchly had seen the ABC shortly after it became functional.

The legal result was a landmark: Atanasoff was declared the originator of several basic computer ideas, but the computer as a concept was declared un-patentable and thus freely open to all. A full-scale working replica of the ABC was completed in 1997, proving that the ABC machine functioned as Atanasoff had claimed. The replica is currently on display at the Computer History Museum.

Bell Labs Relay Interpolator is completed

research type of computer

George Stibitz circa 1940

The US Army asked Bell Laboratories to design a machine to assist in testing its M-9 gun director, a type of analog computer that aims large guns to their targets. Mathematician George Stibitz recommends using a relay-based calculator for the project. The result was the Relay Interpolator, later called the Bell Labs Model II. The Relay Interpolator used 440 relays, and since it was programmable by paper tape, was used for other applications following the war.

Curt Herzstark designs Curta calculator

research type of computer

Curta Model 1 calculator

Curt Herzstark was an Austrian engineer who worked in his family’s manufacturing business until he was arrested by the Nazis in 1943. While imprisoned at Buchenwald concentration camp for the rest of World War II, he refines his pre-war design of a calculator featuring a modified version of Leibniz’s “stepped drum” design. After the war, Herzstark’s Curta made history as the smallest all-mechanical, four-function calculator ever built.

First Colossus operational at Bletchley Park

research type of computer

The Colossus at work at Bletchley Park

Designed by British engineer Tommy Flowers, the Colossus is designed to break the complex Lorenz ciphers used by the Nazis during World War II. A total of ten Colossi were delivered, each using as many as 2,500 vacuum tubes. A series of pulleys transported continuous rolls of punched paper tape containing possible solutions to a particular code. Colossus reduced the time to break Lorenz messages from weeks to hours. Most historians believe that the use of Colossus machines significantly shortened the war by providing evidence of enemy intentions and beliefs. The machine’s existence was not made public until the 1970s.

Harvard Mark 1 is completed

research type of computer

Conceived by Harvard physics professor Howard Aiken, and designed and built by IBM, the Harvard Mark 1 is a room-sized, relay-based calculator. The machine had a fifty-foot long camshaft running the length of machine that synchronized the machine’s thousands of component parts and used 3,500 relays. The Mark 1 produced mathematical tables but was soon superseded by electronic stored-program computers.

John von Neumann writes First Draft of a Report on the EDVAC

research type of computer

John von Neumann

In a widely circulated paper, mathematician John von Neumann outlines the architecture of a stored-program computer, including electronic storage of programming information and data -- which eliminates the need for more clumsy methods of programming such as plugboards, punched cards and paper. Hungarian-born von Neumann demonstrated prodigious expertise in hydrodynamics, ballistics, meteorology, game theory, statistics, and the use of mechanical devices for computation. After the war, he concentrated on the development of Princeton´s Institute for Advanced Studies computer.

Moore School lectures take place

research type of computer

The Moore School Building at the University of Pennsylvania

An inspiring summer school on computing at the University of Pennsylvania´s Moore School of Electrical Engineering stimulates construction of stored-program computers at universities and research institutions in the US, France, the UK, and Germany. Among the lecturers were early computer designers like John von Neumann, Howard Aiken, J. Presper Eckert and John Mauchly, as well as mathematicians including Derrick Lehmer, George Stibitz, and Douglas Hartree. Students included future computing pioneers such as Maurice Wilkes, Claude Shannon, David Rees, and Jay Forrester. This free, public set of lectures inspired the EDSAC, BINAC, and, later, IAS machine clones like the AVIDAC.

Project Whirlwind begins

research type of computer

Whirlwind installation at MIT

During World War II, the US Navy approaches the Massachusetts Institute of Technology (MIT) about building a flight simulator to train bomber crews. Under the leadership of MIT's Gordon Brown and Jay Forrester, the team first built a small analog simulator, but found it inaccurate and inflexible. News of the groundbreaking electronic ENIAC computer that same year inspired the group to change course and attempt a digital solution, whereby flight variables could be rapidly programmed in software. Completed in 1951, Whirlwind remains one of the most important computer projects in the history of computing. Foremost among its developments was Forrester’s perfection of magnetic core memory, which became the dominant form of high-speed random access memory for computers until the mid-1970s.

Public unveiling of ENIAC

research type of computer

Started in 1943, the ENIAC computing system was built by John Mauchly and J. Presper Eckert at the Moore School of Electrical Engineering of the University of Pennsylvania. Because of its electronic, as opposed to electromechanical, technology, it is over 1,000 times faster than any previous computer. ENIAC used panel-to-panel wiring and switches for programming, occupied more than 1,000 square feet, used about 18,000 vacuum tubes and weighed 30 tons. It was believed that ENIAC had done more calculation over the ten years it was in operation than all of humanity had until that time.

First Computer Program to Run on a Computer

research type of computer

Kilburn (left) and Williams in front of 'Baby'

University of Manchester researchers Frederic Williams, Tom Kilburn, and Geoff Toothill develop the Small-Scale Experimental Machine (SSEM), better known as the Manchester "Baby." The Baby was built to test a new memory technology developed by Williams and Kilburn -- soon known as the Williams Tube – which was the first high-speed electronic random access memory for computers. Their first program, consisting of seventeen instructions and written by Kilburn, ran on June 21st, 1948. This was the first program in history to run on a digital, electronic, stored-program computer.

SSEC goes on display

research type of computer

IBM Selective Sequence Electronic Calculator (SSEC)

The Selective Sequence Electronic Calculator (SSEC) project, led by IBM engineer Wallace Eckert, uses both relays and vacuum tubes to process scientific data at the rate of 50 14 x 14 digit multiplications per second. Before its decommissioning in 1952, the SSEC produced the moon position tables used in early planning of the 1969 Apollo XII moon landing. These tables were later confirmed by using more modern computers for the actual flights. The SSEC was one of the last of the generation of 'super calculators' to be built using electromechanical technology.

CSIRAC runs first program

research type of computer

While many early digital computers were based on similar designs, such as the IAS and its copies, others are unique designs, like the CSIRAC. Built in Sydney, Australia by the Council of Scientific and Industrial Research for use in its Radio physics Laboratory in Sydney, CSIRAC was designed by British-born Trevor Pearcey, and used unusual 12-hole paper tape. It was transferred to the Department of Physics at the University of Melbourne in 1955 and remained in service until 1964.

EDSAC completed

research type of computer

The first practical stored-program computer to provide a regular computing service, EDSAC is built at Cambridge University using vacuum tubes and mercury delay lines for memory. The EDSAC project was led by Cambridge professor and director of the Cambridge Computation Laboratory, Maurice Wilkes. Wilkes' ideas grew out of the Moore School lectures he had attended three years earlier. One major advance in programming was Wilkes' use of a library of short programs, called “subroutines,” stored on punched paper tapes and used for performing common repetitive calculations within a larger program.

MADDIDA developed

research type of computer

MADDIDA (Magnetic Drum Digital Differential Analyzer) prototype

MADDIDA is a digital drum-based differential analyzer. This type of computer is useful in performing many of the mathematical equations scientists and engineers encounter in their work. It was originally created for a nuclear missile design project in 1949 by a team led by Fred Steele. It used 53 vacuum tubes and hundreds of germanium diodes, with a magnetic drum for memory. Tracks on the drum did the mathematical integration. MADDIDA was flown across the country for a demonstration to John von Neumann, who was impressed. Northrop was initially reluctant to make MADDIDA a commercial product, but by the end of 1952, six had sold.

Manchester Mark I completed

research type of computer

Manchester Mark I

Built by a team led by engineers Frederick Williams and Tom Kilburn, the Mark I serves as the prototype for Ferranti’s first computer – the Ferranti Mark 1. The Manchester Mark I used more than 1,300 vacuum tubes and occupied an area the size of a medium room. Its “Williams-Kilburn tube” memory system was later adopted by several other early computer systems around the world.

ERA 1101 introduced

research type of computer

One of the first commercially produced computers, the company´s first customer was the US Navy. The 1101, designed by ERA but built by Remington-Rand, was intended for high-speed computing and stored 1 million bits on its magnetic drum, one of the earliest magnetic storage devices and a technology which ERA had done much to perfect in its own laboratories. Many of the 1101’s basic architectural details were used again in later Remington-Rand computers until the 1960s.

NPL Pilot ACE completed

research type of computer

Based on ideas from Alan Turing, Britain´s Pilot ACE computer is constructed at the National Physical Laboratory. "We are trying to build a machine to do all kinds of different things simply by programming rather than by the addition of extra apparatus," Turing said at a symposium on large-scale digital calculating machinery in 1947 in Cambridge, Massachusetts. The design packed 800 vacuum tubes into a relatively compact 12 square feet.

Plans to build the Simon 1 relay logic machine are published

research type of computer

Simon featured on the November 1950 Scientific American cover

The hobbyist magazine Radio Electronics publishes Edmund Berkeley's design for the Simon 1 relay computer from 1950 to 1951. The Simon 1 used relay logic and cost about $600 to build. In his book Giant Brains , Berkeley noted - “We shall now consider how we can design a very simple machine that will think. Let us call it Simon, because of its predecessor, Simple Simon... Simon is so simple and so small in fact that it could be built to fill up less space than a grocery-store box; about four cubic feet.”

SEAC and SWAC completed

research type of computer

The Standards Eastern Automatic Computer (SEAC) is among the first stored program computers completed in the United States. It was built in Washington DC as a test-bed for evaluating components and systems as well as for setting computer standards. It was also one of the first computers to use all-diode logic, a technology more reliable than vacuum tubes. The world's first scanned image was made on SEAC by engineer Russell Kirsch in 1957.

The NBS also built the Standards Western Automatic Computer (SWAC) at the Institute for Numerical Analysis on the UCLA campus. Rather than testing components like the SEAC, the SWAC was built using already-developed technology. SWAC was used to solve problems in numerical analysis, including developing climate models and discovering five previously unknown Mersenne prime numbers.

Ferranti Mark I sold

research type of computer

Ferranti Mark 1

The title of “first commercially available general-purpose computer” probably goes to Britain’s Ferranti Mark I for its sale of its first Mark I computer to Manchester University. The Mark 1 was a refinement of the experimental Manchester “Baby” and Manchester Mark 1 computers, also at Manchester University. A British government contract spurred its initial development but a change in government led to loss of funding and the second and only other Mark I was sold at a major loss to the University of Toronto, where it was re-christened FERUT.

First Univac 1 delivered to US Census Bureau

research type of computer

Univac 1 installation

The Univac 1 is the first commercial computer to attract widespread public attention. Although manufactured by Remington Rand, the machine was often mistakenly referred to as “the IBM Univac." Univac computers were used in many different applications but utilities, insurance companies and the US military were major customers. One biblical scholar even used a Univac 1 to compile a concordance to the King James version of the Bible. Created by Presper Eckert and John Mauchly -- designers of the earlier ENIAC computer -- the Univac 1 used 5,200 vacuum tubes and weighed 29,000 pounds. Remington Rand eventually sold 46 Univac 1s at more than $1 million each.

J. Lyons & Company introduce LEO-1

research type of computer

Modeled after the Cambridge University EDSAC computer, the president of Lyons Tea Co. has the LEO built to solve the problem of production scheduling and delivery of cakes to the hundreds of Lyons tea shops around England. After the success of the first LEO, Lyons went into business manufacturing computers to meet the growing need for data processing systems in business. The LEO was England’s first commercial computer and was performing useful work before any other commercial computer system in the world.

IAS computer operational

research type of computer

MANIAC at Los Alamos

The Institute of Advanced Study (IAS) computer is a multi-year research project conducted under the overall supervision of world-famous mathematician John von Neumann. The notion of storing both data and instructions in memory became known as the ‘stored program concept’ to distinguish it from earlier methods of instructing a computer. The IAS computer was designed for scientific calculations and it performed essential work for the US atomic weapons program. Over the next few years, the basic design of the IAS machine was copied in at least 17 places and given similar-sounding names, for example, the MANIAC at Los Alamos Scientific Laboratory; the ILLIAC at the University of Illinois; the Johnniac at The Rand Corporation; and the SILLIAC in Australia.

Grimsdale and Webb build early transistorized computer

research type of computer

Manchester transistorized computer

Working under Tom Kilburn at England’s Manchester University, Richard Grimsdale and Douglas Webb demonstrate a prototype transistorized computer, the "Manchester TC", on November 16, 1953. The 48-bit machine used 92 point-contact transistors and 550 diodes.

IBM ships its Model 701 Electronic Data Processing Machine

research type of computer

Cuthbert Hurd (standing) and Thomas Watson, Sr. at IBM 701 console

During three years of production, IBM sells 19 701s to research laboratories, aircraft companies, and the federal government. Also known inside IBM as the “Defense Calculator," the 701 rented for $15,000 a month. Programmer Arthur Samuels used the 701 to write the first computer program designed to play checkers. The 701 introduction also marked the beginning of IBM’s entry into the large-scale computer market, a market it came to dominate in later decades.

RAND Corporation completes Johnniac computer

research type of computer

RAND Corporation’s Johnniac

The Johnniac computer is one of 17 computers that followed the basic design of Princeton's Institute of Advanced Study (IAS) computer. It was named after John von Neumann, a world famous mathematician and computer pioneer of the day. Johnniac was used for scientific and engineering calculations. It was also repeatedly expanded and improved throughout its 13-year lifespan. Many innovative programs were created for Johnniac, including the time-sharing system JOSS that allowed many users to simultaneously access the machine.

IBM 650 magnetic drum calculator introduced

research type of computer

IBM establishes the 650 as its first mass-produced computer, with the company selling 450 in just one year. Spinning at 12,500 rpm, the 650´s magnetic data-storage drum allowed much faster access to stored information than other drum-based machines. The Model 650 was also highly popular in universities, where a generation of students first learned programming.

English Electric DEUCE introduced

research type of computer

English Electric DEUCE

A commercial version of Alan Turing's Pilot ACE, called DEUCE—the Digital Electronic Universal Computing Engine -- is used mostly for science and engineering problems and a few commercial applications. Over 30 were completed, including one delivered to Australia.

Direct keyboard input to computers

research type of computer

Joe Thompson at Whirlwind console, ca. 1951

At MIT, researchers begin experimenting with direct keyboard input to computers, a precursor to today´s normal mode of operation. Typically, computer users of the time fed their programs into a computer using punched cards or paper tape. Doug Ross wrote a memo advocating direct access in February. Ross contended that a Flexowriter -- an electrically-controlled typewriter -- connected to an MIT computer could function as a keyboard input device due to its low cost and flexibility. An experiment conducted five months later on the MIT Whirlwind computer confirmed how useful and convenient a keyboard input device could be.

Librascope LGP-30 introduced

research type of computer

Physicist Stan Frankel, intrigued by small, general-purpose computers, developed the MINAC at Caltech. The Librascope division of defense contractor General Precision buys Frankel’s design, renaming it the LGP-30 in 1956. Used for science and engineering as well as simple data processing, the LGP-30 was a “bargain” at less than $50,000 and an early example of a ‘personal computer,’ that is, a computer made for a single user.

MIT researchers build the TX-0

research type of computer

TX-0 at MIT

The TX-0 (“Transistor eXperimental - 0”) is the first general-purpose programmable computer built with transistors. For easy replacement, designers placed each transistor circuit inside a "bottle," similar to a vacuum tube. Constructed at MIT´s Lincoln Laboratory, the TX-0 moved to the MIT Research Laboratory of Electronics, where it hosted some early imaginative tests of programming, including writing a Western movie shown on television, 3-D tic-tac-toe, and a maze in which a mouse found martinis and became increasingly inebriated.

Digital Equipment Corporation (DEC) founded

research type of computer

The Maynard mill

DEC is founded initially to make electronic modules for test, measurement, prototyping and control markets. Its founders were Ken and Stan Olsen, and Harlan Anderson. Headquartered in Maynard, Massachusetts, Digital Equipment Corporation, took over 8,680 square foot leased space in a nineteenth century mill that once produced blankets and uniforms for soldiers who fought in the Civil War. General Georges Doriot and his pioneering venture capital firm, American Research and Development, invested $70,000 for 70% of DEC’s stock to launch the company in 1957. The mill is still in use today as an office park (Clock Tower Place) today.

RCA introduces its Model 501 transistorized computer

research type of computer

RCA 501 brochure cover

The 501 is built on a 'building block' concept which allows it to be highly flexible for many different uses and could simultaneously control up to 63 tape drives—very useful for large databases of information. For many business users, quick access to this huge storage capability outweighed its relatively slow processing speed. Customers included US military as well as industry.

SAGE system goes online

research type of computer

SAGE Operator Station

The first large-scale computer communications network, SAGE connects 23 hardened computer sites in the US and Canada. Its task was to detect incoming Soviet bombers and direct interceptor aircraft to destroy them. Operators directed actions by touching a light gun to the SAGE airspace display. The air defense system used two AN/FSQ-7 computers, each of which used a full megawatt of power to drive its 55,000 vacuum tubes, 175,000 diodes and 13,000 transistors.

DEC PDP-1 introduced

research type of computer

Ed Fredkin at DEC PDP-1

The typical PDP-1 computer system, which sells for about $120,000, includes a cathode ray tube graphic display, paper tape input/output, needs no air conditioning and requires only one operator; all of which become standards for minicomputers. Its large scope intrigued early hackers at MIT, who wrote the first computerized video game, SpaceWar! , as well as programs to play music. More than 50 PDP-1s were sold.

NEAC 2203 goes online

research type of computer

NEAC 2203 transistorized computer

An early transistorized computer, the NEAC (Nippon Electric Automatic Computer) includes a CPU, console, paper tape reader and punch, printer and magnetic tape units. It was sold exclusively in Japan, but could process alphabetic and Japanese kana characters. Only about thirty NEACs were sold. It managed Japan's first on-line, real-time reservation system for Kinki Nippon Railways in 1960. The last one was decommissioned in 1979.

IBM 7030 (“Stretch”) completed

research type of computer

IBM Stretch

IBM´s 7000 series of mainframe computers are the company´s first to use transistors. At the top of the line was the Model 7030, also known as "Stretch." Nine of the computers, which featured dozens of advanced design innovations, were sold, mainly to national laboratories and major scientific users. A special version, known as HARVEST, was developed for the US National Security Agency (NSA). The knowledge and technologies developed for the Stretch project played a major role in the design, management, and manufacture of the later IBM System/360--the most successful computer family in IBM history.

IBM Introduces 1400 series

research type of computer

The 1401 mainframe, the first in the series, replaces earlier vacuum tube technology with smaller, more reliable transistors. Demand called for more than 12,000 of the 1401 computers, and the machine´s success made a strong case for using general-purpose computers rather than specialized systems. By the mid-1960s, nearly half of all computers in the world were IBM 1401s.

Minuteman I missile guidance computer developed

research type of computer

Minuteman Guidance computer

Minuteman missiles use transistorized computers to continuously calculate their position in flight. The computer had to be rugged and fast, with advanced circuit design and reliable packaging able to withstand the forces of a missile launch. The military’s high standards for its transistors pushed manufacturers to improve quality control. When the Minuteman I was decommissioned, some universities received these computers for use by students.

Naval Tactical Data System introduced

research type of computer

Naval Tactical Data System (NTDS)

The US Navy Tactical Data System uses computers to integrate and display shipboard radar, sonar and communications data. This real-time information system began operating in the early 1960s. In October 1961, the Navy tested the NTDS on the USS Oriskany carrier and the USS King and USS Mahan frigates. After being successfully used for decades, NTDS was phased out in favor of the newer AEGIS system in the 1980s.

MIT LINC introduced

research type of computer

Wesley Clark with LINC

The LINC is an early and important example of a ‘personal computer,’ that is, a computer designed for only one user. It was designed by MIT Lincoln Laboratory engineer Wesley Clark. Under the auspices of a National Institutes of Health (NIH) grant, biomedical research faculty from around the United States came to a workshop at MIT to build their own LINCs, and then bring them back to their home institutions where they would be used. For research, Digital Equipment Corporation (DEC) supplied the components, and 50 original LINCs were made. The LINC was later commercialized by DEC and sold as the LINC-8.

The Atlas Computer debuts

research type of computer

Chilton Atlas installation

A joint project of England’s Manchester University, Ferranti Computers, and Plessey, Atlas comes online nine years after Manchester’s computer lab begins exploring transistor technology. Atlas was the fastest computer in the world at the time and introduced the concept of “virtual memory,” that is, using a disk or drum as an extension of main memory. System control was provided through the Atlas Supervisor, which some consider to be the first true operating system.

CDC 6600 supercomputer introduced

research type of computer

The Control Data Corporation (CDC) 6600 performs up to 3 million instructions per second —three times faster than that of its closest competitor, the IBM 7030 supercomputer. The 6600 retained the distinction of being the fastest computer in the world until surpassed by its successor, the CDC 7600, in 1968. Part of the speed came from the computer´s design, which used 10 small computers, known as peripheral processing units, to offload the workload from the central processor.

Digital Equipment Corporation introduces the PDP-8

research type of computer

PDP-8 advertisement

The Canadian Chalk River Nuclear Lab needed a special device to monitor a reactor. Instead of designing a custom controller, two young engineers from Digital Equipment Corporation (DEC) -- Gordon Bell and Edson de Castro -- do something unusual: they develop a small, general purpose computer and program it to do the job. A later version of that machine became the PDP-8, the first commercially successful minicomputer. The PDP-8 sold for $18,000, one-fifth the price of a small IBM System/360 mainframe. Because of its speed, small size, and reasonable cost, the PDP-8 was sold by the thousands to manufacturing plants, small businesses, and scientific laboratories around the world.

IBM announces System/360

research type of computer

IBM 360 Model 40

System/360 is a major event in the history of computing. On April 7, IBM announced five models of System/360, spanning a 50-to-1 performance range. At the same press conference, IBM also announced 40 completely new peripherals for the new family. System/360 was aimed at both business and scientific customers and all models could run the same software, largely without modification. IBM’s initial investment of $5 billion was quickly returned as orders for the system climbed to 1,000 per month within two years. At the time IBM released the System/360, the company had just made the transition from discrete transistors to integrated circuits, and its major source of revenue began to move from punched card equipment to electronic computer systems.

SABRE comes on-line

research type of computer

Airline reservation agents working with SABRE

SABRE is a joint project between American Airlines and IBM. Operational by 1964, it was not the first computerized reservation system, but it was well publicized and became very influential. Running on dual IBM 7090 mainframe computer systems, SABRE was inspired by IBM’s earlier work on the SAGE air-defense system. Eventually, SABRE expanded, even making airline reservations available via on-line services such as CompuServe, Genie, and America Online.

Teletype introduced its ASR-33 Teletype

research type of computer

Student using ASR-33

At a cost to computer makers of roughly $700, the ASR-33 Teletype is originally designed as a low cost terminal for the Western Union communications network. Throughout the 1960s and ‘70s, the ASR-33 was a popular and inexpensive choice of input and output device for minicomputers and many of the first generation of microcomputers.

3C DDP-116 introduced

research type of computer

DDP-116 General Purpose Computer

Designed by engineer Gardner Hendrie for Computer Control Corporation (CCC), the DDP-116 is announced at the 1965 Spring Joint Computer Conference. It was the world's first commercial 16-bit minicomputer and 172 systems were sold. The basic computer cost $28,500.

Olivetti Programma 101 is released

research type of computer

Olivetti Programma 101

Announced the year previously at the New York World's Fair the Programma 101 goes on sale. This printing programmable calculator was made from discrete transistors and an acoustic delay-line memory. The Programma 101 could do addition, subtraction, multiplication, and division, as well as calculate square roots. 40,000 were sold, including 10 to NASA for use on the Apollo space project.

HP introduces the HP 2116A

research type of computer

HP 2116A system

The 2116A is HP’s first computer. It was developed as a versatile instrument controller for HP's growing family of programmable test and measurement products. It interfaced with a wide number of standard laboratory instruments, allowing customers to computerize their instrument systems. The 2116A also marked HP's first use of integrated circuits in a commercial product.

ILLIAC IV project begins

research type of computer

A large parallel processing computer, the ILLIAC IV does not operate until 1972. It was eventually housed at NASA´s Ames Research Center in Mountain View, California. The most ambitious massively parallel computer at the time, the ILLIAC IV was plagued with design and production problems. Once finally completed, it achieved a computational speed of 200 million instructions per second and 1 billion bits per second of I/O transfer via a unique combination of its parallel architecture and the overlapping or "pipelining" structure of its 64 processing elements.

RCA announces its Spectra series of computers

research type of computer

Image from RCA Spectra-70 brochure

The first large commercial computers to use integrated circuits, RCA highlights the IC's advantage over IBM’s custom SLT modules. Spectra systems were marketed on the basis of their compatibility with the IBM System/360 series of computer since it implemented the IBM 360 instruction set and could run most IBM software with little or no modification.

Apollo Guidance Computer (AGC) makes its debut

research type of computer

DSKY interface for the Apollo Guidance Computer

Designed by scientists and engineers at MIT’s Instrumentation Laboratory, the Apollo Guidance Computer (AGC) is the culmination of years of work to reduce the size of the Apollo spacecraft computer from the size of seven refrigerators side-by-side to a compact unit weighing only 70 lbs. and taking up a volume of less than 1 cubic foot. The AGC’s first flight was on Apollo 7. A year later, it steered Apollo 11 to the lunar surface. Astronauts communicated with the computer by punching two-digit codes into the display and keyboard unit (DSKY). The AGC was one of the earliest uses of integrated circuits, and used core memory, as well as read-only magnetic rope memory. The astronauts were responsible for entering more than 10,000 commands into the AGC for each trip between Earth and the Moon.

Data General Corporation introduces the Nova Minicomputer

research type of computer

Edson deCastro with a Data General Nova

Started by a group of engineers that left Digital Equipment Corporation (DEC), Data General designs the Nova minicomputer. It had 32 KB of memory and sold for $8,000. Ed de Castro, its main designer and co-founder of Data General, had earlier led the team that created the DEC PDP-8. The Nova line of computers continued through the 1970s, and influenced later systems like the Xerox Alto and Apple 1.

Amdahl Corporation introduces the Amdahl 470

research type of computer

Gene Amdahl with 470V/6 model

Gene Amdahl, father of the IBM System/360, starts his own company, Amdahl Corporation, to compete with IBM in mainframe computer systems. The 470V/6 was the company’s first product and ran the same software as IBM System/370 computers but cost less and was smaller and faster.

First Kenbak-1 is sold

research type of computer

One of the earliest personal computers, the Kenbak-1 is advertised for $750 in Scientific American magazine. Designed by John V. Blankenbaker using standard medium-- and small-scale integrated circuits, the Kenbak-1 relied on switches for input and lights for output from its 256-byte memory. In 1973, after selling only 40 machines, Kenbak Corporation closed its doors.

Hewlett-Packard introduces the HP-35

research type of computer

HP-35 handheld calculator

Initially designed for internal use by HP employees, co-founder Bill Hewlett issues a challenge to his engineers in 1971: fit all of the features of their desktop scientific calculator into a package small enough for his shirt pocket. They did. Marketed as “a fast, extremely accurate electronic slide rule” with a solid-state memory similar to that of a computer, the HP-35 distinguished itself from its competitors by its ability to perform a broad variety of logarithmic and trigonometric functions, to store more intermediate solutions for later use, and to accept and display entries in a form similar to standard scientific notation. The HP-35 helped HP become one of the most dominant companies in the handheld calculator market for more than two decades.

Intel introduces the first microprocessor

research type of computer

Advertisement for Intel's 4004

Computer History Museum

The first advertisement for a microprocessor, the Intel 4004, appears in Electronic News. Developed for Busicom, a Japanese calculator maker, the 4004 had 2250 transistors and could perform up to 90,000 operations per second in four-bit chunks. Federico Faggin led the design and Ted Hoff led the architecture.

Laser printer invented at Xerox PARC

research type of computer

Dover laser printer

Xerox PARC physicist Gary Starkweather realizes in 1967 that exposing a copy machine’s light-sensitive drum to a paper original isn’t the only way to create an image. A computer could “write” it with a laser instead. Xerox wasn’t interested. So in 1971, Starkweather transferred to Xerox Palo Alto Research Center (PARC), away from corporate oversight. Within a year, he had built the world’s first laser printer, launching a new era in computer printing, generating billions of dollars in revenue for Xerox. The laser printer was used with PARC’s Alto computer, and was commercialized as the Xerox 9700.

IBM SCAMP is developed

research type of computer

Dr. Paul Friedl with SCAMP prototype

Under the direction of engineer Dr. Paul Friedl, the Special Computer APL Machine Portable (SCAMP) personal computer prototype is developed at IBM's Los Gatos and Palo Alto, California laboratories. IBM’s first personal computer, the system was designed to run the APL programming language in a compact, briefcase-like enclosure which comprised a keyboard, CRT display, and cassette tape storage. Friedl used the SCAMP prototype to gain approval within IBM to promote and develop IBM’s 5100 family of computers, including the most successful, the 5150, also known as the IBM Personal Computer (PC), introduced in 1981. From concept to finished system, SCAMP took only six months to develop.

Micral is released

research type of computer

Based on the Intel 8008 microprocessor, the Micral is one of the earliest commercial, non-kit personal computers. Designer Thi Truong developed the computer while Philippe Kahn wrote the software. Truong, founder and president of the French company R2E, created the Micral as a replacement for minicomputers in situations that did not require high performance, such as process control and highway toll collection. Selling for $1,750, the Micral never penetrated the U.S. market. In 1979, Truong sold R2E to Bull.

The TV Typewriter plans are published

research type of computer

TV Typewriter

Designed by Don Lancaster, the TV Typewriter is an easy-to-build kit that can display alphanumeric information on an ordinary television set. It used $120 worth of electronics components, as outlined in the September 1973 issue of hobbyist magazine Radio Electronics . The original design included two memory boards and could generate and store 512 characters as 16 lines of 32 characters. A cassette tape interface provided supplementary storage for text. The TV Typewriter was used by many small television stations well in the 1990s.

Wang Laboratories releases the Wang 2200

research type of computer

Wang was a successful calculator manufacturer, then a successful word processor company. The 1973 Wang 2200 makes it a successful computer company, too. Wang sold the 2200 primarily through Value Added Resellers, who added special software to solve specific customer problems. The 2200 used a built-in CRT, cassette tape for storage, and ran the programming language BASIC. The PC era ended Wang’s success, and it filed for bankruptcy in 1992.

Scelbi advertises its 8H computer

research type of computer

The first commercially advertised US computer based on a microprocessor (the Intel 8008,) the Scelbi has 4 KB of internal memory and a cassette tape interface, as well as Teletype and oscilloscope interfaces. Scelbi aimed the 8H, available both in kit form and fully assembled, at scientific, electronic, and biological applications. In 1975, Scelbi introduced the 8B version with 16 KB of memory for the business market. The company sold about 200 machines, losing $500 per unit.

The Mark-8 appears in the pages of Radio-Electronics

research type of computer

Mark-8 featured on Radio-Electronics July 1974 cover

The Mark-8 “Do-It-Yourself” kit is designed by graduate student John Titus and uses the Intel 8008 microprocessor. The kit was the cover story of hobbyist magazine Radio-Electronics in July 1974 – six months before the MITS Altair 8800 was in rival Popular Electronics magazine. Plans for the Mark-8 cost $5 and the blank circuit boards were available for $50.

Xerox PARC Alto introduced

research type of computer

The Alto is a groundbreaking computer with wide influence on the computer industry. It was based on a graphical user interface using windows, icons, and a mouse, and worked together with other Altos over a local area network. It could also share files and print out documents on an advanced Xerox laser printer. Applications were also highly innovative: a WYSISYG word processor known as “Bravo,” a paint program, a graphics editor, and email for example. Apple’s inspiration for the Lisa and Macintosh computers came from the Xerox Alto.

MITS Altair 8800 kit appears in Popular Electronics

research type of computer

Altair 8800

For its January issue, hobbyist magazine Popular Electronics runs a cover story of a new computer kit – the Altair 8800. Within weeks of its appearance, customers inundated its maker, MITS, with orders. Bill Gates and Paul Allen licensed their BASIC programming language interpreter to MITS as the main language for the Altair. MITS co-founder Ed Roberts invented the Altair 8800 — which sold for $297, or $395 with a case — and coined the term “personal computer”. The machine came with 256 bytes of memory (expandable to 64 KB) and an open 100-line bus structure that evolved into the “S-100” standard widely used in hobbyist and personal computers of this era. In 1977, MITS was sold to Pertec, which continued producing Altairs in 1978.

MOS 6502 is introduced

research type of computer

MOS 6502 ad from IEEE Computer, Sept. 1975

Chuck Peddle leads a small team of former Motorola employees to build a low-cost microprocessor. The MOS 6502 was introduced at a conference in San Francisco at a cost of $25, far less than comparable processors from Intel and Motorola, leading some attendees to believe that the company was perpetrating a hoax. The chip quickly became popular with designers of early personal computers like the Apple II and Commodore PET, as well as game consoles like the Nintendo Entertainment System. The 6502 and its progeny are still used today, usually in embedded applications.

Southwest Technical Products introduces the SWTPC 6800

research type of computer

Southwest Technical Products 6800

Southwest Technical Products is founded by Daniel Meyer as DEMCO in the 1960s to provide a source for kit versions of projects published in electronics hobbyist magazines. SWTPC introduces many computer kits based on the Motorola 6800, and later, the 6809. Of the dozens of different SWTP kits available, the 6800 proved the most popular.

Tandem Computers releases the Tandem-16

research type of computer

Dual-processor Tandem 16 system

Tailored for online transaction processing, the Tandem-16 is one of the first commercial fault-tolerant computers. The banking industry rushed to adopt the machine, built to run during repair or expansion. The Tandem-16 eventually led to the “Non-Stop” series of systems, which were used for early ATMs and to monitor stock trades.

VDM prototype built

research type of computer

The Video Display Module (VDM)

The Video Display Module (VDM) marks the first implementation of a memory-mapped alphanumeric video display for personal computers. Introduced at the Altair Convention in Albuquerque in March 1976, the visual display module enabled the use of personal computers for interactive games.

Cray-1 supercomputer introduced

research type of computer

Cray I 'Self-portrait'

The fastest machine of its day, The Cray-1's speed comes partly from its shape, a "C," which reduces the length of wires and thus the time signals need to travel across them. High packaging density of integrated circuits and a novel Freon cooling system also contributed to its speed. Each Cray-1 took a full year to assemble and test and cost about $10 million. Typical applications included US national defense work, including the design and simulation of nuclear weapons, and weather forecasting.

Intel 8080 and Zilog Z-80

research type of computer

Zilgo Z-80 microprocessor

Image by Gennadiy Shvets

Intel and Zilog introduced new microprocessors. Five times faster than its predecessor, the 8008, the Intel 8080 could address four times as many bytes for a total of 64 kilobytes. The Zilog Z-80 could run any program written for the 8080 and included twice as many built-in machine instructions.

Steve Wozniak completes the Apple-1

research type of computer

Designed by Sunnyvale, California native Steve Wozniak, and marketed by his friend Steve Jobs, the Apple-1 is a single-board computer for hobbyists. With an order for 50 assembled systems from Mountain View, California computer store The Byte Shop in hand, the pair started a new company, naming it Apple Computer, Inc. In all, about 200 of the boards were sold before Apple announced the follow-on Apple II a year later as a ready-to-use computer for consumers, a model which sold in the millions for nearly two decades.

Apple II introduced

research type of computer

Sold complete with a main logic board, switching power supply, keyboard, case, manual, game paddles, and cassette tape containing the game Breakout , the Apple-II finds popularity far beyond the hobbyist community which made up Apple’s user community until then. When connected to a color television set, the Apple II produced brilliant color graphics for the time. Millions of Apple IIs were sold between 1977 and 1993, making it one of the longest-lived lines of personal computers. Apple gave away thousands of Apple IIs to school, giving a new generation their first access to personal computers.

Tandy Radio Shack introduces its TRS-80

research type of computer

Performing far better than the company projections of 3,000 units for the first year, in the first month after its release Tandy Radio Shack´s first desktop computer — the TRS-80 — sells 10,000 units. The TRS-80 was priced at $599.95, included a Z80 microprocessor, video display, 4 KB of memory, a built-in BASIC programming language interpreter, cassette storage, and easy-to-understand manuals that assumed no prior knowledge on the part of the user. The TRS-80 proved popular with schools, as well as for home use. The TRS-80 line of computers later included color, portable, and handheld versions before being discontinued in the early 1990s.

The Commodore PET (Personal Electronic Transactor) introduced

research type of computer

Commodore PET

The first of several personal computers released in 1977, the PET comes fully assembled with either 4 or 8 KB of memory, a built-in cassette tape drive, and a membrane keyboard. The PET was popular with schools and for use as a home computer. It used a MOS Technologies 6502 microprocessor running at 1 MHz. After the success of the PET, Commodore remained a major player in the personal computer market into the 1990s.

The DEC VAX introduced

research type of computer

DEC VAX 11/780

Beginning with the VAX-11/780, the Digital Equipment Corporation (DEC) VAX family of computers rivals much more expensive mainframe computers in performance and features the ability to address over 4 GB of virtual memory, hundreds of times the capacity of most minicomputers. Called a “complex instruction set computer,” VAX systems were backward compatible and so preserved the investment owners of previous DEC computers had in software. The success of the VAX family of computers transformed DEC into the second-largest computer company in the world, as VAX systems became the de facto standard computing system for industry, the sciences, engineering, and research.

Atari introduces its Model 400 and 800 computers

research type of computer

Early Atari 400/800 advertisement

Shortly after delivery of the Atari VCS game console, Atari designs two microcomputers with game capabilities: the Model 400 and Model 800. The 400 served primarily as a game console, while the 800 was more of a home computer. Both faced strong competition from the Apple II, Commodore PET, and TRS-80 computers. Atari's 8-bit computers were influential in the arts, especially in the emerging DemoScene culture of the 1980s and '90s.

Motorola introduces the 68000 microprocessor

research type of computer

Die shot of Motorola 68000

Image by Pauli Rautakorpi

The Motorola 68000 microprocessor exhibited a processing speed far greater than its contemporaries. This high performance processor found its place in powerful work stations intended for graphics-intensive programs common in engineering.

Texas Instruments TI 99/4 is released

research type of computer

Texas Instruments TI 99/4 microcomputer

Based around the Texas Instruments TMS 9900 microprocessor running at 3 MHz, the TI 99/4 has one of the fastest CPUs available in a home computer. The TI99/4 had a wide variety of expansion boards, with an especially popular speech synthesis system that could also be used with TI's Speak & Spell educational game. The TI 99/4 sold well and led to a series of TI follow-on machines.

Commodore introduces the VIC-20

research type of computer

Commodore VIC-20

Commodore releases the VIC-20 home computer as the successor to the Commodore PET personal computer. Intended to be a less expensive alternative to the PET, the VIC-20 was highly successful, becoming the first computer to sell more than a million units. Commodore even used Star Trek television star William Shatner in advertisements.

The Sinclair ZX80 introduced

research type of computer

Sinclair ZX80

This very small home computer is available in the UK as a kit for £79 or pre-assembled for £99. Inside was a Z80 microprocessor and a built-in BASIC language interpreter. Output was displayed on the user’s home TV screen through use of an adapter. About 50,000 were sold in Britain, primarily to hobbyists, and initially there was a long waiting list for the system.

The Computer Programme debuts on the BBC

research type of computer

Title card- BBC’s The Computer Programme

The British Broadcasting Corporation’s Computer Literacy Project hoped “to introduce interested adults to the world of computers.” Acorn produces a popular computer, the BBC Microcomputer System, so viewers at home could follow along on their own home computers as they watched the program. The machine was expandable, with ports for cassette storage, serial interface and rudimentary networking. A large amount of software was created for the “BBC Micro,” including educational, productivity, and game programs.

Apollo Computer unveils its first workstation, its DN100

research type of computer

Apollo DN100

The DN100 is based on the Motorola 68000 microprocessor, high-resolution display and built-in networking - the three basic features of all workstations. Apollo and its main competitor, Sun Microsystems, optimized their machines to run the computer-intensive graphics programs common in engineering and scientific applications. Apollo was a leading innovator in the workstation field for more than a decade, and was acquired by Hewlett-Packard in 1989.

IBM introduces its Personal Computer (PC)

research type of computer

IBM's brand recognition, along with a massive marketing campaign, ignites the fast growth of the personal computer market with the announcement of its own personal computer (PC). The first IBM PC, formally known as the IBM Model 5150, was based on a 4.77 MHz Intel 8088 microprocessor and used Microsoft´s MS-DOS operating system. The IBM PC revolutionized business computing by becoming the first PC to gain widespread adoption by industry. The IBM PC was widely copied (“cloned”) and led to the creation of a vast “ecosystem” of software, peripherals, and other commodities for use with the platform.

Osborne 1 introduced

research type of computer

Weighing 24 pounds and costing $1,795, the Osborne 1 is the first mass-produced portable computer. Its price was especially attractive as the computer included very useful productivity software worth about $1,500 alone. It featured a 5-inch display, 64 KB of memory, a modem, and two 5.25-inch floppy disk drives.

Commodore introduces the Commodore 64

research type of computer

Commodore 64 system

The C64, as it is better known, sells for $595, comes with 64 KB of RAM and features impressive graphics. Thousands of software titles were released over the lifespan of the C64 and by the time it was discontinued in 1993, it had sold more than 22 million units. It is recognized by the 2006 Guinness Book of World Records as the greatest selling single computer of all time.

Franklin releases Apple II “clones”

research type of computer

Franklin Ace 100 microcomputer

Created almost five years after the original Apple II, Franklin's Ace 1000 main logic board is nearly identical to that in the Apple II+ computer, and other models were later cloned as well. Franklin was able to undercut Apple's pricing even while offering some features not available on the original. Initially, Franklin won a court victory allowing them to continue cloning the machines, but in 1988, Apple won a copyright lawsuit against Franklin, forcing them to stop making Apple II “clones.”

Sun Microsystems is founded

research type of computer

Sun-1 workstation

When Xerox PARC loaned the Stanford Engineering Department an entire Alto Ethernet network with laser printer, graduate student Andy Bechtolsheim re-designed it into a prototype that he then attached to Stanford’s computer network. Sun Microsystems grows out of this prototype. The roots of the company’s name came from the acronym for Stanford University Network (SUN). The company was incorporated by three 26-year-old Stanford alumni: Bechtolsheim, Vinod Khosla and Scott McNealy. The trio soon attracted UC Berkeley UNIX guru Bill Joy, who led software development. Sun helped cement the model of a workstation having an Ethernet interface as well as high-resolution graphics and the UNIX operating system.

Apple introduces the Lisa computer

research type of computer

Lisa is the first commercial personal computer with a graphical user interface (GUI). It was thus an important milestone in computing as soon Microsoft Windows and the Apple Macintosh would soon adopt the GUI as their user interface, making it the new paradigm for personal computing. The Lisa ran on a Motorola 68000 microprocessor and came equipped with 1 MB of RAM, a 12-inch black-and-white monitor, dual 5.25-inch floppy disk drives and a 5 MB “Profile” hard drive. Lisa itself, and especially its GUI, were inspired by earlier work at the Xerox Palo Alto Research Center.

Compaq Computer Corporation introduces the Compaq Portable

research type of computer

Compaq Portable

Advertised as the first 100% IBM PC-compatible computer, the Compaq Portable can run the same software as the IBM PC. With the success of the clone, Compaq recorded first-year sales of $111 million, the most ever by an American business in a single year. The success of the Portable inspired many other early IBM-compatible computers. Compaq licensed the MS-DOS operating system from Microsoft and legally reverse-engineered IBM’s BIOS software. Compaq's success launched a market for IBM-compatible computers that by 1996 had achieved an 83-percent share of the personal computer market.

Apple Computer launches the Macintosh

research type of computer

Apple Macintosh

Apple introduces the Macintosh with a television commercial during the 1984 Super Bowl, which plays on the theme of totalitarianism in George Orwell´s book 1984 . The ad featured the destruction of “Big Brother” – a veiled reference to IBM -- through the power of personal computing found in a Macintosh. The Macintosh was the first successful mouse-driven computer with a graphical user interface and was based on the Motorola 68000 microprocessor. Its price was $2,500. Applications that came as part of the package included MacPaint, which made use of the mouse, and MacWrite, which demonstrated WYSIWYG (What You See Is What You Get) word processing.

IBM releases its PC Jr. and PC/AT

research type of computer

The PC Jr. is marketed as a home computer but is too expensive and limited in performance to compete with many of the other machines in that market. It’s “chiclet” keyboard was also criticized for poor ergonomics. While the PC Jr. sold poorly, the PC/AT sold in the millions. It offered increased performance and storage capacity over the original IBM PC and sold for about $4,000. It also included more memory and accommodated high-density 1.2-megabyte 5 1/4-inch floppy disks.

PC's Limited is founded

research type of computer

PC’s Limited founder Michael Dell

In 1984, Michael Dell creates PC's Limited while still a student of the University of Texas at Austin. The dorm-room headquartered company sold IBM PC-compatible computers built from stock components. Dell dropped out of school to focus on his business and in 1985, the company produced the first computer of its own design, the Turbo PC, which sold for $795. By the early 1990s, Dell became one of the leading computer retailers.

The Amiga 1000 is released

research type of computer

Music composition on the Amiga 1000

Commodore’s Amiga 1000 is announced with a major event at New York's Lincoln Center featuring celebrities like Andy Warhol and Debbie Harry of the musical group Blondie. The Amiga sold for $1,295 (without monitor) and had audio and video capabilities beyond those found in most other personal computers. It developed a very loyal following while add-on components allowed it to be upgraded easily. The inside of the Amiga case is engraved with the signatures of the Amiga designers, including Jay Miner as well as the paw print of his dog Mitchy.

Compaq introduces the Deskpro 386 system

research type of computer

Promotional shot of the Compaq Deskpro 386s,

Compaq beats IBM to the market when it announces the Deskpro 386, the first computer on the market to use Intel´s new 80386 chip, a 32-bit microprocessor with 275,000 transistors on each chip. At 4 million operations per second and 4 kilobytes of memory, the 80386 gave PCs as much speed and power as older mainframes and minicomputers.

The 386 chip brought with it the introduction of a 32-bit architecture, a significant improvement over the 16-bit architecture of previous microprocessors. It had two operating modes, one that mirrored the segmented memory of older x86 chips, allowing full backward compatibility, and one that took full advantage of its more advanced technology. The new chip made graphical operating environments for IBM PC and PC-compatible computers practical. The architecture that allowed Windows and IBM OS/2 has remained in subsequent chips.

IBM releases the first commercial RISC-based workstation

research type of computer

Reduced instruction set computers (RISC) grow out of the observation that the simplest 20 percent of a computer´s instruction set does 80 percent of the work. The IBM PC-RT had 1 MB of RAM, a 1.2-megabyte floppy disk drive, and a 40 MB hard drive. It performed 2 million instructions per second, but other RISC-based computers worked significantly faster.

The Connection Machine is unveiled

research type of computer

Connection Machine CM-1

Daniel Hillis of Thinking Machines Corporation moves artificial intelligence a step forward when he develops the controversial concept of massive parallelism in the Connection Machine CM-1. The machine used up to 65,536 one-bit processors and could complete several billion operations per second. Each processor had its own small memory linked with others through a flexible network that users altered by reprogramming rather than rewiring. The machine´s system of connections and switches let processors broadcast information and requests for help to other processors in a simulation of brain-like associative recall. Using this system, the machine could work faster than any other at the time on a problem that could be parceled out among the many processors.

Acorn Archimedes is released

research type of computer

Acorn Archimedes microcomputer

Acorn's ARM RISC microprocessor is first used in the company's Archimedes computer system. One of Britain's leading computer companies, Acorn continued the Archimedes line, which grew to nearly twenty different models, into the 1990s. Acorn spun off ARM as its own company to license microprocessor designs, which in turn has transformed mobile computing with ARM’s low power, high-performance processors and systems-on-chip (SoC).

IBM introduces its Personal System/2 (PS/2) machines

research type of computer

The first IBM system to include Intel´s 80386 chip, the company ships more than 1 million units by the end of the first year. IBM released a new operating system, OS/2, at the same time, allowing the use of a mouse with IBM PCs for the first time. Many credit the PS/2 for making the 3.5-inch floppy disk drive and video graphics array (VGA) standard for IBM computers. The system was IBM's response to losing control of the PC market with the rise of widespread copying of the original IBM PC design by “clone” makers.

Apple co-founder Steve Jobs unveils the NeXT Cube

research type of computer

Steve Jobs, forced out of Apple in 1985, founds a new company – NeXT. The computer he created, an all-black cube was an important innovation. The NeXT had three Motorola microprocessors and 8 MB of RAM. Its base price was $6,500. Some of its other innovations were the inclusion of a magneto-optical (MO) disk drive, a digital signal processor and the NeXTSTEP programming environment (later released as OPENSTEP). This object-oriented multitasking operating system was groundbreaking in its ability to foster rapid development of software applications. OPENSTEP was used as one of the foundations for the new Mac OS operating system soon after NeXT was acquired by Apple in 1996.

Laser 128 is released

research type of computer

Laser 128 Apple II clone

VTech, founded in Hong Kong, had been a manufacturer of Pong-like games and educational toys when they introduce the Laser 128 computer. Instead of simply copying the basic input output system (BIOS) of the Apple II as Franklin Computer had done, they reversed engineered the system and sold it for US $479, a much lower price than the comparable Apple II. While Apple sued to remove the Laser 128 from the market, they were unsuccessful and the Laser remained one of the very few Apple “clones” for sale.

Intel introduces the 80486 microprocessor

research type of computer

Intel 80486 promotional photo

Intel released the 80486 microprocessor and the i860 RISC/coprocessor chip, each of which contained more than 1 million transistors. The RISC microprocessor had a 32-bit integer arithmetic and logic unit (the part of the CPU that performs operations such as addition and subtraction), a 64-bit floating-point unit, and a clock rate of 33 MHz.

The 486 chips remained similar in structure to their predecessors, the 386 chips. What set the 486 apart was its optimized instruction set, with an on-chip unified instruction and data cache and an optional on-chip floating-point unit. Combined with an enhanced bus interface unit, the microprocessor doubled the performance of the 386 without increasing the clock rate.

Macintosh Portable is introduced

research type of computer

Macintosh Portable

Apple had initially included a handle in their Macintosh computers to encourage users to take their Macs on the go, though not until five years after the initial introduction does Apple introduce a true portable computer. The Macintosh Portable was heavy, weighing sixteen pounds, and expensive (US$6,500). Sales were weaker than projected, despite being widely praised by the press for its active matrix display, removable trackball, and high performance. The line was discontinued less than two years later.

Intel's Touchstone Delta supercomputer system comes online

research type of computer

Intel Touchstone Delta supercomputer

Reaching 32 gigaflops (32 billion floating point operations per second), Intel’s Touchstone Delta has 512 processors operating independently, arranged in a two-dimensional communications “mesh.” Caltech researchers used this supercomputer prototype for projects such as real-time processing of satellite images, and for simulating molecular models in AIDS research. It would serve as the model for several other significant multi-processor systems that would be among the fastest in the world.

Babbage's Difference Engine #2 is completed

research type of computer

The Difference Engine #2 at the Science Museum, London

Based on Charles Babbage's second design for a mechanical calculating engine, a team at the Science Museum in London sets out to prove that the design would have worked as planned. Led by curator Doron Swade the team built Babbage’s machine in six years, using techniques that would have been available to Babbage at the time, proving that Babbage’s design was accurate and that it could have been built in his day.

PowerBook series of laptops is introduced

research type of computer

PowerBook 100 laptop computer

Apple's Macintosh Portable meets with little success in the marketplace and leads to a complete redesign of Apple's line of portable computers. All three PowerBooks introduced featured a built-in trackball, internal floppy drive, and palm rests, which would eventually become typical of 1990s laptop design. The PowerBook 100 was the entry-level machine, while the PowerBook 140 was more powerful and had a larger memory. The PowerBook 170 was the high-end model, featuring an active matrix display, faster processor, as well as a floating point unit. The PowerBook line of computers was discontinued in 2006.

DEC announces Alpha chip architecture

research type of computer

DEC Alpha chip die-shot

Designed to replace the 32-bit VAX architecture, the Alpha is a 64-bit reduced instruction set computer (RISC) microprocessor. It was widely used in DEC's workstations and servers, as well as several supercomputers like the Chinese Sunway Blue Light system, and the Swiss Gigabooster. The Alpha processor designs were eventually acquired by Compaq, which, along with Intel, phased out the Alpha architecture in favor of the HP/Itanium microprocessor.

Intel Paragon is operational

research type of computer

Intel Paragon system

Based on the Touchstone Delta computer Intel had built at Caltech, the Paragon is a parallel supercomputer that uses 2,048 (later increased to more than four thousand) Intel i860 processors. More than one hundred Paragons were installed over the lifetime of the system, each costing as much as five million dollars. The Paragon at Caltech was named the fastest supercomputer in the world in 1992. Paragon systems were used in many scientific areas, including atmospheric and oceanic flow studies, and energy research.

Apple ships the first Newton

research type of computer

The Apple Newton Personal Digital Assistant

Apple enters the handheld computer market with the Newton. Dubbed a “Personal Digital Assistant” by Apple President John Sculley in 1992, the Newton featured many of the features that would define handheld computers in the following decades. The handwriting recognition software was much maligned for inaccuracy. The Newton line never performed as well as hoped and was discontinued in 1998.

Intel's Pentium microprocessor is released

research type of computer

HP Netserver LM, one of the first to use Intel's Pentium

The Pentium is the fifth generation of the ‘x86’ line of microprocessors from Intel, the basis for the IBM PC and its clones. The Pentium introduced several advances that made programs run faster such as the ability to execute several instructions at the same time and support for graphics and music.

RISC PC is released

research type of computer

Acorn RISC PC

Replacing their Archimedes computer, the RISC PC from UK's Acorn Computers uses the ARMv3 RISC microprocessor. Though it used a proprietary operating system, RISC OS, the RISC PC could run PC-compatible software using the Acorn PC Card. The RISC PC was used widely in UK broadcast television and in music production.

BeBox is released

research type of computer

BeBox computer

Be, founded by former Apple executive Jean Louis Gassée and a number of former Apple, NeXT and SUN employees, releases their only product – the BeBox. Using dual PowerPC 603 CPUs, and featuring a large variety of peripheral ports, the first devices were used for software development. While it did not sell well, the operating system, Be OS, retained a loyal following even after Be stopped producing hardware in 1997 after less than 2,000 machines were produced.

IBM releases the ThinkPad 701C

research type of computer

IBM ThinkPad 701C

Officially known as the Track Write, the automatically expanding full-sized keyboard used by the ThinkPad 701 is designed by inventor John Karidis. The keyboard was comprised of three roughly triangular interlocking pieces, which formed a full-sized keyboard when the laptop was opened -- resulting in a keyboard significantly wider than the case. This keyboard design was dubbed “the Butterfly.” The need for such a design was lessened as laptop screens grew wider.

Palm Pilot is introduced

research type of computer

Ed Colligan, Donna Dubinsky, and Jeff Hawkins

Palm Inc., founded by Ed Colligan, Donna Dubinsky, and Jeff Hawkins, originally created software for the Casio Zoomer personal data assistant. The first generation of Palm-produced devices, the Palm 1000 and 5000, are based around a Motorola microprocessor running at 16MHz, and uses a special gestural input language called “Graffiti,” which is quick to learn and fast. Palm could be connected to a PC or Mac using a serial port to synchronize – “sync” – both computer and Palm. The company called it a ‘connected organizer’ rather than a PDA to emphasize this ability.

Sony Vaio series is begun

research type of computer

Sony Vaio laptop

Sony had manufactured and sold computers in Japan, but the VAIO signals their entry into the global computer market. The first VAIO, a desktop computer, featured an additional 3D interface on top of the Windows 95 operating system as a way of attracting new users. The VAIO line of computers would be best known for laptops were designed with communications and audio-video capabilities at the forefront, including innovative designs that incorporated TV and radio tuners, web cameras, and handwriting recognition. The line was discontinued in 2014.

ASCI Red is operational

research type of computer

ASCI Red supercomputers

The Advanced Strategic Computing Initiative (ASCI) needed a supercomputer to help with the maintenance of the US nuclear arsenal following the ban on underground nuclear testing. The ASCI Red, based on the design of the Intel Paragon, was built by IBM and delivered to Sandia National Laboratories. Until the year 2000, it was the world's fastest supercomputer, able to achieve peak performance of 1.3 teraflops, (about 1.3 trillion calculations per second).

Linux-based Supercomputing

research type of computer

Linux Supercomputer

The first supercomputer using the Linux operating system, consumer, off-the shelf parts, and a high-speed, low-latency interconnection network, was developed by David A. Bader while at the University of New Mexico. From this successful prototype design, Bader led the development of “RoadRunner”, the first Linux supercomputer for open use by the national science and engineering community via the National Science Foundation's National Technology Grid. RoadRunner was put into production use in April 1999. Within a decade this design became the predominant architecture for all major supercomputers in the world.

The iMac, a range of all-in-one Macintosh desktop computers, is launched

research type of computer

iMac poster

Apple makes a splash with its Bondi Blue iMac, which sells for about $1,300. Customers got a machine with a 233-MHz G3 processor, 4GB hard drive, 32MB of RAM, a CD-ROM drive, and a 15" monitor. The machine was noted for its ease-of-use and included a 'manual' that contained only a few pictures and less than 20 words. As Apple’s first new product under the leadership of a returning Steve Jobs, many consider this the most significant step in Apple's return from near-bankruptcy in the middle 1990s.

First camera phone introduced

research type of computer

Sony-built J-Phone J-SH04

Japan's SoftBank introduces the first camera phone, the J-Phone J-SH04; a Sharp-manufactured digital phone with integrated camera. The camera had a maximum resolution of 0.11 megapixels a 256-color display, and photos could be shared wirelessly. The J-Phone line would quickly expand, releasing a flip-phone version just a month later. Cameras would become a significant part of most phones within a year, and several countries have even passed laws regulating their use.

Earth Simulator is world's fastest supercomputer

research type of computer

Earth Simulator Supercomputer

Developed by the Japanese government to create global climate models, the Earth Simulator is a massively parallel, vector-based system that costs nearly 60 billion yen (roughly $600 million at the time). A consortium of aerospace, energy, and marine science agencies undertook the project, and the system was built by NEC around their SX-6 architecture. To protect it from earthquakes, the building housing it was built using a seismic isolation system that used rubber supports. The Earth Simulator was listed as the fastest supercomputer in the world from 2002 to 2004.

Handspring Treo is released

research type of computer

Colligan, Dubinsky, Hawkins (left to right)

Leaving Palm Inc., Ed Colligan, Donna Dubinsky, and Jeff Hawkins found Handspring. After retiring their initial Visor series of PDAs, Handspring introduced the Treo line of smartphones, designed with built-in keyboards, cameras, and the Palm operating system. The Treo sold well, and the line continued until Handspring was purchased by Palm in 2003.

PowerMac G5 is released

research type of computer

PowerMac G5 tower computer

With a distinctive anodized aluminum case, and hailed as the first true 64-bit personal computer, the Apple G5 is the most powerful Macintosh ever released to that point. While larger than the previous G4 towers, the G5 had comparatively limited space for expansion. Virginia Tech used more than a thousand PowerMac G5s to create the System X cluster supercomputer, rated #3 in November of that year on the world’s TOP500 fastest computers.

research type of computer

Arduino starter kit

Harkening back to the hobbyist era of personal computing in the 1970s, Arduino begins as a project of the Interaction Design Institute, Ivrea, Italy. Each credit card-sized Arduino board consisted of an inexpensive microcontroller and signal connectors which made Arduinos ideal for use in any application connecting to or monitoring the outside world. The Arduino used a Java-based integrated development environment and users could access a library of programs, called “Wiring,” that allowed for simplified programming. Arduino soon became the main computer platform of the worldwide “Maker” movement.

Lenovo acquires IBM's PC business

research type of computer

IBM and Lenovo logos

Nearly a quarter century after IBM launched their PC in 1981, they had become merely another player in a crowded marketplace. Lenovo, China's largest manufacturer of PCs, purchased IBM's personal computer business in 2005, largely to gain access to IBM's ThinkPad line of computers and sales force. Lenovo became the largest manufacturer of PCs in the world with the acquisition, later also acquiring IBM's server line of computers.

NASA Ames Research Center supercomputer Columbia

research type of computer

Columbia Supercomputer system made up of SGI Altix

Named in honor of the space shuttle which broke-up on re-entry, the Columbia supercomputer is an important part of NASA's return to manned spaceflight after the 2003 disaster. Columbia was used in space vehicle analysis, including studying the Columbia disaster, but also in astrophysics, weather and ocean modeling. At its introduction, it was listed as the second fastest supercomputer in the world and this single system increased NASA's supercomputing capacity 10-fold. The system was kept at NASA Ames Research Center until 2013, when it was removed to make way for two new supercomputers.

One Laptop Per Child initiative begins

research type of computer

OLPC XO laptop computer

At the 2006 World Economic Forum in Davos, Switzerland, the United Nations Development Program (UNDP) announces it will create a program to deliver technology and resources to targeted schools in the least developed countries. The project became the One Laptop per Child Consortium (OLPC) founded by Nicholas Negroponte, the founder of MIT's Media Lab. The first offering to the public required the buyer to purchase one to be given to a child in the developing world as a condition of acquiring a machine for themselves. By 2011, over 2.4 million laptops had been shipped.

The Amazon Kindle is released

research type of computer

Amazon Kindle

Many companies have attempted to release electronic reading systems dating back to the early 1990s. Online retailer Amazon released the Kindle, one of the first to gain a large following among consumers. The first Kindle featured wireless access to content via Amazon.com, along with an SD card slot allowing increased storage. The first release proved so popular there was a long delay in delivering systems on release. Follow-on versions of the Kindle added further audio-video capabilities.

The Apple iPhone is released

research type of computer

Apple iPhone

Apple launches the iPhone - a combination of web browser, music player and cell phone - which could download new functionality in the form of "apps" (applications) from the online Apple store. The touchscreen enabled smartphone also had built-in GPS navigation, high-definition camera, texting, calendar, voice dictation, and weather reports.

The MacBook Air is released

research type of computer

Steve Jobs introducing MacBook Air

Apple introduces their first ultra notebook – a light, thin laptop with high-capacity battery. The Air incorporated many of the technologies that had been associated with Apple's MacBook line of laptops, including integrated camera, and Wi-Fi capabilities. To reduce its size, the traditional hard drive was replaced with a solid-state disk, the first mass-market computer to do so.

IBM's Roadrunner supercomputer is completed

research type of computer

Computer-enhanced image of IBM’s Roadrunner

The Roadrunner is the first computer to reach a sustained performance of 1 petaflop (one thousand trillion floating point operations per second). It used two different microprocessors: an IBM POWER XCell L8i and AMD Opteron. It was used to model the decay of the US nuclear arsenal, analyze financial data, and render 3D medical images in real-time. An offshoot of the POWER XCell8i chip was used as the main processor in the Sony PlayStation 3 game console.

Jaguar Supercomputer at Oak Ridge upgraded

Originally a Cray XT3 system, the Jaguar is a massively parallel supercomputer at Oak Ridge National Laboratory, a US science and energy research facility. The system cost more than $100 million to create and ran a variation of the Linux operating system with up to 10 petabytes of storage. The Jaguar was used to study climate science, seismology, and astrophysics applications. It was the fastest computer in the world from November 2009 to June 2010.

Apple Retina Display

research type of computer

Introduction of the iPhone 4 with retina display

Since the release of the Macintosh in 1984, Apple has placed emphasis on high-resolution graphics and display technologies. In 2012, Apple introduced the Retina display for the MacBook Pro laptop and iPad tablet. With a screen resolution of up to 400 pixels-per-inch (PPI), Retina displays approached the limit of pixel visibility to the human eye. The display also used In Plane Switching (IPS) technology, which allowed for a wider viewing angle and improved color accuracy. The Retina display became standard on most of the iPad, iPhone, MacBook, and Apple Watch product lines.

China's Tianhe supercomputers are operational

research type of computer

Tianhe-1A Supercomputer

With a peak speed of over a petaflop (one thousand trillion calculations per second), the Tianhe 1 (translation: Milky Way 1) is developed by the Chinese National University of Defense Technology using Intel Xeon processors combined with AMD graphic processing units (GPUs). The upgraded and faster Tianhe-1A used Intel Xeon CPUs as well, but switched to nVidia's Tesla GPUs and added more than 2,000 Fei-Tang (SPARC-based) processors. The machines were used by the Chinese Academy of Sciences to run massive solar energy simulations, as well as some of the most complex molecular studies ever undertaken.

The Apple iPad is released

research type of computer

Steve Jobs introducing the iPad

The iPad combines many of the popular capabilities of the iPhone, such as built-in high-definition camera, access to the iTunes Store, and audio-video capabilities, but with a nine-inch screen and without the phone. Apps, games, and accessories helped spur the popularity of the iPad and led to its adoption in thousands of different applications from movie making, creating art, making music, inventory control and point-of-sale systems, to name but a few.

IBM Sequoia is delivered to Lawrence Livermore Labs

Built by IBM using their Blue Gene/Q supercomputer architecture, the Sequoia system is the world's fastest supercomputer in 2012. Despite using 98,304 PowerPC chips, Sequoia's relatively low power usage made it unusually efficient. Scientific and defense applications included studies of human electrophysiology, nuclear weapon simulation, human genome mapping, and global climate change.

Nest Learning Thermostat is Introduced

research type of computer

Nest Learning Thermostat

The Nest Learning Thermostat is an early product made for the emerging “Internet of Things,” which envisages a world in which common everyday devices have network connectivity and can exchange information or be controlled. The Nest allowed for remote access to a user’s home’s thermostat by using a smartphone or tablet and could also send monthly power consumption reports to help save on energy bills. The Nest would remember what temperature users preferred by ‘training’ itself to monitor daily use patterns for a few days then adopting that pattern as its new way of controlling home temperature.

Raspberry Pi, a credit-card-size single board computer, is released as a tool to promote science education

research type of computer

Raspberry Pi computer

Conceived in the UK by the Raspberry Pi Foundation, this credit card-sized computer features ease of use and simplicity making it highly popular with students and hobbyists. In October 2013, the one millionth Raspberry Pi was shipped. Only one month later, another one million Raspberry Pis were delivered. The Pi weighed only 45 grams and initially sold for only $25-$35 U.S. Dollars.

University of Michigan Micro Mote is Completed

The University of Michigan Micro Mote (M3) is the smallest computer in the world at the time of its completion. Three types of the M3 were available – two types that measured either temperature or pressure and one that could take images. The motes were powered by a tiny battery and could gain light energy through a photocell, which was enough to feed the infinitesimally small amount of energy a mote consumes (1 picowatt). Motes are also known as “smart dust,” since the intention is that their tiny size and low cost make them inexpensive enough to “sprinkle” in the real world to as sensors. An ecologist, for example, could sprinkle thousands of motes from the air onto a field and measure soil and air temperature, moisture, and sunlight, giving them accurate real-time data about the environment.

Apple Watch

research type of computer

Apple Store’s display of newly introduced Apple Watches

Building a computer into the watch form factor has been attempted many times but the release of the Apple Watch leads to a new level of excitement. Incorporating a version of Apple's iOS operating system, as well as sensors for environmental and health monitoring, the Apple Watch was designed to be incorporated into the Apple environment with compatibility with iPhones and Mac Books. Almost a million units were ordered on the day of release. The Watch was received with great enthusiasm, but critics took issue with the somewhat limited battery life and high price.

Exhibit Design and Development Team

Exhibit content team.

  • Trending Now
  • Foundational Courses
  • Data Science
  • Practice Problem
  • Machine Learning
  • System Design
  • DevOps Tutorial
  • BCA 1st Semester Syllabus (2023)

Fundamentals of IT & Computers

  • Basics of Computer and its Operations
  • Characteristics of Computer System

Types of Computers

  • Number System and Base Conversions
  • What is Algorithm | Introduction to Algorithms
  • What is a Flowchart and its Types?
  • What is an Operating System?
  • DOS Full Form
  • Types of Operating Systems
  • Commonly Used Operating System
  • Difference between Word Processor and Text Editor
  • Introduction to Microsoft Word
  • Introduction to MS Excel
  • Introduction to Microsoft PowerPoint

C Programming

  • C Programming Language Tutorial
  • Operators in C
  • Control Structures in Programming Languages
  • C if else if ladder
  • Nested switch case
  • Introduction to Divide and Conquer Algorithm - Data Structure and Algorithm Tutorials
  • Understanding Time Complexity with Simple Examples
  • What is PseudoCode: A Complete Tutorial
  • Arithmetic Operators in C
  • C Functions
  • Parameter Passing Techniques in C
  • Difference Between Call by Value and Call by Reference in C
  • Scope rules in C

Basic Mathematics

  • Determinant of a Matrix
  • Mathematics | Limits, Continuity and Differentiability
  • Advanced Differentiation
  • Chain Rule Derivative - Theorem, Proof, Examples
  • Taylor Series
  • Relative Minima and Maxima
  • Beta Function
  • Gamma Function
  • Reduction Formula
  • Vector Algebra

Business Communication

  • What is Communication?
  • Communication and its Types
  • BCA 2nd Semester Syllabus (2023)
  • BCA 3rd Semester Syllabus (2023)
  • BCA 4th Semester Syllabus (2023)
  • BCA 5th Semester Syllabus (2023)
  • BCA 6th Semester Subjects and Syllabus (2023)
  • BCA Full Form
  • Bachelor of Computer Applications: Curriculum and Career Opportunity

Pre-Requisite: Basics of Computer

A computer is a device that transforms data into meaningful information. It processes the input according to the set of instructions provided to it by the user and gives the desired output. Computers are of various types and they can be categorized in two ways on the basis of size and on the basis of data handling capabilities.

research type of computer

Types of Computer

There are two bases on which we can define the types of computers. We will discuss the type of computers on the basis of size and data handling capabilities. We will discuss each type of computer in detail. Let’s see first what are the types of computers.

  • Super Computer

Mainframe computer

  • Mini Computer

Workstation Computer

Personal computer (pc), server computer, analog computer, digital computer, hybrid computer.

  • Tablets and Smartphone

Now, we are going to discuss each of them in detail.

Supercomputer

When we talk about speed, then the first name that comes to mind when thinking of computers is supercomputers. They are the biggest and fastest computers (in terms of speed of processing data). Supercomputers are designed such that they can process a huge amount of data, like processing trillions of instructions or data just in a second. This is because of the thousands of interconnected processors in supercomputers. It is basically used in scientific and engineering applications such as weather forecasting, scientific simulations, and nuclear energy research. It was first developed by Roger Cray in 1976.

Super Computers

Super Computers

Characteristics of Supercomputers

  • Supercomputers are the computers that are the fastest and they are also very expensive.
  • It can calculate up to ten trillion individual calculations per second, this is also the reason which makes it even faster.
  • It is used in the stock market or big organizations for managing the online currency world such as Bitcoin etc.
  • It is used in scientific research areas for analyzing data obtained from exploring the solar system, satellites, etc.

Mainframe computers are designed in such a way that they can support hundreds or thousands of users at the same time. It also supports multiple programs simultaneously. So, they can execute different processes simultaneously. All these features make the mainframe computer ideal for big organizations like banking, telecom sectors, etc., which process a high volume of data in general.

Characteristics of Mainframe Computers

  • It is also an expensive or costly computer.
  • It has high storage capacity and great performance.
  • It can process a huge amount of data (like data involved in the banking sector) very quickly.
  • It runs smoothly for a long time and has a long life.

Minicomputer

Minicomputer is a medium size multiprocessing computer. In this type of computer, there are two or more processors, and it supports 4 to 200 users at one time. Minicomputer is similar to Microcontroller. Minicomputers are used in places like institutes or departments for different work like billing, accounting, inventory management, etc. It is smaller than a mainframe computer but larger in comparison to the microcomputer.

Characteristics of Minicomputer

  • Its weight is low.
  • Because of its low weight, it is easy to carry anywhere.
  • less expensive than a mainframe computer.
  • It is fast.

A workstation computer is designed for technical or scientific applications. It consists of a fast microprocessor, with a large amount of RAM and a high-speed graphic adapter. It is a single-user computer. It is generally used to perform a specific task with great accuracy.

Characteristics of Workstation Computer

  • It is expensive or high in cost.
  • They are exclusively made for complex work purposes.
  • It provides large storage capacity, better graphics, and a more powerful CPU when compared to a PC.
  • It is also used to handle animation, data analysis, CAD, audio and video creation, and editing.

Personal Computers is also known as a microcomputer. It is basically a general-purpose computer designed for individual use. It consists of a microprocessor as a central processing unit(CPU), memory, input unit, and output unit. This kind of computer is suitable for personal work such as making an assignment, watching a movie, or at the office for office work, etc. For example, Laptops and desktop computers.

Personal Computer

Personal Computer

Characteristics of Personal Computer (PC)

  • In this limited number of software can be used.
  • It is the smallest in size.
  • It is designed for personal use.
  • It is easy to use.

Server Computers are computers that are combined data and programs. Electronic data and applications are stored and shared in the server computer. The working of a server computer is that it does not solve a bigger problem like a supercomputer but it solves many smaller similar ones. Examples of server computer are like Wikipedia, as when users put a request for any page, it finds what the user is looking for and sends it to the user.

Analog Computers are particularly designed to process analog data. Continuous data that changes continuously and cannot have discrete values are called analog data. So, an analog computer is used where we don’t need exact values or need approximate values such as speed, temperature, pressure, etc. It can directly accept the data from the measuring device without first converting it into numbers and codes. It measures the continuous changes in physical quantity. It gives output as a reading on a dial or scale. For example speedometer, mercury thermometer, etc.

Digital computers are designed in such a way that they can easily perform calculations and logical operations at high speed. It takes raw data as input and processes it with programs stored in its memory to produce the final output. It only understands the binary input 0 and 1, so the raw input data is converted to 0 and 1 by the computer and then it is processed by the computer to produce the result or final output. All modern computers, like laptops, desktops including smartphones are digital computers.

As the name suggests hybrid, which means made by combining two different things. Similarly, the hybrid computer is a combination of both analog and digital computers. Hybrid computers are fast like analog computers and have memory and accuracy like digital computers. So, it has the ability to process both continuous and discrete data. For working when it accepts analog signals as input then it converts them into digital form before processing the input data. So, it is widely used in specialized applications where both analog and digital data are required to be processed. A processor which is used in petrol pumps that converts the measurements of fuel flow into quantity and price is an example of a hybrid computer. 

Tablet and Smartphones

Tablets and Smartphones are the types of computers that are pocket friendly and easy to carry is these are handy. This is one of the best use of modern technology. These devices have better hardware capabilities, extensive operating systems, and better multimedia functionality. smartphones and tablets contain a number of sensors and are also able to provide wireless communication protocols.

Tablet and Smartphones

We generally classify computers on the basis of size, functionality, and data handling capabilities. For more, you can refer to Classification of Computers .

1. Which computer can deal with analog data?

(A) Analogue Computer

(B) Digital Computer

(C) both a and b

(D) None of the above

The correct option is A, i.e., Analogue computer Analogue computer is particularly designed to process analogue data. A  continuous data that changes continuously and cannot have discrete values is called Analogue data.

2. __________ is also known as a Microcomputer.

(A) Supercomputer

(B) Minicomputer

(C) Workstation

(D) Personal computer

Solution: 

The correct option is D, i.e., Personal computer.

3. Which type of computer has two or more processors and supports 4 to 200 users at one time?

(A) Minicomputer 

(B) Personal computer

(C) Analogue computer

(D) All of the above

The correct option is A, i.e., Minicomputer  Minicomputer is a medium sized multiprocessing computer. In this type of computer, there are two or more processors and it supports 4 to 200 users at one time.

4. All modern computers, like laptops, desktops including smartphones, are ______________computers.

(A) Hybrid 

(B) Analogue

(C) Digital

(D) Supercomputer

The correct option is C, i.e., digital.

Please Login to comment...

Similar reads.

  • School Learning
  • School Programming
  • 10 Best Slack Integrations to Enhance Your Team's Productivity
  • 10 Best Zendesk Alternatives and Competitors
  • 10 Best Trello Power-Ups for Maximizing Project Management
  • Google Rolls Out Gemini In Android Studio For Coding Assistance
  • 30 OOPs Interview Questions and Answers (2024)

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

Types of computers

Main Course Launch Page - Data Size and Speeds Exit Supplmental Material

Supplemental Course Navigation

This is a lesson in the course Introduction to Computers , which is a part of The School of Computer Science

This is a resource to learn about different types of computers that exist.

Supercomputer [ edit | edit source ]

research type of computer

Supercomputers are the fastest and the most expensive computers. These huge computers are used to solve very complex science and engineering problems. Supercomputers get their processing power by taking advantage of parallel processing ; they use lots of CPUs at the same time on one problem. A typical supercomputer can do up to ten trillion individual calculations every second. Example Supercomputers:

Quantum computer [ edit | edit source ]

The industry is replacing Supercomputer with Quantum Computers

research type of computer

  • Measures in “ Qubits ”

Mainframe [ edit | edit source ]

research type of computer

Mainframe (colloquially, "big iron") computers are similar to supercomputers in many aspects, the main difference between them is the fact that a supercomputer uses all its raw power to focus on very few tasks, while a mainframe perform thousands or millions of operations concurrently. Due to its nature, mainframes are often employed by large organizations for bulk data processing, such as census , industry and consumer statistics, enterprise resource planning and transaction processing .

Server Computer [ edit | edit source ]

research type of computer

A server is a central computer that contains collections of data and programs. Also called a network server, this system allows all connected users to share and store electronic data and applications. Two important types of servers are file servers and application servers.

Servers are a step below supercomputers because they don't focus on trying to solve one very complex problem but try to solve many similar smaller ones. An example of servers would be the computers that Wikipedia stores its encyclopedia on. Those computers have to go and find the page you're looking for and send it to you. In itself, it's not a big task, but it becomes a job for a server when the computers have to go and find lots of pages for a lot of people and send them to the right place. Some servers, like the ones Google uses for something like Google Documents , have applications on them instead of just files, like Wikipedia.

Workstation Computer [ edit | edit source ]

research type of computer

Workstations are high-end, expensive computers that are made for more complex procedures and are intended for one user at a time. Some of the complex procedures consist of science, math and engineering calculations and are useful for computer design and manufacturing. Workstations are sometimes improperly named for marketing reasons. Real workstations are not usually sold in retail, but this is starting to change; Apple's Mac Pro would be considered a workstation.

The movie Toy Story was made on a set of Sun (Sparc) workstations [1]

Personal Computer or PC [ edit | edit source ]

PC is an abbreviation for a Personal Computer, it is also known as a Microcomputer. Its physical characteristics and low cost are appealing and useful for its users. The capabilities of a personal computer have changed greatly since the introduction of electronic computers. By the early 1970s, people in academic or research institutions had the opportunity for single-person use of a computer system in interactive mode for extended durations, although these systems would still have been too expensive to be owned by a single individual. The introduction of the microprocessor , a single chip with all the circuitry that formerly occupied large cabinets, led to the proliferation of personal computers after about 1975. Early personal computers, generally called microcomputers, sold often in kit form and in limited volumes and were of interest mostly to hobbyists and technicians. By the late 1970s, mass-market pre-assembled computers allowed a wider range of people to use computers, focusing more on software applications and less on the development of the processor hardware. Throughout the 1970s and 1980s, home computers were developed for household use, offering some personal productivity, programming and games, while somewhat larger and more expensive systems (although still low-cost compared with minicomputers and mainframes) were aimed for office and small business use.

Today a personal computer is an all-around device that can be used as a productivity tool, a media server and a gaming machine. The modular construction of the personal computer allows components to be easily swapped out when broken or upgrading.

Microcontroller [ edit | edit source ]

research type of computer

Microcontrollers are mini-computers that enable the user to store data and execute simple commands and tasks. Many such systems are known as embedded systems . The computer in your car, for example, is an embedded system. A common microcontroller that one might come across is called Arduino .

Smartphone [ edit | edit source ]

research type of computer

A smartphone is a mobile device that combines cellular and mobile computing functions into one unit. They are distinguished from feature phones by their stronger hardware capabilities and extensive mobile operating systems, which facilitate wider software, internet (including web browsing[1] over mobile broadband), and multimedia functionality (including music, video, cameras, and gaming), alongside core phone functions such as voice calls and text messaging. Smartphones typically contain a number of metal–oxide–semiconductor (MOS) integrated circuit (IC) chips, include various sensors that can be leveraged by their software (such as a magnetometer, proximity sensors, barometer, gyroscope, or accelerometer), and support wireless communications protocols (such as Bluetooth, Wi-Fi, or satellite navigation). [1]

References [ edit | edit source ]

  • ↑ Wikipedia: Smartphone

research type of computer

  • Secondary Education
  • Nearly complete resources
  • Computer hardware
  • Computer science

Navigation menu

research type of computer

Advertisement

10 Types of Computers, From Wearables to Supercomputers

  • Share Content on Facebook
  • Share Content on LinkedIn
  • Share Content on Flipboard
  • Share Content on Reddit
  • Share Content via Email

Hispanic woman listening to computer with headphones

It's impossible to imagine life without a computer nowadays. From work to entertainment, these machines have become an integral part of our daily lives. But did you know there are various types of computers , each designed for specific tasks and purposes?

While the term "computer" can apply to virtually any device that has a microprocessor , most people envision a device that receives input through a mouse or keyboard , processes it and displays the result on a screen . The hardware and software within computers have evolved at a circuit-snapping pace in the past few decades — the bulky, desk-crushing machines from the early '80s look nothing like the featherweight touchscreen tablets or laptop computers of today.

Modern computers are not just faster; they're more interconnected, thanks to the internet and various web technologies. The days of dial-up modems and text-based systems are long gone. Today, computers use WiFi and broadband connections to deliver everything from live news streams to high-definition movies and intricate video games.

There are a lot of terms used to describe different types computers. Most of these words imply the size, expected use or capability of the computer. Let's get started with the most obvious one.

  • The All-Powerful Personal Computer
  • Netbooks and Tablets
  • Handheld Computers
  • Workstation
  • Supercomputer

10. The All-Powerful Personal Computer

1994 computer

The term "personal computer" (PC) describes a computer designed specifically for individual use. While Apple's iMac falls under the category of a PC, most associate the term with computers that run on the Windows operating system . These PCs, initially dubbed as microcomputers, were a compact version of the massive computer systems that businesses employed.

In 1981, iconic tech maker IBM unveiled its first PC, which relied on Microsoft's now-legendary operating system — MS-DOS (Microsoft Disk Operating System). Not to be left behind, Apple, in 1983, introduced the Lisa, marking one of the first instances of a PC equipped with a graphical user interface (GUI) [sources: Alfred , Cabell ]. This meant that for the first time, users could interact with on-screen icons rather than a bland text interface.

Over the years, advancements in hardware components like the central processing unit (CPU) and random access memory (RAM) have skyrocketed. These leaps in technology allowed for an exponential increase in computing power. For instance, in 1986, Compaq introduced a 32-bit CPU in its 386 machines, and in 1993, Intel unveiled its first Pentium processor [sources: PCWorld , Tom's Hardware ].

Modern personal computers have evolved to include touchscreens, a myriad of built-in connectivity options like Bluetooth and WiFi, and ever-evolving operating systems. The physical form of these machines, from desktop computers to portable laptops, has also seen significant transformations. Today, PCs are more than just tools for data processing or playing games; they're integral to countless aspects of daily life, from scientific research to weather forecasting.

desktop computer

Until the middle of the 1980s, consumers had one choice for a PC — and it was the desktop format. These knee-knocking boxes (called "towers") were big enough to gouge your shins. Equipped with large CRT (cathode ray tube) monitors, they crowded your home workspace or the office. The expectation with desktop systems were that you would set the computer up in a permanent location.

Most desktops initially offered more power, storage and versatility for less cost than their portable brethren, which was what made them the go-to computer in the 1990s, when laptops were still thousands of dollars [source: Britannica ].

These days, desktops are much, much cheaper than they were 20 years, and you can have one for just a few hundred dollars. That's a far cry from the thousands of dollars they cost in the '80s. In fact, one of Hewlett-Packard's first business PCs, the 300, cost $95,000 in 1972 [source: Comen ].

As smartphones and laptops continue their domination of the world, and their prices have put them in reach of most consumers, desktops are going the way of the dinosaur. In 2017, worldwide desktop sales dropped below 100 million, far fewer than the 161.6 million laptops that flew off shelves that same year [source: Moore-Colyer].

But don't cry for the desktop. This PC format is giving way to products that are just as powerful, with the tremendous added benefit of portability. And hardcore gamers still value desktops.

laptop computer

Once upon a time, if you wanted to use a PC, you had to use a desktop. Engineers simply couldn't condense the sophisticated systems in a PC into a portable box. In the mid-1980s, though, many big computer manufacturers made a push to popularize laptop computers .

Laptops are portable computers that integrate the display , keyboard, a pointing device or trackball, processor, memory and hard drive all in a battery-operated package slightly larger than an average hardcover book.

The first true commercial laptop, though, was a far cry from the svelte devices crowding retail shops today. The Osborne 1, released in 1981, sold for around $1,800, had 64 kb of memory — and weighed about 24 pounds (10 kilograms). As it toned your biceps, the Osborne 1 also gave your eyes a workout, as the screen was just 5 inches (12 centimeters) [source: Computing History ].

Fortunately, manufacturers quickly improved upon the look and feel of laptops. Just two years later, Radio Shack's TRS-80 Model 100 packed its component into a 4-pound (8 kilogram) frame, but it lacked power.

By the end of the decade, NEC's UltraLite smashed barriers by cramming real computing efficiency into the first true notebook (i.e. very light laptop) style, which weighed just 5 pounds (2.2 kilograms). The race to ultra-portability was officially on [source: Bellis ]. However, laptops didn't overtake PCs in sales until 2005 [source: Arthur ].

7. Netbooks and Tablets

iPad drawing

Netbooks are ultra-portable computers that are even smaller than traditional laptops . The extreme cost-effectiveness of netbooks (roughly $200) means they're cheaper than almost any brand-new laptop you'll find at retail outlets. However, netbooks' internal components are less powerful than those in regular laptops [source: Krynin ].

Netbooks first appeared in 2007, primarily as a means for accessing the internet and web-based applications, from email, to music and movie streaming, to web surfing. They're incredibly compact, but as a result, their specifications list often resembles a very stripped-down laptop. They have small displays (as small as 6-7 inches or 15-18 centimeters), little storage capacity (perhaps maxing out at 64GB), and sometimes skimp on or altogether skip data ports (like USB or HDMI) that traditional laptops wield. A lot of netbooks come from small manufacturers, as the big guns can't be bothered with the low profit margins of these cheaper machines [source: Lenovo ].

Because they have relatively sluggish processors and little memory, netbooks can't do the heavy lifting for graphics applications or hardcore games. Instead, they're best for the task that gives them their name: web surfing [source: Krynin ].

Tablets have largely replaced the niche netbooks occupied. Tablets are thin, flat devices that look like larger versions of smartphones. They were first manufactured in 2000 by Lenovo , but popularized by Apple in 2010 with the release of its iPad [source: Bort ].

Tablets can do pretty much all the functions that laptops do, but don't have the internal fans that PCs have. So they have to rely on lower-performing processors that won't use as much heat or battery power. They also have less storage capacity than traditional PCs. Older tablets used the same operating systems as mobile phones but the newer tablets use a full operating system such as Micrsoft Windows 10 [source: Lenovo ].

Tablets are more portable than PCs, have a longer battery life yet can also do smartphone-like activities such as taking photos, playing games and drawing with a stylus. For those who like the keyboard functionality of a laptop, some tablets come with a keyboard (attached or detachable), allowing you to combine the best of both worlds.

6. Handheld Computers

man in wheelchair with friends taking selfie

Early computers of the 20th century famously required entire rooms. These days, you can carry much more processing power right in your pants pocket. Handheld computers like smartphones and PDAs are one of our era's iconic devices [source: Arthur ].

Debuting in the 1990s, personal digital assistants (PDAs) were tightly integrated computers that often used flash memory instead of a hard drive for storage. These computers usually didn't have keyboards but relied on touchscreen technology for user input. PDAs were typically smaller than a paperback novel, very lightweight with a reasonable battery life. For a time, they were the go-to devices for calendars, email and simple messaging functions [source: Britannica ]. Remember the Palm Pilot and the BlackBerry?

But as the smartphone revolution began, PDAs lost their luster. Smartphones like the iPhone and Samsung Galaxy blend calling features and PDA functionality along with full-blown computer capabilities that get more jaw-dropping by the day. They feature touch-screen interfaces, high-speed processors, many gigabytes of memory, complete connectivity options (including Bluetooth, WiFi and more), dual-lens cameras, high-quality audio systems, and other features that would startle electronics engineers from half a century ago.

Although smartphones have existed in some fashion since 2000, it was the heavily hyped debut of the iPhone 3G in 2007 that brought the device to the masses. The look, feel and functionality of that iPhone set the template for all the other smartphones that have followed [source: Nguyen ].

5. Workstation

CAD screen

A workstation is simply a desktop computer that has a more powerful processor, additional memory, high-end graphics adapters and enhanced capabilities for performing a special group of tasks, such as 3D graphics or game development [source: Intel ].

Workstations, like regular desktop computers, are intended for individual users. But they differ from desktops in that they are much, much speedier. Typically, it's businesses like engineering firms or multimedia companies that buy these workhorse PCs for their employees [source: TechTarget ].

The power of a workstation doesn't come cheap. Whereas small businesses can easily find normal desktops for just a few hundred dollars, workstations might cost three times as much. Basic workstations easily go for $1,500 and double in price in a hurry [source: Benton ].

But while cheap desktops are built with equally cheap (read: sometimes unreliable) components, workstations are quality machines meant for serious business. They may be left on overnight to crunch numbers or render animations. Therefore, these computers sport redundant hard drives for data safety, as well as faster CPUs and large-capacity solid-state drives.

All of those factors point to a machine that's made more for profit instead of basic word processing or random games of Minesweeper [source: Benton ].

CERN server

A computer that has been optimized to provide services to other computers over a network, servers usually have powerful processors, lots of memory and large hard drives.

Unlike a desktop or laptop PC, you don't sit down at a server and type. Instead, a server provides computer power — and lots of it — through a local area network (LAN) or over the internet. Companies small and large lean on servers to provide information, process orders, track shipping data, crunch scientific formulas and a whole lot more. Servers are often stored on racks in a dedicated server room, which in some companies may resemble warehouses.

Like regular PCs, servers have typical computer components. They have motherboards, RAM, video cards, power supplies and ample network connections for any need. They don't typically have dedicated displays, though. Instead, IT workers use a single monitor to configure and control multiple servers, combining their computing power for ever greater speed.

Ever wonder how a service like Google can anticipate your search inquiries in real time ... and then kick back answers to your deepest questions in just a moment? It's all because of servers. By some estimates, the company maintains and operates roughly 2.5 million servers in huge data centers scattered all around Earth [source: Data Center Knowledge ].

3. Mainframe

IBM z13 mainframe

In the early days of computing, mainframes were huge computers that could fill an entire room or even a whole floor! As the size of computers has diminished while their power has increased, the term "mainframe" has fallen out of use in favor of enterprise server. You'll still hear the term mentioned, though, particularly in large companies to describe the huge machines processing millions of transactions every day, while simultaneously working to fulfill the needs of hundreds, if not thousands of individual users.

Although mainframes traditionally meant a centralized computer linked to less powerful devices like workstations, this definition is blurring as smaller machines gain more power and mainframe computers get more flexible [source: IBM ].

Mainframes first came to life in the post-World War II era, as the U.S. Department of Defense ramped up its energies to fight the Cold War. Even as servers become more numerous, mainframes are still used to crunch some of the biggest and most complex databases in the world. They help to secure countless sensitive transactions, from mobile payments to top-secret corporation information [source: Alba ].

Indeed, IBM, one of the world's most enduring makers of mainframes for more than half a century, saw a spike in mainframe sales in 2018, for the first time in five years. That's in part because mainframes can pack so much calculating muscle into an area that's small than a rack of modern, high-speed servers [source: Hall ].

2. Supercomputer

MistralÓ supercomputer

This type of computer usually costs hundreds of thousands or even millions of dollars. Although some supercomputers are single computer systems, most are composed of multiple high performance computers working in parallel as a single system. The best known supercomputers are built by Cray Supercomputers.

Supercomputers are different from mainframes. Both types of computers wield incredible computing power for Earth's most intense industrial and scientific calculations. Mainframes are generally tweaked to provide the ultimate in data reliability.

Supercomputers, on the other hand, are the Formula 1 race cars of the computer world, built for breakneck processing speed, so that companies can hurtle through calculations that might take other systems days, weeks, or even months to complete.

They're often found at places like atomic research centers, spy agencies, scientific institutes or weather forecasting stations, where speed is of vital concern. For example, the United States' National Oceanic and Atmospheric Administration, which has some of the world's most advanced weather forecasting capabilities, uses some of the world's fastest computers — capable of more than 8 quadrillion calculations per second [sources: Hardawar , NOAA ].

That kind of heart-stopping computer power comes at an equally heart-stopping price. The U.S. Department of Energy's Oak Ridge National Laboratory's Summit supercomputer, for example, cost $200 million. It is the first supercomputer built to handle AI applications [source: Wolfson ].

1. Wearable

smart watch on man

The latest trend in computing is wearable computers. Essentially, common computer applications (e-mail, database, multimedia, calendar/scheduler) are integrated into watches, cell phones , visors and even clothing. Many other wearables target outdoors enthusiasts and fitness fantatics, allowing them to track their location, altitude, calories burned, steps, speed, and much, much more.

The Apple iWatch , now in its eighth incarnation, is one of the best reviewed wearables to date. This small watch has many of the functionalities of a full-blown smartphone. It lets you perform normal texting and email duties. And it has a built-in cell phone, unlike some other smart watches that must be paired with a phone to make calls. It even has a built-in electrical heart sensor that you can use to take an electrocardiogram and share it instantly with your doctor [source: Apple ].

But watches are just the beginning. Sewn-in accessories for clothing are growing, as are smart eyeglasses, smart belts, sleep monitors, heart rate trackers and intelligent ear buds. A company called MC10 is even touting skin patches that will track various biological processes happening in your body [source: Pervasive Computing ].

Wearables are indeed a new horizon in personal computing. Their flexibility and mind-warping potential speak to the idea that the computer revolution isn't over. If anything, the PC era might just be getting underway.

Types of Computers FAQ

What are the types of computers, what is a computer, what are the most common types of computers and their functions, lots more information, related articles.

  • 10 Worst Computer Viruses of All Time
  • 10 Useful Google Tools
  • Top 5 Myths About the Internet
  • Top 5 Myths About Google
  • Top 5 Ways to Troubleshoot Your Broadband Connection

More Great Links

  • Invention of the PC
  • 30 Years of the PC: A Timeline
  • The Complete History of the IBM PC
  • History of Computers: A Brief Timeline
  • Alba, Davey. "Why on Earth is IBM Still Making Mainframes?" Wired. Jan. 13, 2015. https://www.wired.com/2015/01/z13-mainframe/
  • Abell, John. "Jan. 19, 1983: Apple Gets Graphic With Lisa." Wired. Jan. 19, 2010. https://www.wired.com/2010/01/0119apple-unveils-lisa/
  • Alfred, Randy. "Aug. 12, 1981: IBM Gets Personal with 5150 PC." Wired. August 12, 2011. https://www.wired.com/2011/08/0812ibm-5150-personal-computer-pc/
  • Bellis, Mary. "The History of Laptop Computers." ThoughtCo. April 19, 2018. https://www.thoughtco.com/history-of-laptop-computers-4066247
  • Benton, Brian. "Workstation vs. Desktop Computer: Which Does Your Office Need?" Redshift. March 19, 2013. https://www.autodesk.com/redshift/pc-versus-workstation/
  • Britannica. "PDA Handheld Computer." https://www.britannica.com/technology/PDA
  • Britannica. "Personal Computer." https://www.britannica.com/technology/personal-computer
  • Brown, Michael. "How to Choose a Server for Your Small Business." PC World. March 21, 2012. https://www.pcworld.com/article/251993/how_to_choose_a_server_for_your_small_business.html
  • CNET. "Best Wearable Tech for 2018." https://www.cnet.com/topics/wearable-tech/best-wearable-tech/
  • Comen, Evan. "How Much Did a Personal Computer Cost the Year You Were Born?" USA Today. June 22, 2018. https://www.usatoday.com/story/tech/2018/06/22/cost-of-a-computer-the-year-you-were-born/36156373/
  • Computing History. "Osborne 1." http://www.computinghistory.org.uk/det/504/osborne-1/
  • Data Center Knowledge. "Google Data Center FAQ." March 16, 2017. https://www.datacenterknowledge.com/archives/2017/03/16/google-data-center-faq
  • History.com. "Invention of the PC." https://www.history.com/topics/inventions/invention-of-the-pc
  • Hall, Christine. "Why Mainframes Aren't Going Away Any Time Soon." Data Center Knowledge. Feb. 7, 2018. https://www.datacenterknowledge.com/hardware/why-mainframes-arent-going-away-any-time-soon
  • IBM. "What is a Mainframe? It's a Style of Computing." https://www.ibm.com/support/knowledgecenter/zosbasics/com.ibm.zos.zmainframe/zconc_whatismainframe.htm
  • IEEE Pervasive Computing. "Wearables: The Next Big Thing as Smartphones Mature." Feb. 22, 2018. https://publications.computer.org/pervasive-computing/2018/02/22/wearables-next-big-thing-smartphones/
  • Intel. "Workstation or PC: How to Decide What Type of System is Right for You." https://www.intel.com/content/dam/doc/product-brief/workstation-xeon-e3-workstation-or-pc-comparison-brief.pdf
  • Krynin, Mark. "What is a Netbook?" LifeWire. March 5, 2018. https://www.lifewire.com/what-is-a-netbook-832315
  • Lenovo. "What is a Netbook?" https://www.lenovo.com/us/en/faqs/laptop-faqs/what-is-a-netbook/
  • Mitchell, Bradley, "Servers are the Heart of the Internet." LifeWire. July 2, 2018. https://www.lifewire.com/servers-in-computer-networking-817380
  • Moore-Colyer, Roland. "Desktop PC Sales Slumped in 2017 as the Death Rattle Sounds Once Again." The Inquirer. March 1, 2018. https://www.theinquirer.net/inquirer/news/3027646/desktop-pc-sales-slumped-in-2017-as-death-rattle-sounds-once-again
  • National Oceanic and Atmospheric Administration. "NOAA Kicks Off 2018 with Massive Supercomputer Upgrade." Jan. 10, 2018. http://www.noaa.gov/media-release/noaa-kicks-off-2018-with-massive-supercomputer-upgrade
  • Nguyen, Tuan C. "The History of Smartphones." ThoughtCo. Oct. 3, 2017. https://www.thoughtco.com/history-of-smartphones-4096585
  • PCWorld. "The 25 Greatest PCs of All Time." Aug. 11, 2006. https://www.pcworld.com/article/126692/greatest_pcs_of_all_time.html?page=6
  • TechTarget. "Workstation." https://searchmobilecomputing.techtarget.com/definition/workstation
  • TechTarget. "Personal Digital Assistant." https://searchmobilecomputing.techtarget.com/definition/personal-digital-assistant
  • Tom's Hardware. "Intel Drops 'Pentium' Brand." Jan. 14, 2006. https://www.tomshardware.com/reviews/intel-drops-pentium-brand,1832-2.html
  • Wolfson, Elijah. "The U.S. Surpassed China with a Supercomputer Capable of as Many Calculations per Second as 6.3 Billion Humans." QZ.com. June 9, 2018. https://qz.com/1301510/the-us-has-the-worlds-fastest-supercomputer-again-the-200-petaflop-summit/
  • Zimmerman, Kim Ann. "History of Computers: A Brief Timeline." LiveScience. Sept. 6, 2017. https://www.livescience.com/20718-computer-history.html

Please copy/paste the following text to properly cite this HowStuffWorks.com article:

  • Skip to main content
  • Skip to primary sidebar
  • Skip to secondary sidebar
  • Skip to footer

Computer Notes

  • Computer Fundamental
  • Computer Memory
  • DBMS Tutorial
  • Operating System
  • Computer Networking
  • C Programming
  • C++ Programming
  • Java Programming
  • C# Programming
  • SQL Tutorial
  • Management Tutorial
  • Computer Graphics
  • Compiler Design
  • Style Sheet
  • JavaScript Tutorial
  • Html Tutorial
  • Wordpress Tutorial
  • Python Tutorial
  • PHP Tutorial
  • JSP Tutorial
  • AngularJS Tutorial
  • Data Structures
  • E Commerce Tutorial
  • Visual Basic
  • Structs2 Tutorial
  • Digital Electronics
  • Internet Terms
  • Servlet Tutorial
  • Software Engineering
  • Interviews Questions
  • Basic Terms
  • Troubleshooting

Header Right

Types of computers.

By Dinesh Thakur

Technically, a computer is a handheld machine. It means it can perform a programmed list of instructions and react to new instructions that it is given. It can execute a prerecorded list of instructions (a program). It can quickly save and retrieve considerable quantities of information .

Today, however, the term is most frequently utilized to refer to the desktop computer and laptop computers which most men and women use. When speaking to a desktop model, the expression “computer” technically only applies to the computer itself — not the monitor, keyboard, and mouse. Nonetheless, it is acceptable to refer to everything together as the computer. If you would like to be technical, the box which holds the machine is called the “system.”

Types of computers

Hence computers can perform complex and repetitive processes quickly, precisely, and reliably. Modern computers are digital. The actual machinery (cables, transistors, and circuits) called hardware; the instructions and data are called software. A number of the significant pieces of a personal computer (or PC) include:

The central processing unit ( CPU ): It is part of any electronic computer system; this is the component composed of the Primary memory , control Device, and arithmetic-logic unit. It represents the physical center of the whole computer system; it’s connected to a various peripheral gear, including input/output apparatus and additional storage units. In modern computers, the CPU included on an incorporated circuit chip called a microprocessor .

Memory (fast, expensive, short-term memory) (or RAM ): It is a speedy type of computer memory that temporarily stores all of the information in your PC that you want right now and shortly.

Hard drive or Mass storage device (slower, cheaper, long-term memory ): It’s a hardware device that used to retain considerable quantities of information like applications and documents permanently. The primary hard disk in a PC is your C drive.

While personal computers are undoubtedly the most frequent type of machines now, there are several other kinds of computers. By way of instance, a “ minicomputer ” is a powerful computer that can support many users at once. A “ mainframe ” is a sizable, high-powered computer that can perform billions of calculations from several sources at one time. In the end, a “ supercomputer ” is a machine that can process billions of instructions a second and is used to compute exceptionally complex calculations.

Broadly, computers can classify based on:

( a ) The data handling capabilities and the way they perform the signal processing, and

( b ) Size, in terms of capacities and speed of operation.

Hierarchy of Computer Types

Based on the type of input they accept, the computer is of three types:

We’ll be covering the following topics in this tutorial:

1. Analogue Computer

Everything we hear and see is changing continuously. This variable continuous stream of data is known as analogue data. Analog computer may be used in scientific and industrial applications such as to measure the electric current, frequency and resistance of the capacitor, etc..

Analogue computers directly accept the data in the measuring device without first converting it into codes and numbers.

Cases of analogue computer are temperature, pressure, telephone lines, Speedometer, immunity of capacitor, frequency of signal and voltage, etc..

2. Digital Computer

The digital computer is the most widely used and used to process data with numbers using digits, usually utilizing the binary number system .

A digital computer intended to do calculations and logical operations at a high rate. It takes the raw data as digits or amounts and procedures using applications stored in its memory to make output. All modern computers such as laptops and desktops we use at office or home are digital computers.

It works on data, such as magnitudes, letters, and symbols, which expressed in binary code–i.e., with just the two digits 1 and 0. By counting, comparing, and manipulating those digits or their mixtures by a pair of instructions stored in its memory, a digital computer may perform such tasks to control industrial processes and also control the operations of machinery; examine and organize vast amounts of company data; and mimic the behaviour of dynamic systems (e.g., international climate patterns and chemical reactions) in scientific study.

Digital computer supplies accurate result but they’re slow compared to an analogue computer.

3. Hybrid Computer

A hybrid computer which combines the aspects of a digital computer and an analogue computer. It’s quick like an analogue computer and contains memory and precision like digital computers. It’s intended to incorporate a functioning analogue unit that’s effective for calculations, nevertheless has a readily accessible digital memory. In large businesses and companies, a hybrid computer may be employed to integrate logical operations in addition to provide efficient processing of differential equations.

For instance, a gas pump includes a chip that converts the dimensions of fuel flow to volume and cost.

A hybrid computer is used in hospitals to gauge the heartbeat of this individual.

Different kinds and sizes of computer

Since the coming of the very first computer, different kinds and sizes of machines are providing various services. Computers are often as large as inhabiting a massive building as little as a notebook or even a microcontroller in embedded or mobile systems.

Computers can be generally classified by kind or size and power as follows, although there’s considerable overlap.

Supercomputer

A supercomputer is the fastest computer on earth that could process a considerable number of information very quickly. The calculating Performance of a supercomputer quantified in FLOPS (which is floating-point operations per minute) rather than MIPS.

supercomputer

These computers will be massive regarding the size. A most potent supercomputer could occupy several feet to hundreds of feet. The supercomputer cost is exceptionally high, and they can range from two lakh buck to over 100 million dollars.

Supercomputers were released in the 1960s and developed by Seymour Cray together with the Atlas at the University of Manchester. The Cray made CDC 1604 that has been the first supercomputer on earth, and it replenishes vacuum tubing with transistors.

Uses of Supercomputers

Today’s supercomputers can’t just perform calculations; they process enormous amounts of information in parallel with distributing computing jobs to tens of thousands of CPUs. Supercomputers located at work in research centers, government agencies, and companies performing mathematical calculations in addition to gathering, collating, categorizing, and assessing information.

Weather Forecasting

The regional weatherman bases his predictions on information provided by supercomputers run by NOAA or the National Oceanic and Atmospheric Administration. NOAA’s systems execute database operations, mathematical, and statistical analysis on enormous amounts of information gathered from throughout the country and around the globe. The processing capacity of supercomputers assists climatologists forecast, not merely the probability of rain on your neighborhood but the paths of hurricanes as well as the likelihood of whale strikes.

Scientific Research

Much like the weather, scientific study is contingent on the number-crunching capability of supercomputers. By way of instance, astronomers at NASA examine data flowing from satellites on the planet, ground-based radio and optical telescopes and probes exploring the solar system. Researchers in the European Organization for Nuclear Research, or CERN, discovered the Higgs-Boson particle by assessing the huge amounts of data created by the Large Hadron Collider.

Data Mining

Many supercomputers are necessary to extract data from raw information accumulated from info farms around the floor or the cloud. By way of instance, companies can analyze data gathered in their cash registers to help control stock or spot market tendencies. Life Insurance businesses use supercomputers to lessen their actuarial risks. Likewise, companies that offer health insurance reduce prices and client premiums using supercomputers to analyze the advantages of different treatment choices.

The Top Five Popular Supercomputers

• JAGUAR, Oak Ridge National Laboratory

• NEBULAE, China

• ROADRUNNER, Los Alamos National Laboratory

• KRAKEN, National Institute for Computational Sciences

• JUGENE, Juelich Supercomputing Centre, Germany

Mainframe computer

The mainframe denotes the sort of computer which runs a whole corporation. The Mainframe computers can accommodate in large air-conditioned rooms because of its dimensions in the current world, where all of the companies, trades, and communications are real-time.

mainframe computer

So to do all this endeavor, a highly effective computer need on the host side, which processes the directions and supplies the output in moments. According to the use of computers in the modern world, we could use classifications pc in Supercomputer, Mainframe Computer, and Mini Computer and microcomputer types. A mainframe computer is stronger than Mini and Microcomputer, but stronger than Supercomputer. A mainframe computer used at large businesses.

The main distinction between a supercomputer and a mainframe is that a supercomputer stations all its power to execute a program as quickly as possible. In contrast, a mainframe uses its capability to run many applications simultaneously. In specific ways, mainframes are more effective than supercomputers because they encourage more simultaneous applications. However, supercomputers can do one program faster than a mainframe.

Popular Mainframe computers

• IBM 1400 series.

• 700/7000 series.

• System/360.

• System/370.

• IBM 308X.

Minicomputer

A minicomputer also referred to as miniature. It’s a category of little computers which has introduced to the world from the mid-1960s. Minicomputers used by small businesses. A minicomputer is a computer that has all of the qualities of a considerable size pc, but its size is significantly smaller compared to those. A minicomputer can also be known as a mid-range pc. Minicomputers are primarily multi-users systems where more than one user can operate concurrently.

Minicomputer

Minicomputer can encourage multi-users at one time, or you’ll be able to state that minicomputer is a multiprocessing system.

Additionally, the ability of processing of minicomputers isn’t more significant than the energy of mainframe and supercomputers.

Different Types of Minicomputers

• Tablet PCs

• Smartphones

• Notebooks

• Touch Screen Pads

• High-End Music Plays

• Desktop Mini Computers

Microcomputer

Micro Computer is a little computer. Your private machines are equal to the microcomputer. Mainframe and Mini Computer is the ancestor of all microcomputers. Integrated Circuit manufacturing technology reduces the size of Mainframe and Minicomputer.

Micro Computer

Technically, a microcomputer is a computer where the CPU (central processing unit ( the brains of the machine) comprised of a single processor, a microprocessor , input/output apparatus, and storage (memory) unit. These elements are essential to get the proper functioning of the microcomputer.

Micro-computers especially created for general usages like entertainment, education, and work purposes. Well, known Method of a ‘ Microcomputers.

Types of Micro Computer

• Desktop computers

• personal digital assistant (PDA)

• telephones

You’ll also like:

  • Evolution of Digital Computers
  • Impact of Computers on Society
  • Classification of Computers | Type of Computer
  • Types of Storage Device | Types of Backup
  • What is Files & Types of Files? Types of File Operations.

Dinesh Thakur

Dinesh Thakur is a Freelance Writer who helps different clients from all over the globe. Dinesh has written over 500+ blogs, 30+ eBooks, and 10000+ Posts for all types of clients.

For any type of query or something that you think is missing, please feel free to Contact us .

Basic Course

  • Database System
  • Management System
  • Electronic Commerce

Programming

  • Structured Query (SQL)
  • Java Servlet

World Wide Web

  • Java Script
  • HTML Language
  • Cascading Style Sheet
  • Java Server Pages
  • Machine Learning
  • Robotic Process Automation
  • Entrepreneurs
  • Virtual Reality
  • Real Estate

Go Roboted

We’ll explore everything from personal computers to supercomputers, detailing their unique features and how they serve different purposes in our daily lives. So, whether you’re a tech enthusiast wanting to expand your knowledge or someone who’s just curious about the various analog and digital computers and other devices now surrounding us, this guide has something for you.

Table of Contents

Introduction

Defining computer systems.

In the simplest terms, a computer system is a set of interconnected components that work together to perform computational tasks. These tasks can range from simple calculations to complex data processing and analysis. The two primary parts of a computer system are hardware, the tangible parts of digital computer like the CPU and RAM, and software, the intangible components like operating systems and applications.

The variety of portable computer and systems is extensive, with estimates suggesting that there are over two mobile computers and 2 billion personal computers in use globally, underscoring their ubiquity in homes and businesses.

Understanding The Scope Of Different Types Of Computer Systems

Computer systems come in a wide array of types of computers, each designed to serve specific needs and applications. They vary in size, power, complexity, and cost, depending on their intended use. By understanding these differences, we can better appreciate how these machines have become an integral part of our everyday lives, from our pocket-sized smartphones hybrid computers to the powerful mainframes that keep our digital world running.

Types Of Computer Systems

1. personal computers (pcs).

Personal computers (PCs) are the most common type of computer system, found in homes, schools, and workplaces worldwide, modern computers. They’re portable computers designed for individual use and come with a variety of software applications for tasks such as word processing, browsing the internet, playing games, and much more.

Desktop Computers

Desktop computers are a type of PC that typically sit on a desk, hence their name, desktop computer. They consist of several separate parts, including the monitor, the keyboard, the mouse, and the system unit housing the other central processing unit and components. Desktops are known for their power, storage capacity, and ability to perform multiple tasks simultaneously. They’re ideal for demanding tasks like graphic design, video editing, and gaming.

Laptops And Notebooks

Laptops and notebooks, on the other hand, are portable PCs. They integrate the operating system, unit, monitor, keyboard, and pointing device into a single compact unit. Laptops and notebooks offer similar functionality to desktops but with the added advantage of mobility. This makes them perfect for users who need to work or study on the go.

Mainframe computer systems, known for their robust processing capabilities, continue to play a crucial role in large-scale data processing, with approximately 92% of the world’s top banks relying on mainframes for input data, for financial transactions and raw data too.

research type of computer

2. Workstations

Workstations are high-performance computers designed for technical or scientific applications. They’re more powerful than personal computers and have advanced graphics capabilities, making them suitable for tasks such as 3D modeling, computer-aided design (CAD), and complex data analysis.

Characteristics And Uses Of Workstations

Workstations are characterized by their superior processing power, large memory capacity, and high-quality graphics adapters. They often come with specialized software tailored to specific tasks or industries. For instance, film studios use workstations for animation and visual effects, while engineers use them for CAD and simulations.

Quantum computers, at the forefront of computing power and innovation, utilize qubits for processing. IBM’s Quantum Hummingbird, with 65 qubits , exemplifies the rapid progress in quantum computing capabilities, marking a significant advancement in this nascent field.

Servers are computer systems that manage network resources and provide services to other computers, known as clients. They store, retrieve, and send data to clients, enabling other users simultaneously to shared resources and services like internet access, file storage, and email systems.

Server Architectures And Their Roles

The architecture of servers can vary widely, from simple file servers and disk drives that store and manage files for network users, to powerful database servers that handle large amounts of data for businesses. Web servers host websites, while game servers enable multiplayer online gaming on mobile devices. Each server type plays a vital role in ensuring smooth digital communication and operations.

Embedded computer systems are omnipresent in everyday devices, and it’s projected that by 2025, there will be over 75 billion connected IoT devices worldwide, showcasing the pervasive integration of digital computers input devices and embedded systems.

research type of computer

4. Mainframe Computers

Mainframe computers are large, powerful systems used by large organizations for critical applications, bulk data processing, enterprise resource planning, and transaction processing. They’re capable of supporting thousands of users storage devices discrete data, and applications simultaneously.

Supercomputers, designed for high-performance computing tasks, are a specialized category analog computers. As of 2021, the fastest supercomputer in the world, Fugaku , boasts a processing speed of over 442 petaflops, showcasing the extraordinary computational power of these systems.

Features And Functions Of Mainframe Systems

Mainframes are known for their high availability, reliability, and security. They have extensive input/output facilities, allowing them to handle enormous volumes of data quickly. These systems are commonly used by banks, airlines, and government agencies where continuous, large-scale processing raw input data is required.

5. Supercomputers

Supercomputers are the most powerful computing machines available. They’re used for highly calculation-intensive tasks such specialized applications such as weather forecasting, climate research hybrid computer,, nuclear simulations, and quantum physics.

Capabilities And Applications Of Supercomputers

Supercomputers can process trillions of calculations per second, making them indispensable in fields requiring complex mathematical computations. They’re used in scientific research to simulate natural phenomena, in engineering for product design and testing, and in healthcare for drug discovery.

Server systems are fundamental to the functioning of the internet. It’s estimated that there are over 340 million public servers globally, supporting websites, applications, and various online services.

research type of computer

6. Embedded Systems

Embedded systems are computer systems with a dedicated function within larger mechanical or electrical systems output devices. They’re embedded as part of a complete device system that includes other hardware components, such as electrical and mechanical parts.

Explanation Of Embedded Systems In Daily Use

Examples of embedded systems include digital watches, traffic light controllers, and the system controlling your car’s engine. These systems have specific functions and operate independently without user intervention. They’re designed to perform a dedicated task with maximum efficiency.

Personal computers and computer hardware have evolved over the years, and as of 2021, the global market for desktop and laptop computers is valued at around $300 billion , reflecting the ongoing demand for these computing devices themselves.

research type of computer

We’ve travelled from the desk at home to the depths of scientific research labs, exploring the different types of computer systems. It’s clear that these machines, in their various forms, have a tremendous impact on nearly every aspect of our lives. Whether it’s the personal computer that connects us to the world, the analog computer, the mainframe that protects our financial transactions, personal digital assistant, or the supercomputer pushing the boundaries of scientific discovery, each type of computer system has its unique role and significance.

Last Updated on December 9, 2023 by Parina

Parina

Parina Parmar is a full-time dog mom with a knack for content, editing & advertising. She has years of experience in the communication industry, and her dedication to maintaining the integrity of the author's voice while ensuring clarity and coherence in the text sets her apart in her field. She is dedicated to immersing her love for culture, music, and the advertising industry in her works.

  • Bachelors in Journalism and Mass Communication
  • Specialization in SEO, Editing, Digital Strategy, Content Writing & Video Strategy

Certifications/Qualifications

  • Diploma in Fashion Desgining
  • Performance Marketing by Young Urban Project

View all posts

latest articles

Exploring the unknown of ai in scientific discovery, the path of ai robots: from toys to advanced companions, discover the latest trends and strategies in forum and community management, decoding ancient wisdom: ai’s insights on religious prophecies and predictions, get inspired and expand your knowledge at the cutting-edge ai conferences of 2024, delving into the profound questions of ai and humanity in spielberg’s “a.i.: artificial intelligence”, explore more, leave a reply cancel reply.

Save my name, email, and website in this browser for the next time I comment.

Go Roboted

Our partners have decades of experience implementing RPA solutions. And working with leading systems like MRI Software and Yardi. Opt in to know more.

No thanks, I’m not interested!

TYPES OF RESEARCH IN COMPUTING SCIENCE SOFTWARE ENGINEERING AND ARTIFICIAL INTELLIGENCE

1. background (and updates).

A Research Strategy conference was organised by the CPHC (UK Conference of Professors and Heads of Computer Science) at the University of Manchester 6-7 Jan 2000. It was attended by about 100(?) people (not only professors and heads, including some researchers in industry). On the first afternoon there was an introductory panel session concerned with how the Computing Science community should present its research objectives and achievements to EPSRC and the bodies which award funding to EPSRC. During the ensuing discussion I suggested a high level way of dividing up research aims into four main categories (later expanded to five), which, in part, need to be evaluated differently. Both during the conference and subsequently I received comments and requests for clarification and references. So I thought I should write down what I had said, expand it a bit, and circulate it for comment and criticism. The resulting document is in this file at http://www.cs.bham.ac.uk/research/projects/cogaff/misc/cs-research.html [1] Whenever I have afterthoughts, or receive criticisms, comments and suggestions for improvements, I may modify/correct/extend the file, with acknowledgements where appropriate in the notes at the end. NOTE 17 Aug 2016 The viewpoint expressed here (and by others at the conference in 2000) is inconsistent with the slogan (by Fred Brooks) highlighted below, claiming that computer science is an engineering discipline. The ever increasing overlap between CS and other disciplines, going far beyond provision of tools, is evidence that Brooks had a blinkered view of CS. He is not alone. He wrote: "Perhaps the most pertinent distinction is that between scientific and engineering disciplines. That distinction lies not so much in the activities of the practitioners as in their purposes. A high-energy physicist may easily spend most of his time building his apparatus; a spacecraft engineer may easily spend most of his time studying the behavior of materials in vacuum. Nevertheless, the scientist builds in order to study; the engineer studies in order to build. " ....... "In a word, the computer scientist is a toolsmith--no more, but no less. It is an honorable calling." F.P. Brooks, Jr., The Computer Scientist As Toolsmith, in Communications of The Acm , March 1996/Vol. 39, No. 3 ACM award acceptance lecture delivered at SIGGRAPH 94. http://www.cs.unc.edu/~brooks/Toolsmith-CACM.pdf

NOTE added 18 Dec 2007

A paper written by Allen Newell addresses some of the issues listed here. A. Newell (1983) Intellectual issues in the history of artificial intelligence, in The study of information: interdisciplinary messages pp. 187--227, Eds. F. Machlup and U. Mansfield, John Wiley \& Sons, New York, Available in the Newell Archives http://diva.library.cmu.edu/webapp/newell/item.jsp?q=box00034/fld02334/bdl0002/doc0001/

NOTE added 2 Nov 2007

Alan Bundy has developed a web site which serves some of the same purposes as this one, here http://www.inf.ed.ac.uk/teaching/courses/irm/notes/hypotheses.html The Need for Hypotheses in Informatics

NOTE added 28 Feb 2006

Most of this document was written before the UKCRC initiative on Research Grand challenges . Several of the grand challenge proposals that emerged within that initiative are examples of the view reported in this document. They presented computing research problems beyond the scope of "traditional" computer science, especially GC1: In Vivo -- In Silico GC5: The Architecture of Brain & Mind GC7: Journeys in Nonclassical Computation

NOTE Added 19 Dec 2004

The UKCRC Grand Challenge initiative proposed in 2003, illustrates some of the points made below about different kinds of research. Discussions of Grand Challenge 5 ('Architecture of Brain and Mind'), which is one of the long term grand challenges with no definite end point (like many scientific and medical grand challenges), raised the difficult question of how to identify progress. This is an issue addressed in a very relevant way in the writings of Imre Lakatos, who extended some of the ideas of Karl Popper by making a distinction between 'progressive' and 'degenerating' research programmes, where the important point is that it may be impossible to decide whether a research programme is of one type or another at early stages in the programme: the decision requires analysis of an extended period of research. There are many internet sites discussing, summarising, criticising or reproducing Lakatos papers. A very short summary of his ideas can be found here . A slightly longer summary can be found here . In the context of Grand Challenge 5 I offered a scenario-based methodology that is useful both for planning research and for evaluating it, based on development of a large collection of (partially) ordered scenarios of varying depth and difficulty. The methodology is summarised here . It is also being used in connection with an ambitious EU-funded project that began in September 2004.

NOTE Added 9 Jan 2001

Following recent discussions about UK CS research on the cphc-members email list, and a note circulated by Alan Bundy referring to a list of research topics produced some time ago by a CPHC committee [2] , I have added a new category of research topics, "Research on Social and Economic Issues". So although there were originally four categories, there are now five, although the new one is not a sub-discipline of Computer Science but rather a multi-disciplinary research area, with a large component of Computer Science.

2. The Five Categories of Research in CS and AI

Research in Computing Science and AI falls into four main categories, with different types of aims, and different success/failure criteria, though each of the categories feeds on and contributes to the others, and there are some kinds of research which straddle categories. There is a fifth cross disciplinary category which is of great interest to many computer scientists though it is not strictly a part of Computer Science or AI, though concepts and techniques from both form part of its subject matter and can also be used to further its aims. The study of what is possible -- and its scope and limits Including both mathematical and less formal modes of theorising. [3] The study of existing (naturally occurring) information-processing systems E.g. animals, societies, brains, minds, .... Sometimes described as "Natural computation". Research involving creation of new useful information-processing systems [4] I.e. research directly related to engineering applications. The creation and evaluation of tools, formalisms and techniques to support all these activities. Research on social and economic issues Including studies of the social and economic impact of computing and AI, ethical issues, changing views of humanity, etc. These categories are described in more detail below. Because different kinds of activity need to be evaluated in different ways (see below), there are implications regarding how EPSRC ought to organise its reviewing of grant proposals, and perhaps also implications regarding what proposers should say about their objectives. In particular, we should strongly resist real or imagined pressures to force all our research into Category 3, and should not be tempted to disguise research in the other categories, or justify it merely as a contribution to Category 3. Added 30 Mar 2005: In the light of recent discussions on the CPHC email list it may be worth subdividing this category in various ways. E.g. some of the research contributions to practical applications involved a relatively simple yet new and powerful key idea (e.g. the original idea of the World Wide Web), whereas others are inherently concerned with production of something large and complex requiring the development of a large and complex collection of ideas, e.g. the design of a secure and robust air traffic control system, or a novel nationwide information system for the health service. Many such systems require the use of knowledge and techniques from many disciplines. The next section explains in more detail what the above categories are, and how they are related and mutually dependent. The section after that explains how the evaluation criteria relevant to these categories of research differ and where they overlap. (In what follows I use "type" and "category" interchangeably as terms of ordinary English, not as technical terms.) NB: where lists of examples are given they are merely illustrative and are not intended to be exhaustive, or to define a category.

3. Comments on the categories

3.1. The study of what is possible -- and its scope and limits This includes a lot of work using mathematics and logic, such as work on semantics of computation, and theorems relating to limits of computation, complexity, properties of mechanisms for cryptography, mathematical analysis of different classes of computations, studies of the expressive power of different formalisms, analysis of properties of various kinds of information-processing architectures, network protocols, scheduling algorithms, etc. etc. Much of this work involves the study of types of virtual machines and their properties. They need not be machines which could exist in nature: e.g. some might be infinite machines. This category also includes less formal, and possibly less rigorous [3] , exploratory investigations of new types of architectures, including virtual machine architectures, hardware and software mechanisms, forms of communication, ontologies, etc. in order to investigate their properties and their trade-offs. Examples in AI include explorations of various forms of representation or high level architectures for use in intelligent systems. Sometimes work that starts off in this informal way leads to new formal, mathematical developments, as has happened throughout the history of mathematics. Often work in Category 1 builds on and abstracts from experience gained in tasks in the other categories, just as much of mathematics derives from attempts to find good ways of modelling complex physical structures and processes, e.g. Newton's and Leibniz' invention of Calculus, and the early work on probability theory inspired by gambling devices. Very often this theoretical work addresses problems that are sufficiently complex to require the use of tools of the sorts developed in research of Type 4. Purely theoretical work often develops in such a way as to provide concepts, models, theorems and techniques relevant to the other three kinds of research, though even if it does not do so it can still be of great interest and worth doing as a contribution to human knowledge. It has intrinsic value comparable to that of music, poetry, painting, sculpture, literature, mathematics and dare I say philosophy.

3.2. The study of existing (naturally occurring) information-processing systems

(Sometimes described as "Natural computation".) This is scientific research of another kind: the attempt to understand, explain or model, things that exist in the world, as opposed to exploring what is possible (Category 1) or finding ways of creating new useful things (Category 3). Of course such understanding can sometimes lead to useful practical applications, by enabling us to predict, control or modify some of the behaviour of systems after we understand them. But that is not a requirement for the work to be of great scientific value (though it can be part of the selection process where there are competing theories). There are many kinds of naturally occurring systems, including machines that manipulate matter, machines that manipulate forces and energy and machines that manipulate information -- including virtual machines that cannot be observed and measured as physical machines can. Long before there were computers or computer science there were many types of extremely sophisticated information-processing systems, including animal brains, insect colonies, animal societies, human social and economic systems, business organisations, etc. More recently new systems have grown which are enabled by information-processing artefacts, but are as much natural systems worthy of study as a society or the weather, for instance traffic systems or the internet. The processes of biological evolution form another such naturally occurring information-processing system. Over huge timescales, using mechanisms which are still only partially understood, it compiles information about many types of environments and many kinds of tasks (e.g. serving needs of organisms) into a diverse collection of wonderfully complex and extremely successful designs for working systems, far exceeding in complexity, sophistication and amazing robustness, anything yet produced by human designers of information-processing machines. Some physicists argue that even the physical universe is best construed as ultimately composed of information-processing systems, not yet fully understood. Whether work in computing science will contribute to that understanding I do not know, though there are attempts in that direction. Prior to the development of computing science the study of complex naturally occurring information-processing systems was often very shallow, mostly just empirical data-collection, often using theories expressed only in crude general forms or coarse-grained equations or statistical correlations which failed to capture or explain any of the intricate detail of processes observed. Since the middle of the last century, the study of different forms computation has enriched our ability to find new ways of formulating and testing powerful models and theories for explaining and predicting natural phenomena. Information-processing models and theories are being developed in many scientific domains, as people find that they provide richer, more powerful explanatory capabilities than the old paradigms (e.g. equations relating observed of measurable quantities). This in turn is feeding new ideas into computing science. This has most obviously happened over the last 50 years or so in work in Artificial Intelligence, a discipline whose scientific "arm" has in the past mainly focused on attempts to model and explain aspects of human-intelligence, though there are increasingly attempts at modelling various kinds of animal intelligence. (See the overview of AI in http://www.cs.bham.ac.uk/~axs/courses/ai.html .) Unfortunately many psychologists have no appreciation of this as shown by the pressures by which the British Psychological Society causes Psychology departments to stop allowing their students to take AI courses, which are not recognised as relevant. (Behind all that is an out-dated philosophy of science based on an incorrect model of physics as a science that collects lots of measurements and then searches for correlations.) A more recent development is the growing interest in interpreting biological evolution as a form of information-processing which has also inspired exploration of novel forms of computation which may or may not turn out to be useful for modelling nature. It is arguable that the activity of engineers, working individually or in teams, is an example of a naturally occurring process and therefore empirical investigations of different kinds of practices, methodologies languages, tools etc., and how they work, could fit into Category 2. This is usually an intrinsic part of research in Category 4, which is primarily intended to support Category 3. However, analysis and simulation of human engineering activities can fit into Category 2, and work in AI/Cognitive science on simulation of human design processes would clearly do so. [5]

3.3. Research involving creation of new useful information-processing systems. [4]

Research closely related to production, analysis and evaluation of practical applications is the main engineering branch of computing science, though Category 4 also includes a type of engineering. Category 3 overlaps with Category 2 insofar creation of explanatory theories and models often involves designing and implementing new and complex systems requiring significant engineering skills. There is also overlap insofar as building useful devices often requires a deep understanding of the environment in which they are to operate. E.g. many software engineering projects producing systems to be used by or interact with humans, including HCI projects, have failed because they used shallow and grossly inadequate models of human cognition, motivation, learning, etc. Despite some overlap with Categories 1 and 2, the primary goal of research in Category 3 is not to study theoretically possible systems and their properties, nor to help us understand already occurring information-processing systems. The goal is to enable us to create new practically useful systems, which may either: (a) provide new (or improved) types of artefacts capable of performing functions that were previously performed only by natural systems such as humans and other animals (e.g. doing numerical computations, proving mathematical theorems, translating from one language to another, designing new machines, managing office records, recognising faces) or, increasingly often, (b) develop systems to perform tasks that could not be achieved at all previously, e.g. the construction of global communication networks, accurately forecasting the weather, controlling extremely complex machines and factories, safely giving trainee pilots experience of flying an airbus without leaving the ground, etc. However for this to count as research it must also increase knowledge . If it merely uses existing computing knowledge to produce new tools that are useful to increase knowledge about some other domain (e.g. physics, biology, etc.) that may make it research in the other discipline. If it increases our explicit re-usable knowledge about how to specify, design, build, test, maintain, improve, or evaluate information-processing systems then it is research in the field of software or computer engineering, or AI engineering. (This is not intended to be a precise definition: there may not be one.) Scientific and engineering research work in Category 3 can be contrasted with a great deal of system development activity that may be of practical use, but either (i) directly deploys existing knowledge in standard ways without extending that knowledge, or (ii) depends only on the intuitive, often unarticulated, grasp of what does and does not work. As regards (ii), unarticulated intuitive knowledge and skills gained through practical experience (perhaps combined with natural gifts), may be called craft since it does not require the use and development of explicit theories about what does and does not work and why (the result of research of Types 1 and 2). Even when such craft work extends what we can do, it is not in itself research and should not be treated or evaluated as such, though it may be a precursor to important research. It may produce useful results but does not, in the process advance communicable knowledge. However craft in building computing systems, like many other types of craft, can, and often does, later stimulate more explicit science and engineering: we often first discover that we can do something, then later wonder how and seek explanations. [6] The resulting articulation leads us to understand precisely what was achieved, the conditions under which it can be achieved, how it can be controlled, varied, extended, etc.

3.4. The creation and evaluation of tools, formalisms and techniques to support these activities

Category 4 can be seen as a subset of Category 3, though it may be useful to separate it out because its engineering goals are concerned with the processes of performing the tasks in the previous categories (and this category) and to that extent involves the pursuit of goals which could not have existed but for the existence of computing science. (That's only an approximate truth!) This category involves a diverse range of activities, including designing new programming languages, new formalisms for expressing requirements, compilers, tools for validating or checking programs or other specifications, tools for designing new computing hardware or checking hardware designs, automatic program synthesizers, tools to support exploratory design of software (e.g. most AI development environments) and many more. Research on design, analysis and testing methodologies, as well as tools to support them, can be included in this category, though it overlaps with other categories. [7] The design and production of new general purpose computers, compilers, operating systems, high level languages, graphical and other interaction devices and many more, clearly falls into both the third and fourth categories. Moreover, many tools which are initially of Type 4 can migrate into tools of Type 3, e.g. early AI software development tools which were later expanded into expert system shells. However, it is possible for a tool of Type 4 to have no obvious use outside computing science and yet be of great value. Perhaps an example might be a tool for automatic analysis and checking of the type-structure of a complex formula in a language used only by theorists, or a tool for analysing the structures of complex ontologies developed entirely for research purposes.

3.5. Research on social and economic issues

Research in this category normally requires collaboration with researchers from other disciplines such as psychology, sociology, anthropology, economics, law, management science, political science and philosophy. It includes attempting to understand all the various ways in which developments in computing technology and artificial intelligence have influenced social, educational, economic, legal and political processes and structures, and ways in which they may influence such processes in the future. It can also include exposing and analysing ethical implications, including the implications of the impact of the new technology on opportunities, resources, jobs, power structures, etc. for various social groups within countries and also the impact on international relations and relative power of nations, international companies, etc. It can also include analysis of ethical implications of views of the human mind arising out of developments in AI.

4. Evaluation Criteria for the above types of research

The five types of research have different evaluation criteria, though there is partial overlap. It is possible that the differences are not fully understood, either by politicians and civil servants who are concerned with funding decisions, or by some of the referees who comment on grant proposals. In particular where the research is concerned with testing or developing explanatory or predictive theories, the history of science shows that there can be rival theories which are both partially successful and both better than other theories attempting to explain the same phenomena, without there being any decisive way of telling which theory is better, at any particular time. However, As Imre Lakatos showed in I. Lakatos (1980), The methodology of scientific research programmes, in Philosophical papers, Vol I, Eds. J. Worrall and G. Currie, Cambridge University Press,
A. Sloman, (1978) The Computer Revolution in Philosophy, Philosophy, Science and Models of Mind, Harvester Press (and Humanities Press), Online here http://www.cs.bham.ac.uk/research/cogaff/crp

4.1 The study of what is possible -- and its scope and limits

The criteria for evaluation of this kind of research are subtle, unobvious, and closely related to criteria for evaluation of research in mathematics, logic, philosophy, theoretical physics, theoretical biology, etc. They involve notions like "depth", "power", "generality", "elegance", "difficulty", "potential applicability", "relevance to other problems", "synthesis", "integration", "opening up new research fields", etc. It can be very hard for some people who have not done this kind of research to appreciate its value. But there are plenty of widely referenced examples, e.g. Turing's invention of the notion of a Turing machine and his and Goedel's work on limit theorems and (less widely known) McCarthy's invention of a programming language that can operate on expressions in the language -- Lisp. (Alas, many developers of programming languages since then have ignored this idea!) It often turns out that new theories about what is possible also have enormous practical applications, though sometimes these are not understood, or deployable, until many years later. Many deep theoretical advances have had unexpected practical applications after considerable delay. E.g. the problem to be solved may not turn up for a long time, or the application may require additional developments which take a long time: most of the practical deployment of ideas about forms of information-processing had to wait for advances in physics, materials science, manufacturing technology, etc. to produce computers with the power, weight, size, price and diversity of uses that we know today. Because research of Type 1 is so hard to evaluate and of such potential importance, it may be necessary to devise mechanisms to keep it going and to keep diversifying it with minimal concern for evaluation by generally agreed criteria. (Compare the use of stochastic search mechanisms to solve really hard problems!)

4.2. The study of existing information-processing systems

Here the criteria for evaluation are more like those in empirical sciences, like experimental physics, biology, psychology, etc. The theories have to be tested against the facts. This can sometimes be done by using the theories to make predictions about behaviour of naturally occurring systems, or by showing how large numbers of different previously observed phenomena can be uniformly explained. Sometimes theories about natural information-processing mechanisms can be confirmed or disconfirmed by evidence gained by opening up the physical system or by sophisticated non-invasive techniques for observing internal processes (e.g. fmri scanners). Often however empirical testing is extremely difficult and has to be indirect, especially when the theory relates to a very complex virtual machine whose structure does not relate in any simple way to the underlying physical machinery, or where the complexity of the physical or physiological mechanisms makes de-compiling an intractable task. In that case theories may inevitably remain highly conjectural, making it hard to choose between rival alternatives with similar behaviour consequences. Sometimes this leads to sterile debates that would be better postponed until there is a better basis for choosing, while work in the rival camps continues to be supported. Often rival theories cannot be properly compared until long after they are first proposed. Sometimes, choosing between alternative theories requires introducing very indirect evidence: e.g. showing that the mechanisms of evolution could have produced one sort of architecture but not another, in order to rule out the second as a correct theory of how a human mind works. But truth is not enough for an explanatory theory to be valuable, for there are trivial or shallow truths: again notions like "depth", "generality", "explanatory power", "elegance", and a theory's ability to open up new research problems, are relevant to the evaluation of the theory as a contribution to science. All this is just a special case of philosophy of science, though most philosophers of science are unaware of the special complexities of scientific theories about information-processing systems, because they were brought up to philosophise about simpler sciences such as physics!

4.3. Research involving creation of new useful information-processing systems.

This sort of work has two kinds of criteria for evaluation: how well it extends knowledge and how useful the results are. Often the work. involves both producing new developments of Category 1 or 2 and also deploying them in creating something useful, e.g. exploring ideas about forms of computation, and then later building usable physical implementations of those ideas, or finding a deep explanation of certain diseases then using that explanation in the search for a cure. The two kinds of work need not proceed in that order: in some cases the practical results and explanatory theories may be developed in parallel, or practical difficulties in applying old ideas may point to the need to improve existing theories, formalisms, conceptual frameworks, etc. In all these cases the criteria for work of Category 1 or 2 are relevant to evaluating work in Category 3 because the work is composite in nature. But there is also evaluation of usefulness of new systems. However, usefulness has its own rewards (e.g. financial rewards) and unless there is also some advance in knowledge it is not research. This must be remembered in evaluating such projects in a research context. Not everyone will agree on criteria to be used in evaluating practical applications. Most people would agree that results can be evaluated in terms of benefits they bring in enhancing quality of life including new forms of entertainment, or facilitating other activities with important practical goals, e.g. preventing air traffic collisions, allowing secure transmission of confidential messages, or automatically diagnosing skin cancer at a very early stage, or designing a better tool for teaching mathematics. But some people will regard work that builds more powerful weapons that can bring death and destruction (euphemistically named "defence") as valuable whereas others will condemn such applications. Recent debates about genetically modified food illustrate this point. Moreover, as any Which? report shows evaluation can often be multi-dimensional with at best a partial ordering of the options available. In addition to the evaluation of the costs and benefits of new applicable systems, they can also sometimes be evaluated intrinsically , e.g. in terms of how elegant they are, how difficult they were to achieve, how ingenious or original their creators had to be. Some railway steam engines were beautiful as well as being powerful and fast, and some very useful bridges are also works of art. Lisp (the original version) and Prolog both have a type of beautiful simplicity in relation to their power as programming formalisms, unlike several others I dare not name. Those who attempt to convey a sense of style when teaching programming appreciate this point, apart from the fact that a good style can also have practical consequences, such as maintainability and re-usability.

4.4. The creation of tools, formalisms and techniques

These things can be evaluated both according to how well they facilitate work in the other three categories, and also according to the previously mentioned criteria which are independent of usefulness. Of course producing good tools for doing other things (e.g. for designing and testing models, for building applications, etc.) can be thought of simply as part of those other activities, and evaluated in relation to their indirect benefits. But the good ones have a kind of generality and power that is of value independently of the particular uses to which they are put. It could be argued that this fourth category is spurious: it should be lumped in as part of the third category, sharing its evaluation criteria. At first I was tempted to do this. However, the development of computing both as science and as engineering has depended on a remarkable amount of bootstrapping, where the most important applications of many tools, concepts, formalisms and techniques are the processes of producing more of the same. A spectacular example is the role of previous generations of hardware and software in producing each new generation of smaller, faster, cheaper, more powerful, computers.

4.5. Research on social and economic issues

The evaluation of research in this area is a huge topic beyond the scope of this note. However it links up with criteria for evaluating research in all the other disciplines involved in this research, including psychology, sociology, anthropology, economics, law, management science, political science and philosophy. In some cases there are significant disputes about how to evaluate research in these fields and the relevance of those disputes is likely to be inherited by research in this category.
[1] The ideas here overlap with those that went into the overview of AI which was produced (with help from colleagues in various places) for the QAA computing science benchmarking panel: http://www.cs.bham.ac.uk/~axs/courses/ai.html [2] Note Alan Bundy has a collection of papers by various authors on "Generic Questions" relating to CS: https://sweb.inf.ed.ac.uk/bundy/Generic_Questions/Generic%20Questions.html (Links restored 24 Aug 2016) [3] I am grateful to Jim Doran for reminding me of the need to allow less mathematical work in this category, especially as it applies to most of my work as a philosopher doing AI! [4] Michael Kay, ICL, pointed out that my original title for the third kind of research ("Creation of new useful information-processing systems") was misleading. Many people work on creating new useful information-processing systems but are not doing research. The description in section 3.3 was rephrased to accommodate his comments. Rachel Harrison, Reading University, suggested including evaluation in this category. [5] Rachel Harrison drew my attention to this point. [6] It may be that the only way to produce excellent engineers is to start by making them expert craftsmen and women! [7] Tom Addis, at Portsmouth University, pointed out in response to the first draft that I had not said anything about research on design and development methodologies. I have now placed this in Category 4, though some aspects of this work clearly belong in other categories, e.g. exploration of possible methodologies and modelling of human designers. Rachel Harrison pointed out that besides design methodologies there are also analysis and testing methodologies. I have grouped research on all of these together as supporting research of the other types. However, this can also be seen as an aspect of Category 3.

Please wait...

Oliveboard

Classification of Computers: By Size, Usage, Type and Purpose

classification of computers

Classification of Computers

Classification of Computers – According to uses and applications, computers come in a variety of sizes and shapes with varying processing capabilities. In the beginning, the size of a computer was as large as building rooms, and processing speeds were relatively slow. With the introduction of microprocessor technology, the size of the computer was drastically reduced, and the processing speed increased.

Classification of Computers – Based on their Functionality and Sizes:

Computers are categorized into four groups according to their external dimensions, internal capabilities, and external uses.

Here is list of computers classified based on functionality:

  • Supercomputer
  • Mainframe computer
  • Minicomputer
  • Microcomputer

Supercomputer:

Among digital computers, supercomputers are the biggest, fastest, strongest, and priciest. The first supercomputer was created in the 1960s for the American Department of Defense (USA). Supercomputers use several processors to increase their speed, and many people can use them simultaneously. Supercomputers are generally utilized for scientific purposes and large-scale, complex calculations.

They are widely used in the aerospace, automotive, chemical, electronics, and petroleum industries, as well as for weather forecasting and seismic analysis.

Example for supercomputer Jaguar, Nebulae, Roadrunner, Kraken, Tianhe-1

Supercomputer

Mainframe Computer:

Mainframe computers, also known as mainframes, are the most commonly used type of digital computer in large industries for controlling processes  as well as in offices for maintaining networks and providing access to shared resources. IBM is estimated to control two-thirds of the mainframe market. They are far more suitable for intensive operation than supercomputers. Many modern computers can multitask; however, they are typically limited to eight or fewer processors.

Megaflops (millions of floating-point arithmetic operations per second) are used to measure processor speed. Mainframe computer systems are powerful enough to support a hundred users at remote terminals at the same time. It can support hundreds of users by keeping multiple programs in primary memory and switching between them quickly. Multi-programming refers to the ability to run multiple programs at the same time for multiple users.

Example for Mainframe computer IBM-3000 series, IBM 4300, IBM 3090.

Mainframe computer

Mini Computer:

Most minicomputers, like mainframes, are multiuser and general-purpose computers. The primary distinction between mainframes and minicomputers is that minicomputers are slower even when performing the same tasks as mainframes.

Example for Mini computer PDP series

Mini computer

Micro Computer: 

The most common type of computer, widely used in homes, schools, banks, and offices, among other places. It is a low-cost digital computer with a single microprocessor, storage unit, and input/output device. Microcomputers are typically designed for individual use only.

They were originally referred to as microcomputers because they were so small in size compared to supercomputers and mainframes. They are commonly used in homes, offices, and for personal use, so they are also referred to as personal computers.

Example for Micro computer Desktop computers and portable computers like a laptop, personal digital assistant (PDA)

Micro computer 

Desktop Computer:

A desktop computer, also known as a PC (Personal computer), is the most common type of microcomputer. It have CPU (Central Processing Unit), a keyboard and a mouse for input, and a monitor or display unit for output. The CPU is made up of a microprocessor, main memory, secondary storage unit such as a hard drive or optical drive, and a power supply unit all housed in a single cabinet.

Example for Desktop computer Apple, Dell, Hp, Lenovo.

Portable Computer:

Portable computers, such as laptops and PDAs, have surpassed desktop computers in popularity. The best feature of this portable computer is that it is lightweight and portable. Laptops have all of the same components as desktop computers, but they are more compact and smaller in size.

A palm-sized portable digital assistant (PDA) is another type of portable computer. As a result, it is also known as a palmtop computer. PDAs are used to keep track of appointments, take important notes, set reminders, perform mathematical calculations, play games, and even surf the internet and send emails. In 1993, Apple released the Newton, the first personal digital assistant.

Example for Portable computer Palm Pilot, Handspring Visor, HP Jordana, Compaq Aero, Franklyn eBook man

Classification of Computers – Based on Purpose:

Computers are broadly classified into two types based on its purpose:

  • General-purpose computer
  • Specific-purpose computer

General Purpose Computer:

A general-purpose computer is built to do a variety of common tasks. Computers of this type have the ability to store multiple programs. They can be applied in the workplace, in science, in education, and even at home. Such computers are adaptable, but they are also less effective and move more slowly.

Specific Purpose Computer:

A single specific task can be handled by a specific-purpose computer, which is designed to execute a certain task. They aren’t made to manage several programs. They were therefore not adaptable. Since they are made to handle a specific task, they are more efficient and faster than general-purpose computers.  These computers are utilized for things like airline reservations, air traffic control, and satellite tracking.

Classification of Computers – Based Data Handling:

The computer is further classified into three types based on its ability to handle data or how it processes incoming data differently. They are

  • Digital computer
  • Analog computer
  • Hybrid computer

Digital Computer:

 A digital computer deals with the data that can be stored in binary format i.e. in the form of 0s and 1s. This computer stores data or information as voltage pulses that either indicate 0 or 1. Before being stored in a computer’s memory, all types of data, including text documents, music files, and graphic images, are transformed into binary format. It is a machine that manipulates discrete data and executes logical and mathematical operations.

Analog Computer:

An Analog computer is used to process the analog data. Analog data is data that is constantly changing or varying. They are used to measure continuously varying aspects of physical quantities such as electrical current, voltages, hydraulic pressure, and other electrical and mechanical properties. Analog computer does not measure discrete values. They are employed in scientific and industrial applications.

Hybrid Computer:

A hybrid computer is a combination of both a digital computer system and an analog. The hybrid computer has the capacity to handle both analog and digital input. While the digital half of the system manages the numerical and logical operation, the analog portion of the system handle the continuously varying aspects of complex mathematical computation. The system’s controller is also a part of the digital component.

Hybrid computers are used in medical science to measure the heartbeat of the patient, also used in controlling industrial processes and scientific applications.

  • SSC CHSL Previous Year Questions Papers, Download Free PDF
  • SEBI Grade A Legal Officer Notification 2024 Out for 5 Vacancies
  • SEBI Grade A Previous Year Papers 2024, Download PDFs
  • SEBI Grade A Previous Year Cut Off Marks, Stream Wise
  • SEBI Grade A Salary 2024, Job Profile & Revised Pay Scale

Classification of Computers: Frequently Asked Questions:

Ans. The computer system can be divided into three categories: 1. Computers are classified into Super, Mainframe, Mini, and Micro Computers based on their size and capacity. 2. Purpose-based computers include general-purpose and special-purpose computers. 3. Analog, Digital, and Hybrid Computers are types of computers based on hardware design and data handling.

Ans. Supercomputers, mainframe computers, minicomputers, personal computers, mobile computers, laptop computers, tablet computers, portable computers, personal digital assistants, calculators, handheld game consoles, information appliances, and embedded systems are the various types of computers.

Ans. Super computers are the powerful computers.

Ans. The computer is classified into four sizes: supercomputers, mainframe computers, minicomputers, and microcomputers.

research type of computer

Sindhuja is a passionate content writer with a strong background in preparing for various state exams and defence examinations. Drawing on her own experiences as an aspirant, she brings valuable insights and expertise to her writing. With a keen understanding of exam patterns, syllabus, and effective study techniques, Sindhuja creates engaging and informative content to assist fellow aspirants in their preparation journey. Her dedication to accuracy and clarity ensures that candidates find convenience and reliability in her work. As a content writer, she aspires to empower others with the knowledge and guidance needed to excel in their respective exams and achieve their dreams.

  Oliveboard Live Courses & Mock Test Series

  • Download 500+ Free Ebooks for Govt. Exams Preparations
  • Attempt Free SSC CGL Mock Test 2024
  • Attempt Free IBPS Mock Test 2024
  • Attempt Free SSC CHSL Mock Test 2024
  • Download Oliveboard App
  • Follow Us on Google News for Latest Update
  • Join Telegram Group for Latest Govt Jobs Update

BANNER ads

Download 500+ Free Ebooks (Limited Offer)👉👉

Thank You 🙌

research type of computer

Community Blog

Keep up-to-date on postgraduate related issues with our quick reads written by students, postdocs, professors and industry leaders.

Types of Research – Explained with Examples

DiscoverPhDs

  • By DiscoverPhDs
  • October 2, 2020

Types of Research Design

Types of Research

Research is about using established methods to investigate a problem or question in detail with the aim of generating new knowledge about it.

It is a vital tool for scientific advancement because it allows researchers to prove or refute hypotheses based on clearly defined parameters, environments and assumptions. Due to this, it enables us to confidently contribute to knowledge as it allows research to be verified and replicated.

Knowing the types of research and what each of them focuses on will allow you to better plan your project, utilises the most appropriate methodologies and techniques and better communicate your findings to other researchers and supervisors.

Classification of Types of Research

There are various types of research that are classified according to their objective, depth of study, analysed data, time required to study the phenomenon and other factors. It’s important to note that a research project will not be limited to one type of research, but will likely use several.

According to its Purpose

Theoretical research.

Theoretical research, also referred to as pure or basic research, focuses on generating knowledge , regardless of its practical application. Here, data collection is used to generate new general concepts for a better understanding of a particular field or to answer a theoretical research question.

Results of this kind are usually oriented towards the formulation of theories and are usually based on documentary analysis, the development of mathematical formulas and the reflection of high-level researchers.

Applied Research

Here, the goal is to find strategies that can be used to address a specific research problem. Applied research draws on theory to generate practical scientific knowledge, and its use is very common in STEM fields such as engineering, computer science and medicine.

This type of research is subdivided into two types:

  • Technological applied research : looks towards improving efficiency in a particular productive sector through the improvement of processes or machinery related to said productive processes.
  • Scientific applied research : has predictive purposes. Through this type of research design, we can measure certain variables to predict behaviours useful to the goods and services sector, such as consumption patterns and viability of commercial projects.

Methodology Research

According to your Depth of Scope

Exploratory research.

Exploratory research is used for the preliminary investigation of a subject that is not yet well understood or sufficiently researched. It serves to establish a frame of reference and a hypothesis from which an in-depth study can be developed that will enable conclusive results to be generated.

Because exploratory research is based on the study of little-studied phenomena, it relies less on theory and more on the collection of data to identify patterns that explain these phenomena.

Descriptive Research

The primary objective of descriptive research is to define the characteristics of a particular phenomenon without necessarily investigating the causes that produce it.

In this type of research, the researcher must take particular care not to intervene in the observed object or phenomenon, as its behaviour may change if an external factor is involved.

Explanatory Research

Explanatory research is the most common type of research method and is responsible for establishing cause-and-effect relationships that allow generalisations to be extended to similar realities. It is closely related to descriptive research, although it provides additional information about the observed object and its interactions with the environment.

Correlational Research

The purpose of this type of scientific research is to identify the relationship between two or more variables. A correlational study aims to determine whether a variable changes, how much the other elements of the observed system change.

According to the Type of Data Used

Qualitative research.

Qualitative methods are often used in the social sciences to collect, compare and interpret information, has a linguistic-semiotic basis and is used in techniques such as discourse analysis, interviews, surveys, records and participant observations.

In order to use statistical methods to validate their results, the observations collected must be evaluated numerically. Qualitative research, however, tends to be subjective, since not all data can be fully controlled. Therefore, this type of research design is better suited to extracting meaning from an event or phenomenon (the ‘why’) than its cause (the ‘how’).

Quantitative Research

Quantitative research study delves into a phenomena through quantitative data collection and using mathematical, statistical and computer-aided tools to measure them . This allows generalised conclusions to be projected over time.

Types of Research Methodology

According to the Degree of Manipulation of Variables

Experimental research.

It is about designing or replicating a phenomenon whose variables are manipulated under strictly controlled conditions in order to identify or discover its effect on another independent variable or object. The phenomenon to be studied is measured through study and control groups, and according to the guidelines of the scientific method.

Non-Experimental Research

Also known as an observational study, it focuses on the analysis of a phenomenon in its natural context. As such, the researcher does not intervene directly, but limits their involvement to measuring the variables required for the study. Due to its observational nature, it is often used in descriptive research.

Quasi-Experimental Research

It controls only some variables of the phenomenon under investigation and is therefore not entirely experimental. In this case, the study and the focus group cannot be randomly selected, but are chosen from existing groups or populations . This is to ensure the collected data is relevant and that the knowledge, perspectives and opinions of the population can be incorporated into the study.

According to the Type of Inference

Deductive investigation.

In this type of research, reality is explained by general laws that point to certain conclusions; conclusions are expected to be part of the premise of the research problem and considered correct if the premise is valid and the inductive method is applied correctly.

Inductive Research

In this type of research, knowledge is generated from an observation to achieve a generalisation. It is based on the collection of specific data to develop new theories.

Hypothetical-Deductive Investigation

It is based on observing reality to make a hypothesis, then use deduction to obtain a conclusion and finally verify or reject it through experience.

Descriptive Research Design

According to the Time in Which it is Carried Out

Longitudinal study (also referred to as diachronic research).

It is the monitoring of the same event, individual or group over a defined period of time. It aims to track changes in a number of variables and see how they evolve over time. It is often used in medical, psychological and social areas .

Cross-Sectional Study (also referred to as Synchronous Research)

Cross-sectional research design is used to observe phenomena, an individual or a group of research subjects at a given time.

According to The Sources of Information

Primary research.

This fundamental research type is defined by the fact that the data is collected directly from the source, that is, it consists of primary, first-hand information.

Secondary research

Unlike primary research, secondary research is developed with information from secondary sources, which are generally based on scientific literature and other documents compiled by another researcher.

Action Research Methods

According to How the Data is Obtained

Documentary (cabinet).

Documentary research, or secondary sources, is based on a systematic review of existing sources of information on a particular subject. This type of scientific research is commonly used when undertaking literature reviews or producing a case study.

Field research study involves the direct collection of information at the location where the observed phenomenon occurs.

From Laboratory

Laboratory research is carried out in a controlled environment in order to isolate a dependent variable and establish its relationship with other variables through scientific methods.

Mixed-Method: Documentary, Field and/or Laboratory

Mixed research methodologies combine results from both secondary (documentary) sources and primary sources through field or laboratory research.

Unit of Analysis

The unit of analysis refers to the main parameter that you’re investigating in your research project or study.

Write an effective figure legend

A well written figure legend will explain exactly what a figure means without having to refer to the main text. Our guide explains how to write one.

What do you call a professor?

You’ll come across many academics with PhD, some using the title of Doctor and others using Professor. This blog post helps you understand the differences.

Join thousands of other students and stay up to date with the latest PhD programmes, funding opportunities and advice.

research type of computer

Browse PhDs Now

Multistage Sampling explained with Multistage Sample

Multistage sampling is a more complex form of cluster sampling for obtaining sample populations. Learn their pros and cons and how to undertake them.

Freija Mendrik Profile

Freija is half way through her PhD at the Energy and Environment Institute, University of Hull, researching the transport of microplastics through the Mekong River and to the South China Sea.

Dr-Amina-Aitsi-Selmi-Profile

Dr Aitsi-Selmi gained her social epidemiology PhD from UCL in 2013. She now runs a private practice in Transformational Coaching and Consulting focused on careers, leadership and wellbeing.

Join Thousands of Students

If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

To log in and use all the features of Khan Academy, please enable JavaScript in your browser.

Computers and the Internet

Course: computers and the internet   >   unit 2, what are the parts of a computer.

  • CPU, memory, input & output
  • Input & output devices
  • Central Processing Unit (CPU)
  • Computer memory
  • Secondary memory
  • Computer components
  • Exploring microcomputers

Want to join the conversation?

  • Upvote Button navigates to signup page
  • Downvote Button navigates to signup page
  • Flag Button navigates to signup page

Great Answer

  • Government Exam Articles
  • Types of Computers

Types of Computers | Computer Knowledge for Competitive Exams

In this article, we shall discuss at length about the different types of computer and their uses and some sample questions based on the topics for the upcoming Government exams.

Questions based on Computer Knowledge may not just be asked in exams where a separate section has been allotted for this subject but also in the General Awareness section where candidates are expected to have basic computer awareness as well.

To learn about the fundamentals of computers , candidates can visit the linked article.

Bank, SSC, RRB, Insurance, are among the most popular exams in which questions based on the types of the computer may be asked and candidates can easily score marks as the questions are direct and not complex. 

For information related to the other Government sector examinations conducted in the country, aspirants can check the Government Exams page. 

Aspirants can also check out the video given below with the complete information about the Types of Computers and their functions in various fields. This will enable them to have a more engaging overview of the topic:

research type of computer

Classification of Computers

There are three major categories based on which computers can be classified. These are:

  • Based on Size
  • Based on Purpose
  • Based on Types

The image given below gives a clear classification of the Types of Computers:

Types of Computers - Classifications of Computer

Further in this article, we shall discuss in detail the above-mentioned types of computer in detail for candidates to understand them easily and efficiently.

Types of Computer-Based on Types

The three types of computers along with their functions are given below:

  • Analog Computer – An analog computer one that uses the continuously changeable aspects of physical phenomena to model the problem being solved. These phenomena may be such as electrical, mechanical, or hydraulic quantities and they are extremely complex to be used. Such computers are mostly used for scientific and industrial applications. Examples of Analog computers include Thermometer, Operational Amplifiers, Electric Integrators, etc.
  • Digital Computer – Such computers are capable of solving problems in discrete format. It only operates on data entered in binary language and can perform the dynamic function of managing large amounts of data and regulating the operations of the machine, Examples of Digital computers are Desktop, Laptop, Mobile Phones, etc.
  • Hybrid Computer – Computers that exhibit features of both Analog and Digital computers are called Hybrid Computers. The logical operations are solved by the digital aspects and the differential equations are solved using the analog features. Few important examples of Hybrid Computers include Space Flights, Food processing Plants, etc. 

Types of Computers – Based on Size

Described below are the four types of Computers based on their sizes along with their functions:

  • Micro Computers – A relatively inexpensive and small computer comprising a microprocessor and a Central Processing Unit (CPU) is called a Microcomputer. Such computers are made with minimal circuitry mounting over a single circuit board. Examples include Desktop, Laptop, etc.
  • Mini Computer – Developed in the mid-1960s, Mini computers are comparatively smaller than mainframe computers. They were developed keeping in consideration human interaction, control instrumentation and were cost-effective. For example Smartphones, iPads, etc.
  • Mainframe Computer – Computers used by large Organisations to manage bulk data are called Mainframe computers. Main functions of such type include managing customer statistics, census and other heavy data in a single device. For example, the system used at Trading companies.
  • Super Computer – Computers used at Organisations dealing with Weather forecasting, Quantum Mechanics, Climate research, etc., where high level of performance has to be justified are called Super Computers.

Apart from the Computer Awareness section, candidates can also get the detailed subject-wise syllabus for various Government exam in the table given below:

Types of Computer – By Purpose

On the basis of purpose, there are just two variety of computers. Those two varieties have been discussed in detail below:

  • Basic Input/Output functions
  • Calculations
  • Data Saving on a smaller scale
  • General performing activities

These may include basic calculators, laptops, desktop computers, mobile phones, etc., which can help people with their basic necessary functions are included in the General Purpose computer type.

  • Special Purpose – When a computer is designed specifically to perform a certain function, such type of computers is known as Special Purpose computer. These types may include:
  • Thermometers to test temperature
  • Generators to manage electricity
  • Devices used for analysing Climate Change
  • Large computers for IT Companies
  • Machines used at Manufacturing Units and the list goes on and on

The special-purpose computers are important for various Organisations and their applications are made in a way that makes the work easy and efficient. 

Aspirants are also advised to also check the Preparation Strategy for Competitive Exams at the linked article and get the best tips and strategies to ace the upcoming Government exams. 

Types of Computers – Sample Questions

Given below are a few sample questions based on the different types of computers which may be asked in the Government exams.

Q 1. Which of the given computers can be operated with the touch of the fingers?

  • Mainframe Computers
  • None of the above

Answer: (1) Tablets

Q 2. Which of the given computers is the most expensive?

Answer: (2) Mainframe

Q 3. Which is the most powerful type of computer?

  • Supercomputer

Answer: (4) Supercomputer

Q 4. Which of the given type of computers work on batteries?

Answer: (4) Laptop

The questions given above are just for candidates reference and similar type of questions may be asked from this topic. Thus, candidates can prepare themselves accordingly. 

For any further information regarding the upcoming Government exam, the important exam dates, study material and preparation tips, candidates can turn to BYJU’S. 

Apart from types of computer, there are various other important concepts which one must be aware of while preparing for the upcoming competitive exams. Given below are the links for such computer-based terms and concepts:

Frequently Asked Questions on Types of Computer

Q 1. how many types of computers are there, based on data handling capability, q 2. is there a full form for computer, q 3. what are the different types of computer.

Ans. Given below is a list of books that candidates can consider to prepare for the RBI Grade B exam:

  • Mini Computer
  • Micro Computer
  • Super Computer

Q 4. Which was the first computer?

Q 5. who is known as the father of computer in india.

UPSC 2023

Leave a Comment Cancel reply

Your Mobile number and Email id will not be published. Required fields are marked *

Request OTP on Voice Call

Post My Comment

research type of computer

Connect with us for Free Preparation

Get access to free crash courses & video lectures for all government exams..

  • Share Share

Register with BYJU'S & Download Free PDFs

Register with byju's & watch live videos.

  • Artificial Intelligence
  • Generative AI
  • Cloud Computing
  • Computers and Peripherals
  • Data Center
  • Emerging Technology
  • Augmented Reality
  • Enterprise Applications
  • IT Leadership
  • IT Management
  • Remote Work
  • IT Operations
  • Operating Systems
  • Productivity Software
  • Collaboration Software
  • Office Suites
  • Vendors and Providers
  • Enterprise Buyer’s Guides
  • United States
  • Netherlands
  • United Kingdom
  • New Zealand
  • Newsletters
  • Foundry Careers
  • Terms of Service
  • Privacy Policy
  • Cookie Policy
  • Copyright Notice
  • Member Preferences
  • About AdChoices
  • E-commerce Affiliate Relationships
  • Your California Privacy Rights

Our Network

  • Network World

shimon brathwaite

Microsoft Forms cheat sheet: How to get started

Online forms are an excellent way to conduct research, collect feedback, test knowledge, and more. here’s how to use microsoft forms to create surveys, feedback forms, quizzes, and other interactive forms..

microsoft forms cw primary

Microsoft Forms is a web app that allows users to create various types of forms that gather information from people online and store that data in the cloud for review.

Why is this useful? Surveys, questionnaires, and other interactive forms are a vital part of doing business. They provide a great way to interact with employees, teammates, customers, and potential business partners. You can use online forms to collect customer feedback or business requirements, conduct market research, gauge employee satisfaction, register attendees for an upcoming event, test learners’ knowledge after a training course, and more.

Forms is included with Microsoft 365 subscriptions for individuals and businesses, and a limited version is available for free to anyone with a Microsoft account. In this cheat sheet, we will cover how to use this program to create questionnaires, add specific types of questions, and view and analyze the responses.

6 steps to creating and using a form in Microsoft Forms

• create a form from scratch, • create a form from a template, • create a quiz, • change your form’s theme, • share your form for others to respond to, • view responses.

Now let’s get started.

How to create a form from scratch

There are a couple of ways to start using the Microsoft Forms app. One way is to navigate to your Microsoft 365 home page , sign in if you haven’t already, and click on the Apps icon in the left panel. The Forms app should appear on the main part of your screen near the bottom. If it isn’t there, use the search bar at the top of the screen to search for forms and launch the app.

microsoft forms 01 m365 home

You can launch Microsoft Forms from the Microsoft 365 home page. (Click image to enlarge it.)

On the next page, click the New Form button.

microsoft forms 02 new form button

Click the New Form button to start a new form. (Click image to enlarge it.)

Alternatively, you can go directly to forms.microsoft.com and click the New Form button.

Either way, you’ll start a new, blank form in the Forms app. Here you can do multiple things, including adding questions, viewing responses, and changing the aesthetics of the form.

microsoft forms 03 new blank form

A new, blank form in Microsoft Forms. (Click image to enlarge it.)

Changing the form’s title

You will first want to change the title of your form and add a description. This is the first thing anyone will see when they open your questionnaire, so you want to make sure the title is easy to understand and explains what it is you’re trying to do.

To add a title, simply click on Untitled form , and you will be able to edit the title and add a description.

microsoft forms 04 change title description

Change the title and add a description for your form. (Click image to enlarge it.)

Adding questions

To add a new question, click the Add new button. A toolbar appears showing four types of questions you can add to your form.

microsoft forms 05 question toolbar

Choose which kind of question you want to add. (Click image to enlarge it.)

Choice: Multiple-choice questions allow you to preselect a set of answers from which the user can choose. You can also add an Other option where users can type in a unique response.

By default, a multiple-choice question allows the user to select just one answer. To change this, click the Multiple answers slider at the lower right to toggle it on. The radio buttons next to the answers change to checkboxes, and users can choose more than one.

microsoft forms 06 multiple choice question

This multiple-choice question lets respondents choose more than one answer. (Click image to enlarge it.)

To rearrange the answers in a multiple-choice question, hover your cursor over the answer you want to move until you see six dots appear to the left of the item. Click and hold the six dots, then drag and drop the answer to its new location.

Text: This is an open-ended question where you allow the user to type in an answer — good when you want to collect individual information such as an email address or hear detailed thoughts from respondents. By default, text questions accept short answers, but you can enable longer responses by turning on the Long answer toggle.

To restrict responses to number format, click the three-dot icon in the lower-right corner of the question box and select Restrictions . To specify that the number be within a certain range, such as between 10 and 500, click the Number dropdown, select Between , and type in the appropriate numbers.

microsoft forms 07 text question

Restricting the responses for a text question to numbers between 10 and 500. (Click image to enlarge it.)

Rating: This question allows respondents to rate performance, typically on a scale of 1 to 5 (bad to excellent). This can give you an idea of how employees feel about their manager, for instance, or how customers view your product or service. You can adjust the number of levels provided (up to 10) or change the rating symbols from stars to numbers, hearts, smiley faces, checkmarks, or others.

Date: This question displays a calendar and asks respondents to select a specific date, such as the date an item is requested.

Other question types: If you click the down arrow at the right end of the question type toolbar, a pop-up menu appears with four additional question types that you’ll probably use less frequently:

  • Ranking: Lets respondents rank items in order of preference or importance to them.
  • Likert: Displays a list of items, each with its own rating scale. A common scenario for this type of question would be to find out how satisfied employees are with various company benefits.
  • Upload File: Lets respondents upload a file. Supported file types include Word, Excel, PowerPoint, PDF, images, videos, and audio files.
  • Net Promoter Score: Asks respondents how likely they are to recommend your product or service, on a scale from 0 (not at all likely) to 10 (extremely likely).

microsoft forms 08 net promoter score question

A typical Net Promoter Score question. (Click image to enlarge it.)

Once you’ve selected the question type, enter the question and responses you want respondents to see, then make any adjustments or restrictions, such as the “multiple answers” option for multiple-choice questions.

Here are a few additional tasks you’ll likely use when adding questions to your form:

  • To make a question required (i.e., respondents must answer it in order to complete the survey): turn the Required toggle on at the lower right of the question box.
  • To explore additional options for a question , such as the ability to shuffle responses or add a subtitle: click the three-dot icon to the right of the Required toggle.
  • To add an image or video to a question: click the image icon at the right end of the field where you enter the question text. On the “Insert media” pane that opens, choose Insert Image or Insert Video . For an image, you can do a Bing web search, browse your OneDrive folders, or upload an image from your computer. For a video, you can paste in a Microsoft Stream or YouTube URL. In a multiple-choice question, you can also add images to the responses.

microsoft forms 09 insert media pane

You can add an image or video to a question. (Click image to enlarge it.)

Building out your form

To add more questions to your form, just keep clicking the Add new button and repeating the steps above. Here are a few more things that are useful to know how to do:

To duplicate a question: select the question and click the Copy question button at the upper right of the question box. A copy of the question appears immediately below it. This is handy if you have more than one question with similar formatting: you can save time by duplicating the question and editing it rather than starting from scratch each time.

microsoft forms 10 copy question delete move up down icons

Use the buttons at the upper right to duplicate a question, delete it, move it up, or move it down. (Click image to enlarge it.)

To move a question up or down: select the question and use the up or down arrow icons at the upper right of the question box.

To insert a question in between existing questions: select the question above the place where you want to insert the new question. Click the Insert new button (which appears in place of “Add new”) and proceed as usual.

To delete a question: select the question and click the trash can icon in its question box.

To add a new section to the survey: select the question above the place where you want the new section to appear. Select Add new or Insert new , click the down arrow at the right end of the toolbar, and select Section from the pop-up menu. Enter a title for the new section. You can optionally add a subtitle and image or video as well.

microsoft forms 11 new section

It can be helpful to break a form into sections. (Click image to enlarge it.)

Adding branching to your form

This feature is optional, but it’s powerful: You may have one or more questions in your form that you want to branch — that is, if the respondent answers the question one way, you want to send them to a different follow-up question than if they answer the question another way. Thus, branching makes the most sense for multiple-choice questions.

It’s best to wait until you’ve added all your questions to the survey before you add branching. Once you’ve done so, select the question you want to branch, click the three-dot icon at its lower right, and select Add branching .

A “Go to” box appears next to each of the answers. Click the drop-down menu next to each answer and choose where you want to send respondents who choose that answer — the next question (the default), the end of the form, or a specific question or section in the form.

microsoft forms 12 branching question

Adding branching to a question lets you set different follow-up actions for different responses. (Click image to enlarge it.)

How to create a form from a template

Rather than starting a new form from scratch every time, you can get a head start by using one of the templates Microsoft provides. Go to Microsoft’s Forms template gallery , where you can choose from a variety of templates including a market research survey, manager feedback survey, office facility request form, and more. Click any template to open it in your browser.

microsoft forms 14 template gallery

The Forms template gallery has more than a dozen templates to choose from. (Click image to enlarge it.)

You’ll see a form that’s prepopulated with questions and answers. You can edit any of the existing questions, delete those you don’t want, and add your own questions into the mix.

microsoft forms 15 employee satisfaction survey template

Using templates gives you a head start on many standard business forms. (Click image to enlarge it.)

Starting from a template not only saves you from having to enter all your questions manually, it may also provide valuable questions you wouldn’t think of on your own.

How to create a quiz

Quizzes are similar to surveys and other questionnaires, but there are correct and incorrect responses. You can assign points to each question, report respondents’ scores, and explain why certain responses are right or wrong. A quiz is a good way to assess how well attendees of a training course have learned the subject matter and coach them in areas they don’t fully understand.

To create a new quiz, go to forms.microsoft.com and click the New Quiz button at the top of the page.

Alternatively, you can go to your Microsoft 365 home page and launch the Forms app as described earlier in the story. On the Forms start page, click the down arrow next to the New Form button and select New Quiz .

Creating a quiz is just like creating a form — you add a title/description and questions the same way — except that you designate the correct answer and assign a point score to each question. When you enter the answers for a question, you’ll see a circled checkmark to the left of each answer. Click one of the checkmarks to mark it as the correct answer. Then go to the Points box at the bottom of the question box and type the number of points the question is worth.

microsoft forms 13 quiz format

Quizzes let you test respondents’ knowledge. (Click image to enlarge it.)

How to change your form’s theme

Now that we have covered the functional aspects of Microsoft Forms, let’s look at how you can change the look and feel of your questionnaire. On the top right of your form, click the Theme button to open a panel full of theme ideas that you can use to change how your questionnaire looks. Look around this tab and select a theme that you like to represent your company.

microsoft forms 16 theme ideas panel

Choose a theme that suits your company and the form itself. (Click image to enlarge it.)

To preview how your form will look to respondents as they’re filling it out, click the Preview button to the left of the Theme button at the top right of the page. You can toggle between Computer view and Mobile view by clicking the buttons at the top right of the preview page.

microsoft forms 17 mobile preview

You can see how your form will look to both desktop and mobile users. (Click image to enlarge it.)

How to share your form for others to respond to

Once your form is finalized and you’re ready to start sending it to clients, employees, or other respondents, select the Collect responses button at the top right. On the pane that appears, you can create and customize the link that you will use to share your questionnaire with others.

microsoft forms 18 send collect responses pane

You can send out a survey link publicly or privately. (Click image to enlarge it.)

If this survey is meant for people outside your company, click the option that allows anyone to respond. If it’s meant for employees in your company, choose the second option. And if you’re looking for feedback only from specific people in your organization, choose the third option and enter the names or email addresses of those people.

Next, select the option to shorten your URL so that it’s less spammy and easier to share with other people in a text, email, or instant message. You can send the link out by clicking the Copy button and pasting it into an email or other message. Alternatively, you can fill out the form on the right to send an email with an embedded link.

How to view responses

Microsoft automatically keeps track of all responses to your form and provides you with a summary of that information in a visual dashboard. Simply click the tab at the top right that says Responses to view your summary:

microsoft forms 19 responses summary

Forms collects and summarizes your survey’s responses. (Click image to enlarge it.)

To view responses individually, click the View results button on the left-hand side under your initial summary of responses. On this page, you can scroll through all of the responses that you’ve received to your form.

microsoft forms 20 individual responses

Viewing the answers from an individual respondent. (Click image to enlarge it.)

You can also export your results to Excel for offline viewing. Click Open in Excel on the right under the initial summary.

microsoft forms 21 open in excel

Click this button to export your results to Excel. (Click image to enlarge it.)

Lastly, you can share this results page with anyone you want via a link provided by Microsoft. Click on the three-dot icon to the right of the “Open in Excel” link and choose Share a summary link from the pop-up menu. Forms will generate a link that you can copy and share.

microsoft forms 22 share summary link

Sharing a summary link. (Click image to enlarge it.)

Related content

How to use pivottables and pivotcharts in excel, 18 ways to speed up windows 10, google adds a premium option for chrome enterprise, jamf brings powerful new compliance tools to apple it, from our editors straight to your inbox.

shimon brathwaite

Shimon Brathwaite is a cybersecurity professional, consultant, and writer at SecurityMadeSimple . He is a graduate of Ryerson University in Toronto, Canada and has worked in several businesses in security-focused roles. His professional certifications include GCIH, Security+, CEH, and AWS Security Specialist. Contact him for writing engagements, consulting, or to ask questions!

More from this author

How (and why) to use conditional formatting in excel, how to use excel formulas and functions, most popular authors.

  • Howard Wen Contributing Writer

research type of computer

Show me more

Google workspace gets new genai pricing options, vids app.

Image

The AI talent shortage — can companies close the skills gap?

Image

What everyone's getting wrong about Google's Chrome incognito saga

Image

Voice cloning, song creation via AI gets even scarier

Image

The link between smartphones and social media addiction

Image

Sam Bankman-Fried gets 25 years in prison

Image

Promoting pre-service teachers’ knowledge integration from multiple text sources across domains with instructional prompts

  • Development Article
  • Open access
  • Published: 10 April 2024

Cite this article

You have full access to this open access article

  • Inka Sara Hähnlein   ORCID: orcid.org/0000-0003-2758-634X 1 &
  • Pablo Pirnay-Dummer 1  

Multiple document comprehension and knowledge integration across domains are particularly important for pre-service teachers, as integrated professional knowledge forms the basis for teaching expertise and competence. This study examines the effects of instructional prompts and relevance prompts embedded in pre-service teachers’ learning processes on the quality their knowledge integration in multiple document comprehension across domains. 109 pre-service teachers participated in an experimental study. They read four texts on “competencies” from different knowledge domains and wrote a text on a given scenario. Experimental group 1 was aided with instructional and relevance prompts, while experimental group 2 received only relevance prompts. The control group received no prompting. Perceived relevance of knowledge integration was assessed in a pre-post-test. Pre-service teachers’ separative and integrative learning, epistemological beliefs, metacognition, study-specific self-concept, and post-experimental motivation were assessed as control variables. Participants’ texts were analyzed concerning knowledge integration by raters and with computer linguistic measures. A key finding is that combined complex prompting enhances pre-service teachers perceived relevance of knowledge integration. This study found effects of prompting types on the pre-service teachers’ semantic knowledge structures. Implications for transfer are discussed.

Avoid common mistakes on your manuscript.

Introduction

Multiple document comprehension and knowledge integration across domains are especially important for pre-service teachers, as they are necessary for forming integrated professional knowledge. This in turn is the basis for the professional competence of future teachers (Lehmann, 2020a , 2020b ). Since pre-service teachers find it difficult to integrate knowledge, and teacher education is hardly designed to help in the integration of knowledge (Hudson & Zgaga, 2017 ), pre-service teachers are in need of support. Instructional strategies such as cognitive prompts have been found to be effective aids (Lehmann, Pirnay-Dummer, et al., 2019a , 2019b ; Lehmann, Rott, et al., 2019a , 2019b ). However, more empirical research is needed on how to promote cognitive knowledge integration across domains.

The present study aims to promote and facilitate pre-service teachers’ knowledge integration from multiple text sources across domains with two kinds of cognitive instructional strategies. In our experimental study, pre-actional relevance prompts and pre-actional instructional prompts combined with relevance prompts are embedded at multiple time points in pre-service teachers’ self-regulated learning processes. The learning process includes a reading phase and a writing task designed to integrate and transfer what has been read. We hypothesize that the types of prompts will influence the three criteria of pre-service teachers’ text quality, knowledge structure, and perceived relevance of knowledge integration in various ways.

Theoretical background

Knowledge integration across disciplines as a precondition for teaching expertise.

For pre-service teachers, pedagogical knowledge (PK), pedagogical content knowledge (PCK), and content knowledge (CK) form the core of their professional competence as future teachers (Shulman, 1986 ; Voss et al., 2011 ). Technological, organizational and counseling knowledge as well as self-regulation skills, motivational orientations, beliefs, and values are also part of teachers’ professional competence (Model of Professional Competence of Teachers, Baumert & Kunter, 2006 ; Technological Pedagogical Content Knowledge, TPACK, Mishra & Koehler, 2006 ; Koehler & Mishra, 2008 ; Mishra, 2019 ; Krauskopf et al., 2020 ).

These aspects constitute the professional knowledge of teachers in training and are the basis for successfully teaching a specific subject (i.e., analyzing, planning, designing, developing, and evaluating instruction and instructional interactions; Seel et al., 2017 ). Teacher education includes study components from educational sciences, didactics, pedagogy, and the teaching subjects. Text is still by far the main source of information in academia (Pirnay-Dummer, 2020 ). For pre-service teachers, these texts come from different disciplines. Pre-service teachers are thus confronted with texts that take different perspectives on the same topics, and integrating these perspectives is not a simple straightforward task. Texts provide different domain-specific rationales for specific instructional decisions that interact with each other in a complex way (Lehmann, 2020a ; Pirnay-Dummer, 2020 ).

Knowledge integration and multiple-document comprehension

According to Lehmann ( 2020a ), knowledge integration is defined in two ways, as first- and second-order knowledge integration. First-order knowledge integration, as a form of constructive learning, is the active linking, merging, distinguishing, organizing, and structuring of knowledge structures into a coherent model (Lehmann, 2020a , 2020b ; Linn, 2000 ; Schneider, 2012 ). Second-order knowledge integration, as a form of knowledge application, is the simultaneous transfer of knowledge from different domains with the goal of reaching a suitable problem solution (Graichen et al., 2019 ; Janssen & Lazonder, 2016 ; Lehmann, 2020a , 2020b ).

Integrating knowledge from text across disciplines, a competency pre-service teachers are expected to possess is a complex process that relies on single-text comprehension (Construction-Integration Model; Kintsch, 1988 , 1998 ; Trevors et al., 2016 ; van Dijk & Kintsch, 1983 ) but furthermore requires multiple document comprehension (Document Model Framework; Braasch et al., 2012 ; Britt & Rouet, 2012 ; Britt et al., 2018 ; Perfetti et al., 1999 ; Rouet, 2006 ).

The construction of a coherent integrated model of multiple texts requires readers to form an integrated mental model of the content of the texts (integrated mental model), including contradictions and their possible or impossible resolutions, as well as representations of the text sources and how these sources are related to each other as an intertext model (Bråten & Braasch, 2018 ; Britt & Rouet, 2012 ; Perfetti et al., 1999 ; Rouet, 2006 ).

Relevance of knowledge integration for teaching

Empirical evidence shows the importance of knowledge integration for pre-service teachers: The level of knowledge integration influences their degree of expertise and professional competence as teachers later on (Baumert & Kunter, 2006 ; Bromme, 2014 ; Graichen et al., 2019 ; Janssen & Lazonder, 2016 ; König, 2010 ; Lehmann, 2020a ). Teachers’ knowledge integration is also positively related to the learning performances of students (Hill et al., 2005 ).

For pre-service teachers and in-service teachers, successful knowledge integration means being able to apply integrated knowledge to solve complex (ill-defined) problems (Jonassen, 2012 ), such as instructional planning (Brunner et al., 2006 ; Dörner, 1976 ; Lehmann, 2020a ; Norton et al., 2009 ), and to make informed instructional decisions, taking into account multiple perspectives and their complex interaction (Lehmann, 2020a ).

Although successful knowledge integration is particularly important for prospective teachers, teacher education is hardly designed to link subject areas. Subject knowledge, didactics, and educational science are taught separately (Ball, 2000 ; Blömeke, 2009 ; Darling-Hammond, 2006 ; Hudson & Zgaga, 2017 ). As Zeeb et al. ( 2020 , p. 202) point out, this way of knowledge acquisition increases the risk that students will develop structural deficits in their knowledge, in the sense of compartmentalization of knowledge (e.g., Whitehead, 1929 ), which in turn explains the fragmented, inert knowledge (Renkl et al., 1996 ). Accordingly, the professional knowledge present in pre-service teachers and in-service teachers tends to be fragmented and poorly integrated. There is hardly any systematic linking across subject areas and faculty boundaries. The principle of separating subject knowledge, subject didactics, educational science, and pedagogy prevails (Ball, 2000 ; Darling-Hammond, 2006 ; Graichen et al., 2019 ; Harr et al., 2014 ; Janssen & Lazonder, 2016 ).

Since integrating knowledge is an important but complex task, and teacher education is hardly designed to initiate and train integration of knowledge (Hudson & Zgaga, 2017 ), pre-service teachers are in need of support.

Supporting pre-service teachers’ knowledge integration and multiple document comprehension

There are different approaches to the promotion of knowledge integration:

The MD-TRACE model (“Multiple-Document Task-Based Relevance Assessment and Content Extraction”) describes how readers represent their goals by forming a task model in multiple-document comprehension (Rouet, 2006 ; Rouet & Britt, 2011 ). The reading context is vital for multiple-document processing. According to the RESOLV theory, readers initially construct a model of the reading context that influences the interpretation of the task at hand, for instance the interpretation of writing tasks (Rouet & Britt, 2011 ; Rouet et al., 2017 ). There is ample evidence to suggest that specific writing tasks can elicit changes in learning activities (Wiley & Voss, 1996 , 1999 ; Bråten & Strømsø 2009 , 2012 ; Lehmann, Rott, et al., 2019a , 2019b ). Specific writing tasks can promote integrated understanding by stimulating elaborative processes of knowledge-transforming rather than simple knowledge-telling (Scardamalia & Bereiter, 1987 , 1991 ). Writing a summary of multiple documents or answering overarching questions may also promote knowledge integration (Britt & Sommer, 2004 ; Wiley & Voss, 1999 ). Tasks asking pre-service teachers to combine information from multiple sources can foster the integrated application of knowledge (Graichen, et al., 2019 ; Harr et al., 2015 ; Lehmann, Rott, et al., 2019a , 2019b ; Wäschle et al., 2015 ).

A growing body of empirical evidence shows that prompts are effective instructional strategies for supporting knowledge integration across domains (Lehmann, Pirnay-Dummer, et al., 2019a , 2019b ; Lehmann, Rott, et al., 2019a , 2019b ). Implemented as statements, focus questions, incomplete sentences, or relevance instructions, among other formats, they promote the cyclical process of self-regulated learning (SRL) (Schiefele & Pekrun, 1996 ; Zimmerman, 2002 ). Prompts can be embedded in the pre-actional, actional, or post-actional phase of SRL. They elicit the use of cognitive, metacognitive, and motivational learning strategies that promote learning at the respective level (Bannert, 2009 ; Ifenthaler, 2012 ; Lehmann et al., 2014 ; Lehmann, Rott, et al., 2019a , 2019b ; Reigeluth & Stein, 1983 ; Zimmerman, 2002 ).

Implemented pre-actionally, that is, provided prior to learning, prompts can assist learners in constructing an appropriate task model (Ifenthaler & Lehmann, 2012 ; Lehmann et al., 2014 ). Empirical evidence demonstrates that pre-actional cognitive prompts promote an integrated deep understanding across core domains of professional knowledge of pre-service teachers (Lehmann, Rott, et al., 2019a , 2019b ; Wäschle et al., 2015 ).

Relevance instruction enhances integrated knowledge in pre-service teachers by encouraging the use of integrative strategies (e.g. Zeeb et al., 2020 ). Relevance prompts can be implemented either specifically or independently. They can emphasize the relevance of the specific learning content (Cerdán & Vidal-Abarca, 2008 ) or learning task (Gil et al., 2010 ), or they can be implemented independently and refer to the relevance of knowledge integration in general (Zeeb, Biwer, et al., 2019 ; Zeeb et al., 2020 ). Zeeb et al. ( 2020 ) argue in favor of the independent implementation, since knowledge integration is important for teacher education across domains and not just within specific subjects or topics. Repeated relevance instructions have been shown to be superior to one-time instructions (Zeeb et al., 2020 ).

The above described theoretical foundation shows the importance of multiple document comprehension and knowledge integration across domains for pre-service teachers and that prompts might be used to help them.

In our experiment, we investigated how to promote and facilitate pre-service teachers’ knowledge integration from multiple text sources across domains. The design examined the effects of two different kinds of prompts embedded in the students’ learning processes at three different points in an experimental design.

Research question

Do instructional and relevance prompts embedded in the learning process promote pre-service teachers’ knowledge integration from multiple texts across domains?

The evidence described above indicates that providing students with pre-actional instructional prompts should lead to an increased quality of knowledge integration. The same is to be expected for repeated pre-actional relevance prompts . Our experiment combined the two types of prompts and embedded them at different stages in the learning process, in experimental contrast to providing relevance prompts only. A control group was provided with no support, as this is still often the standard in pre-service teacher training.

We hypothesized that the combined prompts would lead to higher rated text quality (dependent variable 1, DV1) than relevance prompts alone. We also hypothesized that support from relevance prompts alone would be better than no support at all.

Moreover, we hypothesized that there would be systematic differences between the three groups both in terms of knowledge structure and knowledge semantics (dependent variable 2, DV2) when compared with the knowledge model of the source material.

In addition, we hypothesized that repeated pre-actional relevance prompts would lead to an increase in perceived relevance (dependent variable 3, DV3).

We assessed pre-service teacher students’ perceived integration and separation learning in teacher education, epistemological beliefs, metacognition, study-specific self-efficacy, and post-experimental motivation as control variables, because of their relevance for self-regulated knowledge integration (Barzilai & Strømsø, 2018 ; Lehmann, 2022 ).

Participants

The experiment was conducted with N = 109 of 119 pre-service teacher students (see Analysis and Results for explanation of dropout) who attended one of four courses on research methods and statistics in the 2021/22 fall term at a German university (81 females, 27 males, 1 n/a; age: M = 22.46, SD = 3.23). Of the 108 student teachers who specified their school type, most are studying elementary school teaching (45), while 36 are studying high school teaching, 17 secondary school teaching, 8 special education, and only 1 elementary and special education. The most common major subject is German (43), followed my mathematics (23), history (9), biology (6), English (5), sports (5) and others. The most chosen second subject are mathematics (29), German (24), biology (8), English (5). The participants have been studying their major subject for an average of about six semesters ( M = 5.9, SD = 2.3).

Procedure and design

This experimental intervention study has a cross-sectional control group design. Figure 1 shows the experimental procedure and design. First, the pre-service teacher students were informed about the study at the beginning of the term (calendar week 40) in the introductory sessions of the courses on research methods. Participation was voluntary with no consequences for not participating. All students enrolled in the course chose to participate in this study giving their informed consent. The participants were anonymized by means of randomly assigned codes known only to themselves. Then, the participants were randomly assigned to three experimental conditions: experimental group 1 (EG1), experimental group 2 (EG2), and control group (CG). All assessments and the intervention took place at course time in the course room to make it as easy as possible for the students to participate. Students who dropped out of the course automatically terminated their participation in this study.

figure 1

Experimental procedure with three experimental conditions in an experimental intervention study with cross-sectional design

In weeks two and three of the term (calender week 41/42), we collected the students’ demographics, and as control variables assessed their epistemological beliefs (Students’ Epistemological Beliefs; StEB; Hähnlein, 2018 ), metacognition in the learning process (MILP; Hähnlein & Pirnay-Dummer, 2019 ), study-specific self-efficacy (WIRKSTUD; Schwarzer & Jerusalem, 2003 ), as well as their self-reported separative and integrative learning in teacher education (SILTE; Lehmann et al.; 2020 ). All data collection was implemented online on Limesurvey.

To avoid test fatigue in the participants, the actual intervention was not conducted until several weeks after the control variables were collected. For all participants, the time interval between pre-survey and experiment was the same. The experiment was conducted six to seven weeks (calender week 46/47) into the term and took approximately 90 to 105 minutes for CG and EG2, but about 120 minutes for EG1 to account for the prolonged instruction.

The experiment started with a 3-minute pre-test regarding students’ perceived relevance of knowledge integration (Figure 1 ). After that, to initiate a learning process, all participants received a learning task with an instructional part, a reading and a writing task (Figure 1 ). Participants were instructed to work through the learning process and remaining test procedure on their own time and were allowed to leave the experiment when they were finished, but not before 90 minutes had passed (normal course duration).

For all participants, the reading task consisted of four texts about “the concept of competence in teaching” from different subject domains of teacher education (8 pages in total. We recorded the time participants spent studying the text material including reading time (M = 38, SD = 17, Min = 14, Max = 78). Students’ perceived handling of the text material was evaluated right after the reading phase (Text Material Questionnaire I; Deci & Ryan, n.d.; German adaption). This took just 2 minutes. The writing task was on a fictional scenario that required integrating the knowledge from all four source texts to derive implications for the application of the knowledge.

The participants of the control group (CG: no prompts) received just organizational information in the instructional phase of the learning process and no aid regarding knowledge integration for the reading and writing task (Figure 1 ).

The participants of experimental group 2 (EG2: relevance prompts) received relevance prompts in verbal and written form embedded in their learning processes in the instructional phase as well as the reading and writing phase. Experimental group 1 (EG1: relevance prompts and instructional prompts) received both relevance prompts and instructional prompts (Figure 1 ). Both types of prompts were embedded in the students’ learning process at three time points: the instructional phase, the reading phase, and the writing phase.

Following the learning process, we again assessed students’ perceived relevance of knowledge integration (3 minutes) as well as their post-experimental intrinsic motivation (Intrinsic Motivation Inventory, IMI; Deci & Ryan, n.d.; German version, 12 minutes). All survey instruments used are introduced below in the section Survey Instruments .

Instructional prompts in experimental group 1

In the instructional phase, the participants of experimental group 1 received a 10-minute PowerPoint-based introduction to knowledge integration to stimulate their pre-flexion prior to reading. The introduction used the example of lesson planning to outline how knowledge integration works, how it works, and why it is important for future teachers. This served as a pre-actional cognitive prompt.

To support the reading phase, focus questions were developed in our research team specifically to the text material at hand. The participants received the following focus questions related to knowledge integration to apply to the texts to be read (translated from German):

What are similarities and differences in the understanding of competence between the texts?

What level does the knowledge from the different texts refer to?

Does the knowledge refer to what competencies are?

Does the knowledge relate to abstract or specific objectives? (What is to be achieved?)

Does the knowledge relate to application? Does it relate to the process of how to accomplish something?

Does the knowledge relate to why? (Why does something work this way and not another way?)

How do the competency perspectives interact with each other? Do the knowledge contents and their levels complement each other or do contradictions arise? (Bridges between texts?)

Can different things be derived from the different perspectives for application?

What can be derived for the application from the integrated impression of all 4 texts?

Furthermore, the participants were asked to model the interrelation between the texts, for instance by drawing a mind map. The focus questions served as pre-actional cognitive prompts for the reading task, while the modeling served as an actional cognitive prompt for the reading phase and a pre-actional cognitive prompt for the writing phase.

Before writing another pre-actional prompt was given: The participants in this group were explicitly asked to use their elaborations from the reading phase while writing and to explicitly connect the knowledge instead of just summarizing it.

Relevance prompts in experimental group 1 and 2

During instruction, the relevance prompt was provided to the participants via an oral explanation of the importance of knowledge integration for their future teaching proficiency and the reasons for it, supported by an anthology on knowledge integration research held up during the presentation of a PowerPoint slide. This served as pre-actional prompt.

In both the reading and the writing phase, the participants were again reminded in writing of the relevance of knowledge integration.

For both experimental groups, the task sheet remained with the participants so that they could access the prompts even while performing the task.

The control group received neither relevance prompts nor instructional prompts but just the reading and writing task.

Reading material

The reading material was four selected texts on the concept of competence in teaching. The texts come from different disciplines and are all academic in source and nature (educational science/humanities, educational psychology, didactics, and policy-making). The texts were selected and discussed beforehand by an interdisciplinary team both for their relevance within each field and for their potential for not being too easy to integrate. However, since the four texts are from different disciplines, each of them takes a different professional perspective on the topic.

Text 1 takes the perspective of educational science or pedagogy. It is taken from a textbook on the introduction to educational science and two pages long (Textbook: Thompson, 2020 , pp. 131–133). This text is about how the concept of competence is defined from the perspective of competence research.

Text 2 takes the point of view of educational psychology. It is two and a half pages long and about what distinguishes the concept of competence from established categories such as ability, skill, or intelligence. (Article: Wilhelm & Nicolaus, 2013 , pp. 23–26)

Using the example of learning to read, text 3 deals with the distinction between different levels of competence (Textbook: Philipp, 2012, pp. 11–15). It is a text from the didactics and one and a half pages long.

Text 4 is a curricular description from the Standing Conference of the Ministers of Education and Cultural Affairs (Kultusministerkonferenz, 2009 , pp. 1–5). It takes an educational policy perspective on the subject matter and is two and a half pages long. This text is about the competency level model for the educational standards in the competency area speaking and listening for secondary school.

Writing task

The writing task was part of a fictional scenario requiring participants to integrate knowledge from the four source texts and to draw integrated conclusions for its application to teaching in order to help a friend in need. The scenario with task (translated from German) read as follows:

During your school internship, a future colleague has guided you through many a challenging situation thanks to her professional experience and appreciative nature. This teacher, Monique Gerber, recently turned to you and somewhat bashfully told you that she herself is currently facing a rather challenging situation. She has a school evaluation coming up next week. In itself, this is not a problem for Monique. However, she has learned in advance that dealing with competence and its scientific foundation in teaching is a central theme of the evaluation. She says that the academic discussion has been going on for far too long and that she would like you to give her an informative summary of the topic of competence: What should she look for when teaching? How should she justify things? How does she relate what she does well in class to existing scientific knowledge? A little flattered, and knowing of Monique’s distress, you set out to help her. Task: Write a text yourself on the basis of the four short texts on the topic of competence. Your text for Monique should explain step by step the current scientific understanding of competence and show her how it can be used to help her plan and design lessons. (The text should be written in complete sentences. It must be at least 400 words.)

This complex task requires both knowledge-telling and knowledge-transforming (Scardamalia & Bereiter, 1987 , 1991 ). An integrated mental model of the text sources (Bråten & Strømsø, 2009 ; Wiley & Voss, 1999 ) is needed to derive implications for lesson planning.

Survey instruments

The StEB Inventory (Hähnlein, 2018 ) is designed to assess pre-service teachers’ epistemological beliefs. The instrument development is based on the theoretical conceptualization of the epistemological belief system by Schommer ( 1994 ), the core dimensions by Hofer and Pintrich ( 1997 ) and Conley et al. ( 2004 ), as well as the Integrative Model for Personal Epistemology ( IM, Bendixen & Rule, 2004 ; Rule & Bendixen, 2010 ) to explain the mechanism of change, and the Theory of Integrated Domains in Epistemology (TIDE, Muis et al., 2006 ) to explain the context dependency and discipline specificity of epistemological beliefs. The StEB questionnaire consists of four subscales: beliefs about the simplicity of knowledge , the absoluteness of knowledge , the multimodality of knowledge , and the development of knowledge . The questionnaire consists of 26 items. Agreement with the statements is indicated on a 5-point Likert scale (from does not apply at all to completely applies ).

The MiLP Inventory (Hähnlein & Pirnay-Dummer, 2019 ) assesses students’ metacognitive activities in the form of learning judgments. The instrument development is based on the theoretical model of Nelson and Narens ( 1990 , 1994 ). It distinguishes between metacognitive monitoring and control as well as the three phases of learning: knowledge acquisition, retention, and retrieval. The questionnaire consists of 33 items and six subscales. Four subscales concern the metacognitive activities in the knowledge acquisition phase. Two each concern metacognitive monitoring and metacognitive regulation. One subscale concerns metacognitive observation in the retention phase and one that of knowledge retrieval. The response scale has a 5-level Likert format (from does not apply to does apply ). The six subscales are as follows:

Anco: Assesses a learners’ ability to regulate his/her learning in the phase of knowledge acquisition by means of adequate learning strategies. (10 items)

Abmo: Assesses a learners’ ability to monitor his/her retrieval of knowledge in a way that her/she is able to successfully remember the learning content. (8 items)

Anmo: Assesses a learners’ ability to monitor his/her knowledge acquisition by means of assessing the difficulty of the learning content. (5 items)

Akco: Assesses a learners’ ability to regulate his/her knowledge acquisition in a way that he/she is able to successfully differentiate between important and unimportant learning content. (3 items)

Bemo: Assesses a learners’ ability to monitor his/her retention of knowledge in a way that he/she is able to remember the learning content. (4 items)

Akmo: Assesses a learners’ ability to monitor his/her knowledge acquisition in a way that he/she is able to figure out if the knowledge acquisition was successful. (3 items)

The SILTE Short Scales (Lehmann et al., 2020 ) are used to measure the self-reported knowledge integration of pre-service teacher students in teacher education across domains. With its two dimensions, it measures integrative learning with 7 items and separative learning with 5 items. The two scales have a five-point response format (ranging from does not apply at all to fully applies ). According to Lehmann et al. ( 2020 , p. 156), the theoretical foundation of the SILTE questionnaire is the model of knowledge building (e.g. Chan et al., 1997 ; Scardamalia & Bereiter, 1994 , 1999 ), which can be assigned to the constructivist approaches to strategic learning. In addition, the questionnaire is based on the concepts of cognitive fragmentation and knowledge integration in teacher education and learning to teach (e.g. Ball, 2000 ; Darling-Hammond, 2006 ; Lehmann, 2020b ).

Study-specific self-efficacy is assessed using the WIRKSTUD scale (Schwarzer & Jerusalem, 2003 ). It is one-dimensional and has 7 items with a four-point rating scale ( does not apply at all , hardly applies , applies , applies completely ). The conception of the scale is based on Bandura’s ( 1978 ) social-cognitive learning theory and the concept of positive situation-action expectations contained therein.

The Intrinsic Motivation Inventory (IMI) is a multidimensional measurement that comes in different versions. It is intended to assess “participants’ subjective experience related to a target activity in laboratory experiments” (Deci & Ryan, n.d., p. 1). The Post-Experimental Intrinsic Motivation Inventory (Deci & Ryan, n.d.) originally consists of 45 items and seven scales that can be selected according to the requirements of the experimental setting. In our study, the following six scales were used: Interest/enjoyment (7 items), perceived competence (6 items), effort/importance (5 items), pressure/tension (5 items), perceived choice (7 items), and value/usefulness (7 items). The scale relatedness (8 items) was not used in this study. A five-point response format (ranging from does not apply at all to fully applies ) were used for all IMI measures.

The Text Material Questionnaire consists of three of the subscales of the IMI questionnaire (Deci & Ryan, n.d.) adapted to text material. It assesses students’ interest and pleasure (5 items), felt pressure (2 items), and perceived competence (2 items) in dealing with the text.

The value/usefulness (7 items) subscale of the IMI (Deci & Ryan, n.d.) which is adaptable to different content, was used to measure pre-service teachers’ perceived relevance of knowledge integration in the pretest and posttest.

Table 1 shows the internal consistencies of the survey instruments used. Reliabilities are reported for the current study as well as for the previous development and validation studies. All in all, the reliabilities can be considered acceptable. For individual scales, however, there are very low internal consistencies with values below α = .70.

Text rating measure

To score the quality of the participants’ texts, we used a rating scheme with three criteria: degree of transfer, validity of conclusions, and degree of integration. Each criterion was rated on a scale of 0 to 3 points. The criteria were initially developed by an expert group of five persons regarding content validity. For this study, a two-person group re-evaluated the criteria, but only minor changes were made and only with respect to the specific content. The text criteria were not revealed to the participants. The rating criteria (translated from German) were as follows:

Degree of transfer (transfers are only given if they can also be derived from the texts read).

0: There is no transfer to the action level.

1: A transfer to the action level is only made by naming goals. For this purpose, less specified should statements are used (for example, as in “the lessons should be designed in a friendly way” or “the lessons should have a good relationship level”).

2: Ideas are formulated sporadically (or in an unconnected list form) on how aspects from the transfer can be implemented.

3: There are concrete, interrelated derivations from the texts with regard to a realistic lesson design.

Validity of the conclusions

0: The conclusions cannot be derived with certainty from the sources (e.g., purely intuitive assumptions).

1: Unconnected (e.g., purely abductive) assumptions are present, but at most as a list-like series of unrelated individual statements.

2: The conclusions are largely clear from the sources.

3: The conclusions emerge unambiguously and deductively from the sources and are logically related to application.

Degree of integration

0: Assumptions are treated separately per text.

1: The assumptions are treated separately, but, e.g., any contradictions and compatibilities discovered are contrasted, mentioned, and/or discussed.

2: The different models are treated together, with reference to each other. They are not worked through sequentially.

3: The different models in the texts are processed in an integrated and coherent way, explicitly integrating the areas of knowledge into each other.

Computer linguistic methods

Computer linguistic methods were used in this research project for computational modeling of the semantic knowledge structures contained in both source texts and student texts (Pirnay-Dummer, 2006 , 2010 , 2014 , 2015a , 2015b ; Pirnay-Dummer et al., 2010 ).

Mental model-based (Seel, 1991 , 2003 ) knowledge elicitation techniques have relied on recreating propositional networks from human knowledge (Ifenthaler, 2010 ; Jonassen, 2000 , 2006 ; Jonassen & Cho, 2008 ; Pirnay-Dummer, 2015a , 2015b ).

The computational linguistic heuristic technology T-MITOCAR (Text-Model Inspection Trace of Concepts and Relations; Pirnay-Dummer, 2006 , 2007 ; Pirnay-Dummer et al., 2010 ) was developed as a means of automatically analyzing, modeling, visualizing, and comparing the semantic knowledge structure of texts. The approach behind T-MITOCAR is closely based on the psychology of knowledge, knowing, and epistemology (Spector, 2010; Strasser, 2010). Its associative core functions are founded strictly on mental model theory (Gentner & Stevens, 1983 ; Johnson-Laird, 1983 ; Pirnay-Dummer et al., 2012 ; Pirnay-Dummer & Seel, 2018 ; Seel, 2012 ; Seel et al., 2013 ) and on how, when, and why parts of knowledge are reproduced in the semantics of natural language (Evans & Green, 2006 ; Helbig, 2006 ; Partee, 2004 ; Taylor, 2007 ). The algorithms work through the propositional relations of a text to identify central relations between concepts and build a network (a graph), while always heuristically reconstructing the parts as closely as possible to how human knowledge is constructed, conveyed, and reconstructed through text.

T-MITOCAR can also automatically compare different knowledge structures from text quantitatively using measures based on graph theory (Tittmann, 2010 ), four structural measures (Table 2 ), and three semantic measures (Table 3 ). The measures lead to similarity measures s between zero and one. Tables 2 and 3 only provide an overview for their interpretations, which is necessary to understand them as criteria within this study.

In this research project, we used T-MITOCAR technology to compare the student texts with a reference model of the source texts on the basis of the seven similarity measures. Although we classified the measures into structural and semantic indices, as Tables 2 and 3 show, the similarity indices measure different features and can therefore not be treated like a subsuming scale (e.g., on a test): The measures are not items for the same but for different properties of knowledge graphs. The structural measures indicate different properties of structure, whereas structure itself is not a property. The same holds true for the semantic measures.

Analysis and results

The statistical software R (R-Core-Team, 2022 ) was used for analysis. Of the 119 students enrolled in the four courses at the beginning, five dropped out during term and terminated their participation in this study. Two people were absent on the day of the experiment for health reasons. Three other participants participated in the experiment but did not submit their self-written texts for unknown reasons. Since this was interpreted as a withdrawal of consent, these three datasets were not evaluated. The data of 109 participants were available for further analysis.

First, we checked for differences in the control variables between the experimental conditions using MANOVA and ANOVA. Results from MANOVA showed no significant differences in students’ separative and integrative learning in teacher education between the groups (Wilks’ λ = 0.97, F [4,220] = 0.93, p = .45). Also, no significant differences between the three groups were found in the students’ metacognitive abilities (MANOVA, Wilks’ λ = 0.88, F [12,222] = 1.27, p = .24), epistemological beliefs (MANOVA, Wilks’ λ = 0.95, F [8,226] = 0.81, p = .60), and study-specific self-efficacy (ANOVA, F [2,110] = 2.29, p = .11). The participants of the three groups did not perceive the reading material differently (MANOVA, Wilks’ λ = 0.95, F [6,212] = 0.99, p = .43). For post-experimental intrinsic motivation, no significant differences between the three groups were found (MANOVA, Wilks’ λ = 0.79, F [12,176] = 1.79, p = .053).

Using ANOVA, we analyzed the group differences in the time (minutes) participants spent studying the text material, to check if the instructional prompts (focus questions) were used. Time spent studying the text material differed significantly between the three groups (ANOVA, F [2,89] = 238.2, p < .001; , η p 2 =.84). Participants with the combined prompts spent an average of 63 minutes ( SD = 7.40) studying the text material, while participants with only relevance prompts (EG2: M = 29, SD = 7.25, p < .001) or no prompts (CG: M = 27, SD = 5.67, p < .001) spent significantly less time.

As a further check on prompt usage, participants of EG1 were asked to hand in their models showing the interrelations between the texts. Three subjects who had not created a model, i.e. had not completed this learning phase, were excluded from further analysis.

Quality of knowledge integration (DV1)

We analyzed the quality of the pre-service teachers’ texts (DV1). Two trained raters assessed the student texts ( N = 109, EG1: n1 = 37, EG2: n2 = 37, CG: n3 = 35) for quality on a scale of 0 to 3 points on the basis of the following criteria: degree of transfer, validity of conclusions, and degree of integration.

Table 4 shows the mean ratings for the criteria per group for both raters. The average points achieved for integration, transfer and conclusions are lower for rater 1. Reviewer one seems to evaluate the student texts more strictly than reviewer two.

Interrater reliability (Kendalls-τ, Type b) was high for transfer ( r τ = .52, z = 6.08, p < .001) but low for conclusion ( r τ = .21, z = 2.23, p = .02) and integration ( r τ = .13, z = 1.60, p = .11), as well as for the overall rating ( r τ = .24, z = 3.30, p < .001). The ratings were treated separately in the following analysis due to low interrater reliability. All raters used the whole range of the criteria (0–3) for each item. MANOVA with three rating scores per rater as dependent measures and the experimental conditions as independent measures indicated no significant differences between type of prompting (Wilks’ λ = 0.85, F [12,202] = 1.43, p = .16).

Computer linguistic analyses (DV2)

Comparison of participants’ texts with the reference model of source material.

Using T-MITOCAR technology, we combined the four source texts to generate a reference model of their semantic knowledge structures across the respective domains (see Figure 3 ). Figure 2 shows only a section of the resulting reference model for illustrative purposes because the entire model would be too large.

figure 2

Section of the reference model of the source material

The source reference model consists of concepts that are bound by links. The links are associations strengths as determined by T-MITOCAR. At the links (Figure 2 ) are measures of association as weights between 0 and 1. One stands for the strongest association within the text and zero would stand for no association. Only the strongest ones are included in such a graph. Within the parenthesis is a linear transformation of the same value, so that the weakest links that still made it to the graph show a zero and the strongest show a one (Pirnay-Dummer, 2015b ). The meaning of a concept is constituted by the context structure in which it is located in the network. The meaning of such a T-MITOCAR generated semantic knowledge structures (in this case, the reference model of the source texts) lies the way concepts are linked with each other (e.g. cyclic, hierarchical, sequential) and the connections they make to specific concepts but not others.

To explore the effect of prompting on the pre-service teachers’ knowledge structures, we compared the student texts ( N = 109, EG1: n 1 = 37, EG2: n 2 = 37, CG: n 3 = 35) with the reference model of the source texts using T-MITOCAR similarity measures. The participants in CG and EG2 tended to write longer texts than those in EG1. However, differences in the average word count between the groups were not significant (ANOVA, F [2,106] = 1.67, p = .19; M EG1 = 407.11, SD EG1 = 122.59; M EG2 = 429.73, SD EG2 = 86.21 ; M CG = 448.8, SD CG = 74.56; F (2,106)= 1.75).

The logic of the computer-linguistic analysis in this study is shown in Figure 3 .

figure 3

Computer-linguistic analysis of the student texts

Means and standard deviations for the computer-linguistic comparison measures per group between participants’ texts and reference text model are shown in Table 5 .

MANOVA (Type III), with the four structural (SUR, GRA, STRU, GAMMA) and three semantic similarity scores (CONC, PROP, BSM) as dependent measures and the experimental conditions as independent measures, indicated significant differences between type of prompting (Wilks’ λ = 0.79, F [14,200] = 1.83, p = .04).

We conducted follow-up univariate ANOVA (Type III) for each of the dependent measures. The results indicated significant differences between the experimental conditions for structural matching (STRU: F [2,106] = 7.44, p < .001, η p 2 = .12; Figure 3 ), propositional matching (PROP: F [2,106] = 4.78, p = .01, η p 2 = .08; Figure 4 ), and balanced semantic matching (BSM: F [2,106] = 3.49, p = .03, η p 2 = .06). The differences in surface (SUR: F [2,106] = 1.69, p = .19), graphical (GRA: F [2,106] = 2.10, p = .13), concept matching (CONC: F [2,106] = 1.80, p = .17) and gamma were not significant (GAMMA: F [2,106] = 1.28, p = .28).

figure 4

Propositional similarity between participants’ texts and reference source model per group

Tukey HSD post-hoc for structural matching revealed that the control group (STRU: M = 0.43, SD = 0.10) achieved significantly higher similarities with the reference model than experimental group 1 (STRU: M = 0.31, SD = 0.15, p < .001) and experimental group 2 (STRU: M = 0.34, SD = 0.14, p = .02; see Figure 4 ). The control group (PROP: M = 0.08, SD = 0.03) achieved significantly higher propositional similarity to the reference model than experimental group 1 (PROP: M = 0.06, SD = 0.03, p = .009; see Figure 5 ). The contrasts for balanced semantic matching were not significant (BSM: p > .05).

figure 5

Structural similarity between participants’ texts and reference source model per group

Perceived relevance (DV3)

We used mixed ANOVA (Type III, with Greenhouse-Geisser correction for violation of sphericity) to analyze the differences in the students’ perceived relevance of knowledge integration over time (R-Package: afex; Function: aov_4). There were no significant main effects of the within-subject factor (time, F [1,105] = 0.77, p = .38) or the between-subject factor (group, F [2,105] = 0.20, p = .82) on the differences between the groups over time. However, as Figure 6 illustrates, a significant interaction effect of time (within) and group (between) on perceived relevance was found ( F [2,105] = 5.68, p =.005, η 2 G = .027).

figure 6

Differences in perceived relevance of knowledge integration between groups over time

Tukey-HSD-adjusted mixed post-hoc analysis revealed a significant ( p = .02) increase in perceived relevance of knowledge integration for only EG1, the group receiving combined prompts ( M Diff = − 0.31, 95% − CI[− .58, − .033]).

The main objective of this study was to examine the effects of instructional prompts and relevance prompts embedded in pre-service teachers’ learning processes on the quality of their knowledge integration in multiple document comprehension across domains.

We found that students receiving no prompts (CG) achieved a closer structural match to the overall reference model than the students aided by prompts (EG1, EG2). Also, their propositions were more similar to the reference model than those of the students with combined prompts (EG1). When interpreting the results, it is important to separate the knowledge level from the integration level of a text. The way the reference model was created to which the student texts were compared explains why higher similarity does not indicate higher knowledge integration, but rather more knowledge-telling (Scardamalia & Bereiter, 1987 , 1991 ). The reference model was created by combining all four source texts into one document. From this, the overall model of the textual knowledge structure was then created using T-MITOCAR. Thus, the resulting model of the source texts is not an integrated knowledge model but rather a combined one. Students who have solved the writing task by preparing a short summary for each of the reference texts in turn achieved a higher structural and propositional similarity than students who actually made the effort to produce an integrated text of their own from the four reference texts. Thus, our results provide for structural and propositional matching provide a substantial empirical indication that students who do not receive prompts about knowledge integration are more prone to knowledge-telling than students who receive prompts.

However, no conclusions can be drawn from this as to whether students supported by the prompts actually integrated their knowledge better. We found no direct evidence that pre-service teachers were supported in their knowledge integration by pre-actional cognitive prompts in the form of task-supplemental focus questions in combination with repeated content-independent relevance instruction. This is contrary to previous studies who succeeded in prompting knowledge integration (e.g. Lehmann, Rott et al., 2019a , 2019b ; Zeeb, Biwer, et al., 2019 ; Zeeb et al., 2020 ).

Overall, very few of the pre-service teachers in our study succeeded in writing integrated texts that contained transfer. This again is in line with previous empirical findings that student teachers struggle with knowledge integration (Graichen et al., 2019 ; Harr et al., 2014 ; Harr et al., 2015 ; Janssen & Lazonder, 2016 ). Pre-service teachers have little or no formal experience in knowledge integration, because their study domains are taught separately (Ball, 2000 ; Darling-Hammond, 2006 ; Hudson & Zgaga, 2017 ). The complex task we used in our experiment required students not only to integrate knowledge from four texts across domains but also to draw integrated conclusions for transfer. Low integration despite of repeated, combined prompting can be interpreted as an indicator that transfer-oriented knowledge integration should not be treated as a function of multi-document integration, even when it is as easy to control as it is within studies for knowledge integration. Instead of a text-inherent process of a task, which is limited to a particular domain and particularly trained expectations towards kinds of integration, transfer extending to the true expected task (what students believe to be their knowledge application later in “real life”) seems to modify the kind of integration as well as its outcome. Students in this study seem to be inspired to actually leave their academic knowledge and rely more on their word knowledge. When the students of this study are induced by the task and its way of introducing integration and thus transfer to leave the specific domain, they no longer perceive the same specific content to be as accessible or relevant. This is only just post-hoc at this point and not at all finite evidence, but it poses a recognizable danger to academic transfer and may even help to explain a lack of dissemination. In future analyses, we will also try to map commonplace knowledge to the solutions—this could be difficult to do, however, because we would first need to carefully create a comparable knowledge base as an additional outside criterion.

Low integration and transfer might suggest a floor effect that may have made it additionally difficult for raters, as there was little transfer and integrated knowledge to be found. This suggests that knowledge integration training and practice in the use of knowledge integration support is needed in handling complex tasks especially when the integration is directed at academic transfer.

We did not find significant group differences for text quality (transfer, conclusion, integration criteria). However, the low interrater reliability is a limitation of this study. It is suggested that new raters to be trained in more detail to obtain more reliable assessments of the texts. At the same time, we found group differences for time participants spent reading and working with the given text material. This shows that the focus questions given as instructional prompts to EG1 were indeed used and stimulated a significantly longer engagement with the texts compared to the relevance prompts alone or no prompts.

Contradicting our third hypothesis (see Objectives) and previous findings (Zeeb, Biwer, et al., 2019 ; Zeeb et al., 2020 ), relevance prompts alone were not enough to enhance the relevance perception, even though they were repeated and emphasized knowledge integration independently from source material content or the task (DV3). Only in combination with the complex instructional prompts did the perceived relevance increase over time. However, long-term effects were not part of this study, which limits the validity of this point.

The results of this study are surprising in some important ways: Knowledge integration seems to be even more complex than already known, particularly when it uses interdisciplinary domains aimed at transfer. Everyday knowledge may get in the way of academically sound transfer in a much deeper sense than previously assumed. This should be considered as a prerequisite of transfer. Just leaving it to the practical imagination clearly does not suffice. Scholars and practitioners alike need to know about this gap before they can effectively train for integrated transfer.

Data availability

The data that support the findings of this study are not openly available but are available from the corresponding author upon request.

Ball, D. L. (2000). Bridging practices: Intertwining content and pedagogy in teaching and learning to teach. Journal of Teacher Education, 51 (3), 241–247. https://doi.org/10.1177/0022487100051003013

Article   Google Scholar  

Bandura, A. (1978). Self-efficacy: Toward a unifying theory of behavioral change. Advances in Behaviour Research and Therapy, 1 (4), 139–161. https://doi.org/10.1016/0146-6402(78)90002-4

Bannert, M. (2009). Promoting self-regulated learning through prompts. Zeitschrift für Pädagogische Psychologie, 23 (2), 139–145. https://doi.org/10.1024/1010-0652.23.2.139

Barzilai, S., & Strømsø, H. I. (2018). Individual differences in multiple document comprehension. In J. L. G. Braasch, I. Bråten, & M. T. McCrudden (Eds.), Handbook of multiple source use (pp. 99–115). Routledge. https://doi.org/10.4324/9781315627496-6

Chapter   Google Scholar  

Baumert, J., & Kunter, M. (2006). Stichwort: Professionelle Kompetenz von Lehrkräften. Zeitschrift für Erziehungswissenschaft, 9 (4), 469–520. https://doi.org/10.1007/s11618-006-0165-2

Bendixen, L. D., & Rule, D. C. (2004). An integrative approach to personal epistemology: A guiding model. Educational Psychologist, 39 (1), 69–80. https://doi.org/10.1207/s15326985ep3901_7

Blömeke, S. (2009). Lehrerausbildung. In S. Blömeke, T. Bohl, L. Haag, G. Lang-Wojtasik, & W. Sacher (Eds.), Handbuch schule. Theorie—organisation—entwicklung (pp. 483–490). Klinkhardt.

Braasch, J. L. G., Rouet, J.-F., Vibert, N., & Britt, M. A. (2012). Readers’ use of source information in text comprehension. Memory & Cognition, 40 , 450–465. https://doi.org/10.3758/s13421-011-0160-6

Bråten, I., & Braasch, J. L. G. (2018). The role of conflict in multiple source use. In J. L. G. Braasch, I. Bråten, & M. T. McCrudden (Eds.), Handbook of multiple source use (pp. 184–201). Routledge. https://doi.org/10.4324/9781315627496-11

Bråten, I., & Strømsø, H. I. (2009). Effects of task instruction and personal epistemology on the understanding of multiple texts about climate change. Discourse Processes, 47 (1), 1–31. https://doi.org/10.1080/01638530902959646

Bråten, I., & Strømsø, H. I. (2012). Knowledge acquisition: Constructing meaning from multiple information sources. In N. M. Seel (Ed.), Encyclopedia of the sciences of learning (pp. 1677–1680). Springer. https://doi.org/10.1007/978-1-4419-1428-6_807

Britt, M. A., & Rouet, J.-F. (2012). Learning with multiple documents: Component skills and their acquisition. In J. R. Kirby & M. J. Lawson (Eds.), Enhancing the quality of learning: Dispositions, instruction, and learning processes (pp. 276–314). Cambridge University Press. https://doi.org/10.1017/CBO9781139048224.017

Britt, M. A., Rouet, J.-F., & Durik, A. M. (2018). Representations and processes in multiple source use. In J. L. G. Braasch, I. Bråten, & M. T. McCrudden (Eds.), Handbook of multiple source use (pp. 17–32). Routledge. https://doi.org/10.4324/9781315627496-2

Britt, M. A., & Sommer, J. (2004). Facilitating textual integration with marco-structure focusing tasks. Reading Psychology, 25 (4), 313–339. https://doi.org/10.1080/02702710490522658

Bromme, R. (2014). Der Lehrer als Experte: Zur Psychologie des professionellen Wissens (Vol. 7). Waxmann Verlag.

Google Scholar  

Brunner, M., Kunter, M., Krauss, S., Baumert, J., Blum, W., Dubberke, T., & Neubrand, M. (2006). Welche Zusammenhänge bestehen zwischen dem fachspezifischen Professionswissen von Mathematiklehrkräften und ihrer Ausbildung sowie beruflichen Fortbildung? Zeitschrift für Erziehungswissenschaft, 9 (4), 521–544. https://doi.org/10.1007/s11618-006-0166-1

Cerdán, R., & Vidal-Abarca, E. (2008). The effects of tasks on integrating information from multiple documents. Journal of Educational Psychology, 100 (1), 209–222. https://doi.org/10.1037/0022-0663.100.1.209

Chan, C., Burtis, J., & Bereiter, C. (1997). Knowledge building as a mediator of conflict in conceptual change. Cognition and Instruction, 15 (1), 1–40. https://doi.org/10.1207/s1532690xci1501_1

Conley, A. M., Pintrich, P. R., Vekiri, I., & Harrison, D. (2004). Changes in epistemological beliefs in elementary science students. Epistemological Development and Its Impact on Cognition in Academic Domains, 29 (2), 186–204. https://doi.org/10.1016/j.cedpsych.2004.01.004

Darling-Hammond, L. (2006). Constructing 21st-century teacher education. Journal of Teacher Education, 57 (3), 300–314. https://doi.org/10.1177/0022487105285962

Deci, E. L., & Ryan, R. M. (n.d.). Intrinsic motivation inventory (IMI). http://selfdeterminationtheory.org/intrinsic-motivation-inventory/

Dörner, D. (1976). Problemlösen als Informationsverarbeitung (1st ed.). Kohlhammer.

Evans, V., & Green, M. (2006). Cognitive linguistics. An Introduction. Edinburgh: Edinburgh University Press. https://doi.org/10.4324/9781315864327

Gentner, D., & Stevens, A. L. (1983). Mental models . Erlbaum.

Gil, L., Bråten, I., Vidal-Abarca, E., & Strømsø, H. I. (2010). Summary versus argument tasks when working with multiple documents: Which is better for whom? Contemporary Educational Psychology, 35 (3), 157–173. https://doi.org/10.1016/j.cedpsych.2009.11.002

Graichen, M., Wegner, E., & Nückles, M. (2019). Wie können Lehramtsstudierende beim Lernen durch Schreiben von Lernprotokollen unterstützt werden, dass die Kohärenz und Anwendbarkeit des erworbenen Professionswissens verbessert wird? Unterrichtswissenschaft, 47 (1), 7–28. https://doi.org/10.1007/s42010-019-00042-x

Hähnlein, I. (2018). Erfassung epistemologischer Überzeugungen von Lehramtsstudierenden. Entwicklung und Validierung des StEB Inventar. (Dissertation). Universität Passau. https://opus4.kobv.de/opus4-uni-passau/frontdoor/index/index/docId/588 .

Hähnlein, I., & Pirnay-Dummer, P. (2019). Assessing Metacognition in the Learning Process. Construction of the Metacognition in the Learning Process Inventory MILP . Paper presented at the European Association for Research on Learning and Instruction (EARLI) Conference, Aachen, Germany.

Harr, N., Eichler, A., & Renkl, A. (2014). Integrating pedagogical content knowledge and pedagogical/psychological knowledge in mathematics. Frontiers in Psychology, 5 , 1–10. https://doi.org/10.3389/fpsyg.2014.00924

Harr, N., Eichler, A., & Renkl, A. (2015). Integrated learning: Ways of fostering the applicability of teachers’ pedagogical and psychological knowledge. Frontiers in Psychology, 6 , 1–16. https://doi.org/10.3389/fpsyg.2015.00738

Helbig, H. (2006). Knowledge representation and the semantics of natural language. Springer . https://doi.org/10.1007/3-540-29966-1

Hill, H. C., Rowan, B., & Ball, D. L. (2005). Effects of teachers’ mathematical knowledge for teaching on students’ achievement. American educational research journal, 42 (2), 371–406. https://doi.org/10.3102/00028312042002371

Hofer, B. K., & Pintrich, P. R. (1997). The development of epistemological theories: Beliefs about knowledge and knowing and their relation to learning. Review of Educational Research, 67 (1), 88–140. https://doi.org/10.3102/00346543067001088

Hudson, B., & Zgaga, P. (2017). History, context and overview: Implications for teacher education policy, practice and future research. In H. Brian (Ed.), Overcoming Fragmentation in Teacher Education Policy and Practice (pp. 1–10). Cambridge University Press.

Ifenthaler, D. (2010). Relational, structural, and semantic analysis of graphical representations and concept maps. Educational Technology Research and Development, 58 (1), 81–97. https://doi.org/10.1007/s11423-008-9087-4

Ifenthaler, D. (2012). Determining the effectiveness of prompts for self-regulated learning in problem-solving scenarios. Journal of Educational Technology & Society, 15 (1), 38–52.

Ifenthaler, D., & Lehmann, T. (2012). Preactional self-regulation as a tool for successful problem solving and learning. Technology, Instruction, Cognition and Learning, 9 (1–2), 97–110.

Janssen, N., & Lazonder, A. W. (2016). Supporting pre-service teachers in designing technology-infused lesson plans. Journal of Computer Assisted Learning, 32 (5), 456–467. https://doi.org/10.1111/jcal.12146

Johnson-Laird, P. N. (1983). Mental models: Toward a cognitive science of language, inference, and consicousness . Cambridge Univ Press. https://doi.org/10.2307/414498

Book   Google Scholar  

Jonassen, D. H. (2000). Computers as mindtools for schools: Engaging critical thinking . Cham: Prentice hall.

Jonassen, D. H. (2006). On the role of concepts in learning and instructional design. Educational Technology Research and Development, 54 (2), 177–196. https://doi.org/10.1007/s11423-006-8253-9

Jonassen, D. H. (2012). Problem typology. In N. M. Seel (Ed.), Encyclopedia of the Sciences of Learning. Springer. https://doi.org/10.1007/978-1-4419-1428-6_209

Jonassen, D. H., & Cho, Y. H. (2008). Externalizing mental models with mindtools. In D. Ifenthaler, P. Pirnay-Dummer, & J. M. Spector (Eds.), Understanding models for learning and instruction Essays in honor of Norbert M Seel (pp. 145–159). Springer. https://doi.org/10.1007/978-0-387-76898-4_7

Kintsch, W. (1988). The role of knowledge in discourse comprehension: A construction integration model. Psychological review, 95 (2), 163–182. https://doi.org/10.1037/0033-295X.95.2.163

Kintsch, W. (1998). Comprehension: A paradigm for cognition . Cambridge University Press.

Koehler, M. J., & Mishra, P. (2008). Introducing TPCK. In AACTE Committee on Innovation and Technology (Eds.), Handbook of technological pedagogical content knowledge (TPCK) for educators (pp. 2–29). Routledge.

König, J. (2010). Lehrerprofessionalität. Konzepte und Ergebnisse der internationalen und deutschen Forschung am Beispiel fachübergreifender, pädagogischer Kompetenzen. In J. König & B. Hofmann (Eds.), Professionalität von Lehrkräften. Was sollen Lehrkräfte im Lese- und Schreibunterricht wissen und können (pp. 40–106). Dgls.

Krauskopf, K., Zahn, C., & Hesse, F. W. (2020). Concetualizing (pre-service) teachers’ professional knowledge for complex domains. In T. Lehmann (Ed.), International perspectives on knowledge integration: Theory, research, and good practice in pre-service teacher and higher education (pp. 31–57). Brill Sense. https://doi.org/10.1163/9789004429499_003

Kulturministerkonferenz. (2009). Kompetenzstufenmodell zu den Bildungsstandards im Kompetenzbereich: Sprechen und Zuhören—hier Zuhören—für den Mittleren Schulabschluss . Beschluss der Kultusministerkonferenz.

Lehmann, T. (2020a). What is knowledge integration of multiple domains and how does it relate to teachers’ professional competence? In T. Lehmann (Ed.), International perspectives on knowledge integration: Theory, research, and good practice in pre-service teacher and higher education (pp. 9–29). Brill Sense. https://doi.org/10.1163/9789004429499_002

Lehmann, T. (2020b). International perspectives on knowledge integration: Theory, research, and good practice in pre-service teacher and higher education . Brill Academic Publishers. https://doi.org/10.1163/9789004429499

Lehmann, T. (2022). Student teachers’ knowledge integration across conceptual borders: The role of study approaches, learning strategies, beliefs, and motivation. European Journal of Psychology of Education . https://doi.org/10.1007/s10212-021-00577-7

Lehmann, T., Hähnlein, I., & Ifenthaler, D. (2014). Cognitive, metacognitive and motivational perspectives on preflection in self-regulated online learning. Computers in Human Behavior, 32 , 313–323. https://doi.org/10.1016/j.chb.2013.07.051

Lehmann, T., Klieme, K., & Schmidt-Borcherding, F. (2020). Separative and integrative learing in teacher education: Validity and reliability of the “SILTE” short scales. In T. Lehmann (Ed.), International perspectives on knowledge integration: Theory, research, and good practice in pre-service teacher and higher education (pp. 155–177). Brill Academic Publishers.

Lehmann, T., Pirnay-Dummer, P., & Schmidt-Borcherding, F. (2019a). Fostering integrated mental models of different professional knowledge domains: Instructional approaches and model-based analyses. Educational Technology Research and Development, 68 (3), 905–927. https://doi.org/10.1007/s11423-019-09704-0

Lehmann, T., Rott, B., & Schmidt-Borcherding, F. (2019b). Promoting pre-service teachers’ integration of professional knowledge: Effects of writing tasks and prompts on learning from multiple documents. Instructional Science, 47 (1), 99–126. https://doi.org/10.1007/s11251-018-9472-2

Linn, M. C. (2000). Designing the knowledge integration environment. International Journal of Science Education, 22 (8), 781–796. https://doi.org/10.1080/095006900412275

Mishra, P. (2019). Considering contextual knowledge: The TPACK diagram gets an upgrade. Journal of Digital Learning in Teacher Education, 35 (2), 76–78. https://doi.org/10.1080/21532974.2019.1588611

Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108 (6), 1017–1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x

Muis, K. R., Bendixen, L. D., & Haerle, F. C. (2006). Domain-generality and domain-specificity in personal epistemology research: Philosophical and empirical reflections in the development of a theoretical framework. Educational Psychology Review, 18 (1), 3–54.

Nelson, T. O., & Narens, L. (1990). Metamemory: A theoretical framework and some new findings. The Psychology of Learning and Motivation, 26 , 125–173. https://doi.org/10.1016/S0079-7421(08)60053-5

Nelson, T. O., & Narens, L. (1994). Why investigate metacognition. In J. Metcalfe & A. Shimamura (Eds.), Metacognition knowing about knowing (pp. 1–25). Cambridge University Press.

Norton, P., van Rooij, S. W., Jerome, M. K., Clark, K., Behrmann, M., & Bannan-Ritland, B. (2009). Linking theory and practice through design: An instructional technology program. In M. Orey, V. J. McClendon, & R. M. Branch (Eds.), Educational media and technology yearbook (pp. 47–59). Springer. https://doi.org/10.1007/978-0-387-09675-9_4

Partee, B. H. (2004). Compositionality in formal semantics: Selected papers of Barbara Partee. Blackwell Pub . https://doi.org/10.1002/9780470751305

Perfetti, C. A., Rouet, J.-F., & Britt, M. A. (1999). Toward a theory of documents representation. In H. van Oostendorp & S. R. Goldman (Eds.), The construction of mental representations during reading (pp. 99–122). Lawrence Erlbaum Associates.

Pirnay-Dummer, P. (2006). Expertise und Modellbildung-MITOCAR. (Dissertation). Albert-Ludwigs-Universität Freiburg (Breisgau).

Pirnay-Dummer, P. (2007). Model inspection trace of concepts and relations: A heuristic approach to language-oriented model assessment. Paper presented at the annual meeting of AERA, Chicago IL.

Pirnay-Dummer, P. (2010). Complete structure comparison. In D. Ifenthaler, P. Pirnay-Dummer, & N. M. Seel (Eds.), Computer-based diagnostics and systematic analysis of knowledge (pp. 235–258). Springer. https://doi.org/10.1007/978-1-4419-5662-0_13

Pirnay-Dummer, P. (2014). Gainfully guided misconception: How automatically generated knowledge maps can help companies within and across their projects. In D. Ifenthaler & R. Hanewald (Eds.), Digital knowledge maps in higher education: Technology-enhanced support for teachers and learners (pp. 253–274). Springer. https://doi.org/10.1007/978-1-4614-3178-7_14

Pirnay-Dummer, P. (2015a). Knowledge elicitation. In J. M. Spector (Ed.), The SAGE encyclopedia of educational technology (pp. 438–442). SAGE. https://doi.org/10.4135/9781483346397.n183

Pirnay-Dummer, P. (2015b). Linguistic analysis tools. In C. A. MacArthur, M. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (pp. 427–442). Guilford Publications.

Pirnay-Dummer, P. (2020). Knowledge and structure to teach: A model-based computer-linguistic approach to track, visualize, compare and cluster knowledge and knowledge integration in pre-service teachers. In T. Lehmann (Ed.), International perspectives on knowledge integration: Theory, research, and good practice in pre-service teacher and higher education (pp. 133–154). Brill Sense. https://doi.org/10.1163/9789004429499_007

Pirnay-Dummer, P., Ifenthaler, D., & Seel, N. M. (2012). Designing model-based learning environments to support mental models for learning. In D. H. Jonassen & S. M. Land (Eds.), Theoretical foundations of learning environments (pp. 55–90). Routledge. https://doi.org/10.4324/9780203813799

Pirnay-Dummer, P., Ifenthaler, D., & Spector, J. M. (2010). Highly integrated model assessment technology and tools. Educational Technology Research and Development, 58 (1), 3–18. https://doi.org/10.1007/s11423-009-9119-8

Pirnay-Dummer, P., & Seel, N. M. (2018). The sciences of learning. In L. Lin & J. M. Spector (Eds.), The Sciences of learning and instructional design: Constructive articulation between communities (pp. 8–35). Routledge. https://doi.org/10.4324/9781315684444-2

R-Core-Team (2022). R: A language and environment for statistical computing (Version 4.2.2): R Foundation for Statistical Computing. https://www.R-project.org/ .

Reigeluth, C. M., & Stein, F. S. (1983). The elaboration theory of instruction. In C. M. Reigeluth (Ed.), Instructional design theories and models: An overview of their current status (pp. 335–382). Erlbaum. https://doi.org/10.4324/9780203824283

Renkl, A., Mandl, H., & Gruber, H. (1996). Inert knowledge: Analyses and remedies. Educational Psychologist, 31 (2), 115–121. https://doi.org/10.1207/s15326985ep3102_3

Rouet, J.-F. (2006). The skills of document use: From text comprehension to web-based learning. Lawrence Erlbaum Associates . https://doi.org/10.4324/9780203820094

Rouet, J.-F., & Britt, M. A. (2011). Relevance processes in multiple document comprehension. In M. T. McCrudden, J. P. Magliano, & G. Schraw (Eds.), Text relevance and learning from text (pp. 19–52). IAP Information Age Publishing.

Rouet, J.-F., Britt, M. A., & Durik, A. M. (2017). RESOLV: Readers’ representation of reading contexts and tasks. Educational Psychologist, 52 (3), 200–215. https://doi.org/10.1080/00461520.2017.1329015

Rule, D. C., & Bendixen, L. D. (2010). The integrative model of personal epistemology development: Theoretical underpinnings and implications for education. In L. D. Bendixen & F. C. Feucht (Eds.), Personal epistemology in the classroom (pp. 94–123). Cambridge University Press. https://doi.org/10.1017/CBO9780511691904.004

Scardamalia, M., & Bereiter, C. (1987). Knowledge telling and knowledge transforming in written composition. In S. Rosenberg (Ed.), Advances in applied psycholinguistics (Vol. 2, pp. 142–175). Cham: Cambridge University Press. https://doi.org/10.1017/S0142716400008596

Scardamalia, M., & Bereiter, C. (1991). Literate expertise. In K. A. Ericsson & J. Smith (Eds.), Toward a general theory of expertise: Prospects and limits. Cham: Cambridge University Press.

Scardamalia, M., & Bereiter, C. (1994). Computer support for knowledge-building communities. The Journal of the Learning Sciences, 3 (3), 265–283. https://doi.org/10.1207/s15327809jls0303_3

Scardamalia, M., & Bereiter, C. (1999). Schools as knowledge-building organizations. In D. P. Keating & C. Hertzman (Eds.), Developmental health and the wealth of nations: Social, biological, and educational dynamics (pp. 274–289). Guilford Press.

Schiefele, U., & Pekrun, R. (1996). Psychologische Modelle des selbstgesteuerten und fremdgesteuerten Lernens. In F. E. Weinert (Ed.), Enzyklopädie der Psychologie—Psychologie des Lernens und der Instruktion Pädagogische Psychologie (Vol. 2, pp. 249–287). Hogrefe.

Schneider, M. (2012). Knowledge integration. In N. M. Seel (Ed.), Encyclopedia of the sciences of learning (pp. 1684–1686). Springer. https://doi.org/10.1007/978-1-4419-1428-6_807

Schommer, M. (1994). An emerging conceptualization of epistemological beliefs and their role in learning. In R. Garner & P. A. Alexander (Eds.), Beliefs about text and instruction with text (pp. 25–40). Erlbaum.

Schwarzer, R., & Jerusalem, M. (2003). SWE: Skala zur Allgemeinen Selbstwirksamkeitserwartung [Verfahrensdokumentation, Autorenbeschreibung und Fragebogen]. In L.-I. f. P. (ZPID) (Ed.), Open Test Archive . https://doi.org/10.23668/psycharchives.4515

Seel, N. M. (1991). Weltwissen und mentale Modelle . Hogrefe Verlag f.

Seel, N. M. (2003). Model-centered learning and instruction. Technology, Instruction, Cognition and Learning, 1 (1), 59–85.

Seel, N. M. (2012). Learning and thinking. In N. M. Seel (Ed.), Encyclopedia of the sciences of learning (Vol. 4, pp. 1797–1799). Springer.

Seel, N. M., Ifenthaler, D., & Pirnay-Dummer, P. (2013). Mental models and their role in learning by insight and creative problem solving. In J. M. Spector, B. B. Lockee, S. E. Smaldino, & M. Herring (Eds.), Learning, problem solving, and mind tools: Essays in honor of David H. Jonassen (pp. 10–34). Springer.

Seel, N. M., Lehmann, T., Blumschein, P., & Podolskiy, O. A. (2017). Instructional design for learning: Theoretical foundations. Sense . https://doi.org/10.1007/978-94-6300-941-6

Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15 (2), 4–14. https://doi.org/10.3102/0013189X015002004

Taylor, J. R. (2007). Cognitive linguistics and autonomous linguistics. In D. Geeraerts & H. Cuyckens (Eds.), Cognitive Linguistics (pp. 566–588). Oxford University Press.

Thompson, C. (2020). Allgemeine Erziehungswissenschaft . Eine Einführung (Vol. 3). http://www.klinkhardt.de/ewr/97831726165.html .

Tittmann, P. (2010). Graphs and networks. In D. Ifenthaler, P. Pirnay-Dummer, & N. M. Seel (Eds.), Computer-based diagnostics and systematic analysis of knowledge (pp. 177–188). Springer. https://doi.org/10.1007/978-1-4419-5662-0_10

Trevors, G. J., Muis, K. R., Pekrun, R., Sinatra, G. M., & Winne, P. H. (2016). Identity and epistemic emotions during knowledge revision: A potential account for the backfire effect. Discourse Processes, 53 (5–6), 339–370. https://doi.org/10.1080/0163853X.2015.1136507

van Dijk, T. A., & Kintsch, W. (1983). Strategies of discourse comprehension. Academic Press . https://doi.org/10.2307/415483

Voss, T., Kunter, M., & Baumert, J. (2011). Assessing teacher candidates’ general pedagogical/psychological knowledge: Test construction and validation. Journal of Educational Psychology, 103 (4), 952–969. https://doi.org/10.1037/a0025125

Wäschle, K., Lehmann, T., Brauch, N., & Nückles, M. (2015). Prompted journal writing supports preservice history teachers in drawing on multiple knowledge domains for designing learning tasks. Peabody Journal of Education, 90 (4), 546–559. https://doi.org/10.1080/0161956X.2015.1068084

Whitehead, A. N. (1929). The aims of education . Macmillan.

Wiley, J., & Voss, J. F. (1996). The effects of ‘playing historian’ on learning in history. Applied Cognitive Psychology, 10 (7), 63–72.

Wiley, J., & Voss, J. F. (1999). Constructing arguments from multiple sources: Tasks that promote understanding and not just memory from text. Journal of Educational Psychology, 91 (2), 301–311. https://doi.org/10.1037/0022-0663.91.2.301

Wilhelm, O., & Nickolaus, R. (2013). Was grenzt das Kompetenzkonzept von etablierten Kategorien wie Fähigkeit, Fertigkeit oder Intelligenz ab? Zeitschrift für Erziehungswissenschaft, 16 , 23–26. https://doi.org/10.1007/s11618-013-0380-6

Zeeb, H., Biwer, F., Brunner, G., Leuders, T., & Renkl, A. (2019). Make it relevant! How prior instructions foster the integration of teacher knowledge. Instructional Science, 47 (6), 711–739. https://doi.org/10.1007/s11251-019-09497-y

Zeeb, H., Spitzmesser, E., Röddiger, A., Leuders, T., & Renkl, A. (2020). Using relevance instructions to support the integration of teacher knowledge. In T. Lehmann (Ed.), International perspectives on knowledge integration: Theory, research, and good practice in pre-service teacher and higher education (pp. 201–229). Brill, Sense. https://doi.org/10.1163/9789004429499_010

Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory into practice, 41 (2), 64–70. https://doi.org/10.1207/s15430421tip4102_2

Download references

Open Access funding enabled and organized by Projekt DEAL.

Author information

Authors and affiliations.

Faculty of Philosophy III, Institut of Pedagogy, Educational Psychology, Martin-Luther-University Halle-Wittenberg, Halle, Germany

Inka Sara Hähnlein & Pablo Pirnay-Dummer

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Inka Sara Hähnlein .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Hähnlein, I.S., Pirnay-Dummer, P. Promoting pre-service teachers’ knowledge integration from multiple text sources across domains with instructional prompts. Education Tech Research Dev (2024). https://doi.org/10.1007/s11423-024-10363-z

Download citation

Accepted : 19 February 2024

Published : 10 April 2024

DOI : https://doi.org/10.1007/s11423-024-10363-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Knowledge integration
  • Multiple-document comprehension
  • Pre-service teacher education
  • Find a journal
  • Publish with us
  • Track your research

research type of computer

Chinese researchers develop new luminous smart fiber

SHANGHAI, April 8 (Xinhua) -- A Chinese research team has developed a new type of smart fiber that can emit light and generate electricity without being plugged in.

The fiber integrates functions including wireless energy collection, information perception and transmission, and can be made into textiles that can realize human-computer interaction functions such as luminous display and touch control without chips and batteries.

The study, recently published in the journal Science, is expected to change the way people interact with the environment and between people, and is of great significance for the application of smart textiles.

Smart wearable devices have become part of daily life and play an important role in health monitoring, telemedicine, human-computer interaction and other fields.

Compared with traditional rigid semiconductor components or flexible thin film devices, electronic textiles made of smart fibers are more breathable and softer.

However, the current development of smart fibers uses complex multi-module integration, which increases the volume, weight and rigidity of textiles.

A research team from Donghua University's College of Materials Science and Engineering accidentally discovered that fibers emitted light in a radio field during an experiment.

Based on the findings, the team developed a new type of smart fiber that uses electromagnetic energy as a wireless driving force.

This new type of fiber is featured with cost-effective raw materials and mature processing technology, said Yang Weifeng, a member of the research team.

It can realize fabric display, wireless instruction transmission and other functions without using chips and batteries.

Clothes made of the new fibers can be interactive and luminous, and can also remotely control electronic products wirelessly by generating unique signals for different postures of users, said Hou Chengyi, a researcher at Donghua University.

The research team said that they will further study how to make the new fiber collect energy from space more effectively to develop more functions including display, deformation, and computing.  ■

A researcher shows the luminous mechanism of new smart fiber at Donghua University in Shanghai, east China, in March 2024. (Donghua University/Handout via Xinhua)

IMAGES

  1. Type of Computer / Classification of Computer

    research type of computer

  2. Classification of computer|Types of computer

    research type of computer

  3. 10 Types of Computers

    research type of computer

  4. Types of Computer with their Features

    research type of computer

  5. 6 Types of Computers and Their Purposes

    research type of computer

  6. 15 Types Of Computers (Analog To Quantum)

    research type of computer

VIDEO

  1. 24.02.21 study with me

  2. What are computer common types of computer

  3. 24.02.24 study with me

  4. 3 type computer work blouses//colour combination super//finshing super//మీరు కూడా చూసేయండి//

  5. What is computer Memory ? explain its type

  6. Talk To Type Computer Helps Injured Workers

COMMENTS

  1. 15 Types Of Computers (Analog To Quantum)

    Types of Computers. There used to be only a few different types of computers but today, there are at least 15 types of computers in the world. These include analog computers, digital computers, hybrid computers, PCs, tablets, mainframes, servers, supercomputers, minicomputers, quantum computers, smartphones, smartwatches, and more.

  2. Computer

    computer, device for processing, storing, and displaying information.. Computer once meant a person who did computations, but now the term almost universally refers to automated electronic machinery.The first section of this article focuses on modern digital electronic computers and their design, constituent parts, and applications. The second section covers the history of computing.

  3. Computer

    Computer - History, Technology, Innovation: A computer might be described with deceptive simplicity as "an apparatus that performs routine calculations automatically." Such a definition would owe its deceptiveness to a naive and narrow view of calculation as a strictly mathematical process. In fact, calculation underlies many activities that are not normally thought of as mathematical.

  4. Topic B: Types of computers

    Desktop computer - a personal computer that is designed to stay at one location and fits on or under a desk. It typically has a monitor, keyboard, mouse, and a tower (system unit). Laptop computer (or notebook) - A portable personal computer that is small enough to rest on the user's lap and can be powered by a battery.

  5. Computers

    This type of computer is useful in performing many of the mathematical equations scientists and engineers encounter in their work. It was originally created for a nuclear missile design project in 1949 by a team led by Fred Steele. ... (IAS) computer is a multi-year research project conducted under the overall supervision of world-famous ...

  6. Types of Computers

    There are two bases on which we can define the types of computers. We will discuss the type of computers on the basis of size and data handling capabilities. We will discuss each type of computer in detail. Let's see first what are the types of computers. Super Computer. Mainframe computer. Mini Computer.

  7. Computer science

    The design of appropriate user interfaces for all types of users has evolved into the computer science field known as human-computer interaction (HCI). ... a particular concern of computer science throughout its history is the unique societal impact that accompanies computer science research and technological advancements. With the emergence of ...

  8. Classifications of Computers: Exploring Functionality, Purpose, and

    The three main types of computers based on functionality are digital computers, analogue computers, and hybrid computers. ... They are often used in small businesses or research institutions. Microcomputers are the smallest and most common type of computers. They include desktop computers, laptops, and palmtop computers.

  9. Computer science

    Computer science is the study of computation, information, and automation. Computer science spans theoretical disciplines (such as algorithms, theory of computation, and information theory) to applied disciplines (including the design and implementation of hardware and software). Though more often considered an academic discipline, computer science is closely related to computer programming.

  10. Computer

    A computer is a machine that can be programmed to automatically carry out sequences of arithmetic or logical operations (computation).Modern digital electronic computers can perform generic sets of operations known as programs.These programs enable computers to perform a wide range of tasks. The term computer system may refer to a nominally complete computer that includes the hardware ...

  11. PDF computers types

    Computers types I, Computer: Definition A computer is a machine that can be programmed to manipulate symbols. Its principal characteristics are: ... research, electronic design, and analysis of geological data (e.g. in petrochemical prospecting). Perhaps the best known supercomputer manufacturer is Cray Research.

  12. Types of computers

    The capabilities of a personal computer have changed greatly since the introduction of electronic computers. By the early 1970s, people in academic or research institutions had the opportunity for single-person use of a computer system in interactive mode for extended durations, although these systems would still have been too expensive to be ...

  13. 10 Types of Computers, From Wearables to Supercomputers

    Workstation. Server. Mainframe. Supercomputer. Wearable. 10. The All-Powerful Personal Computer. An IBM computer terminal, used for official scoring on the PGA tour, is displayed in the press room of the 1994 Mercedes Championships in Carlsbad, California. Computers have changed a lot since then.

  14. Types of Computers

    II, Computer sizes and power. Computers can be generally classified by size and power as follows, though there is considerable overlap: Personal computer: A small, single-user computer based on a microprocessor. Workstation: A powerful, single-user computer. A workstation is like a personal computer, but it has a more powerful microprocessor ...

  15. Types of Computers

    Broadly, computers can classify based on: (a) The data handling capabilities and the way they perform the signal processing, and. (b) Size, in terms of capacities and speed of operation. Based on the type of input they accept, the computer is of three types: 1. Analogue Computer. 2.

  16. Types Of Computer Systems: A Comprehensive Guide

    Types Of Computer Systems. 1. Personal Computers (PCs) Personal computers (PCs) are the most common type of computer system, found in homes, schools, and workplaces worldwide, modern computers. They're portable computers designed for individual use and come with a variety of software applications for tasks such as word processing, browsing ...

  17. Types of CS research

    3.5. Research on social and economic issues. 4. Evaluation Criteria for the above types of research. 4.1 The study of what is possible -- and its scope and limits. 4.2. The study of existing information-processing systems. 4.3. Research involving creation of new useful information-processing systems.

  18. Classification of Computers: By Size, Usage, Type and Purpose

    Computers are classified into Super, Mainframe, Mini, and Micro Computers based on their size and capacity. 2. Purpose-based computers include general-purpose and special-purpose computers. 3. Analog, Digital, and Hybrid Computers are types of computers based on hardware design and data handling. Q2.

  19. (PDF) Types of Computer

    Types of Computer. November 2017. Authors: Jayesh Visabhai Mali. Central University of Gujarat.

  20. Types of Research

    Here, the goal is to find strategies that can be used to address a specific research problem. Applied research draws on theory to generate practical scientific knowledge, and its use is very common in STEM fields such as engineering, computer science and medicine. This type of research is subdivided into two types:

  21. What are the parts of a computer? (article)

    At a high level, all computers are made up of a processor (CPU), memory, and input/output devices. Each computer receives input from a variety of devices, processes that data with the CPU and memory, and sends results to some form of output. This diagram visualizes that flow:

  22. Types of Computers

    For example Smartphones, iPads, etc. Mainframe Computer - Computers used by large Organisations to manage bulk data are called Mainframe computers. Main functions of such type include managing customer statistics, census and other heavy data in a single device. For example, the system used at Trading companies.

  23. (PDF) Types of Computer

    December 1974. It is a well-acknowledged fact that the adequate structuring of data is as important as the adequate structuring of a program. The classical von Neumann - machine deals on the ...

  24. Computers

    In general, the scope of the existing research on cloud-based SCADA systems security is often narrowed to specific types of attacks, with the results being scattered throughout numerous publications. Addressing this gap, our research aims to offer a comprehensive survey of the main vulnerabilities and cyberattacks targeting cloud-based SCADA ...

  25. Microsoft Forms cheat sheet: How to get started

    Microsoft Forms is a web app that allows users to create various types of forms that gather information from people online and store that data in the cloud for review. ... conduct market research ...

  26. Promoting pre-service teachers' knowledge integration from ...

    Participants' texts were analyzed concerning knowledge integration by raters and with computer linguistic measures. A key finding is that combined complex prompting enhances pre-service teachers perceived relevance of knowledge integration. This study found effects of prompting types on the pre-service teachers' semantic knowledge structures.

  27. Chinese researchers develop new luminous smart fiber

    SHANGHAI, April 8 (Xinhua) -- A Chinese research team has developed a new type of smart fiber that can emit light and generate electricity without being plugged in. The fiber integrates functions ...