A microcontroller or MCU (“microcontroller unit”) is a minimal computer implemented on an integrated circuit. It contains memory, one or more small CPUs, and programmable I/O. Microcontrollers are used in a wide range of digital electronic devices, such as smartphones, automobiles, and computer peripherals. Their small size and low power requirements make them ideal for use in embedded systems, IoT appliances, LPWANs, and SoCs.

Today, billions of microcontrollers are manufactured and sold each year, making up approximately half of the global processor market.

History

The first microcontroller was created by Texas Instruments engineer Gary Boone in 1971. Boone’s chip, the TMS1802NC, successfully implemented a multi-function calculator on a single integrated circuit (except for the keypad and display). The chip had over 5,000 transistors and 3,000 bits of memory, and was immediately recognized as a major technological breakthrough.

On June 21, 2018, the University of Michigan announced the creation of the “world’s smallest computer,” an ARM Cortex-M0 microcontroller measuring 0.3 mm on each side.

Arduino, BeagleBoard, CPU terms, Hardware terms, Raspberry Pi