MCU

An MCU is an intelligent semiconductor IC that consists of a processor unit, memory modules, communication interfaces and peripherals. The MCU is used across a broad range of applications, including washing machines, robots, drones, radio and game controllers.The history of MCU can be traced back to the invention of MOSFET technology. In the early days, the MCU was a primitive semiconductor IC with a processor unit and memory module. Generally, MCUs are based on the Harvard architecture. Throughout the decades, popular manufacturers such as Intel, Motorola, Microchip and Atmel took the innovation further. Most of the MCUs developed by these manufacturers are 8-bit MCUs with proprietary architecture. The exception is ARM-based MCU, where the ARM architecture is licensed to the manufacturers. The ARM architecture currently dominates the market for 32-bit MCUs.

While an MCU has a processor unit, it is more than performing arithmetic operations on binary values. The true value of an MCU is its ability to interface with the physical world with its built-in communication and peripherals.Technically, an MCU functions by executing the program instructions stored in its non-volatile memory module. MCUs used to be ROM-based, so erasing the program data used to be difficult, if not impossible. When the flash technology is revolutionizing the semiconductor technologies, MCUs start storing program instructions in built-in flash memory.Most modern MCUs use the RISC (Reduced Instruction Set Computer) instruction architecture for its fundamental instruction processing. The RISC offers a shorter instruction execution cycle compared to its predecessor, CISC. To develop the program for an MCU, embedded system developers use the assembler or C programming language. The finished program is then loaded to the MCU with a programming tool.


Electronic Components.jpg      Electronic Components.jpg