This document is a compilation of my notes for COE 538: Microprocessor Systems at TMU. All information comes from my professor’s lectures, as well as the course textbook

Adam Szava - 2tor.ca

F2022

Chapter 1: Introduction to the HCS12 Microcontroller

1.2 Number System Issue

We begin this course by looking at how computers understand numbers. Due to the on-off nature of electricity, computers used binary to represent numbers. The unit used to represent the on-off state is the bit. Computers are meant to interact with humans, which binary is not ideal for, and so conversion between bases is common. In fact we often see mixed use of bases in the same program at the microcontroller level.

The following table shows the four main bases we use, with their designated identifier.

Acrobat_2NeMJieiSi.png

For example:

$$ 15 = \% 1111 = @17=\$F $$

<aside> 💡 Note that it’s also very common to use $0\times$ as a prefix for hexadecimal, as in $\$3F = 0\times 3F$.

</aside>

The number of bits used by a computer to represent a number is usually a multiple of $8$, called a byte. We often measure computation capacity with the number of bits that a computer can operate on in one operation.

1.3 Computer Hardware Organization

In this subsection we look at the different elements of hardware inside of a computer.

The Processor

The processor is responsible for performing all computational operations and coordinating the resources available to it. The CPU consists of three major components:

  1. Arithmetic logic unit (ALU)
  2. Control unit
  3. Registers