The 122nd Installment
Vague hardware and software boundaries
by Syuichi Oikawa,
Professor
Generally, it is said that software is not physical and is therefore easy to change, while hardware is physical and is therefore cannot be or is very difficult to change once it has been created. While this is said in software engineering lectures and I know it is not quite that simple, I will avoid talking about it in order to avoid getting sidetracked.
Here, I think it is also generally recognized that the computer itself is the hardware and the applications that the computer runs is the software. Whether desktops or laptops, computers are visible and are operated by hand, so it is natural to perceive them as physical.
But what about so-called servers? A long time ago (maybe even more), they were called large-scale computers, and there were facilities called computer centers at universities where you had the opportunity to catch a glimpse of one through a small window of a large box in the computer room, which made a rasping noise with cooling fans and air conditioning. The hardware itself was undoubtedly physical, even if it could only be used through a terminal in another room and there was no opportunity to touch it directly.
Currently, cloud computing is the mainstream of servers. As the name suggests, almost like a cloud, the physical location of data centers is often undisclosed and cannot be seen. In addition, cloud computing incorporates various virtualization technologies. Virtualization technology has long been used in computers. Computers can also use a technology called virtual memory, which borrows storage such as HDDs or SSDs to make it appear that it has more capacity than the physical memory. Processors (or CPUs) can also be shared by multiple users by dividing them by time, but this is not called virtualization. Virtualization of a processor is a technology that makes it appear as if there are multiple computers, including memory, independent of one another. On virtual computers, operating systems run and applications are run on top of them.
Computer virtualization has a surprisingly long history. The first case of computer virtualization is said to have been achieved by the IBM CP-40, which was developed in 1967 to virtualize the IBM System/360. At that time, operating systems were being developed to meet the demand for multiple users to use computers simultaneously, but their development was taking a long time, and some say that virtualization technology was developed as an alternative to this. Virtualization technologies also need hardware support, but are basically achieved through software. That is, because software provides an interface for hardware, when run in a virtual environment, software typically executes, or emulates, the parts that hardware executes.
The System/360 used software-executed emulations of parts of the system that would normally be executed by hardware, as well as other normal machine language instructions besides virtualization. It was known for having a range of models from small and inexpensive to high-performance but expensive, but all supported the same instruction set. However, even though the instruction set was the same, the more expensive models offered high performance by achieving most of it through hardware, while the cheaper models used emulation to reduce the hardware and lower prices.
Nowadays, software-based hardware emulation is also used on computers. Many modern computers have a function that makes a USB keyboard look like PS/2, which is an older interface, but this is achieved by emulating the BIOS, which is the computer firmware, of the computer using a function called the system management mode of Intel processors. It seems that microcode is still alive and well. When new instructions were added (such as to support virtualization), they were first implemented in microcode and gradually replaced by silicon implementations, starting with the most frequently used instructions.
In this way, the boundaries between hardware and software have not always been clear from the past to the present. There will certainly always be areas that will remain unchanged or are very difficult to change once they are created, but the boundaries are quite vague.