Time-sharing proved popular through the 1960s and ’70s as businesses such as banks, insurance companies and retailers installed multiple remote terminals that needed to communicate simultaneously with a central computer. With the rise of microcomputing in the early 1980s, time-sharing began to wane. Individual microprocessors were powerful and inexpensive enough that a single person could have all the central processing unit’s time dedicated solely to their needs, even when idle.
But the popularity of the internet and cloud computing has brought the concept of time-sharing and associated technologies full circle. In the time-sharing heyday of the 1970s, IBM released an operating system called VM that permitted admins on its System/370 mainframe systems to have multiple virtual systems, or virtual machines (VMs), on a single physical node. This tangential development to time-sharing, an early version of virtualization, became a huge catalyst for some of the biggest evolutions in communications and computing. Today’s cloud computing environments, including IBM Cloud, contain basic functions that grew from VM. Today’s cloud applications enable end users to send and receive information at a rate that occupies a central server for only a small percentage of terminal time, enabling one centralized server to fulfill the needs of many users simultaneously.
All of which is to say, a century after Thomas J. Watson Sr. first advanced the concept of time-sharing, his idea is still going strong.