Your IP: Unknown · Your Status: ProtectedUnprotectedUnknown

Skip to main content

Time-sharing

Time-sharing

Time-sharing definition

Time-sharing is a computing technique that lets several users or processes share one computer’s resources, such as the CPU and memory, at the same time. Each user or process works as if they have a dedicated computer, even though they share it with others. Time-sharing lets users run programs, access data, and work on tasks in real time, increasing efficiency. It also allows remote users to connect to a central computer from different locations.

See also: cloud architecture, cloud operating system, vMem

History of time-sharing

  • Pioneering days (1960s). The concept of time-sharing was first proposed by John McCarthy in 1959, but it gained practical significance in the 1960s. One early system was the Compatible Time-Sharing System (CTSS), developed at MIT in 1961. CTSS allowed many users to share a single computer, providing each with a separate virtual machine.
  • Multics and Unix (late 1960s). Multics was a joint effort by MIT, General Electric, and Bell Labs to create an advanced time-sharing system. While Multics itself didn’t become big, it introduced new concepts like hierarchical file systems. Meanwhile, Unix, initially developed at Bell Labs for simple tasks, evolved into a powerful operating system known for multitasking and multi-user capabilities.
  • Expansion and commercialization (1970s). Time-sharing grew, with companies like IBM and DEC creating their own time-sharing operating systems. IBM’s CP-67 and CMS allowed many users to run programs on IBM mainframes. DEC’s TOPS-10 and TOPS-20 were used on PDP-10 minicomputers for interactive computing. VAX/VMS, also by DEC, became a popular time-sharing system for businesses and research institutions.
  • Microcomputers and networks (1980s). The rise of microcomputers in the 1980s saw the expansion of time-sharing through networks. Users could connect to central computers from remote locations. Ethernet and local area networks (LANs) made time-sharing resources even more accessible.
  • Internet and cloud computing (1990s). The internet and the World Wide Web made distributed time-sharing possible. Users could access computing resources and data remotely through web-based applications.
  • Cloud era (21st century). Cloud computing emerged, offering scalable, on-demand time-sharing of computing resources over the internet. Services like AWS, Google Cloud, and Azure provide a wide range of cloud-based time-sharing options, from virtual machines to storage and software accessible from anywhere with an internet connection.

Further reading

Ultimate digital security