The idea of the cloud seems fairly modern in its capabilities, a perception that is likely spurred by a relatively recent embrace of this approach to storage and hosting. However, the original concept aligns more closely with the start of modern computing.
Theoriginal framework for what is now cloud computing was first noted over 50 years ago. The idea was proposed by Defense Advanced Research Projects Agency, a division of the U.S. Department of Defense, in 1963. For the price of just $2,000,000, DARPA requested that scientists at MIT create a solution that would allow a computer to be used by multiple people, simultaneously. At this time, computers were a far cry from today’s MacBooks and ThinkPads, with large bolts of magnetic tape in use to store data. Despite the unusual nature of the ask, MIT’s computer science experts created a way to allow several people to work with the same machine at the same time. The term “virtualization” was coined to describe this process.
Virtualization, despite its early origins, is still an important concept in computing (virtual servers are becoming increasingly popular as well), but not exactly in its original sense. Anyone who has used a virtual private network to connect to work servers from home or a program like GoToMyPC or Remote Desktop Protocol to access a remote desktop has had experience in the current ideation of this concept.
Virtualization pre-dated the internet as well; the groundwork of the internet formally came into being in 1969 with the ARPANET, an abbreviation for Advanced Research Projects Agency Network, as created by J.C.R. Licklider. His original vision included what he called the Intergalactic Computer Network, a concept that, in practice, evolved into what is today the internet. And, while cloud computing and the internet are not one in the same, the internet is a key element of cloud computing; without the internet, cloud computing as it exists today is an impossibility.
The idea of a web-supported server first began to make waves in the 1990s as the Internet crawled into the public eye. This original context referred more to empty space – the “cloud,” so to speak – between a service provider and an end user, and the potential this space had in optimizing software and application availability. The tech giant Salesforce was among the first to bring modern cloud computing into reality in 1999, using the internet to deliver products to users rather than through physical items like floppy disks or CD-ROMs.
Amazon is another key player in the debut of cloud computing. In 2006, Amazon launched Amazon Web Services (AWS), a still-popular option for hosting and storage for those who do not want or need their own servers. The original version Google Drive, Google’s answer to web-based alternatives to Microsoft Office, launched in 2006 as well. Google began offering hosting services in conjunction with IBM the following year, and NASA’s OpenNebula, an open-source network for both hybrid cloud and private clouds, came about in 2008.
Over the last decade, cloud-based options have grown exponentially, with countless companies offering software options, hosting options, and infrastructure options designed to optimize workflow and manage business costs in a way standard server technology can’t.