March 5, 2017
Four major colocation and data center markets in Europe reached new milestone in terms of power consumption. Frankfurt, London, Amsterdam and Paris has together registered a power uptake of 155 megawatts (MW) in 2016, according to capital advisor form, CBRE.
December 29, 2016
The world of web hosting has looked pretty simple for most of the time within the last two decades. Most of the terms and niche hosting services were easy to explain even to someone who had very limited technology awareness. Most of the IT Hosting and computing services used to be delivered from physical, bare-metal servers or from virtualized environments (virtual hosts) that worked on top of a stand-alone physical servers. Compute clusters and other form of networking models of computer systems have been around for long time, but were immature as technologies, lacked automation and most of them were infrastructure models build for internet use. All this changed since 2009.
The Good Old Dedicated Server Hosting
10 years ago when any businesses needed to run a resource demanding website or computer program, they used to buy powerful physical servers. If a project required a huge amount of computing resources, companies used any form of clustering (also known as “Grid computing”) of physical servers (or on virtual machines wherever possible). Some enterprise computer systems also used external storage.
Most processing work and storage service have always been delivered from stand-alone physical servers. Therefore terms like “Dedicated Hosting”, “Dedicated Server”, “Dedicated Hosting Services”, “Dedicated Server Hosting” or any other related phrases have always meant a physical server used by one client. Terms like “Bare-Metal Server” or even “physical server” were not in use or were not popular in the industry.
Over the course of the massive transformation of the traditional economy, as we had for for decades, into economy based on computing technologies (Digital Economy), and with the introduction of an Internet in which appliances, and even objects, require network connectivity to function and exchange data (“Internet of things”), the concept of building and operating IT Infrastructures has changed. Computer virtualization technologies have beed transformed into Cloud Computing (Network model of computer systems, in which the infrastructure used for processing operations is physically separated from the data storage appliances).
Since 2004 the use of physical Dedicated Servers for hosting websites or any kind of software programs has declined sharply. The graph featured below represent the drop in Google searches for Dedicated Hosting since 2004.
While the number of physical servers used for web hosting has increased significantly within the last decade, the number of physical dedicated servers used by companies and individuals decreased as percentage of the total IT installations. Since 2004, the industry of provisioning dedicated (guaranteed) computing resources has been transformed from “Dedicated Hosting” (Dedicated Servers) to “Virtual Dedicated Servers”.
The Rise of Cloud Computing?
The Cloud computing software and infrastructure automation platforms have been maturing since 2004, the use of physical dedicated servers for hosting applications and provisioning IT services dropped even further. The launch of major compute infrastructure clouds has totally revamped the IT Hosting industry.
Amazon Elastic Compute Cloud (EC2)
EC2 was one of the fist enterprise infrastructure service providers to launch compute Cloud service available for public use. Amazon launched its cloud services in 2006. At the beginning EC2 was just a large physical compute infrastructure used for provisioning of Virtual Machines (something what is called today Virtual Dedicated Servers) created with Xen virtualization technology.
A user was able to create, launch, and terminate server-instances as needed, paying by the hour for active servers, where the term “Elastic” came from. At the end of 2010, Amazon transformed EC2 into Amazon Web Services (AWS). However, EC2 and later AWS used to deal with flexibility. They allowed users to scale up and down their computing instances and used to offer flexible contractual terms. EC2 was not created as a computing infrastructure to offer genuine Cloud computing services such as High Availability, Failover or Load-balancing.
VMware vCloud Air
In 2009 VMware was one of the fast major virtualization software provider that created a Infrastructure Cloud service, named vCloud Air. vCloud Air is a public cloud computing service built on Vmsware’s virtualization technology vSphere. It has been designed as “Infrastructure-as-a-Service” (IaaS) subscription IT hosting service. It offers “Dedicated Cloud”, “Virtual Private Cloud”, “Cloud Disaster Recovery”. It offered a pay-as-you-go service named “Virtual Private Cloud OnDemand”.
Another major Compute Cloud is Microsoft Azure. Microsoft arrived late on the Cloud. The company has been long struggling with Cloud computing. Azure was first announced in October 2008. It took Microsoft 14 months to launch it. Azure started in February 2010 as Windows Azure and was subsequently renamed to Microsoft Azure.
It is a cloud Computing platform and infrastructure services for building, deploying, and managing Windows OS applications and services through a network of Microsoft-managed data centers. Azure provides Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS) services and supports various tools and programming frameworks.
Find our more about how the dedicated hosting and web hosting terminology changed in article “Dedicated Hosting And Cloud Computing Terms, A Knowledge Boost?“