Cloud Computing is nothing new, it’s been around for a very very long time. One could argue that the original mainframe machines were early Cloud computers running thin client applications to dumb-terminals. But the term and current definition of Cloud Computing is new and it’s risen out of standardisation, virtualisation and the Internet generation.
The Evolution of Cloud Computing
Before we had standard protocols, such as Ethernet, IP or SMTP, the major Original Equipment Manufacturers (OEM’s) had their own protocols, hardware and software. They were mainly hardware companies that had technical expertise in developing ‘transport’ layers for communicating between devices. It was a nightmare for everyone for one simple reason; no two different OEM devices could interconnect. Devices would only talk to other devices from the same manufacturer that ran the same protocols. The reality was that OEM’s were spending R&D budget on developing protocols, hardware and software, so the pace of product development and time to market was rather slow.
Basically IT was very expensive, very clunky, very specialist and technical very complex.
Standardisation in IT
A major breakthrough was with the adoption of common standards. The Institute of Electrical and Electronics Engineers (IEEE) was the main driving force behind the development and adoption of Ethernet for example, a much needed and now the de facto LAN standard.
Additionally organisations like the IOS (International Organisation for Standardisation) developed interconnect frameworks such as the Open System Interconnect, better known to many as the OSI 7 layer stack; which essentially breaks-down the fundamentals of how technology at all levels is built, from physical ‘plumbing’ such as the actual plug you use (RJ45 for example), to the Network; like Ethernet, to the transport layer like TCP/IP right up to the presentation and application layer.
A brave new world for the OEM’s
These standardisations were significant for OEM’s mainly because they didn’t have to focus on trying to develop protocols and standards of their own. They could now focus on what they were trying to build; machines.
But as machines became more powerful through advances in processing power and storage, OEM’s also began to evolve. They soon found that rather than being 80% a hardware company and 20% software, over a relatively short period of time, this was reversed.
Suddenly the role of hardware was to support and run applications. Big internal battles for power and control were fought in corporate offices and inside the OEM’s, battles between the die-hard electronic engineers and the software programmers, these battles still continue today, in organisations who design and manufacture hardware in-house.
The Internet and the CTO
Next came the Internet along with advances in network protocols such as ATM and latterly MPLS. Along with the Internet came more challenges, like security. The reality was that most organisations didn’t really know what the Internet meant for them, how to use or why they needed it. E-commerce, e-procurement, ERP, CRM, JIT (Just In Time) production, were all slowly being adopted. With clunky user-interfaces, poorly deployed, highly personalised and massively expensive, initially these applications caused as many challenges as they solved.
But along with the evolution in applications came a realisation and major transformation in the large businesses that adopted them. The realisation was simple for many businesses; they were transforming into IT companies. Take a bank, a supermarket, a car manufacturer, where are they without IT? Handicapped; and if they went without IT for any length of time they would be out of business.
The lasting result of this evolutionary phase was the birth and fast rise to prominence of the CTO, who quickly became an internal powerhouse, commanding a huge percentage of operational budget.
The CTO’s Agenda
Bottom line; the CTO is responsible for keeping the IT lights green. The buck stops with them.
It is their responsiblity to secure and protect the electronic assets of the organisation, to make sure that in case of a disaster, the organisation can keep running. To ensure data flows between departments, stakeholders inside and out and between business processes. And when an organisations goes through a merger or acquisition, it’s the CTO who has to integrate the businesses and all the processes so they can realise the rationalisation and operational benefits.
The CTO is also held accountable by the Board and shareholders for achieving this ideal state.
So how did Cloud Computing come about?
Organisations needed connectivity between offices so their people could have access to the knowledge, systems and processes that now powered the organisation.
But this was no simple task. Commonly organisations would have the applications hosted in HQ and then have all the users accessing these systems over the corporate WAN. So this was really and internal ‘private’ cloud. But this caused plenty of challenges, just consider security for example, suddenly your entire corporate network was now only as strong as the weakest element (or individual user)! Additionally the network required to support this hub and spoke design is immense, needing to support peak usage.
Planning, designing, gesturing budget and implementing network designs and upgrades was a major under taking and a never-ending cycle. More and more business applications would be added that required more bandwidth. More and more data is being created, shared and stored, 90% of the worlds electronic data was created in the last 2 years alone, think about what that means for someone trying to manage a network!
So many CTO’s turned to the cloud to help them overcome some of the challenges they face. Cloud Computing offers the ability for an organisation to ‘Host’ a service anywhere, and allow users to have access to the service. So that could be onsite (inside a corporate organisation) in a ‘private’ cloud, or with a Cloud Service Provider. They either provide the physical elements, such as space, connectivity and power, or they deliver a fully managed service which can be consumed by the organisation on a pay per user model.
For example, rather than an organisation buying an application, a server (or two for resiliency) security, databases, anti-virus, then installing, configuring, testing, deploying, managing, fixing, upgrading, adding, changing, moving and maintaining. They can pay a price per user per month to consume a pre-built service.
This may be a simplified view for some readers, but its clear to see why many CTO’s push services to the Cloud in one form or another. Cloud is being adopted, it is significant and will continue to be a major platform of choice for many organisations. The main software vendors continue to invest heavily in Cloud technologies, Microsoft have been investing around 90% of their $9.6bn R&D budget annually into areas such as Windows Azure, Microsoft Lync and Office 365. These investment amounts into R&D aren’t showing any signs of reducing.
Cloud Computing, coming of age
Their are many benefits of Cloud Computing for organisations and several key drivers which are well understood and have eloped over a long period of time. But interestingly areas such as Bring Your Own Device (a term and trend resulting from the launch of the tablet) has help accelerate the demand for Cloud service, but has also raised even more concern over data protection and security.
Additionally, the next generation of school and university leavers are progressing through the corporate ranks, they seem unfamiliar with the blue screen of death and the windows sand timer! They expect to be able to work however, wherever and whenever they want.
Technology has been flipped; people had to flex around technology, essentially working for it, putting up with it’s inflexibility which has been protected by corporate security policy’s and stubborn IT managers. Now however, we have access to a limitless array of applications and if we want, we can even create our own.
This is creating mayhem for IT managers, who shudder at applications like Dropbox. Suddenly employees can get around the 6mb limit on email attachments, by dropping the file into Dropbox and sending a link or sharing a folder. That basically means the very data the IT managers are trying to protect has effectively been uploaded to a completely uncontrolled platform and shared with someone using an external service.
So now the CTO needs to build a clear strategy that incorporates how users work, rather than dictating it to them.
So for reasons of flexibility, security, data protection, agility and cost, many organisations are embracing Cloud Computing. Ironically moving to a modernised version of the mainframe set-up, whereby computers are no longer needed on a users desk, instead, they have a dumb-terminal (now called Zero clients or a VDI unit). That essentially means that the full desktop and operating system is virtualised and delivered to a user; Desktop as a Service. So your Windows operating system and Microsoft Office Suite can now run on a server as a virtual computer just for you, with access from a Zero client, any smart device or a web browser.
All your business applications and data can be virtualised too and accessible via your virtual Desktop as a Service. This new eco-system gives control back to the corporate IT team, but allows users much wider access than before. With the IT design, build, deploy, manage model being disrupted, for the first time in a while IT staff can actually focus on adding value to the organisation, rather than being run ragged trying to keep the lights green and keep up with the pace of growth.
So Cloud Computing offers something of value to CTO’s, it gives them a platform they can build their business systems on in a much more structured way and without some of the day to day challenges. Cloud Computing still has a long way to go and many hurdles still await, but we are at a point in time where Cloud service are having significant benefits to businesses and its clear that the trend to move to Cloud will continue for some time to come.