by Uwe B. Meding 
Cloud-computing is a much discussed subject in the IT world. The hype behind it is an approach that shapes information technology. The details however, are not trivial and fuzzy.

The rise of the IT clouds

Cloud-computing in the continuing development of the PC technology of the 80’s, the networks technologies of the 90’s, and the Internet technologies of the millennium. Other contributions are the virtualization on one side as well as Web 2.0 and Software-as-a-service approach on the other side. These various technologies gave rise to the simplified creation of big data centers that can handle large amounts of transactions.

Cloud-computing-concept

Cloud computing concept

Cloud-based services are possible through mature virtualization techniques, widely accessible broadband Internet access as well as grid computing solutions. Grid computing has long been used by science and research: the compute capacities of various data centers are connected through the Internet to realize large simulations or calculations. In this way, distributed compute resources can be assigned and organized on demand. Cloud-computing extends this concept to the general enterprise applications, for example enterprise resource planning (ERP) or storage.

Elements of success

Economy of scale is one of the driving factors for this technology. Companies expect a reduction of IT related expenses by leveraging outsourced data centers. A pay-per-use model offers transparency on cost as well as cost control.

Another big success factor is the unprecedented flexibility cloud-computing offers. For example, the New York Times used a large external provider to digitize around 11 million articles within 24 hours for their on-line archive. The infrastructure that is required for such a process (financially and personnel) is beyond what most companies can implement.

Companies can use cloud services to react more quickly to market demands for more storage or more compute resources in times of high Internet traffic. Companies can also test new applications relatively inexpensive and thereby try out different market opportunities.

The flexible extension and expansion of cloud services put companies in a position where they can innovate and reduce development cycles – without an enormous capital investment. This also applies to the IT infrastructure, because user can always use the latest state-of-the-art technologies and profit from all the enhancements. Standardized environments aid in shorter development cycles, less maintenance effort, and simplified usage.

With simple access through the web, cloud-computing enables telecommuting. Data is always accessible, independent of location and time, because it is distributed globally across several data centers. Even enterprise security compliance rules can be observed on any device, for example through the Web security cloud services. Employee mobility increases productivity of companies and thereby contributes to growth and innovation.

The challenge to all these advantages are a level of trust into the security of the data as well as the technical compatibility.

Cloud security

In a poll by the IT consulting company Avanade of 500 companies: 65% of the companies expected financial savings by implementing cloud-computing. However, with respect to data security, 72% would rather build their own systems. In many cases this concern seem valid, since the user has no control over how and where the cloud provider stores the data.

Security is a particularly sensitive subject for business critical applications and need to analyzed ahead of time. Gardner analysts recommend companies to implement detailed risk assessments specifically for data security, integrity, backups, and security compliance. For example, companies need to make sure that multiple systems can perform comparable tasks and that any data is stored on redundant systems to provide a highly available solution.

Where individual bits of information are stored, is usually not very tractable. Strict country-specific rules regarding data security and data handling presents one of the biggest hurdles to the general adoption of cloud-computing. Security is critical in the finance and health care industries. For example, data may not cross country borders, or can only be accessible by a limited number of administrators. As a result, these industries have a high need for consulting to determine how and where to implement cloud services.

Industries that work contracts for the U.S. government have to follow federal cloud security standards, especially in the defense space. The DOD Cloud Computing Security Requirements Guide (v1) document outlines the security requirements that Department of Defense (DOD) mission owners must adhere to when procuring cloud-based services.

Qualified consulting and transparency are keys in these situations. More often than not, a cloud system is built using several providers. It needs to be clear where the data is stored, who stores and encrypts it, and very importantly what happens when a provider changes.

Securing the private cloud

For increased security, IT cloud provider offer so called private clouds. This lets a customer determine where the data is stored (country and data center). Data is transferred through encrypted channels and stored in encrypted databases. Therefore pieces of the cloud can be assigned directly to a customer – without sharing it with others, and manageable by specific administrators only.

The disadvantage of course is that the more secure a cloud service becomes, the more inflexible it becomes. The dynamic distribution to multiple resources is not possible, also a large part of the economies of scale in a shared data center are eliminated and with it any reduction in cost.

In the long run, hybrid solutions will accomplish this separation. Non-critical applications like e-mail services or collaboration environments will be handled in the cloud. Sensitive data will stay in dedicated data centers. This distinction may look trivial, but places enormous demands on an interoperable infrastructure.

Cloud integration

First and foremost the demands for cloud-computing is to ensure that the same data is available and accessible world-wide. This requires distribution technologies that synchronizes information almost latency-free. There are two parts to this however, the actual distributed storage and fast access to the data. Only a joint solution/implementation provides a high service quality.

cloudIn the short run, technical innovation will solve the faster access, however, integrating an increasingly heterogeneous IT landscape will present a much more difficult task. Most cloud solutions today are mainly proprietary and have no standardized interfaces. The reason for this are market strategic, and the need for providers to differentiate themselves. Therefore there is no guarantee for interoperability even between existing cloud implementations. David Linthicum gave a great talk about this in The Death of Traditional Integration.

In the past years, companies are working more toward end-to-end integration to enable improved business processes, higher productivity, and transparency. The demand of customers for best-of-breed solutions tends to create island solutions which in turn will need to be integrated. Many operating systems and applications are not cloud-compatible which makes application integration a difficult proposition. Only a few instances can be realized using standard adapters. Customer-specific solutions can solve this, it requires a competent system integrator that can not only use hybrid models but also provide the required transformations and integration of the various systems.

Approaches for comprehensive services

There spectrum of cloud-computing solutions that covers Software-as-a-Service (SaaS), through Infrastructure-as-a-Service (IaaS) to Platform-as-a-service (PaaS). Ideally the cloud service provider is independent of the hardware and software providers and can provide the best solution for any application.

  • Software-as-a-Service: At a first glance this much like the older approach of Application Service Providing (ASP). However, highly standardized applications are accessible through the Web by many users simultaneously. This requires a clean separation of clients and client spaces. This makes SaaS a lower cost and flexible solution. All users profit from improvements at the same time, and for the provider, application maintenance is low cost because updates can be performed cost effectively in the background.
  • Infrastructure-as-a-Service: This is the continued development of the physically managed server to a virtually managed server. Within minutes (through a portal), companies can use infrastructure services like compute services, storage, or archiving. The federated system guarantees high availability, even at high data volumes or peak access times.This is a very attractive possibility for digitizing documents, for example, in the health care sector for digitally archiving patient data. Another advantage for the user: compute and storage power can be adapted flexibility to present needs. This eliminates some of the problems in classical outsourcing, where companies have to share business forecasts with their providers. With the demand model, companies can flexibly scale their IT computing needs.
  • Platform-as-a-Service: This approach goes beyond IaaS and provides an operating system and development tools in the “cloud”. Customers can develop new applications with these tools or adapt existing applications so that they can run in the cloud. The customer develops and maintains the application, and the PaaS provider furnishes the platform and possibly the billing processes for the customers and their application.This model is very attractive for small and medium sized companies that typically do not have the required server and development infrastructure. For example, media companies can provide their customers with innovative products to their content.Much of this is still needs some development. The platforms and tools need to be developed to accommodate the hybrid application model. This means that the applications run in the data centers of the customer as well as the cloud, and in turn uses the appropriate services of the cloud provider.

Looking ahead

Cloud-computing as changed the IT market – that much is clear. Many market observers think that the classical outsourcing and its long-term agreements and legal bindings are a thing of the past. Businesses are becoming more flexible and nimble. Today, average terms for duration are about a month for private clouds. Nevertheless, not all issues have been addressed. There are data security aspects and sometimes insufficient quality of service as well as user security itself.

Many factors in cloud computing are not directly manageable, therefore many provider are unable to offer dedicated service level agreements (SLA). Providers typically offer price reductions or future credits in case the services are not available. For private users this can be very acceptable. If the systems and services are a critical part of a company’s business, the lost business opportunities will dwarf this many times over. Companies must weigh their risk and reward before engaging a cloud service provider.

Leave a Reply