(Translated by https://www.hiragana.jp/)
Cloud Computing/Introduction - Wikiversity Jump to content

Cloud Computing/Introduction

From Wikiversity
(Redirected from Cloud computing)

Cloud computing is the use of computing resources (hardware and software) that are delivered as a service over a network (typically the Internet). The name comes from the use of a cloud-shaped symbol as an abstraction for the complex infrastructure it contains in system diagrams. Cloud computing entrusts remote services with a user's data, software and computation.

Introduction

[edit | edit source]

The Cloud brings a new tool to the toolbox of today's Chief Information Officer. Computing is delivered as a packaged service rather than products. Shared resources, software and information are delivered in a flexible way to the client through the Internet.

Almost in a utility like way, like water or electricity, Cloud computing provides IT solutions for the business as a black box service. The service is just there, somewhere from the global network. Without the customer having to worry about the technical and operational aspects of keeping it running.

This Pocket Guide provides you a general high-level introduction to Cloud computing. Its contents has been aligned with the topics for the EXIN Cloud computing Foundation exam. More information about the EXIN exam can be found at the end of this Pocket Guide. For more info, go to www.exin.com.

Principles of Cloud computing

[edit | edit source]

The Cloud is defined as a large pool of usable and accessible virtualized resources. It covers a dynamic delivery model for IT services based on Internet protocols. Scalable and typically virtualized resources are provisioned at the application, platform or infrastructure level. Appropriately, these services are also referred to as Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS):

  • A SaaS service provides application functionality that a user can directly use, e.g. Customer Relationship Management (CRM) functionality, e-mail functionality, Office functionality.
  • A PaaS service provides the platform for running a custom application, e.g. a .NET Framework on a server with access to a number of libraries and related services.
  • An IaaS service covers the infrastructure like a Windows server or Linux server from the Cloud, including required underlying resources like storage and connectivity.

Various other functionality is available "as a Service" from the Cloud. E.g., Communication as a Service (CaaS) is an outsourced Enterprise Communications solution that can be hired from an external provider. Such a CaaS solution can cover functionality for voice over IP, Instant Messaging (IM), and videoconference. CaaS can also stand for Compliance-as-a-service. Other examples: Identity-as-a-Service (IDaaS), Monitoring-as-a-Service (MaaS), Storage-as-a-Service.

Using the Cloud

[edit | edit source]

Using a Web browser

[edit | edit source]

A typical example of using the Cloud is accessing and using web-based tools or applications through a Web browser as if the programs were installed on the client. They are accessed from a Web browser or a desktop or mobile applications. The software and data are stored on a server at a remote location, accessed and used via the Internet.

Users can access Web Applications through a Web browser. Of course, the browser on the client must be compatible with the application. Programming code runs on the server, independently from the client, and users do not even need to know the location of the application.

Using Fat Clients or Thin Clients

[edit | edit source]

Different types of devices can be used to access the cloud. Depending on the situation, it is possible to use a Thin Client, a client computer without a hard drive. Multiple users work on the same server machine, accessed through their Thin Client as a console.

A Thin Client environment typically allows for less freedom for the user to customize and control the software and settings on his desktop. On a Virtual Desktop Infrastructure, the user has its own virtual machine on the server, and has almost the same control and specificity as on a local desktop PC.

Increasingly, various mobile devices can also be used. Today, most cell phones have sufficient memory, and applications in the cloud often do not need storage on the client. Thousands of applications are available for mobile platforms. Additionally, specific applications can be tailored for use on the mobile device. Different platforms are available for mobile devices, so standardization of applications is important. Not all applications are available for all mobile platforms.

Especially when working on mobile devices, there must be sufficient attention for safety. For instance, text messaging while driving a car may be dangerous. Looking at a screen while working in a dangerous environment may distract the user from taking appropriate measures. In some countries, regulations exist - e.g. for prohibiting the use of a cell phone while driving.

Benefits and limitations

[edit | edit source]

Benefits

[edit | edit source]

Cloud computing comes with the promise of the potential for reduced cost. As large scale data centers benefit from the investments by the Service Provider, they hold the potential of offering substantial economies of scale.

Using resources like storage in the Cloud requires only investment in the capacity needed. Additional needs do not require budget for new large devices. Resources and services can be activated and deactivated over time in a scalable, flexible and cost-efficient way. The Cloud indeed allows for a pay-as-you-go model (pay-for-use).

For instance, imagine the need for a test and development platform or infrastructure for a new application project for only a limited period of time. Rather than buying new server and storage hardware and database and system software for this project, you can also use the temporary capacity from the Cloud for the time you need it.

Other typical candidates for the Cloud include those applications or services where demand for capacity is difficult to predict. This can also be because it is increasing or decreasing at a rapid speed, because it can vary or burst very heavily over time, or is linked to a cyclical evolution.

Also, going live with a new application or service may be possible in hours rather than weeks or months. This may support a more flexible and planning for new releases and therefore improve the Time-to-market and Time to Value.

Limitations

[edit | edit source]

On the other hand, Cloud computing comes with a dependency from Internet connectivity. Sufficient bandwidth is needed. In case of network problems, resources in the Cloud may have lower availability and longer access times. Depending on the application needs, latency on the line may directly influence application performance. Regulations and compliancy requirements may vary from country to country and from situation to situation. Especially government entities may be subject to additional restrictions on where the data is located. This may make it more difficult to rely on a public cloud provider with datacenters in other countries.

Cloud computing comes with inherent needs to control security appropriately. Resources in the Cloud may indeed be more difficult to protect against intrusions and other security risks. Part of the security challenge is linked to running applications over a publicly accessible network. Additionally, leaving control to an external service provider, operating on a large shared environment, comes with a need to put trust in the service provider.

Possibly even more than for tradition service contracts, contractual commitments from the external provider as well as roles and responsibilities need to be very clear to all parties involved. Contracts come with a clear Service Level Agreement and sometimes also a penalty mechanism for breaches.

Basic background reading

[edit | edit source]

Historical background

[edit | edit source]

Cloud computing can be situated within an evolution of IT. Early mainframe computers were large accounting machines, serving a single purpose, long before other functions like word processing, designing and gaming were available. The Personal Computer brought IT resources closer to the user. Then, the Client / Server model brought a combination of the use of a Client and Server resources accessed through the network. Today, various IT resources are used on the local client and over the network on a wide range of devices.

Server infrastructures use virtualization and high-performance networks to offer ever increasing flexibility. Over the past decades, networks have become interconnected through the Internet, which evolved from an initiative of the US Ministry of Defense. Reliability and bandwidth of connections have improved, and costs are decreasing. This has opened new technical possibilities to access applications and other resources over the Internet.

At the same time, organizations have opted to focus on the Core Business. Some aspects of IT have turned out to be strategic to the business, and have been considered key assets for the organization. But over time, other aspects of IT have been considered commodity services that can be outsourced or operated with partners. Some organizations decided to rely on Managed Services for the management of their IT services. Such organizations have evolved towards not wanting to care about the technical and operational details of their IT resources and IT infrastructure anymore.

Networking technologies

[edit | edit source]

Even though a network is not always a Cloud, it is an important element for it. Networks have evolved from early incompatible proprietary networks into today's standards which include Ethernet, TCP/IP and HTTP.

  • The Ethernet is the leading network level protocol for running the Local Area Network. Early standardization by the Institute for Electrical and Electronics Engineers (IEEE) has enabled its success against competing technologies.
  • The Internet protocol suite (TCP/IP) is the set of communications protocols used for the Internet. The Transmission Control Protocol (TCP) and the Internet Protocol (IP) protocols are part of it.
  • The Hypertext Transfer Protocol (HTTP) is the basis for data communication on the World Wide Web. It now is an official standard from the Internet Engineering Task Force (IETF) and the World Wide Web Consortium (W3C), documented in so-called Requests for Comments (RFCs). HTTP defines nine actions on data, including HEAD, GET and POST.

Networking protocols are commonly mapped on the 7 layers of the Open Systems Interconnection model (OSI model).

Although no end-user would typically need to debug the use of these standard protocols, troubleshooting is certainly possible by using protocol analyzers (sniffers). Depending on the configuration of the network, such tools may also be used to reconstruct confidential data being transferred. Therefore the use of them may be prohibited or considered a security threat.

Formatting standards

[edit | edit source]

When supporting Cloud computing, some understanding of scripting and content-formatting languages and approaches for back-up and recovery can be useful.

The Web server typically covers most of the computing to be done, in server-side programming languages The client typically receives pages in the Hypertext Markup Language and additional content that can be displayed through browser-plugins. Such pages can contain scripting that runs on the client, written in JavaScript. JavaScript Object Notation (JSON) is a subset of JavaScript typically used to exchange data. Extensible Markup Language (XML) is a standard of the World Wide Web Consortium for the syntax of markup languages for structured data. XML can be read in a number of programs, including the browser.

It is useful to have a basic understanding of storage concepts. Network-attached storage (NAS) refers to a storage device that is connected to the network. NAS devices are in fact complete file servers. Common Internet File System (CIFS) or Server Message Block (SMB), is a network protocol for shared access to files and other resources.

More in general, standards are important. Application of standards provides uniformity and portability. Standard organizations include the International Organization for Standardization (ISO) and industrial organizations. The Open Cloud Consortium (OCC) is a not-for-profit supporting the cloud community by operating cloud infrastructure.

Concepts of Virtualization and Management

[edit | edit source]

Virtualization is the creation of a virtual version of something, such as a hardware platform, operating system, a storage device or network resources. Server hardware virtualization involves running several virtual machines (also called guest machines) on the same physical server (also called the host machine). The server runs a hypervisor, which lies beneath the guest operating systems and manages the execution of them.

Benefits of today's virtualization technologies include better usage of available hardware, more flexibility in using hardware and other resources, high-availability by performed both planned and unplanned fail-over between physical machines, and the potential for specific management functionality on the virtualization layer.

The hypervisor does not run an antivirus software. When one of the virtual machines is affected by a virus, only that virtual machine is infected.

Open Virtualization Format (OVF) is an open standard for packaging and distributing virtual appliances or more generally software to be run in virtual machines.

Relevant standards and protocols for systems management include the following:

  • WBEM: Web-Based Enterprise Management
  • DMTF: Distributed Management Task Force
  • SMI-S: Storage Management Initiative Specification
  • SMASH: System Management Architecture for System Hardware

Cloud architectures and security

[edit | edit source]

Cloud architectures

[edit | edit source]

Cloud computing comes as a logical evolution from concepts like Service Oriented Architecture, distributed computing and virtualization. Today, Tiered, Multipurpose and Datacenter architectures are used.

  • The Web front-end is responsible for responding to client requests.
  • The Business logic tier enables multiple back-end application servers.
  • The Database tier ensures load balanced databases.

A load balancer makes sure that the back-end servers properly distribute the workload. With a Local Cloud implementation, servers are kept in-house. Servers in local clouds may be accessed by both users at the location of the datacenter and from outside the location of the datacenter. Servers and the clients using them are not necessarily on the same location or even in the same country.

The Private Cloud implementation is to be seen against Public Cloud and Hybrid Cloud implementations. The word Partner Cloud is sometimes used for hosted solution that provides a higher level of control than Public Cloud implementations but yet runs off premises and with the economies of scale that come with a (semi-)shared environment.

Security principles

[edit | edit source]

Cloud computing comes with security risks, including concerns over protection of data. Appropriate mitigating measures need to be taken. There must be appropriate attention for privacy and compliance issues and safeguards - e.g. the Health Portability and Accountability Act (HIPAA) privacy rule provides protection of health information.

The essential elements of security are Confidentiality, Integrity and Availability. Data should indeed only be accessed by authorized people (confidentiality) and modifications should only come from authorized people and processes (integrity). Backups help ensure availability and integrity of information.

Measures for authorized use include Authentication, Authorization and Accountability (AAA). Authentication is where someone's identity is authenticated, e.g. with a username and password. Authorization determines whether someone is authorized to perform a given activity. Accounting involves the tracking of resource usage for different purposes.

One aspect of security management is network security. Well-known types of attacks include Denial-of-service and Distributed Denial of Service and Man-in-the-middle attack.

A Virtual Private Network can be used to provide access to a Local Area Network. A Virtual Private Network (VPN) is a secured connection for remote access to a local area network. Internet Protocol Security (IPSec) is a typical security protocol used for building such a VPN. Pretty Good Privacy (PGP) provides data encryption and decryption for communication.

Identity Management in the Cloud

[edit | edit source]

Identity management in the Cloud comes with the use of Federation and the use of presence data. Federation refers to cloud based identity management enabling single sign-on for multiple systems.

  • Permissive federation: all connections accepted
  • Verified federation: weak verification of the peer
  • Encrypted: verification of digital certificate
  • Trusted federation: with certificate from root certificate authority (CA)

Presence data indicate whether a user is available, and may also indicate personal information provided by the user on his or her mood, location or activity. A standard protocol for presence is the Extensible Messaging and Presence Protocol (XMPP). It can be part of an Instant Messaging and Presence Service (IMPS).

Location information is used to determine the geographic location of the user of an application. Location data cannot be used to identify or track an individual user nor to determine who has accessed a document stored in the cloud. The user's identity may or may not be known to the application, depending on the application.

Security protocols like and OpenID can be used. Thanks to the use of the OpenID standard, users can create an account once and then use those accounts for signing on to any website which accepts OpenID authentication. OpenID has become more and more common over the previous years, and adoption is still growing. 

Public Cloud implementations

[edit | edit source]

There are several Public Cloud implementations, among the most popular are included: Amazon Web Services (AWS), Google Cloud Platform (GCP), Microsoft Azure, IBM Cloud, DigitalOcean and Alibaba Cloud. Almost all cloud computing providers offers free trials, most cloud providers require a credit or debit card to sign-in, IBM Cloud do not require credit card for free trial.

Top 5 Cloud Infrastructure Service Providers by market share in 2015 were:

Evaluating Cloud computing

[edit | edit source]

When evaluating cloud computing implementations, both performance factors, management requirements and satisfaction factors as well as service providers and their services in Cloud computing are relevant.

E.g., when considering to use Software-as-a-Service, a trial period can help to evaluate that the selected application is appropriate. Changes in the infrastructure can be performed after the trial period. Other aspects are more difficult to assess. The bandwidth during the trial period may not be realistic and bandwidth can be enlarged while using the application. Also, the trial period will typically be too short to sufficiently evaluate the SLA with the vendor.

The business case for Cloud computing can be built on a comparison between costs and possible savings of Cloud computing. Using Cloud computing may need less staff, but using Cloud computing does also come with the need for specialized skills. Using Cloud computing may place less stress on IT staff freed from normal daily activities as seen in typical data centers. E.g., there may be less worry about normal daily activities like making backups.

When considering the Total Cost of Ownership (TCO) of both a potential Cloud service and the alternative, all relevant aspects need to be taken into account. For instance, facilities and environment aspects including electricity play a role. Practice has learned that the Cloud is not an all-or-nothing scenario for the entire IT. Recent adopters have seen the benefits of a mixed (hybrid) environment based on a thorough understanding of their business and their specific needs. 

Test yourself

[edit | edit source]

Part 1

[edit | edit source]

1. Which functionality is best suited to provide single sign-on on a cloud application using your own Active Directory server?
A. Federation services
B. Virtual Private Network
C. Hypertext Transfer Protocol
D. None of the above

2. Which of the following is most typical of a Local Cloud implementation?
A. It is not shared with other users
B. Servers are kept in-house
C. The application runs as a service
D. The Service Provider ensures end-to-end coverage

3. Which of the following may typically have a negative impact on the business case for Cloud computing?
A. Costs of bandwidth and networking
B. Costs of operational staffing
C. Costs of new hardware investments
D. Technical knowledge of the Service Desk

4. When read access to data by unauthorized people from the Internet is a threat, which aspect of security is at risk?
A. Confidentiality
B. Integrity
C. Availability
D. Authentication

5. What is the hypervisor used for?
A. Linking between applications
B. Strong authentication
C. Privacy regulations
D. Virtualization

Part 2

[edit | edit source]

1. On which part of the OSI reference model would application protocols like HTTP, FTP, SSH, Telnet be mapped?
A. Frame Relay
B. Layer 7
C. Layer 10
D. Core protocols

2. Which of the following is NOT a benefit of virtualization?
A. Improved database performance
B. Fail-over between machines
C. Multiple guest operating systems running on the physical machine
D. Better hardware capacity usage

3. What could be the impact of network downtime when using Cloud computing?
A. Integrity of data can be compromised
B. Application available may be lower for the user
C. Service provider assumes responsibility for Service Level
D. Fail-over to a local network is required

4. Which of the following is the BEST definition for the Cloud?
A. A large pool of usable and accessible virtualized resources
B. A software package from Google for Office use
C. A series of interconnected Web sites
D. Ethernet based access to virtualized applications

5. What would be a typical use for the XMPP protocol?
A. Instant Messaging and Presence Service
B. Network Analyzer
C. Antivirus functionality for firewalls
D. Systems management of virtual servers

Answer keys

[edit | edit source]

Part 1: 1A - 2B - 3A - 4A - 5D
Part 2: 1B - 2A - 3B - 4A - 5A  

Cloud computing Foundation exam

[edit | edit source]

EXIN offers an examination and certification program about Cloud computing essentials. The exam takes 60 minutes and contains 40 multiple-choice questions. Use of notes, books or other aids is not allowed. In order to pass the exam, you need to have 26 correct answers out of 40 questions. There are no requirements to be allowed to this exam. It will cost you around € 140,00. Attending training is recommended but not required. Most exam locations will offer the exam on the computer-based environment. For more information, visit the EXIN Web site: www.exin-exams.com

See also

[edit | edit source]
  • EXIN (ed) Cloud computing Foundation - Preparation Guide EXIN, 2011
  • EXIN (ed) Cloud computing Foundation - Sample Questions EXIN, 2011
  • EXIN (ed) Cloud computing Foundation, basic training material EXIN, 2011
  • Chris Harding, Cloud Computing For Business, The Open Group Guide Van Haren Publishing, 2011 ISBN: 978 90 8753 657 2
  • AWS Cloud Practitioner
  • DevOps
  • w:Cloud_computing
  • Hyper-converged infrastructure (HCI)