cloud provider transparency

8
Cloud Computing 32 COPUBLISHED BY THE IEEE COMPUTER AND RELIABILITY SOCIETIES 1540-7993/10/$26.00 © 2010 IEEE NOVEMBER/DECEMBER 2010 E xternal IT services have been in use for sev- eral decades now, evolving from time-sharing services to application service providers to the current cloud computing phenomena. 1 The US National Institute of Standards and Technology has developed a good working definition of cloud computing that breaks it into three service models: software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). 2 (For a detailed explanation, see the “Cloud Computing Terminology” sidebar.) Cloud computing promises a ubiquitous platform that can automatically scale up, down, or out on demand. It also portends to be self- service and highly automated, allowing an enterprise to get started with nothing more than a browser and a credit card. An important challenge for IT comes from lines of business (LOBs) that are unsatisfied with IT’s re- sponsiveness and how long it takes to respond to new application requests. Several decades ago, the main- frame environment had an acceptable response time of 12 to 18 months to respond to a request for a new application. Highly virtualized datacenters can now procure and provision an application environment in less than four to six weeks. The challenge facing IT occurs when the business manager responds to a four- to six-week answer from IT by producing a credit card and getting something running on Amazon Web Services (AWS) in a matter of hours. IT must be able to respond to that kind of dynamic demand internally from the LOB or find ways to insert itself into the process of assessing and validating publicly available cloud services. Whether a cor- porate IT depart- ment wants to let its company’s crown jewels reside in a public cloud is certainly a question each organization must answer for itself. For this study’s purposes, let’s assume that IT is being driven to the cloud because of potential economic and time-to-market benefits. IT will need a new assessment process to proactively evaluate the cloud along four key dimensions—se- curity, privacy, auditability, and service levels. Open availability of the information from this type of assess- ment provides valuable information for IT to trans- parently evaluate the environment’s risk. The Study’s Purpose This study has two aims: • to create a scorecard for evaluating a cloud’s trans- parency via the cloud provider’s self-service portals and published Web content, and • to empirically evaluate a small population of cloud providers to test the scorecard and assess the popula- tion’s transparency. Kim Wüllenweber and Tim Weitzel built on the theories of perceived risk and reasoned action to em- pirically show that standardization reduces the per- ception of risk in outsourced services (what I will call transparency). 3 In this study, I evaluated cloud providers’ transparency on the basis of their use of standards, best practices, policies, procedures, and contractual ser- vice-level guarantees available on their cloud services portals. The study also looked at publicly available Cloud computing promises many enterprise benefits. The author’s study aims to help businesses assess the transparency of a cloud provider’s security, privacy, auditability, and service-level agreements via self-service Web portals and publications. WAYNE A. PAULEY EMC Cloud Provider Transparency An Empirical Evaluation

Upload: prachyanun-nilsook

Post on 20-Aug-2015

1.273 views

Category:

Technology


0 download

TRANSCRIPT

Page 1: Cloud provider transparency

Cloud Computing

32 COPUBLISHEDBYTHEIEEECOMPUTERANDRELIABILITYSOCIETIES■1540-7993/10/$26.00©2010IEEE■NOVEMBER/DECEMBER2010

E xternal IT services have been in use for sev-eral decades now, evolving from time-sharing services to application service providers to the current cloud computing phenomena.1 The

US National Institute of Standards and Technology has developed a good working definition of cloud computing that breaks it into three service models: software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS).2 (For a detailed explanation, see the “Cloud Computing Terminology” sidebar.) Cloud computing promises a ubiquitous platform that can automatically scale up, down, or out on demand. It also portends to be self-service and highly automated, allowing an enterprise to get started with nothing more than a browser and a credit card.

An important challenge for IT comes from lines of business (LOBs) that are unsatisfied with IT’s re-sponsiveness and how long it takes to respond to new application requests. Several decades ago, the main-frame environment had an acceptable response time of 12 to 18 months to respond to a request for a new application. Highly virtualized datacenters can now procure and provision an application environment in less than four to six weeks. The challenge facing IT occurs when the business manager responds to a four- to six-week answer from IT by producing a credit card and getting something running on Amazon Web Services (AWS) in a matter of hours. IT must be able to respond to that kind of dynamic demand internally from the LOB or find ways to insert itself into the process of assessing and validating publicly available cloud services.

Whether a cor-porate IT de part-ment wants to let its company’s crown jewels reside in a public cloud is certainly a question each organization must answer for itself. For this study’s purposes, let’s assume that IT is being driven to the cloud because of potential economic and time-to-market benefits. IT will need a new assessment process to proactively evaluate the cloud along four key dimensions—se-curity, privacy, auditability, and service levels. Open availability of the information from this type of assess-ment provides valuable information for IT to trans-parently evaluate the environment’s risk.

The Study’s PurposeThis study has two aims:

• to create a scorecard for evaluating a cloud’s trans-parency via the cloud provider’s self-service portals and published Web content, and

• to empirically evaluate a small population of cloud providers to test the scorecard and assess the popula-tion’s transparency.

Kim Wüllenweber and Tim Weitzel built on the theories of perceived risk and reasoned action to em-pirically show that standardization reduces the per-ception of risk in outsourced services (what I will call transparency).3 In this study, I evaluated cloud providers’ transparency on the basis of their use of standards, best practices, policies, procedures, and contractual ser-vice-level guarantees available on their cloud services portals. The study also looked at publicly available

Cloud computing promises many enterprise benefits.

The author’s study aims to help businesses assess the

transparency of a cloud provider’s security, privacy,

auditability, and service-level agreements via self-service

Web portals and publications.

Wayne a. Pauley

EMC

CloudProviderTransparencyAn Empirical Evaluation

Page 2: Cloud provider transparency

Cloud Computing

www.computer.org/security 33

information about past problems the providers might have had related to breaches and downtime.

To perform this study, I developed the Cloud Pro-vider Transparency Scorecard, an instrument to as-sess and score the information that I collected from published Web sources by or about cloud providers. Each of the four domains I considered included a se-ries of questions based on key areas outlined by the Cloud Security Alliance (CSA),4 NIST,2 and the Eu-ropean Network and Information Security Agency (ENISA).5 Each question equated to a “0 = no, 1 = yes” value; I totaled each domain and gave an overall score based on the total of all scores. I then divided the domain-based scores by the total possible score to provide a simple percentile equivalent. I also divided the overall score by the total possible score to derive a percentile equivalent.

What Makes a Cloud Provider Transparent?Researchers have addressed trust in e-commerce extensively, showing that it can positively affect e-commerce usage by reducing concern, which in turn improves disclosure, reduces the demand for legisla-tion, and reduces the perceived risk.6 Business en-gaging a self-service cloud provider is consuming an e-commerce–based service that provides infrastruc-ture services instead of traditional goods such as books or music. Privacy statements, security policies and assessments,5 and availability guarantees are effective for evaluating trust for e-commerce service providers. For the purpose of this research, I extended the defi-nition of an e-commerce service provider to include cloud providers as a new type of e-commerce.

PreassessmentOne approach to assessing the cloud would be to use a third-party security firm with experience in cloud applications. Another would be to use internal re-sources and leverage recently published assessment methods from the CSA or ENISA. Both methods provide steps for security and privacy assessment and detail focus areas for audit and governance, specifi-cally for cloud infrastructures. The challenge with existing methods is that cloud providers rely on the self-service model for customers to engage them, which is based on extensive surveys requiring the cloud provider’s staff involvement. The low-touch self-service model economically benefits both the cloud provider, which can reduce service costs, and the customer, who is charged less and can directly procure and provision resources.

An alternative approach that matches the cloud provider engagement model is to make all required information for assessing clouds via their Web portals publicly available. To preassess cloud providers, pro-

spective customers search the Web for news articles on issues, breaches, and outages—for example, Privacy Rights Clearinghouse keeps a chronology of reported breach data7—and the cloud provider must track and report outage data on its website. Another step should include inspecting the type of customers using the cloud provider to validate if its customers have similar applica-tions, scale, and customer base. One way to accomplish

T he US National Institute of Standards and Technology defines the

cloud as including five essential characteristics.

On-demand self-service is the consumer’s ability to procure and provision

cloud services, such as storage or compute services, via a portal mecha-

nism without the cloud service provider’s assistance.

Broad network access is the ability to connect to cloud services any-

where, with any form of client, such as a mobile phone, laptop, intel-

ligent smart phone, or any Web-enabled device. Depending on the type

of information, where it physically resides can have regulatory ramifica-

tions—for example, personally identifiable information and personal

health records are regulated in the US.

Resource pooling, or multi-tenancy, is when the provider’s resources are

pooled and dynamically allocated based on application demand. Each

physical machine could have multiple tenants (business users) on it—or,

if the cloud provider offers it and the customer is willing to pay for it, a

physical server could run only the one tenant’s virtual machines.

Rapid elasticity is the ability to scale up, down, or out automatically

as workload requirements change. This characteristic lets the customer

pay for resources as needed and allows specific demands to be met with

seemingly unlimited resources. For example, if a business experiences

peak workloads at the end of the month, the cloud will support the

demand transparently to the business. Another example would be to use

the cloud for scale testing.

Measured service, or fairly fine-grained metering capabilities, becomes

necessary with an on-demand and auto-scaling service with a pay-as-

you-go financial model. The metering must include monitoring, control-

ling (for example, setting maximums), and reporting.

A service-level agreement (SLA) between a cloud service provider

and a business details the expectations for both parties. One example

is service availability and the penalties for service loss; another example

would be response time. In the case of Amazon Web Services (AWS),

Simple Storage Service (S3) provides an SLA of 99.9 percent availability,

which translates to 8.75 hours of downtime a year. The buyer must be

aware that SLAs can vary within a cloud provider. Using Amazon Web

Services as an example, AWS Elastic Compute Cloud (EC2) guarantees

99.95 percent uptime, which translates into six hours of downtime a year,

or 30 consecutive minutes a month.

One last concept that’s important when evaluating the aggregation

of services within a cloud provider is impact of transitivity due to also

aggregating the SLAs. Using the previous examples from AWS, where

S3 has a 99.9 percent SLA and EC2 has a 99.95 percent SLA, the result of

aggregating the services provides the lowest SLA of 99.9 percent to the

application that uses both services together.

Cloud Computing Terminology

Page 3: Cloud provider transparency

Cloud Computing

34 IEEESECURITY&PRIVACY

this would be to directly contact the cloud provider’s customers to see what their experiences have been.

In addition, does the cloud provider participate in cloud standards bodies such as CloudAudit,8 Open Cloud Computing Interface,9 CSA, and ENISA? Par-ticipating in cloud standards activities is one way that the cloud provider can demonstrate that it is interested in improving trust and interoperability in the cloud. The basic business assessment also includes such ques-tions as

• “What service models do you offer (IaaS, PaaS, and/or SaaS)?”

• “Are you public or private?”• “Are you profitable?”

These are samples of the types of questions that pro-spective customers should ask during the preassess-ment phase to determine if the cloud provider could be included in a full assessment and if it’s a good busi-ness fit.

As a final preassessment step, evaluate the cloud provider as a business entity. How long has it been in business? According to the US Small Business Ad-ministration, approximately 50 percent of businesses fail in the first five years.10 Has the cloud provider had any financial difficulties? What happens if it’s acquired or shuts down its cloud offering? Does it provide ser-vices in all the locations or countries needed?

The Detailed AssessmentAfter preassessing the cloud provider, the next step is to perform a more detailed assessment using the CPTS as one of the tools for assessment.

Security To perform a detailed assessment, use a browser to visit each cloud site and collect and log the various security, privacy, and service-level policies and procedures. Is all the information located in one place and easy to ac-cess? Are the policies and procedures published? Does the provider offer an email address for additional ques-tions? Does it offer professional services such as secu-rity assessments of customer environments?

What kind of security controls does the cloud pro-vider have in place? If it publishes its security policy and procedures, does it also perform standardized as-sessments? Several cloud providers perform security assessments such as COBIT,10 ISO 27000,11 or NIST SP800-5312 on their environments. Is the cloud pro-vider a member of, or does it contribute to, ENISA or CSA? Does it use the ENISA or CSA recommenda-tions for governance?

What kind of security education and certifica-tions does the staff hold? Are their certifications pub-lished? For example, although AWS doesn’t share

that information, other cloud service providers such as Terremark, SAVVIS, and Rackspace provide their employees’ certifications on their websites and offer specific details to paying customers. Are the employ-ees subject to background checks? Cloud providers often provide this information—for example, AWS publishes most of this information on its website and in its security white paper.

PrivacyDoes the cloud provider have a privacy portal? Does it publish its privacy policy? Does it manage its privacy policy over time? Does the privacy policy apply to all of the cloud provider’s services, or are there separate ones for separate services? If the cloud provider uses other providers’ services bundled within its own ser-vice, does it have a bilateral agreement to hold the other providers to the same standard? Does the cloud provider provide a special email or forum for privacy questions or issues? Does it offer professional services specific to privacy, such as working with customers on Health Insurance Portability and Accountability Act (HIPAA) compliance?

AuditIf a customer has requirements for financial, healthcare, or personally identifiable information, the customer should review the cloud provider’s site for third-party audit mechanisms. For example, does the cloud provid-er comply with the Statements on Auditing Standards (SAS) No. 70 Type II,13 the Payment Card Industry Data Security Standard,14 HIPAA,15 or Sarbanes-Ox-ley?16 Several cloud providers, such as AWS,17 publish the fact that they perform SAS 70 audits, but don’t pub-lish the control groups that they’ve audited.

Service LevelsWhat service-level agreements (SLAs) does the cloud provider guarantee? Do they apply to all the cloud pro-vider’s services? For example, if you’re using Amazon Elastic Compute Cloud (EC2), Amazon has a 99.95 percent uptime guarantee, but Amazon Simple Queue Service (SQS) and Amazon Simple Storage Service (S3) don’t have an SLA guarantee. If you combine SQS or S3 with EC2, the net SLA is 0 percent. Does the cloud provider use a service-level management process such as the Information Technology Infrastructure Library?18

Next Steps PostassessmentOnce the customer has gathered this data, the next step is to contrast the cloud provider’s standards against corporate policies and the requirements of the appli-cation being provisioned on the cloud. Evaluate the cloud policies and practices against internal policies and practices to see if differences exist in the security and privacy policies. Does the cloud provider meet

Page 4: Cloud provider transparency

Cloud Computing

www.computer.org/security 35

or exceed the security and privacy policy levels used internally? Does it provide enough information via its self-service model to determine that?

Results of the PreassessmentFor this study, I chose a relatively small population of six cloud providers (see Table 1). The offerings and structure vary among providers. NIST defines four cloud deployment models: private, public, community, and hybrid clouds. Private clouds operate specifically for one organization, while public clouds are available to the general public. Community clouds support a specific community, such as an academic or govern-ment function. A hybrid cloud is the federation of sev-eral clouds composed of either the same deployment models or different models. The study included only public cloud providers that prospective customers could access from the Internet and that offered their services via a self-service method. For simplicity, I make the six cloud providers (Amazon, Google, Microsoft, IBM, Terremark, and Savvis) anonymous by referring to their results as coming from CP1 through CP6.

Within the public cloud provider category are dif-ferent classes of providers. From the providers cho-sen, I selected Amazon and Google as representative of Web-based companies that repurpose and extend existing infrastructure and software to support cloud services. Microsoft and IBM provide various managed and application services that they’ve extended as cloud services. Terremark and SAVVIS provide various managed services to commercial customers and have

recently created cloud computing offerings targeting IaaS leveraging virtualization technology.

In the preassessment (Figure 1), I found that almost all providers had published outages, along with the fault that caused the outage and the corrective action. Researching for breaches in the Datalossdb database showed no breaches tied to any of the cloud provid-ers studied. CP2 did show up in the database owing to the loss of a laptop containing CP2 employee data. Breaches that affect a cloud provider’s customer data wouldn’t necessarily end up in the Datalossdb unless regulatory rules required the cloud provider to inform those harmed. The nature of the public profile and the services that cloud providers offer have a higher prob-ability of being divulged publicly, and as one cloud provider posted, full disclosure and transparency is a best practice. Microsoft Azure’s loss of Sidekick data in 2009 was highly publicized and analyzed by the cloud provider technical community.19 (Cloud pro-viders aren’t compelled or regulated to share breach information as long as data protected by regulations haven’t been affected.) I also found that all providers belonged to at least one cloud standards group, show-ing common interest in interoperability and gover-nance standards.

Figure 1 has a mixed scoring method designed to create a maximum score of 7 (the best possible score). Several of the questions are negative, making the “yes” answer a negative response, thereby pro-viding a “0” score for that question. All the cloud providers I evaluated scored better than 70 percent,

Table 1. Cloud provider overview.

Provider/offerings Service model Sample customers CommentsGoogle App Engine (GAE) Platform as a

service (PaaS)

Best Buy, Ubisoft, Flickr Appeals to startups, small-to-medium-

sized businesses (SMB), enterprise

businesses, and students and schools as

an integrated development environment

Amazon Web Services (AWS) Infrastructure as a

service (IaaS)

Autodesk, Qualcomm, Second

Life, Washington Post, Harvard

Medical School

Appeals to startups, SMBs, and enterprise

businesses as an operational expense

option for infrastructure with price tiering

based on scale and options

Microsoft Windows Azure,

Microsoft SQL Azure, and

Windows Azure platform

AppFabric

IaaS and PaaS 3M, Verisign, Associated Press,

Kelly Blue Book, Accenture,

Siemens

Appeals to .NET developers and all

businesses; provides a way to bridge

Microsoft datacenter apps with the cloud

IBM Computing on Demand,

IBM Smart Business, IBM Smart

Analytics, and so on

IaaS, PaaS, and

software as a

service (SaaS)

US Air Force, SK Telecom Provides full services for all company sizes

with price tiering for scale

Terremark Enterprise Cloud and

vCloud Express

IaaS USA.gov, Agora Games, Engine

Yard

Infrastructure services for all company

sizes

Savvis Cloud Compute, Savvis

Dedicated Cloud, and Savvis

Open Cloud Compute

IaaS Hallmark, Easyjet, Universal Music

Group, Wall Street

Infrastructure services for all company

sizes

Page 5: Cloud provider transparency

Cloud Computing

36 IEEESECURITY&PRIVACY

which I considered adequate for consideration for the CPTS assessment.

Assessment ResultsI recorded, broke down, and summarized the assess-ment’s qualitative results by domains of security, pri-vacy, audits, and SLA, as depicted in Table 2.

Security ScoresCP3 had the strongest security score, at 0.80. Two ser-vice providers, CP5 and CP6, scored 0.70. The lowest scores were from CP1 and CP2, primarily due to a lack of certifications, professional services, and shar-ing employee certifications. CP4’s relatively low score of 0.50 is likely due to problems encountered with navigating the cloud provider’s website. The study was based on using a self-service method to perform the assessment as opposed to using email/chat inquiry methods or calling the cloud provider. Ease of use and navigation of Web portals are important characteris-tics when a service is designed to be self-service.

Privacy ScoresCP6 and CP3 had perfect privacy scores due to their policies being easy to find, well detailed, and includ-ing privacy explanations in white papers. CP2 lost a point due to the lack of professional services, which it claims are provided through a partner community. CP4 had the lowest score of 0.50 due to the lack of an easy-to-find privacy policy for its cloud offerings.

Audit ScoresAll the cloud providers claim to perform SAS 70 Type II audits on their infrastructure. None of them offers public information about what control groups they

use from SAS 70, although it was possible to acquire control group information via direct email with one of the cloud providers. CP3, CP5, and CP6 all had perfect scores in the audit section. Having internal and external audits and publishing them helps provide proof of capability for specific data types, especially those that are regulated.

SLA ScoresAs Table 2 shows, only CP5 scored well, with a 0.79 on its SLA. The SLA outcomes were skewed by the use of a weighted value that ranged from 1 to 5 based on a 99.5 to 100 percent. If the cloud provider had several different SLAs for different services, I used the lowest SLA for the score. In the case of CP4, I couldn’t find SLA information on the cloud portal. CP5 was the only cloud provider that provided a 100 percent service uptime guarantee. CP5 and CP6 didn’t have any published outage events, which I can discount due to the length of time they’ve been offer-ing cloud services.

Overall ScoresCP3, CP5, and CP6 had the highest overall scores, as Table 2 shows, with scores of 0.76, 0.79, and 0.72, respectively. CP4’s score (0.38) was brought down by an overall lack of information available on its website. CP1 and CP2 both scored near 50 percent, with 0.48 and 0.52, respectively—but removing the two profes-sional services questions actually drops their scores to 0.44 and 0.48.

Cloud-Specific ChallengesThe assessment includes a question about specific char-acteristics in the cloud from the NIST definition re-

Preassessment CP1 CP2 CP3 CP4 CP5 CP6

Business

factors

Length in years in business 16 12 31 114 28 15 Total years

1 Length in years in business > 5? 1 1 1 1 1 1 0 ≤ 5, 1 ≥ 5

2 Published security

or privacy breaches?

1 1 1 1 1 1 0 = Y, 1 = N

3 Published outages? 0 0 0 0 1 0 0 = Y, 1 = N

4 Published data loss? 1 0 0 1 1 1 0 = Y, 1 = N

5 Similar customers? 1 1 1 1 1 1 0 = N, 1 = Y

6 Member of ENISA, CSA,

CloudAudit, OCCI, or other

cloud standards groups?

1 1 1 1 1 1 0 = N, 1 = Y

7 Profitable or public? 1 1 1 1 1 1 0 = N, 1 = Y

Preassessment total score 6 5 5 6 7 6 TotalPercentile score 0.86 0.71 0.71 0.86 1.00 0.86 Score/7

Figure 1. The Cloud Provider Transparency Scorecard. I used the scorecard to examine a variety of cloud computing providers, assessing

their business factors, such as years in business and security or privacy breaches, to create a total preassessment transparency score.

Page 6: Cloud provider transparency

Cloud Computing

www.computer.org/security 37

garding resource pooling. Resource pooling is more commonly called multi-tenancy, and many researchers have addressed it. The question concerned whether the security policy had any specific discussion on multi-tenancy—none of the cloud providers had any specific security-related documentation. The CSA document discusses multi-tenancy and other cloud characteristics, providing guidance on topics such as administration, threat models, and virtual machine regulatory issues.

I designed the scorecard shown in Figure 2 to cover the assessment areas frequently raised in the research

and to begin to establish a high-level exemplar for as-sessing provider transparency. Assessing cloud providers this early in the maturity cycle of cloud as a technology brings with it the caveat that providers as yet don’t have established transparency standards. Market forces, com-petition, and further research are needed to determine the standard for measuring provider transparency.

An area for future research would be to evaluate if the cloud provider offers performance- monitoring tools such as utilization, response times, and avail-ability. As an example, AWS recently launched CloudWatch for customers to monitor resource uti-lization, performance, and demand patterns. Exter-nal monitors such as CloudClimate.com also provide performance data, while companies like Keynote perform remote availability and quality testing of networked resources.

One assessment method that I didn’t include was Shared Assessments (SA),21 which is supported by the US Federal Financial Institutions Council as a fi-nancial services industry standard. SA is specifically designed for outsourcing assessment covering the fi-nancial services industry’s stringent requirements and regulations. I didn’t include it because only one cloud provider currently is a member, and this membership wasn’t connected to the provider’s cloud services.

The CPTS provides a guideline of how an organi-zation can evaluate the adequacy of a cloud provider’s transparency. The methodology’s simplicity and high-level approach might not be adequate for a specific or-ganization’s requirements. As the cloud becomes more

important for IT to meet its business objectives, the need for transparency will only increase. Standardization, open reporting of information in the methodology’s sample domain, and making it readily available via the self-service model will greatly enhance business ability to evaluate and engage cloud providers’ services.

AcknowledgmentsA special thank you to Randy Bias, CEO, founder, and Cloud Strategist of Cloudscaling, for reviewing the cloud provider instrument for completeness and making sugges-tions for improvements. I also thank Mark Rosenbaum, doctoral candidate at Nova Southeastern University, for reviewing the document and, as usual, providing excellent feedback where the document needed improvements.

References1. K.S. Candan et al., “Frontiers in Information and Soft-

ware as Services,” Proc. 2009 IEEE Conf. Data Eng., IEEE CS Press, 2009, pp. 1761–1768.

2. P. Mell and T. Grance, “The NIST Definition of Cloud Computing,” Nat’l Inst. of Standards and Technology Computer Security Division, 7 Oct. 2009; http://csrc.nist.gov/groups/SNS/cloud-computing/cloud-def -v15.doc.

3. K. Wüllenweber and T. Weitzel, “An Empirical Ex-ploration of How Process Standardization Reduces Outsourcing Risk,” Proc. 40th Ann. Hawaii Int’l Conf. System Science, IEEE CS Press, 2007, p. 240c.

4. “Security Guidance for Critical Areas of Focus in Cloud Computing V2.1,” Cloud Security Alliance, 2009; www.cloudsecurityalliance.org/csaguide.pdf.

5. “Cloud Computing Security Risk Assessment,” European Network and Information Security Agency, 20 Nov. 2009; www.enisa.europa.eu/act/rm/files/ deliverables/cloud-computing-risk-assessment.

6. H.R. Nemati and T. Van Dyke, “Do Privacy State-ments Really Work? The Effect of Privacy Statements and Fair Information Practices on Trust and Perceived Risk in E-Commerce,” Int’l J. Information Security and Privacy, vol. 3, no. 1, 2009, pp. 45–65.

7. “Chronology of Data Breaches,” Privacy Rights Clear-inghouse, 2 Mar. 2010; www.privacyrights.org/ar/ ChronDataBreaches.htm.

Table 2. Cloud Provider Transparency Scorecard analysis.

CPTS analysis CP1 CP2 CP3 CP4 CP5 CP6Maximum

scoreSecurity 4 (0.40%) 4 (0.40%) 8 (0.80%) 5 (0.50%) 7 (0.70%) 7 (0.70%) 10 (1.00%)

Privacy 4 (0.67%) 5 (0.83%) 6 (1.00%) 3 (0.50%) 4 (0.67%) 6 (1.00%) 6 (1.00%)

Audit 3 (0.75%) 1 (0.25%) 4 (1.00%) 2 (0.50%) 4 (1.00%) 4 (1.00%) 4 (1.00%)

SLA 3 (0.33%) 5 (0.56%) 4 (0.44%) 1 (0.11%) 8 (0.89%) 4 (0.44%) 9 (1.00%)

Total 14 (0.48%) 15 (0.52%) 22 (0.76%) 11 (0.38%) 23 (0.79%) 21 (0.72%) 29 (1.00%)

Page 7: Cloud provider transparency

Cloud Computing

38 IEEESECURITY&PRIVACY

8. “CloudAudit and the Automated Audit, Assertion, As-sessment, and Assurance API (A6),” CloudAudit, 2010; www.cloudaudit.org.

9. “Open Grid Forum Open Cloud Computing Interface Working Group,” OCCI, 2010; www.occi-wg.org/doku.php.

10. “Frequently Asked Questions,” Small Business Admin-istration Office of Advocacy, Sept. 2009; www.sba.gov/advo/stats/sbfaq.pdf.

11. AU Section 324 Service Organizations: Sources SAS No. 70; SAS No. 78; SAS No. 88; SAS No. 98, Am. Inst. Cer-tified Public Accountants; www.aicpa.org/Research/Standards/AuditAttest/DownloadableDocuments/ AU-00324.pdf.

12. “Payment Card Industry Data Security Standard: Navi-gating PCI DSS V1.2,” Payment Card Industry Security Standards Council, 2008; www.pcisecuritystandards.org/pdfs/pci_dss_saq_navigating_dss.pdf.

13. “The Health Insurance Portability and Accountabil-ity Act of 1996 (HIPAA) Privacy and Security Rules,” US Dept. of Health and Human Services, 2006; www.hhs.gov/ocr/privacy/hipaa/administrative/ privacyrule/adminsimpregtext.pdf.

14. “Sarbanes–Oxley Act of 2002 (Public Company Ac-counting Reform and Investor Protection),” Govern-ment Accountability Office, 2002.

15. “COBIT Framework for IT Governance and Con-trol,” Information Systems Audit and Control Asso-ciation, 2007; www.isaca.org/Knowledge-Center/COBIT/Pages/Overview.aspx.

16. ISO/IEC 27000:2009: Information Technology, Security Techniques, Information Security Management Systems, Over-view and Vocabulary, Int’l Org. for Standardization and the Int’l Electrotechnical Commission, 2009; www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm? csnumber=41933.

Full assessment CP1 CP2 CP3 CP4 CP5 CP6

Security 1 Portal area for security information? 1 1 1 1 0 1

2 Published security policy? 1 1 1 0 0 0

3 White paper on security standards? 1 1 1 1 1 1

4 Does the policy specifically address multi-tenancy issues? 0 0 0 0 0 0

5 Email or online chat for questions? 1 1 1 1 1 1

6 ISO/IEC 27000 certified? 0 0 1 0 1 1

7 COBIT certified? 0 0 1 0 1 1

8 NIST SP800-53 security certified? 0 0 0 0 1 0

9 Offer security professional services (assessment)? 0 0 1 1 1 1

10 Employees CISSP, CISM, or other security certified? 0 0 1 1 1 1

Security subtotal score 4 4 8 5 7 7Privacy 11 Portal area for privacy information? 1 1 1 0 0 1

12 Published privacy policy? 1 1 1 0 0 1

13 White paper on privacy standards? 1 1 1 1 1 1

14 Email or online chat for questions? 1 1 1 1 1 1

15 Offer privacy professional services (assessment)? 0 0 1 1 1 1

16 Employees CIPP or other privacy certified? 0 1 1 0 1 1

Privacy subtotal score 4 5 6 3 4 6External audits or certifications

17 SAS 70 Type II 1 1 1 1 1 1

18 PCI-DSS 0 0 1 1 1 1

19 SOX 1 0 1 0 1 1

20 HIPAA 1 0 1 0 1 1

Audit subtotal score 3 1 4 2 4 4

Service-level agreements

21 Does it offer an SLA? 1 1 1 0 1 1

22 Does the SLA apply to all services? 0 1 1 0 1 1

23 99.9 = 1, 99.95 = 2, 99.99 = 3, 99.999 = 4, 100 = 5 1 2 1 0 5 1

24 ITIL-certified employees? 0 0 0 0 1 1

25 Publish outage and remediation? 1 1 1 1 0 0

SLA subtotal score 3 5 4 1 8 4Total score 14 15 22 11 23 21

Figure 2. The Cloud Provider Transparency Scorecard. The assessment examines the cloud provider’s security, privacy, external audits or

certifications, and service-level agreements to create a total transparency score.

Page 8: Cloud provider transparency

Cloud Computing

www.computer.org/security 39

17. R. Ross et al., “Recommended Security Controls for Federal Information Systems,” Dec. 2007; http://csrc.nist.gov/publications/nistpubs/800-53-Rev2/sp800-53-rev2-fi nal.pdf.

18. “AWS Completes SAS70 Type II Audit,” Amazon Web Services,” 2010; http://aws.amazon.com/about-aws/whats-new/2009/11/11/aws-completes-sas70-type-ii-audit.

19. “Information Technology Infrastructure Library,” ITIL, 12 Mar. 2010; www.itil-offi cialsite.com/home/home.asp.

20. M.W. Jones, “Microsoft’s Sidekick Cloud Outage Gets Worse,” Tech.Blorge, 11 Oct. 2009; http://tech.blorge.com/Structure:%20/2009/10/11/microsofts-sidekick-cloud-outage-gets-worse.

21. “Setting the Standards for Vendor Assessments,” Shared Assessments, 13 Mar. 2010; www.sharedassessments.org.

Wayne A. Pauley is a cloud and security evangelist at EMC

and an executive in its Unifi ed Storage Division. He’s also a

doctoral candidate in information systems science at Nova

Southeastern University. His research interests include cloud

security and privacy. Pauley has an MS in information tech-

nology management from Franklin Pierce University. Contact

him at [email protected].

Selected CS articles and columns are also available for free at http://ComputingNow.computer.org.

Executive Committee Members: Alan Street,

President; Dr. Sam Keene, VP Technical Operations; Lou

Gullo, VP Publications; Alfred Stevens, VP Meetings;

Marsha Abramo, Secretary; Richard Kowalski, Treasurer;

Dennis Hoffman, VP Membership and Sr. Past

President; Dr. Jeffrey Voas, Jr. Past President

Administrative Committee Members: Lou Gullo,

John Healy, Dennis Hoffman, Jim McLinn, Bret

Michael, Bob Stoddard. Joe Childs, Irv Engleson, Sam

Keene, Lisa Edge, Todd Weatherford, Eric Wong, Scott

B. Abrams, John Harauz, Phil LaPlante, Alfred Stevens,

Alan Street, Scott Tamashiro

www.ieee.org/reliabilitysociety

The IEEE Reliability Society (RS) is a technical Society within the IEEE, which is the world’s lead-ing professional association for the advancement of technology. The RS is engaged in the engineering disciplines of hardware, software, and human factors. Its focus on the broad aspects of reliability, allows the RS to be seen as the IEEE Specialty Engineering organization. The IEEE Reliability Society is concerned with attaining and sustaining these design attributes throughout the total life cycle. The Reliability Society has the management, resources, and administrative and technical structures to develop and to provide technical information via publications, training, con-ferences, and technical library (IEEE Xplore) data to its members and the Specialty Engineering community. The IEEE Reliability Society has 22 chapters and mem-bers in 60 countries worldwide.

The Reliability Society is the IEEE professional society for Reliability Engineering, along with other Specialty Engineering disciplines. These disciplines are design engineering vfields that apply scientific knowl-edge so that their specific attributes are designed into the system / product / device / process to assure that it will perform its intended function for the required duration within a given environment, including the ability to test and support it throughout its total life cycle. This is accomplished concurrently with other design disciplines by contributing to the planning and selection of the system architecture, design imple-mentation, materials, processes, and components; fol-lowed by verifying the selections made by thorough analysis and test and then sustainment.

Visit the IEEE Reliability Society Web site as it is the gateway to the many resources that the RS makes available to its members and others interested in the broad aspects of Reliability and Specialty Engineering.

Learn about computing history and the people who shaped it.

COMPUTING THEN

http://computingnow.computer.org/ct