public auditing and user revocation in dynamic cloud environment
DESCRIPTION
Cloud Computing has been envisioned as the next generation architecture of IT Enterprise. In difference to conventional solutions, where the IT services are underneath appropriate physical, logical and recruits controls, Cloud Computing shift the application software and databases to the large information centers, where the organization of the information and services may not be completely responsible. With cloud computing and storage, users are capable to access and to divide resources accessible by cloud service providers at a lower minor cost. With cloud computing and storage services, information is not only stored in the cloud, but regularly shared among a huge number of users in a group. In this project, we propose Oruta, a privacy-preserving auditing scheme for shared data with large groups in the cloud. We utilize ring signatures to compute verification information on shared information, so that the TPA is capable to audit the accuracy of shared information, but cannot make public the uniqueness of the signer on each block. We can implement the batch auditing scheme to perform efficient public auditing to protect both identity and data privacy in cloud environments. And also extend this project to overcome the duplicated data in cloud storage and with privileges keys. We proposed deduplication scheme to check the redundant data allowed to perform the duplicate check for files marked with the corresponding privileges.TRANSCRIPT
-
International Journal of Emerging Technologies and Engineering (IJETE)
Volume 1 Issue 9, October 2014, ISSN 2348 8050
212 www.ijete.org
PUBLIC AUDITING AND USER REVOCATION IN DYNAMIC CLOUD
ENVIRONMENT
P.Banumathi
1, S.Satheeshkumar
2, S.Kaliraj
3
1 PG Student, Dept of CSE
2,3 Assistant Professor, Dept of CSE
Kongunadu College of Engineering & Technology, Trichy
Abstract Cloud Computing has been envisioned as the
next generation architecture of IT Enterprise. In
difference to conventional solutions, where the IT
services are underneath appropriate physical, logical and
recruits controls, Cloud Computing shift the application
software and databases to the large information centers,
where the organization of the information and services
may not be completely responsible. With cloud
computing and storage, users are capable to access and
to divide resources accessible by cloud service providers
at a lower minor cost. With cloud computing and storage
services, information is not only stored in the cloud, but
regularly shared among a huge number of users in a
group. In this project, we propose Oruta, a privacy-
preserving auditing scheme for shared data with large
groups in the cloud. We utilize ring signatures to
compute verification information on shared information,
so that the TPA is capable to audit the accuracy of
shared information, but cannot make public the
uniqueness of the signer on each block. We can
implement the batch auditing scheme to perform
efficient public auditing to protect both identity and data
privacy in cloud environments. And also extend this
project to overcome the duplicated data in cloud storage
and with privileges keys. We proposed deduplication
scheme to check the redundant data allowed to perform
the duplicate check for files marked with the
corresponding privileges.
Keywords: Deduplication, privacy-preserving, shared
data, cloud computing
1. Introduction Cloud computing means storing and accessing
information and programs over the Internet as an
alternative of computer's hard drive. It goes reverse
to the days of flowcharts and presentations that
would symbolize the enormous server-farm
communications of the Internet as not anything but
a distended, white cumulonimbus cloud, long-
suffering connections and doling out information as
it floats. When you accumulate information on or
run programs from the hard drive, that's called local
storage. Everything requires is physically close to
you, which means accessing your data is quick and
simple. Effective off your hard drive is how the
computer industry functioned for decades and some
argue it's still better to cloud computing. The cloud
is also not about having a devoted hardware server
in residence. Storing information on a home or
office network does not add up as utilizing the
cloud. For it to be considered "cloud computing,"
you need to right to use your information over the
Internet, or at the very smallest amount, have that
information corresponding with other information
over the Net. In a big business, all there is to know
about what's on the other side of the connection; as
an entity user, you may not at all have any idea
what kind of huge data-processing is occurrence on
the other end. The conclusion outcome is the same:
with an online connection, cloud computing can be
done wherever, anytime. A model for delivering IT
services in which income are retrieved from the internet
through web-based tools and applications, rather than a
straight connection to a server. Information and software
packages are stored in servers. Cloud computing
configuration allows access to data as extended as an
electronic device has way in to the web. This type of
system allows employees to work slightly.
2. Related work In [1] Shucheng Yu et al. As promising as it is, cloud
computing is also facing many challenges that, if not
well resolved, may impede its fast growth. Data security,
as it exists in many other applications, is among these
challenges that would raise great concerns from users
when they store sensitive information on cloud servers.
These concerns originate from the fact that cloud servers
are usually operated by commercial providers which are
very likely to be outside of the trusted domain of the
users. Data confidential against cloud servers is hence
-
International Journal of Emerging Technologies and Engineering (IJETE)
Volume 1 Issue 9, October 2014, ISSN 2348 8050
213 www.ijete.org
frequently desired when users outsource data for storage
in the cloud. In some practical application systems, data
confidentiality is not only a security/privacy issue, but
also of juristic concerns. For example, in healthcare
application scenarios use and disclosure of protected
health information (PHI) should meet the requirements
of Health Insurance Portability and Accountability Act
(HIPAA), and keeping user data confidential against the
storage servers is not just an option, but a requirement.
To enforce these access policies, the data owners on one
hand would like to take advantage of the abundant
resources that the cloud provides for efficiency and
economy; on the other hand, they may want to keep the
data contents confidential against cloud servers.
In [2] Guojun Wang et al. This paper propose a
hierarchical attribute-based encryption (HABE) model
by combining a HIBE system and a CP-ABE system, to
provide fine-grained access control and full delegation.
Based on the HABE model, we construct a HABE
scheme by making a performance-expressivity tradeoff,
to achieve high performance. Finally, propose a scalable
revocation scheme by delegating to the CSP most of the
computing tasks in revocation, to achieve a dynamic set
of users efficiently. Although some CP-ABE schemes
support delegation between users, which enables a user
to generate attribute secret keys containing a subset and
to keep the sensitive user data confidential against
untrusted CSPs, a natural way is to apply cryptographic
approaches, by disclosing decryption keys only to
authorized users of own attribute secret keys for other
users, to achieve a full delegation, that is, a delegation
mechanism between attribute authorities (AAs), which
independently make decisions on the structure and
semantics of their attributes and to help the enterprise
users to efficiently share confidential data on cloud
servers. Specifically, we want to make our scheme more
applicable in cloud computing by simultaneously
achieving fine-grained access control, high performance,
practicability, and scalability.
3. PDP (Provable data possession) Provable data possession (PDP) is a technique
for ensuring the integrity of data in storage outsourcing.
We describe a framework for provable data possession.
This provides background for related work and for the
specific description of our schemes. A PDP(Fig 1)
protocol checks that an outsourced storage site keep a
file, which consists of a group of n blocks. The client C
pre-processes the file, makes a piece of metadata that is
stored nearby, send out the file to the server S, and may
remove its local copy. The servers provisions the file and
react to challenges issueD by the client. Storage at the
server is in (n) and storage at the client is in O (1), conforming to our notion of an outsourced storage
relationship. As part of pre-processing, the client may
alter the file to be stored at the server. The client may
expand the file or include additional metadata to be
stored at the server. Ahead of deleting its local copy of
the file, the client may implement a data control
challenge to make sure the server has effectively stored
the file. Clients may encrypt a file proceeding to out-
sourcing the storage. For our purposes, encryption is an
orthogonal issue; the file may consist of encrypted data and our metadata does not include encryption keys.
At a later time, the client issues a challenge to the server
to establish that the server has retained the file. The
client requests that the server compute a function of the
stored information, which it drive back to the client.
Using its limited metadata, the client verifies the reply.
Figure 1: Protocol for PDP
4. Cloud security Many organizations today are feeling pressure to
reduce IT costs and optimize IT operations. Cloud
computing is quickly rising as a practical means to
-
International Journal of Emerging Technologies and Engineering (IJETE)
Volume 1 Issue 9, October 2014, ISSN 2348 8050
214 www.ijete.org
create energetic, fast provisioned property for operating
platforms, development environments, storage,
applications, and backup capability, and various more
IT functions. A incredible number of security
considerations exist that information security
professionals need to consider when evaluating the risks
of cloud computing. The first fundamental issue is the
loss of hands-on manages of system, application, and
information defense. Several of the existing best practice
protection controls that infosec specialized have come to
rely on are not accessible in cloud environments,
exposed down in numerous ways, or not capable to be
restricted by security teams. Security professionals have
to turn into heavily involved in the development of
contract language and Service Level Agreements (SLAs)
when doing business with Cloud Service Providers
(CSPs). Compliance and auditing concerns are
compounded. Manage verification and audit exposure
within CSP environments may be less in-depth and
frequent as audit and security teams require. The SANS
Cloud Security Fundamentals course starts out with a
detailed introduction to the various delivery models of
cloud computing ranging from Software as a Service
(SaaS) to Infrastructure as a Service (IaaS) and the
whole thing in among. All of these release models
symbolize an completely part set of security setting to
think, particularly when attached with a variety of cloud
types as well as: public, private, and hybrid. An general
idea of safety issues within every of these models will be
enclosed with in-depth negotiations of risks to believe.
Attendees will go in-depth on architecture and
communications basics for private, public, and hybrid
clouds. A large variety of topics will be enclosed
together with: patch and configuration organization,
virtualization safety, application security, and alter
management. Strategy, risk appraisal, and domination
inside cloud environments will be enclosed with
proposal for both internal policies and agreement
requirements to think. This trail leads to a conversation
of fulfillment and executive concerns. The first day will
conclude with a number of fundamental scenarios for
students to estimate.
Attendees will start off the second day with
reporting of audits and evaluation for cloud
environments. The day will take in hands-on exercises
for students to study about innovative models and
advance for performing assessments, as well as estimate
audit and monitoring controls. Then the class will turn to
shielding the information itself. New approaches for
information encryption, network encryption, key
organization, and information lifecycle concerns will be
enclosed in-depth. The disputes of individuality and
right of entry organization in cloud environments will be
enclosed. The course will shift into tragedy revival and
business stability preparation using cloud models and
architecture. Interruption detection and event reaction in
cloud environments will be covered along with how best
to manage these critical security processes and
technologies that support them given that most controls
are managed by the CSP.
5. Cloud Auditing Cloud Audit is a specification for the
presentation of information about how a cloud
computing service provider addresses organize
frameworks. The requirement provides a normal way to
present and share complete, preset data about
presentation and security. The objective of Cloud Audit
is to offer a normal interface and namespace that permit
enterprises who are involved in reorganization their audit
procedures as well as cloud computing contributor to
automate the Audit, declaration, evaluation, and pledge
of their infrastructure (IaaS), platform (PaaS), and
application (SaaS) environments and allow approved
consumers of their services to do similarly through an
open, extensible and safe interface and attitude. Cloud
Audit is a helper cross-industry attempt from the most
excellent minds and aptitude in Cloud, networking,
security, audit, declaration and architecture backgrounds.
The Cloud Audit Working set was formally launched in
January 2010 and has the participation of many of the
largest cloud computing providers, consultants and
integrators.
Figure 2: Audit System architecture for cloud
computing
-
International Journal of Emerging Technologies and Engineering (IJETE)
Volume 1 Issue 9, October 2014, ISSN 2348 8050
215 www.ijete.org
6. Homomorphic Encryption Homomorphic Encryption systems are used to
perform operations on encrypted data without knowing
the private key, the client is the only owner of the secret
key. When we decrypt the result of any action, it is the
similar as if we had conceded out the calculation on the
raw data. An encryption is homomorphic, if: from Enc
(a) and Enc (b) it is possible to compute Enc(f (a, b)),
where f can be: +, , and without using the private key.
Homomorphic encryption is hardly a new
finding, and cryptographers have long been conscious of
its guarantee. Way back in 1978, Rivest, Adleman and
Dertouzos anticipated homomorphic encryption methods
that supported interesting functions on encrypted
information. Thus, the schedule for researchers was
dual: (1) come up with secure encryption schemes that
could handle useful homomorphisms, and (2) figure out
how to do interesting things with them.
7. Ring Signature Ring signatures consist of only two algorithms:
Sign and Verify; this encapsulates the intuition that ring
signatures are essentially setup-free and unconditionally anonymous. In more recently proposed
ring signature methods, however, a KeyGen algorithm
has been additional as a way to guarantee that all users
have the similar type of keys. Therefore, for the purposes
of safety definitions suppose that a ring signature
scheme consists of three algorithms: KeyGen, Sign, and
Verify. Each user will run KeyGen individually; this
algorithm, on input the security parameter 1k , will
output a keypair. The Sign algorithm, on input a secret
key sk, a ring R, and a message m, outputs a signature on m. Finally, the Verify algorithm on effort the ring R,
a message m ,a signature and, outputs 1 if some member of R created the signature on m and 0 otherwise.
8. Dynamic Group analysis A new user can be added in the group or an
existing user can be revoked from the group, then this
group is indicating as a dynamic group. To maintain
dynamic groups even as still allowing the public verifier
to perform public auditing, all the ring signatures on
shared information require to be re-computed with the
signers private key and all the current users public keys when the membership of the group is changed. A
dynamic group implies that newly joining members must
not be able to understand past group communications,
and that exit members may not go behind future
communications. The dynamic groups new granted users
can directly decrypt data files uploaded before their
participation without contacting with information
owners. User revocation can be simply achieved
throughout a novel revocation list without updating the
secret keys of the enduring users. The range and
computation operating cost of encryption are constant
and independent with the number of revoked users.
9. Batch Auditing Batch auditing to enable TPA with secure and
efficient auditing capability to cope with multiple
auditing delegations from possibly large number of
different users simultaneously. Batch auditing where
several delegated auditing tasks from different users can
be performed concurrently by the TPA. Batch auditing
where TPA will handle multiple users request at the
same time which reduces communication and
computation overhead. Batch auditing where multiple
delegated auditing tasks from different users can be
performed simultaneously by the TPA. With the
establishment of privacy-preserving public auditing in
Cloud Computing, TPA may along with handle
numerous auditing delegations upon different users requests. The personality auditing of these tasks for TPA
can be tedious and very ineffective. Batch auditing not
only permits TPA to carry out the multiple auditing tasks
concurrently, but also greatly decrease the computation
cost on the TPA side.
Figure 3: Batch Auditing
10. Deduplication Data deduplication is a method for dropping the
quantity of storage space an association needs to
accumulate its data. In most organizations, the storage
systems hold duplicate copies of a lot of pieces of data.
For example, the similar file may be saved in numerous
dissimilar places by dissimilar users, or two or more files
that aren't the same may still comprise much of the
similar data. Deduplication discards these additional
-
International Journal of Emerging Technologies and Engineering (IJETE)
Volume 1 Issue 9, October 2014, ISSN 2348 8050
216 www.ijete.org
copies by saving just one copy of the data and swapping
the extra copies with pointers that guide reverse to the
unique copy. Companies regularly use deduplication in
backup and tragedy improvement applications, but it can
be used to liberate up space in main storage as well.
In its simplest form, deduplication takes position on the
file altitude; that is, it discards duplicate copies of the
similar file. This type of deduplication is occasionally
called file-level deduplication or particular occurrence
storage. Deduplication can also take place on the
obstruct level, discarding duplicated blocks of
information that happen in non-identical files. Block-
level deduplication frees up additional space than SIS,
and an exacting type recognized as inconsistent block or
unpredictable length deduplication has turn into very
trendy. Often the expression data deduplication is used
as a synonym for block-level or inconsistent length
deduplication.
Figure 4: Deduplication image
11. Conclusion In this approach there are three main part cloud
server, TPA and the cloud users. Users are considering
being in two category original user and group of user,
the original users are owner of the outsourced data.
Original user has the capability to control the data and its
transaction also. To carry out auditing for verifying the
rightness of data all users send their request to TPA.
Homomorphic Authenticable Ring Structures (HARS)
scheme consist of three algorithms: KeyGen, RingSign
and Ring Verify are built here for achieving the privacy-
preserving auditing. To focusing on an efficient auditing
process the approach can be empowered mainly to make
sure integrity of shared data in grouped users environment. Deduplication scheme use to avoid saving
the same information in the cloud that help to save the
storage space of the users.
References [1] B. Wang, B. Li, and H. Li, Oruta: Privacy-Preserving Public Auditing for Shared Data in the
Cloud, Proc. IEEE Fifth Intl Conf. Cloud Computing, pp. 295-302, 2012.
[2] M. Armbrust, A. Fox, R. Griffith, A.D. Joseph, R.H.
Katz, A. Konwinski, G. Lee, D.A. Patterson, A. Rabkin,
I. Stoica, and M. Zaharia, A View of Cloud Computing, Comm. ACM, vol. 53, no. 4, pp. 50-58, Apr. 2010.
[3] K. Ren, C. Wang, and Q. Wang, Security Challenges for the Public Cloud, IEEE Internet Computing, vol. 16, no. 1, pp. 69-73, 2012.
[4] D. Song, E. Shi, I. Fischer, and U. Shankar, Cloud Data Protection for the Masses, Computer, vol. 45, no. 1, pp. 39-45, 2012.
[5] C. Wang, Q. Wang, K. Ren, and W. Lou, Privacy-Preserving Public Auditing for Data Storage Security in
Cloud Computing, Proc. IEEE INFOCOM, pp. 525-533, 2010.
[6] M. Bellare and A. Palacio. Gq and schnorr
identification schemes: Proofs of security against
impersonation under active and concurrent attacks. In
CRYPTO, pages 162177, 2002. [7] S. Bugiel, S. Nurnberger, A. Sadeghi, and T.
Schneider. Twin clouds: An architecture for secure cloud
computing. In Workshop on Cryptography and Security
in Clouds (WCSC 2011), 2011.
[8] J. R. Douceur, A. Adya, W. J. Bolosky, D. Simon,
and M. Theimer. Reclaiming space from duplicate files
in a serverless distributed file system. In ICDCS, pages
617624, 2002. [9] D. Ferraiolo and R. Kuhn. Role-based access
controls. In 15th NIST-NCSC National Computer
Security Conf., 1992.
[10] GNU Libmicrohttpd. http://www.gnu.org/ software/
libmicrohttpd/