15-09-2016, 11:06 AM
1454581889-quantonium.docx (Size: 80.02 KB / Downloads: 7)
1. ABSTRACT
Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. To protect the confidentiality of sensitive data while supporting deduplication, the convergent encryption technique has been proposed to encrypt the data before outsourcing. To better protect data security, this paper makes the first attempt to formally address the problem of authorized data deduplication. Different from traditional deduplication systems, the differential privileges of users are further considered in duplicate check besides the data itself. Weal so present several new deduplication constructions supporting authorized duplicate check in hybrid cloud architecture. Security analysis demonstrates that our scheme is secure in terms of the definitions specified in the proposed security model. As a proof of concept, we implement a prototype of our proposed authorized duplicate check scheme and conduct test bed experiments using our prototype. We show that our proposed authorized duplicate check scheme incurs minimal overhead compared to normal operations.
Key Words: Deduplication, Hybrid Cloud, Convergent key
2.INTRODUCTION
What is cloud computing?
Cloud computing provides seemingly unlimited “virtualized” resources to users as services across the whole Internet, while hiding platform and implementation de- tails. Today’s cloud service providers offer both highly available storage and massively parallel computing resources at relatively low costs. As cloud computing becomes revalent, an increasing amount of data is being stored in the cloud and shared by users with specified privileges, which define the access rights of the stored data. One critical challenge of cloud storage services is the management of the ever-increasing volume of data. To make data management scalable in cloud computing, deduplication has been a well-known technique and has attracted more and more attention recently. Data deduplication is a specialized data compression technique for eliminating duplicate copies of repeating data in storage. The technique is used to improve storage utilization and can also be applied to network data transfers to reduce the number of bytes that must be sent. Instead of keeping multiple data copies with the same content, deduplication eliminates redundant data by keeping only one physical copy and referring other redundant data to that copy.
3. LITERATURE SURVEY
The S-CSP(storage-cloud service provider) to work as a data storage service in public cloud. On the half of the user S-CSP store the data.The S-CSP eliminates the duplicate data using deduplication and keeps the unique data as it is.
A user wants to access the data or files from S-SCP.User generates the key and stores that key in private cloud.In storage system supporting deduplication,the user only upload unique data but do not upload any duplicate data to save the upload bandwidth,which may be owned by the same user or different users.
In general for providing more security user can use the private cloud instead of public cloud.User store the generated key in private cloud.At the time of downloading system ask the key to download the file.User can not store the secrete key internally.For providing proper protection to key we use private cloud.
Public cloud is used for the storage purpose.Userupload the files in public cloud.Public cloud is similar as S-CSP.When the user wants to download the files from public cloud,it will be ask the key which is generated or stored in private cloud.When the users key is match with files key at that time user can download the file,without key user can not access the file.Only authorized user can access the file
. PROPOSED SYSTEM
In this project, aiming at efficiently solving the problem of deduplication with differential privileges in cloud computing, we consider a hybrid cloud architecture consisting of a public cloud and a private cloud. Unlike existing data deduplication systems, the private cloud is involved as a proxy to allow data owner/users to securely perform duplicate check with differential privileges. Furthermore, we enhance our system in security. Specifically, we present an advanced scheme to support stronger security by encrypting the file with differential privilege keys. In this way, the users without corresponding privileges cannot perform the duplicate check.
Finally, we implement a prototype of the proposed authorized duplicate check and conduct tested experiments to evaluate the overhead of the prototype. I show that the overhead is minimal compared to the normal convergent encryption and file upload operations.
1. A deduplication system in the cloud storage to reduce the storage size of the tags for integrity check.
2. To enhance the security of deduplication and protect the data confidentiality, by transforming the predictable message into unpredictable message. In their system, another third party called key server is introduced to generate the file tag for duplicate check.
3. A novel encryption scheme that provides the essential security for popular data and unpopular data. For popular data that are not particularly sensitive, the traditional conventional encryption is performed.
7. CONCLUSION
The notion of authorized data deduplication was proposed to protect the data security by including differential privileges of users in the duplicate check. We also presented several new deduplication constructions supporting authorized duplicate check in hybrid cloud architecture, in which the duplicate-check tokens of files are generated by the private cloud server with private keys. Security analysis demonstrates that our schemes are secure in terms of insider and outsider attacks specified in the proposed security model. As a proof of concept, we implemented a prototype of our proposed authorized duplicate check scheme and conduct test bed experiments on our prototype. We showed that our authorized duplicate check scheme incurs minimal overhead compared to convergent encryption and network transfer.