22-09-2014, 04:13 PM
TOWARDS SECURE & DEPENDABLE STORAGE SERVICES IN CLOUD COMPUTING
TOWARDS SECURE.doc (Size: 861 KB / Downloads: 12)
ABSTRACT
Though cloud storage mechanism helps the user to store the data on a remote machine, it has the major drawback of security risk. User can access on-demand high quality cloud application without the burden of local hardware and software management. It does not guarantee security for the outsourced data. To carry out secure and dependable cloud storage service ,we proposes flexible distributed storage integrity auditing mechanism, utilizing the homomorphic token and distributed erasure-coded data. The proposed design further supports secure and efficient dynamic operations on outsourced data. Analysis shows the proposed scheme is highly efficient and resilient against Byzantine failure, malicious data modification attack, and even server colluding attacks.
INTRODUCTION
Here we introduce this cloud services through an online banking application Compared to many of its predecessors which only provide binary results about the storage status across the distributed servers ,we propose this new system.This new scheme further supports secure and efficient dynamic operation on Data block .Hence our system is highly efficient as we have provided the technology of token generation for secure login to avoid intruders
LITERATURE SURVEY
Cloud Computing has been envisioned as the nextgeneration architecture of IT Enterprise. In contrast to traditional solutions, where the IT services are under proper physical, logical and personnel controls, Cloud Computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. This unique attribute, however, poses many new security challenges which have not been well understood. In this article, we focus on cloud data storage security, which has always been an important aspect of quality of service. To ensure the correctness of users’ data in the cloud, we propose an effective and flexible distributed scheme with two salient features, opposing to its predecessors. By utilizing the homomorphic token with distributed verification of erasure-coded data, our scheme achieves the integration of storage correctness insurance and data error localization, i.e., the identification of misbehaving server(s). Unlike most prior works, the new scheme further supports secure and efficient dynamic operations on data blocks, including: data update, delete and append. Extensive security and performance analysis shows that the proposed scheme is highly efficient and resilient against Byzantine failure, malicious data modification attack, and even server colluding attacks
Provable Data Possession at Untrusted Stores
We introduce a model for provable data possession (PDP) that allows a client that has stored data at an untrusted server to verify that the server possesses the original data without retrieving it. The model generates probabilistic proofs of possession by sampling random sets of blocks from the server, which drastically reduces I/O costs. The client maintains a constant amount of metadata to verify the proof. The challenge/response protocol transmits a small, constant amount of data, which minimizes network communication. Thus, the PDP model for remote data checking supports large data sets in widely-distributed storage systems
Auditing to Keep Online Storage Services Honest
A growing number of online service providers offer to store customers' photos, email, _le system backups, and other digital assets. Currently, customers cannot make informed decisions about the risk of losing data stored with any particular service provider, reducing their incentive to rely on these services. We argue that thirdparty auditing is important in creating an online service oriented economy, because it allows customers to evaluate risks, and it increases the ef_ciency of insurancebased risk mitigation. We describe approaches and system hooks that support both internal and external auditing of online storage services, describe motivations for service providers and auditors to adopt these approaches, and list challenges that need to be resolved for such auditing to become a reality.
INTRODUCTION TO ORANIZATION
NeoApp Profile
NeoApp develops custom software solutions for companies in a variety of industries. Since it’s beginning in August 2008, NeoApp has offered efficient, reliable and cost-effective solutions with good quality by implementing CMMI practices from its development facility located at Hyderabad.
NeoApp has expertise in latest technologies and caters to your exacting requirements. NeoApp helps you from concept to completion of a project with full range of service offerings.
Most importantly, NeoApp combines the right strategy with the right products and the right people, ensuring technical superiority, high quality deliverables and timely implementations. NeoApp supports different delivery and billing models to fit your requirements. By having NeoApp involved with your software development projects, you benefit with reduced costs and faster development cycles. To reduce the development costs NeoApp strictly adhere on reusable component model with plug and play architecture.
Offshore outsourcing model became easily adoptable and has increased benefits beyond cost reductions. The offshore outsourcing with NeoApp includes full spectrum services and multi fold benefits.
NeoApp, with its experience in executing offshore projects ranging from large enterprise solutions to small plug-in applications, helps customers achieve the offshore outsourcing goals.
NeoApp establishes suitable project execution methodologies for each project and accomplishes offshore execution on time and on budget. NeoApp pays
EXISTING SYSTEM
The outsourced data doesn’t have security even Cloud Service Provider(CSP) may discard the rarely accessed data,attempt to hide the data loss incidents so as to maintain data reputation. Although outsourcing data into the cloud is economically attractive for the cost and complexity of long-term large-scale data storage, its lacking of offering strong assurance of data integrity and availability may impede its wide adoption by both enterprise and individual cloud users.Data may loss due to Byzantine failure
Software Requirement Specification
Software Requirements Specification plays an important role in creating quality software solutions. Specification is basically a representation process. Requirements are represented in a manner that ultimately leads to successful software implementation.
Requirements may be specified in a variety of ways. However there are some guidelines worth following: -
• Representation format and content should be relevant to the problem
• Information contained within the specification should be nested
• Diagrams and other notational forms should be restricted in number and consistent in use.
• Representations should be revisable.
SYSTEM DESIGN
INPUT DESIGN
The input design is the link between the information system and the user. It comprises the developing specification and procedures for data preparation and those steps are necessary to put transaction data in to a usable form for processing can be achieved by inspecting the computer to read data from a written or printed document or it can occur by having people keying the data directly into the system. The design of input focuses on controlling the amount of input required, controlling the errors, avoiding delay, avoiding extra steps and keeping the process simple. The input is designed in such a way so that it provides security and ease of use with retaining the privacy. Input Design considered the following things:
What data should be given as input?
How the data should be arranged or coded?
The dialog to guide the operating personnel in providing input.
Methods for preparing input validations and steps to follow when error occur.
OUTPUT DESIGN
A quality output is one, which meets the requirements of the end user and presents the information clearly. In any system results of processing are communicated to the users and to other system through outputs. In output design it is determined how the information is to be displaced for immediate need and also the hard copy output. It is the most important and direct source information to the user. Efficient and intelligent output design improves the system’s relationship to help user decision-making.
1. Designing computer output should proceed in an organized, well thought out manner; the right output must be developed while ensuring that each output element is designed so that people will find the system can use easily and effectively. When analysis design computer output, they should Identify the specific output that is needed to meet the requirements.
2.Select methods for presenting information.
3.Create document, report, or other formats that contain information produced by the system.
The output form of an information system should accomplish one or more of the following objectives.
Convey information about past activities, current status or projections of the
Future.
Signal important events, opportunities, problems, or warnings.
Trigger an action.
Confirm an action
Third Party Auditing
As discussed in our architecture, in case the user does not have the time, feasibility or resources to perform the storage correctness verification, he can optionally delegate this task to an independent third party auditor, making the cloud storage publicly verifiable. However, as pointed out by the recent work, to securely introduce an effective TPA, the auditing process should bring in no new vulnerabilities towards user data privacy. Namely, TPA should not learn user’s data content through the delegated data auditing.