23-06-2012, 02:12 PM
Efficient and Secure Content Processing and Distribution by Cooperative Intermediaries
Efficient and Secure Content Processing and.doc (Size: 348 KB / Downloads: 36)
Abstract
Content services such as content filtering and transcoding adapt contents to meet system requirements, display capacities, or user preferences. Data security in such a framework is an important problem and crucial for many Web applications. In this paper, we propose an approach that addresses data integrity and confidentiality in content adaptation and caching by intermediaries. Our approach permits multiple intermediaries to simultaneously perform content services on different portions of the data. Our protocol supports decentralized proxy and key management and flexible delegation of services. Our experimental results show that our approach is efficient and minimizes the amount of data transmitted across the network.
Introduction
With the emergence of various network appliances and heterogeneous client environments, there are other relevant new requirements for content services by intermediaries. For example, content may be transformed to satisfy the requirements of a client’s security policy, device capabilities, preferences, and so forth. Therefore, several content services have been identified that include but are not limited to content transcoding, in which data is transformed from one format into another, data filtering, and value-added services such as watermarking . Other relevant services are related to personalization, according to which special-purpose proxies can tailor the contents based on user preferences, current activities, and past access history. Many studies have been carried out on intermediary content services; however, the problem of data security in these settings has not caught much attention. Confidentiality and integrity are two main security properties that must be ensured for data in several distributed cooperative application domains such as collaborative e-commerce, distance learning, telemedicine, and e-government. Confidentiality means that data can only be accessed under the proper authorizations. Integrity means that data can only be modified by authorized subjects. The approaches developed for securely transferring data from a server to clients are not suitable when data is to be transformed by intermediaries. When a proxy mediates data transmission, if the data is enciphered during transmission, security is ensured; however, it is impossible for intermediaries to modify the data. On the other hand, when intermediaries are allowed to modify the data, it is difficult to enforce security.
LITERATURE SURVEY
In order to enhance the performance of content distribution networks (CDNs), several approaches have been developed based on the use of content management services provided by intermediary proxies. The work by Lum and Lau discussed the trade-off between the transcoding overhead and spatial consumption in content adaptation [16]. CoralCDN, a peer to-peer CDN, was recently presented; it combines peer-to peer systems and Web-based content delivery [11]. Chi and Wu [8] proposed a Data Integrity Service Model (DISM) to
enforce the integrity of data transformed by intermediaries. In such a model, integrity is enforced by using metadata expressing modification policies specified by content owners. However, in DISM, every subject can access the data. Thus, confidentiality is not enforced. Another problem with DISM is the lack of efficiency. It does not exploit the possible parallelism that is inherent in data relationships and in the access control policies. In several applications such as multimedia content adaptation [2] efficiency is crucial. In the partial and preliminary version of this paper[14], a protocol was proposed to ensure confidentiality and integrity for XML document updates in distributed and cooperative systems. In this paper, we present a general and improved protocol to meet the high availability requirement for large-scale network services .
Existing System
Previous work has been done on data adaptation and content delivery.
Proposed System
It combines peer-to-peer systems and Data Integrity Service Model to enforce the integrity of data transformed by intermediaries. In such a model, integrity is enforced by using metadata expressing modification policies specified by content owners. In this paper, we present a general and improved protocol to meet the high availability requirement for large-scale network services