Abstract:- In the present-day,emerging demand over cloud computing architectures especially in data as aservice platform there is a huge incremental necessity over the quantity ofdata that is being kept for service as it is been outsourced from dataoutsource on to the cloud data server. This leads to a great increase in thequantity of user data that will get accumulated into the cloud server. We mayneed to focus on enhancing data access capability with a focus on maintainingdata integrity auditing and there is a chance of data duplication that has tobe addressed effectively. So we adopt a secure data retrieval access policy oftag evaluation strategy over duplicate data. Most of the data that is beenoutsourced for service is sensitive personal information so we may need to facilitateconfidentiality in such matters and privacy preservation can be initiated toacquire the reliability of sensitive information of data users.
Ingeneral, in cloud computing, effective and efficient data retrieval processesand policies are been empowered on a high scale wherein we need to adopt highsecure confidentiality preservation of sensitive formats of data and notletting the huge volume of data that get duplicated by anyway. So data userattempts to retrieve the desired data as a unique copy that should getretrieved along with the consideration of time in the factor so that the dataretrieval time should get reduced to increase the performance of the system. Sothis optimal mechanism helps us to deal with a huge volume of data as well flexibleretrieval process in the most secure way of the auditing process. Dataintegrity over data access of sharable data could be done by syndicateverification of a collection for their data users with segment-based securitykeys. All the segmented data are been mapped with a variety of security keys inconnection to the corresponding users even over the data modificationsentertained over a variety of data users.
Keywords:- Cloud storage, public cloud auditing, securededuplication, batch verification