diff_months: 16

Cloud Computing Technology Assignment

Download Solution Now
Added on: 2023-02-28 06:44:51
Order Code: CLT315458
Question Task Id: 0
  • Country :


1. Introduction

Cloud computing is a unique paradigm that may combine various internet-based technologies and ideas to provide an enterprise with a flexible and inexpensive IT platform. A variety of sectors have increased their use of cloud computing recently. This has benefited the development of related markets (Hammami et al., 2017). Cloud computing services like Amazon EC2 are becoming more popular, but many users are still determining the best next steps due to security concerns. Only some have a firm grasp of what Cloud computing is. Cloud computing is sharing server space and other hardware through the Internet. The concept of cloud computing is now integral to most modern technological conversations. This is because the problem has received increasing attention from many different fields. Most businesses across all sectors have been reducing expenses in various ways (Elzamly et al., 2019). One of the avenues that may be pursued is cloud computing. The Cloud is a common internet term as a metaphor for the complicated platforms and infrastructures involved in cloud data management (Bo, 2018). There is always the matter of safety when customers expect their data to be sent to provide services across an IT network. During data migration, it is reasonable to place a premium on security to ensure the data arrives safely at its final destination. To ensure a smooth transition from one platform to another, it is important to consider the legal implications of cloud data transfer and implement the appropriate measures.

Cloud computing is both a computational paradigm and a distributed architecture. Its primary goal is to provide a safe, rapid, and easy data storage and internet computing service. All computer resources are seen as services and are distributed through the Internet. The Cloud improves cooperation, agility, scalability, availability, and the capacity to respond to variations according to demand. It also speeds up development work and allows for cost reductions via streamlined and efficient computing. A system that uses the Internet and integrates several separate computer ideas and technologies, such as Service Oriented Architecture (SOA), Web 2.0, virtualization, and other technologies, is referred to as "cloud computing." This strategy allows clients to leave their own software and data on the servers while also satisfying their computing needs by making common business applications online and accessible via web browsers. The word "cloud computing" is a marketing term that depicts the maturation of these technologies and the services they offer. In some ways, cloud computing is a representation of the maturation of these technologies. Adopting cloud computing can provide many advantages; nevertheless, doing so is fraught with the possibility of encountering several challenging obstacles. Adoption hurdles worth noting after security include issues with compliance, privacy, and legal difficulties. There is a need for more knowledge regarding how security may be handled at different levels (such as the network, host, application, and data layers), as well as how applications may securely relocate their security to the cloud, since cloud computing is still a relatively new paradigm. Many IT managers have voiced security worries regarding cloud computing due to all the unknowns (Hashizume et al., 2013). Data storage off-site, dependence on the Internet, internal control gaps, a large number of tenants, and mixing external and internal security measures are all potential security weaknesses. The Cloud stands apart from more conventional technologies thanks to a number of distinctive characteristics. The vast scale of the Cloud and the completely different, scattered, and virtualized nature of the resources under cloud provider management are two examples.

Traditional security measures such as identification, authentication, and authorization are no longer sufficient for clouds in their present state. The majority of the time, the security controls that are used in cloud computing are not much different from the security controls that are used in any other IT environment (Hashizume et al., 2013). On the other hand, cloud computing may expose an organization to risks distinct from those posed by traditional information technology solutions. This is true due to the operational models, cloud service models, and supporting technologies that have been implemented. Unfortunately, it's a common misconception that making these systems more secure will limit their adaptability. Businesses that are concerned about expanding their operations outside of their data center's network sometimes confront the frightening prospect of moving sensitive data and environments with mission-critical applications to environments hosted in public clouds. In order to alleviate these concerns, a cloud solution provider must ensure that customers will continue to have the same security and privacy controls over their applications and services, provide evidence to customers that their organization is secure and they can meet their service-level agreements, and that they can prove compliance to auditors. In addition, customers must be provided with evidence that cloud solution providers can meet regulatory requirements (Sen, 2016).

1.1 Deploying cloud computing technologies

Four market-leading technologies may be used to build and put into practice a strategic and relevant solution for an organization's infrastructure. These technologies are being employed in the business sphere more and more, according to recent academic studies. In their work from 2020, Tavbulatova, Z.K., et al. explore the different cloud deployment options. Starting with the Private Cloud, which has benefits including lower energy and maintenance costs, more security and storage space, quicker data transfer rates, more organizational resource flexibility, and user friendly payment methods. Although there are some restrictions, such as the cost of the necessary hardware and software licenses, the time and effort needed for administration, and the potential for data damage, there are also some (Habiba et al., 2014). Following a discussion of the advantages of the new information system, they move on to a discussion of the deployment of public clouds, which are quick and simple to set up, have high levels of data security on both a physical and software level using large data centers, can handle an infinite amount of computing resources, and are easy to use and efficient. Hardware and software costs are kept to a minimum since just an internet connection is necessary. On the other hand, there are a number of drawbacks to this technology, such as the fact that administration of the Cloud within an organization is outside of its control and that users can only depend on the service provider and the Internet to work. A hybrid cloud architecture makes it feasible to maintain data privacy, save expenses by moving workloads to the cloud, and create a Community Cloud that enables information access at a fair cost from anywhere. Costly cloud setup, limited storage capacity, and little data protection are all drawbacks (Habiba et al., 2014).

1.3 Review of cloud computing

Thousands upon thousands of people all around the globe have easy access to this powerful tool, and it is used in a wide variety of contexts. It closely resembles the characteristics of a real cloud in that it exhibits a variety of dynamics, including scalability, unknown location, and abstract boundaries. The Cloud includes a broad range of information technologies, and advancements in those technologies have aided in its extension and development. The NIST defines cloud computing as a method that enables ubiquitous, frictionless, and on-demand network access. The common pool of settings for computing resources, including as servers, applications, networks, and services, is made more accessible, which speeds up provisioning and eases the burden on service providers when they are involved in administration or engagement tasks. A group of technologies known collectively as "cloud computing" includes four deployments, three service kinds, and five main attributes. Traditional computer methods and several networking philosophies are combined in cloud computing.

Traditional computer techniques are used with various networking mechanisms in cloud computing. Disaggregating a huge calculating task into smaller, easier to handle components is the aim of distributed computing. After this is complete, a group of computer users will utilize the assembly process to examine and compile the results. High degrees of efficiency-required challenges may be solved with the use of parallel computing. For the purpose of computing and assessing a certain activity, it assembles a large amount of resources.

1.4 Thesis statement: The impact of cloud computing, security risks, and future perspectives for data movement to provide a service via a network.

2. Literature review

2.1 Types of Cloud Providers

From the perspective of cloud computing, the ease and security of data transmission depending on the cloud service providers used. The resources and services offered by the providers and any potential risks will vary (Lynn et al., 2017). There are several cloud service providers, and some even see data migration as a service they provide. Depending on the nature of the company and its needs, an organization may use a wide range of service providers (Habjan & Pucihar, 2017). In particular, cloud computing may be divided into three different categories: platform as a service, infrastructure as a service, and software as a service. The term "software as a service," or "SaaS," refers to the increasingly popular practice of making software accessible to users through the Internet in a way that is entirely controlled by distant "cloud" services. As a consequence, it is very clear that service providers still bear all responsibility for service management and any required service changes. Since everyone will be utilizing the same application version, SaaS's main advantage is how easily its features may be modified.

PaaS, or "platform as a service," is a way to provide consumers access to an app platform. The Google App Engine is an excellent example of this. With the help of this software platform, customers may install programs written in the same language as their service providers. Only customers may modify their programs and utilize house-made programming tools. The responsibility to manage the application's underlying infrastructure remains with the suppliers. Infrastructure as a Service (IaaS), which supplies everything from hardware resources to network components to the designated disk space, is the last service delivery paradigm (Habjan & Pucihar, 2017). Clients have full administrative control over the provision of the services they have acquired and virtual access to the required physical resources.

2.2 Data Migration: Security Concerns and Solutions

2.2.1 Problems Integrating with Current Infrastructure

One essential piece of information that must be provided the compatibility between the cloud service and the present IT system. The intrinsic complexity of the current infrastructure is, to a significant measure, the source of risks (Ahuja & Deval, 2018). A sophisticated platform requires more time to migrate from. In order to speed up data transmission, the business must hire personnel knowledgeable in information technology (Caldarella, Ferri, & Maffei, 2017). Who is capable of fast adapting the whole architecture to the Cloud? Is it still a major problem with a high risk of danger? Businesses that previously employed decentralized, simplified frameworks and microservices in their systems often find the migration process less complicated. This is so that various informational components may be easily distinguished. Not all compromised networks can be trusted to transfer data safely. It should be kept in mind (Sharma, Husain & Ali, 2017). Certain simpler architectural solutions may be more vulnerable than others to data management and transfer issues. It is clear how this will affect all of the organizations.

In their article "A Security Approach for Data Transfer on Cloud Computing Based on Human Genetics," Hammami et al. (2017) alert readers to the many dangers associated with data migration to the cloud. In order to balance the possible benefits and drawbacks, this issue includes quantitative and qualitative research. The issue of determining who is accountable for what in terms of the organization's data management protocols must be considered for the successful balancing process. If the organization's benefits outweigh its inefficiency and ineffectiveness, data transfer risks will still be significant (Corcoran & Datta, 2016). For this reason, it is critical to balance the dangers at play with the safeguards in place to reduce them. Data security is among the cloud computing topics that receive the most attention. Since the service providers can't access the security system directly, they presumably depend largely on cloud data centers. For service providers to propose future security processes, it is essential to verify whether they have been applied (Hammami et al., 2017). The risks considerably rise if the infrastructure cannot guarantee the confidentiality and accountability of the data as it travels across space. The business must use a trusted platform module that enables remote transfers to produce the non-forgeable system reports and summaries. The inability to apply TPM could expose a corporation to hazards that might cost them money.

2.2.2 Urge for Secure Data Migration Process

One of the areas in which managers are confronted with significant challenges is data migration on the Cloud. To enable virtual task execution, the provision of cloud services will need the movement of data across networks, which will be the responsibility of the service providers. That entails the data having to be transferred from the firm's server to a cloud server located in another location. Several factors are at play, each contributing to the challenges that service providers and those in charge of their operations confront (Jain & Menache, 2017). Transferring data from one location to another comes with various related hazards. To begin, cloud computing functions inside an interface. This interface provides companies and their customers with the means to access data and information that is stored in a digital setting. Therefore, the effectiveness of the tasks relies heavily on the knowledge and experience of local cloud service providers. In addition, failing to comply with systematic data migration methods will ultimately result in further concerns with the data's privacy and the situation's general security (Patrignani et al., 2017). Therefore, it is necessary to engage cloud service providers with significant expertise and abilities in the field of data management. This is the only way to guarantee that cloud-based data transfer is handled correctly and in the needed manner. Suppose a company is contemplating moving its data to the Cloud. In order to make it possible for it to supply solutions, the first step should be to get in touch with cloud service providers. At some point in time, the providers will get started on initializing the required procedures that need to be performed to guarantee that security is taken into account at every stage of the process.

2.3.3 Added Latency and Poor Visibility and Control

Several concerns are associated with a data transfer that is often overlooked. These are dangers known to exist and have impacts. However, it is generally accepted that they are either insignificant or too little to have any significant bearing on the situation. Additional latency is, in point of fact, a significant threat that poses potential problems for the organization engaged in the process of data transmission and potentially negative impacts on that company. When applications, systems, and databases utilized during data transmission and cloud service supply are not in line with the urgency of the services required, the hazards that might occur as a consequence of this situation are as follows: (Tchernykh et al., 2019). When dealing with eCommerce services and the Internet of things, latency becomes extremely important since these activities demand a positive client experience. The performance of specific activities that call for the prompt processing of data and information is a part of the data migration process. When seen in this way, the addition of latency will not only result in difficulties but also in disappointment in data transfer.

Another significant factor that contributes to the dangers associated with data transfer is cloud computing environments that provide insufficient visibility and control or none at all. The visibility of network traffic has a substantial influence on both its use and its performance. When a company uses data centers that are privately held, they often have greater or complete control over the problems that need to be solved since they are the only owners of such centers. In addition, the organizations have complete authority over the resources used and the typical activities performed with those resources. The management of both the physical networks and the host networks is often done appropriately. The data migration process involves moving to other cloud service providers outside the organization. In the circumstances like these, the organization needs to have complete control over the services offered, the networks used, or the resources that are put to use. Because of this, the organization needs help to get insight into the processes carried out in conjunction with data in the Cloud and workloads carried out in public cloud environments. The most recent study that Dimensional Research carried out found that more than 95% of the sample group that was questioned stated that visibility concerns are the primary reasons for the difficulties associated with cloud service delivery. According to Zelenkov (2016), the performance of companies in the administration and processing of data via the Cloud has resulted in network challenges due to visibility. In addition, the findings showed that 38 percent of the overall sample that was questioned identified visibility as a critical challenge in the context of network disruptions on cloud servers.

2.2.4 Data Loss and Wasted Costs

One of the most prevalent and risky aspects of cloud computing is the possibility of losing data during the process of migrating data to offer services across a network. Before beginning the data migration process, it is necessary to be certain that the correct method has been set to avoid losing any data in the transition. According to Al-Badi, Tarhini, and Al-Qirim (2018), it is imperative that the data files undergoing migration be handled with extreme care to prevent the loss of any data in a manner that is irreversible. During data migration, you can encounter errors, damaged data, and missing files. These are all rather frequent concerns. Before migrating the information, it is important to put pre-defined solutions into action, such as backing up the data. Another major danger associated with data transfer is the possibility of squandering financial resources.

Various pricing points are associated with the various cloud computing models. The flexibility and dependability of the model's cloud services that are supposed to be provided are directly correlated to the charged rates. When you use cloud computing, you are responsible for making your own payments for the storage, transfer, and storage services you use. Even if the payments have been paid, it is important to remember that a problem with one service might have a big effect on the other services being supplied (Al-Badi, Tarhini & Al-Qirim, 2018). It may be difficult to track down a dependable platform that is also in a position to consider all of an organization's concerns. In certain cases, it requires going through an expensive process of trial and error in order to find the optimal model or platform upon which to base decisions. Because of this, there is a potential threat to the organization's ability to properly distribute any cash that may be available.

2.2.5 Review of intrusion detection and prevention

Aldwairi et al, analysis’s takes into account how the growth of the Internet has produced a society in which everything and everyone is linked. As a result, networked systems are becoming simple targets for hackers at any location. The typical pattern of these incursions is for an antagonistic actor to first locate the infrastructure, then search for a weak point before moving on to more damaging action against the target environment. Assaults progress, often more sophisticated techniques are used. Button et al. outline sophisticated attacks utilizing several assault bases and network obscurants. As a result, countermeasure techniques like IDS must often change. For instance, Handa et al. offer an intrusion detection system/intrusion prevention system (IDS/IPS) based on machine learning for wireless sensor networks that are part of the Internet of Things (IoT). They provide an anomalous intrusion detective protocol (AIDP), which adopts a light attack and fault detection strategy, in contrast to the computationally costly nature of many machine learning methods. Three steps make up the method: preparation, exchange, and replenishment. Values associated with experiences change as a result of warnings throughout the learning process (TAFDS). Every node on the trading floor expresses its opinion on the other nodes depending on the amount of competence with which it is equipped. Last but not least, the standing is updated during the refreshing phase to reflect the new standing in terms of ability, esteem, and trust.

In addition, H. Gupta and S. Sharma did some research on the dangers of using IoT for smart networks. To stress different types of attack, these authors use a layered strategy. The first layer, known as the "Perception Layer," contains strategies as varied as physical injury, jamming, and malicious code injection. The "Network Layer," includes attacks against routers, proxies, and firewalls as well as traffic analysis and flooding. Attacks on the application layer include malware, code injection, and social engineering, while attacks on many levels include distributed denial of service, spyware, and cryptanalytics. In their future work, Khraisat et al. provide a software defined network-aided intrusion detection system. Their intrusion detection technology was Snort, which enables several instances of the software to share hardware. A software-defined network (SDN) controller sends questionable data to certain network nodes for examination. The networking between the many servers that make up the SDN is handled by Docker containers that are operating on top of the GNS3 virtual machine. Huang et al studies of the protocols used for intrusion detection and prevention systems in wireless sensor networks that are a part of an Internet of Things deployment takes into account both the current state of the art and potential problems that these systems may face in the future. In their investigations of the many security requirements of wireless sensor networks and the Internet of Things, key security characteristics including authentication, integrity, confidentiality, non-repudiation, authorization, freshness, availability, forward secrecy, and backwards secrecy are looked at. They also consider prevalent security risks in wireless sensor networks and the Internet of Things. They use a multi-tiered framework to describe attacks. This study examines the successful, thorough, and strategic deployment of an intrusion detection system to lower risks in this setting. In other words, there should be less false positives and false negatives produced by the system. As a result, the system shouldn't do any harm or provide any new entry points for attackers. Its implementation should also be inexpensive and need minimal of the supporting infrastructure. Cairns et al. review current approaches for detecting and preventing intrusions in service-oriented vehicular networks in this study. They start by considering common SONET threats such Sybil attacks, denial-of-service attacks, and attacks using false alarm generators. Different IDA kinds are contrasted, and a method is suggested in which one automobile makes use of an IDA to keep a watch on nearby vehicles. They provide a rule-based intrusion detection method that can defend against common dangers like Sybil and denial-of-service attacks.

2.3 Comparison between cloud computing and grid computing

A system where resources are arranged in a hierarchical structure is referred to as "Grid computing" This method is used to split and distribute hardware and software components across numerous users. The use of grid computing is possible in a wide range of business settings. In a grid computing system, resources may include everything from a network to a software license to a remote device to a printer to a memory disk space to a scanner to a cycle processor. Grid computing is similar to cloud computing, however cloud computing differs in that it is owned and operated by a single company. According to Zissis, persons who take part in grid computing have a duty to share their resources with others in line with a timetable established by grid administrators. Grid computing and cloud computing share a number of characteristics. Both approaches group numerous computers together and utilize their combined scaling capacity to do one or more difficult tasks that would be too demanding for any one machine to complete on its own. The grid computing technique is ideal for scientific and educational projects due to its effectiveness and cheap cost.

Customers must first submit a proposal outlining the nature of the research effort and the resources needed in order to start conversations with providers concerning grid resource consumption. Grid computing tries to take use of idle computer resources when remote work is impractical via user exploitation. The advantage of grid computing over cloud computing, according to Tripathi, is that users are not dependent on a system that prioritizes their own processing demands above those of other users. The emphasis of cloud computing, in contrast, is on for-profit enterprises that make their services accessible to the public for a price. This approach is intended to replace businesses that either don't want to or are unable to manage their own computer development and management. One of the primary purposes of cloud computing, according to Yangui et al., is to split material into smaller parts that may be distributed to consumers depending on their unique interests and preferences. From a technical standpoint, grid computing is the process of merging the computing capacity of several institutions into a single, bigger system, something that is not conceivable with a single cloud computing architecture. These businesses are not constrained in their choice of locations, and they have control over who uses their computers.

Computing on a grid is often accomplished via the use of grid middleware, which is a kind of software that offers generic services to hide the distributed and heterogeneous nature of the underlying infrastructure. Executive management, data management, information services, and security services may all work together efficiently thanks to the middleware. It is standard procedure to employ an information resource to centrally keep comprehensive records of all grid assets. Therefore, all assistance must be updated in order to run on computers in use today. Utilizing security resources is essential to ensure that institutional internal resources are shielded from unwanted access and that regional administrative and communication regulations are followed. Solutions that enhance data availability, migration, replication, and integration are produced through the use of data management resources. Executive management, as described by Sabahi, is used to get things done by making the most of available digital assets. Additionally, it is used to control computation outcomes and keep track of work progress. In contrast, cloud computing is often used to provide cloud-based resources. Depending on the cloud service, cloud software may appeal to a wide variety of customers. It keeps track of all the available hardware and software components so that virtual machines may be made accessible to users and managed in response to their requirements. In addition to assisting in the implementation, setup, and launch of software applications, cloud computing software may ensure pricing, accounting, and user management. Utilizing computing services to their fullest potential calls for the implementation of procedures and methods for deciding where to create virtual machines and when to launch and shut them down. Handa, A. et al. 6 argue that user management is crucial to ensuring proper resource use. Since cloud computing shields consumers from technical challenges, it may be useful for both novice and seasoned creators. Since everything in the cloud is controlled by a single management system, implementing and administering it is a breeze.

3. Methodology

In this section we will discuss the Challenges in regulatory, security, and privacy aspects of Cloud computing are discussed.

3.1 Cloud computing privacy and security Issues

Cloud security uses third-party assurance and controls, much as traditional outsourcing contracts do. The fact that there is presently no established security standard for cloud computing, however, creates further problems. Due to the use of various security standards, technologies, and models, it is crucial to evaluate each cloud service provider on their own merits. In a vendor cloud model, the onus is on the adopting company to collect requirements, conduct risk evaluations of the cloud service provider, and perform assurance operations to ensure that the cloud service
provider is meeting the company's security standards. As a result, companies who are thinking about switching to the cloud face the same security challenges as those who are currently utilizing in-house management. It is important to consider, reduce, or accept both internal and external risks (Basu et al., 2018). The information security concerns that adopting businesses must consider are examined in the sections that follow. These concerns can be addressed directly by designing and implementing security policies in a private cloud or indirectly by relying on vendors' or public cloud providers' assurance efforts. We particularly concentrate on the following issues: These rules apply to all cloud-based data storage. Their attacks against a swarm of enemies. Security concerns in the cloud, including how to respond to and prevent attacks. There is now a higher prevalence of cloud security issues. Unsafe cloud instances. Cloud-based Probabilities of Data Loss There are three broad ways to categorize cloud security issues: issues with accessibility and the usual worries about security caused by a lack of involvement on the part of data providers (Basu et al., 2018).

3.1.1 Traditional safety concerns

Moving to the cloud creates additional security risks, such as the potential for or ease of network and computer intrusion. Cloud providers contend that their security procedures are more tested and proven than those of traditional businesses, which have been sluggish to embrace cloud computing. The Jericho Forum asserts that outsourcing information management may make it easier to protect data if a company is concerned about insider risks. Additionally, using ISP contracts rather than internal rules may make security enforcement easier. "This category includes all of the following issues (Subramanian and Jeyaraj, 2018). Attacks on virtual machines assaults on the level of virtual machines: assaults on the level of virtual machines: The usage of virtual machines (VMs) and hypervisors by cloud service providers creates a possible security concern in multitenant systems. Security flaws exist in virtual machines made by Xen, Microsoft (Virtual PC, Virtual Server), and VMWare (VMWare Shared Folder Bug) (Microsoft Security Bulletin MS07- 049). VM-level monitoring and firewalls are two examples of how Third Brigade addresses security vulnerabilities. Common faults in cloud hosting: These might be weaknesses in the underlying platform, including SQL injection or cross-site scripting. A few flaws in Google Docs are a recent instance (Microsoft Security Bulletin MS07-049). Rational AppScan, a web-based security solution, is now accessible as IBM's web service vulnerability scanning tool (IBM Blue Cloud Initiative). Businesses like Salesforce.com are alerting their consumers about the growth of phishing from cloud service providers. The infrastructure required to connect to and interact with the cloud must be secured by the cloud user since it is often outside the firewall, hence extending the attack surface of the network. The following is an illustration of a cloud assault on a linked computer (Security Evaluation of Grid). The company's authentication and authorization framework may not extend to cloud-based services (Tank, 2019). How does a company link its systems when adopting cloud computing? The next issue is how a company can combine any cloud security data with its own security metrics and regulations. The CLOIDIFIN project argues that "Traditional digital forensic techniques allows investigators to collect devices and undertake in depth analysis on the media and data retrieved." It is thus very improbable that the offender would erase, overwrite, delete, or otherwise destroy the data. In contrast, a cloud computing environment would be more closely related to smaller enterprises that own and operate their own multi-server infrastructure. Due to the enormous size of the cloud and the speed at which data is being rewritten, there is cause for concern." Availability a key cause of concern is the ease of access to essential resources like data and applications. The Extended Gmail Outage in the middle of October 2008, the seven-hour Amazon S3 outage on July 20, and the eighteen-hour FlexiScale outage on October  31 were notable cloud disruptions. Disruption that can be scaled up temporarily Maintaining availability, protecting against denial-of-service attacks (particularly at single points of failure), and guaranteeing computational integrity are all examples of substantial risks in this category (Tank, 2019).

3.1.2 Third Party Data management

Legal ramifications of outsourcing data and applications storage are murky and difficult to predict. There is an increased risk of losing access and control over your data if it is handled by a third party. Cloud marketing often stresses the cloud's autonomy from implementation, yet legal compliance requires cloud openness. To benefit from cloud computing without subjecting their data to any security or privacy threats, several companies are building their own private clouds (Tank, 2019). However, the following difficulties must be resolved correctly. Is there a method for a client of a cloud service to guarantee that their cloud service provider complies with a subpoena or other legal demand within the designated time frame? In terms of how it impacts a company's data retention rules, the provability of deletion provides a comparable difficulty. How long will my data continue to be stored in the cloud after I remove it? The absence of control in cloud computing has effects on auditability as well. Is there sufficient access to the cloud service provider's internal workings to permit auditing? Record-keeping and manual audits have enabled this degree of transparency. An on-site audit is not practicable due to a dispersed, dynamic, multi tenant computer system. Information and activities must remain in specified places due to constraints. In addition to the ambiguity around the alignment of interests, employing another company's infrastructure might result in unanticipated legal ramifications (Tank, 2019).

3.1.3 Defense against cost-effective availability

Availability must also be taken into account when dealing with an adversary whose primary objective is to sabotage operations. As political conflict goes online, this kind of adversary is becoming more prevalent, as seen by the recent cyberattacks on Lithuania (Lithuania Weathers Cyber Attack). When anything goes wrong, you lose productivity as well as user confidence in the system and additional costs for backup procedures. As cloud computing expands, thin clients may be useful for users. Users that use cloud-based applications will need to log in each time they intend to use the service, as opposed to purchasing a license, downloading, and installing the program on their own PCs. This method makes software piracy more difficult while enabling centralized monitoring (Sen, 2022). Access to crucial information may also be denied to unreliable clients. More user mobility is possible with this architecture, but stronger authentication procedures are needed. It is anticipated that using cloud storage, as opposed to maintaining data and programs on users' local computers, may make them more vulnerable to phishing and credential theft. The prevalence of data mashups requiring authorization will increase as cloud computing becomes more pervasive. This development could raise the possibility of data breaches and the number of sources a data consumer must assemble their research from. There are restrictions on who may enter and how as a result. In such a deployment scenario, it may not be possible to implement a centralized access control system. Facebook is a fantastic example of this. Users of Facebook communicate both private and sensitive information. Facebook and other third party programs utilize this data to better cater to the needs of its consumers. Due to Facebook's lack of regular monitoring of these apps, malicious software using Facebook's cloud infrastructure may steal sensitive data (Sen, 2022).

3.1.4 Cloud security and privacy trends

Environments for cloud computing are multi-domain settings where each domain may have its own requirements for security, privacy, and trust and, as a result, its own procedures, user interfaces, and semantics. The word "domain" in this context may refer to independently operational services, additional application or infrastructure components, or both. Service-oriented architectures (SOAs), which enable the composition and orchestration of a variety of services, may be used to facilitate the development of such multi-domain organizations. It is imperative that a thorough policy-based management framework for cloud computing environments be built on the basis of the safe composition of services and earlier research on multidomain policy integration. (Takabi, Joshi and Ahn, 2010). The following highlights some of the most urgent issues with cloud computing security and privacy that must be resolved right now if the technology is to be adopted by a large number of people. Identity verification and authorization Customers' data may be swiftly and simply shared with other online services by being stored in the cloud. A user's identity and access to a service may be confirmed by an identity management (IDM) system by matching their credentials and other identifying information with a database (Mburu, Nderu and Tobias, 2019). One of the main issues with identity and access management (IDM) in the cloud is the possibility of interoperability issues brought on by the usage of a range of identity tokens and identity negotiation protocols. The current state of password-based authentication is both constrained and dangerous. Security of private and proprietary information related to users and operations is the main purpose of an IDM system. Although identity data privacy is poorly understood, it could be jeopardized in multitenant cloud configurations. It is also more difficult to develop effective protections due to the issue of several jurisdictions. When interacting with other services, front- end services may need to take security measures to protect users' privacy. The identities of users and the information of their authentication must be kept separate by providers in multi-tenant clouds. It's crucial that components for authentication and IDM may easily be connected to those for additional security. Designing and implementing reliable identity management and authentication processes is a critical need for cloud computing (Bulusu and Sudia, 2012c).

3.2 Emerging trends in privacy and data in cloud computing

For cloud computing systems, fine-grained access control mechanisms are necessary due to the variety and diversity of services and the varying access needs of the domains. In particular, the least privilege principle must be upheld, and access control services must be flexible enough to accommodate dynamic, context-, attribute-, or credential-based access requirements. Such access control systems could be required to take into account the complicated legislation' stated requirements for privacy protection. The cloud access control system should be simple to use and the distribution of rights should be carefully considered. Cloud delivery models must guarantee that users have access to a policy-neutral access control definition and enforcement framework in order to allay worries about access from various domains. The construction of a privacy-aware framework for access control and accounting services that is suited to compliance monitoring is a crucial issue that has not gotten enough attention from academics. In the cloud, several service providers coexist and collaborate to provide a range of services, but they may each use distinctive security and privacy procedures, making it difficult to manage trust and integrate policies. We must thus deal with the reality that the policies of these groups are inconsistent. Cloud service providers may need to mix a variety of separate solutions in order to deliver more complicated application services. Therefore, processes are required to guarantee that such a dynamic cooperation is maintained effectively and that security breaches are monitored adequately throughout the interoperation process. Prior studies have shown that security breaches may occur during integration even when each domain's laws are carefully examined (Hosseinian-Far, Ramachandran and Slack, 2017). Service providers must thus keep an eye on their access control policies to ensure that the blending of policies doesn't jeopardize data security. In cloud computing, interactions across several service domains may be brief, intensive, and driven by the need to provide a particular service. As a result, a trust framework must be developed to accommodate shifting trust and interaction/sharing requirements via the effective capturing of a broad range of qualities. Additionally, the cloud's policy integration operations must overcome challenges including semantic heterogeneity, safe interoperability, and policy evolution management. A framework for integrating policies that is adaptively supportive is necessary to build, negotiate, and maintain trust since consumer behavior is dynamic. The challenge of creating efficient trust management frameworks for mobile and P2P networks has been extensively researched (Sen, 2022). However, the need to develop safe and reliable trust models for usage in cloud computing environments is urgent. Due to the growing usage of cloud service delivery techniques and the ensuing interoperability issues, it will be challenging to resolve this issue. Management of services without risk: In cloud computing settings, service integrators and cloud service providers collaborate to provide custom solutions for their customers. Independent service providers may collaborate on the platform of the service integrator, which supports the orchestration and interworking of services, in order to meet the security requirements of their customers. Many cloud service providers use the Web Services Description Language (WSDL), yet it is insufficient to fully describe cloud computing services. Concerns including service quality, pricing, and service level agreements (SLAs) are crucial while looking for and creating services in the cloud. To properly characterize services and their capabilities, provide the most interoperable alternatives, integrate services without breaking the regulations of the service owner, and guarantee adherence to service level agreements, all of these concerns must be addressed (Takabi, Joshi, & Ahn, 2010). In conclusion, it is critical and important to establish a reliable and systematic framework for the provision and composition of services that takes into account concerns about confidentiality and privacy.

3.2.1 Privacy and data protection

Cloud computing challenges that revolve around the need for privacy include safeguarding sensitive data such as user identities, policy components, and transaction histories, to name a few. For many organizations, the thought of storing data and applications on equipment outside of their own data centers is daunting (Varghese and Buyya, 2018). When private customer information is transferred to a shared infrastructure, it becomes more prone to hacking. Cloud service providers have a responsibility to reassure customers and provide a high degree of openness about their business practices and privacy assurance. Privacy safeguards must be a part of any strategies for safeguarding cloud data. A significant challenge in this field is the need to track the provenance of data, from its original originator through any later editors and their alterations. There are several uses for keeping track of where something originated from, including tracing, auditing, and granting rights based on previous use. It is challenging to establish a compromise between the two principles of data provenance and privacy since clouds eliminate the requirement for physical boundaries. This is a serious issue in the area of studies as well. Implementation of security policies in organizations: The security management and information security lifecycle models now in use by enterprises undergo a significant change when they move their operations to the cloud. Shared governance in particular has the potential to become a significant issue if improperly managed. Although cloud computing provides a lot of potential benefits, it can also make it harder for the different communities of interest inside client organizations to collaborate. Reliance on third parties by an organization may raise questions about the effectiveness of business continuity and disaster recovery strategies as well as the chances of getting a quick reaction to security occurrences. It will also be necessary to get opinions from other parties on risk and cost-benefit analysis-related issues (Sun, 2019). Therefore, in a perimeter-less setting, users must think about threats like data leaking inside multi-tenant clouds and resilience concerns like their provider's financial health and local calamities. The danger of an insider assault also rises when data and processes are shifted to the cloud. A targeted attack on one tenant in a building with several occupants might have far-reaching effects on all the other tenants. Rethinking existing techniques to life-cycle modeling, risk analysis and management, penetration testing, and service attestation is necessary to ensure that consumers benefit from cloud computing. The area of information security has struggled to identify the appropriate metrics to use for evaluating and assessing risk. To guarantee the rollout and widespread use of secure clouds, we should examine best practices and develop standards. These concerns call for the development of a more structured cyber insurance market, but this is a formidable task due to the worldwide nature of cloud computing. Cloud-specific advancements as well as larger trends in the information technology industry will influence changes in cloud computing services and how they approach future offers, architectures, and innovations (Sun, 2019).

4. Results/ discussion:

Along with a concise explanation of the problems and some potential remedies, the following Presentation on the future trends in cloud computing deployment.

Get your Cloud Computing Technology assignment solved by our Cloud Computing Experts from Exam Question Bank . Our Assignment Writing Experts are efficient to provide a fresh solution to all question. We are serving more than 10000+ Students in Australia, UK & US by helping them to score HD in their academics. Our Experts are well trained to follow all marking rubrics & referencing Style. Be it a used or new solution, the quality of the work submitted by our assignment experts remains unhampered.

You may continue to expect the same or even better quality with the used and new assignment solution files respectively. There’s one thing to be noticed that you could choose one between the two and acquire an HD either way. You could choose a new assignment solution file to get yourself an exclusive, plagiarism (with free Turn tin file), expert quality assignment or order an old solution file that was considered worthy of the highest distinction.

  • Uploaded By : Katthy Wills
  • Posted on : February 28th, 2023
  • Downloads : 0
  • Views : 340

Download Solution Now

Can't find what you're looking for?

Whatsapp Tap to ChatGet instant assistance

Choose a Plan


80 USD
  • All in Gold, plus:
  • 30-minute live one-to-one session with an expert
    • Understanding Marking Rubric
    • Understanding task requirements
    • Structuring & Formatting
    • Referencing & Citing


30 50 USD
  • Get the Full Used Solution
    (Solution is already submitted and 100% plagiarised.
    Can only be used for reference purposes)
Save 33%


20 USD
  • Journals
  • Peer-Reviewed Articles
  • Books
  • Various other Data Sources – ProQuest, Informit, Scopus, Academic Search Complete, EBSCO, Exerpta Medica Database, and more