To see the other types of publications on this topic, follow the link: LEADING CLOUD PROVIDER.

Journal articles on the topic 'LEADING CLOUD PROVIDER'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'LEADING CLOUD PROVIDER.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Basu, Aveek, and Sanchita Ghosh. "Implementing Fuzzy TOPSIS in Cloud Type and Service Provider Selection." Advances in Fuzzy Systems 2018 (November 15, 2018): 1–12. http://dx.doi.org/10.1155/2018/2503895.

Full text
Abstract:
Cloud computing can be considered as one of the leading-edge technological advances in the current IT industry. Cloud computing or simply cloud is attributed to the Service Oriented Architecture. Every organization is trying to utilize the benefit of cloud not only to reduce the cost overhead in infrastructure, network, hardware, software, etc., but also to provide seamless service to end users with the benefit of scalability. The concept of multitenancy assists cloud service providers to leverage the costs by providing services to multiple users/companies at the same time via shared resource. There are several cloud service providers currently in the market and they are rapidly changing and reorienting themselves as per market demand. In order to gain market share, the cloud service providers are trying to provide the latest technology to end users/customers with the reduction of costs. In such scenario, it becomes extremely difficult for cloud customers to select the best service provider as per their requirement. It is also becoming difficult to decide upon the deployment model to choose among the existing ones. The deployment models are suitable for different companies. There exist divergent criteria for different deployment models which are not tailor made for an organization. As a cloud customer, it is difficult to decide on the model and determine the appropriate service provider. The multicriteria decision making method is applied to find out the best suitable service provider among the top existing four companies and choose the deployment model as per requirement.
APA, Harvard, Vancouver, ISO, and other styles
2

Malouf, Lina Samir. "Towards Query Processing Over Homomorphic Encrypted Data in Cloud." Association of Arab Universities Journal of Engineering Sciences 26, no. 4 (December 31, 2019): 65–72. http://dx.doi.org/10.33261/jaaru.2019.26.4.008.

Full text
Abstract:
With data growth very fast, the need for data storage and management in the cloud in a secure way is rapidly increasing, leading developers to find secure data management solutions through new technologies. One of the most advanced technologies at present is cloud computing technology that functions as an online service. Cloud computing technology relies on an external provider to provide online demand services. On the other hand, this technology is pay-for-use technology which means that the user must pay for each service provided by the provider. When we have a look back at the literature, we can find that regular database management systems with query processing specifications do not meet the requirements in cloud computing. This paper focuses on homogeneous coding, which is used primarily for knowledge security within the cloud. Homomorphic encryption has been clarified because of encryption technology in which specific operations can be managed on encrypted data information.
APA, Harvard, Vancouver, ISO, and other styles
3

Singh, Jitendra, and Kamlesh Kumar Raghuvanshi. "Regulations and Standards in Public Cloud." Journal of Information Technology Research 13, no. 3 (July 2020): 21–36. http://dx.doi.org/10.4018/jitr.2020070102.

Full text
Abstract:
Security is a critical issue particularly in public cloud as it rests with the cloud providers. During security implementation, prevailing security threats and regulatory standards are borne in mind. Regulatory compliance varies from one cloud provider to another according to their maturity and location of the data center. Thus, subscribers need to verify the security requirement meeting their objective and the one implemented by the public cloud provider. To this end, subscribers need to visit each cloud provider's site to view the compliance. This is a time-consuming activity at the same time difficult to locate on a website. This work presents the prominent security standards suggested by the leading security institutions including NIST, CSA, ENISA, ISO, etc., that are applicable to the public cloud. A centrally-driven scheme is proposed in order to empower the subscriber to know the regulation and standards applicable according to their services need. The availability of an exhaustive list at one place will lower the users hassle at subscription time.
APA, Harvard, Vancouver, ISO, and other styles
4

R, Natarajan. "Telecom Operations in Cloud Computing." International Journal for Research in Applied Science and Engineering Technology 10, no. 1 (January 31, 2022): 1323–26. http://dx.doi.org/10.22214/ijraset.2022.40051.

Full text
Abstract:
Abstract: Previous generations of wireless connectivity have focussed on voice and data capabilities, 5G is architected to better enable consumer business models. Edge compute (both on-prem / device-edge and provider edge) and related services to support 5G use-cases appears to be the leading driver behind recent announcements. These use-cases will need to be managed by our OSS/BSS for the telco operators and their customers. Keywords: AWS Telecom, Google Telecom, VMware Telecom, Red hat Telecom
APA, Harvard, Vancouver, ISO, and other styles
5

Kouatli, Issam. "People-process-performance benchmarking technique in cloud computing environment." International Journal of Productivity and Performance Management 69, no. 9 (May 23, 2019): 1955–72. http://dx.doi.org/10.1108/ijppm-04-2017-0083.

Full text
Abstract:
Purpose Cloud computing is relatively a new type of technology demanding a new method of management techniques to attain security and privacy leading to customer satisfaction regarding “Business Protection” measure. As cloud computing businesses are usually composed of multiple colocation sites/departments, the purpose of this paper is to propose a benchmark operation to measure and compare the overall integrated people-process-performance (PPP) among different departments within cloud computing organization. The purpose of this paper is to motivate staff/units to improve the process performance and meet the standards in a competitive approach among business units. Design/methodology/approach The research method was conducted at Cirrus Ltd, which is a cloud computing service provider where a focus group consists of six IT professionals/managers. The objective of the focus group was to investigate the proposed technique by selecting the best practices relevant criteria, with the relevant sub-criteria as a benchmarking performance tool to measure PPP via an analytic hierarchy processing (AHP) approach. The standard pairwise comparative AHP scale was used to measure the performance of three different teams defined as production team, user acceptance testing team and the development team. Findings Based on best practice performance measurement (reviewed in this paper) of cloud computing, the proposed AHP model was implemented in a local medium-sized cloud service provider named “Cirrus” with their single site data center. The actual criteria relevant to Cirrus was an adaptation of the “Best practice” described in the literature. The main reason for the adaptation of criteria was that the principle of PPP assumes multiple departments/datacenters located in a different geographical area in large service providers. As Cirrus is a type of SMEs, the adaptation of performance measurement was based on teams within the same data center location. Irrelevant of this adaptation, the objective of measuring vendors KPI using the AHP technique as a specific output of PPP is also a valid situation. Practical implications This study provides guidance for achieving cloud computing performance measurement using the AHP technique. Hence, the proposed technique is an integrated model to measure the PPP under monitored cloud environment. Originality/value The proposed technique measures and manages the performance of cloud service providers that also implicitly act as a catalyst to attain trust in such high information-sensitive environment leading to organizational effectiveness of managing cloud organizations.
APA, Harvard, Vancouver, ISO, and other styles
6

Yuvaraj, Mayank. "Security threats, risks and open source cloud computing security solutions for libraries." Library Hi Tech News 32, no. 7 (September 7, 2015): 16–18. http://dx.doi.org/10.1108/lhtn-04-2015-0026.

Full text
Abstract:
Purpose – In recent years, a large number of organizations have found that cloud computing has many advantages leading to a surge in its adoption. Design/methodology/approach – Cloud computing involves the usage of large servers for the access of data, its storage and manipulation as well as provisioning of other services. Findings – When infrastructure, applications, data and storage are hosted by cloud providers, there are huge security risks associated with each type of service offered. Originality/value – There are a number of considerations, apart from cost which must be evaluated before choosing any particular provider. Sometimes, the physical location of the servers may also be a factor to consider, if sensitive duty is involved.
APA, Harvard, Vancouver, ISO, and other styles
7

Murad, Salah Eddin, and Salah Dowaji. "Service value optimization of cloud hosted systems using particle swarm technique." Journal of Enterprise Information Management 29, no. 4 (July 11, 2016): 612–26. http://dx.doi.org/10.1108/jeim-02-2015-0018.

Full text
Abstract:
Purpose – Cloud Computing has become a more promising technology with potential opportunities, through reducing the high cost of running the traditional business applications and by leading to new business models. Nonetheless, this technology is fraught with many challenges. From a Software as a Service (SaaS) provider perspective, deployment choices are one of the major perplexing issues in determining the degree to which the application owners’ objectives are met while considering their customers’ targets. The purpose of this paper is to present a new model that allows the service owner to optimize the resources selection based on defined metrics when responding to many customers’ with various priorities. Design/methodology/approach – More than 65 academic papers have been collected, a short list of the most related 35 papers have been reviewed, in addition to assessing the functionality of major cloud systems. A potential set of techniques has been investigated to determine the most appropriate ones. Moreover, a new model has been built and a study of different simulation platforms has been conducted. Findings – The findings demonstrate that serving many SaaS customer requests, with different agreements and expected outcomes, would have mutual influence that impact the overall provider objectives. Furthermore, this paper investigates how tagging those customers with various priorities, with reflection of their importance to the provider, permits controlling and aligning the selection of computing resources as per the current objectives and defined priorities. Research limitations/implications – This study provides researchers with a useful literature, which can assist them in relevant subject. Additionally, it uses a value-based approach and particle swarm technique to model and solve the optimization of the computing resource selection, considering different business objectives for both stakeholders, providers and customers. This study derives priority of a number of factors, by which service providers can make strong and adaptive decisions. Practical implications – The paper includes implications on how the SaaS service provider can make decisions to select the needed virtual machines type driven by his own preferences. Originality/value – This paper rests on the usage of Particle Swarm Optimization technique to optimize the business value of the service provider, as well as the usage of value-based approach. This will help model that value in order to combine the total profit of the provider and the customer satisfaction, based on the agreed budget and processing time requested by the customer. Another additional approach has been charted by using the customer severity factor that allows the provider to reflect the customer importance while making the placement decision.
APA, Harvard, Vancouver, ISO, and other styles
8

Lowe, Devesh, and Bhavna Galhotra. "An Overview of Pricing Models for Using Cloud Services with analysis on Pay-Per-Use Model." International Journal of Engineering & Technology 7, no. 3.12 (July 20, 2018): 248. http://dx.doi.org/10.14419/ijet.v7i3.12.16035.

Full text
Abstract:
Emergence of Cloud computing in recent years has provided various options to end-users w.r.t. cloud services. Different end users have different requirements for cloud services such as IAAS, PAAS & SAAS, but these services can be availed using different pricing mechanisms such as PPU, PFR, leased based, subscription based and dynamic pricing based on factors such as initial cost, lease period, QoS, Age of resources and cost of maintenance. The authors work focusses on ‘pay-per-use’ model of cloud pricing by studying various aspects of this model and comparing the current pricing rates of leading cloud service provider. Through this paper, the authors try to analyse the pricing model used by provider by comparing similar pricing offered by competitors. Authors will also try to establish the fairness of pricing as basis for designing better model for such services. The idea of pay per use has emerged to counter the rampant software piracy, while capturing the marginal and heterogeneous users who have been often found to use pirated software, as the acquisition costs for perpetual usage are too high. The marginal usage does not justify the huge capital investment of perpetual license, thereby leading to software piracy. In this paper the authors have also discussed on the Pay per use SaaS model and how it is better than the perpetual licencing, Pay per use is primarily dependent upon certain market conditions, like higher potential for piracy, lower inconvenience costs, majority of marginal users, and strong cloud network presence. Whereas perpetual licensing is important for heavy users, a market having the above-mentioned conditions will always benefit the SaaS pay per use model. So while the developer finds advantages of increased authorized user network, lower costs of marketing, enhanced customer reliability, and lesser impact of piracy, the rewards for users are even greater as they get to use the licensed full & updated versions for a small fee, even for minor everyday usage without incurring the huge expenditure on acquisition.
APA, Harvard, Vancouver, ISO, and other styles
9

Mangalampalli, Sudheer, Ganesh Reddy Karri, and Ahmed A. Elngar. "An Efficient Trust-Aware Task Scheduling Algorithm in Cloud Computing Using Firefly Optimization." Sensors 23, no. 3 (January 26, 2023): 1384. http://dx.doi.org/10.3390/s23031384.

Full text
Abstract:
Task scheduling in the cloud computing paradigm poses a challenge for researchers as the workloads that come onto cloud platforms are dynamic and heterogeneous. Therefore, scheduling these heterogeneous tasks to the appropriate virtual resources is a huge challenge. The inappropriate assignment of tasks to virtual resources leads to the degradation of the quality of services and thereby leads to a violation of the SLA metrics, ultimately leading to the degradation of trust in the cloud provider by the cloud user. Therefore, to preserve trust in the cloud provider and to improve the scheduling process in the cloud paradigm, we propose an efficient task scheduling algorithm that considers the priorities of tasks as well as virtual machines, thereby scheduling tasks accurately to appropriate VMs. This scheduling algorithm is modeled using firefly optimization. The workload for this approach is considered by using fabricated datasets with different distributions and the real-time worklogs of HPC2N and NASA were considered. This algorithm was implemented by using a Cloudsim simulation environment and, finally, our proposed approach is compared over the baseline approaches of ACO, PSO, and the GA. The simulation results revealed that our proposed approach has shown a significant impact over the baseline approaches by minimizing the makespan, availability, success rate, and turnaround efficiency.
APA, Harvard, Vancouver, ISO, and other styles
10

Basu, Srijita, Sandip Karmakar, and Debasish Bera. "Securing Cloud Virtual Machine Image Using Ethereum Blockchain." International Journal of Information Security and Privacy 16, no. 1 (January 2022): 1–22. http://dx.doi.org/10.4018/ijisp.295868.

Full text
Abstract:
Virtual Machine Image (VMI) is the building block of cloud infrastructure. It encapsulates the various applications and data deployed at the Cloud Service Provider (CSP) end. With the leading advances of cloud computing, comes the added concern of its security. Securing the Cloud infrastructure as a whole is based on the security of the underlying Virtual Machine Images (VMI). In this paper an attempt has been made to highlight the various risks faced by the CSP and Cloud Service Consumer (CSC) in the context of VMI related operations. Later, in this article a formal model of the cloud infrastructure has been proposed. Finally, the Ethereum blockchain has been incorporated to secure, track and manage all the vital operations of the VMIs. The immutable and decentralized nature of blockchain not only makes the proposed scheme more reliable but guarantees auditability of the system by maintaining the entire VMI history in the blockchain.
APA, Harvard, Vancouver, ISO, and other styles
11

Achar, Sandesh. "Cloud and HPC Headway for Next-Generation Management of Projects and Technologies." Asian Business Review 10, no. 3 (2020): 187–92. http://dx.doi.org/10.18034/abr.v10i3.637.

Full text
Abstract:
In the last decade, cloud computing has changed dramatically. More providers and administration contributions have entered the market, and cloud infrastructure, once limited to single-provider data centers, is expanding. This article discusses the shifting cloud foundation and the benefits of decentralizing computing from data centers. These patterns necessitate novel cloud computing architectures. These models may affect linking people and devices, data-intensive computing, the service space, and self-learning frameworks. Finally, we compiled a list of issues to consider while assessing modern cloud frameworks. Architectural and urban design projects breach scale and predictability constraints and seek enhanced competency, maintainability, energy performance, and cost-efficiency. Simulation and large-scale information processing drive this cycle. Advances in calculations and computer power help address the complex elements of a coordinated whole-structure framework. Adaptability is a barrier to the configuration, control, and development of whole-system frameworks. This position paper proposes several solutions for semi-or fully automated projects, such as short-plan boundary space exploration, large-scope high-accuracy simulation, and integrated multidisciplinary development. These computer-intensive operations were previously only accessible to the exam network. Once empowered by cloud computing and high-performance computing, these methods can stimulate intelligent plan measures, leading to enhanced results and shorter development times.
APA, Harvard, Vancouver, ISO, and other styles
12

Et. al., Yadaiah Balagoni,. "AN INTEGRATED FRAMEWORK FOR SLA-AWARE MULTI-OBJECTIVE TASK SCHEDULING IN CLOUD COMPUTING." INFORMATION TECHNOLOGY IN INDUSTRY 9, no. 2 (April 10, 2021): 913–28. http://dx.doi.org/10.17762/itii.v9i2.428.

Full text
Abstract:
Cloud services are offered to consumers based on Service Level Agreements (SLAs) signed between Cloud Service Provider (CSP) and consumer. Due to on-demand provisioning of resources there is exponential growth of cloud consumers. Job scheduling is one of the areas that has attracted researchers to improve performance of cloud management system. Along with the on premise infrastructure, Small and Medium Enterprises (SMEs) also depend on public cloud infrastructure (leading to hybrid cloud) for seamless continuity of their businesses. In this context, ensuring SLAs and effective management of hybrid cloud resources are major challenging issues to be considered. Hence, there is a need for an effective scheduling algorithm which considers multiple objective functions like SLA (deadline), cost and energy while making scheduling decisions. Most of the state of the art schedulers in hybrid cloud environment considered single objective function. However, in real world, it is inadequate for scheduling effectiveness. To overcome this problem, we proposed an integrated framework which ensures SLAs (deadline), cost effectiveness and energy efficiency with an underlying scheduling algorithm known as SCE-TS. This algorithm is evaluated with different workloads and SLAs using a cloud platform. The empirical study revealed that the proposed framework improves scheduling efficiency in terms of meeting SLAs, cost and energy efficiency. It is evaluated and compared with the state of the art and found to be effective in making scheduling decisions in cloud environment.
APA, Harvard, Vancouver, ISO, and other styles
13

Niranjanamurthy, M., H. K. Yogish, K. L. Abhishek, and M. P. Amulya. "Progression of Information Sharing, Managing and Storage Using Cloud Environment." Journal of Computational and Theoretical Nanoscience 17, no. 9 (July 1, 2020): 4525–30. http://dx.doi.org/10.1166/jctn.2020.9329.

Full text
Abstract:
Nowadays every big and small IT industry has at least one application using the cloud technology. In 2016, Amazon Web Services, a leading provider of public cloud, generated $ 12.2 billion in net sales and its increase each year. The proposed cloud framework is an assistance wherein information is distantly overseen, oversaw and upheld up. The administration is accessible to clients on a system, which is commonly the Internet. The proposed cloud framework offers two alternatives for the client to oversee records, one is Cloudinary, the other is in the application. Clients have a choice to back up their data and records. Individuals can transfer and make records web based, setting them in the favored area. Specifically, reinforcements become increasingly versatile and helpful for clients when the record is available on the Internet. Moreover, clients can get to their data through an assortment of gadgets. In our application, we implemented chatting system, which will help users to chat with other user. User can chat in group as well as personal chat they can do. This application is accessible by all digital devices, which have web browsers and internet connectivity. There are many other different features, which can be performed by users they are Rename of files, deletion of files from their account, Export files.
APA, Harvard, Vancouver, ISO, and other styles
14

Kollias, P., N. Bharadwaj, E. E. Clothiaux, K. Lamer, M. Oue, J. Hardin, B. Isom, et al. "The ARM Radar Network: At the Leading Edge of Cloud and Precipitation Observations." Bulletin of the American Meteorological Society 101, no. 5 (May 1, 2020): E588—E607. http://dx.doi.org/10.1175/bams-d-18-0288.1.

Full text
Abstract:
Abstract Improving our ability to predict future weather and climate conditions is strongly linked to achieving significant advancements in our understanding of cloud and precipitation processes. Observations are critical to making these advancements because they both improve our understanding of these processes and provide constraints on numerical models. Historically, instruments for observing cloud properties have limited cloud–aerosol investigations to a small subset of cloud-process interactions. To address these challenges, the last decade has seen the U.S. DOE ARM facility significantly upgrade and expand its surveillance radar capabilities toward providing holistic and multiscale observations of clouds and precipitation. These upgrades include radars that operate at four frequency bands covering a wide range of scattering regimes, improving upon the information contained in earlier ARM observations. The traditional ARM emphasis on the vertical column is maintained, providing more comprehensive, calibrated, and multiparametric measurements of clouds and precipitation. In addition, the ARM radar network now features multiple scanning dual-polarization Doppler radars to exploit polarimetric and multi-Doppler capabilities that provide a wealth of information on storm microphysics and dynamics under a wide range of conditions. Although the diversity in wavelengths and detection capabilities are unprecedented, there is still considerable work ahead before the full potential of these radar advancements is realized. This includes synergy with other observations, improved forward and inverse modeling methods, and well-designed data–model integration methods. The overarching goal is to provide a comprehensive characterization of a complete volume of the cloudy atmosphere and to act as a natural laboratory for the study of cloud processes.
APA, Harvard, Vancouver, ISO, and other styles
15

Riemann, Ute. "Benefits and Challenges for BPM in the Cloud." International Journal of Organizational and Collective Intelligence 5, no. 1 (January 2015): 32–61. http://dx.doi.org/10.4018/ijoci.2015010103.

Full text
Abstract:
Business processes are not only variable, they are dynamic as well. A key benefit of BPM is the ability to adjust processes accordingly in response to changing market requirements. In parallel to BPM, enterprise cloud computing technology has emerged to provide a more cost effective solution to businesses and services while making use of inexpensive computing solutions, which combines pervasive, internet, and virtualization technologies (). Despite the slow start the business benefits of cloud computing are as such that the transition of BPM to the cloud is now underway. Cloud services refer to the operation of a virtualized, automated, and service-oriented IT landscape that allows the flexible provision and usage-based invoicing of resources, services, and applications via a network or the Internet. The generic term “X-as-a-Service” summarized the business models delivering almost everything as a service. BPM in the cloud is often regarded as a SaaS application. More recently, BPM is being regarded as a PaaS as it facilitates the creation and deployment of applications, in this case business process solutions. The PaaS landscape is the least developed of the four cloud based software delivery models previously discussed. PaaS vendors, such as IBM, Oracle, Microsoft delivered an application platform with managed cloud infrastructure services however more recently the PaaS market has begun to evolve to include other middleware capabilities including process management. BPM PaaS is the delivery of BPM technology as a service via a cloud service provider. In order to be classified as a PaaS a BPM suite requires the following capabilities: the architecture should be multi-tenant, it should be hosted off premise and it should offer elasticity and metering by use capabilities. When we refer to BPM in the cloud what we are really referring to is a combination of BPM PaaS and BPaaS (Business Process as a Service). Business Process as a Service (BPaaS) is a set of pre-defined business processes that allows the execution of customized business processes in the cloud. BPaaS is a complete pre-integrated BPM platform hosted in the cloud and delivered as a service, for the development and execution of general-purpose business process application. Although such a service harbors an economic potential, questions that need to be answered are as follows: Can an individual and company-specific business process supported by a standardized cloud solution, or should we protect process creativity and competitive differentiation by allowing the company to design the processes individually and solely support basic data flows and structures? Does it make sense to take a software solution “out of the box” that handles both data and process in a cloud environment, or would this hinder the creativity of business (process) development leading to a lower quality of processes and consequently to a decrease in the competitive positioning of a company? How to manage the inherent compliance and security topic. Within a completely integrated business application system, all required security aspects can be implemented as safeguards with just enough money. Within the cloud, however, advanced standards and identity prove is required to monitor and measure information exchange across the federation. Thereby there seems to be no need for developing new protocols, but a standardized way to collect and evaluate the collected information.
APA, Harvard, Vancouver, ISO, and other styles
16

Riemann, Ute. "Benefits and Challenges for Business Process Management in the Cloud." International Journal of Organizational and Collective Intelligence 5, no. 2 (April 2015): 80–104. http://dx.doi.org/10.4018/ijoci.2015040104.

Full text
Abstract:
Business processes are not only variable they are as well dynamic. A key benefit of Business Process Management (BPM) is the ability to adjust business processes accordingly in response to changing market requirements. In parallel to BPM, enterprise cloud computing technology has emerged to provide a more cost effective solution to businesses and services while making use of inexpensive computing solutions, which combines pervasive, internet, and virtualization technologies (). Despite the slow start, the business benefits of cloud computing are as such that the transition of BPM to the cloud is now underway. Cloud services refer to the operation of a virtualized, automated, and service-oriented IT landscape allowing the flexible provision and usage-based invoicing of resources, services, and applications via a network or the internet. The generic term “X-as-a-Service” summarize the business models delivering almost everything as a service. BPM in the cloud is often regarded as a SaaS application. More recently, BPM is being regarded as a PaaS as it facilitates the creation and deployment of applications, in this case business process solutions. The PaaS landscape is the least developed of the four cloud based software delivery models previously discussed. PaaS vendors, such as IBM, Oracle, and Microsoft delivered an application platform with managed cloud infrastructure services however, more recently the PaaS market has begun to evolve to include other middleware capabilities including process management. BPM PaaS is the delivery of BPM technology as a service via a cloud service provider. For the classification as a PaaS a BPM suite requires the following capabilities: the architecture should be multi-tenant, hosting should be off premise and it should offer elasticity and metering by use capabilities. When we refer to BPM in the cloud, what we are really referring to is a combination of BPM PaaS and BPaaS (Business Process as a Service). Business Process as a Service (BPaaS) is a set of pre-defined business processes that allows the execution of customized business processes in the cloud. BPaaS is a complete pre-integrated BPM platform hosted in the cloud and delivered as a service, for the development and execution of general-purpose business process application. Although such a service harbors an economic potential there are remaining questions: Can an individual and company-specific business process supported by a standardized cloud solution, or should we protect process creativity and competitive differentiation by allowing the company to design the processes individually and solely support basic data flows and structures? Does it make sense to take a software solution “out of the box” that handles both data and process in a cloud environment, or would this hinder the creativity of business (process) development leading to a lower quality of processes and consequently to a decrease in the competitive positioning of a company? How to manage the inherent compliance and security topic. Within a completely integrated business application system, all required security aspects can be implemented as a safeguarding with just enough money. Within the cloud, however, advanced standards and identity prove is required to monitor and measure information exchange across the federation. Thereby there seems to be no need for developing new protocols, but a standardized way to collect and evaluate the collected information.
APA, Harvard, Vancouver, ISO, and other styles
17

Dhokale, Ishani, and Pratibha Chavan. "Fundamentals of SD WAN Communication Technologies." Journal of Network Security Computer Networks 8, no. 2 (August 9, 2022): 53–58. http://dx.doi.org/10.46610/jonscn.2022.v08i02.005.

Full text
Abstract:
Let us take a look at these new high-tech communications and how they shape the industry. Local area-defined software is a software system that manages wide area networks, provides ease of use, single site management and minimizes the cost, and can enhance communication with branch offices and the cloud. There are been some changes in local networks nothing has been more relevant than software-defined WAN or SD-WAN, which changes the way network professionals think about improving multimedia connectivity such as changing the Multi-protocol label. (MPLS), independent transfer and Digital Subscriber Line (DSL). The separation of hardware and control plane is similar to how software-defined networks use visual technology to improve the management and operation of their data centers. In fact, SD-WAN are set up and managed using proprietary protocols such as Cisco iOS. That is less hardware and less control over the hardware. A leading SD-WAN application that enables organizations to build highly efficient WANs with cost-effective turnkey Internet access. This allows organizations to phase out or completely replace low-cost WAN communication technologies such as MPLS. So customers can easily manage their network regardless of the connection provider. SD-WAN is currently one of the hottest topics with a real impact on CC services and WAN environment. SD-WAN it affects the way we think about how we use the web services yet. More importantly, it has great potential to change the way we use Communication Services in future. There are several industries that are interesting SD-WAN deployment conditions.
APA, Harvard, Vancouver, ISO, and other styles
18

Panth, Deepak, Dhananjay Mehta, and Rituparna Shelgaonkar. "A Survey on Security Mechanisms of Leading Cloud Service Providers." International Journal of Computer Applications 98, no. 1 (July 18, 2014): 34–37. http://dx.doi.org/10.5120/17149-7184.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Kölling, Tobias, Tobias Zinner, and Bernhard Mayer. "Aircraft-based stereographic reconstruction of 3-D cloud geometry." Atmospheric Measurement Techniques 12, no. 2 (February 22, 2019): 1155–66. http://dx.doi.org/10.5194/amt-12-1155-2019.

Full text
Abstract:
Abstract. This work describes a method to retrieve the location and geometry of clouds using RGB images from a video camera on an aircraft and data from the aircraft's navigation system. Opposed to ordinary stereo methods for which two cameras with fixed relative position at a certain distance are used to match images taken at the exact same moment, this method uses only a single camera and the aircraft's movement to provide the needed parallax. Advantages of this approach include a relatively simple installation on a (research) aircraft and the possibility to use different image offsets that are even larger than the size of the aircraft. Detrimental effects are the evolution of observed clouds during the time offset between two images as well as the background wind. However we will show that some wind information can also be recovered and subsequently used for the physics-based filtering of outliers. Our method allows the derivation of cloud top geometry which can be used, e.g., to provide location and distance information for other passive cloud remote sensing products. In addition it can also improve retrieval methods by providing cloud geometry information useful for the correction of 3-D illumination effects. We show that this method works as intended through comparison to data from a simultaneously operated lidar system. The stereo method provides lower heights than the lidar method; the median difference is 126 m. This behavior is expected as the lidar method has a lower detection limit (leading to greater cloud top heights for the downward view), while the stereo method also retrieves data points on cloud sides and lower cloud layers (leading to lower cloud heights). Systematic errors across the measurement swath are less than 50 m.
APA, Harvard, Vancouver, ISO, and other styles
20

Li, Zhihui, Philip F. Hopkins, Jonathan Squire, and Cameron Hummels. "On the survival of cool clouds in the circumgalactic medium." Monthly Notices of the Royal Astronomical Society 492, no. 2 (December 19, 2019): 1841–54. http://dx.doi.org/10.1093/mnras/stz3567.

Full text
Abstract:
ABSTRACT We explore the survival of cool clouds in multiphase circumgalactic media. We revisit the ‘cloud-crushing problem’ in a large survey of simulations including radiative cooling, self-shielding, self-gravity, magnetic fields, and anisotropic Braginskii conduction and viscosity (with saturation). We explore a wide range of parameters including cloud size, velocity, ambient temperature and density, and a variety of magnetic field configurations and cloud turbulence. We find that realistic magnetic fields and turbulence have weaker effects on cloud survival; the most important physics is radiative cooling and conduction. Self-gravity and self-shielding are important for clouds that are initially Jeans-unstable, but largely irrelevant otherwise. Non-self-gravitating, realistically magnetized clouds separate into four regimes: (1) at low column densities, clouds evaporate rapidly via conduction; (2) a ‘failed pressure confinement’ regime, where the ambient hot gas cools too rapidly to provide pressure confinement for the cloud; (3) an ‘infinitely long-lived’ regime, in which the cloud lifetime becomes longer than the cooling time of gas swept up in the leading bow shock, so the cloud begins to accrete and grow; and (4) a ‘classical cloud destruction’ regime, where clouds are eventually destroyed by instabilities. In the final regime, the cloud lifetime can exceed the naive cloud-crushing time owing to conduction-induced compression. However, small and/or slow-moving clouds can also evaporate more rapidly than the cloud-crushing time. We develop simple analytic models that explain the simulated cloud destruction times in this regime.
APA, Harvard, Vancouver, ISO, and other styles
21

Wang, Yunhe, Xiaojun Yuan, and Mark A. Cane. "Coupled mode of cloud, atmospheric circulation, and sea ice controlled by wave-3 pattern in Antarctic winter." Environmental Research Letters 17, no. 4 (March 29, 2022): 044053. http://dx.doi.org/10.1088/1748-9326/ac5272.

Full text
Abstract:
Abstract This study examines coupled relationships among clouds, atmospheric circulation, and sea ice in Antarctic winter. We find that the wave-3 pattern dominates the leading covariability mode among cloud, atmospheric circulation, and sea ice. Both horizontal transport and vertical motion contribute to cloud formation, resulting in maximum cloud anomalies spatially between maximum meridional wind and pressure anomalies in the coupled system. The radiative effect of the clouds related to the wave-3 pattern can generate sea ice anomalies up to 12 cm thick in one month in the Amundsen Sea. It also strengthens the sea ice anomalies that are directly induced by low-level atmospheric circulation anomalies. In addition, the radiative forcing of the leading cloud mode in the lower troposphere is suppressed by the dynamic and thermodynamic effects of the circulation anomalies. These discoveries provide a better understanding of Antarctica’s interactive processes, and also offer physical evidence for climate model validations.
APA, Harvard, Vancouver, ISO, and other styles
22

L’Ecuyer, Tristan S., Philip Gabriel, Kyle Leesman, Steven J. Cooper, and Graeme L. Stephens. "Objective Assessment of the Information Content of Visible and Infrared Radiance Measurements for Cloud Microphysical Property Retrievals over the Global Oceans. Part I: Liquid Clouds." Journal of Applied Meteorology and Climatology 45, no. 1 (January 1, 2006): 20–41. http://dx.doi.org/10.1175/jam2326.1.

Full text
Abstract:
Abstract The importance of accurately representing the role of clouds in climate change studies has become increasingly apparent in recent years, leading to a substantial increase in the number of satellite sensors and associated algorithms that are devoted to measuring the global distribution of cloud properties. The physics governing the radiative transfer through clouds is well understood, but the impact of uncertainties in algorithm assumptions and the true information content of the measurements in the inverse retrieval problem are generally not as clear, making it difficult to determine the best product to adopt for any particular application. This paper applies information theory to objectively analyze the problem of liquid cloud retrievals from an observing system modeled after the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument currently operating on the Aqua and Terra platforms. It is found that four diagnostics—the retrieval error covariance, the information content, the number of degrees of freedom for signal, and the effective rank of the problem—provide a rigorous test of an observing system. Based on these diagnostics, the combination of the 0.64- and 1.64-μm channels during the daytime and the 3.75- and 11.0-μm channels at night provides the most information for retrieving the properties of the wide variety of liquid clouds modeled. With an eye toward developing a coherent representation of the global distribution of cloud microphysical and radiative properties, these four channels may be integrated into a suitable multichannel inversion methodology such as the optimal estimation or Bayesian techniques to provide a common framework for cloud retrievals under varying conditions. The expected resolution of the observing system for such liquid cloud microphysical property retrievals over a wide variety of liquid cloud is also explored.
APA, Harvard, Vancouver, ISO, and other styles
23

Henderson, David S., Jason A. Otkin, and John R. Mecikalski. "Evaluating Convective Initiation in High-Resolution Numerical Weather Prediction Models Using GOES-16 Infrared Brightness Temperatures." Monthly Weather Review 149, no. 4 (April 2021): 1153–72. http://dx.doi.org/10.1175/mwr-d-20-0272.1.

Full text
Abstract:
AbstractThe evolution of model-based cloud-top brightness temperatures (BT) associated with convective initiation (CI) is assessed for three bulk cloud microphysics schemes in the Weather Research and Forecasting Model. Using a composite-based analysis, cloud objects derived from high-resolution (500 m) model simulations are compared to 5-min GOES-16 imagery for a case study day located near the Alabama–Mississippi border. Observed and simulated cloud characteristics for clouds reaching CI are examined by utilizing infrared BTs commonly used in satellite-based CI nowcasting methods. The results demonstrate the ability of object-based verification methods with satellite observations to evaluate the evolution of model cloud characteristics, and the BT comparison provides insight into a known issue of model simulations producing too many convective cells reaching CI. The timing of CI from the different microphysical schemes is dependent on the production of ice in the upper levels of the cloud, which typically occurs near the time of maximum cloud growth. In particular, large differences in precipitation formation drive differences in the amount of cloud water able to reach upper layers of the cloud, which impacts cloud-top glaciation. Larger cloud mixing ratios are found in clouds with sustained growth leading to more cloud water lofted to the upper levels of the cloud and the formation of ice. Clouds unable to sustain growth lack the necessary cloud water needed to form ice and grow into cumulonimbus. Clouds with slower growth rates display similar BT trends as clouds exhibiting growth, which suggests that forecasting CI using geostationary satellites might require additional information beyond those derived at cloud top.
APA, Harvard, Vancouver, ISO, and other styles
24

Gryspeerdt, Edward, Johannes Quaas, Tom Goren, Daniel Klocke, and Matthias Brueck. "An automated cirrus classification." Atmospheric Chemistry and Physics 18, no. 9 (May 3, 2018): 6157–69. http://dx.doi.org/10.5194/acp-18-6157-2018.

Full text
Abstract:
Abstract. Cirrus clouds play an important role in determining the radiation budget of the earth, but many of their properties remain uncertain, particularly their response to aerosol variations and to warming. Part of the reason for this uncertainty is the dependence of cirrus cloud properties on the cloud formation mechanism, which itself is strongly dependent on the local meteorological conditions. In this work, a classification system (Identification and Classification of Cirrus or IC-CIR) is introduced to identify cirrus clouds by the cloud formation mechanism. Using reanalysis and satellite data, cirrus clouds are separated into four main types: orographic, frontal, convective and synoptic. Through a comparison to convection-permitting model simulations and back-trajectory-based analysis, it is shown that these observation-based regimes can provide extra information on the cloud-scale updraughts and the frequency of occurrence of liquid-origin ice, with the convective regime having higher updraughts and a greater occurrence of liquid-origin ice compared to the synoptic regimes. Despite having different cloud formation mechanisms, the radiative properties of the regimes are not distinct, indicating that retrieved cloud properties alone are insufficient to completely describe them. This classification is designed to be easily implemented in GCMs, helping improve future model–observation comparisons and leading to improved parametrisations of cirrus cloud processes.
APA, Harvard, Vancouver, ISO, and other styles
25

Medeiros, Brian, Bjorn Stevens, Isaac M. Held, Ming Zhao, David L. Williamson, Jerry G. Olson, and Christopher S. Bretherton. "Aquaplanets, Climate Sensitivity, and Low Clouds." Journal of Climate 21, no. 19 (October 1, 2008): 4974–91. http://dx.doi.org/10.1175/2008jcli1995.1.

Full text
Abstract:
Abstract Cloud effects have repeatedly been pointed out as the leading source of uncertainty in projections of future climate, yet clouds remain poorly understood and simulated in climate models. Aquaplanets provide a simplified framework for comparing and understanding cloud effects, and how they are partitioned as a function of regime, in large-scale models. This work uses two climate models to demonstrate that aquaplanets can successfully predict a climate model’s sensitivity to an idealized climate change. For both models, aquaplanet climate sensitivity is similar to that of the realistic configuration. Tropical low clouds appear to play a leading role in determining the sensitivity. Regions of large-scale subsidence, which cover much of the tropics, are most directly responsible for the differences between the models. Although cloud effects and climate sensitivity are similar for aquaplanets and realistic configurations, the aquaplanets lack persistent stratocumulus in the tropical atmosphere. This, and an additional analysis of the cloud response in the realistically configured simulations, suggests the representation of shallow (trade wind) cumulus convection, which is ubiquitous in the tropics, is largely responsible for differences in the simulated climate sensitivity of these two models.
APA, Harvard, Vancouver, ISO, and other styles
26

Baniata, Hamza, Sami Mahmood, and Attila Kertesz. "Assessing anthropogenic heat flux of public cloud data centers: current and future trends." PeerJ Computer Science 7 (May 5, 2021): e478. http://dx.doi.org/10.7717/peerj-cs.478.

Full text
Abstract:
Global average temperature had been significantly increasing during the past century, mainly due to the growing rates of greenhouse gas (GHG) emissions, leading to a global warming problem. Many research works indicated other causes of this problem, such as the anthropogenic heat flux (AHF). Cloud computing (CC) data centers (DCs), for example, perform massive computational tasks for end users, leading to emit huge amounts of waste heat towards the surrounding (local) atmosphere in the form of AHF. Out of the total power consumption of a public cloud DC, nearly 10% is wasted in the form of heat. In this paper, we quantitatively and qualitatively analyze the current state of AHF emissions of the top three cloud service providers (i.e., Google, Azure and Amazon) according to their average energy consumption and the global distribution of their DCs. In this study, we found that Microsoft Azure DCs emit the highest amounts of AHF, followed by Amazon and Google, respectively. We also found that Europe is the most negatively affected by AHF of public DCs, due to its small area relative to other continents and the large number of cloud DCs within. Accordingly, we present mean estimations of continental AHF density per square meter. Following our results, we found that the top three clouds (with waste heat at a rate of 1,720.512 MW) contribute an average of more than 2.8% out of averaged continental AHF emissions. Using this percentage, we provide future trends estimations of AHF densities in the period [2020–2100]. In one of the presented scenarios, our estimations predict that by 2100, AHF of public clouds DCs will reach 0.01 Wm−2.
APA, Harvard, Vancouver, ISO, and other styles
27

Poulsen, C. A., P. D. Watts, G. E. Thomas, A. M. Sayer, R. Siddans, R. G. Grainger, B. N. Lawrence, E. Campmany, S. M. Dean, and C. Arnold. "Cloud retrievals from satellite data using optimal estimation: evaluation and application to ATSR." Atmospheric Measurement Techniques Discussions 4, no. 2 (April 28, 2011): 2389–431. http://dx.doi.org/10.5194/amtd-4-2389-2011.

Full text
Abstract:
Abstract. Clouds play an important role in balancing the Earth's radiation budget. Clouds reflect sunlight which cools the Earth, and also trap infrared radiation in the same manner as greenhouse gases. Changes in cloud cover and cloud properties over time can have important consequences for climate. The Intergovernmental Panel for Climate Change (IPCC) has identified current gaps in the understanding of clouds and related climate feedback processes as a leading cause of uncertainty in forecasting climate change. In this paper we present an algorithm that uses optimal estimation to retrieve cloud parameters from satellite multi-spectral imager data, in particular the Along-Track Scanning Radiometers ATSR-2 and AATSR. The cloud parameters retrieved are the cloud top pressure, cloud optical depth, cloud effective radius, cloud fraction and cloud phase. Importantly, the technique also provides estimated errors along with the retrieved values and quantifies the consistency between retrieval representation of cloud and satellite radiances. This should enable the effective use of the products for comparison with climate models or for exploitation via data assimilation. The technique is evaluated by performing retrieval simulations for a variety of simulated single layer and multi-layer conditions. Examples of applying the algorithm to ATSR-2 flight data are presented and the sensitivity of the retrievals assessed. This algorithm has been applied to both ATSR-2 and AATSR visible and infrared measurements in the context of the GRAPE (Global Retrieval and cloud Product Evaluation) project to produce a 14 year consistent record for climate research (Sayer et al., 2010).
APA, Harvard, Vancouver, ISO, and other styles
28

Li, Pak Shing, and Richard I. Klein. "Magnetized interstellar molecular clouds – II. The large-scale structure and dynamics of filamentary molecular clouds." Monthly Notices of the Royal Astronomical Society 485, no. 4 (March 27, 2019): 4509–28. http://dx.doi.org/10.1093/mnras/stz653.

Full text
Abstract:
Abstract We perform ideal magnetohydrodynamics high-resolution adaptive mesh refinement simulations with driven turbulence and self-gravity and find that long filamentary molecular clouds are formed at the converging locations of large-scale turbulence flows and the filaments are bounded by gravity. The magnetic field helps shape and reinforce the long filamentary structures. The main filamentary cloud has a length of ∼4.4 pc. Instead of a monolithic cylindrical structure, the main cloud is shown to be a collection of fibre/web-like substructures similar to filamentary clouds such as L1495. Unless the line-of-sight is close to the mean field direction, the large-scale magnetic field and striations in the simulation are found roughly perpendicular to the long axis of the main cloud, similar to L1495. This provides strong support for a large-scale moderately strong magnetic field surrounding L1495. We find that the projection effect from observations can lead to incorrect interpretations of the true three-dimensional physical shape, size, and velocity structure of the clouds. Helical magnetic field structures found around filamentary clouds that are interpreted from Zeeman observations can be explained by a simple bending of the magnetic field that pierces through the cloud. We demonstrate that two dark clouds form a T-shaped configuration that is strikingly similar to the infrared dark cloud SDC13, leading to the interpretation that SDC13 results from a collision of two long filamentary clouds. We show that a moderately strong magnetic field (${{\cal M}_{\rm A}}\sim 1$) is crucial for maintaining a long and slender filamentary cloud for a long period of time ∼0.5 Myr.
APA, Harvard, Vancouver, ISO, and other styles
29

Murray-Watson, Rebecca J., Edward Gryspeerdt, and Tom Goren. "Investigating the development of clouds within marine cold-air outbreaks." Atmospheric Chemistry and Physics 23, no. 16 (August 24, 2023): 9365–83. http://dx.doi.org/10.5194/acp-23-9365-2023.

Full text
Abstract:
Abstract. Marine cold-air outbreaks are important parts of the high-latitude climate system and are characterised by strong surface fluxes generated by the air–sea temperature gradient. These fluxes promote cloud formation, which can be identified in satellite imagery by the distinct transformation of stratiform cloud “streets” into a broken field of cumuliform clouds downwind of the outbreak. This evolution in cloud morphology changes the radiative properties of the cloud and therefore is of importance to the surface energy budget. While the drivers of stratocumulus-to-cumulus transitions, such as aerosols or the sea surface temperature gradient, have been extensively studied for subtropical clouds, the factors influencing transitions at higher latitudes are relatively poorly understood. This work uses reanalysis data to create a set of composite trajectories of cold-air outbreaks moving off the Arctic ice edge and co-locates these trajectories with satellite data to generate a unique view of liquid-dominated cloud development within cold-air outbreaks. The results of this analysis show that clouds embedded in cold-air outbreaks have distinctive properties relative to clouds following other trajectories in the region. The initial strength of the outbreak shows a lasting effect on cloud properties, with differences between clouds in strong and weak events visible over 30 h after the air has left the ice edge. However, while the strength (measured by the magnitude of the marine cold-air outbreak index) of the outbreak affects the magnitude of cloud properties, it does not affect the timing of the transition to cumuliform clouds or the top-of-atmosphere albedo. In contrast, the initial aerosol conditions do not strongly affect the magnitude of the cloud properties but are correlated to cloud break-up, leading to an enhanced cooling effect in clouds moving through high-aerosol conditions due to delayed break-up. Both the aerosol environment and the strength and frequency of marine cold-air outbreaks are expected to change in the future Arctic, and these results provide insight into how these changes will affect the radiative properties of the clouds. These results also highlight the need for information about present-day aerosol sources at the ice edge to correctly model cloud development.
APA, Harvard, Vancouver, ISO, and other styles
30

Chang, D. Y., H. Tost, B. Steil, and J. Lelieveld. "Aerosol–cloud interactions studied with the chemistry-climate model EMAC." Atmospheric Chemistry and Physics Discussions 14, no. 15 (August 27, 2014): 21975–2043. http://dx.doi.org/10.5194/acpd-14-21975-2014.

Full text
Abstract:
Abstract. This study uses the EMAC atmospheric chemistry-climate model to simulate cloud properties and estimate cloud radiative effects induced by aerosols. We have tested two prognostic cloud droplet nucleation parameterizations, i.e., the standard STN (osmotic coefficient model) and hybrid (HYB, replacing the osmotic coefficient by the κ hygroscopicity parameter) schemes to calculate aerosol hygroscopicity and critical supersaturation, and consider aerosol–cloud feedbacks with a focus on warm clouds. Both prognostic schemes (STN and HYB) account for aerosol number, size and composition effects on droplet nucleation, and are tested in combination with two different cloud cover parameterizations, i.e., a relative humidity threshold and a statistical cloud cover scheme (RH-CLC and ST-CLC). The use of either STN and HYB leads to very different cloud radiative effects, particularly over the continents. The STN scheme predicts highly effective CCN activation in warm clouds and hazes/fogs near the surface. The enhanced CCN activity increases the cloud albedo effect of aerosols and cools the Earth's surface. The cooler surface enhances the hydrostatic stability of the lower continental troposphere and thereby reduces convection and convective precipitation. In contrast, the HYB simulations calculate lower, more realistic CCN activation and consequent cloud albedo effect, leading to relatively stronger convection and high cloud formation. The enhanced high clouds increase greenhouse warming and moderate the cooling effect of the low clouds. With respect to the cloud radiative effects, the statistical ST-CLC scheme shows much higher sensitivity to aerosol–cloud coupling for all continental regions than the RH-CLC threshold scheme, most pronounced for low clouds but also for high clouds. Simulations of the short wave cloud radiative effect at the top of the atmosphere in ST-CLC are a factor of 2–8 more sensitive to aerosol coupling than the RH-CLC configurations. The long wave cloud radiative effect responds about a factor of 2 more sensitively. Our results show that the coupling with the HYB scheme (κ approach) outperforms the coupling with STN (osmotic coefficient), and also provides a more straightforward approach to account for physicochemical effects on aerosol activation into cloud droplets. Accordingly, the sensitivity of CCN activation to chemical composition is highest in HYB. Overall, the prognostic schemes of cloud cover and cloud droplet formation help improve the agreement between model results and observations, and for the ST-CLC scheme it seems to be a necessity.
APA, Harvard, Vancouver, ISO, and other styles
31

Фролов, Вячеслав Вікторович, Олександр Олександрович Орєхов, Вячеслав Сергійович Харченко, and Олександр Вікторович Фролов. "АНАЛИЗ ВАРИАНТОВ ДВУХВЕРСИОННЫХ МНОГОМОДУЛЬНЫХ ВЕБ-ПРИЛОЖЕНИЙ С ИСПОЛЬЗОВАНИЕМ ОБЛАЧНЫХ СЕРВИСОВ." RADIOELECTRONIC AND COMPUTER SYSTEMS, no. 2 (April 26, 2020): 80–91. http://dx.doi.org/10.32620/reks.2020.2.07.

Full text
Abstract:
The article is devoted to the analysis of a variant of two-version multi-module web application using cloud services. As the design and development of web applications are increasingly active, there is a need to increase their reliability in the face of the increasing complexity of the applications themselves and the infrastructure on which they are based. One of the key solutions to this problem is the use of cloud services, which can greatly simplify the task of ensuring the reliability and security of various applications. At the same time, cloud providers cannot fully guarantee the fault tolerance of applications that run in their environment. Therefore, users should worry about this themselves. One of the most promising approaches is the use of diversity to increase the security and reliability of web applications hosted in the clouds. The object of research and analysis of this work is a multi-module web application designed using cloud services. The study of this work aims to compare modern solutions and technologies that allow implementing sabotage for a web application. Since many companies are moving their infrastructure to the clouds, it becomes necessary to consider the possibility of using diversity by cloud services. They allow you to create and deploy web applications developed in various programming languages on the servers of cloud providers. Thus, part of the responsibility for ensuring reliability is transferred to them. However, it is still necessary to ensure the resiliency of your programs, which may fail due to defects in the program code. One of the main solutions to this problem is N-version programming, which allows you to create an application from several independent versions. Each version can be written in different programming languages and using various technologies by separate development teams, thereby increasing the reliability of the final software product. As a result, in this paper, we conclude that leading cloud providers provide the opportunity to implement diversity using services of various presentation models, such as IaaS and PaaS. Using the principle of diversity, you can design a reliable web application that will avoid its failure in case of an error in the program code.
APA, Harvard, Vancouver, ISO, and other styles
32

Marshall, Ariela L., Juliana Perez Botero, John A. Heit, Aneel A. Ashrani, Rajiv K. Pruthi, Ashish V. Chintakuntlawar, Jennifer Guenther, and Mrinal M. Patnaik. "The Impact of Antithrombin Deficiency on Women's Reproductive Health Experiences and Healthcare Decision-Making: A Qualitative Patient-Oriented Survey Study." Blood 128, no. 22 (December 2, 2016): 3588. http://dx.doi.org/10.1182/blood.v128.22.3588.3588.

Full text
Abstract:
Abstract Aims/Objective:Congenital antithrombin (AT) deficiency, estrogen-containing oral contraceptive pills (OCPs) and pregnancy are associated with increased risk of venous thromboembolism (VTE). Little is known about reproductive decision-making in women with AT deficiency. We sought to explore the attitudes and choices regarding reproductive health in this population. Study Design/Methods:We conducted a provider-administered survey of women identified from the Mayo Clinic congenital AT deficiency database. Participants were asked to discuss their diagnosis of AT deficiency (age at diagnosis and VTE history including diagnosis and treatment) and questioned about (1) any lifetime methods of contraception, (2) pregnancies and pregnancy outcomes, (3) use of assisted reproductive technology, (4) history of menorrhagia, and (5) how their AT diagnosis had impacted each reproductive health experience and how this impact made them feel (open-ended question). Results:Of 31 women with congenital AT deficiency, 18 (58%) were reached, 8 (26%) were not reachable and 5 (16%) were deceased. Of the 18 women who completed the survey, median age at AT deficiency diagnosis was 40 years (range 7 - 65). Twelve (67%) women had experienced a VTE; 2 and 5 women had VTE in the setting of pregnancy OCP use, respectively. Reported contraceptive methods included OCPs, intrauterine device (IUD), and male condoms. Of the 5 women diagnosed with AT deficiency while taking OCPs, three switched to IUD, one to condoms, and one did not report an alternative method of contraception. Fifteen (83% ) women reported at least one pregnancy (median number of pregnancies = 2; range 1 - 6). Of 42 total pregnancies among the 18 women, 33 (79%) resulted in live term birth, 3 (7%) in live preterm birth, and 6 (14%) in miscarriage/spontaneous abortion at a median of 12 weeks into the pregnancy. Four women (22%) reported the use of anticoagulation during pregnancy (presumed to be low molecular weight heparin, but most women could not specify), and 1 woman reported use of assisted reproductive technology leading to a successful pregnancy. Eleven women (61%) reported menorrhagia, 4 (36%) while on anticoagulation for VTE events. Ten of 18 women (56%) reported that the AT deficiency diagnosis had affected their reproductive health in at least one way. Patient-reported comments are presented in Table 1, and a "word cloud" diagram (an image composed of words from the women's comments regarding contraception), in which the size of each word indicates its frequency) is presented in Figure 1. Conclusion:Women with AT deficiency require careful multidisciplinary management to avoid complications in the settings of hormonal contraception and pregnancy. Many women reported that their reproductive health experiences were impacted by their diagnosis, but also reported successful pregnancy outcomes. Validated questionnaires should be developed to assess women's reproductive health experiences. Strategies for reproductive risk management should be discussed carefully, and patient involvement in reproductive decision-making is key in this population. Word Cloud depicting frequency of women's comments regarding AT deficiency and contraception Word Cloud depicting frequency of women's comments regarding AT deficiency and contraception Figure 1 Figure 1. Disclosures No relevant conflicts of interest to declare.
APA, Harvard, Vancouver, ISO, and other styles
33

Rabbani, Imran Mujaddid, Muhammad Aslam, Ana Maria Martinez Enriquez, and Zeeshan Qudeer. "Service Association Factor (SAF) for Cloud Service Selection and Recommendation." Information Technology And Control 49, no. 1 (March 25, 2020): 113–26. http://dx.doi.org/10.5755/j01.itc.49.1.23251.

Full text
Abstract:
Cloud computing is one of the leading technology in IT and computer science domain. Business IT infrastructures are equipping themselves with modern regime of clouds. In the presence of several opportunities, selection criteria decision becomes vital when there is no supporting information available. Global clouds also need evaluation and assessment from its users that what they think about and how new ones could make their selection as per their needs. Recommended systems were built to propose best services using customer's feedback, applying quality of service parameters, assigning scores, trust worthiness and clustering in different forms and models. These techniques did not record and use interrelationships between the services that is true impact of service utilization. In the proposed approach, service association factor calculates value of interrelations among services used by the end user. An intelligent leaning based recommendation system is developed for assisting users to select services on their respective preferences. This technique is evaluated on leading service providers and results show that learning base system performs well on all types of cloud models.
APA, Harvard, Vancouver, ISO, and other styles
34

Petras, Vaclav, Anna Petrasova, James B. McCarter, Helena Mitasova, and Ross K. Meentemeyer. "Point Density Variations in Airborne Lidar Point Clouds." Sensors 23, no. 3 (February 1, 2023): 1593. http://dx.doi.org/10.3390/s23031593.

Full text
Abstract:
In spite of increasing point density and accuracy, airborne lidar point clouds often exhibit point density variations. Some of these density variations indicate issues with point clouds, potentially leading to errors in derived products. To highlight these issues, we provide an overview of point density variations and show examples in six airborne lidar point cloud datasets that we used in our topographic and geospatial modeling research. Using the published literature, we identified sources of point density variations and issues indicated or caused by these variations. Lastly, we discuss the reduction in point density variations using decimations, homogenizations, and their applicability.
APA, Harvard, Vancouver, ISO, and other styles
35

Schween, Jan H., Camilo del Rio, Juan-Luis García, Pablo Osses, Sarah Westbrook, and Ulrich Löhnert. "Life cycle of stratocumulus clouds over 1 year at the coast of the Atacama Desert." Atmospheric Chemistry and Physics 22, no. 18 (September 20, 2022): 12241–67. http://dx.doi.org/10.5194/acp-22-12241-2022.

Full text
Abstract:
Abstract. Marine stratocumulus clouds of the eastern Pacific play an essential role in the earth's energy and radiation budget. Parts of these clouds off the western coast of South America form the major source of water to the hyperarid Atacama Desert coastal region at the northern coast of Chile. For the first time, a full year of vertical structure observations of the coastal stratocumulus and their environment is presented and analyzed. Installed at Iquique Airport in northern Chile in 2018/2019, three state-of-the-art remote sensing instruments provide vertical profiles of cloud macro- and micro-physical properties, wind, turbulence, and temperature as well as integrated values of water vapor and liquid water. Distinct diurnal and seasonal patterns of the stratocumulus life cycle are observed. Embedded in a land–sea circulation with a superimposed southerly wind component, maximum cloud occurrence and vertical extent occur at night but minima at local noon. Nighttime clouds are maintained by cloud-top cooling, whereas afternoon clouds reappear within a convective boundary layer driven through local moisture advection from the Pacific. During the night, these clouds finally re-connect to the maritime clouds in the upper branch of the land–sea circulation. The diurnal cycle is much more pronounced in austral winter, with lower, thicker, and more abundant (5×) clouds than in summer. This can be associated with different sea surface temperature (SST) gradients in summer and winter, leading to a stable or neutral stratification of the maritime boundary layer at the coast of the Atacama Desert in Iquique.
APA, Harvard, Vancouver, ISO, and other styles
36

Zhao, Xi, Xiaohong Liu, Vaughan T. J. Phillips, and Sachin Patade. "Impacts of secondary ice production on Arctic mixed-phase clouds based on ARM observations and CAM6 single-column model simulations." Atmospheric Chemistry and Physics 21, no. 7 (April 15, 2021): 5685–703. http://dx.doi.org/10.5194/acp-21-5685-2021.

Full text
Abstract:
Abstract. For decades, measured ice crystal number concentrations have been found to be orders of magnitude higher than measured ice-nucleating particle number concentrations in moderately cold clouds. This observed discrepancy reveals the existence of secondary ice production (SIP) in addition to the primary ice nucleation. However, the importance of SIP relative to primary ice nucleation remains highly unclear. Furthermore, most weather and climate models do not represent SIP processes well, leading to large biases in simulated cloud properties. This study demonstrates a first attempt to represent different SIP mechanisms (frozen raindrop shattering, ice–ice collisional breakup, and rime splintering) in a global climate model (GCM). The model is run in the single column mode to facilitate comparisons with the Department of Energy (DOE)'s Atmospheric Radiation Measurement (ARM) Mixed-Phase Arctic Cloud Experiment (M-PACE) observations. We show the important role of SIP in four types of clouds during M-PACE (i.e., multilayer, single-layer stratus, transition, and frontal clouds), with the maximum enhancement in ice crystal number concentrations up to 4 orders of magnitude in moderately supercooled clouds. We reveal that SIP is the dominant source of ice crystals near the cloud base for the long-lived Arctic single-layer mixed-phase clouds. The model with SIP improves the occurrence and phase partitioning of the mixed-phase clouds, reverses the vertical distribution pattern of ice number concentrations, and provides a better agreement with observations. The findings of this study highlight the importance of considering SIP in GCMs.
APA, Harvard, Vancouver, ISO, and other styles
37

French, Jeffrey R., Katja Friedrich, Sarah A. Tessendorf, Robert M. Rauber, Bart Geerts, Roy M. Rasmussen, Lulin Xue, Melvin L. Kunkel, and Derek R. Blestrud. "Precipitation formation from orographic cloud seeding." Proceedings of the National Academy of Sciences 115, no. 6 (January 22, 2018): 1168–73. http://dx.doi.org/10.1073/pnas.1716995115.

Full text
Abstract:
Throughout the western United States and other semiarid mountainous regions across the globe, water supplies are fed primarily through the melting of snowpack. Growing populations place higher demands on water, while warmer winters and earlier springs reduce its supply. Water managers are tantalized by the prospect of cloud seeding as a way to increase winter snowfall, thereby shifting the balance between water supply and demand. Little direct scientific evidence exists that confirms even the basic physical hypothesis upon which cloud seeding relies. The intent of glaciogenic seeding of orographic clouds is to introduce aerosol into a cloud to alter the natural development of cloud particles and enhance wintertime precipitation in a targeted region. The hypothesized chain of events begins with the introduction of silver iodide aerosol into cloud regions containing supercooled liquid water, leading to the nucleation of ice crystals, followed by ice particle growth to sizes sufficiently large such that snow falls to the ground. Despite numerous experiments spanning several decades, no direct observations of this process exist. Here, measurements from radars and aircraft-mounted cloud physics probes are presented that together show the initiation, growth, and fallout to the mountain surface of ice crystals resulting from glaciogenic seeding. These data, by themselves, do not address the question of cloud seeding efficacy, but rather form a critical set of observations necessary for such investigations. These observations are unambiguous and provide details of the physical chain of events following the introduction of glaciogenic cloud seeding aerosol into supercooled liquid orographic clouds.
APA, Harvard, Vancouver, ISO, and other styles
38

G, Sandhya, and Dr H. S. Guruprasad. "Security Monitoring for Multi-Cloud Native Network Service Based Functions." International Journal for Research in Applied Science and Engineering Technology 10, no. 9 (September 30, 2022): 1473–78. http://dx.doi.org/10.22214/ijraset.2022.46634.

Full text
Abstract:
Abstract: Nowadays, enterprises are adopting a cloud- native to provide rapid change, large scale, and resilience in their application. The applications were built as independent services and packaged as self-contained, lightweight containers.And in order to provide leading applications, better performance, and avoid getting locked into a particular cloud provider's infrastructure. They choose to deploy a cloud-native application on a multi-cloud Infrastructure. While this native multi-cloud strategy has many benefits, it definitely adds more management complexity. We have proposed a framework by creating an abstraction layer that provides security and visibility across these multi-cloud services. We have visualized the metrics like request per second, status code, bandwidth, andlatencies of two sample API services (Users, and Products) which are publicly available and deployed on Google cloudfunction and on a CloudFlare.
APA, Harvard, Vancouver, ISO, and other styles
39

Mebel, Alexander M., Marcelino Agúndez, José Cernicharo, and Ralf I. Kaiser. "Elucidating the Formation of Ethynylbutatrienylidene (HCCCHCCC; X1A′) in the Taurus Molecular Cloud (TMC-1) via the Gas-phase Reaction of Tricarbon (C3) with the Propargyl Radical (C3H3)." Astrophysical Journal Letters 945, no. 2 (March 1, 2023): L40. http://dx.doi.org/10.3847/2041-8213/acbf41.

Full text
Abstract:
Abstract The recent astronomical detection of ethynylbutatrienylidene (HCCCHCCC)—a high-energy isomer of triacetylene (HCCCCCCH) and hexapentaenylidene (H2CCCCCC)—in TMC-1 puzzled the laboratory astrophysics community since proposed reaction pathways could not synthesize the ethynylbutatrienylidene (HCCCHCCC) under cold molecular cloud conditions. Exploiting a retrosynthesis coupled with electronic structure calculations and astrochemical modeling, we reveal that observed fractional abundance of ethynylbutatrienylidene (HCCCHCCC) of 1.3 ± 0.2 × 10−11 can be quantitatively replicated though the barrierless and exoergic reaction of tricarbon (C3) with the resonantly stabilized propargyl radical (C3H3) after a few 105 yr—typical ages of cold molecular clouds. Our study provides persuasive evidence that previously assumed “dead” reactants such as tricarbon (C3) and the propargyl radical (C3H3) provide fundamental molecular building blocks in molecular mass growth processes leading to exotic, high-energy isomers of hydrocarbons: ethynylbutatrienylidene (HCCCHCCC).
APA, Harvard, Vancouver, ISO, and other styles
40

Ferry, Eugene, John O Raw, and Kevin Curran. "Security evaluation of the OAuth 2.0 framework." Information & Computer Security 23, no. 1 (March 9, 2015): 73–101. http://dx.doi.org/10.1108/ics-12-2013-0089.

Full text
Abstract:
Purpose – The interoperability of cloud data between web applications and mobile devices has vastly improved over recent years. The popularity of social media, smartphones and cloud-based web services have contributed to the level of integration that can be achieved between applications. This paper investigates the potential security issues of OAuth, an authorisation framework for granting third-party applications revocable access to user data. OAuth has rapidly become an interim de facto standard for protecting access to web API data. Vendors have implemented OAuth before the open standard was officially published. To evaluate whether the OAuth 2.0 specification is truly ready for industry application, an entire OAuth client server environment was developed and validated against the speciation threat model. The research also included the analysis of the security features of several popular OAuth integrated websites and comparing those to the threat model. High-impacting exploits leading to account hijacking were identified with a number of major online publications. It is hypothesised that the OAuth 2.0 specification can be a secure authorisation mechanism when implemented correctly. Design/methodology/approach – To analyse the security of OAuth implementations in industry a list of the 50 most popular websites in Ireland was retrieved from the statistical website Alexa (Noureddine and Bashroush, 2011). Each site was analysed to identify if it utilised OAuth. Out of the 50 sites, 21 were identified with OAuth support. Each vulnerability in the threat model was then tested against each OAuth-enabled site. To test the robustness of the OAuth framework, an entire OAuth environment was required. The proposed solution would compose of three parts: a client application, an authorisation server and a resource server. The client application needed to consume OAuth-enabled services. The authorisation server had to manage access to the resource server. The resource server had to expose data from the database based on the authorisation the user would be given from the authorisation server. It was decided that the client application would consume emails from Google’s Gmail API. The authorisation and resource server were modelled around a basic task-tracking web application. The client application would also consume task data from the developed resource server. The client application would also support Single Sign On for Google and Facebook, as well as a developed identity provider “MyTasks”. The authorisation server delegated authorisation to the client application and stored cryptography information for each access grant. The resource server validated the supplied access token via public cryptography and returned the requested data. Findings – Two sites out of the 21 were found to be susceptible to some form of attack, meaning that 10.5 per cent were vulnerable. In total, 18 per cent of the world’s 50 most popular sites were in the list of 21 OAuth-enabled sites. The OAuth 2.0 specification is still very much in its infancy, but when implemented correctly, it can provide a relatively secure and interoperable authentication delegation mechanism. The IETF are currently addressing issues and expansions in their working drafts. Once a strict level of conformity is achieved between vendors and vulnerabilities are mitigated, it is likely that the framework will change the way we access data on the web and other devices. Originality/value – OAuth is flexible, in that it offers extensions to support varying situations and existing technologies. A disadvantage of this flexibility is that new extensions typically bring new security exploits. Members of the IETF OAuth Working Group are constantly refining the draft specifications and are identifying new threats to the expanding functionality. OAuth provides a flexible authentication mechanism to protect and delegate access to APIs. It solves the password re-use across multiple accounts problem and stops the user from having to disclose their credentials to third parties. Filtering access to information by scope and giving the user the option to revoke access at any point gives the user control of their data. OAuth does raise security concerns, such as defying phishing education, but there are always going to be security issues with any authentication technology. Although several high impacting vulnerabilities were identified in industry, the developed solution proves the predicted hypothesis that a secure OAuth environment can be built when implemented correctly. Developers must conform to the defined specification and are responsible for validating their implementation against the given threat model. OAuth is an evolving authorisation framework. It is still in its infancy, and much work needs to be done in the specification to achieve stricter validation and vendor conformity. Vendor implementations need to become better aligned in order to provider a rich and truly interoperable authorisation mechanism. Once these issues are resolved, OAuth will be on track for becoming the definitive authentication standard on the web.
APA, Harvard, Vancouver, ISO, and other styles
41

Chen, Anthony, Shiwen Mao, Zhu Li, Minrui Xu, Hongliang Zhang, Dusit Niyato, and Zhu Han. "An Introduction to Point Cloud Compression Standards." GetMobile: Mobile Computing and Communications 27, no. 1 (May 17, 2023): 11–17. http://dx.doi.org/10.1145/3599184.3599188.

Full text
Abstract:
The prevalent point cloud compression (PCC) standards of today are utilized to encode various types of point cloud data, allowing for reasonable bandwidth and storage usage. With increasing demand for high-fidelity three-dimensional (3D) models for a large variety of applications, including immersive visual communication, Augmented reality (AR) and Virtual Reality (VR), navigation, autonomous driving, and smart city, point clouds are seeing increasing usage and development to meet the increasing demands. However, with the advancements in 3D modelling and sensing, the amount of data required to accurately depict such representations and models is likewise ballooning to increasingly large proportions, leading to the development and standardization of the point cloud compression standards. In this article, we provide an overview of some topical and popular MPEG point cloud compression (PCC) standards. We discuss the development and applications of the Geometry-based PCC (G-PCC) and Video-based PCC (V-PCC) standards as they escalate in importance in an era of virtual reality and machine learning. Finally, we conclude our article describing the future research directions and applications of the PCC standards of today.
APA, Harvard, Vancouver, ISO, and other styles
42

Nogherotto, Rita, Adrian Mark Tompkins, Graziano Giuliani, Erika Coppola, and Filippo Giorgi. "Numerical framework and performance of the new multiple-phase cloud microphysics scheme in RegCM4.5: precipitation, cloud microphysics, and cloud radiative effects." Geoscientific Model Development 9, no. 7 (July 27, 2016): 2533–47. http://dx.doi.org/10.5194/gmd-9-2533-2016.

Full text
Abstract:
Abstract. We implement and evaluate a new parameterization scheme for stratiform cloud microphysics and precipitation within regional climate model RegCM4. This new parameterization is based on a multiple-phase one-moment cloud microphysics scheme built upon the implicit numerical framework recently developed and implemented in the ECMWF operational forecasting model. The parameterization solves five prognostic equations for water vapour, cloud liquid water, rain, cloud ice, and snow mixing ratios. Compared to the pre-existing scheme, it allows a proper treatment of mixed-phase clouds and a more physically realistic representation of cloud microphysics and precipitation. Various fields from a 10-year long integration of RegCM4 run in tropical band mode with the new scheme are compared with their counterparts using the previous cloud scheme and are evaluated against satellite observations. In addition, an assessment using the Cloud Feedback Model Intercomparison Project (CFMIP) Observational Simulator Package (COSP) for a 1-year sub-period provides additional information for evaluating the cloud optical properties against satellite data. The new microphysics parameterization yields an improved simulation of cloud fields, and in particular it removes the overestimation of upper level cloud characteristics of the previous scheme, increasing the agreement with observations and leading to an amelioration of a long-standing problem in the RegCM system. The vertical cloud profile produced by the new scheme leads to a considerably improvement of the representation of the longwave and shortwave components of the cloud radiative forcing.
APA, Harvard, Vancouver, ISO, and other styles
43

Martynenko, A. P. "Cloud technologies for educational institutions." CTE Workshop Proceedings 3 (March 20, 2015): 46–50. http://dx.doi.org/10.55056/cte.244.

Full text
Abstract:
Research goals: research cloud solutions that offer educational institutions leading providers in the cloud. Research objectives: to examine the pros and cons of cloud-based tools for use in the educational process of universities. Object of research: universities learning process. Subject of research: the use of cloud technologies in educational process of the university. Research methods used: analysis of publications. Results of the research. Based on the analysis of scientific publications identified the advantages and disadvantages of cloud technologies in education compared with traditional information technology. Studied cloud solutions that offer educational institutions leading providers in the cloud: Microsoft Office 365, Windows Azure Platform, Google Apps Education Edition, jParus – Educational institutions. The main conclusions and recommendations. Cloud technologies have significant promise in the education field.
APA, Harvard, Vancouver, ISO, and other styles
44

Dahunsi, Folasade, John Idogun, and Abayomi Olawumi. "Commercial Cloud Services for a Robust Mobile Application Backend Data Storage." Indonesian Journal of Computing, Engineering and Design (IJoCED) 3, no. 1 (March 10, 2021): 31–45. http://dx.doi.org/10.35806/ijoced.v3i1.139.

Full text
Abstract:
Rapid advancements in the infrastructure of Information and Communication Technology (ICT) have led to radically new but ubiquitous technology; cloud computing. Cloud computing has gracefully emerged offering services that possess on-demand scalability, huge computing power, and a utility-like availability, all at a relatively low cost. It has unsurprisingly become a paradigm shift in ICT, gaining adoptions in all forms of application i.e., personal, academic, business, or government. Not only for its cost-effectiveness but also for its inherent ability to meet business goals and provide strategic ICT resources. More recently there have been advances in cloud computing leading to the evolution of newer commercial cloud services, one of which is the Mobile backend as a Service (MBaaS). The MBaaS is important and required for a robust mobile application back-end data storage and management. Its wide adoption and importance stem from its ability to simplify application development and deployment. Also, MBaaS is robust, with the ability to cope with errors by providing nifty tools and other features. These enable rapid scaffolding of mobile applications. This paper reviews Mobile backend as a Service (MBaaS) and provides required background knowledge on some cloud services and their providers to enable stakeholders to make informed decisions and appropriate choices.
APA, Harvard, Vancouver, ISO, and other styles
45

Rodrigues, Pedro, Filipe Freitas, and José Simão. "QuickFaaS: Providing Portability and Interoperability between FaaS Platforms." Future Internet 14, no. 12 (November 30, 2022): 360. http://dx.doi.org/10.3390/fi14120360.

Full text
Abstract:
Serverless computing hides infrastructure management from developers and runs code on-demand automatically scaled and billed during the code’s execution time. One of the most popular serverless backend services is called Function-as-a-Service (FaaS), in which developers are often confronted with cloud-specific requirements. Function signature requirements, and the usage of custom libraries that are unique to cloud providers, were identified as the two main reasons for portability issues in FaaS applications, leading to various vendor lock-in problems. In this work, we define three cloud-agnostic models that compose FaaS platforms. Based on these models, we developed QuickFaaS, a multi-cloud interoperability desktop tool targeting cloud-agnostic functions and FaaS deployments. The proposed cloud-agnostic approach enables developers to reuse their serverless functions in different cloud providers with no need to change code or install extra software. We also provide an evaluation that validates the proposed solution by measuring the impact of a cloud-agnostic approach on the function’s performance, when compared to a cloud-non-agnostic one. The study shows that a cloud-agnostic approach does not significantly impact the function’s performance.
APA, Harvard, Vancouver, ISO, and other styles
46

Diaz, Jonathan, and Kenji Bekki. "The Effect of Drag from the Galactic Hot Halo on the Magellanic Stream and Leading Arm." Publications of the Astronomical Society of Australia 28, no. 2 (2011): 117–27. http://dx.doi.org/10.1071/as10044.

Full text
Abstract:
AbstractWe study the effect of drag induced by the Galactic hot halo on the two neutral hydrogen (HI) cloud complexes associated with the Large and Small Magellanic Clouds: the Magellanic Stream (MS) and the Leading Arm (LA). In particular, we adopt the numerical models of previous studies and re-simulate the tidal formation of the MS and LA with the inclusion of a drag term. We find that the drag has three effects which, although model-dependent, may bring the tidal formation scenario into better agreement with observations: correcting the LA kinematics, reproducing the MS column density gradient, and enhancing the formation of MS bifurcation. We furthermore propose a two-stage mechanism by which the bifurcation forms. In general, the inclusion of drag has a variety of both positive and negative effects on the global properties of the MS and LA, including their on-sky positions, kinematics, radial distances, and column densities. We also provide an argument which suggests that ram-pressure stripping and tidal stripping are mutually exclusive candidates for the formation of the MS and LA.
APA, Harvard, Vancouver, ISO, and other styles
47

Barry, James, Stefanie Meilinger, Klaus Pfeilsticker, Anna Herman-Czezuch, Nicola Kimiaie, Christopher Schirrmeister, Rone Yousif, et al. "Irradiance and cloud optical properties from solar photovoltaic systems." Atmospheric Measurement Techniques 16, no. 20 (October 27, 2023): 4975–5007. http://dx.doi.org/10.5194/amt-16-4975-2023.

Full text
Abstract:
Abstract. Solar photovoltaic power output is modulated by atmospheric aerosols and clouds and thus contains valuable information on the optical properties of the atmosphere. As a ground-based data source with high spatiotemporal resolution it has great potential to complement other ground-based solar irradiance measurements as well as those of weather models and satellites, thus leading to an improved characterisation of global horizontal irradiance. In this work several algorithms are presented that can retrieve global tilted and horizontal irradiance and atmospheric optical properties from solar photovoltaic data and/or pyranometer measurements. The method is tested on data from two measurement campaigns that took place in the Allgäu region in Germany in autumn 2018 and summer 2019, and the results are compared with local pyranometer measurements as well as satellite and weather model data. Using power data measured at 1 Hz and averaged to 1 min resolution along with a non-linear photovoltaic module temperature model, global horizontal irradiance is extracted with a mean bias error compared to concurrent pyranometer measurements of 5.79 W m−2 (7.35 W m−2) under clear (cloudy) skies, averaged over the two campaigns, whereas for the retrieval using coarser 15 min power data with a linear temperature model the mean bias error is 5.88 and 41.87 W m−2 under clear and cloudy skies, respectively. During completely overcast periods the cloud optical depth is extracted from photovoltaic power using a lookup table method based on a 1D radiative transfer simulation, and the results are compared to both satellite retrievals and data from the Consortium for Small-scale Modelling (COSMO) weather model. Potential applications of this approach for extracting cloud optical properties are discussed, as well as certain limitations, such as the representation of 3D radiative effects that occur under broken-cloud conditions. In principle this method could provide an unprecedented amount of ground-based data on both irradiance and optical properties of the atmosphere, as long as the required photovoltaic power data are available and properly pre-screened to remove unwanted artefacts in the signal. Possible solutions to this problem are discussed in the context of future work.
APA, Harvard, Vancouver, ISO, and other styles
48

Gautam, Monika, and Jyoti . "Implementing Security in Fog Computing." International Journal for Research in Applied Science and Engineering Technology 10, no. 7 (July 31, 2022): 4008–12. http://dx.doi.org/10.22214/ijraset.2022.45953.

Full text
Abstract:
Abstract: In Fog computing, services can be hosted at end devices such as set-top-boxes or access points. The infrastructure of this new distributed computing allows applications to run as close as possible to sensed actionable and massive data, coming out of people, processes and thing. Such Fog computing concept, actually a Cloud computing close to the ‘ground’, creates automated response that drives the value. As the data generation rates are increasing, it is a tedious task for cloud storage providers to provide efficient storage. Cloud storage providers use different techniques to improve storage efficiency and one of leading technique employed by them is de-duplication. Data once deployed to cloud servers, its beyond the security premises of the data owner, thus most of them prefer to outsource their in an encrypted format. We also propose efficient approach for encryption for providing security on fog computing.
APA, Harvard, Vancouver, ISO, and other styles
49

Krisanski, Sean, Mohammad Sadegh Taskhiri, Susana Gonzalez Aracil, David Herries, and Paul Turner. "Sensor Agnostic Semantic Segmentation of Structurally Diverse and Complex Forest Point Clouds Using Deep Learning." Remote Sensing 13, no. 8 (April 7, 2021): 1413. http://dx.doi.org/10.3390/rs13081413.

Full text
Abstract:
Forest inventories play an important role in enabling informed decisions to be made for the management and conservation of forest resources; however, the process of collecting inventory information is laborious. Despite advancements in mapping technologies allowing forests to be digitized in finer granularity than ever before, it is still common for forest measurements to be collected using simple tools such as calipers, measuring tapes, and hypsometers. Dense understory vegetation and complex forest structures can present substantial challenges to point cloud processing tools, often leading to erroneous measurements, and making them of less utility in complex forests. To address this challenge, this research demonstrates an effective deep learning approach for semantically segmenting high-resolution forest point clouds from multiple different sensing systems in diverse forest conditions. Seven diverse point cloud datasets were manually segmented to train and evaluate this model, resulting in per-class segmentation accuracies of Terrain: 95.92%, Vegetation: 96.02%, Coarse Woody Debris: 54.98%, and Stem: 96.09%. By exploiting the segmented point cloud, we also present a method of extracting a Digital Terrain Model (DTM) from such segmented point clouds. This approach was applied to a set of six point clouds that were made publicly available as part of a benchmarking study to evaluate the DTM performance. The mean DTM error was 0.04 m relative to the reference with 99.9% completeness. These approaches serve as useful steps toward a fully automated and reliable measurement extraction tool, agnostic to the sensing technology used or the complexity of the forest, provided that the point cloud has sufficient coverage and accuracy. Ongoing work will see these models incorporated into a fully automated forest measurement tool for the extraction of structural metrics for applications in forestry, conservation, and research.
APA, Harvard, Vancouver, ISO, and other styles
50

Naud, C. M., J.-P. Muller, E. C. Slack, C. L. Wrench, and E. E. Clothiaux. "Assessment of the Performance of the Chilbolton 3-GHz Advanced Meteorological Radar for Cloud-Top-Height Retrieval." Journal of Applied Meteorology 44, no. 6 (June 1, 2005): 876–87. http://dx.doi.org/10.1175/jam2244.1.

Full text
Abstract:
Abstract The Chilbolton 3-GHz Advanced Meteorological Radar (CAMRa), which is mounted on a fully steerable 25-m dish, can provide three-dimensional information on the presence of hydrometeors. The potential for this radar to make useful measurements of low-altitude liquid water cloud structure is investigated. To assess the cloud-height assignment capabilities of the 3-GHz radar, low-level cloud-top heights were retrieved from CAMRa measurements made between May and July 2003 and were compared with cloud-top heights retrieved from a vertically pointing 94-GHz radar that operates alongside CAMRa. The average difference between the 94- and 3-GHz radar-derived cloud-top heights is shown to be −0.1 ± 0.4 km. To assess the capability of 3-GHz radar scans to be used for satellite-derived cloud-top-height validation, multiangle imaging spectroradiometer (MISR) cloud-top heights were compared with both 94- and 3-GHz radar retrievals. The average difference between 94-GHz radar and MISR cloud-top heights is shown to be 0.1 ± 0.3 km, while the 3-GHz radar and MISR average cloud-top-height difference is shown to be −0.2 ± 0.6 km. In assessing the value of the CAMRa measurements, the problems associated with low-reflectivity values from stratiform liquid water clouds, ground clutter, and Bragg scattering resulting from turbulent mixing are all addressed. It is shown that, despite the difficulties, the potential exists for CAMRa measurements to contribute significantly to liquid water cloud-top-height retrievals, leading to the production of two-dimensional transects (i.e., maps) of cloud-top height.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography