Academic literature on the topic 'LEADING CLOUD PROVIDER'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'LEADING CLOUD PROVIDER.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "LEADING CLOUD PROVIDER"

1

Basu, Aveek, and Sanchita Ghosh. "Implementing Fuzzy TOPSIS in Cloud Type and Service Provider Selection." Advances in Fuzzy Systems 2018 (November 15, 2018): 1–12. http://dx.doi.org/10.1155/2018/2503895.

Full text
Abstract:
Cloud computing can be considered as one of the leading-edge technological advances in the current IT industry. Cloud computing or simply cloud is attributed to the Service Oriented Architecture. Every organization is trying to utilize the benefit of cloud not only to reduce the cost overhead in infrastructure, network, hardware, software, etc., but also to provide seamless service to end users with the benefit of scalability. The concept of multitenancy assists cloud service providers to leverage the costs by providing services to multiple users/companies at the same time via shared resource. There are several cloud service providers currently in the market and they are rapidly changing and reorienting themselves as per market demand. In order to gain market share, the cloud service providers are trying to provide the latest technology to end users/customers with the reduction of costs. In such scenario, it becomes extremely difficult for cloud customers to select the best service provider as per their requirement. It is also becoming difficult to decide upon the deployment model to choose among the existing ones. The deployment models are suitable for different companies. There exist divergent criteria for different deployment models which are not tailor made for an organization. As a cloud customer, it is difficult to decide on the model and determine the appropriate service provider. The multicriteria decision making method is applied to find out the best suitable service provider among the top existing four companies and choose the deployment model as per requirement.
APA, Harvard, Vancouver, ISO, and other styles
2

Malouf, Lina Samir. "Towards Query Processing Over Homomorphic Encrypted Data in Cloud." Association of Arab Universities Journal of Engineering Sciences 26, no. 4 (December 31, 2019): 65–72. http://dx.doi.org/10.33261/jaaru.2019.26.4.008.

Full text
Abstract:
With data growth very fast, the need for data storage and management in the cloud in a secure way is rapidly increasing, leading developers to find secure data management solutions through new technologies. One of the most advanced technologies at present is cloud computing technology that functions as an online service. Cloud computing technology relies on an external provider to provide online demand services. On the other hand, this technology is pay-for-use technology which means that the user must pay for each service provided by the provider. When we have a look back at the literature, we can find that regular database management systems with query processing specifications do not meet the requirements in cloud computing. This paper focuses on homogeneous coding, which is used primarily for knowledge security within the cloud. Homomorphic encryption has been clarified because of encryption technology in which specific operations can be managed on encrypted data information.
APA, Harvard, Vancouver, ISO, and other styles
3

Singh, Jitendra, and Kamlesh Kumar Raghuvanshi. "Regulations and Standards in Public Cloud." Journal of Information Technology Research 13, no. 3 (July 2020): 21–36. http://dx.doi.org/10.4018/jitr.2020070102.

Full text
Abstract:
Security is a critical issue particularly in public cloud as it rests with the cloud providers. During security implementation, prevailing security threats and regulatory standards are borne in mind. Regulatory compliance varies from one cloud provider to another according to their maturity and location of the data center. Thus, subscribers need to verify the security requirement meeting their objective and the one implemented by the public cloud provider. To this end, subscribers need to visit each cloud provider's site to view the compliance. This is a time-consuming activity at the same time difficult to locate on a website. This work presents the prominent security standards suggested by the leading security institutions including NIST, CSA, ENISA, ISO, etc., that are applicable to the public cloud. A centrally-driven scheme is proposed in order to empower the subscriber to know the regulation and standards applicable according to their services need. The availability of an exhaustive list at one place will lower the users hassle at subscription time.
APA, Harvard, Vancouver, ISO, and other styles
4

R, Natarajan. "Telecom Operations in Cloud Computing." International Journal for Research in Applied Science and Engineering Technology 10, no. 1 (January 31, 2022): 1323–26. http://dx.doi.org/10.22214/ijraset.2022.40051.

Full text
Abstract:
Abstract: Previous generations of wireless connectivity have focussed on voice and data capabilities, 5G is architected to better enable consumer business models. Edge compute (both on-prem / device-edge and provider edge) and related services to support 5G use-cases appears to be the leading driver behind recent announcements. These use-cases will need to be managed by our OSS/BSS for the telco operators and their customers. Keywords: AWS Telecom, Google Telecom, VMware Telecom, Red hat Telecom
APA, Harvard, Vancouver, ISO, and other styles
5

Kouatli, Issam. "People-process-performance benchmarking technique in cloud computing environment." International Journal of Productivity and Performance Management 69, no. 9 (May 23, 2019): 1955–72. http://dx.doi.org/10.1108/ijppm-04-2017-0083.

Full text
Abstract:
Purpose Cloud computing is relatively a new type of technology demanding a new method of management techniques to attain security and privacy leading to customer satisfaction regarding “Business Protection” measure. As cloud computing businesses are usually composed of multiple colocation sites/departments, the purpose of this paper is to propose a benchmark operation to measure and compare the overall integrated people-process-performance (PPP) among different departments within cloud computing organization. The purpose of this paper is to motivate staff/units to improve the process performance and meet the standards in a competitive approach among business units. Design/methodology/approach The research method was conducted at Cirrus Ltd, which is a cloud computing service provider where a focus group consists of six IT professionals/managers. The objective of the focus group was to investigate the proposed technique by selecting the best practices relevant criteria, with the relevant sub-criteria as a benchmarking performance tool to measure PPP via an analytic hierarchy processing (AHP) approach. The standard pairwise comparative AHP scale was used to measure the performance of three different teams defined as production team, user acceptance testing team and the development team. Findings Based on best practice performance measurement (reviewed in this paper) of cloud computing, the proposed AHP model was implemented in a local medium-sized cloud service provider named “Cirrus” with their single site data center. The actual criteria relevant to Cirrus was an adaptation of the “Best practice” described in the literature. The main reason for the adaptation of criteria was that the principle of PPP assumes multiple departments/datacenters located in a different geographical area in large service providers. As Cirrus is a type of SMEs, the adaptation of performance measurement was based on teams within the same data center location. Irrelevant of this adaptation, the objective of measuring vendors KPI using the AHP technique as a specific output of PPP is also a valid situation. Practical implications This study provides guidance for achieving cloud computing performance measurement using the AHP technique. Hence, the proposed technique is an integrated model to measure the PPP under monitored cloud environment. Originality/value The proposed technique measures and manages the performance of cloud service providers that also implicitly act as a catalyst to attain trust in such high information-sensitive environment leading to organizational effectiveness of managing cloud organizations.
APA, Harvard, Vancouver, ISO, and other styles
6

Yuvaraj, Mayank. "Security threats, risks and open source cloud computing security solutions for libraries." Library Hi Tech News 32, no. 7 (September 7, 2015): 16–18. http://dx.doi.org/10.1108/lhtn-04-2015-0026.

Full text
Abstract:
Purpose – In recent years, a large number of organizations have found that cloud computing has many advantages leading to a surge in its adoption. Design/methodology/approach – Cloud computing involves the usage of large servers for the access of data, its storage and manipulation as well as provisioning of other services. Findings – When infrastructure, applications, data and storage are hosted by cloud providers, there are huge security risks associated with each type of service offered. Originality/value – There are a number of considerations, apart from cost which must be evaluated before choosing any particular provider. Sometimes, the physical location of the servers may also be a factor to consider, if sensitive duty is involved.
APA, Harvard, Vancouver, ISO, and other styles
7

Murad, Salah Eddin, and Salah Dowaji. "Service value optimization of cloud hosted systems using particle swarm technique." Journal of Enterprise Information Management 29, no. 4 (July 11, 2016): 612–26. http://dx.doi.org/10.1108/jeim-02-2015-0018.

Full text
Abstract:
Purpose – Cloud Computing has become a more promising technology with potential opportunities, through reducing the high cost of running the traditional business applications and by leading to new business models. Nonetheless, this technology is fraught with many challenges. From a Software as a Service (SaaS) provider perspective, deployment choices are one of the major perplexing issues in determining the degree to which the application owners’ objectives are met while considering their customers’ targets. The purpose of this paper is to present a new model that allows the service owner to optimize the resources selection based on defined metrics when responding to many customers’ with various priorities. Design/methodology/approach – More than 65 academic papers have been collected, a short list of the most related 35 papers have been reviewed, in addition to assessing the functionality of major cloud systems. A potential set of techniques has been investigated to determine the most appropriate ones. Moreover, a new model has been built and a study of different simulation platforms has been conducted. Findings – The findings demonstrate that serving many SaaS customer requests, with different agreements and expected outcomes, would have mutual influence that impact the overall provider objectives. Furthermore, this paper investigates how tagging those customers with various priorities, with reflection of their importance to the provider, permits controlling and aligning the selection of computing resources as per the current objectives and defined priorities. Research limitations/implications – This study provides researchers with a useful literature, which can assist them in relevant subject. Additionally, it uses a value-based approach and particle swarm technique to model and solve the optimization of the computing resource selection, considering different business objectives for both stakeholders, providers and customers. This study derives priority of a number of factors, by which service providers can make strong and adaptive decisions. Practical implications – The paper includes implications on how the SaaS service provider can make decisions to select the needed virtual machines type driven by his own preferences. Originality/value – This paper rests on the usage of Particle Swarm Optimization technique to optimize the business value of the service provider, as well as the usage of value-based approach. This will help model that value in order to combine the total profit of the provider and the customer satisfaction, based on the agreed budget and processing time requested by the customer. Another additional approach has been charted by using the customer severity factor that allows the provider to reflect the customer importance while making the placement decision.
APA, Harvard, Vancouver, ISO, and other styles
8

Lowe, Devesh, and Bhavna Galhotra. "An Overview of Pricing Models for Using Cloud Services with analysis on Pay-Per-Use Model." International Journal of Engineering & Technology 7, no. 3.12 (July 20, 2018): 248. http://dx.doi.org/10.14419/ijet.v7i3.12.16035.

Full text
Abstract:
Emergence of Cloud computing in recent years has provided various options to end-users w.r.t. cloud services. Different end users have different requirements for cloud services such as IAAS, PAAS & SAAS, but these services can be availed using different pricing mechanisms such as PPU, PFR, leased based, subscription based and dynamic pricing based on factors such as initial cost, lease period, QoS, Age of resources and cost of maintenance. The authors work focusses on ‘pay-per-use’ model of cloud pricing by studying various aspects of this model and comparing the current pricing rates of leading cloud service provider. Through this paper, the authors try to analyse the pricing model used by provider by comparing similar pricing offered by competitors. Authors will also try to establish the fairness of pricing as basis for designing better model for such services. The idea of pay per use has emerged to counter the rampant software piracy, while capturing the marginal and heterogeneous users who have been often found to use pirated software, as the acquisition costs for perpetual usage are too high. The marginal usage does not justify the huge capital investment of perpetual license, thereby leading to software piracy. In this paper the authors have also discussed on the Pay per use SaaS model and how it is better than the perpetual licencing, Pay per use is primarily dependent upon certain market conditions, like higher potential for piracy, lower inconvenience costs, majority of marginal users, and strong cloud network presence. Whereas perpetual licensing is important for heavy users, a market having the above-mentioned conditions will always benefit the SaaS pay per use model. So while the developer finds advantages of increased authorized user network, lower costs of marketing, enhanced customer reliability, and lesser impact of piracy, the rewards for users are even greater as they get to use the licensed full & updated versions for a small fee, even for minor everyday usage without incurring the huge expenditure on acquisition.
APA, Harvard, Vancouver, ISO, and other styles
9

Mangalampalli, Sudheer, Ganesh Reddy Karri, and Ahmed A. Elngar. "An Efficient Trust-Aware Task Scheduling Algorithm in Cloud Computing Using Firefly Optimization." Sensors 23, no. 3 (January 26, 2023): 1384. http://dx.doi.org/10.3390/s23031384.

Full text
Abstract:
Task scheduling in the cloud computing paradigm poses a challenge for researchers as the workloads that come onto cloud platforms are dynamic and heterogeneous. Therefore, scheduling these heterogeneous tasks to the appropriate virtual resources is a huge challenge. The inappropriate assignment of tasks to virtual resources leads to the degradation of the quality of services and thereby leads to a violation of the SLA metrics, ultimately leading to the degradation of trust in the cloud provider by the cloud user. Therefore, to preserve trust in the cloud provider and to improve the scheduling process in the cloud paradigm, we propose an efficient task scheduling algorithm that considers the priorities of tasks as well as virtual machines, thereby scheduling tasks accurately to appropriate VMs. This scheduling algorithm is modeled using firefly optimization. The workload for this approach is considered by using fabricated datasets with different distributions and the real-time worklogs of HPC2N and NASA were considered. This algorithm was implemented by using a Cloudsim simulation environment and, finally, our proposed approach is compared over the baseline approaches of ACO, PSO, and the GA. The simulation results revealed that our proposed approach has shown a significant impact over the baseline approaches by minimizing the makespan, availability, success rate, and turnaround efficiency.
APA, Harvard, Vancouver, ISO, and other styles
10

Basu, Srijita, Sandip Karmakar, and Debasish Bera. "Securing Cloud Virtual Machine Image Using Ethereum Blockchain." International Journal of Information Security and Privacy 16, no. 1 (January 2022): 1–22. http://dx.doi.org/10.4018/ijisp.295868.

Full text
Abstract:
Virtual Machine Image (VMI) is the building block of cloud infrastructure. It encapsulates the various applications and data deployed at the Cloud Service Provider (CSP) end. With the leading advances of cloud computing, comes the added concern of its security. Securing the Cloud infrastructure as a whole is based on the security of the underlying Virtual Machine Images (VMI). In this paper an attempt has been made to highlight the various risks faced by the CSP and Cloud Service Consumer (CSC) in the context of VMI related operations. Later, in this article a formal model of the cloud infrastructure has been proposed. Finally, the Ethereum blockchain has been incorporated to secure, track and manage all the vital operations of the VMIs. The immutable and decentralized nature of blockchain not only makes the proposed scheme more reliable but guarantees auditability of the system by maintaining the entire VMI history in the blockchain.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "LEADING CLOUD PROVIDER"

1

McCarthy, David. Aws: The Leading Global Cloud Service Provider Amazon Web Services 2019. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

McCarthy, David. Aws: All You Need to Know about the Leading Global Cloud Service Provider. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Falco, Gregory J., and Eric Rosenbach. Confronting Cyber Risk. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780197526545.001.0001.

Full text
Abstract:
Confronting Cyber Risk: An Embedded Endurance Strategy for Cybersecurity is a practical leadership handbook defining a new strategy for improving cybersecurity and mitigating cyber risk. Written by two leading experts with extensive professional experience in cybersecurity, the book provides CEOs and cyber newcomers alike with novel, concrete guidance on how to implement a cutting-edge strategy to mitigate an organization’s overall risk to malicious cyberattacks. Using short, real-world case studies, the book highlights the need to address attack prevention and the resilience of each digital asset while also accounting for an incident’s potential impact on overall operations. In a world of hackers, artificial intelligence, and persistent ransomware attacks, the Embedded Endurance strategy embraces the reality of interdependent digital assets and provides an approach that addresses cyber risk at both the micro level (people, networks, systems and data) and the macro level (the organization). Most books about cybersecurity focus entirely on technology; the Embedded Endurance strategy recognizes the need for sophisticated thinking about hardware and software while also extending beyond to address operational, reputational and litigation risk. This book both provides the reader with a solid grounding in important prevention-focused technologies—such as cloud-based security and intrusion detection—and emphasizes the important role of incident response. By implementing an Embedded Endurance strategy, you can guide your team to blunt major cyber incidents with preventative and resilience measures engaged systematically across your organization.
APA, Harvard, Vancouver, ISO, and other styles
4

Bouchet, Freddy, Tapio Schneider, Antoine Venaille, and Christophe Salomon, eds. Fundamental Aspects of Turbulent Flows in Climate Dynamics. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198855217.001.0001.

Full text
Abstract:
This book collects the text of the lectures given at the Les Houches Summer School on “Fundamental aspects of turbulent flows in climate dynamics”, held in August 2017. Leading scientists in the fields of climate dynamics, atmosphere and ocean dynamics, geophysical fluid dynamics, physics and non-linear sciences present their views on this fast growing and interdisciplinary field of research, by venturing upon fundamental problems of atmospheric convection, clouds, large-scale circulation, and predictability. Climate is controlled by turbulent flows. Turbulent motions are responsible for the bulk of the transport of energy, momentum, and water vapor in the atmosphere, which determine the distribution of temperature, winds, and precipitation on Earth. Clouds, weather systems, and boundary layers in the oceans and atmosphere are manifestations of turbulence in the climate system. Because turbulence remains as the great unsolved problem of classical physics, we do not have a complete physical theory of climate. The aim of this summer school was to survey what is known about how turbulent flows control climate, what role they may play in climate change, and to outline where progress in this important area can be expected, given today’s computational and observational capabilities. This book reviews the state-of-the-art developments in this field and provides an essential background to future studies. All chapters are written from a pedagogical perspective, making the book accessible to masters and PhD students and all researchers wishing to enter this field. It is complemented by online video of several lectures and seminars recorded during the summer school.
APA, Harvard, Vancouver, ISO, and other styles
5

Goswami, B. N., and Soumi Chakravorty. Dynamics of the Indian Summer Monsoon Climate. Oxford University Press, 2017. http://dx.doi.org/10.1093/acrefore/9780190228620.013.613.

Full text
Abstract:
Lifeline for about one-sixth of the world’s population in the subcontinent, the Indian summer monsoon (ISM) is an integral part of the annual cycle of the winds (reversal of winds with seasons), coupled with a strong annual cycle of precipitation (wet summer and dry winter). For over a century, high socioeconomic impacts of ISM rainfall (ISMR) in the region have driven scientists to attempt to predict the year-to-year variations of ISM rainfall. A remarkably stable phenomenon, making its appearance every year without fail, the ISM climate exhibits a rather small year-to-year variation (the standard deviation of the seasonal mean being 10% of the long-term mean), but it has proven to be an extremely challenging system to predict. Even the most skillful, sophisticated models are barely useful with skill significantly below the potential limit on predictability. Understanding what drives the mean ISM climate and its variability on different timescales is, therefore, critical to advancing skills in predicting the monsoon. A conceptual ISM model helps explain what maintains not only the mean ISM but also its variability on interannual and longer timescales.The annual ISM precipitation cycle can be described as a manifestation of the seasonal migration of the intertropical convergence zone (ITCZ) or the zonally oriented cloud (rain) band characterized by a sudden “onset.” The other important feature of ISM is the deep overturning meridional (regional Hadley circulation) that is associated with it, driven primarily by the latent heat release associated with the ISM (ITCZ) precipitation. The dynamics of the monsoon climate, therefore, is an extension of the dynamics of the ITCZ. The classical land–sea surface temperature gradient model of ISM may explain the seasonal reversal of the surface winds, but it fails to explain the onset and the deep vertical structure of the ISM circulation. While the surface temperature over land cools after the onset, reversing the north–south surface temperature gradient and making it inadequate to sustain the monsoon after onset, it is the tropospheric temperature gradient that becomes positive at the time of onset and remains strongly positive thereafter, maintaining the monsoon. The change in sign of the tropospheric temperature (TT) gradient is dynamically responsible for a symmetric instability, leading to the onset and subsequent northward progression of the ITCZ. The unified ISM model in terms of the TT gradient provides a platform to understand the drivers of ISM variability by identifying processes that affect TT in the north and the south and influence the gradient.The predictability of the seasonal mean ISM is limited by interactions of the annual cycle and higher frequency monsoon variability within the season. The monsoon intraseasonal oscillation (MISO) has a seminal role in influencing the seasonal mean and its interannual variability. While ISM climate on long timescales (e.g., multimillennium) largely follows the solar forcing, on shorter timescales the ISM variability is governed by the internal dynamics arising from ocean–atmosphere–land interactions, regional as well as remote, together with teleconnections with other climate modes. Also important is the role of anthropogenic forcing, such as the greenhouse gases and aerosols versus the natural multidecadal variability in the context of the recent six-decade long decreasing trend of ISM rainfall.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "LEADING CLOUD PROVIDER"

1

O’Brien, Martin A., David Mohally, Götz P. Brasche, and Andrea G. Sanfilippo. "Huawei and International Data Spaces." In Designing Data Spaces, 451–69. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-93975-5_27.

Full text
Abstract:
AbstractIn a digitalized and deeply interconnected industrial ecosystem, it is of paramount importance to create mechanisms that seamlessly guarantee standardization and verifiability, interoperability (including regional legislation), transparency, and trustworthiness, in particular in the intermediation of all businesses and stakeholders, including SMEs. In this respect, the International Data Spaces and GAIA-X initiatives pave the way to a framework for collaboration that use secure and trusted data services to safeguard digital sovereignty.Huawei is a leading ICT provider, operating in more than 170 countries, and an active member of more than 600 standardization bodies and industry associations, among them IDS and GAIA-X. With their international footprint, IDS and GAIA-X are of the utmost importance for Huawei in Germany, Europe, and globally. Here, we provide a brief overview of our understanding of the data ecosystem’s inherent issues, and how regulations address them, followed by more specific examples of how IDS and GAIA-X objectives can be supported by Huawei, with emphasis on validating concepts in the manufacturing domain as a prominent reference example for an important European market vertical. We illustrate the use case of an evolved implementation of Industry 4.0 that integrates machines and cloud services. 5G connectivity for trusted networked production is added to the example. We finally highlight the use of GAIA-X compliant federated AI for control, maintenance, and match-making of demand and supply actors.
APA, Harvard, Vancouver, ISO, and other styles
2

Mbandi, Josephine, and Michael Kisangari. "Data Collection Using Wireless Sensor Networks and Online Visualization for Kitui, Kenya." In African Handbook of Climate Change Adaptation, 1735–47. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-45106-6_151.

Full text
Abstract:
AbstractKenya is a developing country with a population of 47,213,282 people this comprises of 56% low-income earners. Small businesses and crop production represent 23% of the income within the country, which is at risk as soils become less productive. Various factors have led to this, climate change and land overuse being leading causes. Without adaptation, the rural to urban migration will continue to increase.Through Internet of Things (IoT) and specifically wireless sensor networks, we can change how we obtain and consume information. Small-scale farmers can collect data and in exchange receive useful information about their soils, temperature, humidity, and moisture content hence make better choices during crop production. Connected end devices bring in data, which is currently sparse in relation to small-scale farming. IoT will enable analysis and informed decision-making including crop selection, support equipment, fertilizers, irrigation, and harvesting. The cloud-based analysis will provide information useful for policy making and improvement.This chapter presents a wireless sensor network (WSN) in mesh topography using XBee communication module, communication, and raspberry pi, combined with a cloud-based data storage and analysis. We successfully set up a proof of concept to test a sensor node that sends information to a RPi and onto an online visualization platform.
APA, Harvard, Vancouver, ISO, and other styles
3

Archana, Kande, V. Kamakshi Prasad, M. Ashok, and G. R. Anantha Raman. "Network Architectures and Protocols for Efficient Exploitation of Spectrum Resources in 5G." In 5G and Beyond, 45–62. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-3668-7_3.

Full text
Abstract:
AbstractWith the emergence of 5G technology with its enhanced architecture, there are unprecedented possibilities in terms of Quality of Service (QoS), data rate, latency, and capacity. This chapter throws light on different aspects of 5G technology and associated technological innovations. It has support for user diversity, devise diversity, and other dimensions. It focuses on the architecture of 5G technology, its underlying mechanisms, support for Device-to-Device (D2D) communication, Multiple-Input Multiple-Output (MIMO) enhancements, advanced interference management, enhanced utility of ultra-dense networks, spectrum sharing, and cloud technologies associated with 5G networks. It also proposes a methodology with spectrum broker with underlying components with delay-aware and energy-efficient approach to leverage 5G base stations leading to the reduction of energy consumption. The concept of queueing delays is considered while proposing the scheme for delay-sensitive communications with energy efficiency. The spectrum broker has mechanisms to achieve this. Simulation study with the proposed scheme shows the energy efficiency of the proposed scheme when compared with the state of the art. This chapter not only provides the required know-how on different technologies but also provides the simulation study that may trigger further investigation into the resource management in 5G networks.
APA, Harvard, Vancouver, ISO, and other styles
4

Riemann, Ute. "Benefits and Challenges for BPM in the Cloud." In Cloud Security, 1844–68. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8176-5.ch091.

Full text
Abstract:
Business processes are not only variable, they are dynamic as well. A key benefit of BPM is the ability to adjust processes accordingly in response to changing market requirements. In parallel to BPM, enterprise cloud computing technology has emerged to provide a more cost effective solution to businesses and services while making use of inexpensive computing solutions, which combines pervasive, internet, and virtualization technologies (). Despite the slow start the business benefits of cloud computing are as such that the transition of BPM to the cloud is now underway. Cloud services refer to the operation of a virtualized, automated, and service-oriented IT landscape that allows the flexible provision and usage-based invoicing of resources, services, and applications via a network or the Internet. The generic term “X-as-a-Service” summarized the business models delivering almost everything as a service. BPM in the cloud is often regarded as a SaaS application. More recently, BPM is being regarded as a PaaS as it facilitates the creation and deployment of applications, in this case business process solutions. The PaaS landscape is the least developed of the four cloud based software delivery models previously discussed. PaaS vendors, such as IBM, Oracle, Microsoft delivered an application platform with managed cloud infrastructure services however more recently the PaaS market has begun to evolve to include other middleware capabilities including process management. BPM PaaS is the delivery of BPM technology as a service via a cloud service provider. In order to be classified as a PaaS a BPM suite requires the following capabilities: the architecture should be multi-tenant, it should be hosted off premise and it should offer elasticity and metering by use capabilities. When we refer to BPM in the cloud what we are really referring to is a combination of BPM PaaS and BPaaS (Business Process as a Service). Business Process as a Service (BPaaS) is a set of pre-defined business processes that allows the execution of customized business processes in the cloud. BPaaS is a complete pre-integrated BPM platform hosted in the cloud and delivered as a service, for the development and execution of general-purpose business process application. Although such a service harbors an economic potential, questions that need to be answered are as follows: Can an individual and company-specific business process supported by a standardized cloud solution, or should we protect process creativity and competitive differentiation by allowing the company to design the processes individually and solely support basic data flows and structures? Does it make sense to take a software solution “out of the box” that handles both data and process in a cloud environment, or would this hinder the creativity of business (process) development leading to a lower quality of processes and consequently to a decrease in the competitive positioning of a company? How to manage the inherent compliance and security topic. Within a completely integrated business application system, all required security aspects can be implemented as safeguards with just enough money. Within the cloud, however, advanced standards and identity prove is required to monitor and measure information exchange across the federation. Thereby there seems to be no need for developing new protocols, but a standardized way to collect and evaluate the collected information.
APA, Harvard, Vancouver, ISO, and other styles
5

Yadav, Mahendra Pratap, Harishchandra A. Akarte, and Kumar Yadav. "Efficient Resources Utilization of Containerized Applications Using TOPSIS." In Recent Developments in Artificial Intelligence and Communication Technologies, 164–84. BENTHAM SCIENCE PUBLISHERS, 2022. http://dx.doi.org/10.2174/9781681089676122010011.

Full text
Abstract:
Highly demanding services require an appropriate amount of resources tomanage the fluctuating workload in cloud environment, which is a challenging task forcloud service provides over the Internet. Cloud providers offer these services to enduserwith pay and use model, such as utility computing. The services are offered toend-user by a cloud provider in a shareable fashion over Infrastructure-as-a-Service.So, IaaS is a type of computing service on which third parties host their application onvirtualized platforms, such as either VMs or Containers. Whenever some containers areoverloaded or under-loaded, it may cause SLA violation, degrade performance, cosumemaximum energy, and also cause minimum throughput and maximum response time. Italso leads to minimizing the customer satisfaction level along with cloud providers,leading to the penalty. The services hosted on VMs or Containers are highlydemanding services, and these highly demanding services are handled with the help ofload balancing. Load balancing is a way to automatically transfer the incoming requestsor load across a group of back-end containers. It improves the distribution of workloadacross multiple virtual machines. Traditionally, load balancing algorithms use one ortwo parameters to balance the load. In this paper, we used one of the popularoptimization techniques, namely the Technique for Order of Preferences by Similarityto Ideal Solution (TOPSIS) algorithm to manage the incoming traffic with the multiplecriteriadecision-making (MCDM) technique. When the proposed technique wascompared with different other techniques, such as round robin, it was found thatTOPSIS gives better performance in terms of efficient resources utilization. It alsominimizes the average response time, which prevents the machine from gettingoverloaded.
APA, Harvard, Vancouver, ISO, and other styles
6

Riemann, Ute. "Benefits and Challenges for BPM in the Cloud." In Web Services, 1681–705. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-7501-6.ch087.

Full text
Abstract:
Business processes are not only variable, they are dynamic as well. A key benefit of BPM is the ability to adjust processes accordingly in response to changing market requirements. In parallel to BPM, enterprise cloud computing technology has emerged to provide a more cost effective solution to businesses and services while making use of inexpensive computing solutions, which combines pervasive, internet, and virtualization technologies (). Despite the slow start the business benefits of cloud computing are as such that the transition of BPM to the cloud is now underway. Cloud services refer to the operation of a virtualized, automated, and service-oriented IT landscape that allows the flexible provision and usage-based invoicing of resources, services, and applications via a network or the Internet. The generic term “X-as-a-Service” summarized the business models delivering almost everything as a service. BPM in the cloud is often regarded as a SaaS application. More recently, BPM is being regarded as a PaaS as it facilitates the creation and deployment of applications, in this case business process solutions. The PaaS landscape is the least developed of the four cloud based software delivery models previously discussed. PaaS vendors, such as IBM, Oracle, Microsoft delivered an application platform with managed cloud infrastructure services however more recently the PaaS market has begun to evolve to include other middleware capabilities including process management. BPM PaaS is the delivery of BPM technology as a service via a cloud service provider. In order to be classified as a PaaS a BPM suite requires the following capabilities: the architecture should be multi-tenant, it should be hosted off premise and it should offer elasticity and metering by use capabilities. When we refer to BPM in the cloud what we are really referring to is a combination of BPM PaaS and BPaaS (Business Process as a Service). Business Process as a Service (BPaaS) is a set of pre-defined business processes that allows the execution of customized business processes in the cloud. BPaaS is a complete pre-integrated BPM platform hosted in the cloud and delivered as a service, for the development and execution of general-purpose business process application. Although such a service harbors an economic potential, questions that need to be answered are as follows: Can an individual and company-specific business process supported by a standardized cloud solution, or should we protect process creativity and competitive differentiation by allowing the company to design the processes individually and solely support basic data flows and structures? Does it make sense to take a software solution “out of the box” that handles both data and process in a cloud environment, or would this hinder the creativity of business (process) development leading to a lower quality of processes and consequently to a decrease in the competitive positioning of a company? How to manage the inherent compliance and security topic. Within a completely integrated business application system, all required security aspects can be implemented as safeguards with just enough money. Within the cloud, however, advanced standards and identity prove is required to monitor and measure information exchange across the federation. Thereby there seems to be no need for developing new protocols, but a standardized way to collect and evaluate the collected information.
APA, Harvard, Vancouver, ISO, and other styles
7

Riemann, Ute. "Benefits and Challenges for Business Process Management in the Cloud." In Web-Based Services, 2096–121. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-9466-8.ch092.

Full text
Abstract:
Business processes are not only variable they are as well dynamic. A key benefit of Business Process Management (BPM) is the ability to adjust business processes accordingly in response to changing market requirements. In parallel to BPM, enterprise cloud computing technology has emerged to provide a more cost effective solution to businesses and services while making use of inexpensive computing solutions, which combines pervasive, internet, and virtualization technologies (). Despite the slow start, the business benefits of cloud computing are as such that the transition of BPM to the cloud is now underway. Cloud services refer to the operation of a virtualized, automated, and service-oriented IT landscape allowing the flexible provision and usage-based invoicing of resources, services, and applications via a network or the internet. The generic term “X-as-a-Service” summarize the business models delivering almost everything as a service. BPM in the cloud is often regarded as a SaaS application. More recently, BPM is being regarded as a PaaS as it facilitates the creation and deployment of applications, in this case business process solutions. The PaaS landscape is the least developed of the four cloud based software delivery models previously discussed. PaaS vendors, such as IBM, Oracle, and Microsoft delivered an application platform with managed cloud infrastructure services however, more recently the PaaS market has begun to evolve to include other middleware capabilities including process management. BPM PaaS is the delivery of BPM technology as a service via a cloud service provider. For the classification as a PaaS a BPM suite requires the following capabilities: the architecture should be multi-tenant, hosting should be off premise and it should offer elasticity and metering by use capabilities. When we refer to BPM in the cloud, what we are really referring to is a combination of BPM PaaS and BPaaS (Business Process as a Service). Business Process as a Service (BPaaS) is a set of pre-defined business processes that allows the execution of customized business processes in the cloud. BPaaS is a complete pre-integrated BPM platform hosted in the cloud and delivered as a service, for the development and execution of general-purpose business process application. Although such a service harbors an economic potential there are remaining questions: Can an individual and company-specific business process supported by a standardized cloud solution, or should we protect process creativity and competitive differentiation by allowing the company to design the processes individually and solely support basic data flows and structures? Does it make sense to take a software solution “out of the box” that handles both data and process in a cloud environment, or would this hinder the creativity of business (process) development leading to a lower quality of processes and consequently to a decrease in the competitive positioning of a company? How to manage the inherent compliance and security topic. Within a completely integrated business application system, all required security aspects can be implemented as a safeguarding with just enough money. Within the cloud, however, advanced standards and identity prove is required to monitor and measure information exchange across the federation. Thereby there seems to be no need for developing new protocols, but a standardized way to collect and evaluate the collected information.
APA, Harvard, Vancouver, ISO, and other styles
8

Krishna, A. V. N. "A Randomized Cloud Library Security Environment." In Cloud Security, 1087–107. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8176-5.ch056.

Full text
Abstract:
Cloud computing is leading the technology development of today's communication scenario. This is because of its cost-efficiency and flexibility. In Cloud computing vast amounts of data are stored in varied and distributed environments, and security to data is of prime concern. RSA or Elliptic Curve Cryptography (ECC) provides a secure means of message transmission among communicating hosts using Diffie Hellman Key Exchange algorithm or ElGamal algorithm. By having key lengths of 160 bits, the ECC algorithm provides sufficient strength against crypto analysis and its performance can be compared with standard algorithms like RSA with a bit length of 1024 bits. In the present work, the plain text is converted to cipher text using RSA or ECC algorithms. As the proposed model is intended to be used in Cloud environment, a probabilistic mathematical model is also used. While the data is being retrieved from the servers, a query is being used which uses the mathematical model to search for the data which is still in encryption form. Final decryption takes place only at user's site by using the private keys. Thus the security model provides the fundamental security services like Authentication, Security, and Confidentiality to the transmitted message and also provides sufficient strength against crypto analysis in Cloud environment.
APA, Harvard, Vancouver, ISO, and other styles
9

Ahuja, Sanjay P., Thomas F. Furman, Kerwin E. Roslie, and Jared T. Wheeler. "System Benchmarking on Public Clouds." In Advances in Systems Analysis, Software Engineering, and High Performance Computing, 37–50. IGI Global, 2015. http://dx.doi.org/10.4018/978-1-4666-8676-2.ch004.

Full text
Abstract:
There are several public cloud providers that provide service across different cloud models such as IaaS, PaaS, and SaaS. End users require an objective means to assess the performance of the services being offered by the various cloud providers. Benchmarks have typically been used to evaluate the performance of various systems and can play a vital role in assessing performance of the different public cloud platforms in a vendor neutral manner. Amazon's EC2 Service is one of the leading public cloud service providers and offers many different levels of service. The research in this chapter focuses on system level benchmarks and looks into evaluating the memory, CPU, and I/O performance of two different tiers of hardware offered through Amazon's EC2. Using three distinct types of system benchmarks, the performance of the micro spot instance and the M1 small instance are measured and compared. In order to examine the performance and scalability of the hardware, the virtual machines are set up in a cluster formation ranging from two to eight nodes. The results show that the scalability of the cloud is achieved by increasing resources when applicable. This chapter also looks at the economic model and other cloud services offered by Amazon's EC2, Microsoft's Azure, and Google's App Engine.
APA, Harvard, Vancouver, ISO, and other styles
10

Ahuja, Sanjay P. "On the Use of System-Level Benchmarks for Comparing Public Cloud Environments." In Handbook of Research on Cloud Computing and Big Data Applications in IoT, 24–38. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8407-0.ch002.

Full text
Abstract:
The proliferation of public cloud providers and services offered necessitate that end users have benchmarking-related information that help compare the properties of the cloud computing environment being provided. System-level benchmarks are used to measure the performance of overall system or subsystem. This chapter surveys the system-level benchmarks that are used for traditional computing environments that can also be used to compare cloud computing environments. Amazon's EC2 Service is one of the leading public cloud service providers and offers many different levels of service. The research in this chapter focuses on system-level benchmarks and looks into evaluating the memory, CPU, and I/O performance of two different tiers of hardware offered through Amazon's EC2. Using three distinct types of system benchmarks, the performance of the micro spot instance and the M1 small instance are measured and compared. In order to examine the performance and scalability of the hardware, the virtual machines are set up in a cluster formation ranging from two to eight nodes.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "LEADING CLOUD PROVIDER"

1

De Carvalho, Leonardo Rebouças, Flavio Vidal, Bruno Kamienski, and Aleteia Araujo. "How Leading Public Cloud Providers Deliver FaaS Environments: A Comparative Study." In 2023 18th Iberian Conference on Information Systems and Technologies (CISTI). IEEE, 2023. http://dx.doi.org/10.23919/cisti58278.2023.10211560.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sharma, Abhishek, Umesh Kumar Singh, Kamal Upreti, Nishant Kumar, and Suyash Kumar Singh. "A Comparative analysis of security issues & vulnerabilities of leading Cloud Service Providers and in-house University Cloud platform for hosting E-Educational applications." In 2021 IEEE Mysore Sub Section International Conference (MysuruCon). IEEE, 2021. http://dx.doi.org/10.1109/mysurucon52639.2021.9641545.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Suciu, George, Traianlucian Militaru, Alexandru Vulpe, and Gyorgy Todoran. "CLOUD COMPUTING AND BIG DATA AS CONVERGENT TECHNOLOGIES FOR MOBILE E-LEARNING." In eLSE 2014. Editura Universitatii Nationale de Aparare "Carol I", 2014. http://dx.doi.org/10.12753/2066-026x-14-016.

Full text
Abstract:
Big data science is a powerful, pervasive force in knowledge management today, particularly for addressing the complex challenge of E-Learning. Cloud Computing comes in to provide access to entirely new education capabilities through sharing resources and services and managing and assigning resources effectively. Done right, the application of scientific principles to the creation of a true E-Learning optimization strategy can lead to significant improvements for both instructors and students. The paper is a general overview on how to provide a convergent platform that integrates cloud computing and big data for mobile E-Learning. Furthermore, we present how we use a cloud search based application to seek for weak signals in big data by analyzing multimedia data (text, voice, picture, video) and mining online social networks. Our research explains why education can no longer thrive without a science-based E-Learning platform, defines and illustrates the right science-based approach, and calls out the key features and functionalities of leading science-based E-Learning optimization systems. The paper fills a gap in the big data literature by providing an overview of big data tools running on cloud platforms, which could be applied for E-Learning strategies. In particular, given a cloud platform, we propose to leverage trivial and non-trivial connections between different curricula information and data from online social networks, in order to find patterns that are likely to provide innovative solutions to existing E-Learning problems. The aggregation of such weak signals will provide evidence of connections between student related behaviour faster and better than trivial mining of data. As a consequence, the software has a significant potential for matching E-Learning strategies and education challenges that are related in non-obvious ways.
APA, Harvard, Vancouver, ISO, and other styles
4

Meulengracht, Chresten Steen, and Henrik Meidell Poulsen. "An Optimum Cloud Based Approach to Timely and Secure After-Tax Economic Analysis and Decision Making." In SPE/IATMI Asia Pacific Oil & Gas Conference and Exhibition. SPE, 2023. http://dx.doi.org/10.2118/215410-ms.

Full text
Abstract:
Abstract The requirement for computing power in economic decision making is normally underestimated. It is already recognized in the oil and gas industry that for geologic and reservoir engineering calculations parallel computing power is necessary to carry out simulations of millions of cells. This paper explores he actual requirements for parallel computing for economic analysis covering decision making at all decision gates, and the added benefits of running calculations and databases in a secure cloud solution. Economic analysis for typical oil and gas project decisions requires extensive analysis of alternative scenarios and assumptions, and robustness evaluation such as identification of after-tax key metrics and break-even values. A specific project scenario analysis is then be added to the full analysis for exploration decisions, portfolio mergers and acquisitions (M&A) decisions or selection of optimum development scenario at front end engineering and design (FEED) stage or life-extension projects. A key challenge for economists is that other disciplines do not always recognize the complexity of the analysis involved and the time required to provide the necessary decision basis for investment in multiple-billion dollar projects. Economists need to be involved at an early stage of the work process and have access to powerful computing power provided by appropriate tools in the cloud. Economic analysis is usually the last step in a series of work steps leading up to making an investment decision by management. A framework for such tools in the cloud was developed and applied to several real-case decision gate situations, including exploration, concept select, field development including life extension/enhanced recovery projects and portfolio M&A activities. Other aspects of using this framework with respect to speed, security and user friendliness was also evaluated. An appropriate framework for covering all requirements for timely and secure after-tax economic analysis and decision making was successfully developed and applied to several real cases. This framework allows economists to define inputs and receive results in a fast and timely manner. Traditionally the discipline of economic analysis has not been considered computer-intensive or manpower-intensive, but this paper demonstrates that cloud computing is necessary and provides significant benefits and allows economic analysis to be performed faster and more timely, and still be very secure. This novel cloud-based approach to full and extensive economic analysis fulfills all requirements for fast, timely and secure economic decision making at all decision gates including exploration, concept select, field development incl. life extension/enhanced recovery projects and portfolio M&A activities.
APA, Harvard, Vancouver, ISO, and other styles
5

Huber, Florian, Jesse Amato-Grill, Alexei Bylinskii, Sergio H. Cantu, Ming-Guang Hu, Donggyu Kim, Alexander Lukin, Nate Gemelke, and Alexander Keesling. "Cloud-Accessible, Programmable Quantum Simulator Based on Two-Dimensional Neutral Atom Arrays." In Quantum 2.0. Washington, D.C.: Optica Publishing Group, 2022. http://dx.doi.org/10.1364/quantum.2022.qw3a.2.

Full text
Abstract:
Neutral atom arrays recently emerged as one the leading platforms for large-scale quantum computing and simulations [1, 2]. These systems offer a variety of possible qubit encodings with long coherence times along with exceptional programmability and reconfigurability of the array geometry and qubit connectivity. In addition, strong, highly coherent coupling between the qubits can be achieved using Rydberg states of the atoms. QuEra provides a cloud-accessible, programmable 256-qubit quantum simulator based on a two-dimensional array of Rubidium-87 atoms in reconfigurable optical tweezers.
APA, Harvard, Vancouver, ISO, and other styles
6

Suciu, George, Muneeb Anwar, and Roxana Mihalcioiu. "VIRTUALIZED VIDEO AND CLOUD COMPUTING FOR EFFICIENT ELEARNING." In eLSE 2017. Carol I National Defence University Publishing House, 2017. http://dx.doi.org/10.12753/2066-026x-17-114.

Full text
Abstract:
Multimedia content has become more and more popular in current eLearning platforms. However, there are still many use cases which require virtualized resources from cloud platforms: gamification, video analytics, and surveillance during exams, video conferencing for coaching, and video streaming (live and video-on-demand) services. This paper has the purpose to present how video services can exploit the virtualization techniques to provide ubiquitous video learning experience. Virtualization consists of multiple operating systems on a single physical system. Virtualization shares the underlying hardware resources, providing multiple execution environments for software partitioning and time-sharing. Furthermore, virtualization is the basis for cloud computing, which is adaptable and secure sharing of resources as a service, leading to ease of use for different types of computing platforms. This gives to the teachers the authority to deploy their curricula to new channels and to promote it with efficient video communication solutions. It is expected, from this point of view, that virtualization will contribute in a beneficial way to these use cases. The main concept we present in this paper is how to virtualize the video services and utilizing them combined with other eLearning techniques, whereas the virtualization technique permits virtualization for full server machines along with the whole operating system. The main objective is to demonstrate they can be monitored and easily be maintained, as well being extremely secure and provide uninterrupted services. Furthermore, virtualized video using Cloud would establish and provide quality eLearning relationships regardless of distance and devices or operating systems, being accessible through a web browser.
APA, Harvard, Vancouver, ISO, and other styles
7

Beinoravičius, Darijus, and Violeta Keršulienė. "PRIVACY PROTECTION IN THE TRANSMISSION OF PERSONAL DATA IN BUSINESS – INSIGHTS FROM LITHUANIA." In 12th International Scientific Conference „Business and Management 2022“. Vilnius Gediminas Technical University, 2022. http://dx.doi.org/10.3846/bm.2022.927.

Full text
Abstract:
In recent decades extremely rapid technological advances have been seen that have changed many areas of daily life and the business environment. These technological advances are leading to the increasing use of electronic communications networks and cloud technologies by individuals, businesses, and organizations to provide services, store, and manage records, especially in the electronic space. The increasing use of these links offers an unprecedented opportunity to systematically collect and use a variety of data (including personal data) for different purposes. Information and data collected and processed with the help of technology are used not only for the purposes of meeting the needs of natural and legal persons but for various other reasons too. In the context of the collection and the use of personal data, which is very widespread in business relations, ensuring the individual’s right to privacy becomes problematic, especially if the data have to be transferred to third countries outside the EU. The authors of the article provide an example of how the case of data transfer to a third party was resolved in the Lithuanian courts. It also provides insights into how data transfers to third (non-EU) countries will change according to the Standard Contractual Clauses (SCCs), which will take effect on December 27, 2022.
APA, Harvard, Vancouver, ISO, and other styles
8

Marko Šidlovský, Marko Šidlovský, Filip Ravas Filip Ravas, and Václav Jirovský Václav Jirovský. "Uniqway - students' carsharing project transforms mobility." In FISITA World Congress 2021. FISITA, 2021. http://dx.doi.org/10.46720/f2021-dgt-040.

Full text
Abstract:
"When we try to have a clear look at the evolution of transportation, we could see bright beginnings with shady todays. Many governmental organizations, institutions, technological companies, and even NGOs are trying to deal with current challenges in transport, mostly resulting in costly congestions. Moreover, almost everybody is expecting the change will be instant. A scientific view of the topic cannot be that simple. The current world of transport offers many data-based so-called solutions dealing with congestions. Furthermore, we have new transport devices, smaller, efficient, single purpose. Last but not least, old technological tools are being transformed by a new approach to the relation between the transportation provider and its user. We can see navigation apps enhanced with real-time traffic data promoting selfish behavior leading to the devastation of calm neighborhoods and local roads by directing high traffic through such places. History of narrow city streets is thwarted by oversized shared bicycles waiting for their next rider or literally blocked by piles of e-scooters. Hence, the environment is adapting to these novelties. Often it is performed in the most convenient way, not in an optimal manner. One of the new approaches to old types of transport is carsharing – user still drives a car by himself, but he does not own the vehicle. It is expected that the current global vehicle fleet of private cars could be reduced almost by 30% if proper carsharing systems are widely adopted. However, proper adoption and time to achieve it are the most significant challenges. The average lifecycle of a single vehicle ends after ca. twenty years of service. Technologies, which allowed the rise of carsharing exist and have widely penetrated into the market in the last few years, even though the carsharing itself is known since ca. 1950s. Therefore, if we would like to observe the real benefit of carsharing, we need to wait until the lifecycle of at least 30% of vehicles will end, while the carsharing is highly available to anybody anywhere. It could be a very long time. On the other hand, current trends in ownership show a different approach to the new generations. Some of the studies doubt the longevity of their alternative attitude, as the needs change with age, family life, employment, etc. Nevertheless, dealing with the young generation, such as students, means dealing with people without a car and relatively low income. Thus, on the one hand, it is an excellent opportunity to reduce the number of stationary vehicles on the streets in the future, but it is a tough challenge in promoting relatively expensive service. Project Uniqway is a carsharing service entirely developed and operated by students with the financial support of ŠKODA AUTO. The student team's limited capacity and focus on development resulted in heavy deployment automation and migration from custom high maintenance and virtual machine-based solutions to cloud-based infrastructure. Challenges of this transformation and data-driven approach pointing out specifics of immensely successful carsharing service in the Czech Republic are unveiled in the proposed article."
APA, Harvard, Vancouver, ISO, and other styles
9

Zollo, Alessandra Lucia, and Edoardo Bucchignani. "A Tool for Remote Detection and Nowcasting of In-Flight Icing Using Satellite Data." In International Conference on Icing of Aircraft, Engines, and Structures. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 2023. http://dx.doi.org/10.4271/2023-01-1489.

Full text
Abstract:
<div class="section abstract"><div class="htmlview paragraph">In-flight icing is a major weather hazard to aviation; therefore, the remote detection of meteorological conditions leading to icing is a very aspired goal for the scientific community. In 2017, the Meteorological Laboratory of CIRA has developed a satellite-based tool for in-flight icing detection in collaboration with Italian Air Force Meteorological Service. Then, in the framework of the European project SENS4ICE, a further maturation of the previously developed algorithm has been achieved, in order to consider also Supercooled Large Drop (SLD) Icing Conditions. The tool relies on high-resolution satellite products based on Meteosat Second Generation (MSG) data. The aim of this product is to identify areas potentially affected by in-flight icing hazard, using information about the properties of clouds, remotely inferred from satellite, and the set of experimental curves and envelopes describing the interrelationship of icing-related cloud variables, that represent the icing reference certification rules, namely Appendix C and Appendix O to FAA 14 CFR Part 25 / EASA CS-25. Furthermore, starting from this detection product, a nowcasting tool has been developed with the aim to perform a forecast of the current icing conditions over a short period ahead. In the present work an overall description of the implemented tools for detection and nowcasting of icing conditions is provided. These tools will be used during the SENS4ICE flight test campaign, to be held in April 2023, which represents a good opportunity to validate them and to identify steps for future enhancements.</div></div>
APA, Harvard, Vancouver, ISO, and other styles
10

Melvin, Jharge S. R., B. Rokesh, S. Dheepajyothieshwar, and K. Akila. "Driver Yawn Prediction Using Convolutional Neural Network." In International Research Conference on IOT, Cloud and Data Science. Switzerland: Trans Tech Publications Ltd, 2023. http://dx.doi.org/10.4028/p-43g8x2.

Full text
Abstract:
Driver weariness is the leading cause of road accidents, according to numerous studies. Computer vision algorithms have shown promise in detecting indicators of exhaustion from facial motions like yawning. Precision and consistent yawning recognition is difficult in the real-world driving environment due to the various facial gestures and expression of driver. Yawning causes a mouth deformation in a variety of facial activities and expressions. This paper provides a novel way to based on minor facial motion identification to address the aforementioned concerns. We offer a for nuanced face activity recognition. Bidirectional and 3D convolutional networks are used in this network. A keyframe selection technique is used to discover the most from delicate face gestures. This method employs photo histograms to swiftly eliminate redundant frames and the median absolute deviation to locate outliers. A variety of tests are also run on the method.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "LEADING CLOUD PROVIDER"

1

Abell, Thomas, Arndt Husar, and Lim May-Ann. Cloud Computing as a Key Enabler for Tech Start-Ups across Asia and the Pacific. Asian Development Bank, July 2021. http://dx.doi.org/10.22617/wps210253-2.

Full text
Abstract:
New enterprises that produce digital solutions for businesses, public institutions, civil society, and consumers play a vital role in shaping digital economies. These dynamic start-ups most effectively integrate leading talent and sources of capital. They are driven by an urgency to succeed quickly—if they do not, they will then seek to deploy skills and resources more effectively. Governments need to establish or refine policies and mechanisms that foster vibrant start-up ecosystems, enabled by foundational technologies such as cloud computing. This paper provides an overview of the opportunities and challenges involved and suggests how policymakers can help start-ups make the most of cloud-computing technologies.
APA, Harvard, Vancouver, ISO, and other styles
2

Kolgatin, Oleksandr H., Larisa S. Kolgatina, Nadiia S. Ponomareva, and Ekaterina O. Shmeltser. Systematicity of students’ independent work in cloud learning environment. [б. в.], September 2019. http://dx.doi.org/10.31812/123456789/3247.

Full text
Abstract:
The paper deals with the problem of out-of-class students’ independent work in information and communication learning environment based on cloud technologies. Results of appropriate survey among students of pedagogical university are discussed. The students answered the questions about systematicity of their learning activity and propositions for its improving. It is determined that the leading problems are needs in more careful instruction according to features of the task completing, insufficient experience in self-management, the lack of internal motivation. Most of all, students recommend to provide the tasks with detail instruction (oral or written) and to pay attention to careful planning the time that is necessary for full completion of the task. It is pointed that such complicated requirements can be satisfied only by complex use of information and communication technologies as well as the automated system of pedagogical diagnostics. Some requirements for management of students’ out-of-classroom independent work are formulated as a result of this discussion.
APA, Harvard, Vancouver, ISO, and other styles
3

Modlo, Yevhenii O., Serhiy O. Semerikov, Stanislav L. Bondarevskyi, Stanislav T. Tolmachev, Oksana M. Markova, and Pavlo P. Nechypurenko. Methods of using mobile Internet devices in the formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3677.

Full text
Abstract:
An analysis of the experience of professional training bachelors of electromechanics in Ukraine and abroad made it possible to determine that one of the leading trends in its modernization is the synergistic integration of various engineering branches (mechanical, electrical, electronic engineering and automation) in mechatronics for the purpose of design, manufacture, operation and maintenance electromechanical equipment. Teaching mechatronics provides for the meaningful integration of various disciplines of professional and practical training bachelors of electromechanics based on the concept of modeling and technological integration of various organizational forms and teaching methods based on the concept of mobility. Within this approach, the leading learning tools of bachelors of electromechanics are mobile Internet devices (MID) – a multimedia mobile devices that provide wireless access to information and communication Internet services for collecting, organizing, storing, processing, transmitting, presenting all kinds of messages and data. The authors reveals the main possibilities of using MID in learning to ensure equal access to education, personalized learning, instant feedback and evaluating learning outcomes, mobile learning, productive use of time spent in classrooms, creating mobile learning communities, support situated learning, development of continuous seamless learning, ensuring the gap between formal and informal learning, minimize educational disruption in conflict and disaster areas, assist learners with disabilities, improve the quality of the communication and the management of institution, and maximize the cost-efficiency. Bachelor of electromechanics competency in modeling of technical objects is a personal and vocational ability, which includes a system of knowledge, skills, experience in learning and research activities on modeling mechatronic systems and a positive value attitude towards it; bachelor of electromechanics should be ready and able to use methods and software/hardware modeling tools for processes analyzes, systems synthesis, evaluating their reliability and effectiveness for solving practical problems in professional field. The competency structure of the bachelor of electromechanics in the modeling of technical objects is reflected in three groups of competencies: general scientific, general professional and specialized professional. The implementation of the technique of using MID in learning bachelors of electromechanics in modeling of technical objects is the appropriate methodic of using, the component of which is partial methods for using MID in the formation of the general scientific component of the bachelor of electromechanics competency in modeling of technical objects, are disclosed by example academic disciplines “Higher mathematics”, “Computers and programming”, “Engineering mechanics”, “Electrical machines”. The leading tools of formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects are augmented reality mobile tools (to visualize the objects’ structure and modeling results), mobile computer mathematical systems (universal tools used at all stages of modeling learning), cloud based spreadsheets (as modeling tools) and text editors (to make the program description of model), mobile computer-aided design systems (to create and view the physical properties of models of technical objects) and mobile communication tools (to organize a joint activity in modeling).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography