Dissertations / Theses on the topic 'Cloud analysi'

To see the other types of publications on this topic, follow the link: Cloud analysi.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Cloud analysi.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Szczodrak, Malgorzata. "Variability of cloud optical depth and cloud droplet effective radius in layer clouds : satellite based analysis." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape15/PQDD_0019/NQ27255.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Brösamlen, Gerd. "Radiative transfer in lognormal multifractal clouds and analysis of cloud liquid water data." Thesis, McGill University, 1994. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=68158.

Full text
Abstract:
The study of radiative transfer in multifractal clouds is of great interest, an important application being to Global Climate Models. In this work we develop a formalism analogous to the multifractal singularity formalism for understanding photon scattering statistics in radiative transfer in multifractals, and test the results numerically on lognormal multifractals. Although the results are only exactly valid in the thick cloud limit, the approximation is found to be quite accurate down to optical thickness of $ tau approx1$-10, so the results may be widely applicable. Furthermore we show the possibility of "renormalizing" the multifractal by replacing it with a near equivalent homogeneous medium but with a "renormalized" optical thickness $ tau sp{1/(1+C sb1)}$ where C$ sb1$ is the codimension of the mean singularity of the cloud. We argue that this approximation is likely to continue to be valid for multiple scattering, and is also compatible with recent results for diffusion on multifractals. Finally we analyze cloud liquid water content data and estimate the universal multifractal indices. We find that the scaling is respected over the whole range 5m-330km and that the cloud can in fact be reasonably described by a lognormal multifractal.
APA, Harvard, Vancouver, ISO, and other styles
3

Williams, Robyn D. "Studies of Mixed-Phase Cloud Microphysics Using An In-Situ Unmanned Aerial Vehicle (UAV) Platform." Thesis, Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/7252.

Full text
Abstract:
Cirrus clouds cover between 20% - 50% of the globe and are an essential component in the climate. The improved understanding of ice cloud microphysical properties is contingent on acquiring and analyzing in-situ and remote sensing data from cirrus clouds. In ??u observations of microphysical properties of ice and mixed-phase clouds using the mini-Video Ice Particle Sizer (mini-VIPS) aboard robotic unmanned aerial vehicles (UAVs) provide a promising and powerful platform for obtaining valuable data in a cost-effective, safe, and long-term manner. The purpose of this study is to better understand cirrus microphysical properties by analyzing the effectiveness of the mini-VIPS/UAV in-situ platform. The specific goals include: (1) To validate the mini-VIPS performance by comparing the mini-VIPS data retrieved during an Artic UAV mission with data retrieved from the millimeterwavelength cloud radar (MMCR) at the Barrow ARM/CART site. (2) To analyze mini-VIPS data to survey the properties of high latitude mixedphase clouds The intercomparison between in-situ and remote sensing measurements was carried out by comparing reflectivity values calculated from in-situ measurements with observations from the MMCR facility. Good agreement between observations and measurements is obtained during the time frame where the sampled volume was saturated with respect to ice. We also have 1 2 shown that the degree of closure between calculated and observed reflectivity strongly correlates with the assumption of ice crystal geometry observed in the mini-VIPS images. The good correlation increases the confidence in mini-VIPS and MMCR measurements. Finally, the size distribution and ice crystal geometry obtained from the data analysis is consistent with published literature for similar conditions of temperature and ice supersaturation.
APA, Harvard, Vancouver, ISO, and other styles
4

Self, Lance. "Development and Analysis Cloud." International Foundation for Telemetering, 2010. http://hdl.handle.net/10150/605938.

Full text
Abstract:
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California
The development and analysis cloud is a rapid development system being designed to support the Air Force Research Lab (AFRL) Simulation & Technology Assessment Branch. The purpose is to isolate research, development, test, and evaluation of unique software within a Zone D enclave [1] to allow researchers and analysts to develop and test software free of the many IT requirements that hamper development and without risk of contaminating the overall Air Force network. The cloud system is being designed so researchers and analysts will utilize Software as a Service (SaaS) models. Such a model makes it transparent to users such things as where the software originates and any licensing concerns. Utilities, tools, and other enhancing software that users need are published and using them frees the developer to focus on their specific development efforts versus tertiary development modules.
APA, Harvard, Vancouver, ISO, and other styles
5

CENCI, ANNALISA. "An infrastructure for decision-making to support neonatal clinical care and research." Doctoral thesis, Università Politecnica delle Marche, 2018. http://hdl.handle.net/11566/253144.

Full text
Abstract:
Le culle dei neonati pretermine presso l’Unità di Terapia Intensiva Neonatale (UTIN) dell’Ospedale Pediatrico “G. Salesi” di Ancona sono circondati da numerosi dispositivi per il monitoraggio, la diagnosi e il trattamento delle malattie e forniscono un’enorme quantità di dati che, fino ad ora, venivano visualizzati solo su monitor e trascritti in una cartella clinica cartacea. Le note manuali, quindi, venivano periodicamente, ma non immediatamente, trascritte in un foglio elettronico sul PC della UTIN con il rischio di errori e dimenticanze. I medici hanno espresso la necessità di raccogliere automaticamente i dati dai dispositivi per garantire che non venissero trascurati dettagli importanti per la cura del paziente, essendo consapevoli che l’automazione di tale processo possa facilitare e migliorare l’implementazione delle procedure della pratica clinica quotidiana. L’obiettivo di questa tesi è quello di permettere l’interfacciamento delle strumentazioni biomediche della UTIN in un’unica infrastruttura cloud, che ne consenta la comunicazione con un DB unico. L’architettura proposta consente l’automatizzazione del processo di raccolta, trasmissione, memorizzazione, elaborazione e disponibilità dei dati dei dispositivi per il personale medico. Questa è garantita dalla realizzazione di un’interfaccia web che supera le funzionalità di una semplice Cartella Clinica Elettronica, grazie allo sviluppo di moduli clinici innovativi. Essi contengono tecniche di estrazione e analisi dei dati e algoritmi decisionali, che forniscono in output reminder, allarmi e indicatori, in grado di supportare i medici nella previsione e nella diagnosi di malattie come l’ittero neonatale o le disabilità motorie, nel monitoraggio dei parametri fisiologici, come la frequenza respiratoria e i parametri di crescita, e nelle decisioni da prendere riguardo problemi clinici, come gli apporti nutrizionali giornalieri e i follow-up. Tutte le soluzioni sopra descritte sono state validate attraverso esperimenti condotti nella UTIN sotto la supervisione del primario e dei medici del reparto.
Preterm infants’ cribs in Neonatal Intensive Care Unit (NICU) of “Women’s and Children’s Hospital G. Salesi” of Ancona are surrounded by many devices for the monitoring, diagnosis and treatment of diseases, which provide a huge amount of data that, until now, was only displayed on monitors and periodically transcribed in a paper medical record. Then manual notes were regularly, but not immediately, transcribed in an electronic sheet on a NICU PC with the risk of errors and forgetfulnesses. In this context, physicians have expressed the need to automatically gather data from all these devices to ensure that no key details for patient care were overlooked, as they are aware that the automation of this process could improve the implementation of the procedures of their daily clinical practice. The objective of this thesis is to allow the interfacing of NICU biomedical instruments into a single cloud-based infrastructure that enables the communication between different medical devices and a unique DB. The proposed architecture permits the automation of the process of device data collection, transmission, storage, processing and availability for medical staff, that is guaranteed through the implementation of a web interface that exceeds the functionalities of a simple Electronic Medical Record, thanks to the development of innovative clinical tools. They contain data extraction and analysis techniques and decision-making algorithms, which provide in output advices, reminders, alarms, and indicators that can support physicians in predicting and diagnosing diseases, such as neonatal jaundice or motor disabilities, in monitoring physiological parameters, such as respiratory rate and growth parameters, and in making decisions about clinical problems, such as daily nutritional intakes and follow-ups. All the solutions described above have been validated through experiments conducted in the NICU under the supervision of the physicians and the Head Physician.
APA, Harvard, Vancouver, ISO, and other styles
6

Pettegrew, Brian P. "Analysis of cloud and cloud-to-ground lightning in winter convection." Diss., Columbia, Mo. : University of Missouri-Columbia, 2008. http://hdl.handle.net/10355/5586.

Full text
Abstract:
Thesis (Ph. D.)--University of Missouri-Columbia, 2008.
The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file (viewed on June 15, 2009) Vita. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
7

Larson, Bridger Ronald. "Selecting Cloud Platform Services Based On Application Requirements." BYU ScholarsArchive, 2016. https://scholarsarchive.byu.edu/etd/6129.

Full text
Abstract:
As virtualization platforms or cloud computing have become more of a commodity, many more organizations have been utilizing them. Many organizations and technologies have emerged to fulfill those cloud needs. Cloud vendors provide similar services, but the differences can have significant impact on specific applications. Selecting the right provider is difficult and confusing because of the number of options. It can be difficult to determine which application characteristics will impact the choice of implementation. There has not been a concise process to select which cloud vendor and characteristics are best suited for the application requirements and organization requirements. This thesis provides a model that identifies crucial application characteristics, organization requirements and also characteristics of a cloud. The model is used to analyze the interaction of the application with multiple cloud platforms and select the best option based on a suitability score. Case studies utilize this model to test three applications against three cloud implementations to identify the best fit cloud implementation. The model is further validated by a small group of peers through a survey. The studies show that the model is useful in identifying and comparing cloud implementations with regard to application requirements.
APA, Harvard, Vancouver, ISO, and other styles
8

Brück, Heiner Matthias. "Evaluation of statistical cloud parameterizations." Doctoral thesis, Universitätsbibliothek Leipzig, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-212714.

Full text
Abstract:
This work is motivated by the question: how much complexity is appropriate for a cloud parameterization used in general circulation models (GCM). To approach this question, cloud parameterizations across the complexity range are explored using general circulation models and theoretical Monte-Carlo simulations. Their results are compared with high-resolution satellite observations and simulations that resolve the GCM subgrid-scale variability explicitly. A process-orientated evaluation is facilitated by GCM forecast simulations which reproduce the synoptic state. For this purpose novel methods were develop to a) conceptually relate the underlying saturation deficit probability density function (PDF) with its saturated cloudy part, b) analytically compute the vertical integrated liquid water path (LWP) variability, c) diagnose the relevant PDF-moments from cloud parameterizations, d) derive high-resolution LWP from satellite observations and e) deduce the LWP statistics by aggregating the LWP onto boxes equivalent to the GCM grid size. On this basis, this work shows that it is possible to evaluate the sub-grid scale variability of cloud parameterizations in terms of cloud variables. Differences among the PDF types increase with complexity, in particular the more advanced cloud parameterizations can make use of their double Gaussian PDF in conditions, where cumulus convection forms a separate mode with respect to the remainder of the grid-box. Therefore, it is concluded that the difference between unimodal and bimodal PDFs is more important, than the shape within each mode. However, the simulations and their evaluation reveals that the advanced parameterizations do not take full advantage of their abilities and their statistical relationships are broadly similar to less complex PDF shapes, while the results from observations and cloud resolving simulations indicate even more complex distributions. Therefore, this work suggests that the use of less complex PDF shapes might yield a better trade-off. With increasing model resolution initial weaknesses of simpler, e.g. unimodal PDFs, will be diminished. While cloud schemes for coarse-resolved models need to parameterize multiple cloud regimes per grid-box, higher spatial resolution of future GCMs will separate them better, so that the unimodal approximation improves.
APA, Harvard, Vancouver, ISO, and other styles
9

Kanmantha, Reddy Pruthvi Raj Reddy. "Comparative Analysis of Virtual Desktops in Cloud : Performance comparison of OpenStack Private Cloud and AWS Public cloud." Thesis, Blekinge Tekniska Högskola, Institutionen för kommunikationssystem, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-10840.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gordon, Neil D. "Cluster analysis of cloud properties a method for diagnosing cloud-climate feedbacks /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2008. http://wwwlib.umi.com/cr/ucsd/fullcit?p3296823.

Full text
Abstract:
Thesis (Ph. D.)--University of California, San Diego, 2008.
Title from first page of PDF file (viewed Mar. 24, 2008). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 108-112).
APA, Harvard, Vancouver, ISO, and other styles
11

Clyman, Matthew J. "Analysis of cloud-based database systems." Thesis, Monterey, California: Naval Postgraduate School, 2015. http://hdl.handle.net/10945/45826.

Full text
Abstract:
Approved for public release; distribution is unlimited
To take advantage of cloud computing benefits that boost an enterprise’s efficiency, innovation, and cost savings, the Department of Defense’s (DOD) cloud computing strategy needs to evaluate databases as a service. If the DOD is going to prioritize outsourced database server hosting, each application’s performance and agility of each must be assessed to determine if they can thrive in this new environment. We performed an experiment to compare the performance between a current Naval Postgraduate School standalone database server and a cloud version developed specifically for the experiment. The cloud environment was created both with resources less equal to and greater than the live standalone server. We simulated cloud environment traffic based on the type of queries observed in production and collected data to compare its performance against the standalone database. The results show that the cloud database performed similarly to or better than our standalone server, with equivalent resources. It achieved this level of performance without utilizing additional resources. We increased the resources dedicated to our cloud environment to test scalability, and we witnessed that the time needed to execute queries decreased significantly. We therefore concluded that our database would perform and scale favorably in a cloud environment.
APA, Harvard, Vancouver, ISO, and other styles
12

Pennella, Francesco. "Analisi e sperimentazione della piattaforma Cloud Dataflow." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amslaurea.unibo.it/12360/.

Full text
Abstract:
In questa trattazione si è interessati a sperimentare le possibilità offerte nel campo dell’elaborazione di Big Data da parte di una piattaforma di Cloud Computing sviluppata da Google, chiamata Cloud Dataflow. In particolare l’obiettivo è quello di analizzare e confrontare in modo sperimentale le caratteristiche e le performance di Cloud Dataflow con le piattaforme Apache Hadoop e Apache Spark tramite l’esecuzione di programmi di WordCount basati sul modello MapReduce.
APA, Harvard, Vancouver, ISO, and other styles
13

Forsman, Mona. "Point cloud densification." Thesis, Umeå universitet, Institutionen för fysik, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-39980.

Full text
Abstract:
Several automatic methods exist for creating 3D point clouds extracted from 2D photos. In manycases, the result is a sparse point cloud, unevenly distributed over the scene.After determining the coordinates of the same point in two images of an object, the 3D positionof that point can be calculated using knowledge of camera data and relative orientation. A model created from a unevenly distributed point clouds may loss detail and precision in thesparse areas. The aim of this thesis is to study methods for densification of point clouds. This thesis contains a literature study over different methods for extracting matched point pairs,and an implementation of Least Square Template Matching (LSTM) with a set of improvementtechniques. The implementation is evaluated on a set of different scenes of various difficulty. LSTM is implemented by working on a dense grid of points in an image and Wallis filtering isused to enhance contrast. The matched point correspondences are evaluated with parameters fromthe optimization in order to keep good matches and discard bad ones. The purpose is to find detailsclose to a plane in the images, or on plane-like surfaces. A set of extensions to LSTM is implemented in the aim of improving the quality of the matchedpoints. The seed points are improved by Transformed Normalized Cross Correlation (TNCC) andMultiple Seed Points (MSP) for the same template, and then tested to see if they converge to thesame result. Wallis filtering is used to increase the contrast in the image. The quality of the extractedpoints are evaluated with respect to correlation with other optimization parameters and comparisonof standard deviation in x- and y- direction. If a point is rejected, the option to try again with a largertemplate size exists, called Adaptive Template Size (ATS).
APA, Harvard, Vancouver, ISO, and other styles
14

Kommineni, Mohanarajesh, and Revanth Parvathi. "RISK ANALYSIS FOR EXPLORING THE OPPORTUNITIES IN CLOUD OUTSOURCING." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3656.

Full text
Abstract:
Context: Cloud Outsourcing is a new form of outsourcing which is not more under implementation and yet to be implemented. It is a form of outsourcing in which software organizations outsource the work to e-freelancers available throughout the world using cloud services via the Internet. Software organizations handovers the respective task to the cloud and from the cloud e-freelancers undertake the development of task and then return back the finished task to the cloud. Organizations recollect the finished task from the cloud and verify it and then pay to the e-freelancer. Objectives: The aim of this study is to identify the sequence of activities involved during the entire process of cloud outsourcing and to find out the risks which are likely to be occurred during the implementation of this process. To prioritize the elicitated risks according to their probability of occurrence, impact and cost required to mitigate the corresponding risk. Methods: Data is collected by literature review and then the data is synthesized. On the other side interviews with practitioners are conducted to know the activities involved and to find out the risks that are likely to be occurred during the implementation of cloud outsourcing. After this, a survey is conducted in order to prioritize the risks and a standard risk analysis is conducted to know the risks which are likely to be occurred. Literature review is done using four databases including the literature from the year 1990 to till date. Results: Totally we have identified 21 risks that are likely to be occurred and 8 activities so far. By performing risk analysis we have presented the risks, which should be considered first and relevant counter measures are suggested to overcome them.
mr.kommineni@me.com, mr.kommineni1@gmail.com phone no. +919963420123
APA, Harvard, Vancouver, ISO, and other styles
15

Laukka, Lucas, and Carl Fransson. "Cloud risk analysis using OCTAVE Allegro : Identifying and analysing risks of a cloud service." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-176731.

Full text
Abstract:
Cybersecurity is currently an important and relevant issue, as more and more industries are taking advantage of the accessibility of storing information online. To create a secure system one must know the potential risks and attacks on that system, making risk analysis a very potent tool. In this study, we performed such an analysis using the risk analysis method OCTAVE Allegro on a company providing a cloud-based service to find out what risks a cloud service provider might be exposed to, and the usefulness of said risk analysis method in this circumstance. We found that OCTAVE Allegro is suitable to use on smaller companies and applicable to cloud services, and the most severe risks identified were connected to leakage of client data with a consequence of damaging the company's reputation. Common areas of concern for these risks were third parties hacking the cloud server or other company systems to gain access to sensitive information. Such risks will likely be found at any company providing a cloud service that manages sensitive data, increasing the importance of risk analysis at these companies.
APA, Harvard, Vancouver, ISO, and other styles
16

Johansson, Jonas. "Evaluation of Cloud Native Solutions for Trading Activity Analysis." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-300141.

Full text
Abstract:
Cloud computing has become increasingly popular over recent years, allowing computing resources to be scaled on-demand. Cloud Native applications are specifically created to run on the cloud service model. Currently, there is a research gap regarding the design and implementation of cloud native applications, especially regarding how design decisions affect metrics such as execution time and scalability of systems. The problem investigated in this thesis is whether the execution time and quality scalability, ηt of cloud native solutions are affected when housing the functionality of multiple use cases within the same cloud native application. In this work, a cloud native application for trading data analysis is presented, where the functionality of 3 use cases are implemented to the application: (1) creating reports of trade prices, (2) anomaly detection, and (3) analysis of relation diagram of trades. The execution time and scalability of the application are evaluated and compared to readily available solutions, which serve as a baseline for the evaluation. The results of use cases 1 and 2 are compared to Amazon Athena, while use case 3 is compared to Amazon Neptune. The results suggest that having functionalities combined into the same application could improve both execution time and scalability of the system. The impact depends on the use case and hardware configuration. When executing the use cases in a sequence, the mean execution time of the implemented system was decreased up to 17.2% while the quality scalability score was improved by 10.3% for use case 2. The implemented application had significantly lower execution time than Amazon Neptune but did not surpass Amazon Athena for respective use cases. The scalability of the systems varied depending on the use case. While not surpassing the baseline in all use cases, the results show that the execution time of a cloud native system could be improved by having functionality of multiple use cases within one system. However, the potential performance gains differ depending on the use case and might be smaller than the performance gains of choosing another solution.
Cloud computing har de senaste åren blivit alltmer populärt och möjliggör att skala beräkningskapacitet och resurser på begäran. Cloud native-applikationer är specifikt skapade för att köras på distribuerad infrastruktur. För närvarande finns det luckor i forskningen gällande design och implementering av cloud native applikationer, särskilt angående hur designbeslut påverkar mätbara värden som exekveringstid och skalbarhet. Problemet som undersöks i denna uppsats är huruvida exekveringstiden och måttet av kvalitetsskalbarhet, ηt påverkas när funktionaliteten av flera användningsfall intregreras i samma cloud native applikation. I det här arbetet skapades en cloud native applikation som kombinerar flera användningsfall för att analysera transaktionsbaserad börshandelsdata. Funktionaliteten av 3 användningsfall implementeras i applikationen: (1) generera rapporter över handelspriser, (2) detektering av avvikelser och (3) analys av relations-grafer. Applikationens exekveringstid och skalbarhet utvärderas och jämförs med kommersiella cloudtjänster, vilka fungerar som en baslinje för utvärderingen. Resultaten från användningsfall 1 och 2 jämförs med Amazon Athena, medan användningsfall 3 jämförs med Amazon Neptune. Resultaten antyder att systemets exekveringstid och skalbarhet kan förbättras genom att funktionalitet för flera användningsfall implementeras i samma system. Effekten varierar beroende på användningsfall och hårdvarukonfiguration. När samtliga användningsfall körs i en sekvens, minskar den genomsnittliga körtiden för den implementerade applikationen med upp till 17,2% medan kvalitetsskalbarheten ηt förbättrades med 10,3%för användningsfall 2. Den implementerade applikationen har betydligt kortare exekveringstid än Amazon Neptune men överträffar inte Amazon Athena för respektive användningsfall. Systemens skalbarhet varierade beroende på användningsfall. Även om det inte överträffar baslinjen i alla användningsfall, visar resultaten att exekveringstiden för en cloud native applikation kan förbättras genom att kombinera funktionaliteten hos flera användningsfall inom ett system. De potentiella prestandavinsterna varierar dock beroende på användningsfallet och kan vara mindre än vinsterna av att välja en annan lösning.
APA, Harvard, Vancouver, ISO, and other styles
17

Wong, Leung. "Analysis of NoSQL Storage in the Cloud." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-89990.

Full text
Abstract:
Data storage is essential in many Internet services today. As the amountof generated data increases every day a way to store is needed. Relationaldatabases do not handle continuous growing data well. An option to it is touse NoSQL. XDIN Link¨ping is interested in developing an Internet serviceowhich uses non-relational type of data storage to store structured data. Thedata should be accessible to members of the service. The data storage doesnot make any assumption on what type of data is stored. Instead tags thatare attached to each entity are used to verify what type of data it is stored.This thesis provides a description of what NoSQL is and along with it theproposed database designs. Further it aims to investigate if it is reasonableto create a web service which uses NoSQL as the backend storage.
APA, Harvard, Vancouver, ISO, and other styles
18

Donner, Marc, Sebastian Varga, and Ralf Donner. "Point cloud generation for hyperspectral ore analysis." Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2018. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-231365.

Full text
Abstract:
Recent development of hyperspectral snapshot cameras offers new possibilities for ore analysis. A method for generating a 3D dataset from RGB and hyperspectral images is presented. By using Structure from Motion, a reference of each source image to the resulting point cloud is kept. This reference is used for projecting hyperspectral data onto the point cloud. Additionally, with this work flow it is possible to add meta data to the point cloud, which was generated from images alone.
APA, Harvard, Vancouver, ISO, and other styles
19

Bhattacharjee, Ratnadeep. "An analysis of the cloud computing platform." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/47864.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, System Design and Management Program, 2009.
Includes bibliographical references.
A slew of articles have been written about the fact that computing will eventually go in the direction of electricity. Just as most software users these days also own the hardware that runs the software, electricity users in the days of yore used to generate their own power. However, over time with standardization in voltage and frequency of generated power and better distribution mechanisms the generation of electricity was consolidated amongst fewer utility providers. The same is being forecast for computing infrastructure. Its is being touted that more and more users will rent computing infrastructure from a utility or "cloud" provider instead of maintaining their own hardware. This phenomenon or technology is being referred to Cloud Computing or Utility Computing. Cloud computing has been in existence in some form or the other since the beginning of computing. However, the advent of vastly improved software, hardware and communication technologies has given special meaning to the term cloud computing and opened up a world of possibilities. It is possible today to start an ecommerce or related company without investing in datacenters. This has turned out to be very beneficial to startups and smaller companies that want to test the efficacy of their idea before making any investment in expensive hardware. Corporations like Amazon, SalesForce.com, Google, IBM, Sun Microsystems, and many more are offering or planning to offer these infrastructure services in one form or another.
(cont.) An ecosystem has already been created and going by the investment and enthusiasm in this space the ecosystem is bound to grow. This thesis tries to define and explain the fundamentals of cloud computing. It looks at the technical aspects of this industry and the kind of applications where cloud can be used. It also looks at the economic value created by the platform, the network externalities, its effect on traditional software companies and their reaction to this technology. The thesis also tries to apply the principle of multi-homing, coring and tipping to the cloud-computing platform and explain the results. The hurdles for both users and providers of this service are also examined in this thesis.
by Ratnadeep Bhattacharjee.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
20

Donner, Marc, Sebastian Varga, and Ralf Donner. "Point cloud generation for hyperspectral ore analysis." TU Bergakademie Freiberg, 2017. https://tubaf.qucosa.de/id/qucosa%3A23196.

Full text
Abstract:
Recent development of hyperspectral snapshot cameras offers new possibilities for ore analysis. A method for generating a 3D dataset from RGB and hyperspectral images is presented. By using Structure from Motion, a reference of each source image to the resulting point cloud is kept. This reference is used for projecting hyperspectral data onto the point cloud. Additionally, with this work flow it is possible to add meta data to the point cloud, which was generated from images alone.
APA, Harvard, Vancouver, ISO, and other styles
21

Costantino, Lorenzo. "Analysis of aerosol-cloud interaction from space." Versailles-St Quentin en Yvelines, 2012. http://www.theses.fr/2012VERS0004.

Full text
Abstract:
Le but de cette thèse est de fournir une analyse exhaustive des interactions entre nuages et aérosols dans le Sud-Est de l'Atlantique, en quantifiant l'impact des aérosols sur le bilan radiatif régional en ondes courtes. Pour cet objectif, nous avons utilisé les données satellitaires de MODIS, PARASOL et CALIPSO, qui fournissent des observations complémentaires et quasi simultanées. L'idée principale qui a permis une analyse originale est d'utiliser les observations du lidar CALIPSO pour identifier les cas pour lesquels les couches d’aérosols et nuages vues par MODIS et PARASOL sont en interaction (mélangées) et ceux pour lesquels ils sont clairement disjoints. Il ressort de cette analyse que les propriétés des nuages sont fortement influencées par l'interaction avec les aérosols (premier effet indirect). On observe une diminution du rayon efficace de gouttelettes et du contenu en eau sous l'effet d’une hausse de la concentration des particules polluantes. En revanche, nous n’avons pas mis en évidence une modification significative de la réflectance des nuages. Lorsque les aérosols et les nuages sont mélangés, on observe aussi une diminution de l’occurrence des précipitations (second effet indirect) et l'augmentation de la couverture nuageuse. D'autre part, la fraction nuageuse est affectée par la présence d'aérosols, même si les particules de pollution sont situées au-dessus du sommet des nuages (sans interaction physique). Cette observation est interprétée comme étant une conséquence de l'effet radiatif des aérosols. Pour quantifier le forçage radiatif direct et indirect des aérosols, nous avons utilisé un code de transfert radiatif rapide à onde courte, contraint par les observations satellitaires. Sur six ans (2005-2010), le forçage moyen est faible et égal à -0. 07 (direct) et -0. 05 (indirect) W/m². Le forçage total est donc négatif (refroidissement) et égal à -0. 12 W/m²
The aim of this work is to provide a comprehensive analysis of cloud and aerosol interaction over South-East Atlantic, to quantify the overall aerosol impact on the regional radiation budget. We used data from MODIS, PARASOL and CALIPSO satellites, that fly in close proximity on the same sun-synchronous orbit and allow for complementary observations of the same portion of the atmosphere, within a few minutes. The main idea is to use CALIPSO vertical information to define whether or not aerosol and cloud layers observed by MODIS and PARASOL are mixed and interacting. We found evidences that, in case of interaction, cloud properties are strongly influenced by aerosol presence (first indirect effect). In particular, there is a decrease in cloud droplet effective radius and liquid water path with aerosol enhancement. On the other hand, we could not evidence any significant impact on the cloud reflectance. We also analyzed the aerosol impact on precipitation (second indirect effect). In polluted low clouds over the ocean, we found evidence of precipitation suppression and cloud cover increase with increasing aerosol concentration. On the other hand, cloud fraction is shown to be affected by aerosol presence, even if pollution particles are located above cloud top, without physical interaction. This observation is interpreted as a consequence of the aerosol radiative effect. Aerosol shortwave direct (DRF) and indirect (IRF) radiative forcing at TOA has been quantified, with the use of a radiative transfer model constrained by satellite observations. For the direct effect, there is a competition between cooling (negative, due to light scattering by the aerosols) and warming (positive, due to the absorption by the same particles). The six year (2005-2010) mean estimate is equal to -0. 07 (DRF) and -0. 05 (IRF) W/m². The resulting total aerosol forcing is negative (cooling) and equal to -0. 12 W/m²
APA, Harvard, Vancouver, ISO, and other styles
22

Doddapaneni, Purna, Quincy Wofford, and Nicole Maneth. "MULTI-SENSOR HEALTH PLATFORM WITH CLOUD ANALYSIS." International Foundation for Telemetering, 2016. http://hdl.handle.net/10150/624186.

Full text
Abstract:
What could we learn from monitoring our body processes with various portable sensors and an unconstrained analysis platform? Physiological processes in the human body produce observable biosignals. These signals contain a wealth of information about the condition of the body, and its reaction to environmental factors. Our study harnesses 9 unique sensors, integrated by the eHealthSensor platform for Arduino, to transmit data to an Android device. The Android device contains a local PostgreSQL database, which synchronizes with the cloud. Using this platform, researchers can monitor a subjects biosignals as they ride a roller coaster or participate in exercise activities. Nurses can monitor the vitals of multiple patients remotely. Analytic, cloud based services, managed by healthcare providers, could ultimately enable automated diagnosis of medical conditions.
APA, Harvard, Vancouver, ISO, and other styles
23

Deng, Nan. "Systems Support for Carbon-Aware Cloud Applications." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1439855103.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Parikh, Apoorva. "Cloud security and platform thinking : an analysis of Cisco Umbrella, a cloud-delivered enterprise security." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/121796.

Full text
Abstract:
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, System Design and Management Program, 2019
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 101-114).
Cisco's security business segment with over $2 billion in revenue in fiscal 2018, makes Cisco one of the largest enterprise security players in the market. It is also one of the fastest growing business segments for Cisco, with last five years' CAGR at 12%. While this growth rate is in line with the estimated CAGR for cybersecurity market growth between 2018 - 2022, another leading incumbent's growth rate shows there is an opportunity to grow even faster. What can Cisco do to accelerate its security business group's growth and, more broadly, how can Cisco maintain its leadership position in a rapidly evolving and highly fragmented cybersecurity market? The goal of this thesis is two folds: first, is to discover the emerging cybersecurity needs for enterprises under the dynamic threat landscape in mobile, cloud-era, and the resultant growth opportunities and challenges these present to Cisco's security business group. Second, is to discover the main elements of Cisco security business group's current growth strategies and to evaluate platform thinking as a potential growth strategy for Cisco's Cloud Security business. We find Cisco Umbrella, a recently launched cloud security offering exhibits potential to become a foundation for Cisco Cloud Security as an open platform ecosystem. We conclude by discussing a potential future platform direction for Cisco Umbrella and raise follow-on questions for further consideration.
by Apoorva Parikh.
S.M. in Engineering and Management
S.M.inEngineeringandManagement Massachusetts Institute of Technology, System Design and Management Program
APA, Harvard, Vancouver, ISO, and other styles
25

Hebbal, Yacine. "Semantic monitoring mechanisms dedicated to security monitoring in IaaS cloud." Thesis, Ecole nationale supérieure Mines-Télécom Atlantique Bretagne Pays de la Loire, 2017. http://www.theses.fr/2017IMTA0029/document.

Full text
Abstract:
L’introspection de machine virtuelle (VM) consiste à superviser les états et les activités de celles-ci depuis la couche de virtualisation, tirant ainsi avantage de son emplacement qui offre à la fois une bonne visibilité des états et des activités des VMs ainsi qu’une bonne isolation de ces dernières. Cependant, les états et les activités des VMs à superviser sont vus par la couche de virtualisation comme une suite binaire de bits et d’octets en plus des états des ressources virtuelles. L’écart entre la vue brute disponible à la couche de virtualisation et celle nécessaire pour la supervision de sécurité des VMs constitue un challenge pour l’introspection appelé « le fossé sémantique ». Pour obtenir des informations sémantiques sur les états et les activités des VMs à fin de superviser leur sécurité, nous présentons dans cette thèse un ensemble de techniques basé sur l’analyse binaire et la réutilisation du code binaire du noyau d’une VM. Ces techniques permettent d’identifier les adresses et les noms de la plupart des fonctions noyau d’une VM puis de les instrumenter (intercepter, appeler et analyser) pour franchir le fossé sémantique de manière automatique et efficiente même dans les cas des optimisations du compilateur et de la randomisation de l’emplacement du code noyau dans la mémoire de la VM
Virtual Machine Introspection (VMI) consists inmonitoring VMs security from the hypervisor layer which offers thanks to its location a strong visibility on their activities in addition to a strong isolation from them. However, hypervisor view of VMs is just raw bits and bytes in addition to hardware states. The semantic difference between this raw view and the one needed for VM security monitoring presents a significant challenge for VMI called “the semantic gap”. In order to obtain semantic information about VM states and activities for monitoring their security from the hypervisor layer, we present in this thesis a set of techniques based on analysis and reuse of VM kernel binary code. These techniques enable to identify addresses and names of most VM kernel functions then instrument (call, intercept and analyze) them to automatically bridge the semantic gap regardless of challenges presented by compiler optimizations and kernel base address randomization
APA, Harvard, Vancouver, ISO, and other styles
26

Nankervis, Christopher James. "Co-located analysis of ice clouds detected from space and their impact on longwave energy transfer." Thesis, University of Edinburgh, 2013. http://hdl.handle.net/1842/7755.

Full text
Abstract:
A lack of quality data on high clouds has led to inadequate representations within global weather and climate models. Recent advances in spaceborne measurements of the Earth’s atmosphere have provided complementary information on the interior of these clouds. This study demonstrate how an array of space-borne measurements can be used and combined, by close co-located comparisons in space and time, to form a more complete representation of high cloud processes and properties. High clouds are found in the upper atmosphere, where sub-zero temperatures frequently result in the formation of cloud particles that are composed of ice. Weather and climate models characterise the bulk properties of these ice particles to describe the current state of the cloud-sky atmosphere. By directly comparing measurements with simulations undertaken at the same place and time, this study demonstrates how improvements can be made to the representation of cloud properties. The results from this study will assist in the design of future cloud missions to provide a better quality input. These improvements will also help improve weather predictions and lower the uncertainty in cloud feedback response to increasing atmospheric temperature. Most clouds are difficult to monitor by more than one instrument due to continuous changes in: large-scale and sub-cloud scale circulation features, microphysical properties and processes and characteristic chemical signatures. This study undertakes co-located comparisons of high cloud data with a cloud ice dataset reported from the Microwave Limb Sounder (MLS) instrument onboard the Aura satellite that forms part of the A-train constellation. Data from the MLS science team include vertical profiles of temperature, ice water content (IWC) and the mixing ratios of several trace gases. Their vertical resolutions are 3 to 6 km. Initial investigations explore the link between cloud-top properties and the longwave radiation budget, developing methods for estimating cloud top heights using; longwave radiative fluxes, and IWC profiles. Synergistic trios of direct and indirect high cloud measurements were used to validate detections from the MLS by direct comparisons with two different A-train instruments; the NASA Moderate-resolution Imaging Spectroradiometer (MODIS) and the Clouds and the Earth’s Radiant Energy System (CERES) onboard on the Aqua satellite. This finding focuses later studies on two high cloud scene types that are well detected by the MLS; deep convective plumes that form from moist ascent, and their adjacent outflows that emanate outwards several hundred kilometres. The second part of the thesis identifies and characterises two different high cloud scenes in the tropics. Direct observational data is used to refine calculations of the climate sensitivity to upper tropospheric humidity and high cloud in different conditions. The data reveals several discernible features of convective outflows are identified using a large sample of MLS data. The key finding, facilitated by the use of co-location, reveals that deep convective plumes exert a large longwave warming effect on the local climate of 52 ± 28Wm−2, with their adjacent outflows presenting a more modest warming of 33 ± 20Wm−2.
APA, Harvard, Vancouver, ISO, and other styles
27

Matthews, N. "Models of molecular line emission from star formation regions." Thesis, University of Kent, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.381427.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Josefsson, Tim. "Root-cause analysis throughmachine learning in the cloud." Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-340428.

Full text
Abstract:
It has been predicted that by 2021 there will be 28 billion connected devices and that 80% of global consumer internet traffic will be related to streaming services such as Netflix, Hulu and Youtube. This connectivity will in turn be matched by a cloudinfrastructure that will ensure connectivity and services. With such an increase in infrastructure the need for reliable systems will also rise. One solution to providing reliability in data centres is root-cause analysis where the aim is to identifying the root-cause of a service degradation in order to prevent it or allow for easy localization of the problem.In this report we explore an approach to root-cause-analysis using a machine learning model called self-organizing maps. Self-organizing maps provides data classification, while also providing visualization of the model which is something many machine learning models fail to do. We show that self-organizing maps are a promising solutionto root-cause analysis. Within the report we also compare our approach to another prominent approachs and show that our model preforms favorably. Finally, we touch upon some interesting research topics that we believe can further the field of root-cause analysis
APA, Harvard, Vancouver, ISO, and other styles
29

Mineart, Gary M. "Multispectral satellite analysis of marine stratocumulus cloud microphysics." Thesis, Monterey, California. Naval Postgraduate School, 1988. http://hdl.handle.net/10945/23321.

Full text
Abstract:
Variations in marine stratocumulus cloud microphysics during FIRE IFO 1987 are observed and analyzed through the use of NOAA-9/10 AVHRR satellite data and aircraft in-cloud measurements. The relationships between channel 3 reflectance and cloud microphysical properties are examined through model reflectances based on Mie theory and the delta-Eddington approximation, and reveal a channel 3 reflectance dependence on cloud droplet size distribution. Satellite observations show significant regions of continental influence over the ocean through higher channel 3 reflectances resulting from the injection of continental aerosols and the associated modification of cloud droplet characteristics. Channel 3 reflectance gradients across individual cloud elements correspond to radially varying cloud droplet size distributions within the elements. Various mesoscale and microscale features such as ship stack effluent tracks and pollution sources are observed in the data. Correlations between reflectance values and aircraft measurements illustrate the potential of estimating cloud droplet size distribution and marine atmospheric boundary layer aerosol composition and concentration through use of satellite data. Such an estimation technique may prove useful in determining climatic implications of cloud reflectance changes due to the influence of natural and man-made aerosol sources, and provide a means to assess the performance of boundary layer electro-optic systems. Keywords: Radiometry; Cloud physics. Theses. (edc) 24u
APA, Harvard, Vancouver, ISO, and other styles
30

Morehead, Steven Emory. "Ship track cloud analysis for the North Pacifi." Thesis, Monterey, California. Naval Postgraduate School, 1988. http://hdl.handle.net/10945/23388.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Gu, Zhiqiany. "Texture analysis techniques for multi-spectral cloud classification." Thesis, University of Edinburgh, 1988. http://hdl.handle.net/1842/12072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Lampshire, Shelby. "Metal Dust Cloud Distribution Characterization Through Image Analysis." DigitalCommons@CalPoly, 2021. https://digitalcommons.calpoly.edu/theses/2333.

Full text
Abstract:
With the increasing development of metal additive manufacturing technology, the present need for accurate explosivity testing of high density and exotic metal powders is under active research. The accuracy of such tests depends upon the uniformity of the dust dispersion within the testing chambers during ignition. There is a need for further research to understand the dust cloud dispersion process in order to determine the best time for ignition. This study explores a methodology of using high-speed footage and image analysis to characterize the uniformity of a dust cloud temporally that future applications may build upon. This thesis consisted of the experimental methods used to generate a dust cloud for image acquisition and an in-depth study of processing pixel data in MATLAB to determine points of highest dust cloud uniformity. The image analysis process was applied to the generated footage and the results were assessed through visual means. The analysis was also applied to dust cloud footage generated by Lawrence Livermore National Laboratory on a transparent replica of a modified ANKO 20-L explosivity testing apparatus. The image analysis methodology proved to offer a promising means of determining dust distribution uniformity as it relates to the timing of explosivity ignition.
APA, Harvard, Vancouver, ISO, and other styles
33

Garraghan, Peter Michael. "Holistic cloud computing environmental quantification and behavioural analysis." Thesis, University of Leeds, 2014. http://etheses.whiterose.ac.uk/7192/.

Full text
Abstract:
Cloud computing has been characterized to be large-scale multi-tenant systems that are able to dynamically scale-up and scale-down computational resources to consumers with diverse Quality-of-Service requirements. In recent years, a number of dependability and resource management approaches have been proposed for Cloud computing datacenters. However, there is still a lack of real-world Cloud datasets that analyse and extensively model Cloud computing characteristics and quantify their effect on system dimensions such as resource utilization, user behavioural patterns and failure characteristics. This results in two research problems: First, without the holistic analysis of real-world systems Cloud characteristics, their dimensions cannot be quantified resulting in inaccurate research assumptions of Cloud system behaviour. Second, simulated parameters used in state-of-the-art Cloud mechanisms currently rely on theoretical values which do not accurately represent real Cloud systems, as important parameters such as failure times and energy-waste have not been quantified using empirical data. This presents a large gap in terms of practicality and effectiveness between developing and evaluating mechanisms within simulated and real Cloud systems. This thesis presents a comprehensive method and empirical analysis of large-scale production Cloud computing environments in order to quantify system characteristics in terms of consumer submission and resource request patterns, workload behaviour, server utilization and failures. Furthermore, this work identifies areas of operational inefficiency within the system, as well as quantifies the amount of energy waste created due to failures. We discover that 4-10% of all server computation is wasted due to Termination Events, and that failures contribute to approximately 11% of the total datacenter energy waste. These analyses of empirical data enables researchers and Cloud providers an enhanced understanding of real Cloud behaviour and supports system assumptions and provides parameters that can be used to develop and validate the effectiveness of future energy-efficient and dependability mechanisms.
APA, Harvard, Vancouver, ISO, and other styles
34

Peake, Chris. "Accepting the Cloud| A Quantitative Predictive Analysis of Cloud Trust and Acceptance Among IT Security Professionals." Thesis, Capella University, 2018. http://pqdtopen.proquest.com/#viewpdf?dispub=10980831.

Full text
Abstract:

Industry experts recognize the cloud and cloud-based services as advantageous from both operational and economic perspectives, yet the gap is that individuals and organizations hesitate to accept the cloud because of concerns about security and privacy. The purpose of this study is to examine what factors that may influence the cloud acceptance by IT professionals by focusing on the principal research question: To what extent do ease of use, usefulness, attitude, security apprehensions, compatibility, and trust predict IT security professionals’ acceptance of cloud computing. The population for this study consisted of IT security professionals who either had industry security certifications or had been in a security position for at least two years. Sample inclusion criteria consisted IT professionals with the qualification described above and over the age of 18 who were living in the United States. The study survey was administered using SurveyMonkey, which randomly selected and recruited potential participants who met the sample criteria from a participant database, resulting in ninety-seven total study participants. Among the six factors examined, perceived usefulness, attitudes, security apprehensions, and trust were found to significantly predict cloud acceptance. The results indicate that cloud service providers should focus their attention on these factors in order to promote cloud acceptance.

APA, Harvard, Vancouver, ISO, and other styles
35

McCulley, Shane. "Forensic Analysis of G Suite Collaborative Protocols." ScholarWorks@UNO, 2017. http://scholarworks.uno.edu/td/2386.

Full text
Abstract:
Widespread adoption of cloud services is fundamentally changing the way IT services are delivered and how data is stored. Current forensic tools and techniques have been slow to adapt to new challenges and demands of collecting and analyzing cloud artifacts. Traditional methods focusing only on client data collection are incomplete, as the client may have only a (partial) snapshot and misses cloud-native artifacts that may contain valuable historical information. In this work, we demonstrate the importance of recovering and analyzing cloud-native artifacts using G Suite as a case study. We develop a tool that extracts and processes the history of Google Documents and Google Slides by reverse engineering the web applications private protocol. Combined with previous work that has focused on API-based acquisition of cloud drives, this presents a more complete solution to cloud forensics, and is generalizable to any cloud service that maintains a detailed log of revisions.
APA, Harvard, Vancouver, ISO, and other styles
36

Islam, Md Zahidul. "A Cloud Based Platform for Big Data Science." Thesis, Linköpings universitet, Programvara och system, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-103700.

Full text
Abstract:
With the advent of cloud computing, resizable scalable infrastructures for data processing is now available to everyone. Software platforms and frameworks that support data intensive distributed applications such as Amazon Web Services and Apache Hadoop enable users to the necessary tools and infrastructure to work with thousands of scalable computers and process terabytes of data. However writing scalable applications that are run on top of these distributed frameworks is still a demanding and challenging task. The thesis aimed to advance the core scientific and technological means of managing, analyzing, visualizing, and extracting useful information from large data sets, collectively known as “big data”. The term “big-data” in this thesis refers to large, diverse, complex, longitudinal and/or distributed data sets generated from instruments, sensors, internet transactions, email, social networks, twitter streams, and/or all digital sources available today and in the future. We introduced architectures and concepts for implementing a cloud-based infrastructure for analyzing large volume of semi-structured and unstructured data. We built and evaluated an application prototype for collecting, organizing, processing, visualizing and analyzing data from the retail industry gathered from indoor navigation systems and social networks (Twitter, Facebook etc). Our finding was that developing large scale data analysis platform is often quite complex when there is an expectation that the processed data will grow continuously in future. The architecture varies depend on requirements. If we want to make a data warehouse and analyze the data afterwards (batch processing) the best choices will be Hadoop clusters and Pig or Hive. This architecture has been proven in Facebook and Yahoo for years. On the other hand, if the application involves real-time data analytics then the recommendation will be Hadoop clusters with Storm which has been successfully used in Twitter. After evaluating the developed prototype we introduced a new architecture which will be able to handle large scale batch and real-time data. We also proposed an upgrade of the existing prototype to handle real-time indoor navigation data.
APA, Harvard, Vancouver, ISO, and other styles
37

Rosch, Jan, Thijs Heus, Marc Salzmann, Johannes Mülmenstädt, Linda Schlemmer, and Johannes Quaas. "Analysis of diagnostic climate model cloud parameterisations using large-eddy simulations: Analysis of diagnostic climate model cloud parameterisations usinglarge-eddy simulations." Journal of the Royal Meteorological Society (2015), 141, 691, Part B, S. 2199–2205, 2015. https://ul.qucosa.de/id/qucosa%3A14689.

Full text
Abstract:
Current climate models often predict fractional cloud cover on the basis of a diagnostic probability density function (PDF) describing the subgrid-scale variability of the total water specific humidity, qt, favouring schemes with limited complexity. Standard shapes are uniform or triangular PDFs the width of which is assumed to scale with the gridbox mean qt or the grid-box mean saturation specific humidity, qs. In this study, the qt variability is analysed from large-eddy simulations for two stratocumulus, two shallow cumulus, and one deep convective cases. We find that in most cases, triangles are a better approximation to the simulated PDFs than uniform distributions. In two of the 24 slices examined, the actual distributions were so strongly skewed that the simple symmetric shapes could not capture the PDF at all. The distribution width for either shape scales acceptably well with both the mean value of qt and qs, the former being a slightly better choice. The qt variance is underestimated by the fitted PDFs, but overestimated by the existing parameterisations. While the cloud fraction is in general relatively well diagnosed from fitted or parameterised uniform or triangular PDFs, it fails to capture cases with small partial cloudiness, and in 10 – 30% of the cases misdiagnoses clouds in clear skies or vice-versa. The results suggest choosing a parameterisation with a triangular shape, where the distribution width would scale with the grid-box mean qt using a scaling factor of 0.076. This, however, is subject to the caveat that the reference simulations examined here were partly for rather small domains and driven by idealised boundary conditions.
APA, Harvard, Vancouver, ISO, and other styles
38

Sreenibha, Reddy Byreddy. "Performance Metrics Analysis of GamingAnywhere with GPU accelerated Nvidia CUDA." Thesis, Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-16846.

Full text
Abstract:
The modern world has opened the gates to a lot of advancements in cloud computing, particularly in the field of Cloud Gaming. The most recent development made in this area is the open-source cloud gaming system called GamingAnywhere. The relationship between the CPU and GPU is what is the main object of our concentration in this thesis paper. The Graphical Processing Unit (GPU) performance plays a vital role in analyzing the playing experience and enhancement of GamingAnywhere. In this paper, the virtualization of the GPU has been concentrated on and is suggested that the acceleration of this unit using NVIDIA CUDA, is the key for better performance while using GamingAnywhere. After vast research, the technique employed for NVIDIA CUDA has been chosen as gVirtuS. There is an experimental study conducted to evaluate the feasibility and performance of GPU solutions by VMware in cloud gaming scenarios given by GamingAnywhere. Performance is measured in terms of bitrate, packet loss, jitter and frame rate. Different resolutions of the game are considered in our empirical research and our results show that the frame rate and bitrate have increased with different resolutions, and the usage of NVIDIA CUDA enhanced GPU.
APA, Harvard, Vancouver, ISO, and other styles
39

Boström, Patrik. "Revisiting Observed Changes in Cloud Properties over Europe." Thesis, Uppsala universitet, Luft-, vatten och landskapslära, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-179997.

Full text
Abstract:
The Earth’s atmosphere is a vulnerable system which is easily changed by micro- and macrophysical variations. Big decreases in pollution levels of sulfur dioxide over Central Europe from 1980s to 2000s led to decreased mass concentration of atmospheric solid and liquid particles. This gives the opportunity to investigate how these particles influence the atmosphere. Newly released satellite climatology data was used to analyze statistics of cloud properties during four years in the high polluted atmosphere (1985-88) and four years in the less polluted atmosphere (2004-07). These two periods were investigated in collaboration with Atmospheric Remote Sensing Unit of the research department of the Swedish Meteorological and Hydrological Institute (SMHI). Cloud top temperature of liquid clouds over polluted regions during the earlier period was colder by more than 2 K and more than 5 K for only optical thin liquid clouds. The changes in mass concentrations of atmospheric particles derived by the sulfur dioxide emissions are shown to be a highly possible factor to the observed cloud changes.
Jordens atmosfär är ett känsligt system som lätt förändras av mikro- samt makrofysikaliska variationer. Stora minskningar i föroreningsnivåer av svaveldioxid över centrala Europa från 1980 till 2000-talet ledde till minskade masskoncentrationer av fasta och flytande atmosfäriska partiklar. Detta ger en möjlighet att undersöka hur dessa partiklar påverkar atmosfären. Nyligen utvecklad klimatologisk satellitdata användes för att analysera statistik av molnegenskaper under fyra år i en högt förorenad atmosfär (1985-88) och fyra år i en mindre förorenad atmosfär (2004-07). De två perioderna undersöktes i samarbete med Enheten för atmosfärisk fjärranalys av forskningsavdelningen till Sveriges meteorologiska och hydrologiska institut (SMHI). Molntopptemperaturen för moln i vätskefas över förorenande områden under den tidigare perioden var mer än 2 K kallare och mer än 5 K kallare för endast optiskt tunna moln i vätskefas. Förändringarna i masskoncentrationer för atmosfäriska partiklar och droppar med svaveldioxidusläpp som ursprung visas vara högst möjliga att ligga bakom de observerade molnförändringarna.
APA, Harvard, Vancouver, ISO, and other styles
40

Bengtsson, Hampus, and Ludvig Knutsmark. "Lagring av sekretessreglerade uppgifter i molntjänster : En analys kring förutsättningar för användning av molnleverantörer bland myndigheter." Thesis, Blekinge Tekniska Högskola, Institutionen för datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-19723.

Full text
Abstract:
Background: Swedish authorities' use of popular cloud providers is today the subject of an intense debate. Legislations, like the U.S. CLOUD Act, are applicable across borders, which makes data that is stored on servers located in Sweden affected by U.S. law. Several Swedish organizations mean that the usage of affected cloud providers for storage of sensitive records breaks the Swedish law - Offentlighets- och sekretesslagen. The program for collaboration between Swedish authorities, eSam, says that there is a possibility of withstanding the law, if suitable encryption is used, but states that more research is needed. Objectives: The main objective of this thesis is to research which requirements for encryption mechanisms are needed for Swedish authorities' use of cloud providers affected by legislations like CLOUD Act, without them breaking Swedish laws. The three most popular cloud providers, Amazon Web Services (AWS), Microsoft Azure, and Google Cloud will be compared and examined if requirements on encryption are met. Historically, providers' access to encryption keys is a major threat to data confidentiality. Therefore an alternative encryption method that withholds both encryption keys and clear text, but preserves functionality will be researched. Method: To create fair and good requirements on encryption mechanisms, several threat models are created from the perspective of today's- and future laws. A SWOT-analysis will be used to compare the cloud providers. To research the possibility and usability of alternative encryption in the cloud, a system that withholds both encryption keys and clear text data from the provider is proposed. Result: The result shows that the most popular services like Office 365 and G Suite are not suitable for use by Swedish authorities for the storage of sensitive records. Instead, Swedish authorities can use IaaS-services from both AWS and Microsoft Azure for storage of sensitive records - if the requirements for encryption mechanisms are met. The result also shows that alternative encryption methods can be used as part of a document management system. Conclusion: Swedish authorities should strive to expand their digitalization but should be careful about the usage of cloud providers. If laws change, or political tensions rise, the requirements for the encryption mechanisms proposed in this thesis would not be applicable. In a different situation, Swedish authorities should use alternative solutions which are not affected by an altered situation. One such alternative solution is the document management system proposed in this thesis.
Bakgrund: Idag pågår det en delad debatt gällande lämpligheten för svenska myndigheter att använda utländska molntjänster. Lagstiftningar som bland annat amerikanska CLOUD Act utgör ett hot för svensk suveränitet, eftersom data lagrad i svenska serverhallar kan komma att vara föremål i amerikanska domslut. Vissa organisationer menar då att uppgifter som omfattas av offentlighets- och sekretesslagen (OSL) kan sannolikt betraktas som röjda. Nyttan med att använda molntjänster är generellt sett stor - och bör därför strävas efter att användas. eSam, som är ett program för myndighetssamverkan, menar i ett uttalande att kryptering skulle kunna förhindra ett röjande och möjliggöra användande av dessa molntjänster, men ifrågasätter hur det bör gå till. Syfte: Syftet med detta arbete är att undersöka vilka krav på krypteringsmekanismer som krävs för att lagra sekretessreglerad information i molntjänster utan att bryta mot svensk lagstiftning. Molntjänstleverantörerna, Amazon Web Services (AWS), Microsoft Azure och Google Cloud, kommer att undersökas för att se om dessa lever upp till de identifierade kraven. Historiskt sett är tjänsteleverantörers tillgång till krypteringsnycklar ett stort hot. Därför ska en alternativ krypteringsmetod som bevarar krypteringsnyckel undangömd från tjänsteleverantören undersökas, men som samtidigt bevarar viss funktionalitet med molntjänsten. Metod: För att kunna skapa korrekta och rimliga krav på krypteringsmekanismer skapas hotmodeller som utgår från juridiken. En SWOT-analys kommer användas för att jämföra vilka molntjänstleverantörer som lever upp till kraven. För att undersöka möjligheten och nyttan med alternativa krypteringsmekanismer, implementeras ett förslag på system där varken krypteringsnyckel eller data i klartext tillgås molntjänstleverantören. Resultat: Resultatet visar att populära tjänster som exempelvis Office 365 och G Suite är direkt olämpliga för användning av svenska myndigheter. Det visar sig att IaaS-tjänster som tillhandahålls av AWS och Microsoft Azure i viss mån lämpar sig för användning - förutsatt att vissa krav uppfylls. Resultatet visar även att det är möjligt att använda alternativa krypteringsmetoder som en del i ett dokumenthanteringssystem. Slutsats: Svenska myndigheter bör sträva efter att öka sin digitalisering, men bör ta ordentliga försiktighetsåtgärder innan eventuella upphandlingar av molntjänster sker. Skulle det juridiska eller säkerhetspolitiska läget förändras kommer de krav på krypteringsmekanismer som presenteras i arbetet inte längre vara tillämpliga. I ett förändrat läge bör svenska myndigheter istället använda alternativa lösningar som inte påverkas, likt det dokumenthanteringssystem som arbetet presenterar.
APA, Harvard, Vancouver, ISO, and other styles
41

Barron, John P. "An objective technique for Arctic cloud analysis using multispectral AVHRR satellite imagery." Thesis, Monterey, California. Naval Postgraduate School, 1988. http://hdl.handle.net/10945/23335.

Full text
Abstract:
Approved for public release; distribution is unlimited.
An established cloud analysis routine has been modified for use in the Arctic. The separation of clouds from the snow and sea ice backgrounds is accomplished through a multispectral technique which utilizes VHRR channel 2 (visible), channel 3 (near infrared) and channel 4 (infrared) data. The primary means of cloud identification is based on a derived channel 3 reflectance image. At this wavelength, a significant contrast exists between liquid clouds and the arctic backgrounds, unlike in the standard visible and infrared images. The channel 3 reflectance is obtained by first using the channel 4 emission temperature to estimate the thermal emission component of the total channel 3 radiance. This thermal emission component is subsequently removed from the total radiance, leaving only the solar reflectance component available for analysis. Since many ice clouds do not exhibit a substantially greater reflectance is channel 3, the routine exploits differences in transmissive characteristics between channels 3 and 4 for identification. The routine was applied to six case studies which had been analyzed by three independent experts to establish 'ground truth'. Verification of the cloud analysis results, through a comparison to the subjective analyses, yielded impressive statistics. A success rate of 77.9% was obtained with an arguably small data base of 131 undisputed scenes
http://archive.org/details/objectivetechniq00barr
Lieutenant, United States Navy
APA, Harvard, Vancouver, ISO, and other styles
42

Bandini, Alessandro. "Programming and Deployment of Cloud-based Data Analysis Applications." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/13803/.

Full text
Abstract:
Cloud Computing constitutes a model capable of enabling the network access in a shared, practical and on demand of different computational resources like networks, memory, application or services. This work has as goal the explanation of the project made within Cloud Computing. After an introduction of the theory that lies behind Cloud computing's technologies, there is the practical part of the the work, starting from a more specific platform, Hadoop, which allows storage and data analysis and then moving to more general purpose platforms, Amazon Web Services and Google App Engine, where different types of services have been tried. The major part of the project is based on Google App Engine, where storage and computational services have been used to run MapReduce jobs. MapReduce is a different programming approach for solving data analysis problems, that is suited for big data. The first jobs are written in python, an imperative programming language. Later on, a functional approach on the same problems has been tried, with the Scala language and Spark platform, to compare the code. As Cloud computing is mainly used to host websites, a simple site was developed as integral part of the work. The development of the site is not explained as it goes beyond this thesis' main focus, only the relevant aspects will be treated.
APA, Harvard, Vancouver, ISO, and other styles
43

Palopoli, Amedeo. "Containerization in Cloud Computing: performance analysis of virtualization architectures." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/14818/.

Full text
Abstract:
La crescente adozione del cloud è fortemente influenzata dall’emergere di tecnologie che mirano a migliorare i processi di sviluppo e deployment di applicazioni di livello enterprise. L’obiettivo di questa tesi è analizzare una di queste soluzioni, chiamata “containerization” e di valutare nel dettaglio come questa tecnologia possa essere adottata in infrastrutture cloud in alternativa a soluzioni complementari come le macchine virtuali. Fino ad oggi, il modello tradizionale “virtual machine” è stata la soluzione predominante nel mercato. L’importante differenza architetturale che i container offrono ha portato questa tecnologia ad una rapida adozione poichè migliora di molto la gestione delle risorse, la loro condivisione e garantisce significativi miglioramenti in termini di provisioning delle singole istanze. Nella tesi, verrà esaminata la “containerization” sia dal punto di vista infrastrutturale che applicativo. Per quanto riguarda il primo aspetto, verranno analizzate le performances confrontando LXD, Docker e KVM, come hypervisor dell’infrastruttura cloud OpenStack, mentre il secondo punto concerne lo sviluppo di applicazioni di livello enterprise che devono essere installate su un insieme di server distribuiti. In tal caso, abbiamo bisogno di servizi di alto livello, come l’orchestrazione. Pertanto, verranno confrontate le performances delle seguenti soluzioni: Kubernetes, Docker Swarm, Apache Mesos e Cattle.
APA, Harvard, Vancouver, ISO, and other styles
44

Hohenegger, Cathy. "Dynamical analysis of atmospheric predictability in cloud-resolving models /." Zürich : ETH, 2006. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=16871.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Huang, Yujie. "The application of multispectral analysis to reduce cloud interference." Thesis, University of Gävle, University of Gävle, Ämnesavdelningen för samhällsbyggnad, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-6840.

Full text
Abstract:

For multispectral Remote Sensing (RS) image analysis, a big problem is that original dataalways include Clouds-Interference (CI). Especially in the bad weather conditions, the CI is evidentin RS image. So during the pre-processing of RS image, the CI should be reduced as much aspossible. In this paper, reducing CI is researched as the central problem, so that much Ground-Objects Feature (GOF) can be obtained. An analysis about the clouds reflection in differentSpectral-Bands (SBs) was done based on optical theory and early researches. Moreover, therelationships between clouds reflection and ground-objects reflection are presented to understandwhat the Digital Number (DN) represented in each SB, and to reduce the impact of CI the Same DNSpectral Matching Method (SDN-SMM) based on the multispectral application is applied. Finally,two cases are tested using Matlab Programme to indicate the rationality and practicability of SDNSMM.About SDN-SMM, some advantages and disadvantages are concluded through discussion onfinal results. The method can be used in any kind of multispectral sensors image with simplecalculation, while, the original data of clouds-free region will not be changed. However, the qualityof CI reduction depends on the precision of clouds identification and the SB which is used forspectral position relationship creating. In the end of this paper, the improvement is also presentedfor the future work.

APA, Harvard, Vancouver, ISO, and other styles
46

Li, Zhongli. "Towards a Cloud-based Data Analysis and Visualization System." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/35030.

Full text
Abstract:
In recent years, increasing attentions are paid on developing exceptional technologies for efficiently processing massive collection of heterogeneous data generated by different kinds of sensors. While we have observed great successes of utilizing big data in many innovative applications, the need on integrating information poses new challenges caused by the heterogeneity of the data. In this thesis, we target at geo-tagged data, and propose a cloud based platform named City Digital Pulse (CDP), where a unified mechanism and extensible architecture are provided to facilitate the various aspects in big data analysis, ranging from data acquisition to data visualization. We instantiate the proposed system using multi-model data collected from two social platforms, Twitter and Instagram, which include plenty of geo-tagged messages. Data analysis is performed to detect human affections from the user uploaded content. The emotional information in big social data can be uncovered by using a multi-dimension visualization interface, based on which users can easily grasp the evolving of human affective status within a given geographical area, and interact with the system. This offers costless opportunities to improve the decision making in many critical areas. Both the proposed architecture and algorithm are empirically demonstrated to be able to achieve real-time big data analysis.
APA, Harvard, Vancouver, ISO, and other styles
47

Albarrán, Munoz Isaac, and Ruiz De Azúa Manuel Parras. "Telecommunication Services’ Migration to the Cloud : Network Performance analysis." Thesis, KTH, Kommunikationssystem, CoS, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-93841.

Full text
Abstract:
Nowadays, telecommunication services are commonly deployed in private networks, which are controlled and maintained by the telecommunication operators themselves, by co-location services providers, or, to some extent, by their hardware and software providers. However, with the present development of cloud computing resources, one might consider if these services could and should be implemented in the Cloud, thus taking advantage of cloud computing’s high availability, geographic distribution, and ease of usage. Additionally, this migration could reduce the telecommunication operators’ concerns in terms of hardware and network maintenance, leaving those to the Cloud computing providers who will need to supply a highly available and consistent service, to fulfill the telecommunication services’ requirements. Furthermore, virtualization provides the possibility of easily and rapidly changing the Cloud network topology facilitating the addition and removal of machines and services, allowing telecommunication services providers to adapt to their demands on the fly. The aim of this thesis project is to analyze and evaluate the level of performance, from the network point of view, that can be achieved when using Cloud computing resources to implement a telecommunication service, carrying out practical experiments both in laboratory and real environments. These measurements and analyses were conducted using an Ericsson prototype mobile switching center server (MSC-S) application, although the results obtained could be adapted to other applications with similar requirements. In order to potentially test this approach in a real environment, a prior providers’ survey was utilized to evaluate their services based on our requirements in terms of hardware and network characteristics, and thus select a suitable candidate environment for our purposes. One cloud provider was selected and its service was further evaluated based on the MSC-S application requirements. We report the results of our bench-marking process in this environment and compare them to the results of testing in a laboratory environment. The results of both sets of testing were well correlated and indicate potential for hosting telecommunication services in a Cloud environment, providing the Cloud meets the requirements imposed by the telecom services.
Actualmente, los servicios de telecomunicaciones se implementan comúnmente en redes privadas, controladas y mantenidas por los operadores de telecomunicaciones, por proveedores de servicios de colocación o, hasta cierto punto, por proveedores de hardware y software. Sin embargo, con el presente desarrollo de la tecnología de ’Cloud computing’, se puede considerar la posibilidad de implementar servicios de telecomunicaciones en la nube, aprovechando su alta disponibilidad, distribución geográfica y facilidad de uso. Además, este cambio puede reducir las preocupaciones de los operadores en relación al mantenimiento del hardware y de la red, delegando en los proveedores del servicio de ’Cloud computing’, los cuáles deberán proporcionar un servicio consistente, cumpliendo así con los requisitos de los servicios de telecomunicaciones. Por otra parte, la virtualización propociona la posibilidad de cambiar rápida y fácilmente la topología de la red, facilitando la adición y supresión de maquinas y servicios, y, por tanto, permitiendo a los operadores adaptarse a sus necesidades sobre la marcha. El objetivo de esta tésis es analizar y evaluar en nivel de rendimiento, desde el punto de vista de la red, que se puede conseguir usando recursos de ’Cloud computing’ para implementar un servicio de telecomunicaciones, llevando a cabo experimentos tanto en el laboratorio como en un entorno real. Estos análisis fueron realizados utilizando un prototipo de un servidor de conmutación móvil (MSC-S) de Ericsson, aunque los resultados pueden adaptarse a otras aplicaciones con unos requisitos similares. Para probar esta propuesta en un entorno real, se realizó una encuesta de proveedores de servicios de ’Cloud computing’, con el objetivo de evaluar sus servicios teniendo en cuenta nuestros requisitos de hardware y red. Finalmente, un proveedor fue escogido y su servicio evaluado basándonos en los requisitos de la aplicación MSC-S. En este documento proporcionamos los resultados de esa evaluación y los comparamos con los obtenidos en el laboratorio. Los resultados de ambas evaluaciones fueron satisfactorios e indican la posibilidad de implementar servicios de telecomunicaciones en la nube, siempre que la nube cumpla los requisitos impuestos por dichos servicios de telecomunicaciones.
Nuförtiden är telekommunikationstjänster ofta uppsatta i privata nätverk, som kontrolleras och underhålls av teleoperatörerna själva, av samlokaliserande tjänsteleverantörer eller i viss utsträckning av deras hårdvaruoch programvaru-leverantörer. Med den nuvarande utvecklingen av Cloud Computing-resurser kan man dock överväga om dessa tjänster kan och bör genomföras i ett Cloud, vilket drar fördel av Cloud Computings höga tillgänglighet, geografiska spridning, och enkla användning. Denna migration minskar även teleoperatörernas oro angående hårdvaru- och nätverks-underhåll genom att överlåta detta till Cloud Computing-leverantörerna, som kommer att behöva leverera en hög tillgänglighet och konsekvent service för att uppfylla telekommunikationstjänsternas krav. Dessutom ger virtualisering möjlighet att enkelt och snabbt ändra ett Clouds nätverkstopologi, vilket underlättar tillägg och borttagning av maskiner och tjänster, vilket hjälper teleoperatörer att snabbt anpassa sig till deras krav. Målet med examensarbetet är att analysera och uppskatta prestandan, från nätets perspektiv, som kan uppnås vid användning av Cloud Computingresurser för att genomföra en teletjänst, genom praktiska experiment både i laboratorium och i verkligheten. Dessa mätningar och analyser utfördes med en prototyp av en Ericsson mobilomkopplingscentralserverapplikation (MSCS), även om de erhållna resultaten skulle kunna anpassas till andra program med liknande krav. För att potentiellt kunna testa denna metod i en verklig miljö användes en tidigare leverantörs undersökning för att utvärdera deras tjänster baserat på våra krav på hårdvara och nätverksegenskaper, och genom detta välja en lämplig kandidatmiljö för våra syften. En Cloud-leverantör valdes och dess tjänster utvärderades vidare baserat på MSC-Ss applikationskrav. Vi redovisar resultatet av vår testprocess i den här miljön och jämför det med resultaten av tester i laboratoriemiljö. Resultaten från båda uppsättningarna av tester var väl korrelerade och visar på potentialen av att implementera telekommunikationstjänster i en Cloud-miljö, om detta Cloud uppfyller de kraven som ställs av telekommunikationtjänsterna.
APA, Harvard, Vancouver, ISO, and other styles
48

Kumar, Gunasekar, and Anirudh Chelikani. "Analysis of security issues in cloud based e-learning." Thesis, Högskolan i Borås, Institutionen Handels- och IT-högskolan, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hb:diva-20868.

Full text
Abstract:
Cloud based E-Learning is one of the booming technologies in IT field which brings powerful e-learning products with the help of cloud power. Cloud technology has numerous advantages over the existing traditional E-Learning systems but at the same time, security is a major concern in cloud based e-learning. So security measures are unavoidable to prevent the loss of users’ valuable data from the security vulnerabilities. Cloud based e-learning products also need to satisfy the security needs of customers and overcome various security threats which attack valuable data stored in cloud servers.So the study investigates various security issues involved in cloud based e-learning technology with an aim to suggest solutions in the form of security measures and security management standards. These will help to overcome the security threats in cloud based e-learning technology. To achieve our thesis aim, we used theoretical and empirical studies. Empirical study is made through the information gathered through various cloud based e-learning solution vendors websites. And the theoretical study is made through the text analysis on various research articles related to our subject areas. And finally the constant comparative method is used to compare the empirical findings with the facts discovered from our theoretical findings. These analysis and research studies are leads to find various security issues in cloud based e-learning technology.
Program: Magisterutbildning i informatik
APA, Harvard, Vancouver, ISO, and other styles
49

Owen, Anne M. "Widescale analysis of transcriptomics data using cloud computing methods." Thesis, University of Essex, 2016. http://repository.essex.ac.uk/16125/.

Full text
Abstract:
This study explores the handling and analyzing of big data in the field of bioinformatics. The focus has been on improving the analysis of public domain data for Affymetrix GeneChips which are a widely used technology for measuring gene expression. Methods to determine the bias in gene expression due to G-stacks associated with runs of guanine in probes have been explored via the use of a grid and various types of cloud computing. An attempt has been made to find the best way of storing and analyzing big data used in bioinformatics. A grid and various types of cloud computing have been employed. The experience gained in using a grid and different clouds has been reported. In the case of Windows Azure, a public cloud has been employed in a new way to demonstrate the use of the R statistical language for research in bioinformatics. This work has studied the G-stack bias in a broad range of GeneChip data from public repositories. A wide scale survey has been carried out to determine the extent of the Gstack bias in four different chips across three different species. The study commenced with the human GeneChip HG U133A. A second human GeneChip HG U133 Plus2 was then examined, followed by a plant chip, Arabidopsis thaliana, and then a bacterium chip, Pseudomonas aeruginosa. Comparisons have also been made between the use of widely recognised algorithms RMA and PLIER for the normalization stage of extracting gene expression from GeneChip data.
APA, Harvard, Vancouver, ISO, and other styles
50

Ray, Sujan. "Dimensionality Reduction in Healthcare Data Analysis on Cloud Platform." University of Cincinnati / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin161375080072697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography