Dissertations / Theses on the topic 'Green Software'

To see the other types of publications on this topic, follow the link: Green Software.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 34 dissertations / theses for your research on the topic 'Green Software.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Acar, Hayri. "Software development methodology in a Green IT environment." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSE1256/document.

Full text
Abstract:
Le nombre de périphériques mobiles (smartphone, tablette, ordinateur portable, etc.) et les internautes augmentent continuellement. En raison de l'accessibilité du cloud computing, de l'Internet et de l'Internet des Objets (IdO), les utilisateurs utilisent de plus en plus d'applications logicielles qui provoquent un effet croissant sur les émissions de gaz à effet de serre. Ainsi, les TIC (Technologies de l'Information et de la Communication) sont responsables d'environ 2% des émissions mondiales de gaz à effet de serre qui sont équivalentes à celles émises par l'industrie aérienne. Selon des rapports récents, le Groupe d'experts Intergouvernemental sur l'Evolution du Climat (GIEC), les émissions de CO2 dus aux TIC augmentent rapidement. Néanmoins, les TIC, en permettant de résoudre des problèmes complexes dans d'autres secteurs, peuvent grandement et facilement participer pour réduire une partie importante des 98% restants des émissions mondiales de CO2. L'utilisation du logiciel implique des opérations matérielles qui sont physiquement responsables de la consommation d'énergie. Par conséquent, le logiciel est indirectement impliqué dans la consommation d'énergie. Ainsi, nous devons réduire la consommation d'énergie du logiciel tout en conservant les mêmes fonctionnalités pour le logiciel afin de créer des logiciels durables et verts. Premièrement, dans ce travail de thèse, nous définissons les termes «durable et vert» dans le domaine du logiciel afin de créer des logiciels respectant les critères de ces termes. Pour créer un produit logiciel, nous devons suivre un processus d'ingénierie logicielle. Par conséquent, nous décrivons des critères durables et verts à respecter après chaque étape de ce processus afin d'établir un processus d'ingénierie logicielle durable et écologique. En particulier, nous nous concentrons sur l'estimation de la consommation d'énergie du logiciel. De nombreux travaux ont essayé de proposer divers outils pour estimer la consommation d'énergie due aux logiciels afin de réduire l'empreinte carbone. Pendant longtemps, les solutions proposées se sont concentrées uniquement sur la conception du matériel, mais ces dernières années, les aspects logiciels sont également devenus importants. Malheureusement, ces études, dans la plupart des cas, ne considèrent que le CPU et négligent tous les autres composants. Les modèles de consommation d'énergie existants doivent être améliorés en tenant compte de plus de composants susceptibles de consommer de l'énergie pendant l'exécution d'une application. L'écriture d'un logiciel durable, performant et vert nécessite de comprendre le comportement de consommation d'énergie d'un programme informatique. L'un des avantages est que les développeurs, en améliorant leurs implémentations du code source, optimiseront la consommation d'énergie du logiciel. De plus, il existe un manque d'outil d'analyse pour surveiller dynamiquement la consommation d'énergie du code source de plusieurs composants. Ainsi, nous proposons GMTEEC (Méthodologie Générique d'Outil pour Estimer la Consommation Energétique) qui se compose de quatre couches aidant et guidant la construction d'un outil permettant d'estimer la consommation énergétique d'un logiciel. Ainsi, dans notre travail, en respectant les couches de GMTEEC, nous créons TEEC (Outil pour Estimer la Consommation Energétique) qui repose sur une formule mathématique établie pour chaque composant (CPU, mémoire, disque dur, réseau) susceptible de consommer de l'énergie afin d'estimer la consommation totale d'énergie du logiciel composée de la somme de chaque consommation d'énergie par composant. De plus, nous ajoutons à TEEC la capacité de localiser dynamiquement les points chauds qui sont les parties du code source consommant la plus grande quantité d'énergie afin d'aider et guider les développeurs à optimiser leur code source et à créer des logiciels efficaces, durables et verts... [etc]
The number of mobile devices (smartphone, tablet, laptop, etc.) and Internet users are continually increasing. Due to the accessibility provided by cloud computing, Internet and Internet of Things (IoT), users use more and more software applications which cause an increasing effect on gas emission. Thus, ICT (Information and Communication Technologies) is responsible of around 2% worldwide greenhouse gas emissions which is equivalent of that emitted by the airline industry. According to recent reports, the Intergovernmental Panel on Climate Change (IPCC), CO2 emissions due to ICT are increasing widely. Nevertheless, ICT, in allowing to solve complex problems in other sectors, can greatly and easily participate to reduce significant portion of the remaining 98% of global CO2 emissions. The use of software implies hardware operations which are physically responsible of energy consumption. Consequently, software is indirectly involved in the energy consumption. Thus, we need to reduce software energy consumption while maintaining the same functionalities for the software in order to build sustainable and green software. Firstly, in this thesis work, we define the terms sustainable and green in the area of software development. To build a software product, we need to follow a software engineering process. Hence, we define and describe sustainable and green criteria to be respected after each step of this process in order to establish a sustainable and green software engineering process. Then, we focus on the software energy consumption estimation. Many research works tried to propose various tools to estimate the energy consumption due to software in order to reduce carbon footprint. Unfortunately, these studies, in the majority of cases, consider only the CPU and neglects all others components. Existing power consumption methodologies need to be improved by taking into account more components susceptible to consume energy during runtime of an application. Writing sustainable, power efficient and green software necessitates to understand the power consumption behavior of a computer program. One of the benefits is the fact that developers, by improving their source code implementations, will optimize software power consumption. Moreover, there is a lack of analyzing tool to dynamically monitor source code energy consumption of several components. Thus, we propose GMTEEC (Generic Methodology of a Tool to Estimate Energy Consumption) which is composed of four layers assisting developers to build a tool estimating the software power consumption. Hence, in our work, respecting the layers of GMTEEC, we develop TEEC (Tool to Estimate Energy Consumption) which is based on mathematical formula established for each component (CPU, memory, hard disk, network) in order to estimate the total software energy consumption. Moreover, we add in TEEC the capacity to locate dynamically the hotpoints which are the parts of source code consuming the greater amount of energy in order to help and guide developers to optimize their source code and build efficient, sustainable and green software. We performed a variety of experiments to validate the accuracy and quality of the sustainable and green software engineering process and TEEC. The results demonstrate the possibility to save significant quantity of energy and time at limited costs with an important positive impact on environment
APA, Harvard, Vancouver, ISO, and other styles
2

Hiryanto, Lely. "Multi-Stage Network Upgrade for Green Software Defined Networking." Thesis, Curtin University, 2022. http://hdl.handle.net/20.500.11937/88898.

Full text
Abstract:
This thesis addresses three versions of a novel problem, called Green Multi-Stage Upgrade (GMSU), to upgrade legacy networks to Software Defined Networks (SDNs). The three versions, namely GMSU-1, GMSU-2, and GMSU-3, consider legacy networks that support IEEE 802.1AX, where each link contains multiple cables. Each version aims to replace a set of legacy-switches with SDN-switches over multiple stages. The aim is to maximally turn off unused cables adjacent to SDN-switches to save energy.
APA, Harvard, Vancouver, ISO, and other styles
3

Crute, Stephen John. "Computer simulations of green spruce aphid populations." Thesis, University of Ulster, 1990. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.281228.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Silva, Dina Raquel Rodrigues Retroz e. "Gestão de projeto de software: caso de estudo green na IUZ Technologies." Master's thesis, Universidade de Aveiro, 2012. http://hdl.handle.net/10773/10853.

Full text
Abstract:
Mestrado em Gestão
Este documento relata o processo de estágio efetuado na iUZ Technologies entre Setembro de 2011 e Maio de 2012, onde a autora colaborou num processo de desenvolvimento de uma aplicação elaborada para um cliente da empresa de acolhimento. Durante este trabalho, para além da gestão do projeto em causa, a autora envolveu-se no desenvolvimento do sistema de informação subjacente, desde o desenho dos requisitos até à validação e teste das aplicações desenvolvidas. Para efetuar o trabalho foi necessário recorrer a múltiplas técnicas de planeamento e gestão de projeto, sendo efetuado neste relatório de estágio uma análise do estado da arte nesta área do conhecimento em termos de metodologias, que depois é confrontada com a realidade prática do dia-a-dia empresarial. Para além disso, é também efetuada uma análise de metodologias de desenvolvimento e teste de sistemas de informação, sendo descrito o processo de desenvolvimento das aplicações com base em user stories e em metodologias ágeis de desenvolvimento de software.
This document describes the traineeship process in iUZ Technologies between September 2011 and May 2012, where the author developed an application for a customer in the host company. During this work, in addition to the management of project involved, the author became involved in the development of the underlying information system from the design of requirements until the validation and testing of applications developed. To perform this work it was necessary to use multiple techniques for planning and project management, described in this report including an analysis of the state of the art in this area of knowledge in terms of methodologies, which is then faced with practical reality of day-to-day business. In addition, it was also carried out an analysis of methodologies of development and testing of information systems, and described the process of developing applications based on user stories and in agile methodologies of software development.
APA, Harvard, Vancouver, ISO, and other styles
5

Sapountzis, Ioannis. "Traffic Monitoring for Green Networking." Thesis, KTH, Skolan för informations- och kommunikationsteknik (ICT), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-147981.

Full text
Abstract:
The notion of the networked society is more than ever true nowadays. The Internet has a big impact on our daily lives. Network operators provide the underlying infrastructure and continuously deploy services in order to meet customer demands. The amount of data transported through operator networks is also increasing with the introduction of new high band width services and over the network content. That being said, operators, most often deploy or operate networks to meet these demands without any regard to energy-efficiency. As the price of electricity continues to grow,  tends to become a problem with serious implications. To solve this problem a trend towards more energy efficient networks has emerged. In this thesis, we investigate a way to facilitate the introduction of new energy efficiency paradigms for fixed networks. Towards this end, we investigate the energy efficiency schemes proposed up to now and select one that we believe is more realistic to deploy. Furthermore, we specify the inputs required for the selected “green” routing approach. Moreover, we study existing and new protocols that can provide basic network monitoring functionality that enables the acquirement of these inputs. In the end, a Software Defined Networking (SDN) approach is proposed to facilitate the development of energy-efficient aware networks. The details of a basic SDN monitoring application are presented from an abstract architectural point of view and three designs stemming from this basic architecture are discussed. The three designs are namely All_Flow, First_Switch and Port_FlowRemoved. The first two were implemented as steps towards understanding the full capabilities of performing monitoring in SDN enabled networks and provided useful input towards realizing the third one as a proof of concept. Their usage and faults are discussed as they can provide useful insight for possible future implementations. The Port_FlowRemoved is the design and implementation that is suggested as providing the most fitting results for the monitoring purpose at hand. This purpose is to retrieve the identified inputs for the selected “green” networking approach. The differentiation factor among the three designs is how they collect the required inputs from the network. A fast-prototype is created as a proof of concept in order to validate the proposed architecture and thus empower the validity of the idea.
APA, Harvard, Vancouver, ISO, and other styles
6

Demir, Emrah, Martin Hrusovsky, Werner Jammernegg, and Woensel Tom Van. "Green intermodal freight transportation: bi-objective modelling and analysis." Taylor & Francis, 2019. http://epub.wu.ac.at/6990/1/00207543.2019.pdf.

Full text
Abstract:
Efficient planning of freight transportation requires a comprehensive look at wide range of factors in the operation and man- agement of any transportation mode to achieve safe, fast, and environmentally suitable movement of goods. In this regard, a combination of transportation modes offers flexible and environmentally friendly alternatives to transport high volumes of goods over long distances. In order to reflect the advantages of each transportation mode, it is the challenge to develop models and algorithms in Transport Management System software packages. This paper discusses the principles of green logistics required in designing such models and algorithms which truly represent multiple modes and their characteristics. Thus, this research provides a unique practical contribution to green logistics literature by advancing our understanding of the multi-objective planning in intermodal freight transportation. Analysis based on a case study from hinterland intermodal transportation in Europe is therefore intended to make contributions to the literature about the potential benefits from com bining economic and environmental criteria in transportation planning. An insight derived from the experiments conducted shows that there is no need to greatly compromise on transportation costs in order to achieve a significant reduction in carbon-related emissions.
APA, Harvard, Vancouver, ISO, and other styles
7

Chinenyeze, Samuel Jaachimma. "Mango : a model-driven approach to engineering green Mobile Cloud Applications." Thesis, Edinburgh Napier University, 2017. http://researchrepository.napier.ac.uk/Output/976572.

Full text
Abstract:
With the resource constrained nature of mobile devices and the resource abundant offerings of the cloud, several promising optimisation techniques have been proposed by the green computing research community. Prominent techniques and unique methods have been developed to offload resource/computation intensive tasks from mobile devices to the cloud. Most of the existing offloading techniques can only be applied to legacy mobile applications as they are motivated by existing systems. Consequently, they are realised with custom runtimes which incur overhead on the application. Moreover, existing approaches which can be applied to the software development phase, are difficult to implement (based on manual process) and also fall short of overall (mobile to cloud) efficiency in software qualityattributes or awareness of full-tier (mobile to cloud) implications. To address the above issues, the thesis proposes a model-driven architecturefor integration of software quality with green optimisation in Mobile Cloud Applications (MCAs), abbreviated as Mango architecture. The core aim of the architecture is to present an approach which easily integrates software quality attributes (SQAs) with the green optimisation objective of Mobile Cloud Computing (MCC). Also, as MCA is an application domain which spans through the mobile and cloud tiers; the Mango architecture, therefore, takesinto account the specification of SQAs across the mobile and cloud tiers, for overall efficiency. Furthermore, as a model-driven architecture, models can be built for computation intensive tasks and their SQAs, which in turn drives the development – for development efficiency. Thus, a modelling framework (called Mosaic) and a full-tier test framework (called Beftigre) were proposed to automate the architecture derivation and demonstrate the efficiency of Mango approach. By use of real world scenarios/applications, Mango has been demonstrated to enhance the MCA development process while achieving overall efficiency in terms of SQAs (including mobile performance and energy usage compared to existing counterparts).
APA, Harvard, Vancouver, ISO, and other styles
8

Carpa, Radu. "Energy Efficient Traffic Engineering in Software Defined Networks." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSEN065/document.

Full text
Abstract:
Ce travail a pour but d'améliorer l'efficacité énergétique des réseaux de cœur en éteignant un sous-ensemble de liens par une approche SDN (Software Defined Network). Nous nous différencions des nombreux travaux de ce domaine par une réactivité accrue aux variations des conditions réseaux. Cela a été rendu possible grâce à une complexité calculatoire réduite et une attention particulière au surcoût induit par les échanges de données. Pour valider les solutions proposées, nous les avons testées sur une plateforme spécialement construite à cet effet.Dans la première partie de cette thèse, nous présentons l'architecture logicielle ``SegmenT Routing based Energy Efficient Traffic Engineering'' (STREETE). Le cœur de la solution repose sur un re-routage dynamique du trafic en fonction de la charge du réseau dans le but d'éteindre certains liens peu utilisés. Cette solution utilise des algorithmes de graphes dynamiques pour réduire la complexité calculatoire et atteindre des temps de calcul de l'ordre des millisecondes sur un réseau de 50 nœuds. Nos solutions ont aussi été validées sur une plateforme de test comprenant le contrôleur SDN ONOS et des commutateurs OpenFlow. Nous comparons nos algorithmes aux solutions optimales obtenues grâce à des techniques de programmation linéaires en nombres entiers et montrons que le nombre de liens allumés peut être efficacement réduit pour diminuer la consommation électrique tout en évitant de surcharger le réseau.Dans la deuxième partie de cette thèse, nous cherchons à améliorer la performance de STREETE dans le cas d’une forte charge, qui ne peut pas être écoulée par le réseau si des algorithmes de routages à plus courts chemins sont utilisés. Nous analysons des méthodes d'équilibrage de charge pour obtenir un placement presque optimal des flux dans le réseau.Dans la dernière partie, nous évaluons la combinaison des deux techniques proposées précédemment : STREETE avec équilibrage de charge. Ensuite, nous utilisons notre plateforme de test pour analyser l'impact de re-routages fréquents sur les flux TCP. Cela nous permet de donner des indications sur des améliorations à prendre en compte afin d'éviter des instabilités causées par des basculements incontrôlés des flux réseau entre des chemins alternatifs. Nous croyons à l'importance de fournir des résultats reproductibles à la communauté scientifique. Ainsi, une grande partie des résultats présentés dans cette thèse peuvent être facilement reproduits à l'aide des instructions et logiciels fournis
This work seeks to improve the energy efficiency of backbone networks by automatically managing the paths of network flows to reduce the over-provisioning. Compared to numerous works in this field, we stand out by focusing on low computational complexity and smooth deployment of the proposed solution in the context of Software Defined Networks (SDN). To ensure that we meet these requirements, we validate the proposed solutions on a network testbed built for this purpose. Moreover, we believe that it is indispensable for the research community in computer science to improve the reproducibility of experiments. Thus, one can reproduce most of the results presented in this thesis by following a couple of simple steps. In the first part of this thesis, we present a framework for putting links and line cards into sleep mode during off-peak periods and rapidly bringing them back on when more network capacity is needed. The solution, which we term ``SegmenT Routing based Energy Efficient Traffic Engineering'' (STREETE), was implemented using state-of-art dynamic graph algorithms. STREETE achieves execution times of tens of milliseconds on a 50-node network. The approach was also validated on a testbed using the ONOS SDN controller along with OpenFlow switches. We compared our algorithm against optimal solutions obtained via a Mixed Integer Linear Programming (MILP) model to demonstrate that it can effectively prevent network congestion, avoid turning-on unneeded links, and provide excellent energy-efficiency. The second part of this thesis studies solutions for maximizing the utilization of existing components to extend the STREETE framework to workloads that are not very well handled by its original form. This includes the high network loads that cannot be routed through the network without a fine-grained management of the flows. In this part, we diverge from the shortest path routing, which is traditionally used in computer networks, and perform a particular load balancing of the network flows. In the last part of this thesis, we combine STREETE with the proposed load balancing technique and evaluate the performance of this combination both regarding turned-off links and in its ability to keep the network out of congestion. After that, we use our network testbed to evaluate the impact of our solutions on the TCP flows and provide an intuition about the additional constraints that must be considered to avoid instabilities due to traffic oscillations between multiple paths
APA, Harvard, Vancouver, ISO, and other styles
9

Warth, Benedikt. "Design and Application of Software Sensors in Batch and Fed-batch Cultivations during Recombinant Protein Expression in Escherichia coli." Thesis, Linköping University, The Department of Physics, Chemistry and Biology, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-12530.

Full text
Abstract:

Software sensors are a potent tool to improve biotechnological real time process monitoring and control. In the current project, algorithms for six partly novel, software sensors were established and tested in a microbial reactor system. Eight batch and two fed-batch runs were carried out with a recombinant Escherichia coli to investigate the suitability of the different software sensor models in diverse cultivation stages. Special respect was given to effects on the sensors after recombinant protein expression was initiated by addition of an inducer molecule. It was an objective to figure out influences of excessive recombinant protein expression on the software sensor signals.

Two of the developed algorithms calculated the biomass on-line and estimated furthermore, the specific growth rate by integration of the biomass changes with the time. The principle of the first was the application of a near infrared probe to obtain on-line readings of the optical density. The other algorithm was founded on the titration of ammonia as only available nitrogen source. The other two sensors analyzed for the specific consumption of glucose and the specific production of acetate and are predicted on an in-line HPLC system.

The results showed that all software sensors worked as expected and are rather powerful to estimate important state parameters in real time. In some stages, restrictions may occur due to different limitation affects in the models or the physiology of the culture. However, the results were very convincing and suggested the development of further and more advanced software sensor models in the future.

APA, Harvard, Vancouver, ISO, and other styles
10

Birol, Kemal Ozgen. "Design And Analysis Of Energy Saving Buildings Using The Software Energy Plus." Master's thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12614653/index.pdf.

Full text
Abstract:
Being the major energy consumer of electricity and natural gas, buildings consume more than 70% of electricity and 30% of natural gas. On the way to green buildings and zero energy buildings, investigation and improvement of energy efficiency of the buildings will result in significant reductions in energy demands and CO2 emissions
make cost savings and improve thermal comfort as well. Key steps of a successful green, energy efficient building can be summarized as whole building design, site design, building envelope design, lighting and day lighting design and HVAC system design. Energy Plus®
software is mainly developed to simulate the performance of the buildings in the view of the above listed points. The design of a building or the analysis of an existing building with the software will show how efficient the building is or will be, and also helps finding the best efficient choice of the whole building system. Thesis focuses on the effect of changes in building envelope properties. In Turkey, topic of green buildings has recently started to be studied. Therefore, this thesis aims to present efficient technologies providing energy savings in buildings, to present green building concept and alternative energy simulation software. In the context of this study, design, methods and material guidelines are introduced to reduce energy needs of buildings and to bring in the green building design concept. Building and system parameters to enhance building energy efficiency and energy savings together with green building principles are summarized. Moreover, whole building energy analysis methods and simulation steps are explained
year-round simulation is performed for a sample building
as a result, energy savings about 36% is achieved.
APA, Harvard, Vancouver, ISO, and other styles
11

Eldridge, Jacob Douglas. "A Comparison of Current Anuran Monitoring Methods with Emphasis on the Accuracy of Automatic Vocalization Detection Software." TopSCHOLAR®, 2011. http://digitalcommons.wku.edu/theses/1122.

Full text
Abstract:
Currently, a variety of methods are available to monitor anurans, and little standardization of methods exists. New methods to monitor anurans have become available over the past twenty years, including PVC pipe arrays used for tree frog capture and Automated Digital Recording Systems (ADRS) used to remotely monitor calling activity. In addition to ADRS, machine-learning computer software, automated vocalization recognition software (AVRS), has been developed to automatically detect vocalizations within digital sound recordings. The use of a combination of ADRS and AVRS shows the promise to reduce the number of people, time, and resources needed for an effective call survey program. However, little research exists that uses the described tools for wildlife monitoring, especially for anuran monitoring. In the study, there were two problems addressed relating to AVRS. The first was the poorly understood relationship between auditory survey methods and physical survey methods. I tested this problem by using current auditory monitoring methods, ADRS and the AVRS Song Scope© (v.3.1), alongside more traditional physical monitoring methods that included drift fences, a PVC pipe array, and visual encounter transects. No significant relationship between physical and auditory community population measures was found. Auditory methods were also effective in the detection of call characteristic differences between urban and rural locations, further suggesting an influence of noise pollution. The second problem addressed was the call identification errors found in auditory survey methods. I examined the influence of treatments including the ADRS location, listener group, species, and season on the error rates of the AVRS Song Scope© (v.3.1) and groups of human listeners. Computer error rates were higher than human listeners, yet less affected by the treatments. Both studies suggested that AVRS was a viable method to monitor anuran populations, but the choice of methods should be dependent upon the species of interest and the objectives of the study.
APA, Harvard, Vancouver, ISO, and other styles
12

Sztulzaft, Patrick. "Green-expert : un solveur généralisé associé à un générateur de formulations pour la méthode des intégrales de frontières." Grenoble INPG, 1994. https://hal.archives-ouvertes.fr/tel-01331763.

Full text
Abstract:
De nombreux secteurs de l'industrie et de la recherche utilisent la modélisation des phénomènes de la physique des milieux continus. Les équations aux dérivées partielles décrivant ces phénomènes sont résolues à l'aide de diverses méthodes numériques. Les modélisations utilisées sont de plus en plus pointues, tant au niveau physique qu'au niveau numérique. Les réponses logicielles à ces problèmes doivent donc être évolutives. Ce travail s'insère dans une dynamique de recherche dans le domaine de la modélisation des phénomènes complexes qui a débuté avec l'élaboration du programme Flux-Expert®, basé sur la Méthode des Eléments Finis. Afin d'élargir le champ des possibilités offertes par ce programme, nous avons choisi d'y associer la Méthodes des Intégrales de Frontières. Dans cette optique, après une présentation didactique de la Méthode des Intégrales de Frontières, nous proposons une décomposition générale de la résolution numérique d'un problème à l'aide de cette méthode. Nous décrivons ensuite le logiciel issu de cette analyse : Green-Expert. L'originalité de la démarche réside dans l'association d'un programme Générateur de Formulations et d'un programme Solveur généralisé. Ce Solveur est capable de résoudre tout problème décrit à l'aide d'une formulation issue du Générateur et d'une Géométrie discrétisée. La dernière partie de ce mémoire est consacré à la validation. Des exemples de couplage entre la Méthode des Intégrales de Frontières et la Méthode des Éléments Finis sont présentés. Enfin, des exemples de résolution 2D et 3D permettent de valider le Générateur et le Solveur de Green-Expert
Investigations in many sectors of industry and research require the modelling of phenomena observed in the physics of continuous media. The partial differential equations describing these phenomena are solved using a wide range of numerical methods. The models used are increasingly sophisticated, from both a physical and numerical point of view. Software used to solve these problems must therefore be capable of evolving. This work is a continuation of research efforts devoted to the modelling of complex phenomena that began with the development of the Flux-Expert® program, based on the Finite Element Method. In order to extend the possibilities offered by this program, we decided to combine it with the Boundary Element Method. After reviewing the Boundary Element Method, we propose a general decomposition of the numerical solution of a problem using this method. We then describe the Green-Expert software developed on the basis of this analysis. The original aspect of the approach lies in the combination of a formulations generator and a general solver. This solver is capable of solving any problem described using a formulation coming from the Generator and a discrete geometry. The last part of this thesis is devoted to the validation phase. Examples of the combined use of the Boundary Elements and the Finite Element Methods are presented and examples of 2D and 3D resolution are used to validate the Green-Expert Solver and Generator
APA, Harvard, Vancouver, ISO, and other styles
13

Barra, López Daniel. "Análisis del efecto del arbolado urbano sobre la absorción de material particulado respirable (MP2, 5), mediante el software I - Tree Eco al interior del Parque Ecuador en la ciudad de Concepción." Tesis, Universidad de Chile, 2019. http://repositorio.uchile.cl/handle/2250/170493.

Full text
Abstract:
La realización de esta memoria se encuentra en el marco del proyecto Fondecyt de Iniciación N°1180990 "Construcción social del clima urbano: hacia la calidad y justicia climática en ciudades chilenas".
Memoria para optar al título de Geógrafo
El deterioro en la calidad del aire es uno de los principales problemas ambientales que afectan a las ciudades alrededor del mundo, siendo la polución por material particulado 2,5 (MP2,5) la más peligrosa y mortal para el ser humano. Diversas también han sido las soluciones propuestas para ayudar a mitigar los efectos negativos de la contaminación atmosférica. De estas, una de las que ha tomado mayor fuerza durante el último tiempo corresponde a la utilización de los árboles urbanos para disminuir la contaminación al interior de las ciudades. Mediante el software I-Tree Eco, desarrollado por el Departamento de Agricultura de los Estados Unidos (USDA), se analizaron los efectos que posee el arbolado urbano sobre la interceptación y posterior absorción de MP2,5, al interior de la ciudad de Concepción, Región del Biobío, Chile. A partir de datos de calidad de aire, precipitación y la estructura de los árboles urbanos, el software permite estimar la cantidad de contaminación del aire removida durante un año por los árboles. Para lograr el propósito de la investigación, se utilizaron datos desde el Sistema de Información Nacional de Calidad del Aire (SINCA), mientras que el relleno de datos faltantes de esta base de datos, se realizó por medio del método de imputación de datos individual (SDEM Model). Por otra parte, la base de datos de precipitación se obtuvo desde la Red Agroclimática Nacional (AGROMET). Luego, mediante la fotointerpretación de imágenes satelitales a través de Google Earth, más las capas de levantamiento de espacios verdes, extraídas desde la red CEDEUS y el trabajo en terreno, se identificaron los elementos de la infraestructura verde de mayor relevancia, localizados al interior de la ciudad de Concepción. Junto con ello, se identificó al Parque Ecuador como uno de los espacios más importantes, debido a su tamaño y cercanía con el centro de la ciudad, asimismo, se realizó un inventario completo de los árboles urbanos presentes en su interior, identificando su estructura. Estos datos fueron procesados por el software, obteniendo la eliminación total de la contaminación por MP2,5, su capacidad de almacenamiento y secuestro de carbono, y la emisión total de compuestos orgánicos volátiles biogénicos (COVBs) que poseen los árboles urbanos del parque. Los resultados demuestran que el parque al año eliminó un total de 4,52 Kg de MP2,5, mientras que la tasa de eliminación fue de 0,13 gm-2 por cobertura arbórea; el almacenamiento de carbono fue de 350 toneladas de carbono y su secuestro bruto fue de 3,24 toneladas métricas por año, mientras que las especies del parque anualmente emiten un total de 50,48 Kg de COVBs.
Deterioration of air quality is one of the main environmental problems affecting cities around the world, with particulate matter pollution 2.5 (PM2.5) being the most dangerous and deadly for humans. The proposed solutions to help mitigate the negative effects of air pollution have been diverse; of these, one that has taken greater strength on recent times has been the use of urban trees to reduce pollution inside the cities. This way, through the software I-Tree Eco, developed by the U.S. Department of Agriculture (USDA), the effects of urban trees on the interception and subsequent absorption of PM2.5, inside of Concepción city, Biobío region, Chile were analyzed. Using data of air quality, precipitation and the structure of urban trees, the software allows to estimate the amount of air pollution removed by trees over a year. To achieve the objective of this research, data from the National Air Quality Information System (SINCA) was used, while the filling of missing data was done through the individual data imputation method (SDEM Model). Complete rainfall data was obtained from the National Agroclimatic Network (AGROMET). Then, through photointerpretation of satellite images from Google Earth, plus the layers of green space surveying, extracted from the CEDEUS network and workfield, the most relevant elements of the green infrastructure inside the city of Concepcion were identified. Ecuador Park was identified as one of the most relevant spaces on this city, and a complete inventory of the urban trees that are present inside of it was made, identifying its structure. The data was then processed by the software, obtaining the total elimination of the contamination by PM2.5, their storage capacity and carbon sequestration, and their emission of biogenic volatile organic compounds (BVOCs) that the urban trees of the park have. The results show that the park eliminated a total of 4.52 Kg of MP2.5 per year, while the elimination rate was 0.13 gm-2 for tree coverage; carbon storage was 350 tons of carbon and its gross sequestration was 3.24 metric tons per year, while the park species annually emit a total of 50.48 Kg of BVOCs.
APA, Harvard, Vancouver, ISO, and other styles
14

Khaled, Haitham El-Mohamdy. "Energy and throughput efficient strategies for heterogeneous future communication networks." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2021. https://ro.ecu.edu.au/theses/2418.

Full text
Abstract:
As a result of the proliferation of wireless-enabled user equipment and data-hungry applications, mobile data traffic has exponentially increased in recent years.This in-crease has not only forced mobile networks to compete on the scarce wireless spectrum but also to intensify their power consumption to serve an ever-increasing number of user devices. The Heterogeneous Network (HetNet) concept, where mixed types of low-power base stations coexist with large macro base stations, has emerged as a potential solution to address power consumption and spectrum scarcity challenges. However, as a consequence of their inflexible, constrained, and hardware-based configurations, HetNets have major limitations in adapting to fluctuating traffic patterns. Moreover, for large mobile networks, the number of low-power base stations (BSs) may increase dramatically leading to sever power consumption. This can easily overwhelm the benefits of the HetNet concept. This thesis exploits the adaptive nature of Software-defined Radio (SDR) technology to design novel and optimal communication strategies. These strategies have been designed to leverage the spectrum-based cell zooming technique, the long-term evolution licensed assisted access (LTE-LAA) concept, and green energy, in order to introduce a novel communication framework that endeavors to minimize overall network on-grid power consumption and to maximize aggregated throughput, which brings significant benefits for both network operators and their customers. The proposed strategies take into consideration user data demands, BS loads, BS power consumption, and available spectrum to model the research questions as optimization problems. In addition, this thesis leverages the opportunistic nature of the cognitive radio (CR) technique and the adaptive nature of the SDR to introduce a CR-based communication strategy. This proposed CR-based strategy alleviates the power consumption of the CR technique and enhances its security measures according to the confidentiality level of the data being sent. Furthermore, the introduced strategy takes into account user-related factors, such as user battery levels and user data types, and network-related factors, such as the number of unutilized bands and vulnerability level, and then models the research question as a constrained optimization problem. Considering the time complexity of the optimum solutions for the above-mentioned strategies, heuristic solutions were proposed and examined against existing solutions. The obtained results show that the proposed strategies can save energy consumption up to 18%, increase user throughput up to 23%, and achieve better spectrum utilization. Therefore, the proposed strategies offer substantial benefits for both network operators and users.
APA, Harvard, Vancouver, ISO, and other styles
15

Wu, Xiguang. "Hierarchical reconfiguration management for heterogeneous cognitive radio equipments." Thesis, CentraleSupélec, 2016. http://www.theses.fr/2016SUPL0002/document.

Full text
Abstract:
Pour supporter l’évolution constante des standards de communication numérique, du GSM vers la 5G, les équipements de communication doivent continuellement s’adapter. Face à l’utilisation croissante de l’internet, on assiste à une explosion du trafic de données, ce qui augmente la consommation d'énergie des appareils de communication sans fil et conduit donc à un impact significatif sur les émissions mondiales de CO2. De plus en plus de recherches se sont concentrées sur l'efficacité énergétique de la communication sans fil. La radio Intelligente, ou Cognitive Radio (CR), est considérée comme une technologie pertinente pour les communications radio vertes en raison de sa capacité à adapter son comportement à son environnement. Sur la base de métriques fournissant suffisamment d'informations sur l'état de fonctionnement du système, une décision optimale peut être effectuée en vue d'une action de reconfiguration, dans le but de réduire au minimum la dissipation d'énergie tout en ne compromettant pas les performances. Par conséquent, tout équipement intelligent doit disposer d’une architecture de gestion de la reconfiguration. Nous avons retenu l’architecture HDCRAM (Hierarchical and Distributed Cognitive Radio Architecture Management), développée dans notre équipe, et nous l’avons déployée sur des plates-formes hétérogènes. L'un des objectifs est d'améliorer l'efficacité énergétique par la mise en œuvre de l’architecture HDCRAM. Nous l’avons appliquée à un système OFDM simplifié pour illustrer comment HDCRAM permet de gérer efficacement le système et son adaptation à un environnement évolutif
As the digital communication systems evolve from GSM and now toward 5G, the supported standards are also growing. The desired communication equipments are required to support different standards in a single device at the same time. And more and more wireless Internet services have been being provided resulting in the explosive growth in data traffic, which increase the energy consumption of the communication devices thus leads to significant impact on global CO2 emission. More and more researches have focused on the energy efficiency of wireless communication. Cognitive Radio (CR) has been considered as an enabling technology for green radio communications due to its ability to adapt its behavior to the changing environment. In order to efficiently manage the sensing information and the reconfiguration of a cognitive equipment, it is essential, first of all, to gather the necessary metrics so as to provide enough information about the operating condition thus helping decision making. Then, on the basis of the metrics obtained, an optimal decision can be made and is followed by a reconfiguration action, whose aim is to minimize the power dissipation while not compromising on performance. Therefore, a management architecture is necessary to be added into the cognitive equipment acting as a glue to realize the CR capabilities. We introduce a management architecture, namely Hierarchical and Distributed Cognitive Radio Architecture Management (HDCRAM), which has been proposed for CR management by our team. This work focuses on the implementation of HDCRAM on heterogeneous platforms. One of the objectives is to improve the energy efficiency by the management of HDCRAM. And an example of a simplified OFDM system is used to explain how HDCRAM works to efficiently manage the system to adapt to the changing environment
APA, Harvard, Vancouver, ISO, and other styles
16

Seifhashemi, Seyedeh Mahsa. "Impact of cool roof application on commercial buildings: A contribution to sustainable design in Australia." Thesis, Queensland University of Technology, 2015. https://eprints.qut.edu.au/90897/1/Seyedeh%20Mahsa_Seifhashemi_Thesis.pdf.

Full text
Abstract:
This study investigated the cool roof technology effects on annual energy saving of a large one-storey commercial building in Queensland, Australia. A computer model of the case study was developed using commercial software by using the appropriate geometrical and thermal building specifications. Field study data were used to validate the model. The model was then used to extend the investigation to other cities in various Australian climate zones. The results of this research show that significant energy savings can be obtained using cool roof technology, particularly in warm, sunny climates, and the thesis can contribute to provide a guideline for application of cool roof technology to single-storey commercial building throughout Australia.
APA, Harvard, Vancouver, ISO, and other styles
17

Griffitts, Troy Andrew. "Software for the collaborative editing of the Greek New Testament." Thesis, University of Birmingham, 2018. http://etheses.bham.ac.uk//id/eprint/8244/.

Full text
Abstract:
This project was responsible for developing the Virtual Manuscript Room Collaborative Research Environment (VMR CRE), which offers a facility for the critical editing workflow from raw data collection, through processing, to publication, within an open and online collaborative framework for the Institut für Neutestamentliche Textforschung (INTF) and their global partners while editing the Editio Critica Maior (ECM)-- the paramount critical edition of the Greek New Testament which analyses over 5600 Greek witnesses and includes a comprehensive apparatus of chosen manuscripts, weighted by quotations and early translations. Additionally, this project produced the first digital edition of the ECM. This case study, transitioning the workflow at the INTF to an online collaborative research environment, seeks to convey successful methods and lessons learned through describing a professional software engineer’s foray into the world of academic digital humanities. It compares development roles and practices in the software industry with the academic environment and offers insights to how this software engineer found a software team therein, suggests how a fledgling online community can successfully achieve critical mass, provides an outsider’s perspective on what a digital critical scholarly edition might be, and hopes to offer useful software, datasets, and a thriving online community for manuscript researchers.
APA, Harvard, Vancouver, ISO, and other styles
18

Quansah, Solomon. "Life cycle analysis of shea butter biodiesel using GREET software." Kansas State University, 2012. http://hdl.handle.net/2097/13446.

Full text
Abstract:
Master of Science
Department of Chemical Engineering
John Schlup
In this study, life cycle analysis (LCA) of shea butter biodiesel from Well-to-Pump (WTP) is considered utilizing information gathered from Anuanom Industrial Bio Products Ltd. (AIBP) in Ghana, West Africa. The information presented in this report starts with shea plant cultivation, proceeds through harvesting of shea fruits, extraction of shea butter from shea kernels, and finishes with the production of shea butter biodiesel via homogenous acid–alkali transesterification reactions utilizing methanol. After researching the conversion of shea butter to biodiesel, the GREET software was explored as a tool to perform LCA. Shea butter is an excellent alternative feedstock to produce biodiesel on an industrial scale. Though research into shea plant cultivation and subsequent conversion into biodiesel in Ghana has not received formal attention, it has huge potential in the biodiesel industry. The tree originates in Africa and is tropical and drought-resistant. Although even some basic agronomic characteristics of shea butter are not yet fully understood, the plant enjoys a booming interest, which may hold the risk of unsustainable practice. The GREET software from the Argonne National laboratory of the US Department of Energy (DOE) was used in LCA. The software is a very useful tool specifically designed for LCA focused on energy and emissions of different production processes, including biodiesel production. This software is managed by DOE research laboratory and is made available for public use. The GREET software allow users perform many existing fuel production processes. To perform an LCA on shea butter biodiesel which is a new feedstock to the GREET software, some of the requisite information, and data input has to be sent to the Argonne National Laboratory personnel for input. For a new biodiesel feedstock such as shea butter which is not part of the GREET software database, it is important to work with the Argonne National laboratory to perform the LCA.
APA, Harvard, Vancouver, ISO, and other styles
19

Cirqueira, Alexandre Correia. "Um Mecanismo de Segurança com Adaptação Dinâmica em Tempo de Execução para Dispositivos Móveis." reponame:Repositório Institucional da UFC, 2011. http://www.repositorio.ufc.br/handle/riufc/16904.

Full text
Abstract:
CIRQUEIRA, Alexandre Correia. Um Mecanismo de Segurança com Adaptação Dinâmica em Tempo de Execução para Dispositivos Móveis. 2011. 110 f. : Dissertação (mestrado) - Universidade Federal do Ceará, Centro de Ciências, Departamento de Computação, Fortaleza-CE, 2011.
Submitted by guaracy araujo (guaraa3355@gmail.com) on 2016-05-19T19:32:40Z No. of bitstreams: 1 2011_dis_accirqueira.pdf: 5491223 bytes, checksum: 7f1b8cc28f5bf56687e6d0e34acc337f (MD5)
Approved for entry into archive by guaracy araujo (guaraa3355@gmail.com) on 2016-05-19T19:33:29Z (GMT) No. of bitstreams: 1 2011_dis_accirqueira.pdf: 5491223 bytes, checksum: 7f1b8cc28f5bf56687e6d0e34acc337f (MD5)
Made available in DSpace on 2016-05-19T19:33:29Z (GMT). No. of bitstreams: 1 2011_dis_accirqueira.pdf: 5491223 bytes, checksum: 7f1b8cc28f5bf56687e6d0e34acc337f (MD5) Previous issue date: 2011
The increasing use of mobile devices, wireless networks and mobile applications highlights the importance of ensuring information security. This concern arises because of the risks involved in traffic sensitive information via wireless, since it does not limit the risk of attacks, as in conventional networks. Additionally, the trend in the use of sustainable practices advocated by the Green Computing imposes the need for designing flexible applications that seek to reduce consumption of resources such as energy. Thus, mechanisms for providing confidentiality of information passing over the wireless medium should consider the efficient allocation of computing resources. This is a key issue to be considered when designing secure mobile applications. Therefore, the protection mechanisms should balance the security level required in accordance with the consumption of resources allocated to provide it. The use of information that characterizes the current situation (context) can assist in this task. Thus, the use of appropriate protective security requirements of applications and combined with the context can identify situations where you need to raise or lower the security level in order to reduce the resource consumption of the device. This work proposes a Security Mechanism Dynamic Adaptation (MeSAD), focusing on confidentiality, able to adapt the level of security according to the context and reduce the resource consumption of mobile devices. The main objective is to find the balance point in the tradeoff between the level of security and resource consumption. In order to achieve this goal, this paper presents a tool to support the use of MeSAD during the development of mobile applications, and enable the assessments on the performance of cryptographic algorithms that are used in different devices.
A crescente utilização de dispositivos móveis, redes sem fio e aplicações móveis evidencia a importância da garantia de segurança da informação. Esta preocupação surge devido aos riscos envolvidos no tráfego de informações sensíveis por meio sem fio, uma vez que o meio não limita os riscos de ataques, tal como nas redes convencionais. Adicionalmente, a tendência no uso de práticas sustentáveis defendidas pela Computação Verde impõe a necessidade de concepção de aplicações flexíveis que busquem a redução do consumo de recursos, como o de energia. Assim, mecanismos para o provimento de confidencialidade de informações que trafegam por meio sem fio devem considerar a alocação eficiente de recursos computacionais. Esta é uma questão chave a ser considerada no momento da concepção de aplicações móveis seguras. Portanto, os mecanismos de proteção devem balancear o nível de segurança requerido de acordo com o consumo de recursos alocados para provê-lo. O emprego de informações que caracterizam a situação corrente (contexto) pode auxiliar nessa tarefa. Assim, a utilização de proteção adequada aos requisitos de segurança das aplicações e combinada com o contexto pode identificar situações nas quais será necessário aumentar ou diminuir o nível de segurança, de forma a diminuir o consumo de recursos do dispositivo. Esse trabalho propõe, portanto, um Mecanismo de Segurança com Adaptação Dinâmica (MeSAD), com foco na confidencialidade, capaz de adaptar o nível de segurança de acordo com o contexto e reduzir o consumo de recursos dos dispositivos móveis. O objetivo principal consiste em encontrar o ponto de equilíbrio no tradeoff entre nível de segurança e consumo de recursos. A fim de atingir este objetivo, este trabalho apresenta também uma ferramenta de suporte à utilização do MeSAD durante o desenvolvimento de aplicações móveis, além de possibilitar a realização de avaliações sobre o desempenho dos algoritmos criptográficos que são utilizados nos diferentes dispositivos.
APA, Harvard, Vancouver, ISO, and other styles
20

Chevalier, Arthur. "Optimisation du placement des licences logicielles dans le Cloud pour un déploiement économique et efficient." Thesis, Lyon, 2020. http://www.theses.fr/2020LYSEN071.

Full text
Abstract:
Cette thèse s'intéresse au Software Asset Management (SAM) qui correspond à la gestion de licences, de droits d'usage, et du bon respect des règles contractuelles. Lorsque l'on parle de logiciels propriétaires, ces règles sont bien souvent mal interprétées ou totalement incomprises. En échange du fait que nous sommes libres de licencier notre usage comme bon nous semble, dans le respect du contrat, les éditeurs possèdent le droit d'audit. Ces derniers peuvent vérifier le bon respect des règles et imposer, lorsque ces dernières ne sont pas respectées, des pénalités bien souvent d'ordre financières. L'émergence du Cloud a grandement augmenté la problématique du fait que les droits d'usages des logiciels n'étaient pas initialement prévus pour ce type d'architecture. Après un historique académique et industriel du Software Asset Management, des racines aux travaux les plus récents concernant le Cloud et l'identification logicielle, nous nous intéressons aux méthodes de licensing des principaux éditeurs avant d'introduire les différents problèmes intrinsèques au SAM. Le manque de standardisation dans les métriques, des droits d'usages, et la différence de paradigme apportée par le Cloud et prochainement le réseau virtualisé rendent la situation plus compliquée qu'elle ne l'était déjà. Nos recherches s'orientent vers la modélisation de ces licences et métriques afin de s'abstraire du côté juridique et flou des contrats. Cette abstraction nous permet de développer des algorithmes de placement de logiciels qui assurent le bon respect des règles contractuelles en tout temps. Ce modèle de licence nous permet également d'introduire une heuristique de déploiement qui optimise plusieurs critères au moment du placement du logiciel tels que la performance, l'énergie et le coût des licences. Nous introduisons ensuite les problèmes liés au déploiement de plusieurs logiciels simultanément en optimisant ces mêmes critères et nous apportons une preuve de la NPcomplétude du problème de décision associé. Afin de répondre à ces critères, nous présentons un algorithme de placement qui approche l'optimal et utilise l'heuristique ci-dessus. En parallèle, nous avons développé un logiciel SAM qui utilise ces recherches pour offrir une gestion automatisée et totalement générique des logiciels dans une architecture Cloud. Tous ces travaux ont été menés en collaboration avec Orange et testés lors de différentes preuves de concept avant d'être intégrés totalement dans l'outillage SAM
This thesis takes place in the field of Software Asset Management, license management, use rights, and compliance with contractual rules. When talking about proprietary software, these rules are often misinterpreted or totally misunderstood. In exchange for the fact that we are free to license our use as we see fit, in compliance with the contract, the publishers have the right to make audits. They can check that the rules are being followed and, if they are not respected, they can impose penalties, often financial penalties. This can lead to disastrous situations such as the lawsuit between AbInBev and SAP, where the latter claimed a USD 600 million penalty. The emergence of the Cloud has greatly increased the problem because software usage rights were not originally intended for this type of architecture. After an academic and industrial history of Software Asset Management (SAM), from its roots to the most recent work on the Cloud and software identification, we look at the licensing methods of major publishers such as Oracle, IBM and SAP before introducing the various problems inherent in SAM. The lack of standardization in metrics, specific usage rights, and the difference in paradigm brought about by the Cloud and soon the virtualized network make the situation more complicated than it already was. Our research is oriented towards modeling these licenses and metrics in order to abstract from the legal and blurry side of contracts. This abstraction allows us to develop software placement algorithms that ensure that contractual rules are respected at all times. This licensing model also allows us to introduce a deployment heuristic that optimizes several criteria at the time of software placement such as performance, energy and cost of licenses. We then introduce the problems associated with deploying multiple software at the same time by optimizing these same criteria and prove the NP-completeness of the associated decision problem. In order to meet these criteria, we present a placement algorithm that approaches the optimal and uses the above heuristic. In parallel, we have developed a SAM tool that uses these researches to offer an automated and totally generic software management in a Cloud architecture. All this work has been conducted in collaboration with Orange and tested in different Proof-Of-Concept before being fully integrated into the SAM tool
APA, Harvard, Vancouver, ISO, and other styles
21

Karathanasopoulos, Andreas. "Modeling and trading the Greek stock market with artificial intelligence models." Thesis, Liverpool John Moores University, 2011. http://researchonline.ljmu.ac.uk/6106/.

Full text
Abstract:
The main motivation for this thesis is to introduce some new methodologies for the prediction of the directional movement of financial assets with an application to the ASE20 Greek stock index. Specifically, we use some alternative computational methodologies named Evolutionary Support Vector Machine (ESVM), Gene Expression programming, Genetic Programming Algorithms and 2 hybrid combinations of linear and no linear models for modeling and trading the ASE20 Greek stock index using as inputs previous values of the ASE20 index and of four other financial indices. For comparison purposes, the trading performance of the ESVM stock predictor, Gene Expression Programming, Genetic Programming Algorithms and the 2 Hybrid combination methodologies have been benchmarked with four traditional strategies (a nave strategy, a Buy and Hold strategy, a MACD and an ARMA models), and a Multilayer Pereceptron (MLP) neural network model. As it turns out, the proposed methodologies produced a higher trading performance in terms of annualized return and information ratio, while providing information about the relationship between the ASE20 index and other foreign indices.
APA, Harvard, Vancouver, ISO, and other styles
22

Poulsen, Andrew Joseph. "Real-time Adaptive Cancellation of Satellite Interference in Radio Astronomy." Diss., CLICK HERE for online access, 2003. http://contentdm.lib.byu.edu/ETD/image/etd238.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Crombette, Pauline. "Contribution des technologies satellitaires Pléiades à l'étude des trames vertes urbaines : entre maintien des connectivités écologiques potentielles et densification des espaces urbains." Thesis, Toulouse 2, 2016. http://www.theses.fr/2016TOU20032/document.

Full text
Abstract:
En milieu urbain, la concurrence entre les enjeux de préservation de la biodiversité et de la densification du territoire est particulièrement développée. Dans une optique d’aide à la décision, une meilleure connaissance des zones les plus conflictuelles est requise. Face au constat d’insuffisance et d’inadéquation des données et des méthodes nécessaires à la cartographie des Trames vertes urbaines, notre travail s’intègre en premier lieu dans une démarche technique. Celle-ci est centrée sur la mise en place d’une méthode de traitement d’images satellitaires Très Haute Résolution Spatiale Pléiades (THRS) pour l’extraction de la végétation arborée et herbacée à l’échelle fine d’une emprise urbaine. D’abord appliquée à des données fictives, cette méthode est ensuite déployée sur quatre territoires (Toulouse, Muret, Pierrefite-Nestalas et Strasbourg). Bien que fondée sur une approche pixel, la simplicité de la méthode, qui s’appuie sur des outils libres, et les résultats obtenus (indice Kappa supérieur à 85 %) garantissent sa reproductibilité sur de vastes territoires plus ou moins urbanisés. Cette donnée de végétation est ensuite exploitée pour modéliser les connectivités écologiques potentielles du paysage urbain et périurbain toulousain. L’approche mobilise la théorie des graphes et permet d’évaluer l’impact d’un aménagement urbain sur la biodiversité. Le cas du Boulevard Urbain Nord de Toulouse est étudié. La cartographie proposée des réservoirs de biodiversité, hiérarchisés à l’aide de métriques de connectivité, est avant tout indicative. Elle est finalement confrontée à des documents d’urbanisme (Plans Locaux d’Urbanisme) afin d’obtenir une meilleure visibilité des territoires à enjeux environnementaux et urbanistiques. En fonction des enjeux fixés par les acteurs du territoire et à travers le filtre applicatif, cette thèse propose un outil robuste d’analyse et d’aide à la décision pour la gestion et la planification du territoire
In urban areas, competition between land development and ecological conservation is intense. To assist decision making, a better knowledge of those areas of interest is required. Regarding inadequacy data and methods needed for ecological network mapping in urban areas, the aim of our study is to develop a method for semi-automatic vegetation extraction with Very High Spatial Resolution Pleiades imagery (VHSR). Initially applied to training samples, the process is then be deployed to four French study areas (Toulouse, Muret, Pierrefite-Nestalas and Strasbourg). The reproducibility of this method over large urbanized areas is ensured by its simplicity and the results of a pixel-based classification (kappa coefficient higher than 85 %). This extraction workflow uses free or open-source software. This vegetation data is then used in order to model potential ecological connectivity in Toulouse’s urban and peri-urban areas. Impacts on biodiversity due to urban planning are assessed using graph theory. The “Boulevard Urbain Nord de Toulouse” project, a road infrastructure, is studied. Graph metrics have been calculated to assess the level of connectivity at habitat patches and landscape scales. We classified the importance of the patches which is cross-tabulated with planning documents (PLU, a local town planning) in order to locate conflict urban areas: between biodiversity preservation and urbanization. Depending on the issues set out by local actors and through the application filter, this thesis proposes a robust analytical tool and decision-making aid for landscape management and land planning
APA, Harvard, Vancouver, ISO, and other styles
24

Dais, Sofoklis, and Dimitrios Stylianidis. "Collaborate? Let me check if I need you right now! : Collaboration and openness initiatives and activities in six Greek start-ups." Thesis, Högskolan i Gävle, Avdelningen för Industriell utveckling, IT och Samhällsbyggnad, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-20407.

Full text
Abstract:
Context: Start-ups have recently emerged as an operational model for small and newly-founded firms globally. This increasing business acceptance is present within the European markets, as well as within the Greek. Researchers also complied to the ”commands” of the industry and startup research followed the same, to practice, increasing course. Although the increase in both research and practice is visible, and the fact that several start-up related topics are well-documented, the start-up literature still shows certain limitations that need to be answered. Theory: This study performs an extensive review of the start-up literature, provides definitions and descriptions of key start-up characteristics, and identifies the main streams, and limitations of start-up research, as long as cases of actual start-ups within the Greek business reality. Purpose: The purpose of this study is to provide insight on certain literature limitations by examining start-up customs towards collaboration and openness initiatives and activities. More in detail, the study aims to identify whether start-ups are able to collaborate (newness and smallness paradox), what is the extent (breadth and depth, partner variety, and collaboration content) of their collaboration and openness customs with different partners, but also the individual importance of specific partners, and the ways this importance changes through different phases of the start-up growth. Also, documented matters such as the determinants of collaborations and the internal organizational structure of start-ups towards openness and collaborations are also discussed. Design/Methodology/Approach: A multiple-case study that follows the replication logic is performed. The study focuses on six Greek online start-ups, and extracts information initially from the websites of the firms, and then by interviewing one key employee in each start-up. The combined information from each case are cross-analysed so as behavioural patterns to emerge and conclusions to be drawn regarding start-up initiatives and activities towards collaboration and openness. Findings: Start-ups are indeed able to collaborate and practice openness with external partners from the beginning, while the collaboration and openness is closely related to the desired outcome/collaboration content that fulfils a specific need. This desired outcome is connected to the extent – breadth and depth – of the collaboration, but also to the type of each partner. Thus, startups closely collaborate with few and selected partners of each kind (e.g. universities, supplies etc.), with the exception of customers and users. The collaboration with customers and users is wide and limited on their feedbacks due to their numbers. Customers, users, suppliers, innovation intermediaries, and universities were identified as the most important partners to start-ups. The importance of these partners is connected to the start-up growth lifecycle. Innovation intermediaries are the most important startup partners, while customers, users and suppliers are important from the stabilization phase and during the whole start-up lifecycle. Universities importance were not found to be clearly connected to the startup growth phases, but mostly to the collaboration content. Regarding the importance, some differences might occur amongst start-ups active in different industries. Those differences are industry specific and affect how and when each start-up collaborates with different partners. Finally, this study confirmed the propositions of previous studies regarding the determinants and internal organizational structure towards collaboration and openness with external partners. Research limitations and implications: Although the present study shows a set of limitations mostly regarding the number and distribution of the cases, it is the authors’ belief that it also shows a set of theoretical and practical implications. It provides managers and researchers with findings on uncharted territories in start-up literature, it connects its findings to prior start-up research, and provides insight on the almost undeveloped literature on Greek start-ups.
APA, Harvard, Vancouver, ISO, and other styles
25

Grein, Dirceu. "Uma contribuição para a integração do sistema legados da saúde pública do Brasil usando agentes de software / Dirceu Grein ; (orientador, João da Silva Dias ; co-orientador, Edson E. Scalabrin)." reponame:Biblioteca Digital de Teses e Dissertações da PUC_PR, 2005. http://www.biblioteca.pucpr.br/tede/tde_busca/arquivo.php?codArquivo=1630.

Full text
Abstract:
Dissertação (mestrado) - Pontifícia Universidade Católica do Paraná, Curitiba, 2005
Bibliografia: f. 131-146
A distribuição física e segmentada das áreas de saúde, entre um grande número de organizações, caracteriza a Saúde Pública como um sistema distribuído com diversas fontes de informações. Este trabalho apresenta uma abordagem de agentes de software para in
The physical and segmented distribution of the areas of health, among a great number of organizations, it characterizes the public health as a system distributed with several sources of information. This work presents an approach of software agents to int
APA, Harvard, Vancouver, ISO, and other styles
26

Moreno, Moreno Flavio David. "Reconocimiento de gestos corporales, utilizando procesamiento digital de imágenes para activar sistema de alarma." Bachelor's thesis, Universidad Ricardo Palma, 2015. http://cybertesis.urp.edu.pe/handle/urp/1283.

Full text
Abstract:
La investigación realizada a los sistemas de seguridad electrónica de edificios, plantea como objetivo principal el reconocimiento de tres gestos de un lenguaje corporal del personal de vigilancia, y la consecuente activación de alarma en forma automática. Inicialmente se realizó una encuesta dirigida a las administraciones y personal de edificios, para saber cuales eran las ocurrencias que vulneraban la seguridad de un edificio multifamiliar, luego se observaron y analizaron las imágenes capturadas por una cámara de vigilancia ubicada en la recepción, identificando las ocurrencias más vulnerables y gestos asociados a dichos eventos; se seleccionaron tres gestos que en forma inconsciente realizaba el personal de vigilancia ante dichas situaciones. A determinados cuadros que comprenden estas imágenes se le aplicaron técnicas de procesamiento espacial, con ayuda de una iluminación artificial que era más intensa en la parte posterior del sujeto de análisis, consiguiéndose la definición de una silueta binarizada en el entorno Matlab, técnicas como selección del plano rojo, plano de bits más significativo, invertir imagen y transformaciones morfológicas tipo cerradura, definieron una silueta que ayudó a desarrollar un algoritmo matemático para generar una señal eléctrica en el puerto serial USB del ordenador, donde se conectó físicamente una plataforma de hardware Arduino que activa la alarma. La elección de esta plataforma se debió a que Matlab cuenta con un grupo de instrucciones para Arduino, con el objetivo de lograr una comunicación sincronizada entre ordenador e interface. Las técnicas utilizadas reconocieron 62,5% de los eventos descritos en las encuestas realizadas y que no son mencionadas en temas de investigación similar. Para lograr el objetivo fue necesario analizar un cuadro por segundo. The research poses as their main objective the three gestures recognition of a body language of surveillance personnel and the consequent activation of alarm automatically. It was initially carried out a survey of the administration and the offices of the buildings to know which were the occurrences that violate the security of a multi-family building, then were observed and analyzed images captured by a surveillance camera located in the reception, identifying the most vulnerable occurrences and gestures associated with these events; were selected three gestures that unconsciously performs surveillance personnel before such situations; to certain pictures that comprise these images were applied spatial processing techniques, with the help of an artificial lighting that was more intense in the back of the subject of analysis, getting the definition of a silhouette binarized in the Matlab environment, techniques such as plane selection red, more significant bit plane, to invest an image and convolution close type, defined a silhouette that allowed to develop a mathematical algorithm that generated an electrical signal in USB serial port of the computer, where it is physically connected a hardware platform Arduino that active the alarm.This platform choice is due to the fact that Matlab has a group of instructions for Arduino, achieving an orderly communication between computer and interface. The techniques used recognized 62.5 % of the events described in the surveys carried out and which aren’t mentioned in similar research topics. To achieve the objective was necessary to analyze a picture per second.
APA, Harvard, Vancouver, ISO, and other styles
27

Rua, Rui António Ramada. "GreenSource: repository tailored for green software analysis." Master's thesis, 2018. http://hdl.handle.net/1822/64275.

Full text
Abstract:
Dissertação de mestrado in Computer Science
Both energy consumption analysis and energy-aware development have gained the attention of both developers and researchers over the past years. The interest is more notorious due to the proliferation of mobile devices, where energy is a key concern. There is a gap identified in terms of tools and information to detect and identify anomalous energy consumption in Android applications. A large part of the existing tools are based on external hardware (costly solutions in terms of setup-time), through predictive models (requiring previous hardware calibration) or static code analysis methods. We could not identify so far a tool capable of monitor all relevant system resources and components that an application uses and appoint its energy consumption, while being easily integrated with the application and/or with its development environment. Due to the lack of a tool capable of gathering all this information, a natural consequence is the lack of information about the energy consumption of applications and factors that can influence it. This dissertation aims to carry out a study on the energy consumption of applications and mobile devices in the Android platform, having developed in this scope the GreenSource infrastructure, a repository containing the source code, representative metadata and metrics relatively to a large number of applications (and respective execution in physical devices). In order to gather the results, an auxiliary tool has been developed to automatize the process of testing and collect the respective results for each one of the applications. This tool is a software-based solution, allowing to obtain results in terms of consumption through executions made directly on a physical device running the Android platform. The developed framework, the AnaDroid, has the capability to perform static and dynamic analysis of an application, being able to monitor power consumption and usage of resources for each application through tests execution. This is done following a whitebox testing approach, in order to test applications at source code level. It invokes calls to the TrepnLib library at strategic locations of the application code (through instrumentation techniques) to gain control over relevant portions of the source code, like methods and unit tests. In this way the programmer can have results about the use, state and consumption of resources such as energy, CPU, GPU, memory, sensor usage and complexity of developed test cases. The information gathered through the use of the AnaDroid over a large set of applications was stored in GreenSource backend. With the collected results, we expect to be able to characterize and classify applications, as well the tests developed for it. It is intended that this will be made publicly available and serve as a reference for future works and studies.
Quer a análise do consumo de energia, quer o desenvolvimento de aplicações com consciência neste sentido têm vindo a cativar a atenção de desenvolvedores e investigadores nos últimos anos. O interesse é mais notório devido à proliferação de dispositivos móveis, onde a energia é uma preocupação fundamental mas ainda pouco explorada. Como tal, existem lacunas identificadas em termos de ferramentas e informações para detectar e identificar o consumo anómalo de energia em aplicações Android. Grande parte das ferramentas existentes são baseadas em hardware externo (soluções dispendiosas em termos de tempo de setup), através de modelos preditivos (que exigem calibração prévia) ou métodos de análise estática de código. Não conseguimos identificar até ao momento uma ferramenta capaz de monitorizar de forma precisa todos os recursos e componentes relevantes do sistema usados por uma aplicação, bem como de determinar o seu consumo energético. Esta lacuna tem como consequência natural a falta de informação sobre o consumo de energia de aplicações e fatores que podem influenciá-lo. Esta dissertação tem como objetivo realizar um estudo sobre o consumo de energia na plataforma Android, tendo sido desenvolvido neste âmbito a infraestrutura GreenSource. Esta contém um repositório que engloba o código fonte, resultados e métricas relativas a um grande número de aplicações. A fim de obter resultados ilustrativos para um grande número de aplicações, foi desenvolvida uma ferramenta para automatizar o processo de teste e reunir os respectivos resultados. A ferramenta desenvolvida é baseada em software, permitindo obter resultados em termos de consumo através de execuções realizadas diretamente num dispositivo físico Android. Esta framework, denominada AnaDroid, possui a capacidade de analizar aplicações de forma estática e dinâmica, bem como de monitorizar o consumo e uso de recursos durante a sua execução. Para este efeito, são efetuadas invocações a uma biblioteca denominada TrepnLib, em locais estratégicos do código da aplicação para obter controlo sobre partes relevantes deste. Desta forma obtém-se resultados sobre o uso, estado e consumo de recursos, tais como consumo energético, CPU, GPU, memória, sensores. As informações reunidas através da execução do AnaDroid foram armazenadas na base de dados do GreenSource. Com todos os resultados coletados, pretende-se caracterizar e classificar energeticamente aplicações e testes desenvolvidos para estas. Pretende-se disponibilizar abertamente estes resultados, para que possam servir como referencia para futuros trabalhos, análises e estudos.
APA, Harvard, Vancouver, ISO, and other styles
28

Pereira, Rui Alexandre Afonso. "Energyware engineering: techniques and tools for green software development." Doctoral thesis, 2018. http://hdl.handle.net/1822/59013.

Full text
Abstract:
Tese de Doutoramento em Informática (MAP-i)
Energy consumption is nowadays one of the most important concerns worldwide. While hardware is generally seen as the main culprit for a computer’s energy usage, software too has a tremendous impact on the energy spent, as it can cancel the efficiency introduced by the hardware. Green Computing is not a newfield of study, but the focus has been, until recently, on hardware. While there has been advancements in Green Software techniques, there is still not enough support for software developers so they can make their code more energy-aware, with various studies arguing there is both a lack of knowledge and lack of tools for energy-aware development. This thesis intends to tackle these two problems and aims at further pushing forward research on Green Software. This software energy consumption issue is faced as a software engineering question. By using systematic, disciplined, and quantifiable approaches to the development, operation, and maintenance of software we defined several techniques, methodologies, and tools within this document. These focus on providing software developers more knowledge and tools to help with energy-aware software development, or Energyware Engineering. Insights are provided on the energy influence of several stages performed during a software’s development process. We look at the energy efficiency of various popular programming languages, understanding which are the most appropriate if a developer’s concern is energy consumption. A detailed study on the energy profiles of different Java data structures is also presented, alongwith a technique and tool, further providing more knowledge on what energy efficient alternatives a developer has to choose from. To help developers with the lack of tools, we defined and implemented a technique to detect energy inefficient fragments within the source code of a software system. This technique and tool has been shown to help developers improve the energy efficiency of their programs, and even outperforming a runtime profiler. Finally, answers are provided to common questions and misconceptions within this field of research, such as the relationship between time and energy, and howone can improve their software’s energy consumption. This thesis provides a great effort to help support both research and education on this topic, helps continue to grow green software out of its infancy, and contributes to solving the lack of knowledge and tools which exist for Energyware Engineering.
Hoje em dia o consumo energético é uma das maiores preocupações a nível global. Apesar do hardware ser, de umaforma geral, o principal culpado para o consumo de energia num computador, o software tem também um impacto significativo na energia consumida, pois pode anular, em parte, a eficiência introduzida pelo hardware. Embora Green Computing não seja uma área de investigação nova, o foco tem sido, até recentemente, na componente de hardware. Embora as técnicas de Green Software tenham vindo a evoluir, não há ainda suporte suficiente para que os programadores possam produzir código com consciencialização energética. De facto existemvários estudos que defendem que existe tanto uma falta de conhecimento como uma escassez de ferramentas para o desenvolvimento energeticamente consciente. Esta tese pretende abordar estes dois problemas e tem como foco promover avanços em green software. O tópico do consumo de energia é abordado duma perspectiva de engenharia de software. Através do uso de abordagens sistemáticas, disciplinadas e quantificáveis no processo de desenvolvimento, operação e manutencão de software, foi possível a definição de novas metodologias e ferramentas, apresentadas neste documento. Estas ferramentas e metodologias têm como foco dotar de conhecimento e ferramentas os programadores de software, de modo a suportar um desenvolvimento energeticamente consciente, ou Energyware Engineering. Deste trabalho resulta a compreensão sobre a influência energética a ser usada durante as diferentes fases do processo de desenvolvimento de software. Observamos as linguagens de programação mais populares sobre um ponto de vista de eficiência energética, percebendo quais as mais apropriadas caso o programador tenha uma preocupação com o consumo energético. Apresentamos também um estudo detalhado sobre perfis energéticos de diferentes estruturas de dados em Java, acompanhado por técnicas e ferramentas, fornecendo conhecimento relativo a quais as alternativas energeticamente eficientes que os programadores dispõem. Por forma a ajudar os programadores, definimos e implementamos uma técnica para detetar fragmentos energicamente ineficientes dentro do código fonte de um sistema de software. Esta técnica e ferramenta têm demonstrado ajudar programadores a melhorarem a eficiência energética dos seus programas e em algum casos superando um runtime profiler. Por fim, são dadas respostas a questões e conceções erradamente formuladas dentro desta área de investigação, tais como o relacionamento entre tempo e energia e como é possível melhorar o consumo de energia do software. Foi empregue nesta tese um esforço árduo de suporte tanto na investigação como na educação relativo a este tópico, ajudando à maturação e crescimento de green computing, contribuindo para a resolução da lacuna de conhecimento e ferramentas para suporte a Energyware Engineering.
This work is partially funded by FCT – Foundation for Science and Technology, the Portuguese Ministry of Science, Technology and Higher Education, through national funds, and co-financed by the European Social Fund (ESF) through the Operacional Programme for Human Capital (POCH), with scholarship reference SFRH/BD/112733/2015. Additionally, funding was also provided the ERDF – European Regional Development Fund – through the Operational Programmes for Competitiveness and Internationalisation COMPETE and COMPETE 2020, and by the Portuguese Government through FCT project Green Software Lab (ref. POCI-01-0145-FEDER-016718), by the project GreenSSCM - Green Software for Space Missions Control, a project financed by the Innovation Agency, SA, Northern Regional Operational Programme, Financial Incentive Grant Agreement under the Incentive Research and Development System, Project No. 38973, and by the Luso-American Foundation in collaboration with the National Science Foundation with grant FLAD/NSF ref. 300/2015 and ref. 275/2016.
APA, Harvard, Vancouver, ISO, and other styles
29

Wang, Yu-Fang, and 王瑜芳. "Test and Verification of a Green Production Management Software System." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/14979615038405552070.

Full text
Abstract:
碩士
中原大學
工業與系統工程研究所
97
EU-RoHS is important in influencing the world, and it is also a serious lash to Taiwan electrical and electronic manufacturing related industry. Hence, Taiwan electrical and electronic manufacturing related industry need to concern about EU-RoHS’s issue and do appropriate managing planning to keep ompetitive advantage in this time. Therefore, our research team has developed a Green Production Management System in hopes of helping Taiwan electrical and electronic manufacturing related industry control the quality of green products and assures the products could conform with EU-RoHS standard and avoid unnecessary penalties from occurring. To assure our Green Production Management System could reach the demands which we set up before and conform to user’s requests is the target in this research, we go through tests and verification to prove whether the system could achieve the target or not. In this research, integration testing and system testing are the main topics to discuss. Through our integration testing, we have assured that this system reaches the demands which we set up before. And through our system testing, we have also assured that this system reaches the demands of conforming to user’s requests. In the end of the research, we believe the system would get to the target we set up before and could almost reach user’s demand. In the future, we hope the system could be tested in the industry to be checked of its outcomes so that we could improve it and apply it widely.
APA, Harvard, Vancouver, ISO, and other styles
30

Cheng, Li-Hao, and 鄭力豪. "Green OpenFlow Packet Classification based on TCAM and Optimal Tree Covers in Software-Defined Networks." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/13187282689026645281.

Full text
Abstract:
碩士
國立交通大學
電信工程研究所
104
In this thesis, we proposed an algorithm to divide rules to buckets. When a switch receives a packet, it needs to use flow table to decide how to process the packet. If we put flow table to TCAM, it will consume too much power, so we use our algorithm to divide rules. A switch will match index TCAM to decide to open which bucket, the buckets which are not opened can be closed to save power. There are counters in each rule in openflow standard, so we can know there is how many packets match each rule. We use the knowledge to calculate the probability of each rule matched and use it to design our algorithm. The object of our algorithm is to open entries in each search as less as possible. We will build a tree to represent rules and use dynamic programming algorithm to divide rules to buckets. Simulation result shows that our algorithm can reduce power consumed by TCAM very much.
APA, Harvard, Vancouver, ISO, and other styles
31

Sun, Yu-Wen, and 孫郁文. "A Study on Future Classroom: Applying Innovative Green Cloud-based Education Software Services System to Natural Science in Junior High School." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/9xmsy4.

Full text
Abstract:
碩士
國立臺北科技大學
技術及職業教育研究所
101
The cloud-based teaching service system is a highly interactive one that combines and complements the Classroom of the Future. The present study adopted the quasi-experimental pretest posttest control group design and explored whether cloud-based teaching service system was useful to both the instructors who used the system and the students in terms of their natural science learning and learning attitude. Sixty-two eighth graders from two intact classes in New Taipei city participated in the study. One class was assigned as the experimental group while the other was the controlled group. The experimental group incorporated cloud-based teaching service system of science and technology to design natural science instruction. The controlled group was instructed with the traditional teaching method. Both groups were given a pretest and a posttest on natural science as well as a learning attitude questionnaire. The results showed that students in the experimental group performed significantly on the natural science posttest and exhibited more positive learning attitude than the controlled group students did. Students felt that with adopting cloud-based teaching service system, their class became livelier, more interactive, and safer. The instructors expressed concerns about the system’s stability; however, they were positive about the system and its’ use in natural science teaching.
APA, Harvard, Vancouver, ISO, and other styles
32

Ghonaim, Fahad A. "Adaptive router bypass techniques to enhance core network efficiency." Thesis, 2018. https://dspace.library.uvic.ca//handle/1828/9283.

Full text
Abstract:
Internet traffic is increasing exponentially, driven by new technologies such as Internet of Things (IoT) and rich streaming media. The traditional IP router becomes a bottleneck for further Internet expansion due to its high power consumption and inefficiency in processing the growing traffic. Router bypass has been introduced to overcome capacity limitations and the processing costs of IP routers. With router bypass, a portion of traffic is provisioned to bypass the router and is switched by the transport layer. Router bypass has shown to provide significant savings in network costs. These advantages are limited by a reduction in the statistical multiplexing associated with the subdivision of the available bandwidth typically into bypass and traditional portions thus limiting the interest in bypass techniques. This thesis will explore multiple techniques to enhance the efficiency of router bypass. The main goals are to address the issue of the reduction in statistical multiplexing and to add a dynamic approach to the router bypass mechanism. The recent advancements in the Optical Transport Network (OTN) play a major role in the transport network. This proposal takes full advantage of OTN in the router-bypassing context by applying recent developments such as Hitless Adjustments ODUflex (HAO), which allow the provisioned channels to be adjusted without re-establishing the connections. In addition, it will allow the bypassing mechanism to be flexible enough to meet the traffic behaviour needs of the future. This thesis will study multiple approaches to enhance the router bypass mechanism including: an adaptive provisioning style using various degrees of provisioning granularities and controlling the provisioning based on traffic behaviour. In addition, this thesis will explore the impact of automation in Software-Defined Networking (SDN) on router bypass. The application-driven infrastructure in SDN is moving the network to be more adaptive, which paves the way for an enhanced implementation of router bypass. Many challenges still face the industry to fully integrate the three layers (3, 2, and 1) to transform the current infrastructure into an adaptive application driven network. The IP router (layer 3) provisions and restores the connection regardless of the underlying layers (layer 2 and 1) and the transport layer does the same regardless of the IP layer. Although allowing every layer to develop without being constrained by other layers offers a huge advantage, it renders the transport layer static and not fully aware of the traffic behaviour. It is my hope that this thesis is a step forward in transforming the current network into a dynamic, efficient and responsive network. A simulation has been built to imitate the router bypassing concept and then many measurements have been recorded.
Graduate
APA, Harvard, Vancouver, ISO, and other styles
33

Παππάς, Ιωάννης. "Απομακρυσμένη διαχείριση συστημάτων και δικτύων και εφαρμογής της στο πανελλήνιο σχολικό δίκτυο." Thesis, 2009. http://nemertes.lis.upatras.gr/jspui/handle/10889/1774.

Full text
Abstract:
Σε αυτήν τη διπλωματική εργασία θα ασχοληθούμε με την απομακρυσμένη διαχείριση συστημάτων και δικτύων, ως υπηρεσία σ’ ένα μεγάλο δίκτυο όπως το Πανελλήνιο Σχολικό Δίκτυο. Η εργασία αποτυπώνει τους παράγοντες που επηρεάζουν τη διαχείριση των συστημάτων ενός δικτύου και την ανάπτυξη μιας υπηρεσίας απομακρυσμένης κεντρικής διαχείρισης υπολογιστικών συστημάτων σε IP δίκτυα ευρείας περιοχής. Μελετήθηκε η υπάρχουσα αρχιτεκτονική μιας τέτοιας υπηρεσίας, έγινε επανασχεδιασμός και εφαρμογή της. Παράλληλα μελετήθηκαν οι δυνατότητες προσαρμοστικότητας ενός εμπορικού και ενός ανοικτού κώδικα εργαλείου. Στη συνέχεια παρουσιάστηκαν τα αποτελέσματα εφαρμογής και των δύο εργαλείων και έγινε μια συγκριτική μελέτη. Τέλος προτάθηκαν λύσεις ανάπτυξης λογισμικού(agile development) με βάση τις state of the art εξελίξεις της διαχείρισης τέτοιων ομάδων, και πως μπορούν να εφαρμοστούν στην υπηρεσία απομακρυσμένης διαχείρισης και να της δώσουν ένα μακροπρόθεσμο κύκλο ζωής και ανάπτυξης
The object of the thesis is the remote management of systems of a network, as a service at the Greek School Network (GSN). The thesis presents the factors that influence the management of the systems of a network and the development of the central remote system management service via IP networks of wide area. We studied the already status of such a service and then we preceded at the redesign and implementation. We also studied the adaptability of one commercial and one open source tool. The next step is to present the results of the implementation of the above two tools and the compare between them. At the end, we suggested processes of software development, according to the principles of agile development. We presented how these techniques can give long life and support at the service of remote system management.
APA, Harvard, Vancouver, ISO, and other styles
34

Μπέκος, Βασίλειος. "Η ελληνική ιστορία στο Διαδίκτυο." Thesis, 2012. http://hdl.handle.net/10889/5376.

Full text
Abstract:
Αντικείμενο αυτής της μεταπτυχιακής διπλωματικής εργασίας είναι η ανάλυση της Ελληνικής Ιστορίας στο Διαδίκτυο, αλλά και η δημιουργία ενός ολοκληρωμένου ιστοτόπου που θα έχει ως αντικείμενο την Ελληνική Ιστορία και πιο συγκεκριμένα την ιστορία της πόλης της Ναυπάκτου. Η Ναύπακτος είναι μια πόλη με μεγάλη και ιδιαίτερα ενδιαφέρουσα ιστορία, στην οποία έλαβαν χώρα σημαντικά γεγονότα που επηρέασαν την ιστορία ολόκληρου του Ελληνικού έθνους, κυρίως κατά τη διάρκεια της Οθωμανικής περιόδου (με τη θρυλική Ναυμαχία της Ναυπάκτου – Naval Battle of Lepanto το 1571). Όταν ένας λαός έχει επίγνωση της ιστορίας του, έχει επίγνωση και της ταυτότητάς του. Η ιστορία σφραγίζει την ιδιαιτερότητα και το μέγεθος της παρουσίας ενός λαού. Συνεπώς, ένας άνθρωπος που δεν γνωρίζει την ιστορία του τόπου του δεν μπορεί να θεωρείται ολοκληρωμένος. Στις μέρες μας, το Διαδίκτυο έχει συμβάλει σημαντικά στην προβολή, την ανάδειξη και την εκμάθηση της ιστορίας κάθε τόπου. Με το Διαδίκτυο άνθρωποι από διαφορετικούς τόπους, κουλτούρες και συνήθειες μπορούν να «επισκεφθούν» και να γνωρίσουν τη χώρα και τα στοιχεία της που την έχουν κάνει ξακουστή σε όλο τον κόσμο. Η Ελλάδα δεν θα μπορούσε φυσικά να λείπει από το Διαδίκτυο με την πλούσια και μακροχρόνια ιστορία της. Η Ελληνική Ιστορία αποτέλεσε και αποτελεί αντικείμενο μελέτης από ιστορικούς ολόκληρου του κόσμου. Μάλιστα, για πολλούς ο αρχαίος Ελληνικός πολιτισμός θεωρείται ίσως ο κορυφαίος πολιτισμός που έχει ποτέ υπάρξει. Σίγουρα, υπάρχουν πολλοί ιστότοποι που έχουν ως θέμα τους την Ελληνική Ιστορία. Σε αυτή την εργασία έγινε προσπάθεια να δοθεί μια ολοκληρωμένη εικόνα της υφιστάμενης κατάστασης. Για αυτό το λόγο, αναφέρθηκαν ορισμένοι από τους πιο αξιόλογους ιστοτόπους της Ελληνικής Ιστορίας στο Διαδίκτυο. Έτσι, αυτή η εργασία μπορεί να αποτελέσει ένα χρήσιμο οδηγό για όλους εκείνους τους χρήστες του Διαδικτύου που ενδιαφέρονται να επισκεφθούν ιστοτόπους που θα τους δώσουν χρήσιμες πληροφορίες για την ιστορία της Ελλάδας. Επίσης, για την ανάπτυξη του ιστοτόπου που αφορά την ιστορία της Ναυπάκτου, χρησιμοποιήθηκαν νέες τεχνολογίες Διαδικτύου ανοικτού κώδικα και πιο συγκεκριμένα το εργαλείο Joomla!. (Ακόμα, χρησιμοποιήθηκε το εργαλείο ΧΑΜΡΡ για την εγκατάσταση του Apache HTTP Server και των MySQL, PHP και Perl). Το Joomla! είναι ένα δωρεάν σύστημα διαχείρισης περιεχομένου (Content Management System - CMS). Χρησιμοποιείται για τη δημοσίευση περιεχομένου στον Παγκόσμιο Ιστό (World Wide Web) και σε τοπικά δίκτυα (intranets). Το βασικό χαρακτηριστικό του είναι ότι οι ιστοσελίδες που εμφανίζει είναι δυναμικές, δηλαδή δημιουργούνται τη στιγμή που ζητούνται.
The purpose of this postgraduate thesis is the analysis of Greek History on the Internet and the development of a complete web site which will focus on Greek History and more specifically on the history of the city of Nafpaktos. Nafpaktos is a city with a long and very interesting history, in which important events occurred (especially during the Ottoman period, with the legendary Naval Battle of Lepanto in 1571) which affected the history of the entire Greek nation. When the people of a country have awareness of their history, then they are also aware of their identity. History seals the specificity and the magnitude of the presence of the people of a country. Therefore, when a person does not know the history of his country, then he can not be considered a complete person. Nowadays, the Internet has significantly contributed to the promotion and learning of the history of each country. With the Internet, people from different countries and with different cultures and traditions can "visit" and learn about the country and its details that have made it famous around the world. Of course, Greece could never be absent from the Internet with its rich and long history. Greek History was and still is the main point of the study of many historians around the world. Indeed, for many people the ancient Greek civilization is perhaps considered the most significant civilization that has ever existed. There are surely many web sites which are refered to Greek History. In this thesis, it was made an attempt to be given a complete picture of the current situation. For this reason, they were reported some of the most remarkable web sites of Greek History on the Internet. Thus, this thesis can be a useful guide for all the users of the Internet who are interested in visiting web sites which will give them useful information about the history of Greece. Also, they were used new Internet “open source” technologies and more specifically the web development tool Joomla!, for the development of the web site which concentrates on the history of Nafpaktos. (It was also used XAMPP for the installation of Apache HTTP Server, MySQL, PHP and Perl). Joomla! is a free Content Management System (CMS). It is used for publishing content on the World Wide Web (WWW) and on intranets. The key feature of Joomla! is that it displays dynamic web pages, namely web pages which are created when required.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography