Dissertations / Theses on the topic 'Information modelling, management and ontologies'

To see the other types of publications on this topic, follow the link: Information modelling, management and ontologies.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Information modelling, management and ontologies.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Nyqvist, Olof. "Information Management for Cutting Tools : Information Models and Ontologies." Doctoral thesis, Stockholm : Industriell produktion, Production Engineering, Kungliga Tekniska högskolan, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4763.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Afzal, Muhammad. "Modelling temporal aspects of healthcare processes with Ontologies." Thesis, Jönköping University, JTH, Computer and Electrical Engineering, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-12781.

Full text
Abstract:

This thesis represents the ontological model for the Time Aspects for a Healthcare Organization. It provides information about activities which take place at different interval of time at Ryhov Hospital. These activities are series of actions which may be happen in predefined sequence and at predefined times or may be happen at any time in a General ward or in Emergency ward of a Ryhov Hospital.

For achieving above mentioned objective, our supervisor conducts a workshop at the start of thesis. In this workshop, the domain experts explain the main idea of ward activities. From this workshop; the author got a lot of knowledge about activities and time aspects. After this, the author start literature review for achieving valuable knowledge about ward activities, time aspects and also methodology steps which are essentials for ontological model. After developing ontological model for Time Aspects, our supervisor also conducts a second workshop. In this workshop, the author presents the model for evaluation purpose.

APA, Harvard, Vancouver, ISO, and other styles
3

Tchouanguem, Djuedja Justine Flore. "Information modelling for the development of sustainable construction (MINDOC)." Thesis, Toulouse, INPT, 2019. http://www.theses.fr/2019INPT0133.

Full text
Abstract:
Au cours des dernières décennies, la maîtrise de l'impact sur l'environnement par l'analyse du cycle de vie est devenue un sujet d'actualité dans le secteur du bâtiment. Cependant, il y a quelques problèmes d’échange d'informations entre experts pour la réalisation de diverses études telles que l’évaluation environnementale du bâtiment. Il existe une hétérogénéité entre les bases de données de produits de construction car elles n'ont pas les mêmes caractéristiques et n'utilisent pas la même base pour mesurer l'impact environnemental de chaque produit de construction. En outre, il est encore difficile d'exploiter pleinement le potentiel de liaison entre le BIM, le Web sémantique et les bases de données de produits de construction, car l'idée de les combiner est relativement récente. L'objectif de cette thèse est d'accroître la flexibilité nécessaire pour évaluer l'impact environnemental du bâtiment au moment opportun. Premièrement, notre recherche détermine les lacunes en matière d’interopérabilité dans le domaine AEC (Architecture Engineering and Construction). Ensuite, nous comblons certaines des lacunes rencontrées par la formalisation des informations du bâtiment et la génération de données du bâtiment aux formats Web sémantique. Nous promouvons l'utilisation efficace du BIM tout au long du cycle de vie du bâtiment en intégrant et en référençant les données environnementales sur les produits de construction dans un outil BIM. De plus, la sémantique a été affiner par l'amélioration d'une ontologie bien connue basée sur le bâtiment ; à savoir ifcOWL pour le langage d'ontologie Web (OWL) des IFC (Industry Foundation Classes). Enfin, nous avons réalisé une expérimentation d'une étude de cas d'un petit bâtiment pour notre méthodologie
In previous decades, controlling the environmental impact through lifecycle analysis has become a topical issue in the building sector. However, there are some problems when trying to exchange information between experts for conducting various studies like the environmental assessment of the building. There is also heterogeneity between construction product databases because they do not have the same characteristics and do not use the same basis to measure the environmental impact of each construction product. Moreover, there are still difficulties to exploit the full potential of linking BIM, SemanticWeb and databases of construction products because the idea of combining them is relatively recent. The goal of this thesis is to increase the flexibility needed to assess the building’s environmental impact in a timely manner. First, our research determines gaps in interoperability in the AEC (Architecture Engineering and Construction) domain. Then, we fill some of the shortcomings encountered in the formalization of building information and the generation of building data in Semantic Web formats. We further promote efficient use of BIM throughout the building life cycle by integrating and referencing environmental data on construction products into a BIM tool. Moreover, semantics has been improved by the enhancement of a well-known building-based ontology (namely ifcOWL for Industry Foundation Classes Web Ontology Language). Finally, we experience a case study of a small building for our methodology
APA, Harvard, Vancouver, ISO, and other styles
4

Flycht-Eriksson, (Silvervarg) Annika. "Design and use of ontologies in information-providing dialogue systems." Doctoral thesis, Linköpings universitet, NLPLAB - Laboratoriet för databehandling av naturligt språk, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-5007.

Full text
Abstract:
In this thesis, the design and use of ontologies as domain knowledge sources in information-providing dialogue systems are investigated. The research is divided into two parts, theoretical investigations that have resulted in a requirements specifications on the design of ontologies to be used in information-providing dialogue systems, and empirical work on the development of a framework for use of ontologies in information-providing dialogue systems. The framework includes three models: A model for ontology-based semantic analysis of questions. A model for ontology-based dialogue management, specifically focus management and clarifications. A model for ontology-based domain knowledge management, specifically transformation of user requests to system oriented concepts used for information retrieval. In this thesis, it is shown that using ontologies to represent and reason on domain knowledge in dialogue systems has several advantages. A deeper semantic analysis is possible in several modules and a more natural and efficient dialogue can be achieved. Another important aspect is that it facilitates portability; to be able to reuse adapt the dialogue system to new tasks and domains, since the domain-specific knowledge is separated form generic features in the dialogue system architecture. Other advantages are that it reduces the complexity of linguistic produced in various domains.
APA, Harvard, Vancouver, ISO, and other styles
5

Fragos, Serafeim. "Behavioural modelling in management and accounting information systems." Thesis, Lancaster University, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.483621.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ayub, Muhammad, and Muhammad Jawad. "Structuring and Modelling Competences in the Healthcare Area with the help of Ontologies." Thesis, Jönköping University, School of Engineering, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-9627.

Full text
Abstract:

Ontology development is a systematic technique to represent the existing and new knowledge about a specific domain by using some models to present the system in which conceptualization is involved. This thesis presents the use of ontologies to formally represent ontology-based competence model for potential users of quality registry report in a healthcare organization. The model describes the professional and occupational interests and needs of the users through structuring and describing their skills and qualifications. There are individual competences model having two main parts: general competence and occupational competence. The model is implemented in an ontology editor. Although our competence model gives the general view about all medical areas in a hospital, from implementation point of view, we have considered only Cardiology area in detail. The potential users of quality registry are medical staff, county council staff and Pharmaceutical staff. In this report we have also used different classifications of education, occupational fields and diseases. A user can get information about the patient and specific disease with treatment tips by using various organizational resources: i.e. quality registries, electronic medical reports, and online journals. Our model also provides a support of information filtering which filters the information according to the need and the competencies of the users.

APA, Harvard, Vancouver, ISO, and other styles
7

Wong, Siaw Ming. "Analyse des causes d'échec des projets d'affaires à partir d'études de cas en entreprises, et proposition d'un modèle de domaine en langage UML." Phd thesis, Université de La Rochelle, 2010. http://tel.archives-ouvertes.fr/tel-00556609.

Full text
Abstract:
En dépit des efforts destinés à accroitre la maturité de la profession dans le domaine de la gestion de projet, le taux d'échec des projets d'affaires (par opposition aux projets techniques) reste élevé. On s'est aperçu que les standards actuels en matière de gestion de projet ne prenaient pas en compte les contraintes liées au contexte d'exécution des projets, et que de ce fait, la gestion de projets d'affaires n'avait pas été étudiée en profondeur. L'objectif de ce travail de recherche transdisciplinaire est donc d'abord d'obtenir une meilleure compréhension du sujet en essayant de comprendre pourquoi l'échec d'un projet d'affaires est considéré comme un échec du point de vue de l'organisation, puis de formaliser la connaissance acquise dans un format qui permette par la suite de l'enrichir et de l'appliquer. En nous appuyant sur le modèle des systèmes ouverts, trois études de cas ont été conduites, avec pour objectif d'étudier l'effet modérateur des différents types de structures des organisations et des systèmes d'information pour la gestion de projet sur la relation causale entre la compétence en gestion de projet et le succès des projets d'affaires. Il résulte de ce travail que le succès des projets d'affaires devrait être mesuré en termes de réalisation des objectifs du projet mais aussi de l'organisation. Ce travail a également permis d'identifier les composants essentiels de la gestion de projets d'affaires : (1) "Compétences de base pour la gestion de projet "; (2) "Gestion intégrée de programme" et (3) "Système d'information intégré pour la gestion de projet". Dans les trois études de cas, il apparait également de manière déterminante que les facteurs d'ordre organisationnel ont un impact significatif sur la réussite du projet. Une théorie est proposée, qui postule qu'un projet d'affaires a de grandes chances d'échouer s'il n'est pas géré comme une partie intégrante de l'entreprise, en le traitant comme une opération courante au sein de l'entreprise. Cela signifie que la manière dont les projets d'affaires sont gérés aujourd'hui devrait être revue. Le rôle de l'informatique dans l'assistance à la gestion de ces projets devrait également être revu. Et il faudrait sans doute aussi faire une plus grande différence entre les projets d'affaires et les projets " traditionnels " plus techniques. D'autre part, la formalisation de la connaissance acquise au cours de ces études de cas a été effectuée en développant un modèle de domaine à l'aide du langage de modélisation UML. Et l'approche de modélisation du domaine a été élaborée en modifiant l'étape de conceptualisation dans le processus traditionnel d'ingénierie d'ontologie. En prenant comme point de départ le cadre théorique qui prend en compte l'essentiel des composants de la gestion de projets d'affaires, le modèle a été construit en quatre étapes : (1) définition de la portée du travail en développant chaque composant à partir des normes en vigueur ; (2) intégration de ces développements en réutilisant les travaux réalisés et proposés par d'autres chercheurs ; (3) développement et (4) évaluation des spécifications UML décrivant aussi bien les aspects structurels que dynamiques du sujet traité. Le fait d'avoir réussi à développer un modèle du domaine et à montrer de quelle manière il pouvait être mis en œuvre directement pour développer un système d'information pour la gestion de projet ainsi que des ontologies portant sur les connaissances liées à la gestion de projet a montré que l'approche consistant à construire une base sémantique commune permettant de travailler à la modélisation de systèmes applicatifs et d'ontologies est à la fois réalisable et valide. De plus, le modèle de domaine proposé peut servir de socle permettant d'accumuler progressivement la connaissance du domaine, dans la mesure où l'approche de modélisation a pris en compte la possibilité d'intégrer des travaux et propositions antérieurs. Ce résultat ouvre de nouvelles perspectives de développement de logiciels s'appuyant sur un modèle de domaine qui est directement issu de travaux de recherch
APA, Harvard, Vancouver, ISO, and other styles
8

Leshi, Olumide. "An Approach to Extending Ontologies in the Nanomaterials Domain." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-170255.

Full text
Abstract:
As recently as the last decade or two, data-driven science workflows have become increasingly popular and semantic technology has been relied on to help align often parallel research efforts in the different domains and foster interoperability and data sharing. However, a key challenge is the size of the data and the pace at which it is being generated, so much that manual procedures lag behind. Thus, eliciting automation of most workflows. In this study, the effort is to continue investigating ways by which some tasks performed by experts in the nanotechnology domain, specifically in ontology engineering, could benefit from automation. An approach, featuring phrase-based topic modelling and formal topical concept analysis is further motivated, together with formal implication rules, to uncover new concepts and axioms relevant to two nanotechnology-related ontologies. A corpus of 2,715 nanotechnology research articles helps showcase that the approach can scale, as seen in a number of experiments conducted. The usefulness of document text ranking as an alternative form of input to topic models is highlighted as well as the benefit of implication rules to the task of concept discovery. In all, a total of 203 new concepts are uncovered by the approach to extend the referenced ontologies
APA, Harvard, Vancouver, ISO, and other styles
9

Thomas, Manoj. "An Ontology Centric Architecture For Mediating Interactions In Semantic Web-Based E-Commerce Environments." VCU Scholars Compass, 2008. http://scholarscompass.vcu.edu/etd/1598.

Full text
Abstract:
Information freely generated, widely distributed and openly interpreted is a rich source of creative energy in the digital age that we live in. As we move further into this irrevocable relationship with self-growing and actively proliferating information spaces, we are also finding ourselves overwhelmed, disheartened and powerless in the presence of so much information. We are at a point where, without domain familiarity or expert guidance, sifting through the copious volumes of information to find relevance quickly turns into a mundane task often requiring enormous patience. The realization of accomplishment soon turns into a matter of extensive cognitive load, serendipity or just plain luck. This dissertation describes a theoretical framework to analyze user interactions based on mental representations in a medium where the nature of the problem-solving task emphasizes the interaction between internal task representation and the external problem domain. The framework is established by relating to work in behavioral science, sociology, cognitive science and knowledge engineering, particularly Herbert Simon’s (1957; 1989) notion of satisficing on bounded rationality and Schön’s (1983) reflective model. Mental representations mediate situated actions in our constrained digital environment and provide the opportunity for completing a task. Since assistive aids to guide situated actions reduce complexity in the task environment (Vessey 1991; Pirolli et al. 1999), the framework is used as the foundation for developing mediating structures to express the internal, external and mental representations. Interaction aids superimposed on mediating structures that model thought and action will help to guide the “perpetual novice” (Borgman 1996) through the vast digital information spaces by orchestrating better cognitive fit between the task environment and the task solution. This dissertation presents an ontology centric architecture for mediating interactions is presented in a semantic web based e-commerce environment. The Design Science approach is applied for this purpose. The potential of the framework is illustrated as a functional model by using it to model the hierarchy of tasks in a consumer decision-making process as it applies in an e-commerce setting. Ontologies are used to express the perceptual operations on the external task environment, the intuitive operations on the internal task representation, and the constraint satisfaction and situated actions conforming to reasoning from the cognitive fit. It is maintained that actions themselves cannot be enforced, but when the meaning from mental imagery and the task environment are brought into coordination, it leads to situated actions that change the present situation into one closer to what is desired. To test the usability of the ontologies we use the Web Ontology Language (OWL) to express the semantics of the three representations. We also use OWL to validate the knowledge representations and to make rule-based logical inferences on the ontological semantics. An e-commerce application was also developed to show how effective guidance can be provided by constructing semantically rich target pages from the knowledge manifested in the ontologies.
APA, Harvard, Vancouver, ISO, and other styles
10

Dennie, Keiran. "Scalable attack modelling in support of security information and event management." Master's thesis, University of Cape Town, 2014. http://hdl.handle.net/11427/9205.

Full text
Abstract:
Includes bibliographical references
While assessing security on single devices can be performed using vulnerability assessment tools, modelling of more intricate attacks, which incorporate multiple steps on different machines, requires more advanced techniques. Attack graphs are a promising technique, however they face a number of challenges. An attack graph is an abstract description of what attacks are possible against a specific network. Nodes in an attack graph represent the state of a network at a point in time while arcs between nodes indicate the transformation of a network from one state to another, via the exploit of a vulnerability. Using attack graphs allows system and network configuration information to be correlated and analysed to indicate imminent threats. This approach is limited by several serious issues including the state-space explosion, due to the exponential nature of the problem, and the difficulty in visualising an exhaustive graph of all potential attacks. Furthermore, the lack of availability of information regarding exploits, in a standardised format, makes it difficult to model atomic attacks in terms of exploit requirements and effects. This thesis has as its objective to address these issues and to present a proof of concept solution. It describes a proof of concept implementation of an automated attack graph based tool, to assist in evaluation of network security, assessing whether a sequence of actions could lead to an attacker gaining access to critical network resources. Key objectives are the investigation of attacks that can be modelled, discovery of attack paths, development of techniques to strengthen networks based on attack paths, and testing scalability for larger networks. The proof of concept framework, Network Vulnerability Analyser (NVA), sources vulnerability information from National Vulnerability Database (NVD), a comprehensive, publicly available vulnerability database, transforming it into atomic exploit actions. NVA combines these with a topological network model, using an automated planner to identify potential attacks on network devices. Automated planning is an area of Artificial Intelligence (AI) which focuses on the computational deliberation process of action sequences, by measuring their expected outcomes and this technique is applied to support discovery of a best possible solution to an attack graph that is created. Through the use of heuristics developed for this study, unpromising regions of an attack graph are avoided. Effectively, this prevents the state-space explosion problem associated with modelling large scale networks, only enumerating critical paths rather than an exhaustive graph. SGPlan5 was selected as the most suitable automated planner for this study and was integrated into the system, employing network and exploit models to construct critical attack paths. A critical attack path indicates the most likely attack vector to be used in compromising a targeted device. Critical attack paths are identifed by SGPlan5 by using a heuristic to search through the state-space the attack which yields the highest aggregated severity score. CVSS severity scores were selected as a means of guiding state-space exploration since they are currently the only publicly available metric which can measure the impact of an exploited vulnerability. Two analysis techniques have been implemented to further support the user in making an informed decision as to how to prevent identified attacks. Evaluation of NVA was broken down into a demonstration of its effectiveness in two case studies, and analysis of its scalability potential. Results demonstrate that NVA can successfully enumerate the expected critical attack paths and also this information to establish a solution to identified attacks. Additionally, performance and scalability testing illustrate NVA's success in application to realistically sized larger networks.
APA, Harvard, Vancouver, ISO, and other styles
11

Alvarsson, Jonathan. "Ligand-based Methods for Data Management and Modelling." Doctoral thesis, Uppsala universitet, Institutionen för farmaceutisk biovetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-248964.

Full text
Abstract:
Drug discovery is a complicated and expensive process in the billion dollar range. One way of making the drug development process more efficient is better information handling, modelling and visualisation. The majority of todays drugs are small molecules, which interact with drug targets to cause an effect. Since the 1980s large amounts of compounds have been systematically tested by robots in so called high-throughput screening. Ligand-based drug discovery is based on modelling drug molecules. In the field known as Quantitative Structure–Activity Relationship (QSAR) molecules are described by molecular descriptors which are used for building mathematical models. Based on these models molecular properties can be predicted and using the molecular descriptors molecules can be compared for, e.g., similarity. Bioclipse is a workbench for the life sciences which provides ligand-based tools through a point and click interface.  The aims of this thesis were to research, and develop new or improved ligand-based methods and open source software, and to work towards making these tools available for users through the Bioclipse workbench. To this end, a series of molecular signature studies was done and various Bioclipse plugins were developed. An introduction to the field is provided in the thesis summary which is followed by five research papers. Paper I describes the Bioclipse 2 software and the Bioclipse scripting language. In Paper II the laboratory information system Brunn for supporting work with dose-response studies on microtiter plates is described. In Paper III the creation of a molecular fingerprint based on the molecular signature descriptor is presented and the new fingerprints are evaluated for target prediction and found to perform on par with industrial standard commercial molecular fingerprints. In Paper IV the effect of different parameter choices when using the signature fingerprint together with support vector machines (SVM) using the radial basis function (RBF) kernel is explored and reasonable default values are found. In Paper V the performance of SVM based QSAR using large datasets with the molecular signature descriptor is studied, and a QSAR model based on 1.2 million substances is created and made available from the Bioclipse workbench.
APA, Harvard, Vancouver, ISO, and other styles
12

Lesch, Ragnar H. "Modelling nonlinear stochastic dynamics in financial time series." Thesis, Aston University, 2000. http://publications.aston.ac.uk/13260/.

Full text
Abstract:
For analysing financial time series two main opposing viewpoints exist, either capital markets are completely stochastic and therefore prices follow a random walk, or they are deterministic and consequently predictable. For each of these views a great variety of tools exist with which it can be tried to confirm the hypotheses. Unfortunately, these methods are not well suited for dealing with data characterised in part by both paradigms. This thesis investigates these two approaches in order to model the behaviour of financial time series. In the deterministic framework methods are used to characterise the dimensionality of embedded financial data. The stochastic approach includes here an estimation of the unconditioned and conditional return distributions using parametric, non- and semi-parametric density estimation techniques. Finally, it will be shown how elements from these two approaches could be combined to achieve a more realistic model for financial time series.
APA, Harvard, Vancouver, ISO, and other styles
13

Owolabi, Abidemi. "Development of an integrated product information management system." Thesis, Loughborough University, 2004. https://dspace.lboro.ac.uk/2134/2753.

Full text
Abstract:
This thesis reports on a research project undertaken over a four year period investigating and developing a software framework and application for integrating and managing building product information for construction engineering. The research involved extensive literature research, observation of the industry practices and interviews with construction industry practitioners and systems implementers to determine how best to represent and present product information to support the construction process. Applicable product models for information representation were reviewed and evaluated to determine present suitability. The IFC product model was found to be the most applicable. Investigations of technologies supporting the product model led to the development of a software tool, the IFC Assembly Viewer, which aided further investigations into the suitability of the product model (in its current state) for the exchange and sharing of product information. A software framework, or reusable software design and application, called PROduct Information Management System (PROMIS), was developed based on a non-standard product model but with flexibility to work with the IFC product model when sufficiently mature. The software comprises three subsystems namely: ProductWeb, ModelManager.NET and Product/Project Service (or P2Service). The key features of this system were shared project databases, parametric product specification, integration of product information sources, and application interaction and integration through interface components. PROMIS was applied to and tested with a modular construction business for the management of product information and for integration of product and project information through the design and construction (production) process.
APA, Harvard, Vancouver, ISO, and other styles
14

Zhan, Pei. "An ontology-based approach for semantic level information exchange and integration in applications for product lifecycle management." Online access for everyone, 2007. http://www.dissertations.wsu.edu/Dissertations/Summer2007/P_Zhan_080607.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Snyman, Irene. "A framework for co-located collaborative business process modelling using touch technologies." Thesis, Nelson Mandela Metropolitan University, 2013. http://hdl.handle.net/10948/d1021015.

Full text
Abstract:
In recent years the field of Business Process Modelling (BPM) has gained increasing attention from both the business and research communities. One of the primary drivers for BPM is the improved understanding of Business Processes (BPs) and the competitive advantage gained over competitors. In addition, BPM can improve communication in an organisation and facilitate increased support for change management. BPM is a collaborative activity that needs to be carried out in a team environment, and Collaborative Business Process Modelling (CBPM) promotes improved readability, accuracy and quality of process models as well as a reduced workload for modellers. In spite of the increased popularity of CBPM, there is limited research related to the collaborative nature of the modelling tasks performed by modellers and specifically to the synchronisation of shared process models. In addition, tools and techniques to support CBPM do not support this synchronisation effectively or efficiently. This study proposes a conceptual framework for CBPM using touch technologies in a colocated collaborative environment. The main research problem addressed by this study is that modellers experience difficulties conducting BPM activities in a co-located collaborative environment. In order to address the research problem and clarify and elaborate on the problems of CBPM, a two-fold approach was undertaken. Firstly, after an in-depth literature review, a BPM survey was designed and then sent to modellers in South African Information Technology (IT) consulting companies in order to provide a more in-depth understanding of the status and challenges of CBPM in IT consulting organisations. The results revealed that available BPM software do not adequately cater for CBPM and software tools do not enforce versioning and synchronisation. In addition, hardware constraints were reported as well as problems with integrating different parts of the process model that the modellers were working on. The results of the survey also showed that the positive aspects of CBPM are that ideas could be shared and overall there is a better understanding of the BPs being modelled. The second part of the problem elaboration consisted of usability field studies with participants from both education and industry using a traditional popular BPM software tool, Enterprise Architect (EA). Whilst several benefits of CBPM were confirmed, several challenges were encountered, particularly with regard to the integration and synchronisation of models. To overcome the problems of CBPM, a framework was developed that allows for co-located CBPM using tablet PCs. The framework includes a developed prototype of the BPMTouch software which runs on tablet PCs, as well as some theoretical aspects of CBPM. The BPMTouch software supports effective and efficient CBPM and the synchronisation of process models since it allows multiple modellers to work together on one BP model, with each modeller using his/her own tablet. If one modeller makes changes to the model, the changes are immediately reflected on the tablets of the other modellers since the changes to the model are updated in real time. Modellers cannot draw on the same model simultaneously, however, everyone can see what the active modeller (active participant with the green flag) is doing. Other participants can then become the active modeller and make changes to the model once the flag has been released and re-allocated. The results from the field studies, industry surveys and usability evaluations were all incorporated into the BPMTouch software tool design and into the aspects of CBPM in order to assist with the process of co-located CBPM using touch technologies. Usability evaluations were carried out in which industry and student participants used BPMTouch to create an integrated model and simultaneously and synchronously create a process model. The evaluations of the BPMTouch prototype revealed that participants prefer this system over traditional BPM software since the BPMTouch removes the need for post modelling integration. The theoretical contribution of the framework consists of aspects proposing that organisations should take the potential benefits and challenges of CBPM into consideration and address the Critical Success Factors (CSFs) before embarking on a CBPM project. These aspects can help with decisions relating to CBPM. The use of this framework can improve the quality of process models, reduce the workload of modellers and in this way increase the success rate of CBPM projects.
APA, Harvard, Vancouver, ISO, and other styles
16

Große, Christine. "Towards an Integrated Framework for Quality and Information Security Management in Small Companies." Thesis, Luleå tekniska universitet, Institutionen för system- och rymdteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-327.

Full text
Abstract:
This master thesis elaborates the construction of an integrated framework for the simultaneous initiation of quality management and information security management within micro and small enterprises. Called QISMO, the model collection consists of three parts: (1) a holistic framework as structure dedicated to achieving a shared understanding among key stakeholders concerned about relations and dependencies, (2) a reference process model for visualising the entire process with the activities related, and (3) a lifecycle model for illustrating the process loop and for clarifying specific phases therein. This study offers an analysis of alternative approaches that results in premises and requirements adapted to micro and small enterprises. Furthermore, major barriers to the improvement of quality and information security management of micro and small enterprises are identified in this study. These include miscalculation of risks, lack of competence, and absence of structured processes. Aside from valuable insights for further development of enhanced training programs, the study contributes a comprehensive analysis of standards and good practices within the field of IT governance. Moreover, the study shares a concrete reference process model that is adapted to the preconditions of micro and small enterprises. These preconditions are acquired throughout the study. The proposition is to provide a basis for the further improvement of business processes and the models related to them, both in practice and in research.
APA, Harvard, Vancouver, ISO, and other styles
17

Bengtsson, Jonas, and Mikael Grönkvist. "Performing Geographic Information System Analyses on Building Information Management Models." Thesis, KTH, Geodesi och satellitpositionering, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208922.

Full text
Abstract:
As the usage of both BIM (Building Information Modelling) and 3D-GIS (Three-Dimensional Geographic Information Systems) has increased within the field of urban development and construction, so has the interest in connecting these two tools.  One possibility of integration is the potential of visualising BIM models together with other spatial data in 3D. Another is to be able to perform spatial 3D analyses on the models. Both of these can be achieved through use of GIS software. This study explores how integration of BIM and GIS could look. The goal was to perform typical GIS analyses in 3D on BIM models. Previous research points towards some success within the field through use of the indicated standard format for each tool – IFC (Industry Foundation Classes) for BIM and CityGML (City Geographic Markup Language) for GIS. Transformation between the formats took place through use of the BIM software Revit, the transformation tool FME and the GIS software ArcGIS. A couple of reviewed applications of GIS analyses were chosen for testing on the converted models – indoor network analysis, visibility analysis and spatial analysis for 3D buildings. The input data in the study was several BIM models, both models created for real-life usage and others that only function as sample data within the different software. From the results of the practical work it can be concluded that a simple, automated and full-scale integration does not seem to be within reach quite yet. Most transformations between IFC and CityGML failed to some extent, especially the more detailed and complex ones. In some test cases, the file could not be imported into ArcGIS and in others geometries were missing or existing even though they should not. There were also examples where geometries had been moved during the process. As a consequence of these problems, most analyses failed or did not give meaningful results. A few of the original analyses did give positive results. Combining (flawed) CityGML models with other spatial data for visualisation purposes worked rather well. Both the shadow volume and sightline analyses did also get reasonable results which indicates that there might be a future for those applications. The obstacles for a full-scale integration identified during the work were divided into four different categories. The first is BIM usage and routines where created models need to be of high quality if the final results are to be correct. The second are problems concerning the level of detail, especially the lack of common definitions for the amount of details and information. The third category concerns the connection between local and global coordinate systems where a solution in form of updates to IFC might already be in place. The fourth, and largest, category contains those surrounding the different formats and software used. Here, focus should lie on the transformation between IFC and CityGML. There are plenty of possible, future, work concerning these different problems. There is also potential in developing own tools for integration or performing different analyses than those chosen for this thesis.
I takt med den ökade användningen av både BIM och 3D-GIS inom samhällsbyggnadsprocessen har även intresset för att sammanföra de två verktygen blivit större. En möjlighet med integration är potentialen att visualisera BIM-modeller tillsammans med andra geografiska data i 3D. En annan är att kunna genomföra rumsliga 3D-analyser på modellerna. Båda dessa går att utföra med hjälp av GIS-programvara. Denna studie utforskar hur en integration mellan BIM och GIS kan se ut. Målet är att genomföra typiska GIS-analyser i 3D på BIM-modeller. Tidigare forskning pekar mot vissa framgångar inom området genom att arbeta med det utpekade standardformatet för respektive verktyg – IFC för BIM och CityGML för GIS. Transformation mellan formaten skedde med hjälp av programvarorna Revit, FME och ArcGIS. Ett par framhållna tillämpningar av GIS-analyser valdes ut för tester på de konverterade modellerna – nätverksanalyser inomhus, siktanalyser och rumsliga analyser för 3D-byggnader. Som indata användes flera olika BIM-modeller, både sådana som tillverkats för faktisk användning och modeller som skapats för att användas som exempeldata inom programvarorna. Utifrån resultaten från det praktiska arbetet kan konstateras att en enkel, automatiserad och fullskalig integration mellan verktygen verkar ligga en bit in i framtiden. De flesta transformationerna mellan IFC och CityGML misslyckades i någon aspekt, speciellt de mer detaljerade och komplexa. I vissa testfall kunde filen inte importeras i ArcGIS, i andra saknas eller existerar oväntade geometrier även om importen lyckats. Det finns också exempel där geometrier förflyttats. Som en konsekvens av dessa problem kunde de flesta 3D-analyser inte genomföras alls eller lyckades inte ge betydelsefulla resultat. Ett fåtal av de ursprungliga analyserna gav dock positiv utdelning. Att kombinera (felaktiga) CityGML-modeller med annan rumslig data fungerade förhållandevis väl ur ett visualiseringssyfte. Både skuggvolymsanalysen och framtagandet av siktlinjer från byggnaderna gav någorlunda korrekta resultat vilket indikerar att det kan finnas en framtid gällande de tillämpningarna. Hindren för en fullskalig integration som identifierades genom arbetet delades upp i fyra olika kategorier. Den första är BIM-användning där hög kvalitet på de skapade modellerna är viktigt för korrekta slutresultat. Den andra är detaljeringsgraden där avsaknaden av gemensamma definitioner för detaljeringsgraderna ställer till problem. Den tredje kategorin är koordinat- och referenssystem där en lösning på kopplingen mellan lokala och globala system redan kan finnas på plats i en av de senare utgåvorna av IFC-formatet. Den sista och största kategorin är problematiken kring just format och programvaror där mer arbete på översättningen mellan IFC och CityGML kommer att krävas. I framtiden finns det gott om arbete att göra med dessa olika problem. Det finns också potential att utveckla egna verktyg för integrationen eller att ägna sig åt att göra andra analyser än de som valdes ut i den här studien.
APA, Harvard, Vancouver, ISO, and other styles
18

Dave, B. A. "Developing a construction management system based on lean construction and building information modelling." Thesis, University of Salford, 2013. http://usir.salford.ac.uk/30820/.

Full text
Abstract:
This research aims at improving construction management through simultaneous implementation of Lean Construction and Building Information Modelling. Specifically, the area of production management and control is addressed by developing a prototype software system that supports Lean Construction processes and provides a visual interface through Building Information Modelling. The research addresses a practically relevant problem, and follows the Design Science Research method. The first stage of the research explores the problem area through the author’s own observation of industrial practice, and also through a literature review. At the broad level, a two-fold problem is identified; first the problems with the production management process itself, and second the problems with visualisation and management of the product model and its integration with the production management. At the fundamental level, it is found that many of these problems are linked with the deficient theory behind production, which is predominantly based on the “Transformation” view of production. Additionally, it is found that the previous attempts at solving the problems of construction management through information systems have only met with limited success as they mostly address the peripheral processes rather than the core area of production management. The second stage of the research explores and puts forward potential solutions to overcome the problems of production management. Lean Construction is identified as a partial solution to the production planning and control process. Specifically, the Last Planner SystemTM of production control is found to improve the productivity and efficiency of the production process by reducing variability, improving reliability and collaboration and introducing continuous improvement. At the same time, it is found that Building Information Modelling helps overcome many of the problems found with the traditional product management techniques (such as 2D and 3D CAD), by providing an object oriented, parametric and visual representation of the product. It is also found that the application of Building Information Modelling is relevant to all aspects of the construction process. Through a conceptual analysis, significant synergies between Lean Construction and Building Information Modelling are identified, with applications also spanning the entire construction lifecycle. Specific benefits to the production management process are also found, backed by empirical evidence. However, it is also found that the current Building Information Modelling systems do not fully support an integrated implementation of production management. This particular aspect of an integrated and visual system, which would support the core production management process, is identified as a potential solution area. The third stage of the research is dedicated to the design and development of a software system called VisiLean, which provides a collaborative planning and control platform, which is integrated with the Building Information Modelling platform, and which supports the production management process. A prototype system is developed through an iterative and incremental process, through simultaneous feedback, evaluation and review. The fourth stage of the research includes the evaluation of the VisiLean prototype through a demonstration and feedback process. At this stage, the design, development and evaluation process is analyzed and discussed. Finally, the contributions to the theory and the body of knowledge are identified, along with the suggestions for future development.
APA, Harvard, Vancouver, ISO, and other styles
19

Owusu-Asamoah, Kwasi. "Modelling an information management system for the National Health Insurance Scheme in Ghana." Thesis, Loughborough University, 2014. https://dspace.lboro.ac.uk/2134/16415.

Full text
Abstract:
The National Health Insurance Scheme (NHIS) in Ghana was introduced to alleviate the problem of citizens having to pay for healthcare at the point of delivery, given that many did not have the financial resources needed to do so, and as such were unable to adequately access healthcare services. The scheme is managed from the national headquarters in the capital Accra, through satellite offices located in districts right across the length and breadth of the country. It is the job of these offices to oversee the operations of the scheme within that particular district. Current literature however shows us that there is a digital divide that exists between the rural and urban areas of the country which has led to differences in the management of information within urban-based and rural-based districts. This thesis reviews the variables affecting the management of information within the scheme, and proposes an information management model to eliminate identified bottlenecks in the current information management model. The thesis begins by reviewing the theory of health insurance, information management and then finally the rural-urban digital divide. In addition to semi-structured interviews with key personnel within the scheme and observation, a survey questionnaire was also handed out to staff in nine different district schemes to obtain the raw data for this study. In identifying any issues with the current information management system, a comparative analysis was made between the current information management model and the real-world system in place to determine the changes needed to improve the current information management system in the NHIS. The changes discovered formed an input into developing the proposed information management system with the assistance of Natural Conceptual Modelling Language (NCML). The use of a mixed methodology in conducting the study, in addition to the employment of NCML was an innovation, and is the first of its kind in studying the NHIS in Ghana. This study is also the first to look at the differences in information management within the NHIS given the rural-urban digital divide.
APA, Harvard, Vancouver, ISO, and other styles
20

Noaman, Amin Yousef. "Reconciling formal and informal documentation in business modelling." Thesis, McGill University, 1995. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=22783.

Full text
Abstract:
Business modelling, the modelling of architectures and processes of organizations, should have a broad scope. It should not exclusively capture the basic information of the processes, but also address the various kinds of documentation related to the processes under consideration. In this combination, organizational models will be more expressive and useful.
The research reported here describes and demonstrates a new approach for reconciling formal and informal documentation in business modelling. It is based on the integration of an underlying formal modelling approach with hypertext concepts that provide mechanisms for capturing, manipulating and viewing informal model documentation.
We have developed the Hypertec tool which complements the Macrotec environment. Macrotec is a business modelling environment that is based on the formalism of extended colored Petri nets. Hypertec is a hypertext-based component supporting authoring, display and navigation of all the process documentation that cannot be captured by Macrotec. Our experience with Macrotec/Hypertec shows that their combined functionality substantially facilitates the understanding of business processes and clearly reduces problems such as miscommunication, misinterpretation, and misunderstandings about entire processes or some of their components.
APA, Harvard, Vancouver, ISO, and other styles
21

Tolis, Christofer. "Framing the business : business modelling for business development." Doctoral thesis, Stockholm : Economic Research Institute, Stockholm School of Economics (Ekonomiska forskningsinstitutet vid Handelshögskolan) (EFI), 2005. http://web.hhs.se/efi/summary/664.htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Geyer, Rian Willem. "Value-adding business process modelling : determining the suitability of a business process modelling technique for a given application." Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019.1/85758.

Full text
Abstract:
Thesis (MScEng)-- Stellenbosch University, 2013.
ENGLISH ABSTRACT: Organizations formally define and document their business processes in order to properly understand them and to subsequently enable their continuous development, improvement and management. In order to formally define and document their business processes, organizations can use Business Process Modelling, which represents the design of graphical models that portray the business processes of organizations. It is however noted that it is difficult to select a suitable Business Process Modelling Technique in support of a specific application of Business Process Modelling. This is due to the considerable amount of existing Business Process Modelling Techniques, the inherent impact of their varying capabilities and the lack of formal measures that are available to support evaluations regarding their suitability for specific modelling applications. It is therefore considered appropriate to execute a research study that is aimed at the development and validation of a measurement framework that can be used to evaluate the suitability of Business Process Modelling Techniques for specific modelling applications.
AFRIKAANSE OPSOMMING: Organisasies definieer en dokumenteer hulle besigheidsprosesse op ʼn formele wyse om hulle ordentlik te verstaan en gevolglik hulle deurlopende ontwikkeling, verbetering en bestuur te bemagtig. Ten einde die uitvoering van hierdie aktiwiteit aan te spreek, kan organisasies Besigheidsproses Modellering gebruik om grafiese modelle van hulle besigheidsprosesse te ontwerp. Daar word egter kennis geneem dat dit moeilik is om ʼn geskikte Besigheidsproses Modellering Tegniek te kies tes ondersteuning van ʼn spesifieke toepassing van Besigheidsproses Modellering. Dit is weens die groot hoeveelheid bestaande Besigheidsproses Modellering Tegnieke, die impak van hulle variërende vermoëns asook die gebrek aan formele maatstawwe wat gebruik kan word om hulle geskiktheid vir spesifieke modellering toepassings te evalueer. Dit lei tot die besluit om ‘n studie te voltooi wat gefokus is op die ontwikkeling en validasie van ʼn metings raamwerk wat gebruik kan word om die geskiktheid van Besigheidsproses Modellering Tegnieke vir spesifieke toepassings van Besigheidproses Modellering te evalueer.
APA, Harvard, Vancouver, ISO, and other styles
23

Abbasnejad, Behzad. "Building information modelling adoption and implementation in construction firms: A multi-stage model." Thesis, Queensland University of Technology, 2018. https://eprints.qut.edu.au/119686/1/Behzad_Abbasnejad_Thesis.pdf.

Full text
Abstract:
This research developed a stage-based model for the management of building information modelling (BIM) implementation in construction organisations based on the theories of business process change management, innovation management, and IT implementation. The model identified five distinct stages: awareness, consideration, readiness assessment, deployment, evaluation and improvement plan and their related enablers, which are aligned with the organisational goals and objectives as a precursor to the successful and sustained implementation of BIM. Comparative case studies of five construction firms were used to test the applicability of the model within the broader spectrum of the construction supply chain in Australia.
APA, Harvard, Vancouver, ISO, and other styles
24

Majcherek, Ewa. "Building Information Modelling in the business of architecture : Case of Sweden." Thesis, KTH, Industriell ekonomi och organisation (Inst.), 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-124358.

Full text
Abstract:
Architectural practice, although its first objective is providing a design value, is nevertheless a business branch. Creative work of architects needs formal managerial guidelines and principles in order to ensure financial profitability of the firm. One of the challenges of architectural management is carrying through innovative solutions.  A prominent example of a recent innovation in architecture, engineering and construction industry (AEC) is Building Information Modelling (BIM). In Sweden BIM regulations supporting its further diffusion across the industry were first established in 2013. The research indicates managerial practices which are crucial for the successful implementation of BIM in Swedish architectural offices and consequently bring significant business benefits to its adopters.
APA, Harvard, Vancouver, ISO, and other styles
25

Alekhtyar, Mumena. "Building Information Modelling and Virtual Design and Construction : Differentiations and interaction." Thesis, KTH, Fastigheter och byggande, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-231602.

Full text
Abstract:
Within the last decade BIM technology products have been providing the construction industry with various tools that can be used in all construction phases and aspects with a wide set of potentials that range from visualization to simulation, scheduling and cost estimation (Kam , et al., 2016). As a result, the term BIM started to have new interpretation from a process- oriented perspective beside the product-oriented interpretation. These different interpretations and definitions of BIM creates an ambiguity regarding BIM. The ambiguity about what is BIM takes another direction when the term “Virtual Design and Constructions” VDC is used as synonym of BIM as well in many situations. VDC is defined as the use of integrated multi-disciplinary performance models of design-construction projects to support explicit and public business objectives (Kunz & Fischer, 2012). This study is an attempt to answer the following questions: What are the differences between BIM and VDC and how BIM and VDC interact and affect each other. As a result, a timeline for both terms was created based on a historical analysis for the emergence of both terms. Furthermore, more differences between VDC and BIM were located through literature reviews and empirics and this mapping was used to find how each term affects the other.   The study was conducted in Tyréns Swedish consultancy company and it covered two infrastructure projects where VDC is used.
APA, Harvard, Vancouver, ISO, and other styles
26

Gichuiri, Jane Wanjugu. "Process modelling : an evaluation approach in support of effective management of construction project information." Thesis, University of Salford, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.395702.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Ruci, Xhesika. "Capacity Management in Hyper-Scale Datacenters using Predictive Modelling." Thesis, Luleå tekniska universitet, Institutionen för system- och rymdteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-72487.

Full text
Abstract:
Big data applications have become increasingly popular with the emerge of cloud computing and the explosion of artificial intelligence. Hence, the increasing adoption of data-hungry machines and services is driving the need for more power to keep the datacenters of the world running. It has become crucial for large IT companies such as Google, Facebook, Amazon etc. to monitor the energy efficiency of their datacenters’ facilities and take actions on optimization of these heavy consumers of electricity. This master thesis work proposes several predictive models to forecast PUE (Power Usage Effectiveness), regarded as the industry-de-facto metric for measuring datacenter’s IT power efficiency. This approach is a novel capacity management technique to predict and monitor the environment in order to prevent future disastrous events, which are strictly unacceptable in datacenter’s business.
APA, Harvard, Vancouver, ISO, and other styles
28

Bonzanini, Marco. "Opinion-aware information management : statistical summarisation and knowledge representation of opinions." Thesis, Queen Mary, University of London, 2015. http://qmro.qmul.ac.uk/xmlui/handle/123456789/9084.

Full text
Abstract:
Nowadays, an increasing amount of media platforms provide the users with opportunities for sharing their opinions about products, companies or people. In order to support users accessing opinion-based information, and to support engineers building systems that require opinion-aware reasoning, intelligent opinion-aware tools and techniques are needed. This thesis contributes methods and technology for opinion-aware information management from two different perspectives, namely document summarisation and knowledge representation. Document summarisation has been widely investigated as a mean to reduce information overload. This thesis focuses on statistical models for summarisation, with a particular attention to divergence-based models, within the context of opinions. Firstly, topic-based document summarisation is addressed, contributing a study on divergence-based document to summary similarity and the definition of a novel algorithm for summarisation based on sentence removal. Secondly, summarisation models are tailored to opinion-oriented content and shown to be useful also when exploited for different tasks such as sentiment classification. Thirdly, summarisation models are applied to knowledge-oriented data, in order to tackle tasks such as entity summarisation. The comprehensive task addressed is the knowledge-based opinion-aware summarisation of content (free text, facts). This thesis also contributes a broad discussion on knowledge representation of opinions. A thorough study on how to model opinions using traditional techniques, such as Entity-Relationship (ER) modelling, underlines that a high-level, opinion-aware layer of conceptual modelling is useful since it hides away implementation details. A conceptual and logical knowledge representation methodology for modelling opinions is hence proposed, with the purpose of guiding engineers towards the use of best practices during the development of sentiment analysis applications. Specifically, an extension of the traditional ER modelling and the definition of an automatic mapping procedure, to translate opinion-aware components of the conceptual model into a relational model, help achieving a clear separation between conceptual and logical modelling. The mapping procedure yields an automatic and replicable methodology to design applications which require opinion-aware reasoning.
APA, Harvard, Vancouver, ISO, and other styles
29

Ulriksson, Jenny. "Consistency management in collaborative modelling and simulation." Licentiate thesis, KTH, Microelectronics and Information Technology, IMIT, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-571.

Full text
Abstract:

The aim of this thesis is to exploit the technological capabilities of computer supported collaborative work (CSCW) in the field of collaborative Modelling and Simulation (M&S). The thesis focuses on addressing two main problems: (i) providing flexible means of consistency management in collaborative M&S, and (ii) the ability of providing platform and application independent services for collaborative M&S.

In this work, some CSCW technologies and how some of the concepts can be incorporated in a distributed collaborative M&S environment, have been studied. An environment for component based simulation development and visualization, which provides support for collaborative M&S, has been designed. Some consistency policies that can be used in conjunction with distributed simulation and the High Level Architecture (HLA) have been investigated. Furthermore, the efficient utilization of HLA and XML in combination, as the foundation of a CSCW infrastructure has been proved. Two consistency policies were implemented utilizing HLA, a strict and an optimistic, in the distributed collaborative environment. Their performance was compared to the performance of a totally relaxed policy, in various collaboration situations.

APA, Harvard, Vancouver, ISO, and other styles
30

Anderson, Alison Mary. "The object-oriented modelling of information systems security risk." Thesis, Queensland University of Technology, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
31

Kiepusewski, Bartosz. "Expressiveness and suitability of languages for control flow modelling in workflows." Thesis, Queensland University of Technology, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
32

Zhu, Junxiang. "Integration of Building Information Modelling and Geographic Information System at Data Level Using Semantics and Geometry Conversion Approach Towards Smart Infrastructure Management." Thesis, Curtin University, 2018. http://hdl.handle.net/20.500.11937/74945.

Full text
Abstract:
This study integrates Building Information Modelling (BIM)and Geographic Information System (GIS) at data level using an open source approach for geometry transformation and an automatic attribute searching algorithm for semantics transfer for the purpose of facilitating data transformation from BIM to GIS. Based on that, an infrastructure management system has been developed using Web GIS technology in conjunction with the models created by BIM and transformed into GIS using the proposed approach.
APA, Harvard, Vancouver, ISO, and other styles
33

Gulliver, John. "Space-time modelling of exposure to air pollution using GIS." Thesis, University of Northampton, 2002. http://nectar.northampton.ac.uk/2810/.

Full text
Abstract:
This thesis develops, tests and applies methods for space-time modelling of exposure to air pollution using GIS. This involves linkage of five main sub-models: a traffic model, a model of urban air pollution - combining local and ‘background’ pollution models - a network analysis tool for modelling exposure during journeys, and a time-activity model. The model can provide exposure estimates for individuals or population groups. The study took place entirely within Northampton, UK. The model used to estimate hourly PM10 concentrations at outdoor locations gave a moderate fit to monitored data. Results were shown to be comparable with the best results from other studies. This research also found a strong, linear relationship between concentrations of PM10 during simultaneous monitoring of walking and in-car concentrations. This relationship was used to calibrate modelled outdoor pollution levels to give in-car concentrations. Modelled journey- time exposures for walking performed equally with predictions made using a fixed- site monitor located close to journey routes. The model did not perform as well as the fixed-site monitor in predicting in-car exposures. The application of the model to a walk-to-school policy, in which modelled local traffic levels were reduced by 20%, demonstrated that the benefits of the reduction were not spread evenly across a sample of schoolchildren, but varied depending on the route used to school and the location of homes and schools. For those switching between car and walk there may be positive or negative effects of the policy in terms of savings in average hourly exposures, depending on their specific journey and time activity patterns. The results from this research showed that, although the model worked reasonably well in estimating exposures, a number of improvements are needed. These include better models of background concentrations, more detailed models of in-car conditions, and extending exposure modelling to include dose-response estimates
APA, Harvard, Vancouver, ISO, and other styles
34

Salam, Md Abdus. "The potential of geographical information system-based modelling for aquaculture development and management in South Western Bangladesh." Thesis, University of Stirling, 2000. http://hdl.handle.net/1893/3252.

Full text
Abstract:
This study describes the delineation of appropriate sites for aquaculture using remote sensing, GPS and GIS. A 1996 composite Landsat TM image covering the south-western part of Bangladesh was used to identify water bodies, the extent of brackish water and associated land use features in the image. The Remote Sensing image was complemented by digitised secondary data from a range of sources, including hard copy maps to produce a GIS database which included environmental layers such as water bodies, rivers, soils, land use, temperature, rainfall, salinity and pH. The database also included infrastructural issues, such as roads, railways, processing plants, towns and cities. A series of GIS models were developed in order to identify and prioritise the most suitable areas for freshwater prawn, tilapia and carp and brackish water shrimp and crab farming. A range of scenarios for land allocations were used to develop a series of resource use models linked to likely production outcomes. Global warming and accelerated sea level rise is considered in the study area with different sea level rise scenarios of 50, 100, 150 and 200cm. The consequence of land losses and displacement of the population from the area in different situations is discussed. The economic characteristics of shrimp farming and alternative land uses in the Khulna region were also considered. Five land use options were studied based on economic output and job potential. Among these, brackish water shrimp and crab culture, moderately saline tolerant tilapia and prawn culture, fresh water carp culture and traditional rice production systems, and fresh water prawn culture performed best followed by brackish water shrimp and crab culture. This study showed the extent of potential for aquaculture in the Khulna region and further demonstrates the usefulness of GIS as an aquaculture-planning tool. Model programming was also found to be very useful tool to enabling regenerating of multiple scenarios very quickly. Overall, GIS modelling associated with remote sensing has great potential for informed decision-making in aquatic production systems and optimising management of natural resources in a region where they are already under considerable pressure. The implications for use of these systems in reducing land use conflict and sector planning for the region are discussed.
APA, Harvard, Vancouver, ISO, and other styles
35

Vilanculos, Agostinho Chuquelane Fadulo. "The use of hydrological information to improve flood management-integrated hydrological modelling of the Zambezi River basin." Thesis, Rhodes University, 2015. http://hdl.handle.net/10962/d1018915.

Full text
Abstract:
The recent high profile flooding events – that have occurred in many parts of the world – have drawn attention to the need for new and improved methods for water resources assessment, water management and the modelling of large-scale flooding events. In the case of the Zambezi Basin, a review of the 2000 and 2001 floods identified the need for tools to enable hydrologists to assess and predict daily stream flow and identify the areas that are likely to be affected by flooding. As a way to address the problem, a methodology was set up to derive catchment soil moisture statistics from Earth Observation (EO) data and to study the improvements brought about by an assimilation of this information into hydrological models for improving reservoir management in a data scarce environment. Rainfall data were obtained from the FEWSNet Web site and computed by the National Oceanic and Atmospheric Administration Climatic Prediction Center (NOAA/CPC). These datasets were processed and used to monitor rainfall variability and subsequently fed into a hydrological model to predict the daily flows for the Zambezi River Basin. The hydrological model used was the Geospatial Stream Flow Model (GeoSFM), developed by the United States Geological Survey (USGS). GeoSFM is a spatially semi-distributed physically-based hydrological model, parameterised using spatially distributed topographic data, soil characteristics and land cover data sets available globally from both Remote Sensing and in situ sources. The Satellite rainfall data were validated against data from twenty (20) rainfall gauges located on the Lower Zambezi. However, at several rain gauge stations (especially those with complex topography, which tended to experience high rainfall spatial variability), there was no direct correlation between the satellite estimates and the ground data as recorded in daily time steps. The model was calibrated for seven gauging stations. The calibrated model performed quite well at seven selected locations (R2=0.66 to 0.90, CE=0.51 to 0.88, RSR=0.35 to 0.69, PBIAS=−4.5 to 7.5). The observed data were obtained from the National Water Agencies of the riparian countries. After GeoSFM calibration, the model generated an integration of the flows into a reservoir and hydropower model to optimise the operation of Kariba and Cahora Bassa dams. The Kariba and Cahora Bassa dams were selected because this study considers these two dams as the major infrastructures for controlling and alleviating floods in the Zambezi River Basin. Other dams (such as the Kafue and Itezhi-Thezi) were recognised in terms of their importance but including them was beyond the scope of this study because of financial and time constraints. The licence of the reservoir model was limited to one year for the same reason. The reservoir model used was the MIKE BASIN, a professional engineering software package and quasi-steady-state mass balance modelling tool for integrated river basin and management, developed by the Denmark Hydraulic Institute (DHI) in 2003. The model was parameterised by the geometry of the reservoir basin (level, area, volume relationships) and by the discharge-level (Q-h) relationship of the dam spillways. The integrated modelling system simulated the daily flow variation for all Zambezi River sub-basins between 1998 and 2008 and validated between 2009 and 2011. The resulting streamflows have been expressed in terms of hydrograph comparisons between simulated and observed flow values at the four gauging stations located downstream of Cahora Bassa dam. The integrated model performed well, between observed and forecast streamflows, at four selected gauging stations (R2=0.53 to 0.90, CE=0.50 to 0.80, RSR=0.49 to 0.69, PBIAS=−2.10 to 4.8). From the results of integrated modelling, it was observed that both Kariba and Cahora Bassa are currently being operated based on the maximum rule curve and both remain focused on maximising hydropower production and ensuring dam safety rather than other potential influences by the Zambezi River (such as flood control downstream – where the communities are located – and environmental issues). In addition, the flood mapping analysis demonstrated that the Cahora Bassa dam plays an important part in flood mitigation downstream of the dams. In the absence of optimisation of flow releases from both the Kariba and Cahora Bassa dams, in additional to the contribution of any other tributaries located downstream of the dams, the impact of flooding can be severe. As such, this study has developed new approaches for flood monitoring downstream of the Zambezi Basin, through the application of an integrated modelling system. The modelling system consists of: predicting daily streamflow (using the calibrated GeoSFM), then feeding the predicted streamflow into MIKE BASIN (for checking the operating rules) and to optimise the releases. Therefore, before releases are made, the flood maps can be used as a decision-making tool to both assess the impact of each level of release downstream and to identify the communities likely to be affected by the flood – this ensures that the necessary warnings can be issued before flooding occurs. Finally an integrated flood management tool was proposed – to host the results produced by the integrated system – which would then be accessible for assessment by the different users. These results were expressed in terms of water level (m). Four discharge-level (Q-h) relationships were developed for converting the simulated flow into water level at four selected sites downstream of Cahora Bassa dam – namely: Cahora Bassa dam site, Tete (E-320), Caia (E-291) and Marromeu (E-285). However, the uncertainties in these predictions suggested that improved monitoring systems may be achieved if data access at appropriate scale and quality was improved.
APA, Harvard, Vancouver, ISO, and other styles
36

Mazeri, Stella. "Improved use of abattoir information to aid the management of liver fluke in cattle." Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/28869.

Full text
Abstract:
Fasciolosis, caused by the trematode parasite Fasciola hepatica, is a multi-host parasitic disease affecting many countries worldwide. It is a well-recognized clinically and economically important disease of food producing animals such as cattle and sheep. In the UK, the incidence and distribution of fasciolosis has been increasing in the last decade while the timing of acute disease is becoming more variable and the season suitable for parasite development outside the mammalian host has been extended. Meanwhile control is proving increasingly difficult due to changing weather conditions, increased animal movements and developing anthelmintic resistance. Forecasting models have been around for a long time to aid health planning related to fasciolosis control, but studies identifying management related risk factors are limited. Moreover, the lack of information on the accuracy of meat inspection and available liver fluke diagnostic tests hinders effective monitoring of disease prevalence and treatment. So far, the evaluation of tests available for the diagnosis of the infection in cattle has mainly been carried out using gold standard approaches or under experimental settings, the limitations of which are well known. In cattle, the infection mainly manifests as a sub-clinical disease, resulting in indirect production losses, which are difficult to estimate. The lack of obvious clinical signs results in these losses commonly being attributed to other causes such as poor weather conditions or bad quality forage. This further undermines establishment of appropriate control strategies, as it is difficult to convince farmers to treat without demonstrating clear economic losses of sub-clinical disease. This project explores the value of slaughterhouse data in understanding the changing epidemiology of fasciolosis, identifying sustainable control measures and estimating the effect of infection on production parameters using data collected at one of the largest cattle and sheep abattoirs in Scotland. Data used in this study include; a) abattoir data routinely collected during 2013 and 2014, b) data collected during 3 periods of abattoir based sampling, c) data collected through administration of a management questionnaire and d) climatic and environmental data from various online sources. A Bayesian extension of the Hui Walter no gold standard model was used to estimate the diagnostic sensitivity and specificity of five diagnostic tests for fasciolosis in cattle, which were applied on 619 samples collected from the abattoir during three sampling periods; summer 2013, winter 2014 and autumn 2014. The results provided novel information on the performance of these tests in a naturally infected cattle population at different times of the year. Meat inspection was estimated to have a sensitivity of 0.68 (95% BCI 0.61-0.75) and a specificity of 0.88 (95% BCI 0.85-0.91). Accurate estimates of sensitivity and specificity will allow for routine abattoir liver inspection to be used as a tool for monitoring the epidemiology of F. hepatica as well as evaluating herd health planning. Linear regression modelling was used to estimate the delay in reaching slaughter weight in beef cattle infected with F. hepatica, accounting for other important factors such as weight, age, sex, breed and farm as a random effect. The model estimated that cattle classified as having fluke based on routine liver inspection had on average 10 (95% CI 9-12) days greater slaughter age, assuming an average carcass weight of 345 kg. Furthermore, estimates from a second model indicated that the increase in age at slaughter was more severe for higher fibrosis scores. More precisely, the increase in slaughter age was 34 (95% CI 11-57) days for fibrosis score of 1, 93 (95% CI 57-128) days for fibrosis score 2 and 78 (95% CI 30-125) days for fibrosis score 3. Similarly, in a third model comparing different burden categories with animals with no fluke burden, there was a 31 (95% CI 7-56) days increase in slaughter age for animals with 1 to 10 parasites and 77 (95% CI 32-124) days increase in animals with more than 10 parasites found in their livers. Lastly, a multi-variable mixed effects logistic regression model was built to estimate the association between climate, environmental, management and animal specific factors and the risk of an animal being infected by F. hepatica. Multiple imputation methodology was employed to deal with missing data arising from skipped questions in the questionnaire. Results of the regression model confirmed the importance of temperature, rainfall and cattle movements in increasing the risk for fasciolosis, while it indicated that the presence of deer can increase the risk of infection and that male cattle have a reduced risk of infection. Overall, this project has used slaughterhouse data to fill important knowledge gaps regarding F. hepatica infection in cattle. It has provided valuable information on the accuracy of routine abattoir meat inspection, as well as other diagnostic tests. It has also provided estimates of the effect of infection on the time cattle take to reach slaughter weight at different levels of infection and identified relevant risk factors related to the infection. In conclusion, knowledge of the effect of infection on slaughter age, as well as regional risk factors for F. hepatica infection, along with an improved use of abattoir inspection results in the evaluation of treatment strategies, can provide farmers and veterinarians with better incentives and tools to improve their herd health strategies and in the longer term help reduce the incidence of liver fluke in cattle.
APA, Harvard, Vancouver, ISO, and other styles
37

Li, Jinmin. "Integrating Building Information Modelling (BIM), Cost Estimating and Scheduling for Buildings Construction at the Conceptual Design Stage." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/35292.

Full text
Abstract:
Estimating the construction time and cost of a building project is an essential task of construction manager, which benefits owners, engineers and contractors. Construction duration and cost, in particular, have profound influence on the outcome of a project at the conceptual stage of its life. The conventional methods used to estimate the time and costs of construction projects are based on 2D models, which need much time and effort from engineers, estimators and schedulers who are involved in preparing them because all of this process is done manually, especially when the project has several design alternatives. Considering that, Building Information Modelling (BIM), which is a technology that enhances data transfer and ensures cooperation among designers, engineers, and contractors, can provide an efficient way for cost estimating and schedule planning. On the other hand, sustainability has drawn more and more attention by the construction industry, this is because a project’s construction process has crucial impacts on society, the environment, and the economy. Modular Construction has been proven to ensure sustainable construction by reducing the negative impacts on the environment, reducing construction time, and improving manpower productivity. This research aims at developing an integrated model that interrelates BIM with construction cost estimation, scheduling, and sustainability at the conceptual design stage of projects. The aim is to reduce the preparation time and increase the efficiency of making major decisions for both conventional construction and modular construction. The proposed model consists of five modules, including a data collection module, a cost estimation module, a scheduling module, a sustainability evaluation module, and a 5D integrated module. Plug-ins were developed in the model to link BIM tool (i.e., Autodesk Revit) with Microsoft Excel to ensure automatic data transfer among these modules all within a BIM platform so that owners and designers can quickly generate a reliable construction cost estimate, construction schedule, preliminary sustainability evaluation, as well as construction process simulation.
APA, Harvard, Vancouver, ISO, and other styles
38

Cecconi, Corrado. "La strategia Building Information Modelling (BIM) per il Facility Management di un impianto sportivo/natatorio - caso di studio." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2017. http://amslaurea.unibo.it/13066/.

Full text
Abstract:
In questa tesi si affronta il problema del Facility Management (FM) di un impianto natatorio e di come la strategia del Building Information Modeling ne possa migliorare l’efficienza tramite l’organizzazione degli aspetti informativi. La scelta della tesi è ricaduta sul Facility Management a seguito dell’esperienza di tirocinio e di esperienze lavorative dirette svolte nell’ambito di società che gestiscono impianti sportivi / natatori. Il confronto con le problematiche scaturite dall’ambito lavorativo hanno permesso di focalizzare come tema di sviluppo per il Facility Management l’introduzione della digitalizzazione BIM. Tramite una serie di incontri con interlocutori privilegiati sono stati individuati ed analizzati gli aspetti più importanti del FM di un impianto sportivo natatorio, ovvero della gestione, esercizio e manutenzione di una piscina sportiva, che possono essere oggetto di digitalizzazione. Dai risultati di quest’indagine sono stati individuati gli obbiettivi da sviluppare nella tesi. Gli obbiettivi posti a base della tesi sono di mostrare come, attraverso le informazioni fornite dal modello BIM, sia possibile avere: • accesso immediato a tutte le informazioni relative alla struttura e agli impianti. • il controllo dello stato di Manutenzione di ogni elemento strutturale e impiantistico. • il controllo sui consumi e poter preventivare scelte in base ad essi. • il controllo e calcolo dei costi di gestione. La tesi, che si avvale della collaborazione dell’Ing. Angelo Mingozzi, ha analizzato il caso di studio della piscina di Cà Selvatica, di Bologna. Il Modello BIM è stato digitalizzato con il software ALLPLAN.
APA, Harvard, Vancouver, ISO, and other styles
39

Parsanezhad, Pouriya. "A Lifecycle Approach towards Building Information Management : Technical and procedural implications for the facility management and operations sector." Licentiate thesis, KTH, Projektkommunikation, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-164954.

Full text
Abstract:
A well-structured and coordinated information management practice is central to promoting efficiency in construction. Building information management encompasses authoring, interpretation, communication, coordination and storage of building information. The benefits envisioned by utilizing IT developments such as Building Information Modelling (BIM) in the facility management and operations (FM&O) sector are estimated to be far greater than in other sectors. There is, however, a gap between the knowledge available in the field of building information management and the actual demands of the architectural, engineering, construction and operation (AECO) industry, especially the FM&O sector. The overall aim of this qualitative research is to develop knowledge that conceptualizes the lifecycle supporting implementation of BIM in the AECO industry with a focus on its implications for a BIM-enabled FM&O practice. This applied research comprises a number of summative and formative components: paper 1 investigates the existing and emerging information management systems for the FM&O sector and their characteristics. The focus of paper 2 is narrowed down to the technical requirements on building information management systems; while its temporal scope spans the entire lifecycle of buildings. Paper 3 is a further elaboration on the findings of paper 1 and covers the technical requirements of BIM-implementation in the FM&O sector. Paper 4 investigates workflows – another category of the issues identified in paper 1. Paper 1 aims to provide a general understanding of the importance and implications of implementing BIM-enabled systems in the FM&O sector and also identifies the main categories of the issues associated with this approach. This literary paper reports on a basic research with a descriptive approach and builds upon the information from a non-exhaustive set of literature. In this paper, workflows, contracts and information technology have been identified as three categories of the issues associated with implementing BIM-enabled systems in the FM&O sector. Paper 2 is also a literary research which draws on the notion of BIM repositories and aims to clarify the technical requirements for a more collaborative building industry as well as depicting the current status of building knowledge management technologies, recent trends and future prospects. Open format BIM repositories have been suggested as the cornerstones of an integrated information management system for AECO firms. The aim of paper 3 is twofold: firstly, to summarize the current status of the building information management technologies applied in the facility operation activities and identifying prevailing issues; secondly, to devise some technical solutions for those issues based on a case project. In the first part of this study, a summarized description of information management configurations in eleven projects were extracted from literature and the technical issues within those systems were identified. Moreover, five major categories of contemporary technical solutions for enhancing information transfer from BIM to FM&O software were designated. Then, a narrative and illustrative representation and reconstruction of an IT-implementation project was developed. Paper 4 is another literary study which aims to provide the theoretical basis for more focused studies on existing and desired processes in the FM&O sector and their associated information transactions. In this paper, firstly, the more common definitions of the key concepts have been revisited and discussed. Then, the generic types of the processes, activities and organizational roles common to FM&O firms, the types of information required by each actor and how such information are acquired have been presented.

QC 20150423

APA, Harvard, Vancouver, ISO, and other styles
40

Russell, Nicholas Charles. "Foundations of process-aware information systems." Thesis, Queensland University of Technology, 2007. https://eprints.qut.edu.au/16592/1/Nicholas_Charles_Russell_Thesis.pdf.

Full text
Abstract:
Over the past decade, the ubiquity of business processes and their need for ongoing management in the same manner as other corporate assets has been recognized through the establishment of a dedicated research area: Business Process Management (or BPM). There are a wide range of potential software technologies on which a BPM o®ering can be founded. Although there is signi¯cant variation between these alternatives, they all share one common factor { their execution occurs on the basis of a business process model { and consequently, this ¯eld of technologies can be termed Process-Aware Information Systems (or PAIS). This thesis develops a conceptual foundation for PAIS based on the results of a detailed examination of contemporary o®erings including work°ow and case han- dling systems, business process modelling languages and web service composition languages. This foundation is based on 126 patterns that identify recurrent core constructs in the control-°ow, data and resource perspectives of PAIS. These patterns have been used to evaluate some of the leading systems and business process modelling languages. It also proposes a generic graphical language for de¯ning exception handling strategies that span these perspectives. On the basis of these insights, a comprehensive reference language { newYAWL { is developed for business process modelling and enactment. This language is formally de¯ned and an abstract syntax and operational semantics are provided for it. An assessment of its capabilities is provided through a comprehensive patterns-based analysis which allows direct comparison of its functionality with other PAIS. newYAWL serves as a reference language and many of the ideas embodied within it are also applicable to existing languages and systems. The ultimate goal of both the patterns and newYAWL is to improve the support and applicability of PAIS.
APA, Harvard, Vancouver, ISO, and other styles
41

Russell, Nicholas Charles. "Foundations of process-aware information systems." Queensland University of Technology, 2007. http://eprints.qut.edu.au/16592/.

Full text
Abstract:
Over the past decade, the ubiquity of business processes and their need for ongoing management in the same manner as other corporate assets has been recognized through the establishment of a dedicated research area: Business Process Management (or BPM). There are a wide range of potential software technologies on which a BPM o®ering can be founded. Although there is signi¯cant variation between these alternatives, they all share one common factor { their execution occurs on the basis of a business process model { and consequently, this ¯eld of technologies can be termed Process-Aware Information Systems (or PAIS). This thesis develops a conceptual foundation for PAIS based on the results of a detailed examination of contemporary o®erings including work°ow and case han- dling systems, business process modelling languages and web service composition languages. This foundation is based on 126 patterns that identify recurrent core constructs in the control-°ow, data and resource perspectives of PAIS. These patterns have been used to evaluate some of the leading systems and business process modelling languages. It also proposes a generic graphical language for de¯ning exception handling strategies that span these perspectives. On the basis of these insights, a comprehensive reference language { newYAWL { is developed for business process modelling and enactment. This language is formally de¯ned and an abstract syntax and operational semantics are provided for it. An assessment of its capabilities is provided through a comprehensive patterns-based analysis which allows direct comparison of its functionality with other PAIS. newYAWL serves as a reference language and many of the ideas embodied within it are also applicable to existing languages and systems. The ultimate goal of both the patterns and newYAWL is to improve the support and applicability of PAIS.
APA, Harvard, Vancouver, ISO, and other styles
42

Kamaludin, Adzhar. "A simulation approach for modelling and investigation of inventory inaccuracy in warehouse operation." Thesis, Loughborough University, 2010. https://dspace.lboro.ac.uk/2134/6750.

Full text
Abstract:
This thesis is focused on a simulation modelling approach to address the inventory inaccuracy problems in a warehouse operation. The main motivation which led to this research was a desire to investigate the inventory inaccuracy issues that have been highlighted by a logistics company. Previous and current research into inventory inaccuracy issues is largely related to the development of RFID technology as a possible solution to inventory problems. Since the inventory inaccuracy related to RFID technology is focused on the overall measurement of inventory management and retail business, there are differences between this existing research and the research presented in this thesis which is focused on issues of inventory inaccuracy in a warehouse operation. In this thesis, warehouse operation is studied as a detailed sequence of processes that are involved in the flow of items physically in parallel with related information being stored in the computer system. In these processes there are many places where errors can occur in counting or recording details of inventory, or in physically moving, storing or picking items incorrectly. These details of a warehouse operation are used to develop a conceptual model of inventory inaccuracy in warehouse operations. The study also found that typically a product needs to be considered differently at different stages of its progress through a warehouse (and therefore within different sections of the conceptual model). This is because initially batches of a product are likely to be delivered from a supplier, therefore if errors occur soon after the product is delivered to the warehouse, the error might involve the whole batch (for example the batch may be misplaced and put in an incorrect storage location), or the error might involve just part of the batch (for example poor transportation by forklift truck may damage the packaging carton and some of the items within the carton). When the product is stored ready for meeting customer orders, it needs to be considered as individual items (and errors can occur in counting of individual items or individual items may be misplaced or stolen). Finally, when a customer order is received, the product will be picked and grouped to meet the requirements of the order (for example, one order may require 10 of the product whilst another order may require 20 of the product). Errors might again occur to the whole group or to just part of the group. (Continued ...)
APA, Harvard, Vancouver, ISO, and other styles
43

Andrade, Pedro Daniel Medeiros Ferreira de. "Avaliação de benefícios da integração do BIM nas opera-ções de Facilities Management." Master's thesis, Faculdade de Ciências e Tecnologia, 2014. http://hdl.handle.net/10362/12526.

Full text
Abstract:
Dissertação para obtenção do Grau de Mestre em Engenharia Civil – Perfil de Construção
Durante o ciclo de vida dos edifícios, muita atenção é dada aos custos de construção e projeto. BIM tem provado que os processos de construção podem ser mudados permitindo à indústria reduzir os custos e aumentar a sua produtividade. No custo global de um edifício a grande fatia dos custos não está associada à sua construção, mas antes à sua exploração e manutenção. O horizonte de projeto não é determinado em anos, mas antes em décadas. Durante esse período os edifícios degradar-se-ão naturalmente e apresentarão anomalias nos seus equipamentos e instalações. Desse modo, é ne-cessário levar a cabo procedimentos no dia-a-dia que garantam que o edifício cumprirá as exigên-cias funcionais inerentes ao seu propósito, e aumentar a sua vida útil. Num ambiente económico cada vez mais degradado a gestão e manutenção de edifício necessita de se orientar nesse sentido, eliminando desperdício nas suas operações, não só em intervenções de grande magnitude, como essencialmente na sua gestão diária. BIM apresentou uma forma de co-municar e colaborar, nunca antes pensada na indústria AEC; esse paradigma pode também ser apli-cado durante a exploração do edifício, permitindo igualmente ganhos de produtividade e eficiência na gestão da informação do edifício. Este trabalho propõe a integração de BIM nos processos de empresas de gestão de instalações, especialmente na partilha de informação entre os intervenientes. Para suportar a implementação de BIM, procede-se ao levantamento dos processos de gestão da manutenção de um caso de estudo usando o método Business Process Modelling and Notation (BPMN) avaliando as alterações que surgem empregando a metodologia Value Stream Mapping (VSM). Com a integração de BIM nos procedimentos da gestão da manutenção, resultaram ganhos de efi-ciência e produtividade assinaláveis. Permitiu a redução dos custos de operação, da variabilidade dos processos e ainda reduzir os erros humanos.
APA, Harvard, Vancouver, ISO, and other styles
44

Pitchforth, Jegar Oliver. "Bayesian networks for information synthesis in complex systems." Thesis, Queensland University of Technology, 2015. https://eprints.qut.edu.au/82030/1/Jegar%20Oliver_Pitchforth_Thesis.pdf.

Full text
Abstract:
This thesis introduces a method of applying Bayesian Networks to combine information from a range of data sources for effective decision support systems. It develops a set of techniques in development, validation, visualisation, and application of Complex Systems models, with a working demonstration in an Australian airport environment. The methods presented here have provided a modelling approach that produces highly flexible, informative and applicable interpretations of a system's behaviour under uncertain conditions. These end-to-end techniques are applied to the development of model based dashboards to support operators and decision makers in the multi-stakeholder airport environment. They provide highly flexible and informative interpretations and confidence in these interpretations of a system's behaviour under uncertain conditions.
APA, Harvard, Vancouver, ISO, and other styles
45

Al-Kuwari, Wasmiya Dalhem M. D. "Information management within the Nursing Department at Hamad Medical Corporation (HMC), Qatar." Thesis, Loughborough University, 2005. https://dspace.lboro.ac.uk/2134/7811.

Full text
Abstract:
Hamad Medical Corporation, the main healthcare provider in the state of Qatar, sponsored this study to investigate the use of electronic records management as the basis for a novel information management system in its Nursing Department. To assess the viability of an electronic records management system a questionnaire survey of a representative sample of the staff and interviews with key post holders were under taken. Results obtained indicated a wide spread dissatisfaction with the existing manual system. However, introduction of any computer-based technology requires great care. To assist with identifying any issues with this technological change, Soft System Methodology (SSM) was employed to discern what changes could be made to improve the current problematic situation found in the Nursing Department. In fact the change archetypes uncovered (procedural, attitudinal, structural and cultural) formed an innovative input into obtaining a roadmap for development of the electronic staff records system. This roadmap was facilitated by the use of Nominal Group Technique (NGT) and Interpretive Structural Modelling (ISM): In fact the roadmap was an ISM intent structure. The roadmap suggested that change could be affected by having written policy documents and the top goal to be achieved reflected an improvement in manpower placing and budgetary forecasts. The use of a multi-methods approach meant that as well as this study's main objectives being reached, the process encompassed some methodological innovations. This study is the first to use the output of SSM to facilitate the NGT and ISM interactions. Equally, it is the first study of its sort to be applied to the Nursing Department at HMC, Qatar, which is an example of a cross-cultural eastern philosophical tradition. The methods used here revealed some significant findings, and have helped in the development of an electronic records management system for use at HMC, Qatar.
APA, Harvard, Vancouver, ISO, and other styles
46

Kessels, Henricus. "Wildfire Management in the Southside Region of Canada’s Montane Cordillera - A Systems Modelling Application on Firebreak Strategies." Thesis, Université d'Ottawa / University of Ottawa, 2016. http://hdl.handle.net/10393/35107.

Full text
Abstract:
There is growing recognition of the importance of preserving Canada’s forests. Canada’s 348 million hectares of forest land cover 35% of its land area, representing 9% of the world’s forests and 24% of the world’s boreal forests. As a renewable resource, forests offer significant environmental, economic and recreational benefits and innumerable services contributing to the quality of life. Canada has recently entered an era of increased frequency and severity of natural disasters. Ecosystems and communities especially in western Canada have recently undergone a trend of increasing pressures from natural disturbances. These disturbances include wildfires associated with increased fuel load levels from past fire suppression regimes and a widely spread infestation of the mountain pine beetle in addition to changes in weather patterns. Wildfire activity has reached extreme levels in many of the recent years. This thesis profiles an area of western Canada within the Montane Cordillera covering the Nechako Lakes Electoral District in central British Columbia and assesses its vulnerability to the specific hazard of wildfires caused by natural and man-made sources. The objectives of this research are to review, simulate and assess the impact of various fuel management strategies in a sub-section of the Nechako Lakes Electoral District called the Southside. Values at risk include private property and old growth forest in respectively timber supply areas, provincial parks, woodlots and community forests. Simulation results show that firebreaks are effective in significantly reducing the area burned in different parts of the landscape. The performance of different strategies shows large variation. Although this has not been investigated further, such variation has likely been caused by topographic aspects and the positioning of firebreaks in the landscape in relation to climatic parameters. These results can therefore not be extrapolated beyond the simulated area, but do give an indication of the performance variation that may be expected when similar firebreaks are applied elsewhere. The results also show that model performance of all firebreak strategies is heavily and fairly consistently influenced by weather stream parameters. Sensitivity analyses of weather stream parameters show that although the reduction in total area burned varies, the ranking between strategies in their overall performance is consistent regardless of the weather pattern. Combined dry, warm and windy weather conditions lead to a 3.44-fold increase in total area burned as compared to the scenario with average weather conditions. In favourable weather conditions represented by wet, cold and nearly windless conditions, the model shows an 85% reduction in total burned area as compared to the average scenario. These results illustrate the significant impact of uncontrollable variables on the overall result.
APA, Harvard, Vancouver, ISO, and other styles
47

Gething, Peter W. "Spatiotemporal modelling of Health Management Information System data to quantify malaria treatment burdens in the Kenyan Government's formal health sector." Thesis, University of Southampton, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.432717.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Gerrish, Tristan. "Exploring the effectiveness of BIM for energy performance management of non-domestic buildings." Thesis, Loughborough University, 2017. https://dspace.lboro.ac.uk/2134/25094.

Full text
Abstract:
Following several years of research and development around the subject of BIM, its impact on the design and handover of buildings is now becoming visible across the construction industry. Changes in design procedures and information management methods indicate the potential for greater utilisation of a Common Data Environment in areas other than design. To identify how these changes are influencing the engineering design process, and adapt this process to the needs and requirements of building performance management requires consideration of multiple factors, relating mainly to the stakeholders and processes employed in these procedures. This thesis is the culmination of a four year Engineering Doctorate exploring how BIM could be used to support non-domestic building energy performance management. It begins with an introduction to the research aim and objectives, then presents a thorough review of the subject area and the methodologies employed for the research. Research is split between eight sequential tasks using literature review, interviews, data analysis and case-study application from which findings, conclusions and key recommendations are made. Findings demonstrate disparity between different information environments and provide insight into the necessary steps to enable connection between BIM and monitored building energy performance information. They highlight the following factors essential to providing an information environment suitable for BIM applied performance management: Skills in handling information and the interface between various environments; Technology capable of producing structured and accurate information, supporting efficient access for interconnection with other environments; and Processes that define the standards to which information is classified, stored and modified, with responsibility for its creation and modification made clear throughout the building life-cycle. A prototype method for the linking of BIM and monitored building energy performance data is demonstrated for a case-study building, encountering many of the technical barriers preventing replication on other projects. Methodological challenges are identified using review of existing building design and operation procedures. In conclusion the research found that BIM is still in its infancy, and while efforts are being made to apply it in novel ways to support efficient operation, several challenges remain. Opportunities for building energy performance improvement may be visualised using the modelling environment BIM provides, and the ability to interface with descriptive performance data suggests the future potential for BIM utilisation post-handover.
APA, Harvard, Vancouver, ISO, and other styles
49

Falconer, Lynne. "Spatial modelling and GIS-based decision support tools to evaluate the suitability of sustainable aquaculture development in large catchments." Thesis, University of Stirling, 2013. http://hdl.handle.net/1893/19465.

Full text
Abstract:
Land, water and natural resources are under increasing pressure due to rising demands for food and energy from the rapidly growing global population. Across a catchment there can be multiple stakeholders with conflicting opinions over how space and resources should be used and managed. Consequently, it is important to consider the suitability of a catchment for a particular purpose to optimise use of the area and minimise potential conflicts and impacts on the wider environment. Aquaculture is a significant contributor to world food supply and as fisheries are unlikely to increase it is expected that the industry will continue to grow and expand in the future to help meet food security requirements. As a result, it is essential that the sector aims for sustainable development within the most suitable locations. However, it can be difficult to assess the suitability of multiple large catchments and some issues may not be immediately apparent. This project aimed to show how spatial models could be used as decision support tools to evaluate the suitability of large catchments for sustainable aquaculture. Four large areas of importance to aquaculture were selected; covering 10,148km2, 26,225km2, 48,319km2 and 66,283km2 in Bangladesh, China, Thailand and Vietnam respectively. Asia is by far the most dominant aquaculture region in the world and each of the four study areas contribute to local, regional and global food supplies. The study area in Bangladesh was located in Khulna region in the south west of the country and the main species of focus were prawn and shrimp. The Chinese study area was located in the south eastern province of Guangdong and the main species covered were tilapia and shrimp. Similarly, in Thailand, the main species evaluated were tilapia and shrimp whilst the study area extended across the Central region. Finally, the largest study area was the Mekong Delta in Vietnam and the main species of focus in this area were pangasius catfish and shrimp. One of the challenges in modelling large catchments is model applicability and data availability. Often, the required data are not available (or accessible) and it would be difficult, time consuming and expensive to collect new information. Furthermore, when assessing multiple areas is it vital that a representative and unbiased approach is used where no one catchment is favoured over the other due to higher quality data. Therefore, this study used data that are available for almost any area in the world; allowing future application of the models and enabling effective and unbiased decision support. Four modelling stages were employed in this study to evaluate the suitability of large catchments for sustainable aquaculture development. The first stage was the classification of seasonal land use models from satellite imagery. This provides information on what the land is used for and how aquaculture could impact or be impacted by the wider environment. The second step was the development of seasonal models of site suitability using optimal values within a GIS-based multi-stage framework. These models identify which locations are best for culture and can also be used to estimate the availability of areas for food production. The next stage investigated the use of Maxent as a novel approach in site suitability modelling to evaluate the conditions experienced by existing farms. The information from Maxent can be used to identify trends, opportunities and concerns related to sustainable management and farm locations. Finally, qualitative models of non-point source pollution (NPSP) were developed which assess the risk of NPSP within a catchment. NPSP is an issue which can impact both aquaculture and the wider environment. Thus, it is important to understand the areas within a catchment where NPSP risk is higher enabling the establishment of monitoring and/or mitigation procedures. The models support the ecosystem approach to aquaculture (EAA) and enable objective planning and management strategies to enhance productivity across large catchments without negatively impacting the environment. In order to meet growing food requirements, large areas will need to be used for agriculture and aquaculture; therefore, analysis at a wider catchment level, which complements assessment at a local scale, is required as it allows a holistic view of the situation. The work presented here illustrates the potential use of spatial models across large catchments and considers the suitability of the areas for aquaculture development.
APA, Harvard, Vancouver, ISO, and other styles
50

Viljoen, Sarel Johannes. "Creation of a hydrological modelling environment to assist in the decision making of water-related activities." Thesis, Bloemfontein : Central University of Technology, Free State, 2007. http://hdl.handle.net/11462/96.

Full text
Abstract:
Thesis (M. Tech.) -- Central University of Technology, Free State, 2007
In South Africa, water is a scarce resource and it has become very important to manage this resource effectively. The State developed a regulating framework, under the hospice of the Minister of Water Affairs and Forestry, which protects the country‟s water resources from over-exploitation by ensuring that it is protected, used, developed, conserved, and managed, in a sustainable and equitable manner. The laws and policies governing the use of water resources are contained in the National Water Act (South Africa, 1998), the National Water Policy (South Africa, 1997a), the National Water Resource Strategy, and the Water Services Act (South Africa, 1997b). In addition some water-related functions were transferred to Catchment Management Agencies and Water Users‟ Associations, and it is their task to ensure that the strategies, laws and policies are implemented. Effective water management can only be performed by making use of hydroinformatics which assists with simulations and estimations. As a result input data will be collected, added to a Relational Database Management System and output results generated. A Geographic Information System with the support of a geodatabase will allow users to store spatial and temporal data. The research project investigated different water-related data models (ArcHydro, Hydstra, GML, HYMOS, and WinHSPF), as well as hydrological modelling frameworks (BASINS, OMS, OpenMI, SPATSIM, and TIME) to determine whether they were adequate to assist with the decision making of water-related activities. It was found that these data models and hydrological modelling frameworks did not allow users to add new datasets to their existing data structures and in many cases only had a limited set of functions. For these reasons it was decided to develop a comprehensive, modifiable, geodatabase that will function in a modelling environment which will allow users to save their data in a centralised database. Additionally the functionality provided by other data models and modelling frameworks may be linked and used in the new modelling environment. A methodology that has been followed was to first establish the objectives of the research project, gather the necessary data, investigate various data models and hydrological modelling frameworks, determine the requirements for the modelling environment, design and create the modelling environment, design and create the geodatabase, and finally selecting the study area which will provide the research project with the necessary data. The following findings were made concerning the research project: firstly, that ArcHydro will be used as example data model to assist in designing the geodatabase. Secondly, that UML will be used as a development tool to assist with the development of the geodatabase. Thirdly, that the geodatabase will be generated from the XML schema and be made available to ArcCatalog. Fourthly, that data from different users/providers (Hydstra, Stats SA, Weather Bureau, Department of Water Affairs and Forestry, etc.) be inserted into the geodatabase. Fifthly, that any other hydrological modelling framework may make use of the data stored in the geodatabase. Finally, ArcGIS was selected as GIS application and Microsoft Access as a storage area.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography