Dissertations / Theses on the topic 'Information Value Method'

To see the other types of publications on this topic, follow the link: Information Value Method.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 32 dissertations / theses for your research on the topic 'Information Value Method.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Martin, Nancy L. "The strategic value of business method patents in information systems /." Available to subscribers only, 2006. http://proquest.umi.com/pqdweb?did=1212781781&sid=11&Fmt=2&clientId=1509&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gammelgård, Magnus. "Business value assessment of IT investments : an evaluation method applied to the electrical power industry /." Stockholm : Elektro- och systemteknik, Kungliga Tekniska högskolan, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4505.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Ketcham, Barbara Lynn. "Incorporating information value into Navy tactical data system system configuration management through the Delphi method." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/25994.

Full text
Abstract:
There is a difficulty in incorporating information value judgments into configuration management decisions regarding command control systems. This thesis reviews two command and control process models, decision theory as it relates to command and control and the current tactical data link configuration management method. The Delphi method is discussed and a means of incorporating its use into configuration management is introduced. The Delphi method allows a systematic gathering of subjective information from selected respondents which then enables formulation of a group position. Use of this method would enable subjective assessments, such as perceived operational impact of tactical data link changes, to be systematically considered in Navy tactical data link configuration management decisions
APA, Harvard, Vancouver, ISO, and other styles
4

Radhakrishnan, Rahul Lal. "A Method to Improve the Security of Information Diffusion in Complex Networks— Node Trust Value Management Mechanism." Thesis, Högskolan i Halmstad, Akademin för informationsteknologi, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-42447.

Full text
Abstract:
In a sensing field-based placement of nodes the communication happens from the data acquisition points to the control center which is the receiver of data acquisition. In this project an algorithm based on data sense points trust value updating is used which will find out the value of the trust level dynamically once the trust level is found out it will pick the forwarded data sense points based on highest value of trust. The NTTUA algorithm is then compared with baysian trust method and then pick the path which has the highest baysian trust. The comparison between NTTUA and baysian method is done with respect to multiple parameters which give good performance and better residual energy along with throughput
APA, Harvard, Vancouver, ISO, and other styles
5

Nilsson, Erik, and Goidaragh Safiyeh Alizadeh. "Kundnyttan av Tekla Structures som verktyg i broprojektering." Thesis, KTH, Byggteknik och design, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-125749.

Full text
Abstract:
Byggbranschen är inne i en övergångsfas mellan 2D och 3D. BIM, som arbetsmetod, och 3D-modeller blir allt vanligare. I broprojekt är dock efterfrågan på dessa modeller låg. På WSP i Stockholm används 3D-modelleringprogrammet Tekla Structures i stor utsträckning på flera avdelningar. Önskemål finns om att utöka användningen av programmet även inom broprojektering. För att det ska bli möjligt krävs en större efterfrågan hos företagets kunder. Detta examensarbete har till syfte att ur kundens synvinkel utvärdera Tekla Structures som verktyg samt vilken nytta kunden har av Tekla-modeller. För att kunna belysa ämnet ur detta perspektiv har intervjuer gjorts med olika kunder till WSP. Deras behov och syn på BIM och Tekla Structures i brosammanhang har kartlagts och analyserats. Aspekter som kunden lyft fram i intervjuerna har fungerat som plattform för analys av en 3D-modell som konstruerats med hjälp av Tekla Structures samt två redan projekterade broar, modellerade i samma program. Analysen visar att kundnyttan av att använda Tekla-modeller i broprojekt är stor men för att möjliggöra hantering av modellerna krävs stora insatser. Utvärdering av programvaran visar även dess starka respektive svaga sidor. Resultatet av arbetet redovisas i rapportens följande sidor.
The building industry is going trough a transitional stage, from 2D to 3D. BIM, as a working method, and 3D models are becoming more common. In bridge project however, the demand for 3D models is low. At WSP in Stockholm the 3D modelling program Tekla Structures is widely used in several departments. There is a desire to expand the usage of the program also when it comes to bridge planning. To make that possible a bigger request by the customers is required. The purpose of this thesis is to evaluate Tekla Structure as a tool from a customer perspective and to analyse the customer values by using Tekla models. To shed light on this issue interviews have been made with different customers of WSP. The customers’ needs and their reflections on BIM and Tekla Structures in bridge projects are mapped out and analysed. Aspects reached by the customers have served as a basis for analyse of a 3D model, built up in Tekla Structures, as well as two bridges – already modelled in the same program. The analysis shows that there is a big customer value by using Tekla models in bridge projects, but to enable management models great efforts are required. An evaluation of the software also shows both strengths and weaknesses. In the following pages the result is presented.
APA, Harvard, Vancouver, ISO, and other styles
6

Niesel, Christoph Ryo. "Older workers' adaptation to information technologies in the workplace: A study in the context of non-standard employment." Thesis, Queensland University of Technology, 2021. https://eprints.qut.edu.au/212786/1/Christoph_Niesel_Thesis.pdf.

Full text
Abstract:
Growing diversification of working arrangements, greater labour decentralisation and increasing reliance on often changing workplace information technologies (ITs) are turning many older workers to Non-Standard Employment (NSE). This study therefore sought to explore the motivations for participation in and IT adaptation behaviours of older workers in NSE. Using qualitative methods, and an Expectancy-Value-Cost theoretical perspective, factors pertaining to the NSE context were found to drive specific adaptation expectancies, values and costs, which led to problem-focused and emotion-focused strategies for older workers dealing with IT adaptation. Meanwhile, financial stability, flexibility, continued activity, socialisation, and maintaining self-identity were motivators for NSE participation.
APA, Harvard, Vancouver, ISO, and other styles
7

Au, Manix. "Automatic State Construction using Decision Trees for Reinforcement Learning Agents." Thesis, Queensland University of Technology, 2005. https://eprints.qut.edu.au/15965/1/Manix_Au_Thesis.pdf.

Full text
Abstract:
Reinforcement Learning (RL) is a learning framework in which an agent learns a policy from continual interaction with the environment. A policy is a mapping from states to actions. The agent receives rewards as feedback on the actions performed. The objective of RL is to design autonomous agents to search for the policy that maximizes the expectation of the cumulative reward. When the environment is partially observable, the agent cannot determine the states with certainty. These states are called hidden in the literature. An agent that relies exclusively on the current observations will not always find the optimal policy. For example, a mobile robot needs to remember the number of doors went by in order to reach a specific door, down a corridor of identical doors. To overcome the problem of partial observability, an agent uses both current and past (memory) observations to construct an internal state representation, which is treated as an abstraction of the environment. This research focuses on how features of past events are extracted with variable granularity regarding the internal state construction. The project introduces a new method that applies Information Theory and decision tree technique to derive a tree structure, which represents the state and the policy. The relevance, of a candidate feature, is assessed by the Information Gain Ratio ranking with respect to the cumulative expected reward. Experiments carried out on three different RL tasks have shown that our variant of the U-Tree (McCallum, 1995) produces a more robust state representation and faster learning. This better performance can be explained by the fact that the Information Gain Ratio exhibits a lower variance in return prediction than the Kolmogorov-Smirnov statistical test used in the original U-Tree algorithm.
APA, Harvard, Vancouver, ISO, and other styles
8

Au, Manix. "Automatic State Construction using Decision Trees for Reinforcement Learning Agents." Queensland University of Technology, 2005. http://eprints.qut.edu.au/15965/.

Full text
Abstract:
Reinforcement Learning (RL) is a learning framework in which an agent learns a policy from continual interaction with the environment. A policy is a mapping from states to actions. The agent receives rewards as feedback on the actions performed. The objective of RL is to design autonomous agents to search for the policy that maximizes the expectation of the cumulative reward. When the environment is partially observable, the agent cannot determine the states with certainty. These states are called hidden in the literature. An agent that relies exclusively on the current observations will not always find the optimal policy. For example, a mobile robot needs to remember the number of doors went by in order to reach a specific door, down a corridor of identical doors. To overcome the problem of partial observability, an agent uses both current and past (memory) observations to construct an internal state representation, which is treated as an abstraction of the environment. This research focuses on how features of past events are extracted with variable granularity regarding the internal state construction. The project introduces a new method that applies Information Theory and decision tree technique to derive a tree structure, which represents the state and the policy. The relevance, of a candidate feature, is assessed by the Information Gain Ratio ranking with respect to the cumulative expected reward. Experiments carried out on three different RL tasks have shown that our variant of the U-Tree (McCallum, 1995) produces a more robust state representation and faster learning. This better performance can be explained by the fact that the Information Gain Ratio exhibits a lower variance in return prediction than the Kolmogorov-Smirnov statistical test used in the original U-Tree algorithm.
APA, Harvard, Vancouver, ISO, and other styles
9

Mac, Dermed Liam Charles. "Value methods for efficiently solving stochastic games of complete and incomplete information." Diss., Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/50270.

Full text
Abstract:
Multi-agent reinforcement learning (MARL) poses the same planning problem as traditional reinforcement learning (RL): What actions over time should an agent take in order to maximize its rewards? MARL tackles a challenging set of problems that can be better understood by modeling them as having a relatively simple environment but with complex dynamics attributed to the presence of other agents who are also attempting to maximize their rewards. A great wealth of research has developed around specific subsets of this problem, most notably when the rewards for each agent are either the same or directly opposite each other. However, there has been relatively little progress made for the general problem. This thesis address this lack. Our goal is to tackle the most general, least restrictive class of MARL problems. These are general-sum, non-deterministic, infinite horizon, multi-agent sequential decision problems of complete and incomplete information. Towards this goal, we engage in two complementary endeavors: the creation of tractable models and the construction of efficient algorithms to solve these models. We tackle three well known models: stochastic games, decentralized partially observable Markov decision problems, and partially observable stochastic games. We also present a new fourth model, Markov games of incomplete information, to help solve the partially observable models. For stochastic games and decentralized partially observable Markov decision problems, we develop novel and efficient value iteration algorithms to solve for game theoretic solutions. We empirically evaluate these algorithms on a range of problems, including well known benchmarks and show that our value iteration algorithms perform better than current policy iteration algorithms. Finally, we argue that our approach is easily extendable to new models and solution concepts, thus providing a foundation for a new class of multi-agent value iteration algorithms.
APA, Harvard, Vancouver, ISO, and other styles
10

Buchanan, Aeron Morgan. "Tracking non-rigid objects in video." Thesis, University of Oxford, 2008. http://ora.ox.ac.uk/objects/uuid:82efb277-abc9-4725-9506-5d114a83bd96.

Full text
Abstract:
Video is a sequence of 2D images of the 3D world generated by a camera. As the camera moves relative to the real scene and elements of that scene themselves move, correlated frame-to-frame changes in the video images are induced. Humans easily identify such changes as scene motion and can readily assess attempts to quantify it. For a machine, the identification of the 2D frame-to-frame motion is difficult. This problem is addressed by the computer vision process of tracking. Tracking underpins the solution to the problem of augmenting general video sequences with artificial imagery, a staple task in the visual effects industry. The problem is difficult because tracking in general video sequences is complicated by the presence of non-rigid motion, repeated texture and arbitrary occlusions. Existing methods provide solutions that rely on imposing limitations on the scenes that can be processed or that rely on human artistry and hard work. I introduce new paradigms, frameworks and algorithms for overcoming the challenges of processing general video and thus provide solutions that fill the gap between the `automated' and `manual' approaches. The work is easily sectioned into three parts, which can be considered separately or taken together for dealing with video without limitations. The initial focus is on directly addressing practical issues of human interaction in the tracking process: a new solution is developed by explicitly incorporating the user into an interactive algorithm. It is a novel tracking system based on fast full-frame patch searching and high-speed optimal track determination. This approach makes only minimal assumptions about motion and appearance, making it suitable for the widest variety of input video. I detail an implementation of the new system using k-d trees and dynamic programming. The second distinct contribution is an important extension to tracking algorithms in general. It can be noted that existing tracking algorithms occupy a spectrum in their use of global motion information. Local methods are easily confused by occlusions, repeated texture and image noise. Global motion models offer strong predictions to see through these difficulties and have been used in restricted circumstances, but are defeated by scenes containing independently moving objects or modest levels of non-rigid motion. I present a well principled way of combining local and global models to improve tracking, especially in these highly problematic cases. By viewing rank-constrained tracking as a probabilistic model of 2D tracks instead of 3D motion, I show how one can obtain a robust motion prior that can be easily incorporated in any existing tracking algorithm. The development of the global motion prior is based on rank-constrained factorization of measurement matrices. A common difficulty comes from the frequent occurrence of occlusions in video, which means that the relevant matrices are often not complete due to missing data. This defeats standard factorization algorithms. To fully explain and understand the algorithmic complexities of factorization in this practical context, I present a common notation for the direct comparison of existing algorithms and propose a new family of hybrid approaches that combine the superb initial performance of alternation methods with the convergence power of the Newton algorithm. Together, these investigations provide a wide-ranging, yet coherent exploration of tracking non-rigid objects in video.
APA, Harvard, Vancouver, ISO, and other styles
11

Moy, Mae. "Evaluating Federal Information Technology Program Success Based on Earned Value Management." ScholarWorks, 2016. https://scholarworks.waldenu.edu/dissertations/2075.

Full text
Abstract:
Despite the use of earned value management (EVM) techniques to track development progress, federal information (IT) software programs continue to fail by not meeting identified business requirements. The purpose of this logistic regression study was to examine, using IT software data from federal agencies from 2011 to 2014, whether a relationship between schedule variance (SV), cost variance (CV), and actual cost (AC) could predict the success of IT software program, as operationalized by meeting the identified business requirements. The population of interest was 132 IT software programs developed between 2011 and 2014 for federal agencies. The sample source was an archival database located at ITdashboard.gov. The theoretical framework for the study was earned value (EV) project management theory. The EV project management theory is a project performance measurement system that involves integrating cost, schedule, and performance elements for planning and control. EVM contributes to project success by providing early warnings when programs deviate from cost and schedule plans. This study found that only SV was significant (SV days, p = .002). The null hypothesis was rejected, suggesting that a relationship exists between IT program success and the SV, CV, and AC. This study may contribute to social change by increasing the program managers' understanding of EV in federal project management and by decreasing federal spending through successful programs and more cost-efficient use of taxpayers' money.
APA, Harvard, Vancouver, ISO, and other styles
12

Ezingeard, Jean-Noel. "Heuristic methods to aid value assessment in the management of Manufacturing Information and Data Systems." Thesis, Brunel University, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.336661.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Beyene, Mussie Abraham. "Modelling the Resilience of Offshore Renewable Energy System Using Non-constant Failure Rates." Thesis, Uppsala universitet, Institutionen för elektroteknik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-445650.

Full text
Abstract:
Offshore renewable energy systems, such as Wave Energy Converters or an Offshore Wind Turbine, must be designed to withstand extremes of the weather environment. For this, it is crucial both to have a good understanding of the wave and wind climate at the intended offshore site, and of the system reaction and possible failures to different weather scenarios. Based on these considerations, the first objective of this thesis was to model and identify the extreme wind speed and significant wave height at an offshore site, based on measured wave and wind data. The extreme wind speeds and wave heights were characterized as return values after 10, 25, 50, and 100 years, using the Generalized Extreme Value method. Based on a literature review, fragility curves for wave and wind energy systems were identified as function of significant wave height and wind speed. For a wave energy system, a varying failure rate as function of the wave height was obtained from the fragility curves, and used to model the resilience of a wave energy farm as a function of the wave climate. The cases of non-constant and constant failure rates were compared, and it was found that the non-constant failure rate had a high impact on the wave energy farm's resilience. When a non-constant failure rate as a function of wave height was applied to the energy wave farm, the number of Wave Energy Converters available in the farm and the absorbed energy from the farm are nearly zero. The cases for non-constant and an averaged constant failure of the instantaneous non-constant failure rate as a function of wave height were also compared, and it was discovered that investigating the resilience of the wave energy farm using the averaged constant failure rate of the non-constant failure rate results in better resilience. So, based on the findings of this thesis, it is recommended that identifying and characterizing offshore extreme weather climates, having a high repair rate, and having a high threshold limit repair vessel to withstand the harsh offshore weather environment.
APA, Harvard, Vancouver, ISO, and other styles
14

Sadatsafavi, Mohsen. "Advancing the methods and accessibility of cost-effectiveness and value of information analyses in health care." Thesis, University of British Columbia, 2012. http://hdl.handle.net/2429/40867.

Full text
Abstract:
This thesis comprises three methodological advancements that address important issues related to cost-effectiveness analysis (CEA) and expected value of information (EVI) analysis in health technology assessment. Aims: 1) To develop a practical sampling scheme for the incorporation of external evidence in CEAs conducted alongside randomized controlled trials (RCT); 2) To develop non-parametric methods for the calculation of the expected value of sample information (EVSI) for RCT-based CEAs; 3) To develop a computationally efficient algorithm for the calculation of single-parameter expected value of partial perfect information (EVPPI) for RCT-based and model-based CEAs. The theories and methods laid out in this work are accompanied by real-world CEA and EVI analyses of the Canadian Optimal Therapy of Chronic Obstructive Pulmonary Diseases (OPTIMAL) trial, a RCT on combination pharmaceutical therapies in chronic obstructive pulmonary diseases (COPD). Results: 1) The ‘vetted bootstrap’ is a semi-parametric algorithm based on rejection sampling and bootstrapping that allows the incorporation of external evidence into RCT-based CEAs. Implementing this method to incorporate external information on the effect size of treatment in the OPTIMAL trial required only minor modifications to the original CEA algorithm. 2) A Bayesian interpretation of the bootstrap allows non-parametric calculation of EVSI through two-level resampling. In the case study, incorporation of missing value imputation and adjustment for covariate imbalance in EVI calculations generated EVSI and the expected value of perfect information (EVPI) values that were significantly different than those calculated conventionally, demonstrating the flexibility of this method and the potential impact of modeling such aspects of the analysis on EVI calculations. 3) The new method enabled the calculation of EVPPI for the effect size of treatment for the exemplary RCT data, and showed a significant (up to 25 times in terms of root-mean-squared error) improvement in efficiency compared to the conventional EVPPI calculation methods in a series of simulations. Summary: This thesis provides several original advancements in the methodology of the CEA and EVI analysis of RCTs and enables several analytical approaches that have hitherto been available only through parametric modeling of RCT data.
APA, Harvard, Vancouver, ISO, and other styles
15

Högberg, Michel, and Paulina Persson. "PaKS : Ramverk för prioritering av krav i systemutvecklingsprojekt." Thesis, KTH, Programvaruteknik och Datorsystem, SCS, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-220124.

Full text
Abstract:
Prioritering av systemkrav är en viktig fråga vid utnyttjandet av resurserna på systemutvecklingsavdelningar. Problem uppstår när antalet krav som ska prioriteras är fler än de systemutvecklingsresurser som finns tillgängliga. Således måste de systemkrav som företag vill få utfört av sin systemutvecklingsavdelning prioriteras på ett strukturerat sätt. För att göra det strukturerat behövs stöd i form av regelverk, ramverk och metoder för prioriteringsarbetet. Problemet är att det finns inga ramverk som ger detta stöd idag. Examensarbetsrapporten utforskar vilka beståndsdelar ett ramverk ska bestå av för att stödja prioritering av systemkrav. Syftet är att skapa ett ramverk för att införa ett strukturerat beslutsstöd med prioriteringsmetoder. En väl utformad forskningsstrategi tillämpas innehållande tre olika forskningsfaser: utforskande, utformande och utvärderande. Forskningsfaserna vägleder arbetet i rätt riktning och ser till att motverka validitetshot som riskerar uppstå. Forskningsmetoden är av kvalitativ och induktiv karaktär. Fakta inhämtas genom litteraturstudier, intervjuer och förstudie, som analyserades ur vilket en första version av ramverkets beståndsdelar utformas. Utvärderingsintervjuer görs med respondenter som har mångårig erfarenhet av prioritering inom systemutveckling. Intervjuerna visar på att företag, oavsett bransch, har liknande förutsättningar och strukturer för att applicera ramverket. Det framtagna ramverkets första version valideras som relevant, lämpligt samt funktionellt för prioritering av systemkrav med mindre justering. Efter en analys av utvärderingen skapades en slutgiltig version av ramverket för prioritering av krav i systemutvecklingsprojekt benämnt PaKS. Resultatet från utvärderingen visar att PaKS är lämpligt, fullständigt och användbart i sin generella utformning där respondenterna bidragit med förslag för ytterligare utformning.
Prioritizing system requirements is an important issue for utilizing resources in system development departments. Issues arise when the number of requirements to prioritize are more than there are resources available in the system development department. Thus, the system requirements that a company want carried out by the system development department must prioritized in a structured way. To make prioritization structured, support is required in the form of regulations, framework and methods for prioritizations. Today there are no frameworks that provide this support. This study explores which components a framework should consist of to support the prioritization of system requirements. The aim is to create a framework for implementing structured decision support with prioritization methods. A well designed research strategy containing three different research phases is used: exploration, design and evaluation. The research phases guide the work in the right direction and aim to counteract validity threats that may occur. The research method is of a qualitative and inductive nature. Facts are obtained through literature studies, interviews and preliminary studies, which were analyzed from which a first version of the framework's components is designed. Evaluation interviews are conducted with respondents who have many years of experience of prioritization in system development. The evaluation show that companies, regardless of industry, have similar conditions and structures to apply the framework. The first version of the framework is validated as relevant, appropriate and functional for prioritizing system requirements with minor adjustments. Following an analysis of the evaluation, a final version of the framework for prioritization of requirements in system development projects, called PaKS, was created. The outcome of the evaluation shows that PaKS is useful in its overall design, in which respondents contributed proposals for further design.
APA, Harvard, Vancouver, ISO, and other styles
16

Витвицька, О. М. "Економічна оцінка інформаційного капіталу нафтогазовидобувних підприємств." Thesis, Івано-Франківський національний технічний університет нафти і газу, 2013. http://elar.nung.edu.ua/handle/123456789/4436.

Full text
Abstract:
У дисертаційній роботі досліджено сутність економічної категорії "інформаційний капітал", виділено характеристики та визначено роль інформаційного капіталу в управлінні нафтогазовидобувними підприємствами. Розглянуто теоретичні підходи до економічної оцінки інформаційного капіталу та визначено особливості їх застосування й можливість адаптування щодо оцінки інформаційного капіталу нафтогазовидобувних підприємств. Здійснено аналіз внутрішнього та зовнішнього середовища нафтогазовидобувних підприємств і на його основі запропоновано модель їх інформаційного середовища. Сформовано систему характеристик і показників цього середовища. Розроблено методичні підходи до вартісної оцінки інформації із застосуванням методу економічних вигод, ентропії та функції середнього ризику. Проаналізовано якісні характеристики інформації та сформовано систему показників для кількісного виміру її якості. Розглянуто методичні підходи до оцінки ефективності використання інформаційного капіталу. Запропоновано механізм управління інформаційним капіталом нафтогазовидобувних підприємств.
В диссертационной работе рассмотрена роль информации и информационных ресурсов на современном этапе развития общества, исследована сущность экономической категории информационный капитал, выделены его особенности, определена роль информационного капитала в управлении нефтегазодобывающими предприятиями. Как инструмент усовершенствования управления предприятием предложена экономическая оценка информационного капитала, которая предполагает стоимостную, качественную оценку информации, а также оценку эффективности использования информационного капитала предприятия. Рассмотрены теоретические подходы к экономической оценке информационного капитала, определены особенности и возможности ее применения по отношению к нефтегазодобывающим предприятиям. Осуществлен анализ внутренней и внешней среды нефтегазодобывающих предприятий, систематизирована информация, используемая на нефтегазодобывающих предприятиях. На основании осуществлённого анализа предложена модель информационной среды НГДП, которая имеет три составляющие: внешняя микроинформационная среда, внешняя макроинформационная среда и внутренняя информационная среда. Сформирована система характеристик и показателей внутренней информационной среды, внешней микроинформационной и макроинформационной среда нефтегазодобывающих предприятий, учитывающая особенности их функционирования. Рассмотрены существующие информационные системы нефтегазодобывающих предприятий. Проанализированы возможности этих информационных систем, определены направления их усовершенствования. Разработан методический подход к стоимостной оценке информации с использованием функции энтропии. Предложенный подход учитывает влияние полноты информации на ее стоимость с учётом изменения энтропии. Также на основании вероятностных свойств информации предложен методический подход к оценке информации с использованием функции среднего риска. Проанализированы качественные характеристики информации и сформирована система показателей для их количественного измерения. Также рассмотрены и усовершенствованы существующие подходы к оценке эффективности использования информационного капитала. Предложен механизм управления информационным капиталом нефтегазодобывающих предприятий.
In the paper it is examined the essence of an economic category "information capital", singled out characteristics and determined the role of information capital in management of oil-and-gas production enterprises. It is considered theoretical approaches to economic evaluation of information capital of an enterprise and determined peculiarities of their application and possibilities of adaptation concerning the evaluation of information capital of oil-and-gas production enterprises. It is analyzed internal and external environment of oil-and-gas production enterprises and on its basis it is suggested a model of their information environment. It is formed a system of characteristics and indices of this environment. It is developed a technical approach to a cost estimation of information with the use of entropy and a function of average risk. It is analyzed qualitative characteristics of information and formed a system of indices for a quantitative measurement of its quality. It is considered technical approaches to the evaluation of efficiency of information capital use. It is suggested a mechanism of oil-and-gas production enterprises information capital control.
APA, Harvard, Vancouver, ISO, and other styles
17

Groll, Pavel. "Vazby podnikových metod řízení a podnikových informačních systémů." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-82011.

Full text
Abstract:
The thesis is dealing with the methods of management of an enterprise and its information system. The main goal of this thesis is to depict the ties of those methods. The other goals are derived from the main one. They are structuring management methods and evaluating the appropriateness for managing IT in the small and medium enterprise. The first step for thorough mapping of the ties is structuring the methods into clearly defined approaches. The criteria gained from the systematic approach to enterprise are used for the analysis of the management methods. The mainly used management methods which have the impact on operational activity and distinct innovative character enter the analysis. In the next chapter, the basic ties of the approaches to the department of business informatics, information system and to the methods of business informatics are set. Subsequently the outputs relevant to business informatics management are extracted. The main contribution of the thesis lies in the systematic approach to methods of management which enables further research of ties to information system and its management. The second more concrete contribution lies in basic definition of the conclusions for the needs of business informatics.
APA, Harvard, Vancouver, ISO, and other styles
18

Kransell, Martin. "The Value of Data Regarding Traceable Attributes in a New Era of Agriculture : Bridging the Information Gap Between Consumers and Producers of Organic Meat." Thesis, Linnéuniversitetet, Institutionen för informatik (IK), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-35089.

Full text
Abstract:
Purpose – This study aims to explore, and suggest solutions to, the gap between the supply of information from organic meat producers and the demand of information from consumers regarding traceable characteristics (attributes) of meat in a limited geographical area in order to maximize the utilization and value of collected data. Design/methodology/approach – A mixed methods research design is applied to collect both quantitative data from consumers and qualitative data from suppliers to produce empirical results of the supply and demand of information. A theoretical framework of organic food purchase intent is used for the quantitative study as well as the correlation between consumers’ perceived importance of attributes and their willingness-to-pay for meat. The results of the empirical studies are compared to each other in an effort to expose a possible gap using a gap analysis. Findings – Meat is shifting from a price based commodity to a product based on characteristics. This study reveals that there is now a gap between the information made available by organic meat producers and the demand of information from consumers that needs to be recognized in order to maximize the value of collected data. Information regarding environmental impact of raising and transporting the animals is not extensively collected. A substantial amount of data about attributes of perceived importance, such as safety and handling, animal welfare and medication or other treatments is collected but not extensively shared with consumers. Research limitations/implications – The small sample size in a unique area and the scope of the survey data does not provide a result that can be truly generalized. It is therefore suggested that future studies produce results from a larger sample that incorporates the perceived accessibility of important information for consumers. Practical implications – This contributes to the emerging literature of organic food production by comparing both the supply and the demand of information regarding attributes of meat. This information is valuable to organic meat producers and marketers as well as developers of agricultural systems and databases that should shift their focus to consumer oriented traceability systems. Originality/value – This study goes beyond the substantial body of literature regarding attributes of organic food and consumers preferences by comparing these factors to the available supply of information by meat producers and by suggesting solutions to bridge the gap between them. Keywords – Organic meat, Organic agriculture, e-Agriculture, Traceability, Traceability systems, Consumer oriented, Consumer behavior, Willingness-to-pay, Supply and demand, Information gap, Gap analysis, Business development, United States of America, Sense-making theory, Mixed methods Paper type – Research paper, Bachelor’s thesis
APA, Harvard, Vancouver, ISO, and other styles
19

Assareh, Hassan. "Bayesian hierarchical models in statistical quality control methods to improve healthcare in hospitals." Thesis, Queensland University of Technology, 2012. https://eprints.qut.edu.au/53342/1/Hassan_Assareh_Thesis.pdf.

Full text
Abstract:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
APA, Harvard, Vancouver, ISO, and other styles
20

Alaofin, Babatunde Ayodele. "The Value of Diagnostic Software and Doctors' Decision Making." ScholarWorks, 2015. https://scholarworks.waldenu.edu/dissertations/344.

Full text
Abstract:
The prevalence of medical misdiagnosis has remained high despite the adoption of diagnostic software. This ongoing controversy about the role of technology in mitigating the problem of misdiagnosis centers on the question of whether diagnostic software does reduce the incidence of misdiagnosis if properly relied upon by physicians. The purpose of this quantitative, cross-sectional study based on planned behavior theory was to measure doctors' opinions of diagnostic technology's medical utility. Recruitment e-mails were sent to 3,100 AMA-accredited physicians through their database that yielded a sample of 99 physicians for the study. One-sample t tests and, where appropriate because of non-normal data, one-sample Wilcoxon signed-rank tests were conducted on the data to address the following key research questions on whether diagnostic software decreases misdiagnosis in healthcare versus unassisted human diagnostic method, if physicians use diagnostic software frequently enough to decrease misdiagnosis in healthcare, and if liability concerns prevent physicians from using diagnostic software. It was found that in the opinion of those surveyed (a) diagnostic software was likely to result in fewer misdiagnoses in healthcare than unassisted human diagnostic methods, (b) when speaking for themselves, physicians thought they used diagnostic software frequently enough to decrease misdiagnoses, and (c) physicians agreed they were not prevented from using diagnostic software because of liability concerns. The study's social significance is the affirmation of diagnostic software's usefulness: Policy and technology stakeholders can use this finding to speed the adoption of diagnostic software, leading to a reduction in the socially costly problem of misdiagnosis.
APA, Harvard, Vancouver, ISO, and other styles
21

Ghosh, Suvankar. "Essays on Emerging Practitioner-Relevant Theories and Methods for the Valuation of Technology." Kent State University / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=kent1246573195.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Wilfong, Jeffery D. "Organizational culture and information technology (IT) project success and failure factors| A mixed-methods study using the competing values framework and Schein's three levels approach." Thesis, Saybrook Graduate School and Research Center, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=3628342.

Full text
Abstract:

The percentage of failure in traditional project management is high, as nearly 70% of projects fail (The Standish Group, 2009). Unsuccessful projects impact businesses, customers, and society in sizable ways.

Project success and failure research fit into two categories: (a) project management methodological issues and (b) leadership and organizational behavior issues. Most research focuses on the former. This research addressed the later, specifically examining Information Technology (IT) project workers who reside in the United States.

The central research question was, What is the optimal organizational culture for IT project teams such that success factors are enhanced and failure factors are lessened? A mixed-methods study was designed and implemented. For Phase One, an internet survey was conducted using Cameron and Quinn's (2006) Competing Values Framework (Organizational Culture Assessment Inventory (OCAI)) and compared to a measure of IT Project Success. For Phase Two, qualitative interviews were carried out using Schein's (2004) Three Levels Model of organizational culture, and then a Thematic Analysis was completed to obtain an optimal culture model.

One hundred forty-one participants completed Phase One. Results showed no significant correlation between the four culture types (Clan, Adhocracy, Market, and Hierarchy) and IT project success. For Phase Two, 15 participants of varying job roles and demographics completed interviews. Applying Thematic Analysis techniques, 175 codes related to leadership and organizational behavior issues were determined, which produced twenty-six themes.

The findings from Phase Two produced a set of interrelated organizational culture factors that IT project workers believed were optimal for project success. The framework was termed Enlightened Information Technology Project Culture (EITPC)TM and comprised four dimensions: (a) organizational behavior/leadership, (b) processes, (c) support, and (d) technology. The results suggest that if managers and consultants implemented this model, or applicable factors, that their IT projects would likely have greater success, or lower degrees of failure.

Suggestions for future research is to continue to study leadership and organizational behavior issues of project teams. Additional research is needed on the Enlightened Information Technology Project Culture (EITPC)TM framework to determine whether differing demographics of IT workers and company (or project) types impact the results.

APA, Harvard, Vancouver, ISO, and other styles
23

Ammari, Ahmad N. "Transforming user data into user value by novel mining techniques for extraction of web content, structure and usage patterns : the development and evaluation of new Web mining methods that enhance information retrieval and improve the understanding of users' Web behavior in websites and social blogs." Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/5269.

Full text
Abstract:
The rapid growth of the World Wide Web in the last decade makes it the largest publicly accessible data source in the world, which has become one of the most significant and influential information revolution of modern times. The influence of the Web has impacted almost every aspect of humans' life, activities and fields, causing paradigm shifts and transformational changes in business, governance, and education. Moreover, the rapid evolution of Web 2.0 and the Social Web in the past few years, such as social blogs and friendship networking sites, has dramatically transformed the Web from a raw environment for information consumption to a dynamic and rich platform for information production and sharing worldwide. However, this growth and transformation of the Web has resulted in an uncontrollable explosion and abundance of the textual contents, creating a serious challenge for any user to find and retrieve the relevant information that he truly seeks to find on the Web. The process of finding a relevant Web page in a website easily and efficiently has become very difficult to achieve. This has created many challenges for researchers to develop new mining techniques in order to improve the user experience on the Web, as well as for organizations to understand the true informational interests and needs of their customers in order to improve their targeted services accordingly by providing the products, services and information that truly match the requirements of every online customer. With these challenges in mind, Web mining aims to extract hidden patterns and discover useful knowledge from Web page contents, Web hyperlinks, and Web usage logs. Based on the primary kinds of Web data used in the mining process, Web mining tasks can be categorized into three main types: Web content mining, which extracts knowledge from Web page contents using text mining techniques, Web structure mining, which extracts patterns from the hyperlinks that represent the structure of the website, and Web usage mining, which mines user's Web navigational patterns from Web server logs that record the Web page access made by every user, representing the interactional activities between the users and the Web pages in a website. The main goal of this thesis is to contribute toward addressing the challenges that have been resulted from the information explosion and overload on the Web, by proposing and developing novel Web mining-based approaches. Toward achieving this goal, the thesis presents, analyzes, and evaluates three major contributions. First, the development of an integrated Web structure and usage mining approach that recommends a collection of hyperlinks for the surfers of a website to be placed at the homepage of that website. Second, the development of an integrated Web content and usage mining approach to improve the understanding of the user's Web behavior and discover the user group interests in a website. Third, the development of a supervised classification model based on recent Social Web concepts, such as Tag Clouds, in order to improve the retrieval of relevant articles and posts from Web social blogs.
APA, Harvard, Vancouver, ISO, and other styles
24

Ammari, Ahmad N. "Transforming user data into user value by novel mining techniques for extraction of web content, structure and usage patterns. The Development and Evaluation of New Web Mining Methods that enhance Information Retrieval and improve the Understanding of User¿s Web Behavior in Websites and Social Blogs." Thesis, University of Bradford, 2010. http://hdl.handle.net/10454/5269.

Full text
Abstract:
The rapid growth of the World Wide Web in the last decade makes it the largest publicly accessible data source in the world, which has become one of the most significant and influential information revolution of modern times. The influence of the Web has impacted almost every aspect of humans' life, activities and fields, causing paradigm shifts and transformational changes in business, governance, and education. Moreover, the rapid evolution of Web 2.0 and the Social Web in the past few years, such as social blogs and friendship networking sites, has dramatically transformed the Web from a raw environment for information consumption to a dynamic and rich platform for information production and sharing worldwide. However, this growth and transformation of the Web has resulted in an uncontrollable explosion and abundance of the textual contents, creating a serious challenge for any user to find and retrieve the relevant information that he truly seeks to find on the Web. The process of finding a relevant Web page in a website easily and efficiently has become very difficult to achieve. This has created many challenges for researchers to develop new mining techniques in order to improve the user experience on the Web, as well as for organizations to understand the true informational interests and needs of their customers in order to improve their targeted services accordingly by providing the products, services and information that truly match the requirements of every online customer. With these challenges in mind, Web mining aims to extract hidden patterns and discover useful knowledge from Web page contents, Web hyperlinks, and Web usage logs. Based on the primary kinds of Web data used in the mining process, Web mining tasks can be categorized into three main types: Web content mining, which extracts knowledge from Web page contents using text mining techniques, Web structure mining, which extracts patterns from the hyperlinks that represent the structure of the website, and Web usage mining, which mines user's Web navigational patterns from Web server logs that record the Web page access made by every user, representing the interactional activities between the users and the Web pages in a website. The main goal of this thesis is to contribute toward addressing the challenges that have been resulted from the information explosion and overload on the Web, by proposing and developing novel Web mining-based approaches. Toward achieving this goal, the thesis presents, analyzes, and evaluates three major contributions. First, the development of an integrated Web structure and usage mining approach that recommends a collection of hyperlinks for the surfers of a website to be placed at the homepage of that website. Second, the development of an integrated Web content and usage mining approach to improve the understanding of the user's Web behavior and discover the user group interests in a website. Third, the development of a supervised classification model based on recent Social Web concepts, such as Tag Clouds, in order to improve the retrieval of relevant articles and posts from Web social blogs.
APA, Harvard, Vancouver, ISO, and other styles
25

André, Étienne. "Les actifs incorporels de l'entreprise en difficulté." Thesis, Lyon, 2018. http://www.theses.fr/2018LYSE3076.

Full text
Abstract:
La mutation des économies a transformé les richesses en profondeur en les désincarnant. Ce phénomène s’est traduit par l’accroissement de valeurs sans matière au sein des entreprises et, incidemment, lorsqu’elles éprouvent des difficultés. La notion d’actifs incorporels place la valeur au centre des préoccupations et renvoie à une réalité tant juridique qu’économique. Cette approche révèle leur singularité dans un contexte de défaillance à travers l’observation des opérations d’évaluation et de réalisation. D’une part, l’évaluation des actifs incorporels se montre défectueuse, révélant les carences de la comptabilité française, qui peine à retranscrire la valeur de ces actifs, et plus largement, mettant en exergue les limites des méthodes d’évaluation de ces actifs dans un contexte de difficulté. D’autre part, la réalisation des actifs incorporels est complexifiée par les modes de cession ou des garanties constituées. Ainsi, la singularité des actifs incorporels rend difficile leur maîtrise. Partant, des solutions peuvent être trouvées dans le cadre du droit des entreprises en difficulté. Une grille de lecture des actifs incorporels peut d’ores et déjà s’articuler autour de la valeur et de son interaction avec l’exploitation. Certains actifs incorporels, tels qu’un logiciel ou un fichier-client, sont directement corrélés à l’activité de l’entreprise et ont tendance à se dévaloriser au fur et à mesure des difficultés de celle-ci. D’autres actifs incorporels, tels les créances et les droits sociaux, reposant sur des éléments extérieurs à l’entreprise, ne perdent pas automatiquement leur valeur en présence de difficultés. La division des actifs incorporels peut donc s’opérer entre les actifs incorporels dont la valeur s’établit à l’aune de l’exploitation, et ceux dont la valeur ne lui est pas directement liée. Ces actifs incorporels suscitent par ailleurs une évolution du droit des entreprises en difficulté au niveau des opérations d’évaluation et de réalisation afin d’être mieux appréhendés. La prise en compte de ces évolutions est indispensable. L’importance grandissante des actifs incorporels au sein des entreprises en difficulté, ne doit pas être ignorée au risque sinon de les affaiblir davantage, de décrédibiliser le cadre judiciaire du traitement des entreprises en difficulté
The mutation of the economy has fundamentally transformed wealth by disembodying it. This has led to the increase in intangible wealth within companies and, incidentally, when they experience difficulties. The concept of intangible assets places centers on value and refers to both a legal and economic reality. This approach reveals their exceptional character in a context of default by observing operations related to valuation and transfer. On the one hand, the valuation of intangible assets is defective, revealing the shortcomings of French accounting, which struggles to translate the value of these assets, and more broadly, highlights the limits of the methods used to value these assets in a difficult context. On the other hand, the transfer of intangible assets is made more complex by the methods of sale or guarantees provided. Thus, the exceptional nature of intangible assets makes them difficult to master. However, solutions can be found in law governing companies in financial difficulty. An index for measuring intangible assets can already be based on value and its interaction with business operations. Some intangible assets, such as software or client files, are directly correlated to the company's activity and tend to devalue as the company's difficulties arise. Other intangible assets, such as receivables and social rights, based on elements external to the company, do not automatically lose their value in the event of difficulties. The division of intangible assets can therefore be made between those intangible assets whose value is established based on exploitation, and those whose value is not directly related to it. Consideration of valuation and transfer operations in relation to intangible assets has led to changes in the law governing companies in difficulty. It is essential to take these developments into account. The growing importance of intangible assets within companies in difficulty must not be ignored at the risk of weakening them further and undermining the judicial framework for such companies
APA, Harvard, Vancouver, ISO, and other styles
26

Бебех, Я. С. "Комп’ютерна система пошуку оптимальних параметрів магнітного тахометричного підсилювача." Master's thesis, Сумський державний університет, 2018. http://essuir.sumdu.edu.ua/handle/123456789/72166.

Full text
Abstract:
Розроблено алгоритм і комп’ютерну програму для пошуку оптимальних параметрів магнітного тахометричного підсилювача. Їхня робота перевірена на контрольному прикладі. Програма може бути застосована при проектуванні цих підсилювачів.
APA, Harvard, Vancouver, ISO, and other styles
27

Wang, Chuan-Yi, and 王銓億. "Evaluating Value of Information on Debris-Flow Monitoring System- Payment Card Method." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/27992574167684521812.

Full text
Abstract:
碩士
國立中興大學
應用經濟學系所
95
To reduce the disaster caused by debris-flow, the Soil and Water Conservation Bureau, Council of Agriculture, established debris-flow monitoring system, including 13 fixed and 2 mobile ones. The main purpose of this paper is to estimate the benefit of debris-flow monitoring system. We used Contingent Valuation Method (CVM) to estimate accurate benefit, and an in-person interview along with payment card format questionnaire. Respondents were out from Taichung’s towns having potential rivers and above 20 ages. The results can be taken as reference for policy or academic research. Results from surveys indicated that residents would pay 342 NT dollars per person per year to maintain debris-flow monitoring system. Besides, with facing higher debris-flow risk, more precisely debris-flow alarm rate, changing willingness to pay (WTP) because of different disasters, getting related debris-flow information from internet, residents will have higher WTP.
APA, Harvard, Vancouver, ISO, and other styles
28

Wu, Chiu-hua, and 吳秋華. "The study of the value in using of exercise administration information system on elementary school by MEC method." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/84321089743611571191.

Full text
Abstract:
碩士
康寧大學
國際企業管理研究所
101
ABSTRACT Along with era of knowledge economy oncoming, not only the enterprise has realized the so-called endurance competitive advantage gradually, no longer is only the dependence visible land, the labor force and the capital, but is must depend on the wisdom and the innovation and so on invisible knowledge capital and the management, even academic circle, all levels of schools and government unit, also feels the knowledge management the tendency with in accordance to it necessity, therefore to achieve the knowledge management the goal, is arises at the historic moment by the information science and technology and the information system or the work system's development and the innovation management, and has occupied the very important status. Secondary schools has long been in the process of school competitions and various events of the Games, in sports administration job, there are many people lack. This study collected relevant information and the use of information technology import Provincial Yuan Lin High School of Commerce and Home Economic Sports Leader, integrated information management system for knowledge management and innovation, developed in 13 secondary schools and two universities to share “Exercise administration information system” for the study of the subject discussed to solve the long-standing school sports administration handled the tedious job and the pressure of the Games, and help improve the efficiency of the school sports administration in handling Games. Finally, and penetrates by the steps of “Means-end chains” and ladders interview method carries on the interview investigation and the material analysis to this set of games administration information system's user, system's attribute, the consequence and the value links the relations, understood that this administration information science and technology system's user hierarchical value map(HVM), provides the system exploiter to system's revision and the improvement reference, expected that this system can improve and transform the sports administration's management style, and leads the school games administration potency the promotion, and create high quality knowledge sharing the culture, urges sports teacher to become the knowledge to work with the inventor, then promotes the school sports “the vitality” and “Competitive power” Key word: Exercise administration information system, Means-end chains, Laddering interview method, Hieralrchical value map , use value.
APA, Harvard, Vancouver, ISO, and other styles
29

Grobler, Chris Daniel. "A strategic theoretical framework to safeguard business value for information systems." Thesis, 2017. http://hdl.handle.net/10500/24724.

Full text
Abstract:
The phenomenon of business value dissipation in mature organisations as an unintended by-product of the adoption and use of information systems has been a highly debated topic in the corporate boardroom awakening the interest of practitioners and academics alike. Much of the discourse tends to focus on the inability of organisations to unlock and realise the intended benefits to be harvested through large information systems investments. While the business case for investing in large technology programmes has been thoroughly investigated, the human agent that causes value erosion through his interaction with information systems (IS), has not received the studied attention it deserves. This study examines the use of technology in organisations by considering the dichotomy inherent in IS where its introduction for the purposes of creating new or sustaining existing business value subsequently also inadvertently dissipates value. The study proceeds to investigate the root people-induced causes resulting in the unintentional dissipation of value and presents an empirically validated model suggesting that human agents do not only create value for organisations through their use of IS, but at the same time, deliberately or inadvertently, dissipate value. The root people-induced causes resulting in the unintentional dissipation of value is delineated within a Theoretical Technology Value Framework that is constructed from a review of the extant literature, and delineates the overall unintentional value destroying causes and effects of IS on organisations. The Theoretical Technology Value Framework is forthwith applied as a basis for the development of a set of questions to support both qualitative and quantitative investigations from which an Archetypical Technology Value Model was derived. Finally, an Archetypical Technology Value Model is presented as a benchmark and basis to identify, investigate, mitigate and minimise or eliminate the unintentional value destroying effects of IS on Information Technology driven organisations. The study concludes with implications for both theory and practice and suggestions on how value erosion through the activities of the human agent may be identified, modeled and mitigated. Ultimately, recommendations are offered towards the crafting of more effective IS.
School of Computing
Ph. D. (Information Systems)
APA, Harvard, Vancouver, ISO, and other styles
30

Xu, Jin. "Essays in Financial Econometric Investigations of Farmland Valuations." Thesis, 2013. http://hdl.handle.net/1969.1/150974.

Full text
Abstract:
This dissertation consists of three essays wherein tools of financial econometrics are used to study the three aspects of farmland valuation puzzle: short-term boom-bust cycles, overpricing of farmland, and inconclusive effects of direct government payments. Essay I addresses the causes of unexplained short-term boom-bust cycles in farmland values in a dynamic land pricing model (DLPM). The analysis finds that gross return rate of farmland asset decreases as the farmland asset level increases, and that the diminishing return function of farmland asset contributes to the boom-bust cycles in farmland values. Furthermore, it is mathematically proved that land values are potentially unstable under diminishing return functions. We also find that intertemporal elasticity of substitution, risk aversion, and transaction costs are important determinants of farmland asset values. Essay II examines the apparent overpricing of farmland by decomposing the forecast error variance of farmland prices into forward looking and backward looking components. The analysis finds that in the short run, the forward looking Capital Asset Pricing Model (CAPM) portion of the forecast errors are significantly higher in a boom or bust stage than in a stable stage. This shows that the farmland market absorbs economic information in a discriminative manner according to the stability of the market, and the market (and actors therein) responds to new information gradually as suggested by the theory. This helps to explain the overpricing of farmland, but this explanation works primarily in the short run. Finally, essay III investigates the duel effects of direct government payments and climate change on farmland values. This study uses a smooth coefficient semi-parametric panel data model. The analysis finds that land valuation is affected by climate change and government payments, both through discounted revenues and through effects on the risk aversion of land owners. This essay shows that including heterogeneous risk aversion is an efficient way to mitigate the impacts of misspecifications in a DLPM, and that precipitation is a good explanatory variable. In particular, precipitation affects land values in a bimodal manner, indicating that farmland prices could have multiple peaks in precipitation due to adaption through crop selection and technology alternation.
APA, Harvard, Vancouver, ISO, and other styles
31

Tognaccini, Sofia. "Geomorfologia applicata all'individuazione dello stato di attività dei movimenti gravitativi e analisi di suscettibilità da frana in diversi contesti geologico-strutturali." Doctoral thesis, 2019. http://hdl.handle.net/2158/1198143.

Full text
Abstract:
Le analisi contenute nel presente elaborato di tesi costituiscono uno strumento conoscitivo di base, indispensabile per tutte quelle zone in cui i movimenti franosi o altre dinamiche inducono pericolosità geologica interferendo con l’ambiente e le strutture antropiche, consentendo una buona conoscenza, anche di territori con estensione piuttosto ampia, a costi molto ridotti.Tutte le procedure e software utilizzati nel presente lavoro sono disponibili open-source e fruibili gratuitamente online. Il rilevamento geomorfologico è stato condotto in tre aree di studio, caratterizzate da contesti geologico-strutturali diversi tra loro. Tale rilevamento, coadiuvato da un'analisi multitemporale condotta mediante fotointerpretazione, si è rivelato molto utile per individuare i limiti dei movimenti franosi, il loro grado di attività e la loro evoluzione nel tempo. Le analisi statistiche hanno consentito di individuare la variabilità dei parametri predisponenti nelle diverse aree di studio e di indicare quali classi, tra i parametri, sono caratterizzate dal maggior numero di frane e, dunque, più prone al franamento. Gli studi di suscettibilità effettuati per le diverse aree di studio con la metodologia di statistica bivariata mostrano un’elevata percentuale di aree in frana ricadenti all’interno delle aree predette a più elevata suscettibilità e possono essere applicati sia a piccola sia a grande scala, per diverse tipologie di frane. Infine, i risultati ottenuti tramite il modello prevalentemente deterministico basato sul modulo r.slope.stability (http://www.slopestability.org/) sono piuttosto attendibili e rappresentativi della situazione reale, come dimostrato dai risultati della validazione tramite la stima di AROC. L’opportunità di utilizzare software GIS, non solo per la manipolazione dei dati ma anche per il loro aggiornamento e per l’esecuzione automatica delle procedure bivariate (ad es. attraverso il codice in linguaggio Python creato durante questo lavoro di tesi), incrementa notevolmente la possibilità di estendere queste analisi a nuove e diverse aree di studio, con costi pressoché nulli.
APA, Harvard, Vancouver, ISO, and other styles
32

Thenga, Godfrey. "The value of modus operandi in investigating child support grant fraud." Diss., 2014. http://hdl.handle.net/10500/18754.

Full text
Abstract:
While establishing modus operandi information is an investigative technique that is used globally in the investigation of fraud, investigators of child support grant fraud in South Africa do not properly use this technique in their investigation. The study sought to examine the modus operandi used by civil servants in unduly accessing child support grand fraud. The study adopted a qualitative approach, with structured interviews and literature as data-collection methods. Two sample groups were used to gather data. The data was analysed using spiral method. The study found that modus operandi as a valuable investigation tool was not properly used and in other instances was never used in the investigations. On the basis of the findings, it is recommended that modus operandi captured on the police and corporate database systems be used as reference for comparing modus operandi of known grants fraudsters. Training and skills be provided to public and corporate investigators on the use of modus operandi information in the investigation of social grants. To ensure that good practice is developed, a modus operandi guideline document should be crafted by the South African Police Services' detective division at National level and corporate investigators respectively, with a view to improve conviction rate.
Criminology and Security Science
M. Tech. (Forensic Investigation)
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography