To see the other types of publications on this topic, follow the link: Empirical processes.

Dissertations / Theses on the topic 'Empirical processes'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Empirical processes.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Tusche, Marco. "Empirical processes of multiple mixing data." Thesis, Tours, 2013. http://www.theses.fr/2013TOUR4033/document.

Full text
Abstract:
Cette thèse étudie la convergence en loi des processus empiriques de données à mélange multiple. Son contenu correspond aux articles : Durieu et Tusche (2012), Dehling, Durieu, et Tusche (2012), et Dehiing, Durieu et Tusche (2013). Nous suivons l’approche par approximation introduite dans Dehling, Durieu, et Vo1n (2009) et Dehling and Durieu (2011), qui ont établi des théorèmes limite centraux empiriques pour des variables aléatoires dépendants à valeurs dans R ou RAd, respectivement. En développant leurs techniques, nous généralisons leurs résultats à des espaces arbitraires et à des processus empiriques indexés par des classes de fonctions. De plus, nous étudions des processus empiriques séquentiels. Nos résultats s’appliquent aux chaînes de Markov B-géométriquement ergodiques, aux modèles itératifs lipschitziens, aux systèmes dynamiques présentant un trou spectral pour l’opérateur de Perron-Frobenius associé, ou encore, aux automorphismes du tore. Nous établissons des conditions garantissant la convergence du processus empirique de tels modèles vers un processus gaussien
The present thesis studies weak convergence of empirical processes of multiple mixing data. It is based on the articles Durieu and Tusche (2012), Dehling, Durieu, and Tusche (2012), and Dehling, Durieu, and Tusche (2013). We follow the approximating class approach introduced by Dehling, Durieu, and Voln (2009)and Dehling and Durieu (2011), who established empirical central limit theorems for dependent R- and R”d-valued random variables, respectively. Extending their technique, we generalize their results to arbitrary state spaces and to empirical processes indexed by classes of functions. Moreover we study sequential empirical processes. Our results apply to B-geometrically ergodic Markov chains, iterative Lipschitz models, dynamical systems with a spectral gap on the Perron—Frobenius operator, and ergodic toms automorphisms. We establish conditions under which the empirical process of such processes converges weakly to a Gaussian process
APA, Harvard, Vancouver, ISO, and other styles
2

Xie, Chen. "DYNAMIC DECISION APPROXIMATE EMPIRICAL REWARD (DDAER) PROCESSES." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1398991609.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Forster, Emma. "Migration decision-making processes : an empirical investigation." Thesis, Edinburgh Napier University, 2000. http://researchrepository.napier.ac.uk/Output/3711.

Full text
Abstract:
This thesis has two purposes. The first is to investigate the motivation for household migration - in particular, the associations between the different reasons for moving and the characteristics of owner-occupier movers in Scotland, their houses and the distances they travel. The second is to investigate the extent to which the migration decision is a longitudinal one, and from this longitudinal analysis to highlight the extent of latent migration. Little longitudinal research has previously been carried out on the migration decision. The thesis uses two recent, large-scale and under-utilised data sources to investigate each of these issues. Firstly, the associations with motivations for migration are investigated using the 'Migration and Housing Choice Survey' (MHCS) which contains information from 10,010 households. The advantage of this cross-sectional source lies in its provision of detailed information on motivations at a national level of coverage. The large-scale, national coverage makes it possible to investigate many types of migration flow. This advantage is not shared by any other British research into motivations for migration and only three other data sets elsewhere. Secondly, the extent to which the decision to n-iigrate is part of an on-going process is investigated using the 'British Household Panel Survey' (BBPS). This new and under-exploited source of migration data contains longitudinal information from 10,264 individuals in the first wave and holds approximately this sample size through each of the following four waves. This thesis makes four key contributions to knowledge. The first three are based on the detailed and systematic analysis of the reasons for residential migration behaviour of owner-occupiersin Scotland,u sing the MHCS. Firstly, the reasonsf or moving, as suggestedb y previously small-scaler esearch,h ave been confirmed by this large-scale data set. Secondly, this thesis has extended - and in some cases refuted - the findings of previous researchb y investigatingt he bivariate associationsb etween each of the reasons for moving and each possible explanatory variables (these being characteristicso f migrants, of their home and of the distancest hey move). This has been investigated using much wider selection of reasons for moving and of characteristicsth an hasb eenp reviouslyd one. Thirdly, this thesish as shown that lifecycle stage exerts a considerable amount of influence on the reasons given for moving, whilst still operating in conjunction with other variables, such as distance moved and housing features. The MIHCS can, for the first time, enable research into the connection between the factors influencing migration flows and the factors influencing motivations for migration. Fourthly, this thesis has investigated how migration decisions and preference for migration relate over time, using longitudinal data (the BHPS). This has shown that a considerable amount of latent mobility is present in Britain, and even more importantly, has identified the characteristics of the latent migrants and frequent movers. In addition, this thesis has offered some methodological pointers for future migration research. Overall, the use of these two important but under-utilised data sets, the MECS and the BBPS, have enabled analyses to be undertaken that are unique in the history of nýgration research. V
APA, Harvard, Vancouver, ISO, and other styles
4

Karimi, Fariba. "Tightly knit : spreading processes in empirical temporal networks." Doctoral thesis, Umeå universitet, Institutionen för fysik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-98885.

Full text
Abstract:
We live in a tightly knit world. Our emotions, desires, perceptions and decisions are interlinked in our interactions with others. We are constantly influencing our surroundings and being influenced by others. In this thesis, we unfold some aspects of social and economical interactions by studying empirical datasets. We project these interactions into a network representation to gain insights on how socio-economic systems form and function and how they change over time. Specifically, this thesis is centered on four main questions: How do the means of communication shape our social network structures? How can we uncover the underlying network of interests from massive observational data? How does a crisis spread in a real financial network? How do the dynamics of interaction influence spreading processes in networks? We use a variety of methods from physics, psychology, sociology, and economics as well as computational, mathematical and statistical analysis to address these questions.
APA, Harvard, Vancouver, ISO, and other styles
5

Bühlmann, Peter Lukas Bühlmann Peter Lukas Bühlmann Peter Lukas. "The blockwise bootstrap in time series and empirical processes /." Zürich, 1993. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=10354.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Hoppitt, W. J. E. "Social processes influencing learning : combining theoretical and empirical approaches." Thesis, University of Cambridge, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.604231.

Full text
Abstract:
There are a number of processes which can result in social transmission of behaviour patterns; so much effort in social learning research has gone into devising experimental procedures that can isolate imitation from other social learning processes. In this thesis I develop method for distinguishing “simple” social learning processes and test these methods using the domestic fowl as a model organism. Strong evidence is presented for a response facilitation effect on a number of behaviour patterns, a process which might function to the same ends as imitation in animal populations. Recent models have suggested that an ability to imitate might be dependent on prior experience rather than specialised learning mechanisms. A neural network model is used to investigate these hypotheses, and generate predictions as to the conditions under which a capacity for imitation should arise. The model predicts that processes such as behavioural synchrony between individuals might result in the formation of appropriate neural links for an imitative ability. These predictions are tested using experimental and observational data on the domestic fowl. The model also suggests that similar mechanisms and developmental processes might underlie imitation and simpler social learning processes, such as response facilitation and observational conditioning. This suggests that a process of positive feedback might operate, with social learning promoting behavioural synchrony, which in turn promotes the development of social learning mechanisms.
APA, Harvard, Vancouver, ISO, and other styles
7

Al-Besbasi, Ibrahim. "An empirical investigation of some cognitive processes of translation." Thesis, University of Exeter, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.280690.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kawczak, Janusz. "Weak convergence of a certain class of residual empirical processes." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape15/PQDD_0014/NQ31159.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Cockburn, A. D. "An empirical study of classroom processes in infant mathematics education." Thesis, University of East Anglia, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.374687.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhou, Bi Ting. "Testing for jumps of discretely observed processes :an empirical analysis." Thesis, University of Macau, 2015. http://umaclib3.umac.mo/record=b3335821.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Willott, Sara. "Reflecting team processes in family therapy : critical review & empirical exploration." Thesis, University of Birmingham, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.435386.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Shao, Wei, and n/a. "Consumer Decision-Making: An Empirical Exploration of Multi-Phased Decision Processes." Griffith University. Griffith Business School, 2007. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20070725.144459.

Full text
Abstract:
Over the past 50 years, a great deal of research has conceptualised and modelled consumer decision-making as a single-or two-stage decision process. Today, the decision complexity has increased and consumers need to filter out a large amount of information prior to the final choice decision. This poses a challenge for marketing modellers to develop decision models that are more representative of real-world decision-making. An important rationale for the present study is to improve our understanding of consumer decision-making by providing empirical evidence that consumer decision-making may go beyond a single-or two-stage structure. This thesis aims to provide an insightful view of consumer decision-making, which may help marketers to develop and reinforce marketing programs to address consumer needs and hence increase profits, with knowledge of the types of decisions made and how decisions are made at different stages of the decision process. The literature review identified single-and two-stage decision models. Data analysis did not fully support this conceptualisation. An empirical exploration of consumer decision-making for a durable product revealed that the existing literature is limited in scope and predictability as they failed to capture multi-phase decision processes, which accounted for approximately one-half of consumer decisions. Empirical evidence was found suggesting that consumers seldom use a single strategy throughout the decision process. Consumer heterogeneity was also evident in this research as different consumers approached the same decision task with different processes and outcomes. Finally, this research identified those aspects of decision processes that have not been captured by the literature-based decision strategies. This research suggests that consumer decisions are more contingent than previously conceived in a single-and two-stage model. This research recommends that marketers should reconsider their understanding of consumer decision-making and bear in mind that one marketing strategy does not fit all customers. Marketers need to develop marketing strategies to address the entire decision process instead of focusing only on the decision outcome. By identifying different decision paths that are used by consumers, marketers can effectively segment the market; marketers can also benchmark consumers' perceptions of their performance on the important attributes against competitors to ensure that their product/brand is not eliminated prior to the selection from within the choice set. Future research requires us to understand how consumer differences interact with the decision environment to influence decision processes and outcomes. To do so, researchers must adopt a multi-phase perspective.
APA, Harvard, Vancouver, ISO, and other styles
13

Bare, Marshall Edwin. "Structuring empirical methods for reuse and efficiency in product development processes /." Diss., CLICK HERE for online access, 2007. http://contentdm.lib.byu.edu/ETD/image/etd1676.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

David, Stella Veronica. "Central limit theorems for empirical product densities of stationary point processes." kostenfrei, 2008. http://d-nb.info/994494777/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Safoutin, Michael John. "A methodology for empirical measurement of iteration in engineering design processes /." Thesis, Connect to this title online; UW restricted, 2003. http://hdl.handle.net/1773/7111.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Shao, Wei. "Consumer Decision-Making: An Empirical Exploration of Multi-Phased Decision Processes." Thesis, Griffith University, 2007. http://hdl.handle.net/10072/365297.

Full text
Abstract:
Over the past 50 years, a great deal of research has conceptualised and modelled consumer decision-making as a single-or two-stage decision process. Today, the decision complexity has increased and consumers need to filter out a large amount of information prior to the final choice decision. This poses a challenge for marketing modellers to develop decision models that are more representative of real-world decision-making. An important rationale for the present study is to improve our understanding of consumer decision-making by providing empirical evidence that consumer decision-making may go beyond a single-or two-stage structure. This thesis aims to provide an insightful view of consumer decision-making, which may help marketers to develop and reinforce marketing programs to address consumer needs and hence increase profits, with knowledge of the types of decisions made and how decisions are made at different stages of the decision process. The literature review identified single-and two-stage decision models. Data analysis did not fully support this conceptualisation. An empirical exploration of consumer decision-making for a durable product revealed that the existing literature is limited in scope and predictability as they failed to capture multi-phase decision processes, which accounted for approximately one-half of consumer decisions. Empirical evidence was found suggesting that consumers seldom use a single strategy throughout the decision process. Consumer heterogeneity was also evident in this research as different consumers approached the same decision task with different processes and outcomes. Finally, this research identified those aspects of decision processes that have not been captured by the literature-based decision strategies. This research suggests that consumer decisions are more contingent than previously conceived in a single-and two-stage model. This research recommends that marketers should reconsider their understanding of consumer decision-making and bear in mind that one marketing strategy does not fit all customers. Marketers need to develop marketing strategies to address the entire decision process instead of focusing only on the decision outcome. By identifying different decision paths that are used by consumers, marketers can effectively segment the market; marketers can also benchmark consumers' perceptions of their performance on the important attributes against competitors to ensure that their product/brand is not eliminated prior to the selection from within the choice set. Future research requires us to understand how consumer differences interact with the decision environment to influence decision processes and outcomes. To do so, researchers must adopt a multi-phase perspective.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
Griffith Business School
Griffith Business School
Full Text
APA, Harvard, Vancouver, ISO, and other styles
17

Nyquist, Pierre. "Large deviations for weighted empirical measures and processes arising in importance sampling." Licentiate thesis, KTH, Matematisk statistik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-117810.

Full text
Abstract:
This thesis consists of two papers related to large deviation results associated with importance sampling algorithms. As the need for efficient computational methods increases, so does the need for theoretical analysis of simulation algorithms. This thesis is mainly concerned with algorithms using importance sampling. Both papers make theoretical contributions to the development of a new approach for analyzing efficiency of importance sampling algorithms by means of large deviation theory. In the first paper of the thesis, the efficiency of an importance sampling algorithm is studied using a large deviation result for the sequence of weighted empirical measures that represent the output of the algorithm. The main result is stated in terms of the Laplace principle for the weighted empirical measure arising in importance sampling and it can be viewed as a weighted version of Sanov's theorem. This result is used to quantify the performance of an importance sampling algorithm over a collection of subsets of a given target set as well as quantile estimates. The method of proof is the weak convergence approach to large deviations developed by Dupuis and Ellis. The second paper studies moderate deviations of the empirical process analogue of the weighted empirical measure arising in importance sampling. Using moderate deviation results for empirical processes the moderate deviation principle is proved for weighted empirical processes that arise in importance sampling. This result can be thought of as the empirical process analogue of the main result of the first paper and the proof is established using standard techniques for empirical processes and Banach space valued random variables. The moderate deviation principle for the importance sampling estimator of the tail of a distribution follows as a corollary. From this, moderate deviation results are established for importance sampling estimators of two risk measures: The quantile process and Expected Shortfall. The results are proved using a delta method for large deviations established by Gao and Zhao (2011) together with more classical results from the theory of large deviations. The thesis begins with an informal discussion of stochastic simulation, in particular importance sampling, followed by short mathematical introductions to large deviations and importance sampling.

QC 20130205

APA, Harvard, Vancouver, ISO, and other styles
18

Rakhlin, Alexander. "Applications of empirical processes in learning theory : algorithmic stability and generalization bounds." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/34564.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Brain and Cognitive Sciences, 2006.
Includes bibliographical references (p. 141-148).
This thesis studies two key properties of learning algorithms: their generalization ability and their stability with respect to perturbations. To analyze these properties, we focus on concentration inequalities and tools from empirical process theory. We obtain theoretical results and demonstrate their applications to machine learning. First, we show how various notions of stability upper- and lower-bound the bias and variance of several estimators of the expected performance for general learning algorithms. A weak stability condition is shown to be equivalent to consistency of empirical risk minimization. The second part of the thesis derives tight performance guarantees for greedy error minimization methods - a family of computationally tractable algorithms. In particular, we derive risk bounds for a greedy mixture density estimation procedure. We prove that, unlike what is suggested in the literature, the number of terms in the mixture is not a bias-variance trade-off for the performance. The third part of this thesis provides a solution to an open problem regarding the stability of Empirical Risk Minimization (ERM). This algorithm is of central importance in Learning Theory.
(cont.) By studying the suprema of the empirical process, we prove that ERM over Donsker classes of functions is stable in the L1 norm. Hence, as the number of samples grows, it becomes less and less likely that a perturbation of o(v/n) samples will result in a very different empirical minimizer. Asymptotic rates of this stability are proved under metric entropy assumptions on the function class. Through the use of a ratio limit inequality, we also prove stability of expected errors of empirical minimizers. Next, we investigate applications of the stability result. In particular, we focus on procedures that optimize an objective function, such as k-means and other clustering methods. We demonstrate that stability of clustering, just like stability of ERM, is closely related to the geometry of the class and the underlying measure. Furthermore, our result on stability of ERM delineates a phase transition between stability and instability of clustering methods. In the last chapter, we prove a generalization of the bounded-difference concentration inequality for almost-everywhere smooth functions. This result can be utilized to analyze algorithms which are almost always stable. Next, we prove a phase transition in the concentration of almost-everywhere smooth functions. Finally, a tight concentration of empirical errors of empirical minimizers is shown under an assumption on the underlying space.
by Alexander Rakhlin.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
19

Melo, Claudia de Oliveira. "Productivity of agile teams: an empirical evaluation of factors and monitoring processes." Universidade de São Paulo, 2013. http://www.teses.usp.br/teses/disponiveis/45/45134/tde-25052015-120242/.

Full text
Abstract:
Lower cost and shorter time-to-market expectations are the major drivers of software productivity improvements. To manage productivity effectively, it is important to identify the most relevant difficulties and develop strategies to cope with them. Agile methods, including Extreme Programming and Scrum, have evolved as approaches to simplify software development process, potentially leading to better productivity. They aim to shorten development time and handle the inevitable changes resulting from market dynamics. Although the industry has extensively adopted agile methods, little research has empirically examined the software development agility construct regarding its dimensions, determinants, and effects on software development performance. Understanding this construct could help determine where to concentrate management efforts (and related financial resources) from a practical standpoint and where to focus research efforts from an academic perspective. Considerable research has been directed at identifying factors that have a significant impact on software development productivity. In general, the studied productivity factors were related to product, personnel, project, process, or organizational issues. Continuously evaluating productivity factors is important, as factors may change under new software engineering practices. However, little research has investigated the major factors influencing agile team productivity. ]The goal of this thesis was to explore productivity definitions, factors, and monitoring in agile teams and to improve the practice based on the collected evidence and gained knowledge. This thesis presents five novel contributions: C1 - Empirical verification of the importance of productivity for companies adopting agile methods and its perceived benefits; C2 - Rationale for the definition of productivity in the context of agile methods; C3 - Empirical verification of agile team productivity factors; C4 - A conceptual framework for agile team productivity factors and their impact; C5 - A team productivity monitoring process considering adaptability and an evaluation of the usefulness of agile team productivity metrics.
Menor custo e expectativa de menor time-to-market são os principais motivadores para melhorias de produtividade de software. Para gerir eficazmente a produtividade, é importante identificar as dificuldades mais relevantes e desenvolver estratégias para lidar com elas. Os métodos ágeis, incluindo Programação Extrema e Scrum, evoluíram como abordagens para simplificar o processo de desenvolvimento de software, potencialmente levando a uma melhor produtividade. Eles visam reduzir o tempo de desenvolvimento e lidar com as mudanças inevitáveis decorrentes da dinâmica do mercado. Embora a indústria tenha adotado amplamente métodos ágeis, há pouco entendimento científico do construto agilidade em desenvolvimento de software em relação às suas dimensões, determinantes e efeitos sobre o desempenho no desenvolvimento de software. Compreender esse construto poderia ajudar a determinar onde concentrar os esforços de gestão (e recursos financeiros relacionados) de um ponto de vista prático, assim como onde concentrar os esforços de investigação de uma perspectiva científica. Pesquisa considerável tem sido direcionada para identificar os fatores com impacto significativo na produtividade de desenvolvimento de software. Em geral, os fatores de produtividade estudados foram relacionadas ao produto, pessoas, projeto, processo ou questões organizacionais. Avaliar fatores de produtividade continuamente é importante, pois os fatores podem mudar quando novas práticas de engenharia de software são adotadas. No entanto, poucos estudos investigaram fatores influenciando a produtividade de times ágeis. O objetivo desta tese é explorar definições, fatores e monitoramento de produtividade em times ágeis e melhorar a prática baseada em evidência. Esta tese apresenta cinco novas contribuições: C1 - Verificação empírica da importância de produtividade para as empresas que adotam métodos ágeis e seus benefícios percebidos; C2 - Justificativa para a definição da produtividade no contexto de métodos ágeis; C3 - A verificação empírica de fatores de produtividade em times ágeis; C4 - Um arcabouço conceitual de fatores de produtividade em times ágeis e seu impacto; C5 - Um processo de acompanhamento de produtividade de times ágeis, considerando adaptabilidade e uma avaliação da utilidade de métricas de produtividade para esses times.
APA, Harvard, Vancouver, ISO, and other styles
20

Dehe, Benjamin. "An empirical investigation in the decision-making processes of new infrastructure development." Thesis, University of Huddersfield, 2014. http://eprints.hud.ac.uk/id/eprint/23706/.

Full text
Abstract:
The aim of this research is to present and discuss the development and deployment of Lean thinking models and techniques applied to improve the decision-making within the planning and design processes of new infrastructures, within a healthcare organisation. In the UK, healthcare organisations are responsible for planning, designing, building and managing their own infrastructures, through which their services are delivered to the local population (Kagioglou & Tzortzopoulos, 2010). These processes are long and complex, involving a large range of stakeholders who are implicated within the strategic decision-making. It is understood that the NHS lacks models and frameworks to support the decision-making associated with their new infrastructure development and that ad-hoc methods, used at local level, lead to inefficiencies and weak performances, despite the contractual efforts made throughout the PPP and PFI schemes (Baker & Mahmood, 2012; Barlow & Koberle-Gaiser, 2008). This is illustrated by the long development cycle time – it can take up to 15 years from conception to completion of new infrastructure. Hence, in collaboration with an NHS organisation, an empirical action research embedded within a mixed-methodology approach, has been designed to analyse the root-cause problems and assess to what extent Lean thinking can be applied to the built environment, to improve the speed and fitness for purpose of new infrastructures. Firstly, this multiphase research establishes the main issues responsible for the weak process performances, via an inductive-deductive cycle, and then demonstrates how Lean thinking inspired techniques: Multiple Criteria Decision Analysis (MCDA) using ER and AHP, Benchmarking and Quality Function Deployment (QFD), have been implemented to optimise the decision-making in order to speed up the planning and design decision-making processes and to enhance the fitness for purpose of new infrastructures. Academic literatures on Lean thinking, decision theories and built environment have been reviewed, in order to establish a reliable knowledge base of the context and to develop relevant solutions. The bespoke models developed have been tested and implemented in collaboration with a local healthcare organisation in UK, as part of the construction of a £15 million health centre project. A substantial set of qualitative and quantitative data has been collected during the 450 days, which the researcher was granted full access, plus a total of 25 sets of interviews, a survey (N=85) and 25 experimental workshops. This mixed-methodology research is composed of an exploratory sequential design and an embedded-experiment variant, enabling the triangulation of different data, methods and findings to be used to develop an innovative solution, thus improving the new infrastructure development process. The emerging developed conceptual model represents a non-prescriptive approach to planning and designing new healthcare infrastructures, using Lean thinking principles to optimise the decision-making and reduce the complexity. This Partial & Bespoke Lean Construction Framework (PBLCF) has been implemented as good practice by the healthcare organisation, to speed up the planning phases and to enhance the quality of the design and reduce the development cost, in order to generate a competitive edge. It is estimated that a reduction of 22% of the cycle time and 7% of the cost is achievable. This research makes a contribution by empirically developing and deploying a partial Lean implementation into the healthcare‟s built environment, and by providing non-prescriptive models to optimise the decision-making underpinning the planning and design of complex healthcare infrastructure. This has the potential to be replicated in other healthcare organisations and can also be adapted to other construction projects.
APA, Harvard, Vancouver, ISO, and other styles
21

McNelis, Robert J. "The measurement and empirical evaluation of quality and productivity for manufacturing processes." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-06102009-063228/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Bare, Marshall Edwin. "Structuring Emperical Methods for Reuse and Efficiency in Product Development Processes." BYU ScholarsArchive, 2006. https://scholarsarchive.byu.edu/etd/1032.

Full text
Abstract:
Product development requires that engineers have the ability to predict product performance. When product performance involves complex physics and natural phenomena, mathematical models are often insufficient to provide accurate predictions. Engineering companies compensate for this deficiency by testing prototypes to obtain empirical data that can be used in place of predictive models. The purpose of this work is to provide techniques and methods for efficient use of empirical methods in product development processes. Empirical methods involve the design and creation of prototype hardware and the testing of that hardware in controlled environments. Empirical methods represent a complete product development sub-cycle within the overall product development process. Empirical product development cycles can be expensive in both time and resources. Global economic pressures have caused companies to focus on improving the productivity of their product development cycles. A variety of techniques for improving the productivity of product development processes have been developed. These methods focus on structuring process steps and product artifacts for reuse and efficiency. However these methods have, to this point, largely ignored the product development sub-cycle of empirical design. The same techniques used on the overall product development processes can and should be applied to the empirical product development sub-cycle. This thesis focuses on applying methods of efficient and reusable product development processes on the empirical development sub-cycle. It also identifies how to efficiently link the empirical product development sub-cycle into the overall product development process. Specifically, empirical product development sub-cycles can be characterized by their purposes into three specific types: first, obtaining data for predictive model coefficients, boundary conditions and driving functions; second, validating an existing predictive model; and third, to provide the basis for predictions using interpolation and extrapolation of the empirical data when a predictive model does not exist. These three types of sub-cycles are structured as reusable processes in a standard form that can be used generally in product development. The roles of these three types of sub-cycles in the overall product development process are also established and the linkages defined. Finally, the techniques and methods provided for improving the efficiency of empirical methods in product development processes are demonstrated in a form that shows their benefits.
APA, Harvard, Vancouver, ISO, and other styles
23

Hoyos, Carlos D. "Intraseasonal Variability: Processes, Predictability and Prospects for Prediction." Diss., Available online, Georgia Institute of Technology, 2006, 2006. http://etd.gatech.edu/theses/available/etd-04102006-135125/.

Full text
Abstract:
Thesis (Ph. D.)--Earth and Atmospheric Sciences, Georgia Institute of Technology, 2006.
Dr. Peter J. Webster, Committee Chair ; Dr. Judith A. Curry, Committee Member ; Dr. Robert Dickinson, Committee Member ; Dr. Robert X. Black, Committee Member ; Dr. Predrag Cvitanovic, Committee Member.
APA, Harvard, Vancouver, ISO, and other styles
24

Cissokho, Youssouph. "Extremal Covariance Matrices." Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/37124.

Full text
Abstract:
The tail dependence coefficient (TDC) is a natural tool to describe extremal dependence. Estimation of the tail dependence coefficient can be performed via empirical process theory. In case of extremal independence, the limit degenerates and hence one cannot construct a test for extremal independence. In order to deal with this issue, we consider an analog of the covariance matrix, namely the extremogram matrix, whose entries depend only on extremal observations. We show that under the null hypothesis of extremal independence and for finite dimension d ≥ 2, the largest eigenvalue of the sample extremogram matrix converges to the maximum of d independent normal random variables. This allows us to conduct an hypothesis testing for extremal independence by means of the asymptotic distribution of the largest eigenvalue. Simulation studies are performed to further illustrate this approach.
APA, Harvard, Vancouver, ISO, and other styles
25

Binkowski, Karol Patryk. "Pricing of European options using empirical characteristic functions." Phd thesis, Australia : Macquarie University, 2008. http://hdl.handle.net/1959.14/28623.

Full text
Abstract:
Thesis (PhD)--Macquarie University, Division of Economic and Financial Studies, Dept. of Statistics, 2008.
Bibliography: p. 73-77.
Introduction -- Lévy processes used in option pricing -- Option pricing for Lévy processes -- Option pricing based on empirical characteristic functions -- Performance of the five models on historical data -- Conclusions -- References -- Appendix A. Proofs -- Appendix B. Supplements -- Appendix C. Matlab programs.
Pricing problems of financial derivatives are among the most important ones in Quantitative Finance. Since 1973 when a Nobel prize winning model was introduced by Black, Merton and Scholes the Brownian Motion (BM) process gained huge attention of professionals professionals. It is now known, however, that stock market log-returns do not follow the very popular BM process. Derivative pricing models which are based on more general Lévy processes tend to perform better. --Carr & Madan (1999) and Lewis (2001) (CML) developed a method for vanilla options valuation based on a characteristic function of asset log-returns assuming that they follow a Lévy process. Assuming that at least part of the problem is in adequate modeling of the distribution of log-returns of the underlying price process, we use instead a nonparametric approach in the CML formula and replaced the unknown characteristic function with its empirical version, the Empirical Characteristic Functions (ECF). We consider four modifications of this model based on the ECF. The first modification requires only historical log-returns of the underlying price process. The other three modifications of the model need, in addition, a calibration based on historical option prices. We compare their performance based on the historical data of the DAX index and on ODAX options written on the index between the 1st of June 2006 and the 17th of May 2007. The resulting pricing errors show that one of our models performs, at least in the cases considered in the project, better than the Carr & Madan (1999) model based on calibration of a parametric Lévy model, called a VG model. --Our study seems to confirm a necessity of using implied parameters, apart from an adequate modeling of the probability distribution of the asset log-returns. It indicates that to precisely reproduce behaviour of the real option prices yet other factors like stochastic volatility need to be included in the option pricing model. Fortunately the discrepancies between our model and real option prices are reduced by introducing the implied parameters which seem to be easily modeled and forecasted using a mixture of regression and time series models. Such approach is computationaly less expensive than the explicit modeling of the stochastic volatility like in the Heston (1993) model and its modifications.
Mode of access: World Wide Web.
x, 111 p. ill., charts
APA, Harvard, Vancouver, ISO, and other styles
26

Ødegaard, Fredrik. "Analytical and empirical models of online auctions." Thesis, University of British Columbia, 2008. http://hdl.handle.net/2429/1615.

Full text
Abstract:
This thesis provides a discussion on some analytical and empirical models of online auctions. The objective is to provide an alternative framework for analyzing online auctions, and to characterize the distribution of intermediate prices. Chapter 1 provides a mathematical formulation of the eBay auction format and background to the data used in the empirical analysis. Chapter 2 analyzes policies for optimally disposing inventory using online auctions. It is assumed a seller has a fixed number of items to sell using a sequence of, possibly overlapping, single-item auctions. The decision the seller must make is when to start each auction. The decision involves a trade-off between a holding cost for each period an item remains unsold, and a cannibalization effect among competing auctions. Consequently the seller must trade-off the expected marginal gain for the ongoing auctions with the expected marginal cost of the unreleased items by further deferring their release. The problem is formulated as a discrete time Markov Decision Problem. Conditions are derived to ensure that the optimal release policy is a control limit policy in the current price of the ongoing auctions. Chapter 2 focuses on the two item case which has sufficient complexity to raise challenging questions. An underlying assumption in Chapter 2 is that the auction dynamics can be captured by a set of transition probabilities. Chapter 3 shows with two fixed bidding strategies how the transition probabilities can be derived for a given auction format and bidder arrival process. The two specific bidding strategies analyzed are when bidders bid: 1) a minimal increment, and 2) their true valuation. Chapters 4 and 5 provides empirical analyzes of 4,000 eBay auctions conducted by Dell. Chapter 4 provides a statistical model where over discrete time periods, prices of online auctions follow a zero-inflated gamma distribution. Chapter 5 provides an analysis of the 44,000 bids placed in the auctions, based on bids following a gamma distribution. Both models presented in Chapters 4 and 5 are based on conditional probabilities given the price and elapsed time of an auction, and certain parameters of the competing auctions. Chapter 6 concludes the thesis with a discussion of the main results and possible extensions.
APA, Harvard, Vancouver, ISO, and other styles
27

Chirila, Costel. "EMPIRICAL PROCESSES AND ROC CURVES WITH AN APPLICATION TO LINEAR COMBINATIONS OF DIAGNOSTIC TESTS." UKnowledge, 2008. http://uknowledge.uky.edu/gradschool_diss/674.

Full text
Abstract:
The Receiver Operating Characteristic (ROC) curve is the plot of Sensitivity vs. 1- Specificity of a quantitative diagnostic test, for a wide range of cut-off points c. The empirical ROC curve is probably the most used nonparametric estimator of the ROC curve. The asymptotic properties of this estimator were first developed by Hsieh and Turnbull (1996) based on strong approximations for quantile processes. Jensen et al. (2000) provided a general method to obtain regional confidence bands for the empirical ROC curve, based on its asymptotic distribution. Since most biomarkers do not have high enough sensitivity and specificity to qualify for good diagnostic test, a combination of biomarkers may result in a better diagnostic test than each one taken alone. Su and Liu (1993) proved that, if the panel of biomarkers is multivariate normally distributed for both diseased and non-diseased populations, then the linear combination, using Fisher's linear discriminant coefficients, maximizes the area under the ROC curve of the newly formed diagnostic test, called the generalized ROC curve. In this dissertation, we will derive the asymptotic properties of the generalized empirical ROC curve, the nonparametric estimator of the generalized ROC curve, by using the empirical processes theory as in van der Vaart (1998). The pivotal result used in finding the asymptotic behavior of the proposed nonparametric is the result on random functions which incorporate estimators as developed by van der Vaart (1998). By using this powerful lemma we will be able to decompose an equivalent process into a sum of two other processes, usually called the brownian bridge and the drift term, via Donsker classes of functions. Using a uniform convergence rate result given by Pollard (1984), we derive the limiting process of the drift term. Due to the independence of the random samples, the asymptotic distribution of the generalized empirical ROC process will be the sum of the asymptotic distributions of the decomposed processes. For completeness, we will first re-derive the asymptotic properties of the empirical ROC curve in the univariate case, using the same technique described before. The methodology is used to combine biomarkers in order to discriminate lung cancer patients from normals.
APA, Harvard, Vancouver, ISO, and other styles
28

Eppler, Martin R. "Information quality in knowledge-intensive processes : problem analysis, conceptual framework and empirical evidence /." St. Gallen, 2002. http://aleph.unisg.ch/hsgscan/hm00131749.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Loukrati, Hicham. "Tail Empirical Processes: Limit Theorems and Bootstrap Techniques, with Applications to Risk Measures." Thesis, Université d'Ottawa / University of Ottawa, 2018. http://hdl.handle.net/10393/37594.

Full text
Abstract:
Au cours des dernières années, des changements importants dans le domaine des assurances et des finances attirent de plus en plus l’attention sur la nécessité d’élaborer un cadre normalisé pour la mesure des risques. Récemment, il y a eu un intérêt croissant de la part des experts en assurance sur l’utilisation de l’espérance conditionnelle des pertes (CTE) parce qu’elle partage des propriétés considérées comme souhaitables et applicables dans diverses situations. En particulier, il répond aux exigences d’une mesure de risque “cohérente”, selon Artzner [2]. Cette thèse représente des contributions à l’inférence statistique en développant des outils, basés sur la convergence des intégrales fonctionnelles, pour l’estimation de la CTE qui présentent un intérêt considérable pour la science actuarielle. Tout d’abord, nous développons un outil permettant l’estimation de la moyenne conditionnelle E[X|X > x], ensuite nous construisons des estimateurs de la CTE, développons la théorie asymptotique nécessaire pour ces estimateurs, puis utilisons la théorie pour construire des intervalles de confiance. Pour la première fois, l’approche de bootstrap non paramétrique est explorée dans cette thèse en développant des nouveaux résultats applicables à la valeur à risque (VaR) et à la CTE. Des études de simulation illustrent la performance de la technique de bootstrap.
APA, Harvard, Vancouver, ISO, and other styles
30

Ho, Phu Van. "Total Quality Management Approach To The Information Systems Development Processes: An Empirical Study." Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/28216.

Full text
Abstract:
The purpose of this dissertation is to study the application of Total Quality Management (TQM) in the Information Systems (IS) development processes. The study describes and evaluates TQM concepts and techniques in the IS development processes and interprets sub-organizational elements in the application of TQM in the public sector. This dissertation uses a multiple case study methodology to study the development processes of IS in three public agencies. This study attempts to examine what quality means across these public organizations and to discover the differences between IS development methodologies that do or do not apply TQM concepts and techniques. The late Dr. W. Edwards Deming, regarded as â fatherâ of post war Japanese economic miracle as well as leading advocate of the TQM movement in the United States, developed a systematic approach to solving quality related problems which aims to fulfill customer expectations. His system of management is adopted as the theoretical basis to this dissertation. The â lessons learnedâ from these case studies, empirically and in literature, reveal multiple experiences of TQM applications to IS development processes in the public sector.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
31

Öztel, Hülya. "Economic development partnerships in France : an empirical investigation of inter-organisational learning processes." Thesis, University of Warwick, 2004. http://wrap.warwick.ac.uk/39999/.

Full text
Abstract:
This dissertation focuses on partnership learning in the context of economic development policy implementation. In the field of business support services, local partnerships' ability to engage in inter-organisational learning can shape the effectiveness of their services, their impact on regional development and governance systems. Where partners are able to change their partnership'S tacit norms and values, strong synergies could be achieved, sustainable double-loop learning may occur. But when a partnership is unable to transcend each collaborating partner's agenda, organisational rivalry, conflict of interest and power struggles can inhibit collaborative learning. We need to understand the processes that underpin partnership learning and unveil how partnerships can overcome the crises and conflicts bound to occur during their existence. The empirical investigation of the issues outlined above is conducted in the context of the French government's Reseau de Diffusion Technologique initiative. The study is conducted using a case study format, following interpretive traditions in social sciences. Presence Rhone-Alpes (PRA) and Reseau Nord Pas de Calais Technologie (NPC) were selected as polar cases. The findings indicate that the broad notion of partnership learning can be analysed in terms of social learning and process learning. Specifically, process learning (linked with the implementation of operational goals) is strongly dependent on pre-requisite tacit knowledge developed through social learning. Indeed the comparative analysis demonstrated that although both partnerships had similar problems, only PRA was able to resolve the deep-rooted causes of crises it experienced through the progressive creation of a governing elite in the region. The presence of such an elite, with clear - albeit tacit- rules for decision making, facilitated partnership "process learning", which meant that negotiations over operational objectives, partnership strategy and even regional policy became opportunities to exert influence and collective power as opposed to instances where collaborating organisations fought to protect their individual turf.
APA, Harvard, Vancouver, ISO, and other styles
32

Alves, Kyle Vierra. "What are the delivery system design characteristics of information-centric mass claims processes?" Thesis, University of Exeter, 2017. http://hdl.handle.net/10871/32564.

Full text
Abstract:
This thesis examines the operational delivery systems of information-centric Mass Claims Processes. Empirical data is presented which builds upon existing literature within the Operations Management discipline. This thesis aims to extend the area of knowledge which focuses on the rendering of assistance to very large groups of individuals disadvantaged through particular events such as armed conflict, civil unrest, acts of government and other similarly sweeping actions. One such approach of aid delivery is through a legal process known as a Mass Claims Process which delivers assistance. This research examines how this assistance is rendered to the individual, the ‘claimant’, through a legally guided and controlled analysis of claimant-provided information. Such organisations are typically either publicly funded or funded through social schemes, which introduces significant pressure for efficiency. Similarly, the legal nature of MCPs emphasises the need for accuracy in the delivery of justice and law. The research addresses a number of areas not fully explored by the extant literature. There is a lack of research which explores the apparent trade-off between efficiency and accuracy in large scale legal services. Little empirical evidence exists on the application of Postponement strategies in information-centric operations. This research also investigates a previously unexplored context in which strategic frameworks must find optimal alignment between the service concept and the design of the delivery system in a restricted and challenging environment. Fieldwork was carried out over a three year period in two separate organisations, and utilised a polar case approach to increase the validity of the findings. The phenomenon of information interrelation, previously unidentified in the literature, is shown to have significant impact in this context. Several models are presented to describe the dynamic relationships between the characteristics and the strategic choices of the MCP. The results produce a set of findings illustrating optimal design choices for the key delivery system characteristics associated with MCPs. The financial impact of such organisations reaches into the billions (USD), and will continue to be a significant economic consideration for the foreseeable future. As such, research in this area has the ability to increase the efficient use of organisational resources for the organisations, while improving the service for the applicants. Whilst this thesis contributes to the body of knowledge for delivery system design, further research is welcomed, especially on the phenomenon of information interrelation, for the growing area of information-centric organisations.
APA, Harvard, Vancouver, ISO, and other styles
33

Bernroider, Edward. "Effective ERP adoption processes: the role of project activators and resource investments." Palgrave Macmillan, 2013. http://dx.doi.org/10.1057/ejis.2012.51.

Full text
Abstract:
The aim of this paper is to demonstrate whether stakeholders activating a project shape team building, the structure and magnitude of resource investment levels, and to what extent these levels impact ERP project effectiveness. The process view of an ERP project includes project initiation, system justification and funding, implementation, and early system use. Results from a nationwide empirical survey conducted in Austria (N = 88) show that activating actors influence team formation and resource investments, which impact project effectiveness levels. Resource-intensive justification and funding phases tend to precede resource-intensive implementations in heavy-weight projects, which seem to be less effective than light-weight projects. Resource and change conflicts are associated with lower project performance and are more common in resource-intensive ERP projects, where early system use appears to be relatively less stable. (author's abstract)
APA, Harvard, Vancouver, ISO, and other styles
34

Altunoglu, Ali Ender. "Effects of environment, structure and past performance on strategic decision processes : an empirical investigation." Thesis, University of Leicester, 2000. http://hdl.handle.net/2381/31106.

Full text
Abstract:
This research examines the nature and impact of environmental and organisational variables on strategy processes and firm performance in Fortune 500 firms. For several decades, research on decision processes has developed conflicting findings about the superiority of the different types of decision process. This study maintains that environmental and organisational conditions of the firm ought to be examined by the strategist. This thesis has three main objectives:;(i) to provide a detailed review of the synoptic and incremental schools,;(ii) to investigate how environmental and intraorganisational variables affect the decision processes,;(iii) to investigate the interaction effects of environmental and organisational factors with decision processes on firm performance. To attain such objectives, multiple regression analysis is applied. The first main finding is that environmental munificence should be taken into consideration in the strategic decision process. Secondly, organisational variables, centralisation, formalisation and size have considerable impact on the variations in the strategy process. This thesis maintains that as organisational structure becomes more centralised and formalised and firms grow in size, top executives tend to employ more rational and comprehensive decision processes. Another main finding is that organisations which use the synoptic process in less uncertain environments are likely to perform better than firms which implement incremental processes. The findings imply that environmental and organisational factors are crucial in the synoptic-incremental dimension. In line with contingency theory, this thesis suggests that the strategic decision process is affected by the external environment and organisational variables.
APA, Harvard, Vancouver, ISO, and other styles
35

Hughes, Amanda. "Insights into Contractional Fault-Related Folding Processes Based on Mechanical, Kinematic, and Empirical Studies." Thesis, Harvard University, 2012. http://dissertations.umi.com/gsas.harvard:10462.

Full text
Abstract:
This dissertation investigates contractional fault-related folding, an important mechanism of deformation in the brittle crust, using a range of kinematic and mechanical models and data from natural structures. Fault-related folds are found in a wide range of tectonic settings, including mountain belts and accretionary prisms. There are several different classes of fault-related folds, including fault-bend, fault-propagation, shear-fault-bend, and detachment folds. They are distinguished by the geometric relationships between the fold and fault shape, which are driven by differences in the nature of fault and fold growth. The proper recognition of the folding style present in a natural structure, and the mechanical conditions that lead the development of these different styles, are the focus of this research. By taking advantage of recent increases in the availability of high-quality seismic reflection data and computational power, we seek to further develop the relationship between empirical observations of fault-related fold geometries and the kinematics and mechanics of how they form. In Chapter 1, we develop an independent means of determining the fault-related folding style of a natural structure through observation of the distribution of displacement along the fault. We derive expected displacements for kinematic models of end-member fault-related folding styles, and validate this approach for natural structures imaged in seismic reflection data. We then use this tool to gain insight into the deformational history of more complex structures. In Chapter 2, we explore the mechanical and geometric conditions that lead to the transition between fault-bend and fault-propagation folds. Using the discrete element modeling (DEM) method, we investigate the relative importance of factors such as fault dip, mechanical layer strength and anisotropy, and fault friction on the style of structure that develops. We use these model results to gain insight into the development of transitional fault-related folds in the Niger Delta. In Chapter 3, we compare empirical observations of fault-propagation folds with results from mechanical models to gain insight into the factors that contribute to the wide range of structural geometries observed within this structural class. We find that mechanical layer anisotropy is an important factor in the development of different end-member fault-propagation folding styles.
Earth and Planetary Sciences
APA, Harvard, Vancouver, ISO, and other styles
36

Huang, Jimmy C. "Knowledge integration processes and dynamics : an empirical study of two cross-functional programme teams." Thesis, University of Warwick, 2000. http://wrap.warwick.ac.uk/36379/.

Full text
Abstract:
This thesis critically reviews and evaluates theories of organisational knowledge and knowledge-related activities. Specifically, it assesses and synthesises relevant theories and thoughts to develop a conceptual model of the knowledge integration process. Empirical evidence, collected from two organisations- Boots The Chemists and NatWest Global Financial Markets is also exploited as a means of building a grounded theory of knowledge integration This theory explains the processes of knowledge integration within the context of crossfunctional project teams. It also considers the general factors that influence these processes, as well as the dynamic interrelationships between the proposed processes. The theory provides a framework not only for future research to systematically examine and test knowledge integration processes within different organisations, but also allows management to continuously anticipate knowledge integration activities within their own organisations. Based on a social construction perspective, this thesis demonstrates that knowledge integration is more than merely the representation of intellectual activities underlying the planning, redesign and implementation stages of a cross-functional programme. It also argues that cross-functional knowledge integration is a continuous process in which programme participants establish emotional alignment through social interaction. This research contributes to studies of organisational knowledge and knowledge-related activities by providing an explorative account that synthesises existing literature with empirical evidence. Secondly, this research contributes to the theoretical development of knowledge integration by focusing on its processes rather than just its outcomes and implications which have been the main concern of other researchers. Finally, the development of a cross-functional knowledge integration theory contributes to the consolidation of the intellectual and emotional dimensions of knowledge-related activities that have in the past been treated in isolation.
APA, Harvard, Vancouver, ISO, and other styles
37

Hsu, Frederick Bei-Min. "Strategic decision processes and effectiveness : an empirical examination by Lord and Maher's integrative framework." Thesis, University of Manchester, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.631666.

Full text
Abstract:
This thesis addresses a long standing debate in the field of strategic decision research between theories of rationality (decision models which call for adoption of comprehensive information search to achieve best possible outcomes) and theories of· bounded rationality (decision models which address decision makers' limited capacity in information processing). Empirical studies have produced inconclusive findings under conditions of rapid and unpredictable environmental change (turbulence). The various debates and controversies have been compounded by repeated failures to specify limits of proposed mechanisms, and by lack of clarity in terminology. The thesis therefore implemented a research agenda based on Lord and Maher's (1990, 1993) integrative framework. This served as a sensitising device for studying empirically observed behaviours in three major experiments. The framework is developed around three dimensions: rationality (incorporating rational models and bounded rational variants); expertness (adoption of expert judgement); and cyberneticness (trial-and-error learning). An experimental research design allowing for longitudinal observation was adopted to explore the relationships between decision processes and effectiveness under different environmental conditions. Decision environments were split into discontinuous and continuous ones, differentiated by the presence or absence of unexpected jolts. Locating changes in decision style and effectiveness in the discontinuous environments lies at the core of this research. The decision to focus on laboratory experiments has the advantage of providing the exclusive opportunities for close observation of multiple decisions in a short time-scale. However, this reduces the capacity of the study to provide direct links with real-life strategic decision situations. The results are therefore suggestive rather than definitive and are offered for cross-validation in more open decision environments. Within the laboratory settings, MBA students and senior executives participating in strategic decision making were observed. Qualitative data were collected from 124 decision tasks. A total of 330 questionnaires were further collected for the quantitative data analysis. Preliminary analysis results from the qualitative data were supported by those from the quantitative data. This exploratory research paved the way for further research in this field with similar research designs. The core finding was that a specific kind of rationality was identified· as effective in discontinuous environments. This decision mode is characterised by a combination of activities directed toward a wide search for information and possibilities (rationality), and testing out the search results (cyberneticness). This mode was labelled as Promethean rationality. Traditionally, based on a cross-sectional view, rationality and cyberneticness in decision making have. been seen as mutually exclusive (e.g., Kleinmuntz and Thomas, 1987; Steinbruner, 1974). In light of the discovery of this mode, assumptions in the literature may need re-examination. Additional findings are reported under conditions of environmental continuity. Again Promethean rationality was found to be effective. Also, a decision mode involving a combination of rationality and utilisation of expert judgement (labelled as Confucian rationality) was found to persist, and was identified as effective. Confucian rationality was found under these conditions to support what was partially expected in the 12- literature. In the past, the combination was stressed as important (e.g., Eisenhardt, 1989; Fredrickson, 1985; Simon, 1987). The fact that Promethean rationality was effective under two types of environmental conditions and previously undetected in the more critical conditions gives it higher potential for theory development than Confucian rationality. Identification of the effectiveness of Confucian rationality under. environmental continuity could be a useful 'by-product' which adds value to this research. The bounded nature of the experimental trials precludes any 'safe' extrapolation of these claims with confidence to real-life situations. However, it does permit the development of insights ('intuitions') regarding the kind of real-life situations most likely to be fruitful in the search for understanding of the experimental modes identified here.
APA, Harvard, Vancouver, ISO, and other styles
38

El-Khouly, T. A. I. "Creative discovery in architectural design processes : an empirical study of procedural and contextual components." Thesis, University College London (University of London), 2015. http://discovery.ucl.ac.uk/1467244/.

Full text
Abstract:
This research aims to collect empirical evidence on the nature of design by investigating the question: What role do procedural activities (where each design step reflects a unit in a linear process) and contextual activities (an action based on the situation, environment and affordances) play in the generation of creative insights, critical moves, and the formation of design concepts in the reasoning process? The thesis shows how these activities can be identified through the structure of a linkograph, for better understanding the conditions under which creativity and innovation take place. Adopting a mixed methodology, a deductive approach evaluates the existing models that aim to capture the series of design events, while an inductive approach collects data and ethnographic observations for an empirical study of architectural design experiments based on structured and unstructured briefs. A joint approach of quantitative and qualitative analyses is developed to detect the role of evolving actions and structural units of reasoning, particularly the occurrence of creative insights (‘eureka’ and ‘aha!’ moments) in the formation of concepts by judging the gradual transformation of mental imagery and external representations in the sketching process. The findings of this research are: (1) For any design process procedural components are subsets in solving the design problem for synchronic concept development or implementation of the predefined conceptual idea, whereas contextual components relate to a comprehensive view to solve the design problem through concept synthesis of back- and forelinking between the diachronic stages of the design process. (2) This study introduces a new method of looking at evolving design moves and critical actions by considering the time of emergence in the structure of the reasoning process. Directed linkography compares two different situations: the first is synchronous, looking at relations back to preceding events, and the second is diachronic, looking at the design state after completion. Accordingly, creative insights can be categorised into those emerging in incremental reasoning to reframe the solution, and sudden mental insights emerging in non-incremental reasoning to restructure the design problem and reformulate the entire design configuration. (3) Two architectural designing styles are identified: some architects define the design concept early, set goals and persevere in framing and reframing this until the end, whereas others initiate the concept by designing independent conceptual elements and then proceed to form syntheses for the design configuration. Sudden mental insights are most likely to emerge from the unexpected combination of synthesis, particularly in the latter style. In its contribution to design research and creative cognition this dissertation paves the way for a better understanding of the role of reflective practices in design creativity and cognitive processes and presents new insights into what it means to think and design as an architect.
APA, Harvard, Vancouver, ISO, and other styles
39

Sakariya, Sohanraj Mishrimul. "Empirical tests of some equilibrium stochastic processes of the term structure of interest rates." Diss., The University of Arizona, 1991. http://hdl.handle.net/10150/185533.

Full text
Abstract:
In duration-based immunization models, the measure of duration is dependent on the assumed stochastic process of the term structure of interest rates. Most tests of immunization have been based on single factor duration models. However, specifications of term structure motions in earlier models permit the possibility of riskless arbitrage profits, as demonstrated by Ingersoll, Skelton and Weil (1978). This paper draws upon the work of Bierwag (1987b) and presents some discrete equilibrium two-state one-factor model of the return generating process. These models are based on processes that give rise to some familiar measures of duration, namely the Fisher-Weil(FW) duration from the Fisher-Weil Equilibrium Process(FWEP), the special additive duration from the Special Additive Process(SAP), and the additive duration measure from the Additive Discrete Equilibrium Stochastic Process(ADEP). All of durations were initially based on disequilibrium processes. Therefore, measures of duration associated with disequilibrium processes can also be associated with discrete equilibrium stochastic processes and hence the criticism of these duration measures as being inconsistent with equilibrium is without merit in this discrete time framework. The focus in this work is to test empirically discrete equilibrium stochastic processes which give rise to the FW, SAP, and ADEP durations. The empirical approach consists of directly testing the implications of the models using Treasury Bill data and not the immunization efficacy approach utilized in most studies of term structure modelling. Initial tests indicate no support for the SAP and the ADEP and partial support for the FWEP. Tests also indicate that the tax treatment of Treasury Bills has some effect on the sensitivity measures and may partly explain from initial regression tests the weak support for the FWEP model. The support for the FWEP varies with the choice of independent variables and errors in variables as a possible explanation is explored. Various techniques, including two non-parametric techniques, which attempt to overcome the bias induced by errors in variables are explored. Overall, additional tests using these techniques strongly support the FWEP.
APA, Harvard, Vancouver, ISO, and other styles
40

Barcus, Anka. "Exploring decision making processes in-situ, in-actu, in-toto : an empirical study of decision-making processes in medium software development projects." Thesis, London School of Economics and Political Science (University of London), 2014. http://etheses.lse.ac.uk/1064/.

Full text
Abstract:
Organisational projects, a multifaceted socio-technical phenomenon that evolve in plural contexts often characterised by a high degree of interconnectedness, have become ubiquitous in strategy delivery. The traditional project management literature emphasises the significance of project and organisational objectives to project success, yet it is not clear how these objectives guide action at project level. Aiming to fill the gap, this empirical research studied project decision-making in two organisations with strong rational orientation that communicate strategic direction through objectives hierarchies, and define and manage projects by objectives. To study decision-making practices in project praxis, this thesis introduces the concept of a “decision site” as an area shaped by a triad of mutually constituting practitioners, sociomaterial context and decisionmaking practices, as well as the concept of “praxis domains” used to analyse entwinement between decision-making practices and sociomaterial context. The environment and participants’ perception was analysed based on semi-structured interviews with practitioners, review of existing organisational documentation, and daily project meetings were audio recorded through silent observation. Twenty eight decision episodes were identified and described in their organisational project context. Two process representations aided analyses of decision episodes, one tracing discursive reference to praxis domains, and the other diagramming decision-making activities which manage a decision site. Decision-making practices of “Neguesstimation” and “Querying Praxis Domains” were defined and differentiated by schemes and degree of entwinement with praxis domains. The thesis findings do not support the notion of project and corporate objectives as being instrumental in project decision-making. Instead, one of the observed practices queries praxis domains as proxies for complex hierarchies of organisational objectives and constructs decision site imbued with local practical logic. The thesis argues that practical logic could be successfully employed in aligning project level activities to complex and dynamic organisational context and suggests potential for development of practice based decision-making approaches.
APA, Harvard, Vancouver, ISO, and other styles
41

Stewart, Michael Ian. "Asymptotic methods for tests of homogeneity for finite mixture models." Thesis, The University of Sydney, 2002. http://hdl.handle.net/2123/855.

Full text
Abstract:
We present limit theory for tests of homogeneity for finite mixture models. More specifically, we derive the asymptotic distribution of certain random quantities used for testing that a mixture of two distributions is in fact just a single distribution. Our methods apply to cases where the mixture component distributions come from one of a wide class of one-parameter exponential families, both continous and discrete. We consider two random quantities, one related to testing simple hypotheses, the other composite hypotheses. For simple hypotheses we consider the maximum of the standardised score process, which is itself a test statistic. For composite hypotheses we consider the maximum of the efficient score process, which is itself not a statistic (it depends on the unknown true distribution) but is asymptotically equivalent to certain common test statistics in a certain sense. We show that we can approximate both quantities with the maximum of a certain Gaussian process depending on the sample size and the true distribution of the observations, which when suitably normalised has a limiting distribution of the Gumbel extreme value type. Although the limit theory is not practically useful for computing approximate p-values, we use Monte-Carlo simulations to show that another method suggested by the theory, involving using a Studentised version of the maximum-score statistic and simulating a Gaussian process to compute approximate p-values, is remarkably accurate and uses a fraction of the computing resources that a straight Monte-Carlo approximation would.
APA, Harvard, Vancouver, ISO, and other styles
42

Stewart, Michael Ian. "Asymptotic methods for tests of homogeneity for finite mixture models." University of Sydney. Mathematics and Statistics, 2002. http://hdl.handle.net/2123/855.

Full text
Abstract:
We present limit theory for tests of homogeneity for finite mixture models. More specifically, we derive the asymptotic distribution of certain random quantities used for testing that a mixture of two distributions is in fact just a single distribution. Our methods apply to cases where the mixture component distributions come from one of a wide class of one-parameter exponential families, both continous and discrete. We consider two random quantities, one related to testing simple hypotheses, the other composite hypotheses. For simple hypotheses we consider the maximum of the standardised score process, which is itself a test statistic. For composite hypotheses we consider the maximum of the efficient score process, which is itself not a statistic (it depends on the unknown true distribution) but is asymptotically equivalent to certain common test statistics in a certain sense. We show that we can approximate both quantities with the maximum of a certain Gaussian process depending on the sample size and the true distribution of the observations, which when suitably normalised has a limiting distribution of the Gumbel extreme value type. Although the limit theory is not practically useful for computing approximate p-values, we use Monte-Carlo simulations to show that another method suggested by the theory, involving using a Studentised version of the maximum-score statistic and simulating a Gaussian process to compute approximate p-values, is remarkably accurate and uses a fraction of the computing resources that a straight Monte-Carlo approximation would.
APA, Harvard, Vancouver, ISO, and other styles
43

SILVA, Rodrigo Bernardo da. "A Bayesian approach for modeling stochastic deterioration." Universidade Federal de Pernambuco, 2010. https://repositorio.ufpe.br/handle/123456789/5610.

Full text
Abstract:
Made available in DSpace on 2014-06-12T17:40:37Z (GMT). No. of bitstreams: 2 arquivo720_1.pdf: 2087569 bytes, checksum: 4e440439e51674690e086dbc501c7a58 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2010
Conselho Nacional de Desenvolvimento Científico e Tecnológico
A modelagem de deterioracão tem estado na vanguarda das analises Bayesianas de confiabilidade. As abordagens mais conhecidas encontradas na literatura para este proposito avaliam o comportamento da medida de confiabilidade ao longo do tempo a luz dos dados empiricos, apenas. No contexto de engenharia de confiabilidade, essas abordagens têm aplicabilidade limitada uma vez que frequentemente lida-se com situacões caracterizadas pela escassez de dados empiricos. Inspirado em estrategias Bayesianas que agregam dados empiricos e opiniões de especialistas na modelagem de medidas de confiabilidade não-dependentes do tempo, este trabalho propõe uma metodologia para lidar com confiabilidade dependente do tempo. A metodologia proposta encapsula conhecidas abordagens Bayesianas, como metodos Bayesianos para combinar dados empiricos e opiniões de especialistas e modelos Bayesianos indexados no tempo, promovendo melhorias sobre eles a fim de encontrar um modelo mais realista para descrever o processo de deterioracão de um determinado componente ou sistema. Os casos a serem discutidos são os tipicamente encontrados na pratica de confiabilidade (por meio de simulacão): avaliacão dos dados sobre tempo de execucão para taxas de falha e a quantidade de deterioracão, dados com base na demanda para probabilidade de falha; e opiniões de especialistas para analise da taxa de falha, quantidade de deterioracão e probabilidade de falha. Estes estudos de caso mostram que o uso de informacões especializadas pode levar a uma reducão da incerteza sobre distribuicões de medidas de confiabilidade, especialmente em situacões em que poucas ou nenhuma falha e observada.
APA, Harvard, Vancouver, ISO, and other styles
44

Gong, Yun. "Empirical likelihood and extremes." Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/43581.

Full text
Abstract:
In 1988, Owen introduced empirical likelihood as a nonparametric method for constructing confidence intervals and regions. Since then, empirical likelihood has been studied extensively in the literature due to its generality and effectiveness. It is well known that empirical likelihood has several attractive advantages comparing to its competitors such as bootstrap: determining the shape of confidence regions automatically using only the data; straightforwardly incorporating side information expressed through constraints; being Bartlett correctable. The main part of this thesis extends the empirical likelihood method to several interesting and important statistical inference situations. This thesis has four components. The first component (Chapter II) proposes a smoothed jackknife empirical likelihood method to construct confidence intervals for the receiver operating characteristic (ROC) curve in order to overcome the computational difficulty when we have nonlinear constrains in the maximization problem. The second component (Chapter III and IV) proposes smoothed empirical likelihood methods to obtain interval estimation for the conditional Value-at-Risk with the volatility model being an ARCH/GARCH model and a nonparametric regression respectively, which have applications in financial risk management. The third component(Chapter V) derives the empirical likelihood for the intermediate quantiles, which plays an important role in the statistics of extremes. Finally, the fourth component (Chapter VI and VII) presents two additional results: in Chapter VI, we present an interesting result by showing that, when the third moment is infinity, we may prefer the Student's t-statistic to the sample mean standardized by the true standard deviation; in Chapter VII, we present a method for testing a subset of parameters for a given parametric model of stationary processes.
APA, Harvard, Vancouver, ISO, and other styles
45

Guyonvarch, Yannick. "Essays in robust estimation and inference in semi- and nonparametric econometrics." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLG007/document.

Full text
Abstract:
Dans le chapitre introductif, nous dressons une étude comparée des approches en économétrie et en apprentissage statistique sur les questions de l'estimation et de l'inférence en statistique.Dans le deuxième chapitre, nous nous intéressons à une classe générale de modèles de variables instrumentales nonparamétriques. Nous généralisons la procédure d'estimation de Otsu (2011) en y ajoutant un terme de régularisation. Nous prouvons la convergence de notre estimateur pour la norme L2 de Lebesgue.Dans le troisième chapitre, nous montrons que lorsque les données ne sont pas indépendantes et identiquement distribuées (i.i.d) mais simplement jointement échangeables, une version modifiée du processus empirique converge faiblement vers un processus gaussien sous les mêmes conditions que dans le cas i.i.d. Nous obtenons un résultat similaire pour une version adaptée du processus empirique bootstrap. Nous déduisons de nos résultats la normalité asymptotique de plusieurs estimateurs non-linéaires ainsi que la validité de l'inférence basée sur le bootstrap. Nous revisitons enfin l'article empirique de Santos Silva et Tenreyro (2006).Dans le quatrième chapitre, nous abordons la question de l'inférence pour des ratios d'espérances. Nous trouvons que lorsque le dénominateur ne tend pas trop vite vers zéro quand le nombre d'observations n augmente, le bootstrap nonparamétrique est valide pour faire de l'inférence asymptotique. Dans un second temps, nous complétons un résultat d'impossibilité de Dufour (1997) en montrant que quand n est fini, il est possible de construire des intervalles de confiance qui ne sont pas pathologiques sont certaines conditions sur le dénominateur.Dans le cinquième chapitre, nous présentons une commande Stata qui implémente les estimateurs proposés par de Chaisemartin et d'Haultfoeuille (2018) pour mesurer plusieurs types d'effets de traitement très étudiés en pratique
In the introductory chapter, we compare views on estimation and inference in the econometric and statistical learning disciplines.In the second chapter, our interest lies in a generic class of nonparametric instrumental models. We extend the estimation procedure in Otsu (2011) by adding a regularisation term to it. We prove the consistency of our estimator under Lebesgue's L2 norm.In the third chapter, we show that when observations are jointly exchangeable rather than independent and identically distributed (i.i.d), a modified version of the empirical process converges weakly towards a Gaussian process under the same conditions as in the i.i.d case. We obtain a similar result for a modified version of the bootstrapped empirical process. We apply our results to get the asymptotic normality of several nonlinear estimators and the validity of bootstrap-based inference. Finally, we revisit the empirical work of Santos Silva and Tenreyro (2006).In the fourth chapter, we address the issue of conducting inference on ratios of expectations. We find that when the denominator tends to zero slowly enough when the number of observations n increases, bootstrap-based inference is asymptotically valid. Secondly, we complement an impossibility result of Dufour (1997) by showing that whenever n is finite it is possible to construct confidence intervals which are not pathological under some conditions on the denominator.In the fifth chapter, we present a Stata command which implements estimators proposed in de Chaisemartin et d'Haultfoeuille (2018) to measure several types of treatment effects widely studied in practice
APA, Harvard, Vancouver, ISO, and other styles
46

Sexton, Nicholas J. "Human task switching and the role of inhibitory processes : a computational modelling and empirical approach." Thesis, Birkbeck (University of London), 2018. http://bbktheses.da.ulcc.ac.uk/356/.

Full text
Abstract:
Task switching is a behavioural paradigm within cognitive psychology that has been claimed to reflect the activity of high-level cognitive control processes. However, classic behavioural markers such as the (n-1) switch cost have also been shown to reflect a multitude of other cognitive processes. The n-2 repetition paradigm has proven more successful, with a behavioural measure (the n-2 repetition cost) agreed to be reflective of a cognitive inhibition mechanism (‘backward inhibition’). The present thesis develops computational models of task switching, including a backward inhibition model. The models are developed within the interactive-activation and competition (IAC) framework, as a development of an existing task switching model. Modelling is constrained by the general computational principles of the IAC framework and default parameter settings where these are shared with earlier models. The effect of specific novel parameter settings on behaviour is explored systematically. The backward inhibition model predicts a range of empirically observed behavioural phenomena including both n-1 switch and n-2 repetition costs, and the modulation of the n-2 repetition cost under certain circumstances, including the manipulation of intertrial intervals. A specific prediction of the model, the modulation of n-2 repetition costs according to switch direction when tasks are of different difficulties, is tested empirically, with results confirming and providing validation of the model. Finally, consideration is given to how such a backward inhibition model could be adapted to maximise performance benefits in different task switching contexts, via a process of parameter tuning.
APA, Harvard, Vancouver, ISO, and other styles
47

Stuck, Jérôme Jürgen [Verfasser]. "Innovation processes in regional innovation systems : classification, network structures, and empirical determinants / Jérôme Jürgen Stuck." Hannover : Technische Informationsbibliothek und Universitätsbibliothek Hannover (TIB), 2015. http://d-nb.info/1074265130/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Burrill, David Michael. "Third party intervention in industrial disputes : an empirical study of the processes and effectiveness of ACAS conciliation in British collective bargaining." Thesis, University of Bradford, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.235614.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Abou, Khalil Zeinab. "Understanding the impact of release policies on software development processes." Thesis, Lille, 2021. http://www.theses.fr/2021LILUI011.

Full text
Abstract:
Avec la livraison rapide de nouvelles fonctionnalités, de nombreux projets logiciels ont modifié leurs processus de développement pour aller vers des modèles où les versions sont publiées selon des cycles courts de quelques semaines ou quelques jours. L’adoption de politiques de publication rapide des logiciels a considérablement réduit le temps de stabilisation, c’est-à-dire le temps nécessaire pour que le taux d’échec d’un produit logiciel atteigne un état stable pour l’ensemble des nouvelles fonctionnalités. Cette mise en place de livraisons rapides a obligé les organisations et les entreprises à modifier leur processus de développement et leurs outils pour publier les nouvelles versions, ceci en un temps réduit tout en garantissant la qualité. Ces nouveaux processus de développement sont censés offrir un délai réduit de mise sur le marché, un retour d'information rapide de la part des utilisateurs. Ils doivent également améliorer le temps de traitement de correction des bugs. Les utilisateurs finaux bénéficient alors d’un accès rapide aux améliorations des fonctionnalités avec des mises à jour fiables. Malgré ces avantages, des recherches antérieures ont montré que les versions rapides se font souvent au détriment de la fiabilité logicielle. Cependant, avec l'adoption croissante des versions rapides des logiciels open source et commerciaux, les effets de cette pratique sur le processus de développement logiciel ne sont pas bien compris.L'objectif de cette thèse est de fournir une compréhension approfondie de l'impact de la livraison rapide de versions dans les différentes phases du processus de développement de logiciels open source. Cette thèse présente une étude de cas empirique de la livraison rapide de versions dans les projets Eclipse et Mozilla Firefox. Nous présentons les résultats d’expériences sur l'impact à court et à long terme de la livraison rapide de versions sur le processus de gestion et de test des bugs dans les organisations open source, ainsi que le plan et les outils nécessaires pour une adoption réussie des livraisons rapides de versions. Nous suivons une approche avec des méthodes mixtes où nous analysons les référentiels de logiciels, contenant différents types de données tels que le code source, les données de test et les rapports de bugs. Nous avons également mené une enquête auprès des développeurs Eclipse. Nos travaux aident à comprendre l'évolution et les changements du processus de développement logiciel ainsi que les plans et les pratiques nécessaires pour une adoption réussie de la livraison rapide de versions et identifient plusieurs perspectives de recherche
The advent of delivering new features faster has led many software projects to change their development processes towards more rapid release models where releases are shipped using release cycles of weeks or days. The adoption of rapid release practices has significantly reduced the amount of stabilization time, the time it takes for a software product’s failure rate to reach close to the steady-state, available for new features. This forces organizations to change their development process and tools to release to the public, in a timely manner and with good quality. Rapid releases are claimed to offer a reduced time-to-market and faster user feedback; end-users bene- fit of faster access to functionality improvements and security updates and improve turnaround time for fixing bad bugs. Despite these benefits, previous research has shown that rapid releases often come at the expense of reduced software reliability. Despite the increasing adoption of rapid releases in open-source and commercial soft- ware, the effects of this practice on the software development process are not well understood.The goal of this thesis is to provide a deeper understanding of how rapid releases impact different parts of the open-source software development process. We present empirical evidence about the short and long-term impact of rapid releases on the bug handling and testing process in open source organizations; and the plan and tools that are needed for successful adoption of rapid releases. This thesis presents an empirical case study of rapid releases in Eclipse and Mozilla Firefox projects. We follow a mixed-methods approach where we analyze software repositories, containing different types of data such as source code, testing data and software issues; and we conduct a survey with Eclipse developers. This help in understanding the evolution and changes of the software development process, the plans and practices that are needed for successful adoption of rapid releases and we identify several future research directions calling for further investigation
APA, Harvard, Vancouver, ISO, and other styles
50

Fernández, Martínez Adrián. "A Usability Inspection Method for Model-driven Web Development Processes." Doctoral thesis, Universitat Politècnica de València, 2012. http://hdl.handle.net/10251/17845.

Full text
Abstract:
Las aplicaciones Web son consideradas actualmente un elemento esencial e indispensable en toda actividad empresarial, intercambio de información y motor de redes sociales. La usabilidad, en este tipo de aplicaciones, es reconocida como uno de los factores clave más importantes, puesto que la facilidad o dificultad que los usuarios experimentan con estas aplicaciones determinan en gran medida su éxito o fracaso. Sin embargo, existen varias limitaciones en las propuestas actuales de evaluación de usabilidad Web, tales como: el concepto de usabilidad sólo se soporta parcialmente, las evaluaciones de usabilidad se realizan principalmente cuando la aplicación Web se ha desarrollado, hay una carencia de guías sobre cómo integrar adecuadamente la usabilidad en el desarrollo Web, y también existe una carencia de métodos de evaluación de la usabilidad Web que hayan sido validados empíricamente. Además, la mayoría de los procesos de desarrollo Web no aprovechan los artefactos producidos en las fases de diseño. Estos artefactos software intermedios se utilizan principalmente para guiar a los desarrolladores y para documentar la aplicación Web, pero no para realizar evaluaciones de usabilidad. Dado que la trazabilidad entre estos artefactos y la aplicación Web final no está bien definida, la realización de evaluaciones de usabilidad de estos artefactos resulta difícil. Este problema se mitiga en el desarrollo Web dirigido por modelos (DWDM), donde los artefactos intermedios (modelos) que representan diferentes perspectivas de una aplicación Web, se utilizan en todas las etapas del proceso de desarrollo, y el código fuente final se genera automáticamente a partir estos modelos. Al tener en cuenta la trazabilidad entre estos modelos, la evaluación de estos modelos permite detectar problemas de usabilidad que experimentaran los usuarios finales de la aplicación Web final, y proveer recomendaciones para corregir estos problemas de usabilidad durante fases tempranas del proceso de desarrollo Web. Esta tesis tiene como objetivo, tratando las anteriores limitaciones detectadas, el proponer un método de inspección de usabilidad que se puede integrar en diferentes procesos de desarrollo Web dirigido por modelos. El método se compone de un modelo de usabilidad Web que descompone el concepto de usabilidad en sub-características, atributos y métricas genéricas, y un proceso de evaluación de usabilidad Web (WUEP), que proporciona directrices sobre cómo el modelo de usabilidad se puede utilizar para llevar a cabo evaluaciones específicas. Las métricas genéricas del modelo de usabilidad deben operacionalizarse con el fin de ser aplicables a los artefactos software de diferentes métodos de desarrollo Web y en diferentes niveles de abstracción, lo que permite evaluar la usabilidad en varias etapas del proceso de desarrollo Web, especialmente en las etapas tempranas. Tanto el modelo de usabilidad como el proceso de evaluación están alineados con la última norma ISO/IEC 25000 estándar para la evaluación de la calidad de productos de software (SQuaRE). El método de inspección de usabilidad propuesto (WUEP) se ha instanciado en dos procesos de desarrollo Web dirigido por modelos diferentes (OO-H y WebML) a fin de demostrar la factibilidad de nuestra propuesta. Además, WUEP fue validado empíricamente mediante la realización de una familia de experimentos en OO-H y un experimento controlado en WebML. El objetivo de nuestros estudios empíricos fue evaluar la efectividad, la eficiencia, facilidad de uso percibida y la satisfacción percibida de los participantes; cuando utilizaron WUEP en comparación con un método de inspección industrial ampliamente utilizado: La Evaluación Heurística (HE). El análisis estadístico y meta-análisis de los datos obtenidos por separado de cada experimento indicaron que WUEP es más eficaz y eficiente que HE en la detección de problemas de usabilidad. Los evaluadores también percibieron más satisfacción cuando se aplicaron WUEP, y les
Fernández Martínez, A. (2012). A Usability Inspection Method for Model-driven Web Development Processes [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/17845
Palancia
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography