Dissertations / Theses on the topic 'Research models'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Research models.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Benedetti, Andrea. "Generalized models in epidemiology research." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=84472.
Full textIn the final simulation study, parametric multiple logistic regression was compared with its nonparametric GAM extension in their ability to control for a continuous confounding variable and several issues related to the implementation of GAMs in this context are investigated.
The results of these simulations will help researchers make optimal use of the potential advantages of flexible assumption-free modelling.
Lambert, Paul Christopher. "Hierarchical models in medical research." Thesis, University of Leicester, 2000. http://hdl.handle.net/2381/29361.
Full textSpencer, Neil Hardy. "Longitudinal multilevel models in educational research." Thesis, Lancaster University, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.306918.
Full textNEVES, ANTONIO BERNARDO FERREIRA. "STATISTICAL MODELS IN ADVERTISING MARKET RESEARCH." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 1991. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=9046@1.
Full textThe advertising is no doubt one of the most important weapons of Marketing. However, to measure its short-term efficiency as the result of sales increase can be a difficult task, mainly when it is compared to promotion results. For that, statistical models are being developed using other measures rather than sales volumes. At the same time, the advertising turn up to be seen in a more scientific way. Moreover, it took a remarkable place within the Marketing Research, creating different tendencies about the best way of guarenteeing the investiment return. Here is showed an essay of linking the theory and the most important and confirmed results of Advertising Reserch, presenting one methodology that assures this return with some guarantee.
Messmacher, Eduardo B. (Eduardo Bernhart) 1972. "Models for project management." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/9217.
Full textAlso available online at the DSpace at MIT website.
Includes bibliographical references (p. 119-122).
Organizations perform work essentially through operations and projects. The characteristics of projects makes them extremely difficult to manage: their non repetitive nature discards the trial and error learning, while their short life span is particularly unforgiving to misjudgments. Some authors have found that effective scheduling is an important contributor to the success of research and development (R&D), as well as construction projects. The widely used critical path method for scheduling projects and identifying important activities fails to capture two important dimensions of the problem: the availability of different technologies (or options) to perform the activities, and the inherent problem of limited availability of resources that most managers face. Nevertheless, when one tries to account for such additional constraints, the problems become very hard to solve. In this thesis we propose an approach to the scheduling problem using a genetic algorithm, and try to compare its performance to more traditional approaches, such as an extension to a very innovative Lagrangian relaxation approach recently proposed. The purpose of using genetic algorithms is twofold: first to obtain good approximations to very hard problems, and second to realize the limitations and virtues of this search technique. The purpose of this thesis is not only to develop the algorithms, but also to obtain insight about the implications of the additional constraints in the perspective of a project manager.
by Eduardo B. Messmacher.
S.M.
Brus, Linda. "Recursive black-box identification of nonlinear state-space ODE models." Licentiate thesis, Uppsala : Department of Information Technology, Uppsala University, 2006. http://www.it.uu.se/research/publications/lic/2006-001/.
Full textWiedemann, Michael. "Robust parameter design for agent-based simulation models with application in a cultural geography model." Thesis, Monterey, California : Naval Postgraduate School, 2010. http://edocs.nps.edu/npspubs/scholarly/theses/2010/Jun/10Jun%5FWiedemann.pdf.
Full textThesis Advisor(s): Johnson, Rachel T. ; Second Reader: Baez, Francisco R, "June 2010." Description based on title screen as viewed on July 15, 2010. Author(s) subject terms: Cultural Geography, Agent-Based Model (ABM), Irregular Warfare (IW), Theory of planned Behavior (TpB), Baysian Belief Nets (BBN), Counterinsurgency Operations (COIN), Stability Operations, Discrete Event Simulation (DES), Design of Experiments (DOX), Robust Parameter Design (RPD). Includes bibliographical references (p. 69-70). Also available in print.
Chandler, James D. "Estimating reliability with discrete growth models." Thesis, Monterey, Calif. : Naval Postgraduate School, 1988. http://edocs.nps.edu/npspubs/scholarly/theses/2008/Dec/08Dec%5FNAME.pdf.
Full textMonsch, Matthieu (Matthieu Frederic). "Large scale prediction models and algorithms." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/84398.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 129-132).
Over 90% of the data available across the world has been produced over the last two years, and the trend is increasing. It has therefore become paramount to develop algorithms which are able to scale to very high dimensions. In this thesis we are interested in showing how we can use structural properties of a given problem to come up with models applicable in practice, while keeping most of the value of a large data set. Our first application provides a provably near-optimal pricing strategy under large-scale competition, and our second focuses on capturing the interactions between extreme weather and damage to the power grid from large historical logs. The first part of this thesis is focused on modeling competition in Revenue Management (RM) problems. RM is used extensively across a swathe of industries, ranging from airlines to the hospitality industry to retail, and the internet has, by reducing search costs for customers, potentially added a new challenge to the design and practice of RM strategies: accounting for competition. This work considers a novel approach to dynamic pricing in the face of competition that is intuitive, tractable and leads to asymptotically optimal equilibria. We also provide empirical support for the notion of equilibrium we posit. The second part of this thesis was done in collaboration with a utility company in the North East of the United States. In recent years, there has been a number of powerful storms that led to extensive power outages. We provide a unified framework to help power companies reduce the duration of such outages. We first train a data driven model to predict the extent and location of damage from weather forecasts. This information is then used in a robust optimization model to optimally dispatch repair crews ahead of time. Finally, we build an algorithm that uses incoming customer calls to compute the likelihood of damage at any point in the electrical network.
by Matthieu Monsch.
Ph.D.
McLure, Stewart William Douglas. "Improving models for translational research in osteoarthritis." Thesis, University of Leeds, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.590471.
Full textSelén, Yngve. "Model selection /." Uppsala : Univ. : Dept. of Information Technology, Univ, 2004. http://www.it.uu.se/research/reports/lic/2004-003/.
Full textWeimar, Jörg Richard. "Cellular automata models for excitable media." Thesis, Virginia Tech, 1991. http://hdl.handle.net/10919/41365.
Full textA cellular automaton is developed for simulating excitable media. First, general "masks" as discrete approximations to the diffusion equation are examined, showing how to calculate the diffusion coefficient from the elements of the mask. The mask is then combined with a thresholding operation to simulate the propagation of waves (shock fronts) in excitable media, showing that (for well-chosen masks) the waves obey a linear "speedcurvature" relation with slope given by the predicted diffusion coefficient. The utility of different masks in terms of computational efficiency and adherence to a linear speed-curvature relation is assessed. Then, a cellular automaton model for wave propagation in reaction diffusion systems is constructed based on these "masks" for the diffusion component and on singular perturbation analysis for the reaction component. The cellular automaton is used to model spiral waves in the Belousov-Zhabotinskii reaction. The behavior of the spiral waves and the movement of the spiral tip are analyzed. By comparing these results to solutions of the Oregonator PDE model, the automaton is shown to be a useful and efficient replacement for the standard numerical solution of the PDE's.
Master of Science
Cerrón-Palomino, Rodolfo, and Peter Kaulicke. "Research in Andean Linguistics." Pontificia Universidad Católica del Perú, 2012. http://repositorio.pucp.edu.pe/index/handle/123456789/113289.
Full textImam, Md Kaisar. "Improvements to the complex question answering models." Thesis, Lethbridge, Alta. : University of Lethbridge, c2011, 2011. http://hdl.handle.net/10133/3214.
Full textx, 128 leaves : ill. ; 29 cm
Cha, Jin Seob. "Obtaining information from stochastic Lanchester-type combat models /." The Ohio State University, 1989. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487673114113213.
Full textWong, Chun-mei May. "Multilevel models for survival analysis in dental research." Click to view the E-thesis via HKUTO, 2005. http://sunzi.lib.hku.hk/hkuto/record/B3637216X.
Full textDuncan, Warwick John, and n/a. "Sheep mandibular animal models for dental implantology research." University of Otago. School of Dentistry, 2005. http://adt.otago.ac.nz./public/adt-NZDU20060707.144214.
Full textWong, Chun-mei May, and 王春美. "Multilevel models for survival analysis in dental research." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2005. http://hub.hku.hk/bib/B3637216X.
Full textLu, Jun. "Bayesian hierarchical models and applications in psychology research /." free to MU campus, to others for purchase, 2004. http://wwwlib.umi.com/cr/mo/fullcit?p3144437.
Full textPan, Huiqi. "Multilevel models in human growth and development research." Thesis, University College London (University of London), 1995. http://discovery.ucl.ac.uk/10020243/.
Full textKight, William D. "An analysis of reasonableness models for research assessments." ScholarWorks, 2010. https://scholarworks.waldenu.edu/dissertations/719.
Full textSidumo, Bonelwa. "Generalized linear models, with applications in fisheries research." Thesis, Rhodes University, 2018. http://hdl.handle.net/10962/61102.
Full textGupta, Vishal Ph D. Massachusetts Institute of Technology. "Data-driven models for uncertainty and behavior." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/91301.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
117
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 173-180).
The last decade has seen an explosion in the availability of data. In this thesis, we propose new techniques to leverage these data to tractably model uncertainty and behavior. Specifically, this thesis consists of three parts: In the first part, we propose a novel schema for utilizing data to design uncertainty sets for robust optimization using hypothesis testing. The approach is flexible and widely applicable, and robust optimization problems built from our new data driven sets are computationally tractable, both theoretically and practically. Optimal solutions to these problems enjoy a strong, finite-sample probabilistic guarantee. Computational evidence from classical applications of robust optimization { queuing and portfolio management { confirm that our new data-driven sets significantly outperform traditional robust optimization techniques whenever data is available. In the second part, we examine in detail an application of the above technique to the unit commitment problem. Unit commitment is a large-scale, multistage optimization problem under uncertainty that is critical to power system operations. Using real data from the New England market, we illustrate how our proposed data-driven uncertainty sets can be used to build high-fidelity models of the demand for electricity, and that the resulting large-scale, mixed-integer adaptive optimization problems can be solved efficiently. With respect to this second contribution, we propose new data-driven solution techniques for this class of problems inspired by ideas from machine learning. Extensive historical back-testing confirms that our proposed approach generates high quality solutions that compare with state-of-the-art methods. In the third part, we focus on behavioral modeling. Utility maximization (single agent case) and equilibrium modeling (multi-agent case) are by far the most common behavioral models in operations research. By combining ideas from inverse optimization with the theory of variational inequalities, we develop an efficient, data-driven technique for estimating the primitives of these models. Our approach supports both parametric and nonparametric estimation through kernel learning. We prove that our estimators enjoy a strong generalization guarantee even when the model is misspecified. Finally, we present computational evidence from applications in economics and transportation science illustrating the effectiveness of our approach and its scalability to large-scale instances.
by Vishal Gupta.
Ph. D.
Ng, Yee Sian. "Advances in data-driven models for transportation." Thesis, Massachusetts Institute of Technology, 2019. https://hdl.handle.net/1721.1/122100.
Full textThesis: Ph. D., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2019
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 163-176).
With the rising popularity of ride-sharing and alternative modes of transportation, there has been a renewed interest in transit planning to improve service quality and stem declining ridership. However, it often takes months of manual planning for operators to redesign and reschedule services in response to changing needs. To this end, we provide four models of transportation planning that are based on data and driven by optimization. A key aspect is the ability to provide certificates of optimality, while being practical in generating high-quality solutions in a short amount of time. We provide approaches to combinatorial problems in transit planning that scales up to city-sized networks. In transit network design, current tractable approaches only consider edges that exist, resulting in proposals that are closely tethered to the original network. We allow new transit links to be proposed and account for commuters transferring between different services. In integrated transit scheduling, we provide a way for transit providers to synchronize the timing of services in multimodal networks while ensuring regularity in the timetables of the individual services. This is made possible by taking the characteristics of transit demand patterns into account when designing tractable formulations. We also advance the state of the art in demand models for transportation optimization. In emergency medical services, we provide data-driven formulations that outperforms their probabilistic counterparts in ensuring coverage. This is achieved by replacing independence assumptions in probabilistic models and capturing the interactions of services in overlapping regions. In transit planning, we provide a unified framework that allows us to optimize frequencies and prices jointly in transit networks for minimizing total waiting time.
by Yee Sian Ng.
Ph. D.
Ph.D. Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center
Li, Kevin Bozhe. "Multiperiod Optimization Models in Operations Management." Thesis, University of California, Berkeley, 2019. http://pqdtopen.proquest.com/#viewpdf?dispub=13423656.
Full textIn the past two decades, retailers have witnessed rapid changes in markets due to an increase in competition, the rise of e-commerce, and ever-changing consumer behavior. As a result, retailers have become increasingly aware of the need to better coordinate inventory control with pricing in order to maximize their profitability. This dissertation was motivated by two of such problems facing retailers at the interface between pricing and inventory control. One considers inventory control decisions for settings in which planned prices fluctuate over time, and the other considers pricing of multiple substitutable products for settings in which customers hold inventory as a consequence of stockpiling when promotional prices are offered.
In Chapter 1, we provide a brief motivation for each problem. In Chapter 2, we consider optimization of procurement and inventory allocation decisions by a retailer that sells a product with a long production lead time and a short selling season. The retailer orders most products months before the selling season, and places only one order for each product due to short product life cycles and long delivery lead times. Goods are initially stored at the warehouse and then sent to stores over the course of the season. The stores are in high-rent locations, necessitating efficient use of space, so there is no backroom space and it is uneconomical to send goods back to the warehouse; thus, all inventory at each store is available for sale. Due to marketing and logistics considerations, the planned trajectory of prices is determined in advance and may be non-monotonic. Demand is stochastic and price-dependent, and independent across time periods. We begin our analysis with the case of a single store. We first formulate the inventory allocation problem given a fixed initial order quantity with the objective of maximizing expected profit as a dynamic program and explain both technical and computational challenges in identifying the optimal policy. We then present two variants of a heuristic based on the notion of equalizing the marginal value of inventory across the time periods. Results from a numerical study indicate that the more sophisticated variant of the heuristic performs well when compared with both an upper bound and an industry benchmark, and even the simpler variant performs fairly well for realistic settings. We then generalize our approaches to the case of multiple stores, where we allow the stores to have different price trajectories. Our numerical results suggest that the performance of both heuristics is still robust in the multiple store setting, and does not suffer from the same performance deterioration observed for the industry benchmark as the number of stores increases or as price differences increase across stores and time periods. For the pre-season procurement problem, we develop a heuristic based on a generalization of the newsvendor problem that accounts for the two-tiered salvage values in our setting, specifically, a low price during end-of-season markdown periods and a very low or zero salvage value after the season has concluded. Results for numerical examples indicate that our modified newsvendor heuristic provides solutions that are as good as those obtained via grid search.
In Chapter 3, we address a retailer's problem of setting prices, including promotional prices, over a multi-period horizon for multiple substitutable products in the same product category. We consider the problem in a setting in which customers anticipate the retailer's pricing strategy and the retailer anticipates the customers' purchasing decisions. We formulate the problem as a two-stage game in which the profit maximizing retailer chooses prices and the utility maximizing customers respond by making explicit decisions regarding purchasing and consumption, and thus also implicit decisions regarding stockpiling. We incorporate a fairly general reference price formation process that allows for cross-product effects of prices on reference prices. We initially focus on a single customer segment. The representative customer's utility function accounts for the value of consumption of the products, psychological benefit (for deal-seekers) from purchasing at a price below his/her reference price but with diminishing marginal returns, costs of purchases, penalties for both shortages and holding inventory, and disutility for deviating from a consumption target in each period (where applicable). We are the first to develop a model that simultaneously accounts for this combination of realistic factors for the customer, and we also separate the customer's purchasing and consumption decisions. We develop a methodology for solving the customer's problem for arbitrary price trajectories based on a linear quadratic control formulation of an approximation of the customer's utility maximization problem. We derive analytical representations for the customer's optimal decisions as simple linear functions of prices, reference prices, inventory levels (as state variables), and the cumulative aggregate consumption level (as a state variable). (Abstract shortened by ProQuest.)
Closson, Taunia Lydia Lynn, and University of Lethbridge Faculty of Arts and Science. "Biological models with a square wave driving force." Thesis, Lethbridge, Alta. : University of Lethbridge, Faculty of Arts and Science, 2002, 2002. http://hdl.handle.net/10133/146.
Full textx, 105 leaves : ill. (some col.) ; 29 cm.
Koutroumpis, Panagiotis. "Research on futures-commodities, macroeconomic volatility and financial development." Thesis, Brunel University, 2016. http://bura.brunel.ac.uk/handle/2438/13989.
Full textMüller, Werner, and Michaela Nettekoven. "A Panel Data Analysis: Research & Development Spillover." Department of Statistics and Mathematics, WU Vienna University of Economics and Business, 1998. http://epub.wu.ac.at/620/1/document.pdf.
Full textSeries: Forschungsberichte / Institut für Statistik
Zarybnisky, Eric J. (Eric Jack) 1979. "Maintenance scheduling for modular systems-models and algorithms." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/68972.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 185-188).
Maintenance scheduling is an integral part of many complex systems. For instance, without effective maintenance scheduling, the combined effects of preventative and corrective maintenance can have severe impacts on the availability of those systems. Based on current Air Force trends including maintenance manpower, dispersed aircraft basing, and increased complexity, there has been a renewed focus on preventative maintenance. To address these concerns, this thesis develops two models for preventative maintenance scheduling for complex systems, the first of interest in the system concept development and design phase, and the second of interest during operations. Both models are highly complex and intractable to solve in their original forms. For the first model, we develop approximation algorithms that yield high quality and easily implementable solutions. To address the second model, we propose a decomposition strategy that produces submodels that can be solved via existing algorithms or via specialized algorithms we develop. While much of the literature has examined stochastically failing systems, preventative maintenance of usage limited systems has received less attention. Of particular interest is the design of modular systems whose components must be repaired/replaced to prevent a failure. By making cost tradeoffs early in development, program managers, designers, engineers, and test conductors can better balance the up front costs associated with system design and testing with the long term cost of maintenance. To facilitate such a tradeoff, the Modular Maintenance Scheduling Problem provides a framework for design teams to evaluate different design and operations concepts and then evaluate the long term costs. While the general Modular Maintenance Scheduling Problem does not require maintenance schedules with specific structure, operational considerations push us to consider cyclic schedules in which components are maintained at a fixed frequency. In order to efficiently find cyclic schedules, we propose the Cycle Rounding algorithm, which has an approximation guarantee of 2, and a family of Shifted Power-of-Two algorithms, which have an approximation guarantee of 1/ ln(2) ~ 1.4427. Computational results indicate that both algorithms perform much better than their associated performance guarantees providing solutions within 15%-25% of a lower bound. Once a modular system has moved into operations, manpower and transportation scheduling become important considerations when developing maintenance schedules. To address the operations phase, we develop the Modular Maintenance and System Assembly Model to balance the tradeoffs between inventory, maintenance capacity, and transportation resources. This model explicitly captures the risk-pooling effects of a central repair facility while also modeling the interaction between repair actions at such a facility. The full model is intractable for all but the smallest instances. Accordingly, we decompose the problem into two parts, the system assembly portion and module repair portion. Finally, we tie together the Modular Maintenance and System Assembly Model with key concepts from the Modular Maintenance Scheduling Problem to propose an integrated methodology for design and operation.
by Eric Jack Zarybnisky.
Ph.D.
Chhaochhria, Pallav. "Forecast-driven tactical planning models for manufacturing systems." Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/68700.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student submitted PDF version of thesis.
Includes bibliographical references (p. 243-247).
Our work is motivated by real-world planning challenges faced by a manufacturer of industrial products. In the first part of the thesis, we study a multi-product serial-flow production line that operates in a low-volume, long lead-time environment. The objective is to minimize variable operating costs, in the face of forecast uncertainty, raw material arrival uncertainty and in-process failure. We develop a dynamic-programming-based tactical model to capture the key uncertainties and trade-offs, and to determine the minimum-cost operating tactics. The tactics include smoothing production to reduce production-related costs, and segmenting the serial-flow line with decoupling buffers to protect against variance propagation. For each segment, we specify a work release policy and a production control policy to manage the work-in-process inventory within the segment and to maintain the inventory targets in the downstream buffer. We also optimize the raw material ordering policy with fixed ordering times, long lead-times and staggered deliveries. In the second part of the thesis, we examine a multi-product assembly system that operates in a high-volume, short lead- time environment. The operating tactics used here include determining a fixed-length cyclic schedule to control production, in addition to smoothing production and segmenting the system with decoupling buffers. We develop another dynamic-programming-based tactical model that determines optimal policies for production planning and scheduling, inventory, and raw material ordering; these policies minimize the operating cost for the system in the face of forecast and raw material arrival uncertainty. We tested these models on both hypothetical and actual factory scenarios. The results confirmed our intuition and also helped develop new managerial insights on the application of these operating tactics. Moreover, the tactical model's factory performance predictions were found to be within 10% of simulation results for the testbed systems, thus validating the models.
by Pallav Chhaochhria.
Ph.D.
Zhou, Xinfeng. "Application of robust statistics to asset allocation models." Thesis, Massachusetts Institute of Technology, 2006. http://hdl.handle.net/1721.1/36231.
Full textIncludes bibliographical references (p. 105-107).
Many strategies for asset allocation involve the computation of expected returns and the covariance or correlation matrix of financial instruments returns. How much of each instrument to own is determined by an attempt to minimize risk (the variance of linear combinations of investments in these financial assets) subject to various constraints such as a given level of return, concentration limits, etc. The expected returns and the covariance matrix contain many parameters to estimate and two main problems arise. First, the data will very likely have outliers that will seriously affect the covariance matrix. Second, with so many parameters to estimate, a large number of observations are required and the nature of markets may change substantially over such a long period. In this thesis we use robust covariance procedures, such as FAST-MCD, quadrant-correlation-based covariance and 2D-Huber-based covariance, to address the first problem and regularization (Bayesian) methods that fully utilize the market weights of all assets for the second. High breakdown affine equivariant robust methods are effective, but tend to be costly when cross-validation is required to determine regularization parameters.
(cont.) We, therefore, also consider non-affine invariant robust covariance estimation. When back-tested on market data, these methods appear to be effective in improving portfolio performance. In conclusion, robust asset allocation methods have great potential to improve risk-adjusted portfolio returns and therefore deserve further exploration in investment management research.
by Xinfeng Zhou.
S.M.
Keller, Philipp W. (Philipp Wilhelm) 1982. "Tractable multi-product pricing under discrete choice models." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/82871.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 199-204).
We consider a retailer offering an assortment of differentiated substitutable products to price-sensitive customers. Prices are chosen to maximize profit, subject to inventory/ capacity constraints, as well as more general constraints. The profit is not even a quasi-concave function of the prices under the basic multinomial logit (MNL) demand model. Linear constraints can induce a non-convex feasible region. Nevertheless, we show how to efficiently solve the pricing problem under three important, more general families of demand models. Generalized attraction (GA) models broaden the range of nonlinear responses to changes in price. We propose a reformulation of the pricing problem over demands (instead of prices) which is convex. We show that the constrained problem under MNL models can be solved in a polynomial number of Newton iterations. In experiments, our reformulation is solved in seconds rather than days by commercial software. For nested-logit (NL) demand models, we show that the profit is concave in the demands (market shares) when all the price-sensitivity parameters are sufficiently close. The closed-form expressions for the Hessian of the profit that we derive can be used with general-purpose nonlinear solvers. For the special (unconstrained) case already considered in the literature, we devise an algorithm that requires no assumptions on the problem parameters. The class of generalized extreme value (GEV) models includes the NL as well as the cross-nested logit (CNL) model. There is generally no closed form expression for the profit in terms of the demands. We nevertheless how the gradient and Hessian can be computed for use with general-purpose solvers. We show that the objective of a transformed problem is nearly concave when all the price sensitivities are close. For the unconstrained case, we develop a simple and surprisingly efficient first-order method. Our experiments suggest that it always finds a global optimum, for any model parameters. We apply the method to mixed logit (MMNL) models, by showing that they can be approximated with CNL models. With an appropriate sequence of parameter scalings, we conjecture that the solution found is also globally optimal.
by Philipp Wilhelm Keller.
Ph.D.
Weber, Theophane. "Correlation decay and decentralized optimization in graphical models." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/58079.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 213-229) and index.
Many models of optimization, statistics, social organizations and machine learning capture local dependencies by means of a network that describes the interconnections and interactions of different components. However, in most cases, optimization or inference on these models is hard due to the dimensionality of the networks. This is so even when using algorithms that take advantage of the underlying graphical structure. Approximate methods are therefore needed. The aim of this thesis is to study such large-scale systems, focusing on the question of how randomness affects the complexity of optimizing in a graph; of particular interest is the study of a phenomenon known as correlation decay, namely, the phenomenon where the influence of a node on another node of the network decreases quickly as the distance between them grows. In the first part of this thesis, we develop a new message-passing algorithm for optimization in graphical models. We formally prove a connection between the correlation decay property and (i) the near-optimality of this algorithm, as well as (ii) the decentralized nature of optimal solutions. In the context of discrete optimization with random costs, we develop a technique for establishing that a system exhibits correlation decay. We illustrate the applicability of the method by giving concrete results for the cases of uniform and Gaussian distributed cost coefficients in networks with bounded connectivity. In the second part, we pursue similar questions in a combinatorial optimization setting: we consider the problem of finding a maximum weight independent set in a bounded degree graph, when the node weights are i.i.d. random variables.
(cont.) Surprisingly, we discover that the problem becomes tractable for certain distributions. Specifically, we construct a PTAS for the case of exponentially distributed weights and arbitrary graphs with degree at most 3, and obtain generalizations for higher degrees and different distributions. At the same time we prove that no PTAS exists for the case of exponentially distributed weights for graphs with sufficiently large but bounded degree, unless P=NP. Next, we shift our focus to graphical games, which are a game-theoretic analog of graphical models. We establish a connection between the problem of finding an approximate Nash equilibrium in a graphical game and the problem of optimization in graphical models. We use this connection to re-derive NashProp, a message-passing algorithm which computes Nash equilibria for graphical games on trees; we also suggest several new search algorithms for graphical games in general networks. Finally, we propose a definition of correlation decay in graphical games, and establish that the property holds in a restricted family of graphical games. The last part of the thesis is devoted to a particular application of graphical models and message-passing algorithms to the problem of early prediction of Alzheimer's disease. To this end, we develop a new measure of synchronicity between different parts of the brain, and apply it to electroencephalogram data. We show that the resulting prediction method outperforms a vast number of other EEG-based measures in the task of predicting the onset of Alzheimer's disease.
by Théophane Weber.
Ph.D.
Dyachenko, Tatiana L. "Bayesian Models for Studying Consumer Behavior." The Ohio State University, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=osu1403017394.
Full textDelgado, San Martin Juan A. "Mathematical models for preclinical heterogeneous cancers." Thesis, University of Aberdeen, 2016. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=230139.
Full textMcEwan, J. A. "Methodology and new applications in food acceptance research." Thesis, University of Reading, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.380836.
Full textHan, Sung Ho. "Integrated empirical models based on a sequential research strategy." Diss., This resource online, 1991. http://scholar.lib.vt.edu/theses/available/etd-07282008-134026/.
Full textZaretsky, M. (Marina). "Essays on variational inequalities and competitive supply chain models." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/28859.
Full textIncludes bibliographical references (p. 103-107).
In the first part of the thesis we combine ideas from cutting plane and interior point methods to solve variational inequality problems efficiently. In particular, we introduce "smarter" cuts into two general methods for solving these problems. These cuts utilize second order information on the problem through the use of a gap function. We establish convergence results for both methods, as well as complexity results for one of the methods. Finally, we compare the performance of an approach that combines affine scaling and cutting plane methods with other methods for solving variational inequalities. The second part of the thesis considers a supply chain setting where several capacitated suppliers compete for orders from a single retailer in a multi-period environment. At each period the retailer places orders to the suppliers in response to the prices and capacities they announce. Our model allows the retailer to carry inventory. Furthermore, suppliers can expand their capacity at an additional cost; the retailer faces exogenous, price-dependent, stochastic demand. We analyze discrete as well as continuous time versions of the model: (i) we illustrate the existence of equilibrium policies; (ii) we characterize the structure of these policies; (iii) we consider coordination mechanisms; and (iv) we present some computational results. We also consider a modified model that uses option contracts and finally present some extensions.
by Marina Zaretsky.
Ph.D.
Anderson, Ross Michael. "Stochastic models and data driven simulations for healthcare operations." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/92055.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 251-257).
This thesis considers problems in two areas in the healthcare operations: Kidney Paired Donation (KPD) and scheduling medical residents in hospitals. In both areas, we explore the implications of policy change through high fidelity simulations. We then build stochastic models to provide strategic insight into how policy decisions affect the operations of these healthcare systems. KPD programs enable patients with living but incompatible donors (referred to as patient-donor pairs) to exchange kidneys with other such pairs in a centrally organized clearing house. Exchanges involving two or more pairs are performed by arranging the pairs in a cycle, where the donor from each pair gives to the patient from the next pair. Alternatively, a so called altruistic donor can be used to initiate a chain of transplants through many pairs, ending on a patient without a willing donor. In recent years, the use of chains has become pervasive in KPD, with chains now accounting for the majority of KPD transplants performed in the United States. A major focus of our work is to understand why long chains have become the dominant method of exchange in KPD, and how to best integrate their use into exchange programs. In particular, we are interested in policies that KPD programs use to determine which exchanges to perform, which we refer to as matching policies. First, we devise a new algorithm using integer programming to maximize the number of transplants performed on a fixed pool of patients, demonstrating that matching policies which must solve this problem are implementable. Second, we evaluate the long run implications of various matching policies, both through high fidelity simulations and analytic models. Most importantly, we find that: (1) using long chains results in more transplants and reduced waiting time, and (2) the policy of maximizing the number of transplants performed each day is as good as any batching policy. Our theoretical results are based on introducing a novel model of a dynamically evolving random graph. The analysis of this model uses classical techniques from Erdos-Renyi random graph theory as well as tools from queueing theory including Lyapunov functions and Little's Law. In the second half of this thesis, we consider the problem of how hospitals should design schedules for their medical residents. These schedules must have capacity to treat all incoming patients, provide quality care, and comply with regulations restricting shift lengths. In 2011, the Accreditation Council for Graduate Medical Education (ACGME) instituted a new set of regulations on duty hours that restrict shift lengths for medical residents. We consider two operational questions for hospitals in light of these new regulations: will there be sufficient staff to admit all incoming patients, and how will the continuity of patient care be affected, particularly in a first day of a patients hospital stay, when such continuity is critical? To address these questions, we built a discrete event simulation tool using historical data from a major academic hospital, and compared several policies relying on both long and short shifts. The simulation tool was used to inform staffing level decisions at the hospital, which was transitioning away from long shifts. Use of the tool led to the following strategic insights. We found that schedules based on shorter more frequent shifts actually led to a larger admitting capacity. At the same time, such schedules generally reduce the continuity of care by most metrics when the departments operate at normal loads. However, in departments which operate at the critical capacity regime, we found that even the continuity of care improved in some metrics for schedules based on shorter shifts, due to a reduction in the use of overtime doctors. We develop an analytically tractable queueing model to capture these insights. The analysis of this model requires analyzing the steady-state behavior of the fluid limit of a queueing system, and proving a so called "interchange of limits" result.
by Ross Michael Anderson.
Ph. D.
Harsha, Pavithra. "Mitigating airport congestion : market mechanisms and airline response models." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/46387.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Includes bibliographical references (leaves 157-165).
Efficient allocation of scarce resources in networks is an important problem worldwide. In this thesis, we focus on resource allocation problems in a network of congested airports. The increasing demand for access to the world's major commercial airports combined with the limited operational capacity at many of these airports have led to growing air traffic congestion resulting in several billion dollars of delay cost every year. In this thesis, we study two demand-management techniques -- strategic and operational approaches -- to mitigate airport congestion. As a strategic initiative, auctions have been proposed to allocate runway slot capacity. We focus on two elements in the design of such slot auctions -- airline valuations and activity rules. An aspect of airport slot market environments, which we argue must be considered in auction design, is the fact that the participating airlines are budget-constrained. -- The problem of finding the best bundle of slots on which to bid in an iterative combinatorial auction, also called the preference elicitation problem, is a particularly hard problem, even more in the case of airlines in a slot auction. We propose a valuation model, called the Aggregated Integrated Airline Scheduling and Fleet Assignment Model, to help airlines understand the true value of the different bundles of slots in the auction. This model is efficient and was found to be robust to data uncertainty in our experimental simulations.
(cont.) -- Activity rules are checks made by the auctioneer at the end of every round to suppress strategic behavior by bidders and to promote consistent, continual preference elicitation. These rules find applications in several real world scenarios including slot auctions. We show that the commonly used activity rules are not applicable for slot auctions as they prevent straightforward behavior by budget-constrained bidders. We propose the notion of a strong activity rule which characterizes straightforward bidding strategies. We then show how a strong activity rule in the context of budget-constrained bidders (and quasilinear bidders) can be expressed as a linear feasibility problem. This work on activity rules also applies to more general iterative combinatorial auctions.We also study operational (real-time) demand-management initiatives that are used when there are sudden drops in capacity at airports due to various uncertainties, such as bad-weather. We propose a system design that integrates the capacity allocation, airline recovery and inter-airline slot exchange procedures, and suggest metrics to evaluate the different approaches to fair allocations.
by Pavithra Harsha.
Ph.D.
Menjoge, Rajiv (Rajiv Shailendra). "New procedures for visualizing data and diagnosing regression models." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/61190.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 97-103).
This thesis presents new methods for exploring data using visualization techniques. The first part of the thesis develops a procedure for visualizing the sampling variability of a plot. The motivation behind this development is that reporting a single plot of a sample of data without a description of its sampling variability can be uninformative and misleading in the same way that reporting a sample mean without a confidence interval can be. Next, the thesis develops a method for simplifying large scatter plot matrices, using similar techniques as the above procedure. The second part of the thesis introduces a new diagnostic method for regression called backward selection search. Backward selection search identifies a relevant feature set and a set of influential observations with good accuracy, given the difficulty of the problem, and additionally provides a description, in the form of a set of plots, of how the regression inferences would be affected with other model choices, which are close to optimal. This description is useful, because an observation, that one analyst identifies as an outlier, could be identified as the most important observation in the data set by another analyst. The key idea behind backward selection search has implications for methodology improvements beyond the realm of visualization. This is described following the presentation of backward selection search. Real and simulated examples, provided throughout the thesis, demonstrate that the methods developed in the first part of the thesis will improve the effectiveness and validity of data visualization, while the methods developed in the second half of the thesis will improve analysts' abilities to select robust models.
by Rajiv Menjoge.
Ph.D.
Arkhipov, Dmitri I. "Computational Models for Scheduling in Online Advertising." Thesis, University of California, Irvine, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10168557.
Full textProgrammatic advertising is an actively developing industry and research area. Some of the research in this area concerns the development of optimal or approximately optimal contracts and policies between publishers, advertisers and intermediaries such as ad networks and ad exchanges. Both the development of contracts and the construction of policies governing their implementation are difficult challenges, and different models take different features of the problem into account. In programmatic advertising decisions are made in real time, and time is a scarce resource particularly for publishers who are concerned with content load times. Policies for advertisement placement must execute very quickly once content is requested; this requires policies to either be pre-computed and accessed as needed, or for the policy execution to be very efficient. We formulate a stochastic optimization problem for per publisher ad sequencing with binding latency constraints. Within our context an ad request lifecycle is modeled as a sequence of one by one solicitations (OBOS) subprocesses/lifecycle stages. From the viewpoint of a supply side platform (SSP) (an entity acting in proxy for a collection of publishers), the duration/span of a given lifecycle stage/subprocess is a stochastic variable. This stochasticity is due both to the stochasticity inherent in Internet delay times, and the lack of information regarding the decision processes of independent entities. In our work we model the problem facing the SSP, namely the problem of optimally or near-optimally choosing the next lifecycle stage of a given ad request lifecycle at any given time. We solve this problem to optimality (subject to the granularity of time) using a classic application of Richard Bellman's dynamic programming approach to the 0/1 Knapsack Problem. The DP approach does not scale to a large number of lifecycle stages/subprocesses so a sub-optimal approach is needed. We use our DP formulation to derive a focused real time dynamic programming (FRTDP) implementation, a heuristic method with optimality guarantees for solving our problem. We empirically evaluate (through simulation) the performance of our FRTDP implementation relative to both the DP implementation (for tractable instances) and to several alternative heuristics for intractable instances. Finally, we make the case that our work is usefully applicable to problems outside the domain of online advertising.
Bhuiyan, Farina. "Dynamic models of concurrent engineering processes and performance." Thesis, McGill University, 2001. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=38153.
Full textMonaghan, Paul Francis. "Model misspecification in survival analysis : applications to cancer research." Thesis, University of Liverpool, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.366244.
Full textMišić, Velibor V. "Data, models and decisions for large-scale stochastic optimization problems." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/105003.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student-submitted PDF version of thesis.
Includes bibliographical references (pages 204-209).
Modern business decisions exceed human decision making ability: often, they are of a large scale, their outcomes are uncertain, and they are made in multiple stages. At the same time, firms have increasing access to data and models. Faced with such complex decisions and increasing access to data and models, how do we transform data and models into effective decisions? In this thesis, we address this question in the context of four important problems: the dynamic control of large-scale stochastic systems, the design of product lines under uncertainty, the selection of an assortment from historical transaction data and the design of a personalized assortment policy from data. In the first chapter, we propose a new solution method for a general class of Markov decision processes (MDPs) called decomposable MDPs. We propose a novel linear optimization formulation that exploits the decomposable nature of the problem data to obtain a heuristic for the true problem. We show that the formulation is theoretically stronger than alternative proposals and provide numerical evidence for its strength in multi-armed bandit problems. In the second chapter, we consider to how to make strategic product line decisions under uncertainty in the underlying choice model. We propose a method based on robust optimization for addressing both parameter uncertainty and structural uncertainty. We show using a real conjoint data set the benefits of our approach over the traditional approach that assumes both the model structure and the model parameters are known precisely. In the third chapter, we propose a new two-step method for transforming limited customer transaction data into effective assortment decisions. The approach involves estimating a ranking-based choice model by solving a large-scale linear optimization problem, and solving a mixed-integer optimization problem to obtain a decision. Using synthetic data, we show that the approach is scalable, leads to accurate predictions and effective decisions that outperform alternative parametric and non-parametric approaches. In the last chapter, we consider how to leverage auxiliary customer data to make personalized assortment decisions. We develop a simple method based on recursive partitioning that segments customers using their attributes and show that it improves on a "uniform" approach that ignores auxiliary customer information.
by Velibor V. Mišić.
Ph. D.
Jernigan, Nicholas R. (Nicholas Richard). "Multi-modal, multi-period, multi-commodity transportation : models and algorithms." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/91399.
Full text33
"June 2014." Cataloged from PDF version of thesis.
Includes bibliographical references (pages 51-54).
In this paper we present a mixed integer optimization framework for modeling the shipment of goods between origin destination (O-D) pairs by vehicles of different types over a time-space network. The output of the model is an optimal schedule and routing of vehicle movements and assignment of goods to vehicles. Specifically, this framework allows for: multiple vehicles of differing characteristics (including speed, cost of travel, and capacity), transshipment locations where goods can be transferred between vehicles; and availability times for goods at their origins and delivery time windows for goods at their destinations. The model is composed of three stages: In the first, vehicle quantities, by type, and goods are allocated to routes in order to minimize late deliveries and vehicle movement costs. In the second stage, individual vehicles, specified by vehicle identification numbers, are assigned routes, and goods are assigned to those vehicles based on the results of the first stage and a minimization of costs involved with the transfer of goods between vehicles. In the third stage we reallocate the idle time of vehicles in order to satisfy crew rest constraints. Computational results show that provably optimal or near optimal solutions are possible for realistic instance sizes.
by Nicholas R. Jernigan.
S.M.
Doan, Xuan Vinh. "Optimization under moment, robust, and data-driven models of uncertainty." Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/57538.
Full textThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student submitted PDF version of thesis.
Includes bibliographical references (p. 151-156).
We study the problem of moments and present two diverse applications that apply both the hierarchy of moment relaxation and the moment duality theory. We then propose a moment-based uncertainty model for stochastic optimization problems, which addresses the ambiguity of probability distributions of random parameters with a minimax decision rule. We establish the model tractability and are able to construct explicitly the extremal distributions. The quality of minimax solutions is compared with that of solutions obtained from other approaches such as data-driven and robust optimization approach. Our approach shows that minimax solutions hedge against worst-case distributions and usually provide low cost variability. We also extend the moment-based framework for multi-stage stochastic optimization problems, which yields a tractable model for exogenous random parameters and affine decision rules. Finally, we investigate the application of data-driven approach with risk aversion and robust optimization approach to solve staffing and routing problem for large-scale call centers. Computational results with real data of a call center show that a simple robust optimization approach can be more efficient than the data-driven approach with risk aversion.
by Xuan Vinh Doan.
Ph.D.
Sun, Peng 1974. "Constructing learning models from data : the dynamic catalog mailing problem." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/16927.
Full textIncludes bibliographical references (p. 105-107).
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
The catalog industry is a large and important industry in the US economy. One of the most important and challenging business decisions in the industry is to decide who should receive catalogs, due to the significant mailing cost and the low response rate. The problem is a dynamic one - when a customer is ready to purchase, s/he may order from a previous catalog if s/he does not have the most recent one. In this sense, customers' purchasing behavior depends not only on the firm's most recent mailing decision, but also on prior mailing decisions. From the firm's perspective, in order to maximize its long-term profit it should make a series of optimal mailing decisions to each customer over time. Contrary to the traditional myopic catalog mailing decision process that is generally implemented in the catalog industry, we propose a model that allows firms to design optimal dynamic mailing policies using their own business data. We constructed the model from a large data set provided by a catalog mailing company. The computational results from the historical data show great potential profit improvement. This application differs from many other applications of (approximate) dynamic programming in that an underlying Markov model is not a priori available, nor can it be derived in a principled manner. Instead, it has to be estimated or "learned" from available data. The thesis furthers the discussion on issues related to constructing learning models from data. More specifically, we discuss the so called "endogeneity problem" and the effects of inaccuracy in model parameter estimation. The fact that the model parameter estimation depends on data collected according to a specific policy introduces an endogeneity problem. As a result, the derived optimal policy depends on the original policy used to collect the data.
(cont.) In the thesis we discuss a specific endogeneity problem, "attribution error." We also investigate whether online learning can solve this problem. More specifically, we discuss the existence of fixed point policies for potential on-line learning algorithms. Imprecision in model parameter estimation also creates the potential for bias. We illustrate this problem and offer a method for detecting it. Finally, we report preliminary results from a large scale field test that tests the effectiveness of the proposed approach in a real business decision setting.
by Peng Sun.
Ph.D.
蘇美子 and Mee-chi Meko So. "An operations research model and algorithm for a production planning application." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2002. http://hub.hku.hk/bib/B31226681.
Full textMcGill, Trevor, and University of Lethbridge Faculty of Arts and Science. "Functionally non-adaptive retinal plasticity in rat models of human retinal degenerative disease." Thesis, Lethbridge, Alta. : University of Lethbridge, Faculty of Arts and Science, 2008, 2008. http://hdl.handle.net/10133/726.
Full textxvii, 205 leaves : ill. ; 29 cm. --