To see the other types of publications on this topic, follow the link: Information management – Econometric models.

Dissertations / Theses on the topic 'Information management – Econometric models'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Information management – Econometric models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Lee, Daesik. "Essays on coalition formation under asymmetric information." Diss., Virginia Polytechnic Institute and State University, 1988. http://hdl.handle.net/10919/53567.

Full text
Abstract:
We consider the applicability of the Revelation Principle under the possibility of collusive behavior among players in some Bayesian framework. In doing this, since the coalition formation itself suffers information asymmetry problems, we assume that the coalition is formed if the colluding parties can successfully find some coalitional mechanism whose outcome is a set of messages in the original mechanism. Recently Cremer [1986] proposes a coalitional mechanism in the framework of the well known Vickrey-Clark-Groves mechanism. We assume that the agents successfully collude if they can find coalitional a mechanism such that (i) coalitional mechanism is incentive-compatible and (ii) the payoff of this mechanism is strictly Pareto-improving in terms of the agent’s expected utility. Our analysis is undertaken in a one principal/two agent framework. We first ünd that the Revelation Principle is still applicable in the pure adverse selection model. We then extend this result to a model with both adverse selection and moral hazard aspects. Finally, we consider a three-tier principal/supervisor/agent hierarchical organization, as in Tirole (1986). We explicitly present the coalitional mechanism as a side-contract between the supervisor and the agent. We apply the previous result of applicability of the Revelation Principle and characterize the coalition-proof mechanism. We find that the principal can design an optimal collusion free contract with some additional cost by specifying proper individual and coalitional incentive-compatibility conditions and individual rationality conditions. Moreover, we find that the results of Tirole (1986)’s paper hinge on the fact that he considers only “hard,” verifiable, information.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
2

Wei, Xiangjing. "House Prices and Mortgage Defaults: Econometric Models and Risk Management Applications." Digital Archive @ GSU, 2010. http://digitalarchive.gsu.edu/rmi_diss/24.

Full text
Abstract:
This dissertation first investigates the possible house price trend and the relationship with the mortgage market, from the perspective of risk management; then it chooses the angle from bond insurers and figures out possible methods to avoid capital procyclicality. In Chapter I, we apply vector auto regression models (VAR) and simultaneous equations models (SEM) to estimate the dynamic relations among house price returns, mortgage rates and mortgage default rates, using historical data during the time period of 1979 through second quarter 2008. We find that house prices would be better estimated and predicted with the consideration of the mortgage market. In Chapter II, following the methodology of co-integration, we first construct several succinct measures to display the possible intrinsic values of house prices. In the short run, house price return dynamics are investigated by dynamic adjustments following Capozza et al (2002) and error correction models. We examine the possible overshooting problem of house price returns. By analytical derivations and simulations, we demonstrate the effects of the coefficients on overshooting. In Chapter III, we adopt a structural model with time-varying correlations for bond insurers. We consider losses due to bond insurers’ downgrading and losses from both insurance contracts and investment portfolio. On that basis, we propose forward-looking smoothing rules of capital over a full business cycle, instead of only based on a short-term horizon, to avoid the procyclicality. With the smoothed capital, a bond insurer can actually establish some capital buffer in good times to support the potential losses in crisis.
APA, Harvard, Vancouver, ISO, and other styles
3

Quigley, Daniel Hugh. "Essays in the economics of information disclosure." Thesis, University of Cambridge, 2014. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.648766.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Sundali, James Arnold. "An experimental investigation of market entry problems." Diss., The University of Arizona, 1995. http://hdl.handle.net/10150/187079.

Full text
Abstract:
This dissertation considers organizational problems of market entry. The research follows the experimental path. Game theoretic models are combined with laboratory experiments to produce a set of empirical findings. Two market entry problems are studied. The first considers the chain store paradox developed by Selten (1978). This game considers an established chain store with locations in numerous towns. In each of these towns a different competitor decides whether to enter and compete with the chain store. When entry occurs, the chain store can respond cooperatively or aggressively. The game proceeds sequentially, the players are not symmetric, and the critical solution concept is the subgame perfect equilibrium. Three experiments are conducted for a total of 550 trials of the game. Experiments differ in the size of payoffs, the number of entrants, the anonymity of the chain store, and whether subjects play in both the role of the chain store and an entrant or in just one role. There is qualified support for the game theoretic prediction that a chain store cannot deter the sequential entry of competitors. Entry occurred on 459 of 550 trials; while some chain stores pursue deterrence, it largely is not effective in these specific experimental environments. It is suggested that deterrence might be effective if the number of entrants or payoffs are increased. The results have implications for discussions on predatory pricing, reputation, and the value of backwards induction as a solution concept. The second market entry problem is based on a simultaneous market entry game developed by Rapoport (1994). In this game symmetric players decide simultaneously whether to enter a market with a specified capacity. The game theoretic prediction for the number of entrants is based on a Nash equilibrium (in pure or mixed strategies). Again, experimental results support game theoretic predictions. Across three experiments the correlation between the number of entrants and the size of the market capacity is consistently above 0.90. Taken together, these experiments on market entry problems provide strong support for the conceptual use of game theory and the methodological use of controlled laboratory experiments in the field of strategic management.
APA, Harvard, Vancouver, ISO, and other styles
5

Eadie, Edward Norman. "Small resource stock share price behaviour and prediction." Title page, contents and abstract only, 2002. http://web4.library.adelaide.edu.au/theses/09CM/09cme11.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Borah, Bijan Jyoti. "Econometric models of provider choice and health care use in India." [Bloomington, Ind.] : Indiana University, 2006. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3240038.

Full text
Abstract:
Thesis (Ph.D.)--Indiana University, Dept. of Economics, 2006.
"Title from dissertation home page (viewed July 16, 2007)." Source: Dissertation Abstracts International, Volume: 67-10, Section: A, page: 3907. Adviser: Pravin Trivedi.
APA, Harvard, Vancouver, ISO, and other styles
7

Huang, Tao. "Forecasting retailer product sales at the UPC level using econometric models with promotional information." Thesis, Lancaster University, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.619278.

Full text
Abstract:
Retailers need accurate forecasts for inventory planning. Poor forecasts lead to either out-of-stock or over-stock conditions. If products are regularly out of stock, customers may get frustrated and eventually switch their loyalty to other supermarkets. Retailers also do not want to overstock items because it is costly. To promote the various products in their stores, retailers employ marketing mix activities such as price reductions and promotions. There is an established marketing literature that focuses on identifying and estimating the effects of the marketing mix activities. However, little research effort has been devoted to incorporating the information of the marketing mix activities in forecasting retailer sales at the Universal Product Code (UPC) level. The forecasting models in previous studies only take into account the effects of the marketing mix activities for the focal product. This thesis proposes econometric forecasting methods that also take into account the effects of competitive marketing mix activities. The selection of competitive marketing mix variables becomes important because it is not obvious which UCPs compete against each other. The relationship between the marketing mix activities and the product sales can change permanently. For example, consumer taste towards a particular product may change or we can consider the availability of a new close substitute product. However, traditional econometric models with fixed parameters assume that the relationship is time invariant. As a result, the model may be subject to structural breaks and thus produce biased forecasts. This thesis implements various recently developed techniques to adjust the fixed parameter model with respect to the forecast bias caused by structural breaks, which may potentially improve the forecasting accuracy. The empirical analysis suggests that the inclusion of competitive marketing mix variables offers worthwhile benefits and the adjusted econometric models which take into account structural changes can produce more accurate forecasts than traditional econometric models.
APA, Harvard, Vancouver, ISO, and other styles
8

Nyqvist, Olof. "Information Management for Cutting Tools : Information Models and Ontologies." Doctoral thesis, Stockholm : Industriell produktion, Production Engineering, Kungliga Tekniska högskolan, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4763.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Dronkert, Max, and Terry Damink. "Improving sustainability performance with management information models." Thesis, Karlstads universitet, Handelshögskolan, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-55346.

Full text
Abstract:
PurposeThe purpose of this paper is to discover how management information models provide organizations that have the will to perform sustainable, with a tool that gives them knowledge and practical guidance to reach sustainability and avoid the practice of greenwashing as a result. Therefore, the research question is: How can management information models serve as a tool to improve the sustainability performance and reduce the practice of greenwashing of an organization? Methods The authors approached this research from a balanced and pragmatic view. The primary data in the research is collected with a qualitative approach in the form of semi-structured interviews. The interviews are conducted with respondents from Swedish organizations in different sectors in order to increase the reliability of the study. The respondents are responsible for sustainability and management information models in their organizations. Findings The results present the need to enhance management information models by including the sustainability elements economic, social and environmental, also called the Triple Bottom Line (Elkington 1999). An evolution of the Balanced Scorecard (Kaplan & Norton 1996) is needed to reach a management information models that improves the sustainability performance. In addition, this study shows the importance of including a knowledge section in such models. Also, it is of high importance to place the measured outcomes of sustainability in context in order to provide insight in the true impact of the sustainability performance of the organization. Implications The implications of this research consists of three theoretical implications and three practical implications. The three theoretical implications include the three core elements of a sustainability model, simplification of complex knowledge especially on the knowledge performance and prescribing only the essential elements of a model. The three practical implications exists of dividing the implementation into phases, ensure responsibility of sustainability on a high management level and integrate sustainability into culture. KeywordsSustainability; greenwashing; management information models; contextsustainability; knowledge.
APA, Harvard, Vancouver, ISO, and other styles
10

Bengtsson, Jonas, and Mikael Grönkvist. "Performing Geographic Information System Analyses on Building Information Management Models." Thesis, KTH, Geodesi och satellitpositionering, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-208922.

Full text
Abstract:
As the usage of both BIM (Building Information Modelling) and 3D-GIS (Three-Dimensional Geographic Information Systems) has increased within the field of urban development and construction, so has the interest in connecting these two tools.  One possibility of integration is the potential of visualising BIM models together with other spatial data in 3D. Another is to be able to perform spatial 3D analyses on the models. Both of these can be achieved through use of GIS software. This study explores how integration of BIM and GIS could look. The goal was to perform typical GIS analyses in 3D on BIM models. Previous research points towards some success within the field through use of the indicated standard format for each tool – IFC (Industry Foundation Classes) for BIM and CityGML (City Geographic Markup Language) for GIS. Transformation between the formats took place through use of the BIM software Revit, the transformation tool FME and the GIS software ArcGIS. A couple of reviewed applications of GIS analyses were chosen for testing on the converted models – indoor network analysis, visibility analysis and spatial analysis for 3D buildings. The input data in the study was several BIM models, both models created for real-life usage and others that only function as sample data within the different software. From the results of the practical work it can be concluded that a simple, automated and full-scale integration does not seem to be within reach quite yet. Most transformations between IFC and CityGML failed to some extent, especially the more detailed and complex ones. In some test cases, the file could not be imported into ArcGIS and in others geometries were missing or existing even though they should not. There were also examples where geometries had been moved during the process. As a consequence of these problems, most analyses failed or did not give meaningful results. A few of the original analyses did give positive results. Combining (flawed) CityGML models with other spatial data for visualisation purposes worked rather well. Both the shadow volume and sightline analyses did also get reasonable results which indicates that there might be a future for those applications. The obstacles for a full-scale integration identified during the work were divided into four different categories. The first is BIM usage and routines where created models need to be of high quality if the final results are to be correct. The second are problems concerning the level of detail, especially the lack of common definitions for the amount of details and information. The third category concerns the connection between local and global coordinate systems where a solution in form of updates to IFC might already be in place. The fourth, and largest, category contains those surrounding the different formats and software used. Here, focus should lie on the transformation between IFC and CityGML. There are plenty of possible, future, work concerning these different problems. There is also potential in developing own tools for integration or performing different analyses than those chosen for this thesis.
I takt med den ökade användningen av både BIM och 3D-GIS inom samhällsbyggnadsprocessen har även intresset för att sammanföra de två verktygen blivit större. En möjlighet med integration är potentialen att visualisera BIM-modeller tillsammans med andra geografiska data i 3D. En annan är att kunna genomföra rumsliga 3D-analyser på modellerna. Båda dessa går att utföra med hjälp av GIS-programvara. Denna studie utforskar hur en integration mellan BIM och GIS kan se ut. Målet är att genomföra typiska GIS-analyser i 3D på BIM-modeller. Tidigare forskning pekar mot vissa framgångar inom området genom att arbeta med det utpekade standardformatet för respektive verktyg – IFC för BIM och CityGML för GIS. Transformation mellan formaten skedde med hjälp av programvarorna Revit, FME och ArcGIS. Ett par framhållna tillämpningar av GIS-analyser valdes ut för tester på de konverterade modellerna – nätverksanalyser inomhus, siktanalyser och rumsliga analyser för 3D-byggnader. Som indata användes flera olika BIM-modeller, både sådana som tillverkats för faktisk användning och modeller som skapats för att användas som exempeldata inom programvarorna. Utifrån resultaten från det praktiska arbetet kan konstateras att en enkel, automatiserad och fullskalig integration mellan verktygen verkar ligga en bit in i framtiden. De flesta transformationerna mellan IFC och CityGML misslyckades i någon aspekt, speciellt de mer detaljerade och komplexa. I vissa testfall kunde filen inte importeras i ArcGIS, i andra saknas eller existerar oväntade geometrier även om importen lyckats. Det finns också exempel där geometrier förflyttats. Som en konsekvens av dessa problem kunde de flesta 3D-analyser inte genomföras alls eller lyckades inte ge betydelsefulla resultat. Ett fåtal av de ursprungliga analyserna gav dock positiv utdelning. Att kombinera (felaktiga) CityGML-modeller med annan rumslig data fungerade förhållandevis väl ur ett visualiseringssyfte. Både skuggvolymsanalysen och framtagandet av siktlinjer från byggnaderna gav någorlunda korrekta resultat vilket indikerar att det kan finnas en framtid gällande de tillämpningarna. Hindren för en fullskalig integration som identifierades genom arbetet delades upp i fyra olika kategorier. Den första är BIM-användning där hög kvalitet på de skapade modellerna är viktigt för korrekta slutresultat. Den andra är detaljeringsgraden där avsaknaden av gemensamma definitioner för detaljeringsgraderna ställer till problem. Den tredje kategorin är koordinat- och referenssystem där en lösning på kopplingen mellan lokala och globala system redan kan finnas på plats i en av de senare utgåvorna av IFC-formatet. Den sista och största kategorin är problematiken kring just format och programvaror där mer arbete på översättningen mellan IFC och CityGML kommer att krävas. I framtiden finns det gott om arbete att göra med dessa olika problem. Det finns också potential att utveckla egna verktyg för integrationen eller att ägna sig åt att göra andra analyser än de som valdes ut i den här studien.
APA, Harvard, Vancouver, ISO, and other styles
11

Hu, Jiangxia S. M. Massachusetts Institute of Technology. "Business models of information aggregators." Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/43171.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, System Design and Management Program, 2008.
Includes bibliographical references.
This thesis identifies the specific characteristics of information aggregators, and proposes nine business models appropriate for information aggregators. These nine models are: advertising, brokerage, subscription, licensing, infomediary (information intermediaries), referral/click-through, customized/personalized service, professional service/consulting, and application service provider. The thesis then looks into various companies who base their businesses on information aggregation and analyzes the development of their business models in the context of competition. The financial and social performances of these companies are studied and reasons are explored. In the end, the thesis summarizes findings from case studies, lists the widely used business models and the rarely used ones, and explores reasons for this phenomenon. The conclusion of this research is that information aggregation is a start point for a company to develop differentiated product or services. Companies can develop into an independent information aggregators; they can use information aggregation as a platform; they can partner with aggregatees or customers to provide customized information. Eventually, many will be integrated into end-to-end solutions, or penetrate into traditional businesses by leveraging information aggregation. The research can be used by companies who develop information aggregation products or services. It can also be used to evaluate the viability of information aggregation initiatives.
by Jiangxia Hu.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
12

Limkriangkrai, Manapon. "An empirical investigation of asset-pricing models in Australia." University of Western Australia. Faculty of Business, 2007. http://theses.library.uwa.edu.au/adt-WU2007.0197.

Full text
Abstract:
[Truncated abstract] This thesis examines competing asset-pricing models in Australia with the goal of establishing the model which best explains cross-sectional stock returns. The research employs Australian equity data over the period 1980-2001, with the major analyses covering the more recent period 1990-2001. The study first documents that existing asset-pricing models namely the capital asset pricing model (CAPM) and domestic Fama-French three-factor model fail to meet the widely applied Merton?s zero-intercept criterion for a well-specified pricing model. This study instead documents that the US three-factor model provides the best description of Australian stock returns. The three US Fama-French factors are statistically significant for the majority of portfolios consisting of large stocks. However, no significant coefficients are found for portfolios in the smallest size quintile. This result initially suggests that the largest firms in the Australian market are globally integrated with the US market while the smallest firms are not. Therefore, the evidence at this point implies domestic segmentation in the Australian market. This is an unsatisfying outcome, considering that the goal of this research is to establish the pricing model that best describes portfolio returns. Given pervasive evidence that liquidity is strongly related to stock returns, the second part of the major analyses derives and incorporates this potentially priced factor to the specified pricing models ... This study also introduces a methodology for individual security analysis, which implements the portfolio analysis, in this part of analyses. The technique makes use of visual impressions conveyed by the histogram plots of coefficients' p-values. A statistically significant coefficient will have its p-values concentrated at below a 5% level of significance; a histogram of p-values will not have a uniform distribution ... The final stage of this study employs daily return data as an examination of what is indeed the best pricing model as well as to provide a robustness check on monthly return results. The daily result indicates that all three US Fama-French factors, namely the US market, size and book-to-market factors as well as LIQT are statistically significant, while the Australian three-factor model only exhibits one significant market factor. This study has discovered that it is in fact the US three-factor model with LIQT and not the domestic model, which qualifies for the criterion of a well-specified asset-pricing model and that it best describes Australian stock returns.
APA, Harvard, Vancouver, ISO, and other styles
13

Malherbe, Frédéric. "Essays on the macroeconomic implications of information asymmetries." Doctoral thesis, Universite Libre de Bruxelles, 2010. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210085.

Full text
Abstract:
Along this dissertation I propose to walk the reader through several macroeconomic

implications of information asymmetries, with a special focus on financial

issues. This exercise is mainly theoretical: I develop stylized models that aim

at capturing macroeconomic phenomena such as self-fulfilling liquidity dry-ups,

the rise and the fall of securitization markets, and the creation of systemic risk.

The dissertation consists of three chapters. The first one proposes an explanation

to self-fulfilling liquidity dry-ups. The second chapters proposes a formalization

of the concept of market discipline and an application to securitization

markets as risk-sharing mechanisms. The third one offers a complementary

analysis to the second as the rise of securitization is presented as banker optimal

response to strict capital constraints.

Two concepts that do not have unique acceptations in economics play a central

role in these models: liquidity and market discipline.

The liquidity of an asset refers to the ability for his owner to transform it into

current consumption goods. Secondary markets for long-term assets play thus

an important role with that respect. However, such markets might be illiquid due

to adverse selection.

In the first chapter, I show that: (1) when agents expect a liquidity dry-up

on such markets, they optimally choose to self-insure through the hoarding of

non-productive but liquid assets; (2) this hoarding behavior worsens adverse selection and dries up market liquidity; (3) such liquidity dry-ups are Pareto inefficient

equilibria; (4) the government can rule them out. Additionally, I show

that idiosyncratic liquidity shocks à la Diamond and Dybvig have stabilizing effects,

which is at odds with the banking literature. The main contribution of the

chapter is to show that market breakdowns due to adverse selection are highly

endogenous to past balance-sheet decisions.

I consider that agents are under market discipline when their current behavior

is influenced by future market outcomes. A key ingredient for market discipline

to be at play is that the market outcome depends on information that is observable

but not verifiable (that is, information that cannot be proved in court, and

consequently, upon which enforceable contracts cannot be based).

In the second chapter, after introducing this novel formalization of market

discipline, I ask whether securitization really contributes to better risk-sharing:

I compare it with other mechanisms that differ on the timing of risk-transfer. I

find that for securitization to be an efficient risk-sharing mechanism, it requires

market discipline to be strong and adverse selection not to be severe. This seems

to seriously restrict the set of assets that should be securitized for risk-sharing

motive.

Additionally, I show how ex-ante leverage may mitigate interim adverse selection

in securitization markets and therefore enhance ex-post risk-sharing. This

is interesting because high leverage is usually associated with “excessive” risktaking.

In the third chapter, I consider risk-neutral bankers facing strict capital constraints;

their capital is indeed required to cover the worst-case-scenario losses.

In such a set-up, I find that: 1) banker optimal autarky response is to diversify

lower-tail risk and maximize leverage; 2) securitization helps to free up capital

and to increase leverage, but distorts incentives to screen loan applicants properly; 3) market discipline mitigates this problem, but if it is overestimated by

the supervisor, it leads to excess leverage, which creates systemic risk. Finally,

I consider opaque securitization and I show that the supervisor: 4) faces uncertainty

about the trade-off between the size of the economy and the probability

and the severity of a systemic crisis; 5) can generally not set capital constraints

at the socially efficient level.
Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
14

Foote, Alan Richard. "Exploring Knowledge Management Models on Information Technology Projects." ScholarWorks, 2016. https://scholarworks.waldenu.edu/dissertations/2028.

Full text
Abstract:
One way an organization manages the knowledge of its people is in information technology (IT) projects. Organizations develop IT projects for many socially responsible reasons, including improved health care services and better community services. IT projects do not always achieve the goals of the organization when the knowledge of the stakeholders is not managed for these objectives. For this study the purpose was to address the use of knowledge management (KM) in project management (PM) to improve the success of IT projects in achieving the organizational goals. The research questions were based on KM including its tools and techniques to improve the success rate for IT projects. The conceptual framework included the project knowledge management (PKM) model, which helped identify the knowledge sharing in IT software projects for a local insurance company in Baltimore, Maryland. Interview data were collected from 26 IT project stakeholders about KM in PM. Analysis revealed 4 themes of managing knowledge in the requirement process, code development process, testing process, and the helpdesk process for the success of the IT project. Each of the 4 processes used different KM repositories and face-to-face tools. Improving the rate of successful IT projects benefits organizations and society with better products and services for lower costs. This study may affect social change by providing information for managers of other organizations about achieving success of their IT projects.
APA, Harvard, Vancouver, ISO, and other styles
15

Fei, Qi. "Operation models for information systems /." View abstract or full-text, 2009. http://library.ust.hk/cgi/db/thesis.pl?IELM%202009%20FEI.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Choy, Hung-tat Lennon, and 蔡鴻達. "Pricing under information asymmetry: an analysis of the housing presale market from the new institutionaleconomics perspective." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2007. http://hub.hku.hk/bib/B37908133.

Full text
Abstract:
The Best PhD Thesis in the Faculties of Architecture, Arts, Business & Economics, Education, Law and Social Sciences (University of Hong Kong), Li Ka Shing Prize, 2006-2007.
published_or_final_version
abstract
Real Estate and Construction
Doctoral
Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
17

Faulk, David Philip. "Cost models and the Corporate Information Management (CIM) initiative." Thesis, Monterey, California. Naval Postgraduate School, 1991. http://hdl.handle.net/10945/30969.

Full text
Abstract:
Approved for public release, distribution unlimited
This thesis provides a brief history of the Corporate Information Management (CIM) initiative, and initiative and includes a summary of the methodology being employed to complete the initiative. The focus of this thesis is on the alternative cost models that are available to the Department of Defense (DoD), and the information requirements for each of them. The cost models reviewed include: actual, normal, standard, variable, cost-volume-profit analysis, and job order. Advantages and disadvantages of each of these models is discussed. In addition, the current DoD implementation of unit costing is ·also discussed and compared and contrasted to the alternative models that exist.
APA, Harvard, Vancouver, ISO, and other styles
18

Johansson, Henrik. "Conceptual information models to integrate data management in engineering simulation /." Luleå, 2002. http://epubl.luth.se/1402-1544/2002/33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Bharadwaj, Ragu. "Business models for information commons in the pharmaceutical industry." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/47865.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, System Design and Management Program, 2009.
Includes bibliographical references (leaves 57-59).
The pharmaceutical industry needs new modes of innovation. The industry's innovation system - based on massive investments in R&D protected by intellectual property rights - has worked well for many years, providing incentives for pharmaceutical firms to invest in developing drugs across a wide variety of major medical needs. However, this traditional drug development process is subject to decreasing productivity and increasing costs. In addition, it encourages pharmaceutical firms to focus on "blockbuster" drugs, and to neglect meeting needs in small potential markets such as "orphan" diseases and diseases primarily found in third world countries. This thesis focuses on new modes of innovation, specifically the sharing of safety information prior to clinical trials. To inform this analysis, I first discuss the data that informs why the industry is in need of new modes of innovation. I then proceed to outline the potential promise of some new modes of pharmaceutical development that are emerging. I then explore a specific novel innovation mode in more detail: the sharing of non-competitive safety information prior to clinical trials, leading to significant reductions in both costs and chances of failure in drug discovery and development. I propose that this new innovation mode offers the potential of significant benefit to both drug developers and medical patients.
by Ragu Bharadwaj.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
20

Caria, Antonio Stefano. "Efficiency and other-regarding preferences in information and job-referral networks." Thesis, University of Oxford, 2015. http://ora.ox.ac.uk/objects/uuid:4c243348-af82-4cdc-b402-e75997e4a599.

Full text
Abstract:
In this thesis I study how networks are formed and I analyse the strategies that well-connected individuals adopt in public good games on a network. In chapter one I study an artefactual field experiment in rural India which tests whether farmers can create efficient networks in a repeated link formation game, and whether group categorisation increases the frequency of in-group links and reduces network efficiency. I find that the efficiency of the networks formed in the experiment is significantly lower than the efficiency which could be achieved under selfish, rational play. When information about group membership is disclosed, in-group links are chosen more frequently, while the efficiency of network structure is not significantly affected. Using a job-referral network experiment in an urban area of Ethiopia, I investigate in chapter two whether individuals create new links with the least connected players in the network. In a first treatment, competition for job-referrals makes it in the player's interest to link with the least connected partners. In this treatment, links to the least connected players are significantly more likely than links to better connected individuals. In a second treatment, connections only affect the welfare of the new partner. Choosing the least connected player minimises inequality and maximises aggregate efficiency. This may motivate other-regarding players. In this treatment, however, links to least connected partners are not significantly more likely than links to other players. In chapter three I explore the characteristics that individuals value in the people they approach for advice. Using cross-sectional data on cocoa farmers in Ghanaian villages and a matched lottery experiment, I find an association between the difference in the aversion to risk of two farmers and the probability that one farmer is interested in the advice of the other farmer. In chapter four I study a one-shot public good game in rural India between farmers connected by a star network. Contributions by the centre of the star have a larger impact on aggregate payoffs than contributions by the spoke players. I use the strategy method to study whether the centre of the star contributes more than the average of the spokes. In selected sessions, I disclose participants' expectations about the choices of the centre of star. I find that the centre player contributes just as much as the average of the spokes, and that he is influenced by the expectations that other players hold about his decisions.
APA, Harvard, Vancouver, ISO, and other styles
21

Chen, Qian. "An Object Model Framework for Interface Management in Building Information Models." Diss., Virginia Tech, 2007. http://hdl.handle.net/10919/28410.

Full text
Abstract:
The construction industry's overall project performance is significantly reduced by numerous interface issues that also hinder its industrialization. Interface Management (IM) is becoming critical to the success of multidisciplinary construction projects. This research deals with three challenging problems associated with IM: 1) how to build a holistic understanding of interface issues for developing all-around IM solutions; 2) how to define and present interface information in a unified, accurate, and efficient way to improve information sharing, coordination, and implementation; and 3) how to resolve interface issues as a whole to optimize IM performance. Comprehensive cause factors of interface issues are investigated from different yet interrelated perspectives. These cause factors allow for the development of an object data model and a systematic IM strategy. The findings of this multi-perspective approach not only add a holistic view of interface issues to the existing body of knowledge but also provide a theoretical base for researchers and practitioners to seek all-around IM solutions. As a key innovation, an object view of interfaces is defined, resulting in a unified way of presenting interface information. This new technique of modeling interfaces as knowledgeable, intelligent, and active objects is far superior to the traditional use of simple relationships. The proposed Interface Object Model (IOM) framework is the first in the literature to present a comprehensive data structure and its dependencies of interface information for object modeling. This can greatly improve the quality and interoperability of modeled interface information. When integrated into a Building Information Modeling (BIM) approach, this technique can significantly enhance BIM capabilities for interface-related coordination, decision-making, operation, and management. As a first application, a systematic model-based IM strategy is conceptually developed, which provides a good foundation for creating an implementation environment for the developed interface model. This strategy aims to resolve interface issues as a whole throughout a complete project process. The multi-perspective approach, the generically structured IOM, and the conceptual, systematic IM strategy all target broad applications. Individually or jointly, they can also be applied to other domains beyond construction.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
22

Calhoun, Karen. "Redesign of Library Workflows: Experimental Models for Electronic Resource Description." the Library of Congress, 2000. http://hdl.handle.net/10150/105094.

Full text
Abstract:
This paper explores the potential for and progress of a gradual transition from a highly centralized model for cataloging to an iterative, collaborative, and broadly distributed model for electronic resource description. The author's purpose is to alert library managers to some experiments underway and to help them conceptualize new methods for defining, planning, and leading the e-resource description process under moderate to severe time and staffing constraints. To build a coherent library system for discovery and retrieval of networked resources, librarians and technologists are experimenting with team-based efforts and new workflows for metadata creation. In an emerging new service model for e-resource description, metadata can come from selectors, public service librarians, information technology staff, authors, vendors, publishers, and catalogers. Arguing that e-resource description demands a level of cross-functional collaboration and creative problem-solving that is often constrained by libraries' functional organizational structures, the author calls for reuniting functional groups into virtual teams that can integrate the e-resource description process, speed up operations, and provide better service. The paper includes an examination of the traditional division of labor for producing catalogs and bibliographies, a discussion of experiments that deploy a widely distributed e-resource description process (e.g., the use of CORC at Cornell and Brown), and an exploration of the results of a brief study of selected ARL libraries' e-resource discovery systems.
APA, Harvard, Vancouver, ISO, and other styles
23

Swaminathan, Raji. "Contingency planning models for government agencies /." Electronic version, 1996. http://adt.lib.uts.edu.au/public/adt-NTSM20030707.112749/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Ahmed, Heba Saleh. "The use of System Dynamics simulation models in project management education." Thesis, University of Sunderland, 2016. http://sure.sunderland.ac.uk/8551/.

Full text
Abstract:
This thesis explores the impact of using System Dynamics (SD) as a simulation tool to help learners understand complex, dynamic concepts in project management education, and specifically with the learning of the theory associated with Earned Value Management (EVM). SD simulation models have been used widely but mainly in business contexts to support managers in the decision making process. However the application of SD in the field of project management education has been limited and particularly in terms of assessing its potential impact to help improve learners’ skills and understanding about project management concepts. ‘Projects’ are considered to be complex information feedback systems, characterized by causality and underlying dynamic relations between multiple variables, and the ability of junior project managers to apply and experience higher practical skills in the management of these complex systems presents a real challenge in the higher education context. The ability of SD to simulate the behaviour of a system, to reveal the underlying relationships, and to help visualize its dynamic changes over time, makes SD a potential modelling tool to help supporting the learners in the project management education area. This study sets out to evaluate the use of SD in an instructional context to help postgraduate project management students to visualize and to more understand the complex dynamic relationships in the concept of EVM, a topic that features significantly in project management education. In this study, SD was deployed to teach EVM through a series of computer based models to visualize changes of multiple interacting variables over time. The SD simulations were evaluated and improved in a series of pilot and formal studies. In an experimentally controlled study involving 46 students, EVM content was delivered with SD simulations and using traditional methods respectively. Results, both quantitative and qualitative, demonstrated a positive impact of SD on the learning of the EVM concept. Recommendations of further work to deploy SD in the delivery of complex project management content and other challenging topics, with wider pool of learners are discussed.
APA, Harvard, Vancouver, ISO, and other styles
25

Chen, Danfang. "Information management for the factory planning process." Licentiate thesis, Stockholm : Skolan för industriell teknik och management, Kungliga Tekniska högskolan, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-11418.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Schneider, Viktoria, and n/a. "A bioeconomic analysis of marine reserves for Paua (Abalone) management at Stewart Island, New Zealand." University of Otago. Department of Economics, 2006. http://adt.otago.ac.nz./public/adt-NZDU20060823.160930.

Full text
Abstract:
Marine reserves have increasingly been recognised for their potential to address the pervasive problem of unsustainable harvest of fisheries worldwide. Biologists advocate the benefits of increased spawning biomass, larger modal sizes and greater densities of fish within marine reserves, and the possibility of spillover to adjacent fishable areas. Bioeconomic studies, however, find that pay-offs from stand-alone marine reserves rarely compete with sustainable yield management schemes, but that they can be beneficial when stocks are heavily exploited. Most of these bioeconomic models are analytical and deterministic in nature, and therefore ignore the redistribution of effort in response to closure and the inherent uncertainty of the marine environment. We present a bioeconomic analysis of a network of no-take areas around Stewart Island in New Zealand applied to the shellfish species paua (abalone) that incorporates both predicted redistribution and reduction in effort, as well as stochastic recruitment. A nested logit model is applied to spatially recorded catch and effort data by the Ministry of Fisheries between 1998 and 2003 to capture the two level decision-making process of divers. On any given day, divers decide whether to go diving at all, and if so, which of the 16 statistical areas around Stewart Island to visit. Weather conditions, spatially varying levels of catch per unit of effort and distance are used as explanatory variables to select areas for closure according to the �least economic impact� in terms of loss of diving trips. An age-structured biological model is developed with parameters specifically applied to paua stocks around Stewart Island. Virgin paua biomass as of 1974 is estimated on the basis of growth, survival, post-larval recruitment and egg production in the absence of fishing. Historic catch rates are then applied to find overall and area-specific levels of exploitation rates, spawning biomass, egg production, legal biomass and numbers of paua. In a final step, the economic model is linked to the biological model to simulate the imposition of no-take areas when taking account of the initial disproportional shift of harvest to fished areas in the first year, and the increase in overall pressure on legal biomass in the years thereafter. We contribute to the marine reserve debate by showing that in the very long run, the overall yield under closure of a relatively small area approaches and even slightly surpasses the yield under no closure for an assumed spillover gradient of 40% despite the redistribution of effort. The most important benefits of marine reserves emerge when stochastic recruitment is included in the recruitment function. In practice, predictions about the stock status and the impact of different harvest levels become much more difficult when acknowledging the inherent variability of the marine environment. The likelihood of stock collapse depends on the assumed value of two recruitment parameters, which highlights the effects of parameter uncertainty and emphasizes the role of marine reserves for population persistence. We also show that under uncertainty average yields under a management regime of a network of no-take areas in addition to the quota system can equal yields under no closure for an assumed spillover gradient of 40%, despite the increased pressure on areas adjacent to the closed areas. Our findings have significant implications for the management of the paua fishery at Stewart Island. For a heterogeneously abundant species, such as paua, spatial management in addition to quota limits could be vital in ensuring the long-term sustainability of the fishery given the inherent variability of the marine environment.
APA, Harvard, Vancouver, ISO, and other styles
27

Guo, Chenhui, and Chenhui Guo. "Empirical Studies on Incentives, Information Disclosure, and Social Interactions in Online Platforms." Diss., The University of Arizona, 2016. http://hdl.handle.net/10150/621773.

Full text
Abstract:
Nowadays, people have many business activities and entertainments on a variety of online platforms. Despite their various functionalities, online platforms have a fundamental administrative problem: How do platform designers or administrators create proper online environments, including mechanisms and policies, to better manage user behaviors, in order to reach the goals of the platforms? Starting with a taxonomy of online platforms, I introduce three critical dimensions that help to characterize such platforms, including revenue model, heterogeneity in the role of users and level of user interaction. Then, I choose three online platforms as research contexts and conduct empirical studies, trying to identify and understand the impact of the incentive program, quality information disclosure, and social influence, on users' decision-making in online platforms. The first essay investigates the effectiveness of incentive hierarchies, where users achieve increasingly higher status in the community after achieving increasingly more challenging goals, in motivating user contribution in the same platform. The findings have important implications for crowd-based online applications, such as knowledge exchange and crowdsourcing. The second essay focuses on online consumer review sites, and studies whether and how consumer-generated word-of-mouth of restaurants-both volume and valence-is influenced by the disclosure of quality information from health inspectors, by conducting analytical modeling and econometric analyses using data from a leading consumer review site. The third essay examines how social interactions matter in a large-scale online social game that adopts an increasingly popular freemium revenue model. The study leverages an econometric model to quantify the effect of peer consumption on players' repeated decisions for the consumption of both free services and premium services. Finally, I conclude the dissertation by highlighting the three fundamental issues of design and management of online platforms.
APA, Harvard, Vancouver, ISO, and other styles
28

Fang, Xiazhi. "Development of distress and performance models of composite pavements for pavement management." Thesis, The University of North Carolina at Charlotte, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10269558.

Full text
Abstract:

Roadway systems in the United States have become huge assets that need massive resources to maintain and operate. To meet the long-term performance goal, government agencies developed pavement management systems (PMSs) to help them manage roadway assets effectively with limited resources. Currently, some PMSs in the United States have been designed for two types of pavements: asphalt and concrete. The composite pavement, another pavement type, which is the result of concrete pavement rehabilitations and constructed with an asphalt surface layer over a concrete base, was treated as asphalt. However, the literature review indicates that compared to asphalt pavements, composite pavements perform differently and have different dominant distresses. In addition, as the amount of composite pavements increases, it is necessary to investigate them independently to incorporate more accurate information into the PMS. Therefore, the goal of this research is to improve and to expand the PMS with an additional pavement type: composite pavements. To achieve this goal, the PMS managed by the North Carolina Department of Transportation (NCDOT) was used as a case study, and several objectives were accomplished in this research: 1) to identify composite pavements and generate the raw data based on the construction history; 2) to clean the raw data and mitigate errors using statistical methods and engineers’ experiences; 3) to develop nonlinear models to describe dominant distresses and pavement performances; 4) to propose quantile regression (QR) models to predict pavement performances; and 5) to investigate the pavement treatment effectiveness by exploring performance index jumps.

Based on findings of this research, it was concluded that the automated data were more consistent with engineers’ experience and revealed more information than the windshield data; longitudinal cracking and transverse cracking were found to be the dominant distresses in composite pavements, followed by alligator cracking and raveling; Interstate composite pavements deteriorated faster than both US and NC composite pavements, and NC composite pavements had the slowest deterioration rate; QR models can be used as a new prediction method of pavement performances at both the project and the network levels; in general the “Resurfacing” treatment was more effective than the “Chip Seal” treatment; and The average service life of asphalt and composite pavements were similar, but composite pavements have a smaller variation of service lives than that of asphalt pavements.

It was recommended that the automated data should be used in future PMS related research projects, due to its better data quality, and because of the robust performance of QR models at both network and project levels, QR models should be incorporated in the future PMS.

In summary, this research expanded the existing NCDOT PMS with composite pavements, proposed systematic methods to improve the quality of performance data, enriched the diversity of prediction models by exploring potentials of QR models, and investigated the effectiveness of pavement treatments. Essentially, transportation agencies can use the findings of this research to make informative investment decisions.

APA, Harvard, Vancouver, ISO, and other styles
29

Sharma, Luv. "Examining the impact of hospital technology and administrative innovation on performance: An Econometric investigation." The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu1466436879.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Chen, Hongqing. "An Empirical Study on the Jump-diffusion Two-beta Asset Pricing Model." PDXScholar, 1996. https://pdxscholar.library.pdx.edu/open_access_etds/1325.

Full text
Abstract:
This dissertation focuses on testing and exploring the usage of the jump-diffusion two-beta asset pricing model. Daily and monthly security returns from both NYSE and AMEX are employed to form various samples for the empirical study. The maximum likelihood estimation is employed to estimate parameters of the jump-diffusion processes. A thorough study on the existence of jump-diffusion processes is carried out with the likelihood ratio test. The probability of existence of the jump process is introduced as an indicator of "switching" between the diffusion process and the jump process. This new empirical method marks a contribution to future studies on the jump-diffusion process. It also makes the jump-diffusion two-beta asset pricing model operational for financial analyses. Hypothesis tests focus on the specifications of the new model as well as the distinction between it and the conventional capital asset pricing model. Both parametric and non-parametric tests are carried out in this study. Comparing with previous models on the risk-return relationship, such as the capital asset pricing model, the arbitrage pricing theory and various multi-factor models, the jump-diffusion two-beta asset pricing model is simple and intuitive. It possesses more explanatory power when the jump process is dominant. This characteristic makes it a better model in explaining the January effect. Extra effort is put in the study of the January Effect due to the importance of the phenomenon. Empirical findings from this study agree with the model in that the systematic risk of an asset is the weighted average of both jump and diffusion betas. It is also found that the systematic risk of the conventional CAPM does not equal the weighted average of jump and diffusion betas.
APA, Harvard, Vancouver, ISO, and other styles
31

Milunovich, George Economics Australian School of Business UNSW. "Modelling and valuing multivariate interdependencies in financial time series." Awarded by:University of New South Wales. School of Economics, 2006. http://handle.unsw.edu.au/1959.4/25162.

Full text
Abstract:
This thesis investigates implications of interdependence between stock market prices in the context of several financial applications including: portfolio selection, tests of market efficiency and measuring the extent of integration among national stock markets. In Chapter 2, I note that volatility spillovers (transmissions of risk) have been found in numerous empirical studies but that no one, to my knowledge, has evaluated their effects in the general portfolio framework. I dynamically forecast two multivariate GARCH models, one that accounts for volatility spillovers and one that does not, and construct optimal mean-variance portfolios using these two alternative models. I show that accounting for volatility spillovers lowers portfolio risk with statistical significance and that risk-averse investors would prefer realised returns from portfolios based on the volatility spillover model. In Chapter 3, I develop a structural MGARCH model that parsimoniously specifies the conditional covariance matrix and provides an identification framework. Using the model to investigate interdependencies between size-sorted portfolios from the Australian Stock Exchange, I gain new insights into the issue of asymmetric dependence. My findings not only confirm the observation that small stocks partially adjust to market-wide news embedded in the returns to large firms but also present evidence that suggests that small firms in Australia fail to even partially adjust (with statistical significance) to large firms??? shocks contemporaneously. All adjustments in small capitalisation stocks occur with a lag. Chapter 4 uses intra-daily data and develops a new method for measuring the extent of stock market integration that takes into account non-instantaneous adjustments to overnight news. This approach establishes the amounts of time that the New York, Tokyo and London stock markets take to fully adjust to overnight news and then uses this This thesis investigates implications of interdependence between stock market prices in the context of several financial applications including: portfolio selection, tests of market efficiency and measuring the extent of integration among national stock markets. In Chapter 2, I note that volatility spillovers (transmissions of risk) have been found in numerous empirical studies but that no one, to my knowledge, has evaluated their effects in the general portfolio framework. I dynamically forecast two multivariate GARCH models, one that accounts for volatility spillovers and one that does not, and construct optimal mean-variance portfolios using these two alternative models. I show that accounting for volatility spillovers lowers portfolio risk with statistical significance and that risk-averse investors would prefer realised returns from portfolios based on the volatility spillover model. In Chapter 3, I develop a structural MGARCH model that parsimoniously specifies the conditional covariance matrix and provides an identification framework. Using the model to investigate interdependencies between size-sorted portfolios from the Australian Stock Exchange, I gain new insights into the issue of asymmetric dependence. My findings not only confirm the observation that small stocks partially adjust to market-wide news embedded in the returns to large firms but also present evidence that suggests that small firms in Australia fail to even partially adjust (with statistical significance) to large firms??? shocks contemporaneously. All adjustments in small capitalisation stocks occur with a lag. Chapter 4 uses intra-daily data and develops a new method for measuring the extent of stock market integration that takes into account non-instantaneous adjustments to overnight news. This approach establishes the amounts of time that the New York, Tokyo and London stock markets take to fully adjust to overnight news and then uses this
APA, Harvard, Vancouver, ISO, and other styles
32

Flovén, Karl Fredrik. "State management models impact on run-time performance in Single Page Applications." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-277758.

Full text
Abstract:
The choice of state management model, i.e. how the state is managed in an application, is an important part of application development. In this thesis, we investigate how run-time performance, more specifically scripting performance, is affected when managing a shared state using a relative state management model in comparison to using a global state management model, in a component-based Single Page Application. To compare the two state management models, two implementations of a simple application were made, with the same functionality, but managing application state in two different ways. Two experiments, each focusing on one application structure dimension, were done to compare the scripting performance of the implementations during state update. The results showed that the state management models do affect the scripting performance, but that it depends on the structure of the application. In the first experiment, the global state management model showed a much larger performance decrease in comparison to the relative state management model when increasing the amount of components dependent on the updating state. Observations of the reconciliation processes showed that the cause for the performance differences were due to unoptimized synchronous DOM mutations introduced by the components using the global state management framework. In the second experiment, the relative state management model showed a larger performance decrease for a deep component structure compared to the global state management model as the component tree depth increased, but in comparison to the previous experiment the difference was smaller. Taken together, the findings from this study suggests that scripting performance may be a factor to consider when choosing state management model for an application. However, the application structure, including the complexity of the application, impacts which state management model that is preferable and to what degree it affects the scripting performance, and thus the run-time performance.
Valet av tillståndsmodell, dvs. hur tillståndet hanteras i en applikation, är en viktig del av applikationsutveckling. I det här examensarbetet undersöker vi hur körtidens prestanda påverkas när man hanterar ett delat tillstånd med hjälp av en relativ tillståndsmodell jämfört med att använda en global tillståndsmodell, i en komponentbaserad ensideapplikation. För att jämföra de två tillståndsmodellerna gjordes två implementationer av en enkel applikation med samma funktionalitet, men som hanterar applikationstillstånd på två olika sätt. Två experiment som vardera fokuserar på en enda dimension av applikationsstrukturen gjordes för att jämföra skriptprestanda för implementeringarna under tillståndsuppdateringen. Resultaten visade att tillståndsmodellerna påverkar skriptprestandan, men att det var beroende på applikationens struktur. I experiment 1 visade den globala tillståndsmodellen en mycket större prestandaminskning jämfört med den relativa tillståndsmodellen när många komponenter var beroende av tillståndet. Observationer av ”reconciliation”-processerna visade att orsaken till prestandaskillnaderna berodde på icke-optimiserade synkrona DOM-mutationer introducerade av komponenterna som använde det globala tillståndshanteringsramverket. I experiment 2 visade den relativa tillståndsmodellen emellertid en sämre prestandaminskning för en djup komponentstruktur än den globala tillståndsmodellen, men jämfört med föregående det experimentet var skillnaden liten. Sammantaget tyder resultaten från den här studien på att körningsprestanda kan vara en faktor att beakta när man väljer tillståndsmodell för en applikation. Applikationsstrukturen, inklusive applikationens komplexitet, påverkar emellertid vilken tillståndsmodell som är att föredra och i vilken utsträckning den påverkar skriptprestandan och därmed realtidsprestandan.
APA, Harvard, Vancouver, ISO, and other styles
33

Tian, Xiaoguang. "Hybrid Models in Automobile Insurance: Technology Adoption and Customer Relations." Thesis, University of North Texas, 2019. https://digital.library.unt.edu/ark:/67531/metadc1538717/.

Full text
Abstract:
Customer relationship management (CRM), a primary activity in the business value chain to relate to the customer, involves solicitation, analysis, and the use of the knowledge about the customer to provide goods and services through effective and efficient methods. It is a wise strategy and source of competitive advantage for customer behavior understanding and business performance management. The use of information technology (IT) in CRM allows companies to simplify their processes, to integrate product or service related decision making with the business strategies, and to optimize their operations by embracing analytical techniques. The insurance industry is facing unprecedented challenges and decisions in this data-driven business paradigm. It is a strategic necessity for customer-centric insurers to utilize emerging IT capability to support interactions between customers and business operations. The research in the dissertation seeks to provide insights into the application of early technology innovation and data-driven strategies by investigating the following two groups of CRM technology issues: technology adoption and data-driven technology application. Through three essays, the dissertation explores the use of information technology and data analytical tools to provide insight into how automobile insurance companies make decisions regarding their relationships with their customers. The results from these studies provide a framework for managers to devise effective approaches to enhancing the performance of their business.
APA, Harvard, Vancouver, ISO, and other styles
34

Litvinov, А. "Development and Research of Probabilistic Models of Quality Assessment of Management Information Systems Operation." Thesis, Sumy State University, 2017. http://essuir.sumdu.edu.ua/handle/123456789/55760.

Full text
Abstract:
The purpose of the research is the development of stochastic models of management information systems (MIS) operation based on queueing systems. It has been shown that it is possible to use a single line queueing systems with the generalized erlang flow of random events. The study of such systems is carried out and the main characteristics of operation are obtained. This allows the planning of procedures for MIS operation.
APA, Harvard, Vancouver, ISO, and other styles
35

Xiong, Li. "Resilient Reputation and Trust Management: Models and Techniques." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/7483.

Full text
Abstract:
The continued advances in service-oriented computing and global communications have created a strong technology push for online information sharing and business transactions among enterprises, organizations and individuals. While these communities offer enormous opportunities, they also present potential threats due to a lack of trust. Reputation systems provide a way for building trust through social control by harnessing the community knowledge in the form of feedback. Although feedback-based reputation systems help community participants decide who to trust and encourage trustworthy behavior, they also introduce vulnerabilities due to potential manipulations by dishonest or malicious players. Therefore, building an effective and resilient reputation system remains a big challenge for the wide deployment of service-oriented computing. This dissertation proposes a decentralized reputation based trust supporting framework called PeerTrust, focusing on models and techniques for resilient reputation management against feedback aggregation related vulnerabilities, especially feedback sparsity with potential feedback manipulation, feedback oscillation, and loss of feedback privacy. This dissertation research has made three unique contributions for building a resilient decentralized reputation system. First, we develop a core reputation model with important trust parameters and a coherent trust metric for quantifying and comparing the trustworthiness of participants. We develop decentralized strategies for implementing the trust model in an efficient and secure manner. Second, we develop techniques countering potential vulnerabilities associated with feedback aggregation, including a similarity inference scheme to counter feedback sparsity with potential feedback manipulations, and a novel metric based on Proportional, Integral, and Derivative (PID) model to handle strategic oscillating behavior of participants. Third but not the least, we develop privacy-conscious trust management models and techniques to address the loss of feedback privacy. We develop a set of novel probabilistic decentralized privacy-preserving computation protocols for important primitive operations. We show how feedback aggregation can be divided into individual steps that utilize above primitive protocols through an example reputation algorithm based on kNN classification. We perform experimental evaluations for each of the schemes we proposed and show the feasibility, effectiveness, and cost of our approach. The PeerTrust framework presents an important step forward with respect to developing attack-resilient reputation trust systems.
APA, Harvard, Vancouver, ISO, and other styles
36

Power, Bernadette. "Factors which foster the survival of long-lived small firms." Thesis, University of St Andrews, 2004. http://hdl.handle.net/10023/14113.

Full text
Abstract:
This thesis focuses on those factors which foster the long-run survival, or continued existence, of the small firm. Using fieldwork methods, new data were gathered in face-to-face interviews with 63 owner-managers of mature small firms in Scotland (average age of 251/2 years). An instrument incorporating novel ways of calibrating organisational change and performance was designed specifically for this study. The unique body of data enabled a number of new hypotheses to be tested in structural econometric models of small firm performance and growth. A mix of quantitative and qualitative data was also used to construct illustrative case studies of seven enterprise profiles. New measures of flexibility and firm-specific turbulence are used to explain the performance of mature small firms, and Heckman sample selection estimation is undertaken of this performance equation. Performance was measured using an index constructed fi-om Likert scales over 28 distinct attributes. It was found that firm- specific turbulence had a large negative effect on performance. Measures of flexibility (viz. agility and speed) enhanced the long run prospects of the mature small firm. Evidence of a trade-off relationship was found between measures of flexibility. Real options logic was found to be useful in interpreting the results. This evidence indicated that entrepreneurs should be alert to precipitators of organisational change, but should not act impulsively in responding to them. The tendency of the long-lived small firm to remain small is considered using structural modelling techniques. In a three-equation simultaneous model, performance, size and a third variable (viz. market extent and size of competitive strategy space) are jointly determined. An array of system estimation techniques (e.g. 2SLS, SSLS, H3SLS) was employed to estimate the behavioural models. A trade-off is found between firm size and performance, thus embedding this result in a larger structural model. It is found that small firms need to adjust downwards in size, and to cultivate a varied competitive strategy in niche or localised markets, to attain higher equilibrium values of performance and to promote longevity.
APA, Harvard, Vancouver, ISO, and other styles
37

Corres, Stelios. "Essays on the dynamics of qualitive aspects of firms' behavior." Diss., Virginia Tech, 1993. http://hdl.handle.net/10919/40187.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Choi, Kin Ying. "Ethical belief and behavior in using information systems : in search of predictive models." HKBU Institutional Repository, 1997. http://repository.hkbu.edu.hk/etd_ra/397.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Yang, Wenling. "M-GARCH Hedge Ratios And Hedging Effectiveness In Australian Futures Markets." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2000. https://ro.ecu.edu.au/theses/1530.

Full text
Abstract:
This study deals with the estimation of the optimal hedge ratios using various econometric models. Most of the recent papers have demonstrated that the conventional ordinary least squares (OLS) method of estimating constant hedge ratios is inappropriate, other more complicated models however seem to produce no more efficient hedge ratios. Using daily AOIs and SPI futures on the Australian market, optimal hedge ratios are calculated from four different models: the OLS regression model, the bivariate vector autoaggressive model (BVAR), the error-correction model (ECM) and the multivariate diagonal Vcc GARCH Model. The performance of each hedge ratio is then compared. The hedging effectiveness is measured in terms of ex-post and ex-ante risk-return traHe-off at various forcasting horizons. It is generally found that the GARCH time varying hedge ratios provide the greatest portfolio risk reduction, particularly for longer hedging horizons, but hey so not generate the highest portfolio return.
APA, Harvard, Vancouver, ISO, and other styles
40

Nour, Mohamed [Verfasser], and Karl [Akademischer Betreuer] Beucke. "A Flexible Model for Incorporating Construction Product Data into Building Information Models / Mohamed Nour ; Betreuer: Karl Beucke." Weimar : Professur Informatik im Bauwesen, 2006. http://d-nb.info/1115806297/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Blatt, Sharon L. "An in-depth look at the information ratio." Link to electronic thesis, 2004. http://www.wpi.edu/Pubs/ETD/Available/etd-0824104-155216/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Sperling, Brian Keith. "Information Sharing Strategies To Improve Team Mental Models In Complex Systems." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/6975.

Full text
Abstract:
This thesis hypothesizes that providing task specific information to individual team members will improve coordination and decision-making, and therefore team performance, at time critical tasks. Major themes addressed in this research include teams and team processes, mental models, team mental models, work domain analysis, and hierarchical task analysis. Furthermore, the theory behind the development of complementary models is introduced. A unique method to identify the information sources and requirements in a complex team environment is first discussed in general and then specifically applied in two domains. The findings are presented of two experiments examining the effects of imposing different information distribution strategies that range from no complementariness to full complementariness of information. Team communication, team and individual task performance, workload, and timeliness and effectiveness of team decision making were assessed in nominal and off-nominal conditions. The first experiment used an automobile simulator and examined team navigation while driving. A second experiment was designed to incorporate additional measures to more specifically investigate individual performance, team workload, and clarity of information requirements using a UH-60 Black Hawk helicopter simulator. The procedures used for both experiments provided for dynamic yet controlled environments through which critical factors that influence team process and performance could be evaluated accurately. Results of these experiments provide empirical evidence that providing task relevant information to individual team members in a time critical environment, while limiting their access to non-relevant information, improves individual and team performance. Furthermore, there is evidence of increased individual performance that indicates this method of distributing information among team members may provide individual crewmembers with a more accurate task relevant mental model of their own environment. This research provides new insight into how the distribution of information among team members effects the development of mental models, information requirements, team and individual performance, and communications, and highlights several directions for future research. The information distribution design principles presented in this thesis address the heterogeneity of teams; teams cannot be thought of as groups of identical individuals. The results concerning the communication, workload, performance and team of mental models were consistent across the domains in this research.
APA, Harvard, Vancouver, ISO, and other styles
43

Van, Eeden Johannes Gerhardus. "An in-depth literary study of Tobin's Q ratio, free cash flow and the relationship that exists between Q and free cash flow." Thesis, Stellenbosch : University of Stellenbosch, 2009. http://hdl.handle.net/10019.1/5047.

Full text
Abstract:
Thesis (MBA (Business Management))--University of Stellenbosch, 2009.
ENGLISH ABSTRACT: Tobin's q value is widely used by financial analysts as a performance indicator ratio. The market value of a firm over the replacement cost of fixed assets and inventory serves as an indication of whether value is created by investing internally in the firm, or whether value is destroyed by investing in negative net present value projects. Where Tobin's q is greater than one (q > 1), the market value of the firm is greater than what it would cost to replace fixed assets and inventory. Therefore value is created. Firms that have a Tobin's q value of less than one are advised to pay dividends rather than invest in negative net present value projects. Over 200 different methods exist of calculating Tobin's q. By increasing the complexity of the algorithm to determine q, very little is achieved to improve the measurement quality. A strong link exists between excess market returns, free cash flow spending announcements and Tobin's q value for the firm. Firms with a high Tobin's q value should ensure that good investment possibilities are pursued. The use of internal funds to fund new investment is viewed in a positive light by the market and above average returns are generated. Firms with a high Tobin's q value and high free cash flow show lower returns. These lower returns happen as a result of the market recognising the firm's failure to capitalise on favourable internal investment opportunities.
AFRIKAANSE OPSOMMING: Tobin se q-waarde word wyd gebruik as prestasie aanwyser deur finansiele ontleders. Die markwaarde van 'n firma gedeel deur die vervangingskoste van vaste bates en voorraad, dien as 'n maatstaf om aan te dui of waarde geskep word deur intern in die firma te belê en of waarde vernietig word deur in projekte met 'n negatiewe netto teenswoordige waarde te belê. Waar Tobin se q-waarde groter is as een (q > 1) is die markwaarde van die firma groter as wat dit sal wees om die vaste bates en voorraad te vervang. Sodoende word waarde geskep. Firmas met 'n q-waarde van minder as een word aanbeveel om eeeder dividende uit te betaal as om die beskikbare fondse in projekte met 'n negatiewe netto teenswoordige waarde te investeer. Meer as 200 verskillende metodes bestaan om Tobin se q-waarde te bereken. Deur die kompleksiteit van die algoritme te vergroot om q te bereken, dra min by tot groter akkuraatheid van die meting. 'n Sterk verband bestaan tussen bo-gemiddelde markopbrengste, aankondigings oor die besteding van vrye kontantvloei en die Tobin q-waarde van die firma. Firmas met 'n hoë Tobin q-waarde moet verseker dat goeie investeringsgeleenthede aangegryp word. Die gebruik van interne fondse om nuwe investering te finansier word deur die mark in 'n positiewe lig beskou en bogemiddelde opbrengste word gelewer. Firmas met 'n hoë Tobin q-waarde en hoë vrye kontantvloei toon laer opbrengste. Hierdie laer opbrengste is as gevolg van die mark wat besef dat die firma nalaat om gunstige interne investeringsgeleenthede te gebruik.
APA, Harvard, Vancouver, ISO, and other styles
44

RICCIARDI, RITA I. "Analise dos conhecimentos criticos de uma organizacao baseada em mapeamento de processos e cartografia de dominios de conhecimento - O estudo do Centro de Radiofarmacia do IPEN." reponame:Repositório Institucional do IPEN, 2003. http://repositorio.ipen.br:8080/xmlui/handle/123456789/11122.

Full text
Abstract:
Made available in DSpace on 2014-10-09T12:48:37Z (GMT). No. of bitstreams: 0
Made available in DSpace on 2014-10-09T13:58:01Z (GMT). No. of bitstreams: 1 09310.pdf: 7984351 bytes, checksum: 1d8d4aef198ba24c3fe0b68cd4caaa95 (MD5)
Dissertacao (Mestrado)
IPEN/D
Instituto de Pesquisas Energeticas e Nucleares - IPEN/CNEN-SP
APA, Harvard, Vancouver, ISO, and other styles
45

Swartbooi, Andile A. "The role of knowledge management in offshore outsourced software development." Thesis, Stellenbosch : University of Stellenbosch, 2010. http://hdl.handle.net/10019.1/5352.

Full text
Abstract:
Thesis (MPhil (Information Science))--University of Stellenbosch, 2010.
ENGLISH ABSTRACT: In an effort to streamline operations and focus on what they regard as core activities, a growing number of organizations from both developed and developing countries are increasingly looking to outsource their software development and maintenance activities to lower cost countries such as India and China, this is evidenced by the phenomenal growth in India’s software industry and the number of major overseas IT companies establishing subsidiaries and relocating their Research and Development operations to India’s high-tech cities such as Hyderabad, Chennai and Pune. With the mere size of their populations standing at over a billion people each, supported by their governments, Indian and Chinese business have been able to leverage this population advantage producing a large pool of software engineers, technical specialists and back office workers to cater for the talent demands of the world. While the actual software development process might be non-core to many organizations, it however yields software applications that drive critical business processes and embed valuable organizational knowledge. The handing over of software development operations by an organization to a third party poses a risk of creating a dependency and exposing vital business knowledge to competition thereby compromising its competitive edge. Both the people that participate in software development projects and the software products these people develop possess knowledge which need to be secured and leveraged to enable the continued success of an organization. Securing these knowledge artefacts and the knowledge created by the software development lifecycle process cannot be left to chance, therefore the success of an organization’s software development activities needs to be measured largely on its ability to secure knowledge assets that derive from such process and the leveraging of such knowledge to drive organizational strategy and yield new knowledge. This thesis is premised on the fact that knowledge is the one competitive advantage that separates successful nations from failed states and one dominant force that prevails across all successful economies in the 21st century, hence the notion of a knowledge economy. The study seeks to understand the importance of the role played by knowledge in an outsourced software development engagement and how knowledge management affects the success of this engagement. By exploring the business drivers that spur organizations to outsource their IT activities, the software development lifecycle, the different outsource models available to organizations and the inherent risks surrounding knowledge loss, the thesis seeks to gain an understanding and the criticality of managing knowledge within an outsourced software development context and the strategies that organizations can utilize to deliver on outsourcing promises with minimal risk.
AFRIKAANSE OPSOMMING: Ten einde hulle werksaamhede meer vaartbelyn te maak en om ingestel te bly op dit wat hulle as kernbedrywighede beskou, kyk al hoe meer organisasies in ontwikkelde en ontwikkelende lande na die moontlikheid om die ontwikkeling en instandhouding van hulle sagteware uit te kontrakteer na lande soos Indië en China, waar dit goedkoper gedoen kan word as tuis. Dié feit blyk duidelik uit die fenomenale groei in veral Indië se sagtewarenywerheid en die getal groot oorsese IT firmas wat hulle navorsing en ontwikkeling in hoë-tegnologie stede soos Hyderabad, Chennai en Puna laat doen. Met bevolkings van meer as ’n miljard elk, kon Chinese en Indiese ondernemings hierdie voorsprong benut om ‘n magdom sagteware-ingenieurs, tegniese spesialiste en kantoorwerkers te produseer om in die wêreld se vraag na kundigheid te voorsien. Terwyl die ontwikkeling van sagteware miskien nie deur baie ondernemings as ‘n kernbedrywigheid beskou word nie, lewer dit tog aanwendings op wat kritieke sakeaktiwiteite aandryf en waardevolle organisatoriese kennis vasvang. Die oordra van sagteware-ontwikkeling van een onderneming na ‘n derde party gaan egter gepaard met die risiko dat dit afhanklikheid kan skep en ook uiters belangrike sakekennis aan konkurrente toeganklik maak, wat die mededingende voorsprong wat sulke kennis bied bedreig. Die mense betrokke by die ontwikkeling van sagteware en die produkte wat hulle sodoende skep, is ‘n bron van kennis wat beveilig en verveelvuldig moet word om ‘n onderneming in staat te stel om suksesvol te bly voortbestaan. Die versekering van hierdie verworwe kennis en die kundigheid wat deur die ontwikkelingsiklus van die sagteware geskep word, kan nie aan die toeval oorgelaat word nie – die sukses van ‘n onderneming se sagteware-ontwikkeling moet veral gemeet word aan sy vermoë om die kennisbates wat uit die proses voortvloei te verseker, en om hierdie kennis te verveelvuldig om organisatoriese strategieë aan te dryf en nuwe kennis op te lewer. Hierdie tesis se uitgangspunt is dat kennis die mededingende voorsprong is wat suksesvolle nasies onderskei van die res; dit is dié faktor wat kenmerkend is van al die suksesvolle ekonomieë van die 20ste eeu, en die kern van die begrip van ‘n “kennis-ekonomie”. Hierdie ondersoek wil die belangrikheid verken van die rol wat gespeel word deur kennis in ‘n uitgekontrakteerde verbintenis vir die ontwikkeling van sagteware and hoe kennisbestuur die sukses van so ‘n verbintenis affekteer. Deur ondersoek in te stel na die motivering wat besighede aanspoor om hulle IT bedrywighede uit te plaas, na die sagteware-ontwikkeling lewenssiklus, die verskillende modelle van uitkontraktering wat vir organisasies beskikbaar is en die inherente risiko’s rondom kennisverlies, wil hierdie tesis ‘n begrip vorm van die kritieke noodsaaklikheid vir die bestuur van kennis in ‘n uitgekontrakteerde sagteware-ontwikkeling en van die strategieë wat organisasies kan aanwend om die voordele wat uitkontraktering beloof ten volle te benut teen minimale risiko.
APA, Harvard, Vancouver, ISO, and other styles
46

Chimhini, Joseline. "International portfolio diversification with special reference to emerging markets." Thesis, Edith Cowan University, Research Online, Perth, Western Australia, 2001. https://ro.ecu.edu.au/theses/1076.

Full text
Abstract:
This study evaluates the potential benefits that investors obtain from diversifying their portfolios into emerging markets when the time varying behavior of assets is considered. It also tests whether the existing asset-pricing model developed in the context of developed markets, which assumes complete integration, can explain the expected returns in emerging markets and determines the risk of investing in these markets using cross section and time series data. An international capital asset pricing model (ICAPM) with time varying moments developed by Harvey (1991) is adopted. The conditional asset-pricing model, which takes into account prevailing world economic factors, was used. The Generalized Methods of Moments (GMM) is used to test the model. Results indicate that some markets have become more integrated to the world markets than they were in the 1980s and other which failed to open their economies fully have become more segmented. The thesis looks at regional markets of Latin America, Africa Sub-Sahara, Middle East and North Africa, East Europe and Asia. A number of authors have looked at the emerging markets of Asia and Latin America but little is known about the African, Middle East and East Europe markets. The innovation of this research is it looked at the behavior of assets in all regional global markets and sees if they behave differently.
APA, Harvard, Vancouver, ISO, and other styles
47

Malings, Carl Albert. "Optimal Sensor Placement for Infrastructure System Monitoring using Probabilistic Graphical Models and Value of Information." Research Showcase @ CMU, 2017. http://repository.cmu.edu/dissertations/869.

Full text
Abstract:
Civil infrastructure systems form the backbone of modern civilization, providing the basic services that allow society to function. Effective management of these systems requires decision-making about the allocation of limited resources to maintain and repair infrastructure components and to replace failed or obsolete components. Making informed decisions requires an understanding of the state of the system; such an understanding can be achieved through a computational or conceptual system model combined with information gathered on the system via inspections or sensors. Gathering of this information, referred to generally as sensing, should be optimized to best support the decision-making and system management processes, in order to reduce long-term operational costs and improve infrastructure performance. In this work, an approach to optimal sensing in infrastructure systems is developed by combining probabilistic graphical models of infrastructure system behavior with the value of information (VoI) metric, which quantifies the utility of information gathering efforts (referred to generally as sensor placements) in supporting decision-making in uncertain systems. Computational methods are presented for the efficient evaluation and optimization of the VoI metric based on the probabilistic model structure. Various case studies on the application of this approach to managing infrastructure systems are presented, illustrating the flexibility of the basic method as well as various special cases for its practical implementation. Three main contributions are presented in this work. First, while the computational complexity of the VoI metric generally grows exponentially with the number of components, growth can be greatly reduced in systems with certain topologies (designated as cumulative topologies). Following from this, an efficient approach to VoI computation based on a cumulative topology and Gaussian random field model is developed and presented. Second, in systems with non-cumulative topologies, approximate techniques may be used to evaluate the VoI metric. This work presents extensive investigations of such systems and draws some general conclusions about the behavior of this metric. Third, this work presents several complete application cases for probabilistic modeling techniques and the VoI metric in supporting infrastructure system management. Case studies are presented in structural health monitoring, seismic risk mitigation, and extreme temperature response in urban areas. Other minor contributions included in this work are theoretical and empirical comparisons of the VoI with other sensor placement metrics and an extension of the developed sensor placement method to systems that evolve in time. Overall, this work illustrates how probabilistic graphical models and the VoI metric can allow for efficient sensor placement optimization to support infrastructure system management. Areas of future work to expand on the results presented here include the development of approximate, heuristic methods to support efficient sensor placement in non-cumulative system topologies, as well as further validation of the efficient sensing optimization approaches used in this work.
APA, Harvard, Vancouver, ISO, and other styles
48

Drexler, Michael. "Evaluating the use of larval connectivity information in fisheries models and management in the Gulf of Mexico." Scholar Commons, 2018. https://scholarcommons.usf.edu/etd/7499.

Full text
Abstract:
Connectivity is a major contributor to the overall dynamics of marine populations. However, it still remains challenging to describe connectivity on ecologically meaningful scales of time and space. This is a major impediment to evaluating the impacts of marine protected area with respect to fisheries management objectives. This dissertation brings together a wide array of spatial and connectivity information in the Gulf of Mexico (GOM) with the goal of 1) understanding the spatial distribution of fish populations and source-sink dynamics and 2) evaluating whether this information can be integrated, through a modeling framework, to identify closed areas that could be beneficial to fisheries management in the Gulf of Mexico. First, a generalized additive modelling (GAM) approach is used to describe the distribution of a large number of species groups (i.e. functional groups) across the Gulf of Mexico (GOM) using a large fisheries independent data set (SEAMAP) and climate scale (decades) oceanographic conditions. Next a numerical Lagrangian particle transport model was developed that incorporates two major connectivity processes; site specific larval production and oceanographic transport for an entire large marine ecosystem and over multiple years. The two components are then combined to develop larval dispersal patterns for the entire GOM and identify areas operating as larval sources and sinks. Last, this information is integrated into an end-to-end ecosystem model to evaluate effectiveness of closing source and sink areas for the management of reef fish fisheries. Closed area managemeny simlautions for reef fish indicated closing reef fish source areas, as opposed to sinks, in the GOM is most efficient method of increasing total biomass and yield. However, the impacts across individual functional groups were site specific. Ultimately, these simulations demonstrate the inclusion of connectivity information could improve fishery management objectives in an ecosystem context.
APA, Harvard, Vancouver, ISO, and other styles
49

Al, Jlailaty Diana. "Mining Business Process Information from Emails Logs for Process Models Discovery." Thesis, Paris Sciences et Lettres (ComUE), 2019. http://www.theses.fr/2019PSLED028.

Full text
Abstract:
Les informations échangées dans les textes des courriels sont généralement concernées par des événements complexes ou des processus métier dans lesquels les entités qui échangent des courriels collaborent pour atteindre les objectifs finaux des processus. Ainsi, le flux d’informations dans les courriels envoyés et reçus constitue une partie essentielle, les activités métier de l’entreprise. L’extraction d’informations sur les processus métier à partir des courriels peut aider à améliorer la gestion des courriels pour les utilisateurs. Il peut également être utilisé pour trouver des réponses riches à plusieurs questions analytiques sur les employés et les organisations. Aucun des travaux précédents n’a résolu le problème de la transformation automatique des journaux de courriels en journaux d’événements pour éventuellement en déduire les processus métier non documentés. Dans ce but, nous travaillons dans cette thèse sur un framework qui induit des informations de processus métier à partir d’emails. Nous introduisons des approches qui contribuent à ce qui suit : (1) découvrir pour chaque courriel le sujet de processus qui le concerne, (2) découvrir l’instance de processus métier à laquelle appartient chaque courriel, (3) extraire les activités de processus métier des courriels et associer ces activités aux métadonnées qui les décrivent, (4) améliorer la performance de la découverte des instances de processus métier et des activités métier en utilisant la relation entre ces deux problèmes, et enfin (5) estimer au préalable la date/heure réelle d’un activité métier. En utilisant les résultats des approches mentionnées, un journal d’événements est généré qui peut être utilisé pour déduire les modèles de processus métier d’un journal de courriels. L’efficacité de toutes les approches ci-dessus est prouvée par l’application de plusieurs expériences sur l’ensemble de données de courriel ouvert d’Enron
Exchanged information in emails’ texts is usually concerned by complex events or business processes in which the entities exchanging emails are collaborating to achieve the processes’ final goals. Thus, the flow of information in the sent and received emails constitutes an essential part of such processes i.e. the tasks or the business activities. Extracting information about business processes from emails can help in enhancing the email management for users. It can be also used in finding rich answers for several analytical queries about the employees and the organizations enacting these business processes. None of the previous works have fully dealt with the problem of automatically transforming email logs into event logs to eventually deduce the undocumented business processes. Towards this aim, we work in this thesis on a framework that induces business process information from emails. We introduce approaches that contribute in the following: (1) discovering for each email the process topic it is concerned by, (2) finding out the business process instance that each email belongs to, (3) extracting business process activities from emails and associating these activities with metadata describing them, (4) improving the performance of business process instances discovery and business activities discovery from emails by making use of the relation between these two problems, and finally (5) preliminary estimating the real timestamp of a business process activity instead of using the email timestamp. Using the results of the mentioned approaches, an event log is generated which can be used for deducing the business process models of an email log. The efficiency of all of the above approaches is proven by applying several experiments on the open Enron email dataset
APA, Harvard, Vancouver, ISO, and other styles
50

Tsu, Maria E. "Dynamic analysis of an open economy and foreign exchange risk management using path-dependent options." Thesis, This resource online, 1994. http://scholar.lib.vt.edu/theses/available/etd-06112009-063829/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography