To see the other types of publications on this topic, follow the link: Estimating value.

Dissertations / Theses on the topic 'Estimating value'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Estimating value.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Yan, Yang. "Essays in modelling and estimating Value-at-Risk." Thesis, London School of Economics and Political Science (University of London), 2014. http://etheses.lse.ac.uk/1033/.

Full text
Abstract:
The thesis concerns semiparametric modelling and forecasting Value-at-Risk models, and the applications of these in financial data. Two general classes of semiparametric VaR models are proposed, the first method is introduced by defining some efficient estimators of the risk measures in a semiparametric GARCH model through moment constraints and a quantile estimator based on inverting an empirical likelihood weighted distribution. It is found that the new quantile estimator is uniformly more efficient than the simple empirical quantile and a quantile estimator based on normalized residuals. At the same time, the efficiency gain in error quantile estimation hinges on the efficiency of estimators of the variance parameters. We show that the same conclusion applies to the estimation of conditional Expected Shortfall. The second model proposes a new method to forecast one-period-ahead Value-at-Risk (VaR) in general ARCH(1) models with possibly heavy-tailed errors. The proposed method is based on least square estimation for the log-transformed model. This method imposes weak moment conditions on the errors. The asymptotic distribution also accounts for the parameter uncertainty in volatility estimation. We test our models against some conventional VaR forecasting methods, and the results demonstrate that our models are among the best in forecasting VaR.
APA, Harvard, Vancouver, ISO, and other styles
2

Pagliari, Romano Italo. "Estimating the monetary value of airport runway departure slots." Thesis, Cranfield University, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.245446.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Norrman, Henrik. "Estimating p-values for outlier detection." Thesis, Högskolan i Halmstad, Sektionen för Informationsvetenskap, Data– och Elektroteknik (IDE), 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-25662.

Full text
Abstract:
Outlier detection is useful in a vast numbers of different domains, wherever there is data and a need for analysis. The research area related to outlier detection is large and the number of available approaches is constantly growing. Most of the approaches produce a binary result: either outlier or not. In this work approaches that are able to detect outliers by producing a p-value estimate are investigated. Approaches that estimate p-values are interesting since it allows their results to easily be compared against each other, followed over time, or be used with a variable threshold. Four approaches are subjected to a variety of tests to attempt to measure their suitability when the data is distributed in a number of ways. The first approach, the R2S, is developed at Halmstad University. Based on finding the mid-point of the data. The second approach is based on one-class support vector machines (OCSVM). The third and fourth approaches are both based on conformal anomaly detection (CAD), but using different nonconformity measures (NCM). The Mahalanobis distance to the mean and a variation of k-NN are used as NCMs. The R2S and the CAD Mahalanobis are both good at estimating p-values from data generated by unimodal and symmetrical distributions. The CAD k-NN is good at estimating p-values when the data is generated by a bimodal or extremely asymmetric distribution. The OCSVM does not excel in any scenario, but produces good average results in most of the tests. The approaches are also subjected to real data, where they all produce comparable results.
APA, Harvard, Vancouver, ISO, and other styles
4

Vallaud, Thierry. "Estimating potential customer value using customer data : using a classification technique to determine customer value /." Abstract and full text available, 2009. http://149.152.10.1/record=b3077978~S16.

Full text
Abstract:
Thesis (M.S.) -- Central Connecticut State University, 2009.
Thesis advisor: Daniel Larose. "... in partial fulfillment of the requirements for the degree of Master of Science in Data Mining." Includes bibliographical references (leaves 37-39). Also available via the World Wide Web.
APA, Harvard, Vancouver, ISO, and other styles
5

Klacar, Dorde. "Estimating Expected Exposure for the Credit Value Adjustment risk measure." Thesis, Umeå universitet, Nationalekonomi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-73104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kwon, Youngho. "Estimating the value of financial transmission rights for transmission expansion." Thesis, University of Manchester, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.497600.

Full text
Abstract:
The current situation of insufficient investment in transmission tends to weaken the advantages of a deregulated power industry. These shortages mainly arise because of the risk of not recovering the cost of investing in new lines Location Marginal prices (LMPs) and Financial Transmission Rights (FTRs) are key elements in reducing these investment risks. Market participants are able to hedge against price fluctuations caused by transmission congestion through the purchase or sale of FTRs. The value of FTRs, which is tied to the difference in prices between locations in the network, would indicate where transmission expansion should be implemented.
APA, Harvard, Vancouver, ISO, and other styles
7

Hu, Youxin. "Auction Behavior: Essays on Externalities and Estimating Value Distributions from EBay." The Ohio State University, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=osu1250677684.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Boateng, F. (Forster). "Estimating value at risk using extreme value theory:is the two dimensional inhomogeneous Poisson model better than the others." Master's thesis, University of Oulu, 2015. http://urn.fi/URN:NBN:fi:oulu-201510152084.

Full text
Abstract:
This research presents an application of extreme value theory to estimate the value at risk of a market position particularly of the OMX Hex Index. There are many approaches to computing value at risk alongside the extreme value method. One fundamental problem the manager of risk face is “what is the optimal choice of value at risk estimator that will best predict the risk?” This implies that choosing an approach to best predict value at risk is challenging. This study proposed a method of estimating value at risk using the two-dimensional inhomogeneous Poisson model. An extreme value theory method that is based on the Peak Over Threshold (POT). The method takes into consideration time varying parameters through some explanatory variables. The study also shows how well theoretical model fit real financial data. The data used is the daily log return of the OMX Hex Index from the period 1990 to 2014. From the data we show empirically that the OMX Hex Index obeys a Fréchet distribution. The explanatory variables for the study are GARCH volatilities, annual trend and quarter dummies. The explanatory variables are all available at time t-1. With the help of the fitted models we adopt the two-dimensional inhomogeneous Poisson model approach to estimating value at risk over the two-dimensional homogeneous Poisson model and other classical or traditional methods, and find that this better predict value at risk estimates.
APA, Harvard, Vancouver, ISO, and other styles
9

Li, Qiuyuan Jimmy. "The value of field experiments in estimating demand elasticities and maximizing profit." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/91105.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2014.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 99-101).
In many situations, the capabilities of firms are better suited to conducting and analyzing field experiments than to analyzing sophisticated demand models. However, the practical value of using field experiments to optimize marketing decisions remains relatively unstudied. We investigate category pricing decisions that require estimating a large matrix of cross-product demand elasticities and ask: how many experiments are required as the number of products in the category grows? Our main result demonstrates that if the categories have a favorable structure, then we can learn faster and reduce the number of experiments that are required: the number of experiments required may grow just logarithmically with the number of products. These findings potentially have important implications for the application of field experiments. Firms may be able to obtain meaningful estimates using a practically feasible number of experiments, even in categories with a large number of products. We also provide a relatively simple mechanism that firms can use to evaluate whether a category has a structure that makes it feasible to use field experiments to set prices. We illustrate how to accomplish this using either a sample of historical data or a pilot set of experiments. Historical data often suffer from the problem of endogeneity bias, but we show that our estimation method is robust to the presence of endogeneity. Besides estimating demand elasticities, firms are also interested in using these elasticities to choose an optimal set of prices in order to maximize profits. We formulate the profit maximization problem and demonstrate that substantial profit gains can also be achieved using a relatively small number of experiments. In addition, we discuss how to evaluate whether field experiments can help optimize other marketing decisions, such as selecting which products to advertise or promote. We adapt our models and methodologies to this setting and show that the main result that relatively few experiments are needed to estimate elasticities and to increase profits continues to hold.
by Jimmy Qiuyuan Li.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
10

ZHANG, XIAOYI. "ESTIMATING PEAKING FACTORS WITH POISSON RECTANGULAR PULSE MODEL AND EXTREME VALUE THEORY." University of Cincinnati / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1123875661.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Tilling, Carl. "Testing a new method for estimating the monetary value of a QALY." Thesis, University of Sheffield, 2012. http://etheses.whiterose.ac.uk/14603/.

Full text
Abstract:
Objectives The objective of this thesis is to develop and test a new method, based on Time Trade-Off (ITO), for the estimation of the monetary value of a QALY (MVQ) informed by public preferences. Methods Two new questions are developed to estimate an MVQ which ask respondents to trade off length of life to either increase their income, or avoid a decrease in their income. These questions are initially tested through a Dutch online survey with 321 members of the Dutch general public. The questions are further tested through a small scale pilot study, followed by a UK based interview study with 100 members of the general public. In addition, two further questions are also developed and tested in the UK study, which are more closely aligned with the concepts of Willingness to Pay and Willingness to Accept. Results In the Dutch online survey there were a large number of respondents who were not prepared to trade any time to increase their income (or avoid a decrease). Furthermore, some respondents traded too much time, which led to negative MVQ estimates. The prevalence of these responses reduced in the UK based interview study but they were still problematic. Despite this, the questions did appear to be feasible for respondents to complete and were sensitive to scale, particularly in the UK study. Conclusion The evidence tentatively suggests that at least some of the respondents stating an infinite preference for length of life over income, were giving a true statement of preference. The questions could potentially be improved by either decreasing the total value of what is being given up, or by increasing the total value of what is being gained. This could potentially be achieved by extending the time horizon in the exercise.
APA, Harvard, Vancouver, ISO, and other styles
12

Censullo, Alex. "Estimating the Relative Value of Individual Strokes Gained on the PGA Tour." Scholarship @ Claremont, 2017. http://scholarship.claremont.edu/cmc_theses/1695.

Full text
Abstract:
This study compares the predictive ability of newly introduced strokes gained measures on PGA Tour earnings performance with that of more conventional golf statistics. It is found that the strokes gained measures explain slightly more of the variability in the distribution of earnings. Strokes gained on the approach shot are determined to be the most valuable relative to the other strokes gained metrics.
APA, Harvard, Vancouver, ISO, and other styles
13

Backhouse, Lötman Leo. "Estimating return levels for weather events with GAMLSS and extreme value distributions." Thesis, Uppsala universitet, Tillämpad matematik och statistik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-453790.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Leonard, Jerry L. Jr. "Estimating the value of hunting licenses: a case study of Colorado elk permits." Thesis, Montana State University, 1998. http://etd.lib.montana.edu/etd/1998/leonard/LeonardJ1998.pdf.

Full text
Abstract:
The purpose of this thesis was to develop a method of assessing the marginal value of limited elk hunting licenses in Colorado. The measure of marginal value constructed used hunters' opportunity costs of securing these licenses, and it does not rely on travel costs or surveys. The statistical inquiry used both ordinary least squares and the ordered pro bit procedure to measure hunters' marginal willingness-to-pay for various hunting characteristics. The results imply that higher demand for licenses is closely linked to unmeasured quality variables and that efficiency and revenue to the Colorado Division of Wildlife could be increased by reallocating hunting licenses from marginally lower valued characteristics to marginally-higher ones if reallocation is not costly.
APA, Harvard, Vancouver, ISO, and other styles
15

Le, Thi Van Trinh. "Estimating the monetary value of the stock of human capital for New Zealand." Thesis, University of Canterbury. Economics, 2006. http://hdl.handle.net/10092/870.

Full text
Abstract:
Human capital is increasingly believed to play an indispensable role in the growth process; however, adequately measuring its stock remains controversial. Because the estimated impact that human capital has on economic growth is sensitive to the measure of human capital, accurate and consistent measures are desirable. While many measures have been developed, most rely on some proxy of educational experience and are thus plagued with limitations. In this study, I adopt a lifetime earnings approach to estimate the monetary value of the human capital stock for New Zealand. I find that the country's working human capital increased by half between 1981 and 2001, mainly due to rising employment level. This stock was well over double that of physical capital. I also model human capital as a latent variable using a Partial Least Squares approach. Exploratory analyses on a number of countries show that age, gender and education combined can capture 65-97 percent of the explained variation in human capital. JEL Classifications: J24, O47.
APA, Harvard, Vancouver, ISO, and other styles
16

Routh, Pallav. "A Framework for Estimating Customer Worth Under Competing Risks." Bowling Green State University / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1525688483331787.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Mead, Clay. "Estimating the value of carcass DNA and performance EPD'S for Gelbvieh bulls at auction." Manhattan, Kan. : Kansas State University, 2008. http://hdl.handle.net/2097/836.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Klocek, Christopher A. "Estimating the economic value of Canaan Valley National Wildlife Refuge a contingent valuation approach /." Morgantown, W. Va. : [West Virginia University Libraries], 2004. https://etd.wvu.edu/etd/controller.jsp?moduleName=documentdata&jsp%5FetdId=3585.

Full text
Abstract:
Thesis (Ph. D.)--West Virginia University, 2004.
Title from document title page. Document formatted into pages; contains vii, 125 p. : ill., map. Includes abstract. Includes bibliographical references (p. 90-96).
APA, Harvard, Vancouver, ISO, and other styles
19

Mead, Clay. "Estimating the value of carcass DNA and performance EPD’S for Gelbvieh bulls at auction." Thesis, Kansas State University, 2008. http://hdl.handle.net/2097/836.

Full text
Abstract:
Master of Agribusiness
Department of Agricultural Economics
Ted C. Schroeder
For the industry to be able to produce a higher performing and consistent quality product, evaluation of performance and information needs to be collected and available for producers to make more informed beef cattle production management decisions. In recent history, the cattle industry has taken on the complex job of maintaining and recording performance records through programs and efforts such as breed association data bases, and herd health data bases. The constant evaluation of performance and genetic records has supplied producers with data resulting in performance, maternal and carcass statistical records such as Expected Progeny Differences (EPDs). Additionally, developing technology is helping the industry through selection and decision tools such as Carcass DNA marker identification. This study evaluates how the selection tools of EPDs and DNA affect the value of Gelbvieh / Balancer bulls at auction. Data collected for this study is from various Gelbvieh / Balancer bull sales throughout Nebraska in the spring of 2008. Variables evaluated in the study were data and information provided to potential buyers before the auctions to be able to observe how this information affected the value of the purchased bull for each buyer. Variables evaluated were Igenity Profile Carcass DNA values of Ribeye Area, Marbling, and Tenderness. Additionally, Performance EPDs of Calving Ease Direct, Birth Weight, Weaning Weight, Yearling Weight, Ribeye Area, and Marbling were evaluated. The only actual measurement observed was Scrotal Circumference. The hedonic models developed for this study suggest that the selected bull data provided to potential buyers before sale are not the only significant determinants affecting price. Statistical measurements and technologies developing the industry are having a profound and positive effect on production and as selection tools however, are not the only potential variables affecting the value of a sire at auction. Other possible variables effecting auction value can also include evaluation of phenotype, pedigree, and buyer benefits. The data and variables evaluated in the study should still be used as valuable additions to other selection tools and observations when selecting a future beef sire.
APA, Harvard, Vancouver, ISO, and other styles
20

Hosseini, Seyed Ali, Seydeh Zohreh Mirdeilami, Fatemeh Ghilishli, and Mohammad Pessarakli. "Estimating nutritive values of Jasminum fruticans L. plant species in northern rangelands of Golestan province." Taylor & Francis, 2017. http://hdl.handle.net/10150/626137.

Full text
Abstract:
Information on different rangeland plants’ nutritive values in different parts of plant species and habitats are important in Rangelands Management. Effects of different plant parts (stems and leaves) of the Jasminum fruticans L. plant species on forage quality indicators were investigated in two regions in 2015. Plant samples were collected from Sharlogh Rangelands and cultivated in Research and Education Center of Agricultural and Natural Resources (RECANR) in Iran with completely randomized design with three replications in each plant sample. The plant leaves and stems’ samples were oven-dried at 80°C for 24 hours, then analyzed for crude protein (CP), crude fiber (CF), dry matter (DM) Ether Extract (EE), Crude Ash (CA), Metabolizable Energy (ME), and mineral elements, including Calcium (Ca) and Phosphorus (P). Results showed that the forage quality indicators in different regions were statistically significant, except for the CA. Also, nutritive values differed significantly (P<0.01) between different plant parts, except ME parameter. Results also indicated that J. fruticans due to its high tissue CP content is a valuable source of forage for livestock.
APA, Harvard, Vancouver, ISO, and other styles
21

Ekengren, Jens. "Large and rare : An extreme values approach to estimating the distribution of large defects in high-performance steels." Doctoral thesis, Karlstads universitet, Avdelningen för maskin- och materialteknik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-8226.

Full text
Abstract:
The presence of different types of defects is an important reality for manufacturers and users of engineering materials. Generally, the defects are either considered to be the unwanted products of impurities in the raw materials or to have been introduced during the manufacturing process. In high-quality steel materials, such as tool steel, the defects are usually non-metallic inclusions such as oxides or sulfides. Traditional methods for purity control during standard manufacturing practice are usually based on the light optical microscopy scanning of polished surfaces and some statistical evaluation of the results. Yet, as the steel manufacturing process has improved, large defects have become increasingly rare. A major disadvantage of the traditional quality control methods is that the accuracy decreases proportionally to the increased rarity of the largest defects unless large areas are examined. However, the use of very high cycle fatigue to 109 cycles has been shown to be a powerful method to locate the largest defects in steel samples. The distribution of the located defects may then be modelled using extreme value statistics. This work presents new methods for determining the volume distribution of large defects in high-quality steels, based on ultrasonic fatigue and the Generalized Extreme Value (GEV) distribution. The methods have been developed and verified by extensive experimental testing, including over 400 fatigue test specimens. Further, a method for reducing the distributions into one single ranking variable has been proposed, as well as a way to estimate an ideal endurance strength at different life lengths using the observed defects and endurance limits. The methods can not only be used to discriminate between different materials made by different process routes, but also to differentiate between different batches of the same material. It is also shown that all modes of the GEV are to be found in different steel materials, thereby challenging a common assumption that the Gumbel distribution, a special case of the GEV, is the appropriate distribution choice when determining the distribution of defects. The new methods have been compared to traditional quality control methods used in common practice (surface scanning using LOM/SEM and ultrasound C-scan), and suggest a greater number of large defects present in the steel than could otherwise be detected.
APA, Harvard, Vancouver, ISO, and other styles
22

Ling, Daphne. "Alternative approaches to tuberculosis diagnostics research: methods for estimating the incremental value of new tests." Thesis, McGill University, 2013. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=114458.

Full text
Abstract:
Tuberculosis (TB) remains a global health problem. Diagnosis is the critical first step for control of TB, and several promising tests have been developed, including the interferon-gamma release assay (IGRA). Unfortunately, TB diagnostics research is still focused on measures of test accuracy (i.e. sensitivity and specificity). There are limited data on the incremental value of new tests over and above conventional tests and their impact on clinical management. Test accuracy data, while necessary, are only surrogates for patient-important outcomes and cannot provide high quality evidence for policy-making. This manuscript-based PhD thesis focused on alternative approaches to evaluate the incremental value of new tests, with and without a gold standard. In the first manuscript, we performed a secondary data analysis of a study on 528 children evaluated for active TB in Cape Town, South Africa. Using TB culture as the gold standard, we measured the incremental value of the IGRA beyond patient demographics, clinical signs and conventional TB tests using the area under the receiver operating characteristic curve (AUC) as well as two newly-described measures based on risk probability: net reclassification improvement (NRI) and integrated discrimination improvement (IDI). All analyses showed that the IGRA did not have added value beyond clinical data and conventional tests for the diagnosis of active TB in hospitalized, smear-negative children. The use of multivariable analysis provided a useful approach to evaluate the incremental value of this new test as part of the diagnostic algorithm, rather than in isolation.In the second manuscript, we developed a methodology for estimating the incremental value of a test when no gold standard exists and true disease status is unknown, such as in the case of latent TB infection (LTBI). Using a Bayesian framework for latent class model estimation, we validated our proposed methods in a series of simulations and then applied these methods to calculate the AUC, NRI and IDI to measure the added value of the IGRA over the tuberculin skin test (TST) for diagnosis of LTBI in different settings. We showed that the magnitude of the AUC and IDI behaved as expected when we changed the true accuracy of the new test using simulated data. Furthermore, we showed that the added value of the new test decreased when conditional dependence between the new and standard tests was taken into account.Finally, the third manuscript was a primary data collection study at the Montreal Children's Hospital (MCH), which recently began implementing the IGRA for children with specific clinical indications. The aim of this study was to assess the impact of the IGRA on clinical management by asking pediatric respirologists to document how the IGRA result changed, if at all, their initial diagnostic and treatment decisions based on the TST and other available data in clinically-relevant subgroups. Our study of 399 children showed that pediatric respirologists used negative IGRA results to withhold preventive therapy in most low-risk children who were found through targeted screening programs and referred for a positive TST result. In contrast, in almost all TST-positive children who were evaluated as TB contacts, negative IGRA results did not change clinical management.While new technologies in the diagnostics pipeline offer great promise for TB control, limited resources mandate that we evaluate them in clinically-meaningful ways before their implementation into routine practice. This PhD thesis addressed the need for incremental value and clinical impact studies and offers insights into the comparative benefits and limitations of the various methods used.
La présente thèse de doctorat par articles se concentre sur des approches alternatives permettant d'évaluer la plus-value des nouveaux tests, avec et sans norme de référence. Dans le premier article, nous avons effectué une analyse secondaire des données issues d'une étude sur 528 enfants évalués pour la tuberculose (TB) à Cape Town, en Afrique du Sud. En utilisant la culture pour la TB comme norme de référence, nous avons mesuré la plus-value du IGRA (interferon-gamma release assay) au-delà des données démographiques, signes cliniques et tests traditionnels de TB en examinant la surface sous la courbe caractéristique de la performance du test (courbe ROC) ainsi que deux mesures basées sur la probabilité de risques élaborées récemment : amélioration de la reclassification nette (NRI ou net reclassification improvement) et amélioration de la discrimination intégrée (IDI ou integrated discrimination improvement). Toutes nos analyses ont démontré que le IGRA n'apportait aucune plus-value aux données cliniques et tests traditionnels pour le diagnostic de la TB active chez les enfants hospitalisés avec résultats de frottis négatifs. Les analyses multivariables ont fourni une approche utile pour l'évaluation de la valeur ajoutée du nouveau test dans l'algorithme diagnostique, plutôt qu'en tant qu'outil isolé.Dans le deuxième article, nous avons développé une méthodologie d'estimation de la plus-value d'un test lorsqu'une norme de référence n'existait pas et que le statut d'une infection était inconnu, comme dans le cas de la tuberculose latente (LTBI ou latent TB infection). En utilisant une approche bayésienne pour l'estimation d'un modèle de classe latent, nous avons validé nos méthodes proposées dans une série de simulations, pour ensuite appliquer ces méthodes pour calculer la surface sous la courbe ROC, le NRI et le IDI pour évaluer la plus-value du IGRA par rapport au test cutané à la tuberculine (TST ou tuberculin skin test) pour le diagnostic de la LTBI dans différents contextes. Nous avons démontré que la magnitude de la surface sous la courbe ROC et du IDI se comportaient tel qu'attendu lorsque les mesures réelles de fiabilité diagnostique du nouveau test étaient modifiées à l'aide de données simulées. De plus, nous avons démontré que la plus-value du nouveau test diminuait lorsque la dépendance conditionnelle entre le nouveau test et le test traditionnel était prise en considération.Finalement, le troisième article était principalement une étude par collecte de données à l'Hôpital de Montréal pour enfants, où l'implantation du IGRA pour les enfants présentant des indications cliniques spécifiques a récemment commencé. L'objectif de cette étude était d'évaluer l'impact du IGRA sur la gestion du traitement clinique en demandant aux pneumologues pédiatriques de documenter comment les résultats du IGRA changeaient, ou non, leur diagnostic initial et décisions de traitement clinique basés sur les résultats du TST et autres données dans des sous-groupes pertinents. Notre étude de 399 enfants a démontré que les pneumologues pédiatriques utilisaient des résultats négatifs de IGRA pour décider de ne pas prescrire de thérapie préventive d'isoniazide chez la plupart des enfants à faible risque, référés suite à un TST positif obtenu à travers des programmes de dépistage ciblés. À l'inverse, pour la plupart des enfants avec un TST positif évalués suite à un contact à la TB, un résultat négatif de IGRA ne modifiait pas la gestion du traitement clinique.
APA, Harvard, Vancouver, ISO, and other styles
23

Svensson, Mikael. "What is a Life Worth? : methodological Issues in Estimating the Value of a Statistical Life /." Örebro : Institutionen för ekonomi, statistik och informatik Department of Business, Economics, Statistics and Informatics, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:oru:diva-1523.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Hoffman, Joel Christopher. "Natal-river to estuary migration of American shad: estimating the value of essential rearing habitat." W&M ScholarWorks, 2006. https://scholarworks.wm.edu/etd/1539791562.

Full text
Abstract:
The objective of this study was to identify important river and estuary habitats of young American shad by estimating their value to fish production. American shad populations across the Atlantic coast have been in decline since the 1800s due, in part, to restricted access to habitat and habitat loss. The study demonstrates that production and year-class strength of Mattaponi River American shad are influenced by allochthonous subsidies from riparian and terrestrial ecosystems, thus habitat quality is related to the watershed's health, and that Chesapeake Bay is an important overwintering habitat for juveniles. Specifically, I characterized production dynamics and biogeochemical processes in the Mattaponi River, a tributary of the York River and the most productive American shad nursery in Virginia's portion of Chesapeake Bay; quantified the contribution of autochthonous and allochthonous organic matter (OM) to zooplankton, macroinvertebrates, and young American shad in the Mattaponi River; identified the trophic pathways that support American shad production within river and estuary habitats during ontogeny; and determined the main habitats within the York River system, from the larval stage to their ocean migration. Mattaponi River production dynamics were strongly influenced by river discharge; during periods of high discharge, primary production was suppressed and greater than 60% of zooplankton, macroinvertebrates and larval American shad production was supported by allochthonous OM. Further, OM concentration, plankton density, and juvenile American shad indices were elevated, demonstrating that allochthonous OM subsidizes the metazoan food web and fish production. Spatial segregation of juveniles rearing in the freshwater nursery zone was identified through a novel application of a stable isotope turnover model; juveniles were shown to reside in habitats of 5-10 river km for a month or longer. These fish emigrate from the nursery zone in November and December, residing and feeding in the York River estuary and Chesapeake Bay before migrating to the ocean in February through April. Variable emigration strategies were observed; most American shad likely emigrated at 2-5 g and spent weeks to months in the estuary, however a few emigrated at habitat.
APA, Harvard, Vancouver, ISO, and other styles
25

Stephens, Daren. "Hedonic bull pricing models: estimating the value of traits of bulls sold following performance testing." Thesis, Kansas State University, 2015. http://hdl.handle.net/2097/20575.

Full text
Abstract:
Master of Agribusiness
Department of Agricultural Economics
Ted Schroeder
Selection of a herd sire has always been of paramount importance given the initial financial investment and their contribution and effect on the genetic make-up of a beef herd. Data was collected from the nation’s longest consecutively run bull test conducted at the University Farm of Oklahoma Panhandle State University (OPSU). The Bull Test and Bull Sale data utilized were collected from 2008-2013. Performance data was collected over a 112 day test period with data collection occurring at 28 day intervals. The top seventy bulls from each year’s test were selected based upon a performance index of ½ ADG and ½ weight per day of age (WDA), and a semen quality and motility score of excellent and sold at auction. Angus bulls were the focus of the study as they represented the vast majority of individuals sold. Three hedonic pricing models were created to try to determine what attributes buyers at the OPSU bull test sale were placing emphasis on. The initial hedonic model contained production data that included BW, ADG, WDA, Julian age, final test weight ultrasound data, and a dummy variable for sale year. The second model utilized all production data and added genetic variables in the form of production EPDs (Calving Ease Direct (CED), BW, Weaning Weight and Yearling Weight) and maternal EPDs (Calving Ease Maternal, Maternal Milk). The third model included the variables from the first and second models with the inclusion of carcass EPDs (Marbling, Ribeye Area (REA) and FAT). Year was significant in all three models however there was less of an effect on price as more variables were included. In model one, the production facts that were of significance were: ADG (P<0.01), BW (P<0.01) and final test weight (P<0.01). In the second model, ADG, BW and final test weight retained their significance at the P<0.01 level. The only production EPD that was significant (P<0.05) was CED. In the third model, years, ADG and BW were still significant (P<0.01). Final test weight (P=0.70) and CED (P = 0.132) had substantial changes. The carcass EPD ribeye area had a P value of 0.057. Producers who are placing bulls on test can utilize the given information to assist with their selection. It cannot go unsaid that while single trait selection can be very detrimental; ADG was significant across all models. The study indicates that performance and growth are of utmost importance to buyers, followed by birth weight consideration.
APA, Harvard, Vancouver, ISO, and other styles
26

Wahlström, Rikard. "Estimating expected shortfall using an unconditional peaks-over-threshold method under an extreme value approach." Thesis, Uppsala universitet, Statistiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-445122.

Full text
Abstract:
Value-at-Risk (VaR) has long been the standard risk measure in financial risk management. However, VaR suffers from critical shortcomings as a risk measure when it comes to quantifying the most severe risks, which was made especially apparent during the financial crisis of 2007–2008. An alternative risk measure addressing the shortcomings of VaR known as expected shortfall (ES) is gaining popularity and is set to replace VaR as the standard measure of financial risk. This thesis introduces how extreme value theory can be applied in estimating ES using an unconditional peaks-over-threshold method. This includes giving an introduction to the theoretical foundations of the method. An application of this method is also performed on five different assets. These assets are chosen to serve as a proxy for the more broad asset classes of equity, fixed income, currencies, commodities and cryptocurrencies. In terms of ES, we find that cryptocurrencies is the riskiest asset and fixed income the safest.
APA, Harvard, Vancouver, ISO, and other styles
27

Alshatshati, Salahaldin Faraj. "Estimating Envelope Thermal Characteristics from Single Point in Time Thermal Images." University of Dayton / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1512648630005333.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Pečeliūnas, Valdas. "The value of plasma cell immunophenotypic analysis estimating response to treatment and risk of multiple myeloma." Doctoral thesis, Lithuanian Academic Libraries Network (LABT), 2011. http://vddb.laba.lt/obj/LT-eLABa-0001:E.02~2011~D_20111102_111523-80198.

Full text
Abstract:
The investigations presented in this dissertation were initiated with the intention to evaluate the prognostic value of plasma cells immunophenotypic analysis in multiple myeloma patients. We tested the hypothesis that kinetics of peripheral blood circulating plasma cells in response to first chemotherapy cycle could identify patients refractory to given treatment. We employed novel original methodology for plasma cells immunophenotyping: cells were stained in two tubes with antibody combinations CD56/CD138/CD45/CD19/CD38/CD20 and cLambda/cKappa/CD138/CD19/CD38/CD56. We found that ~30% of all plasma cells in bone marrow of healthy donors are immunophenotypically aberrant by CD56 and/or CD19 marker expression. We optimized immunophenotypic differentiation between malignant and normal plasma cells. Non reduction of malignant circulating plasma cells in response to first chemotherapy cycle predicted early progression with sensitivity and specificity of 91.7% and 93.3%, respectively. Time to progression and overall survival were significantly shorter in these patients as compared to patients with undetectable or reduced malignant circulating plasma cells. We also evaluated the clinical value of normal plasma cell subpopulation detection in peripheral blood and bone marrow of multiple myeloma patients. In summary, we demonstrated that immunophenotyping of plasma cells using multiparameter flow cytometry provides important prognostic information. The major finding was that the... [to full text]
Šioje disertacijoje aprašyti tyrimai, atlikti siekiant įvertinti plazminių ląstelių imunotipavimo, taikant tėkmės citometriją, prognostinį potencialą. Patikrinome hipotezę, jog cirkuliuojančių plazminių ląstelių kinetika gydymo metu gali, anksčiau, nei standartiniai metodai įvertinti atsaką į gydymą. Tyrime naudojome iki tol neaprašytą plazminių ląstelių imunofenotipavimo metodiką. Mėginiai dažyti dviem skirtingais žymenų deriniais: CD56/CD138/CD45/CD19/CD38/CD20 ir cLambda/cKappa/CD138/CD19/CD38/CD56. Nustatėme, kad sveikų donorų kaulų čiulpuose aptinkama ~30% plazminių ląstelių, turinčių atipine CD56 ir/ar CD19 žymenų raiška. Optimizavome tėkmės citometrijos metodiką normalių ir piktybinų plazminų ląstelių aptikimui. Nustatėme, kad piktybinių cirkuliuojančių plazminių ląstelių proporcijos nesumažėjimas po pirmojo chemoterapijos kurso, su 91,7% jautrumu bei 93,3% specifiškumu prognozuoja ankstyvą progresiją. Pacientų, kuriems cirkuliuojančių plazminių ląstelių proporcija nesumažėjo, laikas iki progresijos ir bendras išgyvenamumas buvo statistiškai patikimai trumpesnis, nei pacientų, kuriems piktybinių cirkuliuojančių plazminių ląstelių proporcija sumažėjo ar šios ląstelės buvo neaptinkamos. Ištyrėme normalių plazminių ląstelių populiacijos klinikinę vertę pacientams, sergantiems mielomine liga. Apibendrinant, plazminių ląstelių imunofenotipavimas taikant tėkmės citometrijos metodą suteikia prognostiškai reikšmingos informacijos. Svarbiausias radinys, jog cirkuliuojančių... [toliau žr. visą tekstą]
APA, Harvard, Vancouver, ISO, and other styles
29

Gwebu, Nomonde Nomfundo. "Estimating the value and economic contribution of agricultural production in the former homelands of South Africa." Diss., University of Pretoria, 2017. http://hdl.handle.net/2263/60810.

Full text
Abstract:
The value and economic contribution of agricultural production in the former homelands of South Africa has become increasingly important to measure because it is critical to our understanding of the role agriculture plays in household food security in these regions and the contribution by this section of the agricultural sector to the economy. Yet, two decades into the Democratic South Africa we still fail to consistently provide accurate estimates of this sectors value. The fundamental premise of this dissertation is to estimate the value and economic contribution of agricultural production in the former homelands of South Africa so that the subsistence agricultural sector can be well understood in terms of its characteristics and its value. The main focus of this study is therefore placed on black subsistence farmers in the former homelands of South Africa, mostly because these areas are under great pressure to maintain food self-sufficiency. The main hypothesis of this study is that, the value and economic contribution of agricultural production in the former homelands is significant when compared with the contribution by the commercial agricultural sector in South African. In order to test this hypothesis, three different data sets were analysed because none of these data sets individually provide exhaustive information for the purposes of this study. These data sets include primary data, such as the Agricultural Research Council (ARC) sample survey data from the OR Tambo District municipality conducted in 2015. The secondary data used in this study include the ARC sample survey 2013, the Income and Expenditure Survey (IES) 2010/2011 conducted by Statistics South Africa (Stats SA), and the National Income Dynamics Study (NIDS) waves 1 to 3 conducted by the Southern African Labour Development Research Unit (SALDRU). The Gross Margin (GM) analysis approach was used in this study to estimate the economic contribution of agricultural production. In interrogating the NIDS waves and IES 2010/2011 data sets, two types of variables which can be used to estimate the economic contribution of agricultural production are provided. The first type of variables are the self-reported values of agricultural goods consumed from home production, which are found in both the NIDS and IES datasets. The second type of variables are quantities of agricultural goods harvested and the value of sales from home production, found in the NIDS datasets. The variables to estimate the economic contribution of agricultural production would appear to be the self-reported values of agricultural goods consumed from home production. Using the NIDS data the estimated value of consumption from home production in current prices was R207 million based on wave 1 data, R80,5 million based on wave 2 data, and R529 million based on wave 3 data. Using the IES data the estimated value of production for home consumption in current prices was R359 million in 2010/2011. In investigating the 2010/2011 figures estimated in this study several issues arise with regard to the number of agriculturally active households and the value of agricultural goods consumed from home production. The most important issue, is that self-reported values of agricultural goods consumed by households introduce an added source of inequality to the measurement of output. According to the UNSD (2005), households can inaccurately assign values to self-produced goods because of a lack of information about local market prices. In order to avoid this source of inequality in the measurement of the agricultural sectors contribution, estimates of the economic contribution of agricultural production were pursued, based on local market prices. It was determined that only the NIDS and the ARC data sets have variables to directly estimate the economic contribution of agricultural production based on the GM approach. The variables include: quantities of crop and livestock goods harvested and the value of sales from own production. Using the ARCs data it was estimated that the annual GM per household per year in 2012 prices was R1 985.32 based on the 2013 data and R8 892.85 based on the 2015 data. Using the NIDS waves 1 and 3 data, it was estimated that the annual GM per household was R1 017.85 based on wave 1 data and R3 535.42 based on wave 3 data in 2012 prices. The NIDS wave 2 data set does not provide farm input cost and livestock production variables. As a result, it was only possible to estimate the annual Gross Farm Income (GFI) per household which was R1 973 in 2010/2011 in 2012 prices. The latter results are somewhat consistent with the ARC 2013 and 2015 figures, although not directly comparable. The Agricultural Research Council-Department of Rural Development and Land Reform (ARC-DRDLR) project introduced in the OR Tambo District municipality has played a key role in terms of changing the mind-set of farmers. Therefore, programmes such as the ARC-DRDLR project should be introduced with more vigour. Such programmes should, however, not undermine subsistence households consumption type activates.
Dissertation (MSc (Agric))--University of Pretoria, 2017.
Agricultural Economics, Extension and Rural Development
MSc (Agric)
Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
30

Garcia-Yi, Jaqueline. "Estimating the Economic Recreational Value of Paracas National Reserve in Ica Peru: A Fair Fee Implementation Approach." Fogler Library, University of Maine, 2004. http://www.library.umaine.edu/theses/pdf/Garcia-YiJ2004.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Jaen, Celada Jaeljattin R. "Estimating the potential returns to research and development from sorghum value added products in El Salvador and Nicaragua." Thesis, Kansas State University, 2011. http://hdl.handle.net/2097/13179.

Full text
Abstract:
Master of Science
Department of Agricultural Economics
Timothy J. Dalton
Sorghum bicolor (L.) Moench is a drought tolerant crop able to adapt to hot and dry weather. It has excellent chemical and physical properties, which make it a grain of good quality for processing different types of products. This research is an impact assessment study that estimated the potential impacts of new uses of sorghum by using an equilibrium displacement model. The data used was drawn from interviews developed in July 2011. Using total quantity production, prices, prices elasticities and cost shares 8 potential market scenarios were simulated. Results between countries were similar. Thus, the analysis was applied for both countries. Producers gain when the sorghum flour demand is shifted between $6,000 and $ 30,000. When the feed demand curve shifted the producer benefit was between $3 million and $ 13 million. In the scenario where the sorghum grain curve shifted and the demand curve for feed and sorghum flour, producer net benefit is between $300,000 to $2.5 million. Interpreting these results suggest that increasing yield and promoting sorghum as a substitute of maize for feed and sorghum as a substitute of wheat for sorghum flour can benefit producers while helping them to increase yield.
APA, Harvard, Vancouver, ISO, and other styles
32

Muller, Jacob. "Estimating the marginal value of agricultural irrigation water: A methodology and empirical application to the Berg River Catchment." Master's thesis, University of Cape Town, 2017. http://hdl.handle.net/11427/25409.

Full text
Abstract:
This study aims to facilitate effective and efficient intersectoral water allocation policy in South Africa, where limited water supplies are increasingly constraining necessary economic development. The study develops an economic model of irrigated agricultural production that recognises the multi-output nature of irrigated agriculture as well as the institutional setting in which commercial irrigation water is allocated in South Africa. The model is then used to econometrically estimate the marginal value of commercial irrigation water in the Berg Water Management Area (WMA), using a Translog functional form, Tobit censored regression model, including controls for heterogeneity, and accounting for heteroscedasticity. The estimates are obtained for 16 irrigated crops in the region and range from an overall mean of 4.84 R/m³ for peaches to 0.14 R/m³ for wheat, but vary significantly between sub-regions and according to soil productivity as well as between crops. Furthermore, the estimates differ substantially from the average value of production per m³ of irrigation water, reflecting a revenue-water elasticity that differs from unity for all crops. The results imply that potential efficiency gains are possible from the intersectoral reallocation of water away from agriculture. A further implication is that reallocation within the agricultural sector would be most efficiently undertaken by farmers themselves, due to the large number factors that affect irrigation water productivity but are unobservable by policymakers or are difficult to account for in the formulation of policy.
APA, Harvard, Vancouver, ISO, and other styles
33

Fethers, A. V., and n/a. "Valuing public goods." University of Canberra. Management, 1991. http://erl.canberra.edu.au./public/adt-AUC20060710.105721.

Full text
Abstract:
There are three broad areas of public administration that require valuation for public goods. One of these areas is concerned with value for cost benefit analysis. The concept here is quantitative, in money terms, and the purpose is to aid decision making. Planners and economists either calculate, or estimate total costs and total benefits of programs or projects as an aid to decision making. The second broad area involves justifying, or allocating public resources. Benefits bestowed by intangibles such as the arts, or questions that affect the environment are difficult to quantify as value may involve concepts the beneficiaries find difficult to identify or describe. The concept of value involves total costs, but also may involve perceptions of the community about value. Valuation costs may be calculated from the aggregate demand, but estimating demand can be difficult. The third broad area involves estimating demand for government services such as those provided by the Bureau of Statistics, and the Department of Administrative Services, as well as many others, who are being required to charge fees for services previously provided without direct charge. This development is part of the trend called corporatisation now occurring in many countries, including Australia. Economists and planners have a range of approaches available to assist them in the estimation of value, whether it be for the purpose of comparing costs with benefits, or for estimating the demand for tangible or intangible items like the arts or statistics. Surveys have been used for many years to assist a wide range of decisions by private enterprise. The use of surveys by government in Australia has been limited, but is increasing. US and European governments have used surveys to value both more and less tangible public goods since 1970. Surveys have also proved useful to assist many other decisions, including policy making, developing the means for implementing policies, monitoring and adjusting programs, and evaluation. This paper is primarily concerned with surveys. A particular type of survey, known as contingent valuation (CV), has been developed to assist the estimation of value for intangible public goods. Also discussed are other applications of surveys for government decision making, and other ways of imputing or estimating values, largely developed by economists and planners to assist cost benefit analysis. Three examples of surveys used to estimate values are discussed. These include a survey of Sydney households to help estimate the value of clean water; an Australia wide survey to help estimate the value of the arts; and a survey of Australians to help estimate the value of Coronation Hill without mining development. While the paper suggests that surveys have potential to assist a range of government decisions, examples also demonstrate the care required to obtain results that are reasonably precise and reliable.
APA, Harvard, Vancouver, ISO, and other styles
34

Kang, Kingston. "ESTIMATING THE RESPIRATORY LUNG MOTION MODEL USING TENSOR DECOMPOSITION ON DISPLACEMENT VECTOR FIELD." VCU Scholars Compass, 2018. https://scholarscompass.vcu.edu/etd/5254.

Full text
Abstract:
Modern big data often emerge as tensors. Standard statistical methods are inadequate to deal with datasets of large volume, high dimensionality, and complex structure. Therefore, it is important to develop algorithms such as low-rank tensor decomposition for data compression, dimensionality reduction, and approximation. With the advancement in technology, high-dimensional images are becoming ubiquitous in the medical field. In lung radiation therapy, the respiratory motion of the lung introduces variabilities during treatment as the tumor inside the lung is moving, which brings challenges to the precise delivery of radiation to the tumor. Several approaches to quantifying this uncertainty propose using a model to formulate the motion through a mathematical function over time. [Li et al., 2011] uses principal component analysis (PCA) to propose one such model using each image as a long vector. However, the images come in a multidimensional arrays, and vectorization breaks the spatial structure. Driven by the needs to develop low-rank tensor decomposition and provided the 4DCT and Displacement Vector Field (DVF), we introduce two tensor decompositions, Population Value Decomposition (PVD) and Population Tucker Decomposition (PTD), to estimate the respiratory lung motion with high levels of accuracy and data compression. The first algorithm is a generalization of PVD [Crainiceanu et al., 2011] to higher order tensor. The second algorithm generalizes the concept of PVD using Tucker decomposition. Both algorithms are tested on clinical and phantom DVFs. New metrics for measuring the model performance are developed in our research. Results of the two new algorithms are compared to the result of the PCA algorithm.
APA, Harvard, Vancouver, ISO, and other styles
35

Chong, Brian. "Estimating the Effects of International Basketball Players on the NBA: Do NBA Coaches, Executives, and Coaches Value International Players Equally Compared to Domestic Players?" Scholarship @ Claremont, 2012. http://scholarship.claremont.edu/cmc_theses/385.

Full text
Abstract:
Each year the NBA draft helps determines the future success of the NBA team and during the 1990’s to 2000’s international players were being drafted at a high rate. Why was this happening and were international players more successful than domestic players throughout their careers? Through my study I wanted to examine what determines success for NBA players and whether certain statistical or award performances affects their career. Furthermore I wanted to see the effects that international players had on team attendance throughout their NBA career. Ultimately I wanted to see how NBA coaches, executives, and fans value international players. This study aims to provide insight regarding international players and their success in the NBA.
APA, Harvard, Vancouver, ISO, and other styles
36

Van, Heerden Petrus Marthinus Stephanus. "Estimating efficiency of a South African bank using data envelopment analysis / by P.M.S. van Heerden." Thesis, North-West University, 2007. http://hdl.handle.net/10394/1854.

Full text
Abstract:
The greater competition and concentration in South Africa's financial sector has put South African banks under more constraints and led to questioning of their present performance. With a greater demand for financial services and more complains about the low quality of financial services and charges being too high, there has been increasing debate about how efficient South African banks really are. This study discusses performance evaluation, the traditional financial and non-financial measures used, and their limitations. The concept of bank efficiency is also briefly discussed, including scale efficiency, scope efficiency, X-efficiency, cost efficiency, standard profit efficiency, alternative profit efficiency and the risk component of bank efficiency. Data Envelopment Analysis (DEA) was chosen as the most appropriate method to estimate the scale efficiency and technical efficiency of 37 districts (and 10 provinces) of one of the largest banks in South Africa. 'DEA involves solving linear programming problems that generate a non-parametric, piecewise linear convex frontier that envelops the input and output data relative to which cost is minimized' (Fare et al., 1985b:193). The intermediation approach was used incorporating both the input- and output-orientated approach under variable returns to scale. The analyses indicated that 19 districts out of the 37 districts were not at least once fully technically efficient during the 22 months (input- and output-orientated). The same results were found with regard to scale efficiency: 17 districts out of the 37 districts were not at least once fully scale efficient (input-orientated) and 19 districts out of the 37 districts were not at least once fully scale efficient (output-orientated), during the 22 months. Synergy was found in 6 provinces out of the 10 provinces (input- and output-orientated).
Thesis (M.Com. (Risk Management))--North-West University, Potchefstroom Campus, 2008.
APA, Harvard, Vancouver, ISO, and other styles
37

Diafas, Iason [Verfasser], Jan [Akademischer Betreuer] Barkmann, Achim [Akademischer Betreuer] Spiller, and Micha [Akademischer Betreuer] Strack. "Estimating the Economic Value of forest ecosystem services using stated preference methods: the case of Kakamega forest, Kenya / Iason Diafas. Betreuer: Jan Barkmann. Gutachter: Achim Spiller ; Micha Strack." Göttingen : Niedersächsische Staats- und Universitätsbibliothek Göttingen, 2016. http://d-nb.info/1083255533/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Homolková, Kristýna. "Stanovení hodnoty podniku." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2019. http://www.nusl.cz/ntk/nusl-400074.

Full text
Abstract:
This master thesis is focused on determining the value of the company Dieffenbacher CZ hydraulic presses by using income-based methods at 31.12.2017. The thesis is divided into a theoretical, practical part and own design solution. The theoretical part describes the basic concepts and procedures. In the practical part is presented company, strategic and financial analysis, value generators and financial plan. The final part of the diploma thesis is the determination of the value of the company.
APA, Harvard, Vancouver, ISO, and other styles
39

Sparks, Douglas Frederick. "Estimating design values for extreme events." Thesis, University of British Columbia, 1985. http://hdl.handle.net/2429/25109.

Full text
Abstract:
Extreme event populations are encountered in all domains of civil engineering. The classical and Bayesian statistical approaches for describing these populations are described and compared. Bayesian frameworks applied to such populations are reviewed and critiqued. The present Bayesian framework is explained from both theoretical and computational points of view. Engineering judgement and regional analyses can be used to yield a distribution on a parameter set describing a population of extremes. Extraordinary order events, as well as known data, can be used to update the prior parameter distribution through Bayes theorem. The resulting posterior distribution is used to form a compound distribution, the basis for estimation. Quantile distributions are developed as are linear transformations of the parameters. Examples from several domains of civil engineering illustrate the flexibility of the computer program which implements the present method. Suggestions are made for further research.
Applied Science, Faculty of
Civil Engineering, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
40

Wang, Zhihua 1970. "Value estimation for software development processes." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=81576.

Full text
Abstract:
The management of software development processes is a continual challenge facing software development organizations. Previous studies used "flexible models" and empirical methods to optimize software development processes. In this thesis, the expected payoff is used to quantitatively evaluate processes. Payoff can be defined as the value of a team member's action, and the expected payoff combines the value of the payoff of a team member's action and the probability of taking that action. The mathematic models of a waterfall process and two flexible processes are evaluated in terms of total maximum expected payoff. The results show under which conditions which process is more valuable. An overview of this work and results will be presented in this seminar.
APA, Harvard, Vancouver, ISO, and other styles
41

Pickering, John David. "Process value estimation at Pacific Bell." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1995. http://handle.dtic.mil/100.2/ADA305649.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Tolikas, Konstantinos. "An application of extreme value theory in value-at-risk estimation." Thesis, University of Dundee, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.491268.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Schenler, Warren William. "Full value estimation of electric utility options." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/46063.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Černý, Matěj. "Value estimation of the company Metrostav a.s." Master's thesis, Vysoká škola ekonomická v Praze, 2009. http://www.nusl.cz/ntk/nusl-16627.

Full text
Abstract:
The aim of the diploma thesis is to determine the market value of the company Metrostav a.s. to the date 1.1.2009. Final market value represents the potential market price for which the owners could sell this company to the date of evaluation. The diploma thesis starts with brief introduction of the company. Afterwards strategic analysis follows which is focused on analysis of profit potential and its result is the prediction of sales in the near future. In financial analysis are examined the financial conditions of the company on the basis of absolute, ratio and global indicators. Financial plan is based on the prediction of sales, which was estimated within strategic analysis. The company is evaluated using two-phase discounted cash flow method in the version of FCFF. The diploma thesis is ended by risk analysis of assessment value of company by using software program Crystal Ball.
APA, Harvard, Vancouver, ISO, and other styles
45

陳昱如. "Estimating Value at Risk with Realized Range." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/94875527315874435451.

Full text
Abstract:
碩士
國立交通大學
經營管理研究所
97
This paper investigates the concept of realized range into the Value-at-Risk estimation. We follow the bias-correction method of Martens and van Dijk (2007) and use MEM model(Multiplicative Error Model)to forecast volatility and VaR estimation. In addition, we apply two different VaR methods to make the comparison: Variance-covariance method and Extreme value theory. In empirical research, we use the intra-day data of S&P 500 and Nasdaq Index to compare the forecast ability of VaR with realized range, daily return and daily range data. The comparing result shows that realized-range-based VaR model performs better than other models.
APA, Harvard, Vancouver, ISO, and other styles
46

Castillo, Arleen Nicole Diaz, and 狄愛林. "Estimating Trust Value: A Social Network Perspective." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/08317977448433756784.

Full text
Abstract:
碩士
淡江大學
企業管理學系碩士班
100
Information overload is an increasing problem, and as information available continues to grow in volume, current filtering techniques are proving inefficient. Social network users and people in general, tend to prioritize recommendations coming from people they are acquainted to. The purpose of this study was to investigate if it was possible to measure trust within individuals in a social network, as well as find out if data clustering methods could help to achieve said goal. Another aim was to develop a trust model that would estimate a trust value for content creators on an online rating system with social network capabilities. This research introduces the concept of social distance, which is drawn from clustering methods applied to the social network user base; and incorporates said distance in the estimation of trust, as well as user generated ratings. The trust value estimated will serve as a metric for filtering and sorting content of any kind based on the trustworthiness of the creator. The results of the study revealed that it is possible to provide an estimate measure of trust within individuals in a social network and that clustering methods were of significant help into said evaluation as well as the integration of other variables affecting the building of trust. It was found that the model proposed by this study was able to integrate various variables and provide a more complete and integrated, multidimensional value to an estimated trust. Results also showed, that higher rating scores combined with shorter social distances provide satisfactory trust values, while the opposite happened for subjects presenting lower rating scores in combination with longer distances. The principal conclusion was that our model provides a multidimensional estimated value for trust on content from the Internet, that integrates some of the variables necessary for the building of trust in online setting, as are: social distance, weight of relationship, time, and ratings from an online rating system; as well trust levels between individuals within a social network. This study contributes to the current literature on trust estimation and social networks role in such endeavors. This will provide also an alternative for current information overload issues as well.
APA, Harvard, Vancouver, ISO, and other styles
47

Hsu, Po-Han, and 許博涵. "Estimating Value-at-Risk of Natural Gas Price." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/56075529647655311447.

Full text
Abstract:
碩士
國立交通大學
財務金融研究所
95
This paper use historical simulation approach, GARCH model, EGARCH model, GJR-GARCH model and CARR model, under different error term distribution hypothesis to estimate the Value at Risk of NYMEX Natural Gas price.   We use different number of days and significant level to test our model and observe their performance. To value our model, we use back test, calculate failure frequency, failure ratio, and use Christoffersen’s Likelihood Ratio Test to test unconditional and conditional test statistic. According to our analysis, error term should be suppose to normal distribution, that can fit the NYMEX Natural Gas property. The model have good performance in failure ratio, as the significant level goes down, the model will be more stable, CARR and EGARCH model have better performance in LR Test.
APA, Harvard, Vancouver, ISO, and other styles
48

Wang, Jia-Wen, and 王佳文. "Progressive Estimating Null Value Approach in Relational Database." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/75022549412252606343.

Full text
Abstract:
博士
國立雲林科技大學
管理研究所博士班
95
A database system generally cannot operate properly if it contains null values of attributes in the system. The study proposes efficient and systematical approaches to estimating null values in a relational database, which progressively improve the performance of the null value. The proposed approaches have three advantages: (1) attribute selection; (2) consideration of database pattern and (3) situation based units of attribute measurement (normalized). The approaches include two phases: (1) data-preprocessing, (2) model construction. Firstly, the data-preprocess phrase uses a progressive approach to estimate null value, which including non-partition, hard partition and fuzzy partition approaches. Secondly, the model building phrase utilizes the beta/correlation coefficient and partition approach to calculate the relative influence of different attributes. Two databases are used to verify the proposed approaches: (1) Human resource database, (2) Waugh’s database. Furthermore, this study uses mean of absolute error rate (MAER) as an evaluation criterion for comparison with other methods. It demonstrates that the proposed approaches are superior to existing methods for estimating null values in relational database systems.
APA, Harvard, Vancouver, ISO, and other styles
49

Chang, Kuan-Chi, and 張寬棋. "Co-Branding Value: A Method of Estimating the Value of Strategic Brand Alliances." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/55634322357042500377.

Full text
Abstract:
碩士
淡江大學
企業管理學系碩士班
97
In recent years, many global corporations have put much emphasis on brand power. Therefore, it’s one of the important goals for enterprises to think about how to enhance brand value for improving competitive advantages in such a competitive business market. Brand alliance strategy has been widely used in business so far so this research focuses on discussing corporate brand value. However, various scholars argue that it doesn’t need revolutionary marketing principals to revaluate the brand value. On the contrary, Balmer and Grey(2003) defined that marketing scholars and others had largely ignored the challenges presented by corporate brand management; besides, the traditional marketing framework was inadequate and requires a radical reappraisal. This research focuses on discussing corporate co-branding value and creates the model of evaluating co-branding value. The connotation of the model is to consider the compatibility of strategic partners such as strategic alliance compatibility and brand alliance compatibility; in addition, this research can estimate the corporate co-branding value through this model to evaluate and discuss the effect of co-branding effect for the future. This study verifies the proposed model synthetically with three presumed cases and one real case (Sony-Ericsson). Conversely, this research anticipates analyzing the model in different perspectives and observing the variation of different combinations to obtain potential managerial implications for corporate managers. This research concludes: (1). brand alliance compatibility has limited effect on corporate co-branding value, (2). strategic alliance compatibility is the major power to drive the direction of corporate co-branding value, and (3). the trend of co-branding value is the important indicator for business managers. In summary, this research successfully creates the model of co-branding value to present result of alliance. Additionally, our model is not only based on financial indicators but also a superior tool for enterprises which may interest in cooperating with others to evaluate the performance of strategic/brand alliance. Hence, the allied enterprises can improve the weakness through the model with modification. Finally, enterprises can understand the importance of the link between brand and customers; thus, they attempt to enhance the co-branding value in order to obtain sustaining competitive advantage.
APA, Harvard, Vancouver, ISO, and other styles
50

Chiao, Huang Yi, and 黃益喬. "Estimating Value at Risk of PIIGS Countries -An Application of Extreme Value Theory." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/90485227123249192397.

Full text
Abstract:
碩士
國立高雄應用科技大學
金融資訊研究所
100
Europe's debt crisis continue to affect the global economy and the stock market in 2011, the center of the series of storms from Europe and five European countries: Portugal, Italy, Ireland, Greece , Spain, English initials of the five countries is the PIIGS, so far, the European debt still has a great influence on the world economy. Extreme value theory model applications describe the tail characteristics, assumptions, without the entire allocation to reduce the risk of model selection. This article will be the PIIGS five stock index as the data, take off the stock index stock index returns to calculate the number of first-order differential rate, find the data with serial correlation using the Ljung-Box statistic test. Thus, McNeil and Frey (2000), uesd the GARCH model to filter the stock price index data, non-normal distribution with iid standardized residuals, then the extreme value theory model to estimate the distribution of the tail of the GARCH standardized residuals. Hill estimator and GPD distribution used in this article, under the three confidence level in three different threshold values to estimate the shape parameter of the PIIGS, the scale parameter, the number of theoretical violation, actual violation of the number of expected shortfall and Kupiec (1995), the likelihood ratio statistic (LR-test).Shape parameters from the point of view, in addition to the Irish thin tail distribution of the remaining four thick tail distribution. LR-test the empirical results show Hill estimator formula in the model compared with the GPD distribution. In the case of the 99.5% confidence level and the threshold value of 5%, the two models performed the best, said the model predictive ability is good. This article will show the best model assumes that under further estimate the risk value of the two models, and the debt event in Italy and Greece last year as a flashpoint, the actual rate of return of the two countries with the model value at risk to compare. However, empirical results show once again Hill estimator and GPD distribution, the predictive ability of the two models with good results.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography