Rozprawy doktorskie na temat „Estimation de la fonction de production”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Sprawdź 50 najlepszych rozpraw doktorskich naukowych na temat „Estimation de la fonction de production”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Przeglądaj rozprawy doktorskie z różnych dziedzin i twórz odpowiednie bibliografie.
Bauer, Arthur. "Essays on Firms Production Function, Markups, and the Share of their Income Going to Workers". Electronic Thesis or Diss., Institut polytechnique de Paris, 2020. http://www.theses.fr/2020IPPAG006.
Pełny tekst źródłaFirms production function link their use of input factors to their production level. Production function estimates are at the same time important and untrustworthy. Important, because key indicators for policy design, such as the measure of aggregate markups, are derived from those estimates. Untrustworthy because they rely on identification assumptions.This project studies the assumptions underlying the usual techniques for estimating production functions: both their functional forms and the often assumed inputs flexibility. It then leverages production function estimates, to assess how firms ability to price over marginal income and the share of their income going to workers have evolved over the last 30 years.In Chapter 1 we provide evidence on the input flexibility assumption grounding production function estimation: the quasi instantaneous adjustment of either material or labor and delayed adjustment of capital hold. We rely on the existence of notches; values where after-tax profits decrease in before-tax sales in the French tax code.We identify which type of firms adjust their size in response to a transient notch. We do this by studying the ex-ante characteristics of firms below the tax cutoff. We find that firms who bunch tend to have larger elasticity of output with respect to materials and lower elasticity of output with respect to capital. Consistently, we also show that to adjust their remaining production firms, tend to primarily reduce spending on material.In Chapter 2 we leverage evidence on inputs flexibility to recover firm level markups of the universe of firms in France over the 1984-2016 period. De Loecker and Warzynski (2012) show that a firm’s markup proportionates the inverse of one of its flexible inputs revenue share. We analyze the evolution of aggregate markups in France and document that the rise of concentration correlates with a reallocation of market share towards high markup firms. We also show that the evolution of the labor share mirrors the evolution of markups: reallocation tends to decrease labor share while within firms, labor share rises.The choice of a functional form to describe firms production process is a compromise between theory and empirics. The workhorse production function is Cobb-Douglas and imposes a constant (and equal to 1) elasticity of substitution. Recent evidence in the empirical literature has however estimated a micro-elasticity of substitution significantly lower than one. While CES production functions allow for non-unit elasticity of substitution, they assume constant elasticity within industry and imply that the ratio of input use doesn’t depend on firm size.In Chapter 3, we show that the latter production function cannot account for IT inputs use in firms.With detailed data on software and hardware investments among French firms, we document that the firm-level demand for IT inputs relative to other inputs grows in the firm’s scale of operation. Theoretically, a non-homothetic CES production function helps rationalizing this empirical fact.We then analyze how the interaction of the fall of IT prices and the non-homothetic characteristics of IT inputs also help rationalize the empirical facts documented in chapter 2. First, since larger firms are more IT intensive in the cross-section, they benefit disproportionally from the fall in IT prices, rationalizing the rise of concentration. Similarly, since larger firms are more IT intensive in the cross-section, they operate at lower returns to scale and therefore have higher profit shares and lower labor shares. This explains how the rise in concentration drives a decline in aggregate labor shares. Finally, the comparative statistics of the model predicts that the fall of IT prices imply that when firms substitute toward IT they operate at higher returns to scale and therefore tend to have larger labor share, explaining the positive contribution to aggregate labor share of the within component
Auclair, Vincent. "Estimation des fonctions d'offre des principaux pays producteurs de pétrole". Thesis, Université Laval, 2011. http://www.theses.ulaval.ca/2011/28252/28252.pdf.
Pełny tekst źródłaThenon, Arthur. "Utilisation de méta-modèles multi-fidélité pour l'optimisation de la production des réservoirs". Electronic Thesis or Diss., Paris 6, 2017. https://accesdistant.sorbonne-universite.fr/login?url=https://theses-intra.sorbonne-universite.fr/2017PA066100.pdf.
Pełny tekst źródłaPerforming flow simulations on numerical models representative of oil deposits is usually a time consuming task in reservoir engineering. The substitution of a meta-model, a mathematical approximation, for the flow simulator is thus a common practice to reduce the number of calls to the flow simulator. It permits to consider applications such as sensitivity analysis, history-matching, production estimation and optimization. This thesis is about the study of meta-models able to integrate simulations performed at different levels of accuracy, for instance on reservoir models with various grid resolutions. The goal is to speed up the building of a predictive meta-model by balancing few expensive but accurate simulations, with numerous cheap but approximated ones. Multi-fidelity meta-models, based on co-kriging, are thus compared to kriging meta-models for approximating different flow simulation outputs. To deal with vectorial outputs without building a meta-model for each component of the vector, the outputs can be split on a reduced basis using principal component analysis. Only a few meta-models are then needed to approximate the main coefficients in the new basis. An extension of this approach to the multi-fidelity context is proposed. In addition, it can provide an efficient meta-modelling of the objective function when used to approximate each production response involved in the objective function definition. The proposed methods are tested on two synthetic cases derived from the PUNQ-S3 and Brugge benchmark cases. Finally, sequential design algorithms are introduced to speed-up the meta-modeling process and exploit the multi-fidelity approach
Thenon, Arthur. "Utilisation de méta-modèles multi-fidélité pour l'optimisation de la production des réservoirs". Thesis, Paris 6, 2017. http://www.theses.fr/2017PA066100/document.
Pełny tekst źródłaPerforming flow simulations on numerical models representative of oil deposits is usually a time consuming task in reservoir engineering. The substitution of a meta-model, a mathematical approximation, for the flow simulator is thus a common practice to reduce the number of calls to the flow simulator. It permits to consider applications such as sensitivity analysis, history-matching, production estimation and optimization. This thesis is about the study of meta-models able to integrate simulations performed at different levels of accuracy, for instance on reservoir models with various grid resolutions. The goal is to speed up the building of a predictive meta-model by balancing few expensive but accurate simulations, with numerous cheap but approximated ones. Multi-fidelity meta-models, based on co-kriging, are thus compared to kriging meta-models for approximating different flow simulation outputs. To deal with vectorial outputs without building a meta-model for each component of the vector, the outputs can be split on a reduced basis using principal component analysis. Only a few meta-models are then needed to approximate the main coefficients in the new basis. An extension of this approach to the multi-fidelity context is proposed. In addition, it can provide an efficient meta-modelling of the objective function when used to approximate each production response involved in the objective function definition. The proposed methods are tested on two synthetic cases derived from the PUNQ-S3 and Brugge benchmark cases. Finally, sequential design algorithms are introduced to speed-up the meta-modeling process and exploit the multi-fidelity approach
Gérault, Raynald. "Etude des systèmes dynamiques pour l'analyse, la prédiction et la surveillance des systèmes de production". Montpellier 2, 1998. http://www.theses.fr/1998MON20242.
Pełny tekst źródłaGardes, Laurent. "Estimation d'une fonction quantile extrême". Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2003. http://tel.archives-ouvertes.fr/tel-00005185.
Pełny tekst źródłaAKOUCHE, MAHIEDINE. "Estimation de la fonction caracteristique". Paris 6, 1993. http://www.theses.fr/1993PA066005.
Pełny tekst źródłaRRohrbach, Bouhdiba Olfa. "Pour une fonction de production multidimensionnelle". Paris 1, 1995. http://www.theses.fr/1995PA010017.
Pełny tekst źródłaThe companies are experiencing an upheaval in their manufacturing organization and in their organization of the manufacturing human resources. They have to face the evolution of both the manufacturing environment and their products with the requirements they have to meet. The companies have to integrate the on-going interactions between the manufacturing activity and the other companies activities, but also the new standards of performance, quality and flexibility. In that environment, a new manufacturing paradigm is taking place that is characterized by a networked organization relying increasingly on the information technologies. The company structure is interdependent and requires the excellence of all its members. The work organization is based on multi-product production team that are autonomous and empowered. The relations with customers and with suppliers are evaluating towards partnerships. Broadly speaking, the companies are rediscovering the manufacturing function with its multiple dimensions. The managers are rediscovering the manufacturing function through new management and organization methodologies
Birichinaga, Edurne. "Estimation spline de la moyenne d'une fonction aléatoire". Pau, 1987. http://www.theses.fr/1987PAUU3021.
Pełny tekst źródłaBirichinaga, Edurne. "Estimation spline de la moyenne d'une fonction aléatoire". Grenoble 2 : ANRT, 1987. http://catalogue.bnf.fr/ark:/12148/cb37603023r.
Pełny tekst źródłaDemirer, Mert. "Essays on production function estimation". Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/127028.
Pełny tekst źródłaCataloged from the official PDF of thesis.
Includes bibliographical references (pages 193-201).
This first chapter develops a new method for estimating production functions with factor-augmenting technology and assesses its economic implications. The method does not impose parametric restrictions and generalizes prior approaches that rely on the CES production function. I first extend the canonical Olley-Pakes framework to accommodate factor-augmenting technology. Then, I show how to identify output elasticities based on a novel control variable approach and the optimality of input expenditures. I use this method to estimate output elasticities and markups in manufacturing industries in the US and four developing countries. Neglecting labor-augmenting productivity and imposing parametric restrictions mismeasures output elasticities and heterogeneity in the production function. My estimates suggest that standard models (i) underestimate capital elasticity by up to 70 percent (ii) overestimate labor elasticity by up to 80 percent.
These biases propagate into markup estimates inferred from output elasticities: markups are overestimated by 20 percentage points. Finally, heterogeneity in output elasticities also affects estimated trends in markups: my estimates point to a much more muted markup growth (about half) in the US manufacturing sector than recent estimates. The second chapter develops partial identification results that are robust to deviations from the commonly used control function approach assumptions and measurement errors in inputs. In particular, the model (i) allows for multi-dimensional unobserved heterogeneity,(ii) relaxes strict monotonicity to weak monotonicity, (iii) accommodates a more flexible timing assumption for capital. I show that under these assumptions production function parameters are partially identified by an 'imperfect proxy' variable via moment inequalities. Using these moment inequalities, I derive bounds on the parameters and propose an estimator.
An empirical application is presented to quantify the informativeness of the identified set. The third chapter develops an approach in which endogenous networks is a source of identification in estimations with network data. In particular, I study a linear model where network data can be used to control for unobserved heterogeneity and partially identify the parameters of the linear model. My method does not rely on a parametric model of network formation. Instead, identification is achieved by assuming that the network satisfies latent homophily - the tendency of individuals to be linked with others who are similar to themselves. I first provide two definitions of homophily: weak and strong homophily. Then, based on these definitions, I characterize the identified sets and show that they are bounded under weak conditions.
Finally, to illustrate the method in an empirical setting, I estimate the effects of education on risk preferences and peer effects using social network data from 150 Chinese villages.
by Mert Demirer.
Ph. D.
Ph.D. Massachusetts Institute of Technology, Department of Economics
Turcotte, Jean-Philippe. "Estimation par densités prédictives". Mémoire, Université de Sherbrooke, 2013. http://hdl.handle.net/11143/6606.
Pełny tekst źródłaPoulin, Nicolas. "Estimation de la fonction des quantiles pour des données tronquées". Littoral, 2006. http://www.theses.fr/2006DUNK0159.
Pełny tekst źródłaIn the left-truncation model, the pair of random variables Y and T with respective distribution function F and G are observed only if Y ≥ T. Let (Yi,Ti) ; 1 ≤ i ≤ n be an observed sample of this pair of random variables. The quantile function of F is estimated by the quantile function of the Lynden-Bell (1971) estimator. After giving some results of the literature in the case of independant data, we consider the α-mixing framework. We obtain strong consistency with rates, give a strong representation for the estimator of the quantile as a mean of random variables with a neglible rest and asymptotic normality. As regards the second topic of this thesis, we consider a multidimensionnal explanatory random variable X of Y which plays the role of a response. We establish strong consitency and asymptotic normality of the conditional distribution function and those of the conditional quantile function of Y given X when Y is subject to truncation. Simulations are drawn to illustrate the results for finite samples
Degerine, Serge. "Fonction d'autocorrélation partielle et estimation autorégressive dans le domaine temporel". Grenoble 1, 1988. http://tel.archives-ouvertes.fr/tel-00243761.
Pełny tekst źródłaDegerine, Serge. "Fonction d'autocorrélation partielle et estimation autorégressive dans le domaine temporel". Grenoble 2 : ANRT, 1988. http://catalogue.bnf.fr/ark:/12148/cb37613056x.
Pełny tekst źródłaDegerine, Serge Le Breton Alain Van Cutsem Bernard. "Fonction d'autocorrélation partielle et estimation autorégressive dans le domaine temporel". S.l. : Université Grenoble 1, 2008. http://tel.archives-ouvertes.fr/tel-00243761.
Pełny tekst źródłaKamanzi, Pierre Canisius. "Les déterminants de la fonction de production en éducation". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0016/MQ48932.pdf.
Pełny tekst źródłaNembé, Jocelyn. "Estimation de la fonction d'intensité d'un processus ponctuel par complexité minimale". Phd thesis, Université Joseph Fourier (Grenoble), 1996. http://tel.archives-ouvertes.fr/tel-00346118.
Pełny tekst źródłaAttouch, Mohammed Kadi. "Estimation robuste de la fonction de régression pour des variables fonctionnelles". Littoral, 2009. http://www.theses.fr/2009DUNK0227.
Pełny tekst źródłaThe robust regression is an analysis of regression with capacity to be relatively insensitive to the large deviations due to some outliers observations. Within this framework, one proposes in this thesis studied the robust estimate of the function of regression, if the observations are at the same time independent, strongly mixing and the covariate is functional. Initially, on considers a succession of identically distributed independent observations. In this context, we establish the asymptotic normality of a robust family of estimators based on the kernel method. With title illustrative, our result is applied to the discrimination of the curves, the forecast time series, and to the construction of a confidence interval. In the second time, we suppose that the observations are strongly mixing, and we establish the rate of specific almost complete convergence and uniform of this family of estimators as well as asymptotic normality. Let us note, that the axes structural of the subject, namely “dimensionality” and the correlation of the observations, “dimensionality” and the robustness of the model, are well exploited in this study. Moreover, the property of the concentration of the measure of probability of the functional variable in small balls is used, this measure of concentration allows under some assumptions to propose an original solution to the problem of the curse of dimensionality and thus to generalize the results already obtaines in the multivariate framework. To illustrate the extension and the contribution of our work, we show in some examples how our results can be applied to the nonstandard problems of the non-parametric statistics such as the forecast of functional time series. Our methods are applied to real data such as the economy and astronomy
Nembé, Jocelyn Grègoire Gérard. "Estimation de la fonction d'intensité d'un processus ponctuel par complexité minimale". S.l. : Université Grenoble 1, 2008. http://tel.archives-ouvertes.fr/tel-00346118.
Pełny tekst źródłaBrunel, Dominique. "La fonction phatique dans la production romanesque de Jean Giono". Aix-Marseille 1, 1990. http://www.theses.fr/1990AIX10002.
Pełny tekst źródłaAs it reinforces the circuit of communication by means of specific linguistic signs, the phatic function of language denotes a tendency to wish to communicate. When it is transferred to literature, it necessitates also taking into account, on the one hand, the social and cultural life of the signs and, on the other hand, the dynamic forces of the text. An analysis of giono's novelistic output does indeed reveal the importance of these two factors in establishing and maintaining a close relationship between the author and the reader: by means of the symbolism of texts, different narrative techniques and paradoxically - the use of what is unstated, giono makes his presence know in the stories he writes and engages us implicitely. What is more, the writer achieves the prodigious recreation of a new phatic communion based, in these modern times, on the loss of natural communication between people
Pambo, Bello Kowir. "Estimation à noyau de la fonction de hasard pour des variables censurées". Versailles-St Quentin en Yvelines, 2011. http://www.theses.fr/2011VERS0030.
Pełny tekst źródłaThis thesis deals with kernel estimates of the hazard function (denoted by λ) in the case when the variables are censored. It composed of three chapters. In each of the first two chapters, we construct a kernel estimator by using a double recursive stochastic algorithm (two-time-scale stochastic algorithm), and then establish its convergence in law. We compare these estimates with a nonrecursive kernel estimator. We show that the asymptotic rate of the recursive estimator λn and the one of the nonrecursive version are similar. However, in terms of confidence interval estimation, we show that it is better to use the estimator λn rather than the nonrecursive version: for the same asymptotic level, the width of the interval obtained by the recursive estimator is smaller than the one obtained with the nonrecursive version. In the third chapter, we first recall the notions of large deviations and moderate deviations, then we establish the pointwise and uniform moderate deviations principles for the sequence (eλ̃n-λ) where eλ̃n is a nonrecursive estimator
Ferrigno, Sandie. "Un test d'adéquation global pour la fonction de répartition conditionnelle". Montpellier 2, 2004. http://www.theses.fr/2004MON20110.
Pełny tekst źródłaBahamonde, Natalia. "Estimation de séries chronologiques avec données manquantes". Paris 11, 2007. http://www.theses.fr/2007PA112115.
Pełny tekst źródłaSrihari, Krishnaswami. "Leadtime estimation in a manufacturing shop environment". Thesis, Virginia Tech, 1985. http://hdl.handle.net/10919/45684.
Pełny tekst źródłaThis research examines the relationships between the independent variables; lot size, sequencing rule, and shop type/product mix and the â dependent variables; work-in-process, machine utilization, flowtime, and leadtime. These relationships have seldom been investigated by academicians. There is no satisfactory method available in the literature today that can be used to accurately estimate leadtime. This research developed methods for leadtime estimation. A leadtime estimation method would be invaluable to the production planner, for it would enable the person to decide when the job should be released to the shop.
The experimental system considered in this research is
an eight machine shop. Three different shop types, a
jobshop, modified flowshop, and a flowshop, were modelled.
Each shop type had eight different product types being
manufactured. Two different sequencing rules, SPT and FIFO,
were used. The entire system was analyzed via simulation on
an IBM P.C. using SIMAN system simulation concepts.
Master of Science
Latte, Okon Wesley. "La fonction législative en Côte d'Ivoire : processus législatif et production législative". Bordeaux 4, 1996. http://www.theses.fr/1996BOR40024.
Pełny tekst źródłaThe functioning of the parliamentary legislative body, from a legal point of view, responds first and foremost to parliamentary rationalisation. This technique can be seen through the demarcation of a particular area of the law and the monitoring of the constitutionality of the laws, as well as through the respecting of a pre-established procedure. On the whole, however, none of these techniques have had any practical consequences whatsoever. On the other hand, the political factor has been a huge influence on the functioning of the legislative service. In the early years of independence, members of parliament, although recruited by president houphouet boigny, assisted by the pdci-rda, participated rather pertinently in the adoption of laws, as much from the point of view of taking the initiatives, as taking part in the parliamentary debates in the commissions and plenary assemblies. This situation was to evolve considerably following the political purges of 1963-1964, which were a result of the discovery of political plotting. From this period of 1964-1980, the year of the internal democratisation of the regime, ivory coast m. P. 's have remained rather timorous. They have not taken any legislative initiatives, and have practically not taken part in any parliamentary debates. The national assembly during this period can no doubt be described simply as a registration chamber, with one or two exceptions
Mint, El Mouvid Mariem. "Sur l'estimateur linéaire local de la fonction de répartition conditionnelle". Montpellier 2, 2000. http://www.theses.fr/2000MON20162.
Pełny tekst źródłaBelalia, Mohamed. "Estimation et inférence non paramétriques basées sur les polynômes de Bernstein". Thèse, Université de Sherbrooke, 2016. http://hdl.handle.net/11143/8558.
Pełny tekst źródłaPinson, Pierre. "Estimation de l'incertitude des prédictions de production éolienne". Phd thesis, École Nationale Supérieure des Mines de Paris, 2006. http://pastel.archives-ouvertes.fr/pastel-00002187.
Pełny tekst źródłaGreen, Stuart. "Estimation of tritium production in fusion reactor blankets". Thesis, University of Birmingham, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.538947.
Pełny tekst źródłaPinson, Pierre Patrick. "Estimation de l’incertitude des prédictions de production eolienne". Paris, ENMP, 2006. http://www.theses.fr/2006ENMP1432.
Pełny tekst źródłaWind power experiences a tremendous development of its installed capacities in Europe. Though, the intermittence of wind generation causes difficulties in the management of power systems. Also, in the context of the deregulation of electricity markets, wind energy is penalized by its intermittent nature. It is recognized today that the forecasting of wind power for horizons up to 2/3-day ahead eases the integration of wind generation. Wind power forecasts are traditionally provided in the form of point predictions, which correspond to the most-likely power production for a given horizon. That sole information is not sufficient for developing optimal management or trading strategies. Therefore, we investigate on possible ways for estimating the uncertainty of wind power forecasts. The characteristics of the prediction uncertainty are described by a thorough study of the performance of some of the state-of-the-art approaches, and by underlining the influence of some variables e. G. Level of predicted power on distributions of prediction errors. Then, a generic method for the estimation of prediction intervals is introduced. This statistical method is non-parametric and utilizes fuzzy logic concepts for integrating expertise on the prediction uncertainty characteristics. By estimating several prediction intervals at once, one obtains predictive distributions of wind power output. The proposed method is evaluated in terms of its reliability, sharpness and resolution. In parallel, we explore the potential use of ensemble predictions for skill forecasting. Wind power ensemble forecasts are obtained either by converting meteorological ensembles (from ECMWF and NCEP) to power or by applying a poor man’s temporal approach. A proposal for the definition of prediction risk indices is given, reflecting the disagreement between ensemble members over a set of successive look-ahead times. Such prediction risk indices may comprise a more comprehensive signal on the expected level of uncertainty in an operational environment. A probabilistic relation between classes of risk indices and the level of forecast error is shown. In a final part, the trading application is considered for demonstrating the value of uncertainty estimation when predicting wind generation. It is explained how to integrate that uncertainty information in a decision-making process accounting for the sensitivity of end-users to regulation costs. The benefits of having a probabilistic view of wind power forecasting are clearly shown
Buba, Ibrahim Muhammad. "Direct estimation of gas reserves using production data". Texas A&M University, 2003. http://hdl.handle.net/1969/153.
Pełny tekst źródłaBen, Cheikh Larbi. "Théorie de la production : une approche paramétrique par la fonction de distance". Paris 2, 1997. http://www.theses.fr/1997PA020031.
Pełny tekst źródłaWe introduce a parametric approach which permits to take into account technical inefficiency in models of production. This approach is based on the concept of the input distance function introduced by shephard (1953 and 1970). We propose measures for the productive performances of firms in terms of the distance function in the contexts of static and temporary equilibrium. Besides, we present the advantages of the distance function with respect to the cost and production functions. Our empirical studies show that the parametric distance function approach may improve the results in terms of the cost and production functions
Ouassou, Idir. "Estimation sous contraintes pour des lois à symétrie sphérique". Rouen, 1999. http://www.theses.fr/1999ROUES031.
Pełny tekst źródłaBarbier, Franck. "Une approche objet dédiée à la fonction production d'une entreprise manufacturière : application à la gestion de production". Chambéry, 1991. http://www.theses.fr/1991CHAMS001.
Pełny tekst źródłaTOETSCH, ANDRE-CHARLES. "Influence de la production endogene de monoxyde d'azote sur la fonction cardiaque". Strasbourg 1, 1995. http://www.theses.fr/1995STR15100.
Pełny tekst źródłaBourgey, Mathieu. "Estimation des risques de développer la maladie coeliaque en fonction d'informations génétiques et familiales". Paris 11, 2007. http://www.theses.fr/2007PA11T034.
Pełny tekst źródłaSrivastava, Mayank. "Estimation of coalbed methane production potential through reservoir simulation /". Available to subscribers only, 2005. http://proquest.umi.com/pqdweb?did=1079667111&sid=4&Fmt=2&clientId=1509&RQT=309&VName=PQD.
Pełny tekst źródłaMahiou, Ramdane. "Sur l'estimation d'une fonction-frontière et l'enveloppe convexe d'un échantillon". Paris 6, 1988. http://www.theses.fr/1988PA066379.
Pełny tekst źródłaFERRIGNO, Sandie. "Un test d'adéquation global pour la fonction de répartition conditionnelle". Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2004. http://tel.archives-ouvertes.fr/tel-00008559.
Pełny tekst źródłal'on doit valider pour justifier son utilisation. Dans ce travail, on propose une approche globale où toutes les hypothèses faites pour asseoir ce modèle sont testées simultanément.
Plus précisément, on construit un test basé sur une quantité qui permet de canaliser toute l'information liant X à Y : la fonction de répartition conditionnelle de Y sachant (X = x) définie par F(y|x)=P(Y<=y|X=x). Notre test compare la valeur prise par l'estimateur polynômial local de F(y|x) à une estimation paramétrique du modèle supposé et rejette sa
validité si la distance entre ces deux quantités est trop grande. Dans un premier temps, on considère le cas où la fonction de répartition supposée est entièrement spécifiée et, dans
ce contexte, on établit le comportement asymptotique du test. Dans la deuxième partie du travail, on généralise ce résultat au cas plus courant en pratique où le modèle supposé contient un certain nombre de paramètres inconnus. On étudie ensuite la puissance locale du test en déterminant son comportement asymptotique local sous des suites d'hypothèses contigües. Enfin, on propose un critère de choix de la fenêtre d'ajustement qui intervient lors de l'étape d'estimation polynômiale locale de la fonction de répartition conditionnelle.
Peypoch, Nicolas. "Théorie et applications de la fonction distance directionnelle en économie de la production". Perpignan, 2006. http://www.theses.fr/2006PERP0732.
Pełny tekst źródłaThis work is about production microeconomic. More precisely, the purpose of this thesis is to introduce new tools in order to analyze efficiency and productivity of the production units. To do this, we use the directional distance function. The contributions are the following: first, we introduce a new aggregate measure of efficiency and productivity. Second, we propose a notion of parallel neutral technical change in input and output. Third, a new decomposition of the productivity is proposed. Fourth, we propose a notion of alpha returns to scale, which is derived from that of homogeneous technology of degree alpha. Finally, we show how our new aggregate Luenberger productivity indicator can be implemented by considering two empirical applications: the first is about education economics whereas the second is about tourism economics
Thomas, Corinne. "Contribution à la définition des communications industrielles pour la fonction gestion de production". Nancy 1, 1999. http://www.theses.fr/1999NAN10274.
Pełny tekst źródłaThe current advanced communication techniques implemented within workshop, characterized by industrial local area networks and communication tools integrated by equipment, are often not used for interfacing between production management system and workstations of workshop. The implementation of this communication is difficult because heterogeneity of equipment. Manufacturing management within the workshop is often then realized by intermediary of the operators. Reliability of information and dynamics of exchanges are therefore not guaranteed. The research presented in this work contributes to defining an integral communication architecture for control and monitoring of the industrial processes by using these communication tools. Ln this context, several points of view are explored. The study of the industrial communication systems leads us to note that the industrial messaging MMS (Manufacturing Message Specification) represents a communication language making it possible to define an integral communication architecture within an automated production system. CAMM (Computer Aided Manufacturing Management) point of view and different work in the area of the information systems in production management enable us to acquire a general view of the workshop management, under an informational aspect. Finally, by considering different modelling methods for the analysis of the information systems, we chose to use the object oriented method OMT (Object Modeling Technique) for our work. By setting us in a specifie context, we propose a methodological approach for the integration of the production management activities at the level of the equipment in the workshop. By starting from OMT model describing a CAMM system and model describing a workstation from a communication point of view, it consist to derive towards a communication architecture using the messaging MMS. In order to validate our approach, we experiment our method by realising a prototyping application of a numerical control workstation which communicates with a CAMM system by using MMS
Jouvie, Camille. "Estimation de la fonction d'entrée en tomographie par émission de positons dynamique : application au fluorodesoxyglucose". Phd thesis, Université Paris Sud - Paris XI, 2013. http://tel.archives-ouvertes.fr/tel-00966453.
Pełny tekst źródłaJouvie, Camille. "Estimation de la fonction d’entrée en tomographie par émission de positons dynamique : application au fluorodesoxyglucose". Thesis, Paris 11, 2013. http://www.theses.fr/2013PA112303/document.
Pełny tekst źródłaPositron Emission Tomography (PET) is a method of functional imaging, used in particular for drug development and tumor imaging. In PET, the estimation of the arterial plasmatic activity concentration of the non-metabolized compound (the "input function") is necessary for the extraction of the pharmacokinetic parameters. These parameters enable the quantification of the compound dynamics in the tissues. This PhD thesis contributes to the study of the input function by the development of a minimally invasive method to estimate the input function. This method uses the PET image and a few blood samples. In this work, the example of the FDG tracer is chosen. The proposed method relies on compartmental modeling: it deconvoluates the three-compartment-model. The originality of the method consists in using a large number of regions of interest (ROIs), a large number of sets of three ROIs, and an iterative process. To validate the method, simulations of PET images of increasing complexity have been performed, from a simple image simulated with an analytic simulator to a complex image simulated with a Monte-Carlo simulator. After simulation of the acquisition, reconstruction and corrections, the images were segmented (through segmentation of an IRM image and registration between PET and IRM images) and corrected for partial volume effect by a variant of Rousset’s method, to obtain the kinetics in the ROIs, which are the input data of the estimation method. The evaluation of the method on simulated and real data is presented, as well as a study of the method robustness to different error sources, for example in the segmentation, in the registration or in the activity of the used blood samples
Mohamed, Zain-Eddine. "Étude économétrique de la fonction d'investissement : estimation d'un modèle à plusieurs régimes sur données sectorielles". Toulouse 1, 1987. http://www.theses.fr/1987TOU10027.
Pełny tekst źródłaInvestment decisions concern four maskets (product market, capital market, financial market, labot market). Desequilibrium in these markets implicate the investment demand. One can distinguish the walrassien demand model, and the effective demand models which incorporate different constraints from certain markets. Our object is to distinguish the intensity of that constraints by the estimation of a switching model. Our framework is an economy a little disaggregated, here the different branches of the manufacture in france
Mohamed, Zain-Eddine. "Etude économétrique de la fonction d'investissement estimation d'un modèle à plusieurs régimes sur données sectorielles /". Grenoble 2 : ANRT, 1987. http://catalogue.bnf.fr/ark:/12148/cb37608115m.
Pełny tekst źródłaSchmitz, Morgan A. "Euclid weak lensing : PSF field estimation". Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLS359/document.
Pełny tekst źródłaAs light propagates through the Universe, its path is altered by the presence of massive objects. This causes a distortion of the images of distant galaxies. Measuring this effect, called weak gravitational lensing, allows us to probe the large scale structure of the Universe. This makes it a powerful source of cosmological insight, and can in particular be used to study the distribution of dark matter and the nature of Dark Energy. The European Space Agency’s upcoming Euclid mission is a spaceborne telescope with weak lensing as one of its primary science objectives.In practice, the weak lensing signal is recovered from the measurement of the shapes of galaxies. The images obtained by any optical instrument are altered by its Point Spread Function (PSF), caused by various effects: diffraction, imperfect optics, atmospheric turbulence (for ground-based telescopes)… Since the PSF also alters galaxy shapes, it is crucial to correct for it when performing weak lensing measurements. This, in turn, requires precise knowledge of the PSF itself.The PSF varies depending on the position of objects within the instrument’s focal plane. Unresolved stars in the field provide a measurement of the PSF at given positions, from which a PSF model can be built. In the case of Euclid, star images will suffer from undersampling. The PSF model will thus need to perform a super-resolution step. In addition, because of the very wide band of its visible instrument, variations of the PSF with the wavelength of incoming light will also need to be accounted for.The main contribution of this thesis is the building of novel PSF modelling approaches. These rely on sparsity and numerical optimal transport. The latter enables us to propose the first method capable of building a polychromatic PSF model, using no information other than undersampled star images, their position and spectra. We also study the propagation of errors in the PSF to the measurement of galaxy shapes
Hamrouni, Zouhir. "Inférence statistique par lissage linéaire local pour une fonction de régression présentant des dicontinuités". Université Joseph Fourier (Grenoble), 1999. http://tel.archives-ouvertes.fr/tel-00004840.
Pełny tekst źródłaDe, Lamotte Frédéric. "Étude des relations structure - fonction des protéines végétales". Habilitation à diriger des recherches, Université Montpellier II - Sciences et Techniques du Languedoc, 2006. http://tel.archives-ouvertes.fr/tel-00375496.
Pełny tekst źródłaJ'ai lu avec intérêt les deux volumes du “Heller”, captivé par l'ingéniosité de nos aînés pour élucider le fonctionnement des plantes. Toutefois, déjà à l'époque, je ressentais une certaine frustration car l'analyse s'arrêtait à un niveau “descriptif”. Je rêvais de pouvoir descendre à un niveau “explicatif” (par la suite, j'ai appris qu'on disait “moléculaire”). Imaginer un instant qu'on puisse un jour visualiser la cascade d'interactions moléculaires qui va de l'application de l'acide abscissique à la chute des feuilles me comblait d'aise.
Comprendre cet enchaînement nécessite d'appréhender le fonctionnement de chacune des protéines impliquées, or ces fonctions sont portées par les structures tridimensionnelles des protéines. C'est là que tout devient plus compliqué ! En effet, il suffit de consulter les bases de données et d'y comparer l'abondance relative des structures primaires de protéines à celle des structures tertiaires pour mesurer la difficulté d'aborder la troisième dimension ! Même avec les progrès réalisés au tournant du millénaire, déterminer toutes les structures 3D de toutes les protéines étudiées semble hors de portée. On réalise alors le grand intérêt de construire une “Pierre de Rosette” qui fera correspondre structures primaires, structures tertiaires et fonctions. De même qu'on sait déjà traduire une séquence de nucléotides en une séquence d'acides aminés, il est primordial de percer les lois qui nous permettront de replier une séquence d'acides aminés en une structure tridimensionnelle. Nous pourrons ainsi, à moindre effort, obtenir les structures 3D in silico, en analyser le fonctionnement et les interactions. Cela est d'autant plus important en cette ère de post-génomique qui voit s'accumuler des milliers de séquences inconnues, issues des programmes de séquençage systématique.
Peterson, Brooke Ashley. "Estimation of rumen microbial protein production and ruminal protein degradation". College Park, Md. : University of Maryland, 2006. http://hdl.handle.net/1903/3865.
Pełny tekst źródłaThesis research directed by: Animal Sciences. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.