Dissertations / Theses on the topic 'Théorie a priori d’objet'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 17 dissertations / theses for your research on the topic 'Théorie a priori d’objet.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Erten, Nur. "De l'οntοlοgie fοrmelle à la phénοménοlοgie. Une lecture husserlienne de la mathématisatiοn de la cοnnaissance." Electronic Thesis or Diss., Normandie, 2024. http://www.theses.fr/2024NORMC011.
Full textThe aim of this work is to situate formal ontology within Husserlian phenomenology. To this end, we propose a reading of formal ontology as distinct from a purely theoretical or mathematical approach, by placing it in the history of philosophy as conceived by Edmund Husserl. We begin with an analysis of the mathematization of logic, which leads us, under the dominance of mathematics, to the genesis of the idea of the theory of theories, to a theory of object without object. Next, we broaden our perspective to examine the mathematization of the sciences, especially the mathematization of physics. By analyzing the crisis of the sciences, we highlight the relationship between the sciences and their obligatory relationship with philosophy, according to Husserl. Through a historical reading from Plato to Galileo, centered on formal ontology, we show the essential criticisms addressed to modern sciences by Husserl, and why, despite their development and mathematical rigor, empirical sciences, particularly physics, cannot found a theory of everything. Finally, our study investigates the relationship between formal ontology and material ontologies. Our analyses in this study justify the necessity of phenomenology as the rigorous science
Moussa, Ali Abdouramane. "Diagnostic sans modèle a priori." Electronic Thesis or Diss., Nancy 1, 2011. http://www.theses.fr/2011NAN10016.
Full textThe fault diagnosis in automatic systems is to trace perceived symptoms to causes. Most of model-based methods are based on the concept of analytical redundancy. The analytical redundancy relations are equations derived from an analytical model, which admits as input only the measured variables. Other classes of model-based methods are based directly on the parameter estimate. The modern syntheses of residuals generation require very detailed knowledge of system to control or to diagnose. Writing the most accurate models require a deep understanding of mechanisms and uses law of physics. The models obtained this way are called knowledge-based models. They involve physical parameters which, by definition, are measurable by experiments which are not necessarily related to how to use the system. However, in some practical cases, these parameters can be evaluated a priori. Moreover, the reliability provided by the knowledge-based models is usually accompanied by the disadvantage of excessive complexity. These models may not be used in practice and we must often reduce the complexity. This work contributes to develop a new algebraic and deterministic approach of fault diagnosis, which is not based on an a priori explicit model of the process and presents a new perspective based on the theory of distributions and the pseudospectra analysis of matrix pencils. Our approach is based on certain tools and developments in the theory of algebraic estimation usual in automatic community but very unconventional in signal processing
Moussa, Ali Abdouramane. "Diagnostic sans modèle a priori." Phd thesis, Université Henri Poincaré - Nancy I, 2011. http://tel.archives-ouvertes.fr/tel-00604090.
Full textDeroy, Xavier. "A priori des modèles d'innovation et contingence massive de l'innovation en sciences de gestion." Paris, CNAM, 2003. http://www.theses.fr/2003CNAM0434.
Full textThis research first aims to clarify literature about innovation by combining sociological and economic approaches. One of the main questions which is disclosed is the potential conflict between innovation and organization. It is often thought that it is possible to organize innovation. Management models of innovations rely on this assumption. Three types of models can be distinguished: integrative, statistical and monographic model. But the core argument of this thesis is to show that models of innovation are rooted in three implicit questionable a priori: mimetism, modelization of innovating action and scientific neutrality of these models. Hence, we argue that innovation is the place of massive contingency, despite some common but unsufficient organizing principles of explanation. Innovation implies specific knowledge and ability to create new relationships. To better understand it, we recommend that researches about innovation aim to elaborate local theories adapted to specific contexts described by monographies
Jacob, Laurent. "A priori structurés pour l'apprentissage supervisé en biologie computationnelle." Phd thesis, École Nationale Supérieure des Mines de Paris, 2009. http://pastel.archives-ouvertes.fr/pastel-00005743.
Full textCucu, Graindorge Tatiana. "Contribution à une méthodologie d'évaluation à priori des projets de transport urbain durable." Thesis, Bordeaux 1, 2012. http://www.theses.fr/2012BOR14488/document.
Full textThe objective of this research is to provide to the local authorities a decision aid tool in order to formalize a participatory approach during the conception of a sustainable urban transport project, in a multi-criteria and multi-actors context. The methodology is based on the a priori evaluation of the impacts of a local project, involving stakeholders as soon as the diagnosis phase. This phase aims at identifying groups of actors according to their perception of urban phenomena, their interactions and their stated preferences of evolution. The diagnosis phase leads to the setting-up of a list of common indicators to be evaluated. The choice of alternatives to be studied is the result of the transferability techniques - based on the projects developed in other cities - and the stated preferences of local users. The probability of using the service is evaluated using an aggregated behavioral model that takes into account the fuzzy perception and the indecision of users in a new situation. Changes in the behaviour of the users are taken into account thanks to an indicator of robustness that tests the impact of exogenous parameters on the evolution of the probability of using a service. A traffic micro--simulator aims at assessing the impacts of the various scenarios on traffic, environment and the welfare of citizens– which is monetized. It illustrates the costs and indirect benefits expected with the implementation of project. A compromise solution is proposed: it aims at identifying an alternative that would best satisfy the representatives of the stakeholder groups – and not necessarily the optimal solution in terms of impacts
Grosgeorge, Damien. "Segmentation par coupes de graphe avec a priori de forme Application à l'IRM cardiaque." Phd thesis, Université de Rouen, 2014. http://tel.archives-ouvertes.fr/tel-01006467.
Full textZhou, Hao. "La chute du "triangle d'or" : apriorité, analyticité, nécessité : de l'équivalence à l'indépendance." Thesis, Paris 1, 2020. http://www.theses.fr/2020PA01H204.
Full textThe three concepts of apriority, analyticity and necessity, which have long been considered equivalent, constitute whatcould be called the “golden triangle” or “triangle of equivalence”. Yet, the Kantian conception of the synthetic a priori and the Kripkean conceptions of the contingent a priori and the necessary a posteriori represent decisive criticismsagainst this triangle of equivalence. Inheriting critically these revolutionary thoughts from Kant and Kripke, a newepistemological schema entitled “subject-knowledge-world” is here systematically constructed. This schema renders thegolden triangle totally obsolete. The concepts of apriority, analyticity and necessity become independent of each other.This leads to a new space of knowledge categories, resulting from the free intersecting of the three distinctions a priori-aposteriori, analytic-synthetic and necessary-contingent. These knowledge categories, some of which are new, apply to science exclusively and exhaustively
Righi, Ali. "Sur l'estimation de densités prédictives et l'estimation d'un coût." Rouen, 2011. http://www.theses.fr/2011ROUES002.
Full textThis thesis is divided in two parts. In the first part, we investigate predictive density estimation for a multivariate Gaussian model under the Kullback-Leibler loss. We focus on the link with the problem of estimation of the mean under quadratic loss. We obtain several parallel results. We prove minimaxity and improved estimation results under restriction for the unknown mean. In particular, we show, via two different paths, that the Bayesian predictive density associated to the uniform prior on a convex C dominates the best invariant predictive density when μ 2 C. This is a parallel result to Hartigan’s result in 2004, for the estimation of the mean under quadratic loss. At the end of this part, we give numerical simulations to visualize the gain obtained by some of our new proposed estimators. In the second part, for the Gaussian model of dimension p, we treat the problem of estimating the loss of the standard estimator of the mean (that is, #0(X) = X). We give generalized Bayes estimators which dominate the unbiased estimator of loss (that is, #0(X) = p), through sufficient conditions for p # 5. Examples illustrate the theory. Then we carry on a technical study and numerical simulations on the gain reached by one of our proposed minimax generalized Bayes estimators of loss
Garrido, Myriam. "Modélisation des évènements rares et estimation des quantiles extrêmes , méthodes de sélection de modèles pour les queues de distribution." Phd thesis, Université Joseph Fourier (Grenoble), 2002. http://tel.archives-ouvertes.fr/tel-00004666.
Full textBONNAILLIE, Virginie. "Analyse mathématique de la supraconductivité dans un domaine à coins: méthodes semi-classiques et numériques." Phd thesis, Université Paris Sud - Paris XI, 2003. http://tel.archives-ouvertes.fr/tel-00005430.
Full textSui, Liqi. "Uncertainty management in parameter identification." Thesis, Compiègne, 2017. http://www.theses.fr/2017COMP2330/document.
Full textIn order to obtain more predictive and accurate simulations of mechanical behaviour in the practical environment, more and more complex material models have been developed. Nowadays, the characterization of material properties remains a top-priority objective. It requires dedicated identification methods and tests in conditions as close as possible to the real ones. This thesis aims at developing an effective identification methodology to find the material property parameters, taking advantages of all available information. The information used for the identification is theoretical, experimental, and empirical: the theoretical information is linked to the mechanical models whose uncertainty is epistemic; the experimental information consists in the full-field measurement whose uncertainty is aleatory; the empirical information is related to the prior information with epistemic uncertainty as well. The main difficulty is that the available information is not always reliable and its corresponding uncertainty is heterogeneous. This difficulty is overcome by the introduction of the theory of belief functions. By offering a general framework to represent and quantify the heterogeneous uncertainties, the performance of the identification is improved. The strategy based on the belief function is proposed to identify macro and micro elastic properties of multi-structure materials. In this strategy, model and measurement uncertainties arc analysed and quantified. This strategy is subsequently developed to take prior information into consideration and quantify its corresponding uncertainty
Clérin, Jean-Marc. "Problèmes de contrôle optimal du type bilinéaire gouvernés par des équations aux dérivées partielles d’évolution." Thesis, Avignon, 2009. http://www.theses.fr/2009AVIG0405/document.
Full textThis thesis is devoted to the analysis of nonlinear optimal control problems governed by an evolution state equation involving a term which is bilinear in state and control. The difficulties due to nonlinearity remain, but bilinearity adds a lot of structure to the control problem under consideration. In Section 2, by using Willet and Wong inequalities we establish a priori estimates for the solutions of the state equation. These estimates allow us to prove that the state equation is well posed in the sense of Hadamard. In the case of a feedback constraint on the control, the state equation becomes a differential inclusion. Under mild assumptions, such a differential inclusion is solvable. In Section 3, we prove the existence of solutions to the optimal control problem. Section 4 is devoted to the sensitivity analysis of the optimal control problem. We obtain a formula for the directional derivative of the optimal value function. This general formula is worked out in detail for particular examples
Grazian, Clara. "Contributions aux méthodes bayésiennes approchées pour modèles complexes." Thesis, Paris Sciences et Lettres (ComUE), 2016. http://www.theses.fr/2016PSLED001.
Full textRecently, the great complexity of modern applications, for instance in genetics,computer science, finance, climatic science etc., has led to the proposal of newmodels which may realistically describe the reality. In these cases, classical MCMCmethods fail to approximate the posterior distribution, because they are too slow toinvestigate the full parameter space. New algorithms have been proposed to handlethese situations, where the likelihood function is unavailable. We will investigatemany features of complex models: how to eliminate the nuisance parameters fromthe analysis and make inference on key quantities of interest, both in a Bayesianand not Bayesian setting, and how to build a reference prior
Capellier, Édouard. "Application of machine learning techniques for evidential 3D perception, in the context of autonomous driving." Thesis, Compiègne, 2020. http://www.theses.fr/2020COMP2534.
Full textThe perception task is paramount for self-driving vehicles. Being able to extract accurate and significant information from sensor inputs is mandatory, so as to ensure a safe operation. The recent progresses of machine-learning techniques revolutionize the way perception modules, for autonomous driving, are being developed and evaluated, while allowing to vastly overpass previous state-of-the-art results in practically all the perception-related tasks. Therefore, efficient and accurate ways to model the knowledge that is used by a self-driving vehicle is mandatory. Indeed, self-awareness, and appropriate modeling of the doubts, are desirable properties for such system. In this work, we assumed that the evidence theory was an efficient way to finely model the information extracted from deep neural networks. Based on those intuitions, we developed three perception modules that rely on machine learning, and the evidence theory. Those modules were tested on real-life data. First, we proposed an asynchronous evidential occupancy grid mapping algorithm, that fused semantic segmentation results obtained from RGB images, and LIDAR scans. Its asynchronous nature makes it particularly efficient to handle sensor failures. The semantic information is used to define decay rates at the cell level, and handle potentially moving object. Then, we proposed an evidential classifier of LIDAR objects. This system is trained to distinguish between vehicles and vulnerable road users, that are detected via a clustering algorithm. The classifier can be reinterpreted as performing a fusion of simple evidential mass functions. Moreover, a simple statistical filtering scheme can be used to filter outputs of the classifier that are incoherent with regards to the training set, so as to allow the classifier to work in open world, and reject other types of objects. Finally, we investigated the possibility to perform road detection in LIDAR scans, from deep neural networks. We proposed two architectures that are inspired by recent state-of-the-art LIDAR processing systems. A training dataset was acquired and labeled in a semi-automatic fashion from road maps. A set of fused neural networks reaches satisfactory results, which allowed us to use them in an evidential road mapping and object detection algorithm, that manages to run at 10 Hz
Vaiter, Samuel. "Régularisations de Faible Complexité pour les Problèmes Inverses." Phd thesis, Université Paris Dauphine - Paris IX, 2014. http://tel.archives-ouvertes.fr/tel-01026398.
Full textLuu, Duy tung. "Exponential weighted aggregation : oracle inequalities and algorithms." Thesis, Normandie, 2017. http://www.theses.fr/2017NORMC234/document.
Full textIn many areas of statistics, including signal and image processing, high-dimensional estimation is an important task to recover an object of interest. However, in the overwhelming majority of cases, the recovery problem is ill-posed. Fortunately, even if the ambient dimension of the object to be restored (signal, image, video) is very large, its intrinsic ``complexity'' is generally small. The introduction of this prior information can be done through two approaches: (i) penalization (very popular) and (ii) aggregation by exponential weighting (EWA). The penalized approach aims at finding an estimator that minimizes a data loss function penalized by a term promoting objects of low (simple) complexity. The EWA combines a family of pre-estimators, each associated with a weight exponentially promoting the same objects of low complexity.This manuscript consists of two parts: a theoretical part and an algorithmic part. In the theoretical part, we first propose the EWA with a new family of priors promoting analysis-group sparse signals whose performance is guaranteed by oracle inequalities. Next, we will analysis the penalized estimator and EWA, with a general prior promoting simple objects, in a unified framework for establishing some theoretical guarantees. Two types of guarantees will be established: (i) prediction oracle inequalities, and (ii) estimation bounds. We will exemplify them for particular cases some of which studied in the literature. In the algorithmic part, we will propose an implementation of these estimators by combining Monte-Carlo simulation (Langevin diffusion process) and proximal splitting algorithms, and show their guarantees of convergence. Several numerical experiments will be considered for illustrating our theoretical guarantees and our algorithms