Dissertationen zum Thema „Technique de référence“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-36 Dissertationen für die Forschung zum Thema "Technique de référence" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
Scache, Daniel. „Référence technique et classe-laboratoire de sciences physique en lycée professionnel : Contribution à la didactique en formation professionnelle“. Lille 1, 1993. http://www.theses.fr/1993LIL12007.
Der volle Inhalt der QuelleNoël, Françoise. „Famille. . . Je vous aime! : la référence familiale dans l'éducation en internat des enfants cas sociaux“. Paris 10, 1985. http://www.theses.fr/1985PA100224.
Der volle Inhalt der QuelleDagnan-Laville, Agnès. „Aide à la conception intégrée basée sur un modèle de référence et la coopération de plusieurs modes de raisonnement“. Châtenay-Malabry, Ecole centrale de Paris, 1999. http://www.theses.fr/1999ECAP0645.
Der volle Inhalt der QuelleLaroche, Florent. „Contribution à la sauvegarde des objets techniques anciens par l'archéologie industrielle avancée : proposition d'un modèle d'information de référence muséologique et d'une méthode inter-disciplinaire pour la capitalisation des connaissances du patrimoine technique et industriel“. Nantes, 2007. http://www.theses.fr/2007NANT2120.
Der volle Inhalt der QuelleFor optimizing its added value creation, enterprise adjusts continuously its operating modes and its production tools. Machines considered obsoletes as soon as they don't meet the demand, are consequently stopped, stored and very often dismantled. Consequently, industrial sites disappear and the workers, creators of knowledge, are leaving the industrial world with their know-how. Safe keeping, analyzing and understanding these objects belonging to a Past Heritage can convert them into a Present Capital; sometimes, they can even become an innovation source for preparing our future and assisting industrials to create the tomorrow objects. In addition, concerning their conservation and their popularization in museums and sites, the aging of technical information requires to implement a new kind of museology for this 3rd millennium. The developed methodology consists in overturning the design time axis: it is the Reverse-process of Heritage Design. Advanced Industrial Archaeology, allows the elaboration of the Technical Heritage File by capitalizing knowledge from the past into a digital media and a virtual simulating state for museography and valorization. As the objects complexity requires a multiplicity of competences, the Heritage Cooperative Engineering Process requires collaborations of experts who usually, do not collaborate. Therefore, a new kind of Inter-disciplinary Team is appearing which needs a common base for allowing different semantics living together. Consequently, the Information System proposes to encapsulate this taxonomy of the Men, also taking into account the actors of the past. The model defines the Heritage Technical Object in its Internal and External Aspects. The Digital Heritage Reference Model, DHRM, can then be considered as a new tool for museology. During these researches, numerous technical studies are analyzed and conceptualized so as to build the ontology's and consequently validate the applicability of DHRM
Laroche, Florent. „Contribution à la sauvegarde des Objets techniques anciens par l'Archéologie industrielle avancée.Proposition d'un Modèle d'information de référence muséologique et d'une Méthode inter-disciplinaire pour la Capitalisation des connaissances du Patrimoine technique et industriel“. Phd thesis, Ecole centrale de nantes - ECN, 2007. http://tel.archives-ouvertes.fr/tel-00382703.
Der volle Inhalt der QuelleLa méthodologie développée consiste à renverser l'axe des temps de la conception : c'est le Rétro processus de Conception Patrimoniale. L'Archéologie Industrielle Avancée permet la constitution du Dossier d'Oeuvre Patrimoniale Technique en capitalisant les connaissances du passé sous une forme numérique et en les repositionnant virtuellement en situation d'usage à des fins de muséographie et de valorisation. La complexité des objets induisant une multiplicité des compétences, le Processus Coopératif d'Ingénierie Patrimoniale requiert la collaboration de métiers qui, jusqu'alors, ne collaboraient pas ou peu. Dès lors, une nouvelle forme d'Equipe Inter disciplinaire émerge pour laquelle un référentiel commun permet de faire cohabiter les différentes sémantiques. Ainsi, le Système d'Information proposé encapsule cette taxonomie des Hommes en considérant également les acteurs du passé. Le modèle définit l'Objet technique à caractère patrimonial dans ses Aspects Internalistes et Externalistes. Le Digital Heritage Reference Model ou Dossier d'Oeuvre Patrimonial Numérique de Référence, DHRM, peut alors être considéré comme un nouvel outil de travail muséologique.
Dans ces travaux de recherche, de nombreux cas d'études sont analysés et conceptualisés pour construire les ontologies validant ainsi l'applicabilité du DHRM.
Razafiarison, Tahiry Anja. „La responsabilité médicale à Madagascar : Réalités internes et proposition d'actualisation en référence au droit médical français“. Thesis, Poitiers, 2013. http://www.theses.fr/2013POIT3003/document.
Der volle Inhalt der QuelleTo come up to the Malagasy society's expectations a legal act on medical responsibility should be coherent with the local conception of medical practice. In Madagascar, the public opinion is convinced that medical practice has to deal with a divine action out of the physician's control so that the legal concept of therapeutic risk is therefore accepted. The physician is only required to assume the continuity of the service and to provide his or her best care to patients. Meanwhile Malagasy culture shows an extreme compassion towards people suffering of bodily injuries mainly when these wounds result from a malpractice. To prevent a godly punishment doctors usually have to donate to their injured patient. This helps to preserve the physician's reputation and his or her relationship with the patient. Medical duty in Madagascar is more a matter of recognition of the patient's pain than a procedure of compensation. French legal system is different as it promotes the complete compensation in case of malpractice. However, both legal systems are similar when proposing alternative process to resolve conflicts
Neunreuther, Éric. „Contribution à la modélisation des systèmes intégrés de production à intelligence distribuée : application à la distribution du contrôle et de la gestion technique sur les équipements de terrain“. Nancy 1, 1998. http://www.theses.fr/1998NAN10183.
Der volle Inhalt der QuelleZhang, Xin. „Contribution à l’ingénierie du changement dans les projets de développement de produits : modèle de référence et simulation par système multi-agents“. Thesis, Bordeaux 1, 2013. http://www.theses.fr/2013BOR14892/document.
Der volle Inhalt der QuelleThe overall goal of this Ph.D. research is to provide reference models, support me- thods and tools that simulate change propagations in a Product Development (PD) project to assist decision-makings. We firstly establish a change analysis framework of modeling the context of change occurrence and propagation by taking into account the multiple knowledge areas of PD project simultaneously. Under the framework, we propose the conceptual models of change occurrence and change propagation that pro- vide a qualitative method to identify change and change propagation and imply some characteristics of change propagations. Relying on that, we suggest the procedures of building up the change propagation networks. Within the network, we propose the methodology of simulating change propagations and then present the process of im- plementing the methodologies and the models as a software prototype by using multi- agent based technology
Albezzawy, Muhammad Nabil Mustafa. „Advanced signal processing methods for source identification using references“. Electronic Thesis or Diss., Lyon, INSA, 2024. http://www.theses.fr/2024ISAL0074.
Der volle Inhalt der QuelleRank-reduced reference/coherence techniques based on the use of references, i.e. fixed sensors, are widely used to solve the two equivalent problems of source extraction and resynchronization encountered during remote sensing of physical fields, when the number of references surpasses the number of incoherent sources. In such case, the cross-spectral matrix (CSM) becomes ill-conditioned, resulting in the invalidity of the least squares LS solution. Although the truncated singular value decomposition (TSVD) was successfully applied in the literature to solve this problem, its validity is limited only to the case of scalar noise on the references. It is also very difficult to define a threshold, for truncation, when the singular values are gradually decreasing. This thesis proposes a solution based on finding a set of virtual references that is maximally correlated with the field measurements, named the maximally-coherent reference (MCR) Technique. This solution is optimal, especially, in the case of correlated noise on the reference, where TSVD fails. However the technique also includes an eigenvalue truncation step, similar to the one required for the TSVD, which necessitates a priori knowledge or the estimation of the number of incoherent sources, i.e. source enumeration, which is an ill-posed inverse problem, insufficiently investigated in the literature within the framework of reference techniques. In this thesis, after providing a unified formalism for all the reference techniques in the literature, three alternative source enumeration methods, applicable to all the reference techniques, were presented namely; a direct likelihood ratio test (LRT) against the saturated model, a parametric bootstrap technique and a cross-validation approach. A comparative study is performed among the three methods, based on simulated numerical data, real sound experimental data, and real electrical motor data. The results showed two important outcomes. The first is that the number of snapshots (spectral windows), used in the spectral analysis, greatly affects the performance of the three methods, and that, they behave differently for the same number of used snapshots. The second is that parametric bootstrapping turned out to be the best method in terms of both estimation accuracy and robustness with regard to the used number of snapshots. Finally, the MCR technique accompanied with bootstrapping was employed for source extraction and resynchronization of real data from laboratory experiments, and an e-motor, and it returned better results than the LS solution and the TSVD when employed for the same purpose
Heristchi, Vincent. „Neige électronique - L'effet vidéo“. Thesis, Paris 3, 2010. http://www.theses.fr/2010PA030105.
Der volle Inhalt der QuelleVideo is currently used on television and as an amateur format, and has been for a few decades. But its mixing with feature movies shot on traditional 16 or 35 millimetres is more occasional, even if it can offer new aesthetical possibilities. By keeping the particular texture of video, in a mix and a confrontation of formats, an effect can occur. The creation of that “video effect” is the subject of this study. What can create a video image [defined and readable as video] when a fiction or a documentary’s reference of format is on celluloid? To that purpose, some films directed by Robert Kramer, William Klein, Michael Haneke, Fritz Lang, Jean-Luc Godard, Atom Egoyan, David Cronenberg, David Lynch, Abbas Kiarostami, Abel Ferrara, Wim Wenders, Chris Marker or Jean- Daniel Pollet will be investigated. These images determine a particular video effect [television, CCTV or amateur-intimate], and its constituting elements can be altered. Video can also lose all its narrative justification and influence the audience’s emotions only through its texture, in order to eventually « make the image » [in reference to Samuel Beckett’s short story]. Finally, by creating instability in the nature of its constitution, the video effect can contain the openness of a poetic whisper
Lombard, Grégori Muriel. „Contribution au génie productique : prototypage d'une architecture d'ingénierie concourante des systèmes intégrés de fabrication manufacturière“. Nancy 1, 1994. http://www.theses.fr/1994NAN10054.
Der volle Inhalt der QuelleCarnec, Mathieu. „Critères de qualité d'images couleur avec référence réduite perceptuelle générique“. Nantes, 2004. http://www.theses.fr/2004NANT2044.
Der volle Inhalt der QuelleDigital images are widely used in information media. For images that target human observers, it is necessary to hold methods that can assess image quality in order to adapt image processing (image compression, image enhancement) and measure the quality of transmission services. To be efficient, these methods (called "quality criteria") must produce quality scores in close relation with subjective quality scores given by human observers (during subjective quality evaluation tests). We propose new quality criteria based on a functional and organisational model of the human visual system. This model describes the different stages of vision : from the eye to the visual cortex which contains the V1 and V2 areas, the ventral pathway and the dorsal pathway. The novelty of these new quality criteria resides in the extraction, from an image represented in a perceptual space, of features that can be compared to the ones used by the human visual system. A similarity measure between features from an original image and features from a test image (whom quality is to be assessed) enables to produce the quality score. Several similarity measures have been tested, each one using different features and so, leading to the development of a particular quality criterium. These features constitute a reduced reference of the image which can, in a transmission context, be transmitted with the distorted image so that the quality of this latter can be assessed. Results show, on two different scored images bases (containing both images and their subjective quality scores), an important correlation between some similarity measures that have been tested and subjective scores. The size of reduced references is flexible and a quantification of this reduced reference lowers its size while preserving good performances. Our work has been integrated in two applications in order to assess images quality and to compress images while choosing the quality of output images
Barland, Rémi. „Évaluation objective sans référence de la qualité perçue : applications aux images et vidéos compressées“. Nantes, 2007. http://www.theses.fr/2007NANT2028.
Der volle Inhalt der QuelleThe conversion to the all-digital and the development of multimedia communications produce an ever-increasing flow of information. This massive increase in the quantity of data exchanged generates a progressive saturation of the transmission networks. To deal with this situation, the compression standards seek to exploit more and more the spatial and/or temporal correlation to reduce the bit rate. The reduction of the resulting information creates visual artefacts which can deteriorate the visual content of the scene and thus cause troubles for the end-user. In order to propose the best broadcasting service, the assessment of the perceived quality is then necessary. The subjective tests which represent the reference method to quantify the perception of distortions are expensive, difficult to implement and remain inappropriate for an on-line quality assessment. In this thesis, we are interested in the most used compression standards (image or video) and have designed no-reference quality metrics based on the exploitation of the most annoying visual artefacts, such as the blocking, blurring and ringing effects. The proposed approach is modular and adapts to the considered coder and to the required ratio between computational cost and performance. For a low complexity, the metric quantifies the distortions specific to the considered coder, only exploiting the properties of the image signal. To improve the performance, to the detriment of a certain complexity, this one integrates in addition, cognitive models simulating the mechanisms of the visual attention. The saliency maps generated are then used to refine the proposed distortion measures purely based on the image signal
Yao, Kunliang. „Estimation de la nutation de la Terre par les techniques VLBI et GPS“. Paris 6, 2013. http://www.theses.fr/2013PA066206.
Der volle Inhalt der QuelleThe purpose of this thesis is to use independently observations obtained by VLBI and GPS techniques, to determine the nutation of the Earth’s rotation axis at all frequencies with the best possible accuracy. After presenting the reference systems, timescales, parameters and models, the manuscript describes the methodology of this thesis, the analysis of the observations and the results. In the case of VLBI, we have studied the effects of the reference system and the precession-nutation model in order to optimize the precession-nutation determination; we have analyzed 5069 observations covering a period of more than 30 years and then estimated corrections to the amplitudes of the nutation terms. In the case of GPS, we have firstly estimated the polar motion and nutation corrections using the IERS C04 solution as a priori. Then, a new software has been developed in Matlab to estimate the time derivative of the nutation. The numerical integration of satellite orbit is computed in a reference system defined by the position of the pole and the origin on the equator at the beginning of the arc, which helps to minimize the influence of precession-nutation a priori values. Analyzing 315 360 observations obtained by about 110 stations every 300 s, we have determined the time derivatives of the coordinates of the celestial pole every 6 h for 3 years from 1 January 2009. Then, we have derived the corrections to nutation terms with periods shorter than 20 days, for the first time by the GPS technique only, with a precision of the order of 10 μas
Appert, Damien. „Conception et évaluation de techniques d'interaction non visuelle optimisées pour de la transmission d'information“. Thesis, Toulouse 3, 2016. http://www.theses.fr/2016TOU30095/document.
Der volle Inhalt der QuelleIn situations where the visual perception is strongly constraint or deficient, it is necessary to make perceptible the information with a "not visual form" while taking into account human sensory and mnesic capacities. For example, a blind person wishing to acquaint an itinerary must read it under a non visual form and memorize it. However, besides the material aspect, the implementation of alternatives (non-visual) still faces to the cognitive abilities of the user (comprehension, memorization, integration of various information, etc.). The purpose of this thesis is to contribute to the design of interaction techniques allowing to optimize the transmission not visual of the information. For these purposes, I explored the feature of multimodality as a means of optimization, allowing of exceeding the memorization limits. I focused on the study of interaction techniques based on auditory and tactile modalities and by minimizing the use of the speech, in order to develop techniques for different environments (flexibility), optimize the use of perceptual channels (operating the properties of sound in audio messages to transmit more information, for example), avoid limiting my techniques by the language barrier or understanding and finally, to explore alternatives to the synthesised voice alone. The works of my thesis led to the design, to the implementation and to the evaluation of interaction techniques "non-visual" and "multiform", in answer to different contexts, whom in particular those of the information transmission of type , (pair of coordinates) and (sequence of couples direction-distance). To achieve design my interactions, I have made a review of literature in order to extract the main factors of design of interaction techniques dedicated to the transmission not visual of the information. Then, I have organized these factors in an analytical framework on which I have relied to design each of my techniques. Three separate experiments were led to evaluate the influence of design factors on the effectiveness of interactions and satisfaction towards users of technology. I can give some of them, the involvement of users (active or passive), the presence of explicit help, the transmission of several information in parallel, the main modality used and the type of coding in which is encoded the information
Wöppelmann, Guy. „Rattachement géodésique des marégraphes dans un système de référence mondial par techniques de géodésie spatiale“. Observatoire de Paris (1667-....), 1997. https://hal.science/tel-02071389.
Der volle Inhalt der QuelleEtame, Etame Thierry. „Conception de signaux de référence pour l'évaluation de la qualité perçue des codeurs de la parole et du son“. Rennes 1, 2008. http://www.theses.fr/2008REN1S112.
Der volle Inhalt der QuelleSubjective assessment is the most reliable way to determine overall perceived voice quality of network equipment, as digital codecs. Reference conditions are useful in subjective tests to provide anchors so that results from different tests can be compared. The Modulated Noise Reference Unit (MNRU) provides a simulated and calibrated degradation qualitatively similar to quantization distortion of waveform codecs. The introduction of new technologies for telecommunications services introduce new types of distortions and so the MNRU is not representative any more of the current degradations. The purpose of our work is to produce a reference system that can simulate and calibrate current degradations of speech and audio codec. The first step of the work consists in producing the multidimensional perceptive space underlying the perception of current degradations. The characterization of these perceptive dimensions should help to simulate and calibrate similar degradations
Ramin, Nicolas. „Vers une métrique sans -référence de la qualité spatiale d'un signal vidéo dans un contexte multimédia“. Nantes, 2009. http://www.theses.fr/2009NANT2070.
Der volle Inhalt der QuelleThe various services of real-time video communication over packet networks still do not guarantee the quality of the delivered signals. Quality evaluation then proves necessary in the design, optimization and control of robust communication chains. In this respect, subjective evaluation is naturally acknowledged as the most reliable approach. However, subjective testing may not be resorted to when it comes to real-time measuring anywhere in the data flow. We can then infer the need for automatic quality evaluation tools and especially the necessity to rely on the delivered signal only : « no-reference quality metrics following a signo-perceptual approach ». The analysis of the existing methods shows that foreseeing the spatial quality of video signals represents a substantial effort. Our contribution to this effort translates into various propositions of psychophysical studies, models and quality metrics
Altamimi, Zuheir. „Combinaison de techniques spatiales pour la détermination et la maintenance d'un système de référence terrestre centimétrique“. Observatoire de Paris, 1990. https://tel.archives-ouvertes.fr/tel-01958572.
Der volle Inhalt der QuelleSossa, Dorothé. „Techniques et moyens juridiques internationaux de lutte contre la corruption politique : (avec référence spéciale à l'Afrique subsaharienne)“. Thesis, University of Ottawa (Canada), 1991. http://hdl.handle.net/10393/7622.
Der volle Inhalt der QuelleCoulot, David. „Télémétrie laser sur satellites et combinaison de techniques géodésiques : contributions aux systèmes de référence terrestres et applications“. Phd thesis, Observatoire de Paris, 2005. http://tel.archives-ouvertes.fr/tel-00069016.
Der volle Inhalt der QuelleCoulot, David. „Télémétrie laser sur satellites et combinaison de techniques géodésiques : contributions aux systèmes de référence terrestres et applications“. Phd thesis, Observatoire de Paris (1667-....), 2005. https://theses.hal.science/tel-00069016.
Der volle Inhalt der QuelleMunoz-Olivas, Riansares. „Spéciation du sélénium par techniques de couplage et détection par ICP/MS : application à la production des matériaux de référence“. Bordeaux 1, 1996. http://www.theses.fr/1996BOR10553.
Der volle Inhalt der QuelleOuni, Sonia. „Evaluation de la qualité des images couleur. Application à la recherche & à l'amélioration des images“. Thesis, Reims, 2012. http://www.theses.fr/2012REIMS034.
Der volle Inhalt der QuelleThe research area in the objective quality assessment of the color images has been a renewed interest in recent years. The work is primarily driven by the advent of digital pictures and additional needs in image coding (compression, transmission, recovery, indexing,...). So far the best evaluation is visual (hence subjective) or by psychophysical techniques or by expert evaluation. Therefore, it is useful, even necessary, to establish criteria and objectives that automatically measures quality scores closest possible quality scores given by the subjective evaluation. We propose, firstly, a new full reference metric to assess the quality of color images, called overall Delta E, based on color appearance and incorporates the features of the human visual system (HVS). Performance was measured in two areas of application compression and restoration. The experiments carried out show a significant correlation between the results and subjective assessment.Then, we propose a new no reference quality assessmenent color images approach based on neural networks: given the multidimensional nature of image quality, a quantification of quality has been proposed, based on a set of attributes forming the descriptor UN (Utility, Naturalness). Accuracy reflects the sharpness and clarity. As for naturality, it reflects the brightness and color. To model the criterion of color, three no reference metrics were defined to detect the dominant color in the image, the proportion of that color and its spatial dispersion. This approach is based on neural networks to mimic the HVS perception. Two variants of this approach have been tried (direct and progressive). The results showed the performance of the progressive variant compared to the direct variant. The application of the proposed approach in two areas: in the context of restoration, this approach has served as a stopping criterion for automatic restoration algorithms. In addition, we have used in a system for estimating the quality of images to automatically detect the type of content in an image degradation. In the context of indexing and image retrieval, the proposed approach was used to introduce the quality of images in the database as an index. The experimental results showed the improvement of system performance image search by content by using the index or by making a quality refinement results with the quality criterion
Pollet, Arnaud. „Combinaison des techniques de géodésie spatiale : contributions aux réalisations des systèmes de référence et à la détermination de la rotation de la Terre“. Observatoire de Paris (1667-....), 2011. https://hal.science/tel-02094987.
Der volle Inhalt der QuelleThis PhD Thesis deals with the combinations of observations provided by the space geodetic techniques DORIS, GPS, SLR, and VLBI. These combinations are currently under investigation, especially in the framework of the IERS working group COL (Combination at the Observation Level). In order to obtain the best possible results with this approach, a homogeneous combined terrestrial frame is needed. Effort has also been made here to obtain the best possible realization of the terrestrial reference system. To achieve this goal, I have tested several approaches of combinations at the observation level. A new model of combination is proposed, which allows us to obtain a homogeneous frame. The contribution of the local ties between co-located stations and their impact regarding the homogeneity of the weekly combined frames are analysed too. To do that, I have adapted a GPS data processing by sub-networks in order to have a dense GPS network and a large number of co-located stations. To strengthen the links between the techniques, the uses of common zenithal tropospheric delays and spatial links via multi-technique satellites, are studied and I have proved their relevance. Finally, a combination at the observation level is performed for the year 2005. This work has also allowed to obtain EOP and station positions time series which take advantage of the processing consistency and of the best qualities, regarding the temporal resolution and the accuracy, of each technique used in the combination
Salido-Ruiz, Ricardo Antonio. „Problèmes inverses contraints en EEG : applications aux potentiels absolus et à l'influence du signal de référence dans l'analyse de l'EEG“. Electronic Thesis or Diss., Université de Lorraine, 2012. http://www.theses.fr/2012LORR0403.
Der volle Inhalt der QuelleThis thesis concerns the issue of scalp EEG signals pre-processing and it is focused on signal's disturbances caused by non zero reference measurements. These signals perturbations induced by an electrical fluctuation of reference signal can lead to misinterpretation errors in certains analysis. This can be easily seen in inter-signal synchronization measurements such as in coherence studies. Thus, the ideal reference is a null reference. During this research work, we focused on the absolute (zero-reference) potentials estimation from a inverse problem reformulation. Here, two cases are treated, one deals with the case of a reference signal that is sufficiently distant from electrophysiological brain sources so, it is considered as independent signal ; otherwise, it is modeled as a linear combination of sources. Thanks to this modeling, it was shown explicitly that the best estimates of absolute potentials without any a priori information are the average reference potentials. On the other hand, the source-independent reference inverse problem is resolved in a source separation context. For this case, it has been shown that the best estimate of the absolute potentials without any a priori information is equivalent to Minimum Power Distortionless Response/Minimum Variance Distortionless Response (MVDR/MPDR) estimators. On the pretreatment of EEG data, we show on simulated and real signals that measured potentials transformed into average reference improve certain analytical methods used in EEG such as blind source separation (BSS) and localization of brain sources. Beyond the problems of reference, this method can be applied as a constrained source estimation algorithm in order to estimate in a more robust way, particular sources such as artifacts or deterministic exogenous electrical stimulation
Roussel, Bénédicte. „Détection des anticorps anti-jo-1, anti-mitochondries de type 2 et anti-ribosomes : comparaison du dot-blot avec les techniques de références“. Paris 5, 1996. http://www.theses.fr/1996PA05P012.
Der volle Inhalt der QuelleSalido-Ruiz, Ricardo Antonio. „Problèmes inverses contraints en EEG : applications aux potentiels absolus et à l'influence du signal de référence dans l'analyse de l'EEG“. Thesis, Université de Lorraine, 2012. http://www.theses.fr/2012LORR0403/document.
Der volle Inhalt der QuelleThis thesis concerns the issue of scalp EEG signals pre-processing and it is focused on signal's disturbances caused by non zero reference measurements. These signals perturbations induced by an electrical fluctuation of reference signal can lead to misinterpretation errors in certains analysis. This can be easily seen in inter-signal synchronization measurements such as in coherence studies. Thus, the ideal reference is a null reference. During this research work, we focused on the absolute (zero-reference) potentials estimation from a inverse problem reformulation. Here, two cases are treated, one deals with the case of a reference signal that is sufficiently distant from electrophysiological brain sources so, it is considered as independent signal ; otherwise, it is modeled as a linear combination of sources. Thanks to this modeling, it was shown explicitly that the best estimates of absolute potentials without any a priori information are the average reference potentials. On the other hand, the source-independent reference inverse problem is resolved in a source separation context. For this case, it has been shown that the best estimate of the absolute potentials without any a priori information is equivalent to Minimum Power Distortionless Response/Minimum Variance Distortionless Response (MVDR/MPDR) estimators. On the pretreatment of EEG data, we show on simulated and real signals that measured potentials transformed into average reference improve certain analytical methods used in EEG such as blind source separation (BSS) and localization of brain sources. Beyond the problems of reference, this method can be applied as a constrained source estimation algorithm in order to estimate in a more robust way, particular sources such as artifacts or deterministic exogenous electrical stimulation
Gao, Bo. „Contribution à la synthèse de commandes référencées vision 2D multi-critères“. Phd thesis, Université Paul Sabatier - Toulouse III, 2006. http://tel.archives-ouvertes.fr/tel-00119789.
Der volle Inhalt der QuelleBorgetto, Manon. „Contribution à la construction de mosaïques d'images sous-marines géo-référencées par l'introduction de méthodes de localisation“. Phd thesis, Université du Sud Toulon Var, 2005. http://tel.archives-ouvertes.fr/tel-00009564.
Der volle Inhalt der QuelleFolio, David. „Stratégies de commande référencées multi-capteurs et gestion de la perte du signal visuel pour la navigation d’un robot mobile“. Toulouse 3, 2007. http://www.theses.fr/2007TOU30253.
Der volle Inhalt der QuelleThe recent sensors improvement gave rise to the sensor-based control which allows to perform various and accurate navigation tasks. This thesis aims at developing sensorbased control laws allowing a mobile robot to perform vision-based tasks amidst possibly occluding obstacles. Indeed,it is necessary to preserve not only the robot safety (ie. Noncollision) but also the visual features visibility. Thus,we have first proposed techniques able to fulfill simultaneously the two previously mentioned objectives. However,avoiding both collisions and occlusions often over-strained the robotic navigation task,reducing the range of realizable missions. This is the reason why we have developed a second approach which lets the visual features loss occurs if it is necessary for the task realization. Using the link between vision and motion,we have proposed different methods (analytical and numerical) to compute the visual signal as soon it becomes totally unavailable
Monnin, Alexandre. „Vers une philosophie du Web : le Web comme devenir-artefact de la philosophie (entre URIs, tags, ontologie (s) et ressources)“. Phd thesis, Université Panthéon-Sorbonne - Paris I, 2013. http://tel.archives-ouvertes.fr/tel-00879147.
Der volle Inhalt der QuelleMathé, Anne-Cécile. „Jeux et enjeux de langage dans la construction d'un vocabulaire de géométrie spécifique et partagé en cycle 3 (Analyse de la portée des jeux de langage dans un Atelier de géométrie en cycle 3 et modélisation des gestes de l'enseignant en situation)“. Phd thesis, Université Claude Bernard - Lyon I, 2006. http://tel.archives-ouvertes.fr/tel-00345659.
Der volle Inhalt der QuelleGrâce à l'élaboration d'une méthodologie originale d'analyse des gestes de l'enseignant, cette recherche propose également de rendre compte du rôle et de la place de l'enseignant dans de telles interactions langagières. Pour ce faire, nous proposons d'expliciter ses modalités effectives d'intervention puis de modéliser son action en situation en termes de tutelle et de médiation.
Kane, Khardiatou. „Documentation numérique en Afrique francophone subsaharienne : évaluation de l'offre et des usages en sciences humaines à l'Université Cheikh Anta Diop de Dakar“. Thesis, Paris, CNAM, 2018. http://www.theses.fr/2018CNAM1185/document.
Der volle Inhalt der QuelleUniversity libraries in French-speaking African countries face a documentary supply challenge in a context of a lack of financial resources, an increase in the cost of documentation, and some inefficient organizational forms. This thesis aims at first, from surveys and collections of diverse data, to establish the state of the paper and digital documentary offer at the University Cheikh Anta Diop of Dakar, in SHS, highlighting points comparison with other Sub-Saharan Francophone Universities. The results are intended to quantify and qualify this offer, to point out new dynamics in the context of digital documentation with different types of actors. In addition, librarians seek to rely fully on Open Access, both in terms of access to resources and the value of local funds. Digital information is increasingly seen as the best way to meet the information needs of the university community at Dakar's Cheikh Anta Diop University (UCAD). Nevertheless, it is often confronted with problems of access to information but also of use of documentary resources. This research focuses, in a second part, to appreciate the uses of digital resources, in a comparative way between several disciplines at UCAD and between teachers and students. Recommendations are made to try to improve the documentary services in this University
Gastaud, Muriel. „Modèles de contours actifs pour la segmentation d'images et de vidéos“. Phd thesis, Université de Nice Sophia-Antipolis, 2005. http://tel.archives-ouvertes.fr/tel-00089384.
Der volle Inhalt der QuelleLa contribution de cette thèse réside dans l'élaboration et l'étude de différents descripteurs de région. Pour chaque critère, nous calculons la dérivée du critère à l'aide des gradients de forme, et en déduisons l'équation d'évolution du contour actif.
Le premier descripteur définit un a priori géométrique sans contrainte paramétrique: il minimise la distance du contour actif à un contour de référence. Nous l'avons appliqué à la déformation de courbe, la segmentation et le suivi de cible.
Le deuxième descripteur caractérise le mouvement de l'objet par un modèle de mouvement. Le critère associé définit conjointement une région et son mouvement sur plusieurs images consécutives. Nous avons appliqué ce critère à l'estimation et la segmentation conjointe du mouvement et au suivi d'objets en mouvement.
Zeloufi, Mohamed. „Développement d’un convertisseur analogique-numérique innovant dans le cadre des projets d’amélioration des systèmes d’acquisition de l’expérience ATLAS au LHC“. Thesis, Université Grenoble Alpes (ComUE), 2016. http://www.theses.fr/2016GREAT115.
Der volle Inhalt der QuelleBy 2024, the ATLAS experiment plan to operate at luminosities 10 times the current configuration. Therefore, many readout electronics must be upgraded. This upgrade is rendered necessary also by the damage caused by years of total radiations’ effect and devices aging. A new Front-End Board (FEB) will be designed for the LAr calorimeter readout electronics. A key device of this board is a radiation hard Analog-to-Digital Converter (ADC) featuring a resolution of 12bits at 40MS/s sampling rate. Following the large number of readout channels, this ADC device must display low power consumption and also a low area to easy a multichannel design.The goal of this thesis is to design an innovative ADC that can deal with these specifications. A Successive Approximation architecture (SAR) has been selected to design our ADC. This architecture has a low power consumption and many recent works has shown his high compatibility with modern CMOS scaling technologies. However, the SAR has some limitations related to decision errors and mismatches in capacitors array.Using Matlab software, we have created the models for two prototypes of 12bits SAR-ADC which are then used to study carefully their limitations, to evaluate their robustness and how it could be improved in digital domain.Then the designs were made in an IBM 130nm CMOS technology that was validated by the ATLAS collaboration for its radiation hardness. The prototypes use a redundant search algorithm with 14 conversion steps allowing some margins with comparator’s decision errors and opening the way to a digital calibration to compensate the capacitors mismatching effects. The digital part of our ADCs is very simplified to reduce the commands generation delays and saving some dynamic power consumption. This logic follows a monotonic switching algorithm which saves about70% of dynamic power consumption compared to the conventional switching algorithm. Using this algorithm, 50% of the total capacitance reduction is achieved when one compare our first prototype using a one segment capacitive DAC with a classic SAR architecture. To boost even more our results in terms of area and consumption, a second prototype was made by introducing a two segments DAC array. This resulted in many additional benefits: Compared to the first prototype, the area used is reduced in a ratio of 7,6, the total equivalent capacitance is divided by a factor 12, and finally the power consumption in improved by a factor 1,58. The ADCs respectively consume a power of ~10,3mW and ~6,5mW, and they respectively occupy an area of ~2,63mm2 and ~0,344mm2.A foreground digital calibration algorithm has been used to compensate the capacitors mismatching effects. A high frequency open loop reference voltages buffers have been designed to allow the high speed and high accuracy charge/discharge of the DAC capacitors array.Following electrical simulations, both prototypes reach an ENOB better than 11bits while operating at the speed of 40MS/s. The INL from the simulations were respectively +1.14/-1.1LSB and +1.66/-1.72LSB.The preliminary testing results of the first prototype are very close to that of a commercial 12bits ADC on our testing board. After calibration, we measured an ENOB of 10,5bits and an INL of +1/-2,18LSB. However, due to a testing board failure, the testing results of the second prototype are less accurate. In these circumstances, the latter reached an ENOB of 9,77bits and an INL of +7,61/-1,26LSB. Furthermore the current testing board limits the operating speed to ~9MS/s. Another improved board was designed to achieve a better ENOB at the targeted 40MS/s speed. The new testing results will be published in the future