Rozprawy doktorskie na temat „Planification du traitement”
Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych
Sprawdź 50 najlepszych rozpraw doktorskich naukowych na temat „Planification du traitement”.
Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.
Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.
Przeglądaj rozprawy doktorskie z różnych dziedzin i twórz odpowiednie bibliografie.
Baussé, Jérôme. "Recalage et planification du traitement en radiothérapie et protonthérapie". Phd thesis, Télécom ParisTech, 2010. http://pastel.archives-ouvertes.fr/pastel-00566712.
Pełny tekst źródłaBausse, Jérôme. "Recalage et planification du traitement en radiothérapie et protonthérapie". Paris, Télécom ParisTech, 2010. http://pastel.archives-ouvertes.fr/pastel-00566712.
Pełny tekst źródłaIn an important and ambitious project which aim is to renew treatment center installations, ICPO (Institut Curie – Centre de Protonthérapie d'Orsay) develop new software, dedicated to patients’ treatment using protons’ beam. High energies used during treatment, and precision obtained from protons’ beam particles, requires a more precise patient positioning than in classical radiotherapy. This thesis subject is born from these considerations, goal is using only anatomical data in positioning images, and renew software used for dosimetric planification. Today, we achieved second goal, Isogray (a thesis’ partner main product) is used in clinical way, and first patients planed with it have already been treated. First goal, even if it has progressed a lot during this thesis work, can’t be achieved before thesis’ end, tests and tuning still necessary. First results are hopeful, and they revealed problems to be solved in order to finalize developments. This thesis has been done in a partnership between ICPO and DOSIsoft society, European leader in treatment planification systems, providing software used at ICPO. TSI (Signal and Image Treatment) laboratory’s knowledge, in Télécom ParisTech, comes in this partnership as scientific reference, with a strong background in field of medical research
Ndiaye, Amadou. "Planification de traitement physico-chimique par niveaux d'integration differents. Modele act". Paris 7, 1991. http://www.theses.fr/1991PA077250.
Pełny tekst źródłaLavallée, Marie-Claude. "Irradiation corporelle totale dynamique à vitesse variable : de la planification au traitement". Thesis, Université Laval, 2010. http://www.theses.ulaval.ca/2010/27688/27688.pdf.
Pełny tekst źródłaStriegnitz, Kristina. "Génération d'expressions anaphoriques : Raisonnement contextuel et planification de phrases". Nancy 1, 2004. http://www.theses.fr/2004NAN10186.
Pełny tekst źródłaThis thesis investigates the contextual reasoning involved in the production of anaphoric expressions in natural language generation systems. More specifically, I propose generation strategies for two types of discourse anaphora which have not been treated in generation before: bridging descriptions and additive particles. To this end the contextual conditions that govern the use of these expressions have to be formalized. The formalization that I propose is based on notions from linguistics and extends previous approaches to the generation of co-referential anaphora. I then specify the reasoning tasks that have to be carried out in order to check the contextual conditions. I describe how they can be implemented using a state-of-the-art reasoning system for description logics, and I compare my proposal to alternative approaches using other kinds of reasoning tools. Finally, I describe an experimental implementation of the proposed approach
Shen, Wei. "Planification de trajectoires en présence d'obstacles à partir d'images de l'environnemeent". Poitiers, 1996. http://www.theses.fr/1996POIT2306.
Pełny tekst źródłaLe, Maitre Amandine. "Optimisation de l'utilisation de l'imagerie TEP pour la planification de traitement en radiothérapie". Thesis, Brest, 2012. http://www.theses.fr/2012BRES0029.
Pełny tekst źródłaThere has been an increasing interest for the use Positron Emission Tomography (PET) combined with Computed Tomography for radiotherapy treatment planning. It improves target volume delineation by reducing inter and intra-observer variability and allows visualizing biological heterogeneities. Plethoras of segmentation algorithm have been proposed but there is a lack of consensus regarding which one to use. Monte Carlo simulations are interesting to validate these algorithms since they allow creating datasets with known ground-truth and for which all acquisition parameters are controlled.We proposed several methodologies for improving the realism of simulations. Several datasets incorporating patient specific variability in terms of anatomy and activity distributions, realistic tumor shape and activity modeling and integrating the respiratory motions were created.These data were used in a first study concerning target volume definition. Several algorithms were compared for radiotherapy treatment planning. The accuracy of segmentation was related to the quality of ground-truth volume coverage. We also studied the impact of respiratory motion on segmentation accuracy.We investigated the use of an advanced segmentation method able to define high uptake sub-volumes, for heterogeneous dose prescriptions. Several scenarios of prescriptions were compares in terms of Tumor Control Probability (TCP) computed on PET images. Variability of this TCP due to acquisition parameters was quantified. The impact of contrast and size of sub-volume was studied. Finally we studied the usefulness of the addition of compartments to such heterogeneous prescriptions
Gaborit, Paul. "Planification distribuée pour la coopération multi-agents". Phd thesis, Université Paul Sabatier - Toulouse III, 1996. http://tel.archives-ouvertes.fr/tel-00142562.
Pełny tekst źródłaMadakat, Dalal. "Approches multicritères pour le traitement des débris spatiaux". Thesis, Paris 9, 2014. http://www.theses.fr/2014PA090019.
Pełny tekst źródłaSpace debris are a threat for the space exploitation and exploration. Their number will continue to increase even if we stop all space activities, making collisions between debris and operational satellites more likely to happen. Debris removal proves necessary to protect active satellites. Since the number of space debris is very high, we should first deal with the most dangerous ones.In the first part of this thesis, we have developed a multicriteria approach to categorize debris depending on their removal priority degree. Debris belonging to the most urgent category will be dealt with during a space mission. The planning of such a space mission is studied in the second part of this thesis.The planning should be designed while optimizing two criteria: mission cost and mission duration
Schleifer, Jacques. "Traitement de l'information et des connaissances dans la planification des mines à ciel ouvert". Paris, ENMP, 1986. http://www.theses.fr/1986ENMP0139.
Pełny tekst źródłaSchleifer, Jacques. "Traitement de l'information et des connaissances dans la planification des mines à ciel ouvert". Grenoble 2 : ANRT, 1986. http://catalogue.bnf.fr/ark:/12148/cb37601075h.
Pełny tekst źródłaAcosta-Tamayo, Oscar Dario. "De la navigation exploratoire virtuelle à la planification d'interventions endovasculaires". Rennes 1, 2004. https://tel.archives-ouvertes.fr/tel-00007555v2.
Pełny tekst źródłaCapdevielle, Olivier. "Formulation d'objectifs de traitement d'images : une exploitation interactive d'un système de planification automatique de chaînes d'opérateurs". Toulouse 3, 1995. http://www.theses.fr/1995TOU30003.
Pełny tekst źródłaClerc, Xavier. "Planification dans un espace de buts par stratégie de type meilleur d'abord". Grenoble INPG, 2007. http://www.theses.fr/2007INPG0059.
Pełny tekst źródłaMost of distributed planning systems are based on models which were developped for centralized planning. These models have then been adapted to distribution and its specific contraints. Our goal is at the opposite to design a planning model that considers these constraints as premises. We have developped a planning model that uses a best-first search (as an adaptation of the proof-number search algorithm). We have applied this model to planning over task structures (from multiagent notations) as well as to HTN planning. Ln this latter case, we have shown how a best-first search allows the planner to rapidly gather constraints that can prune branches from the search space. We have also defined plan robustness in order to mitigate the consequences of an agent failure or a resource unavailability
Chové, Étienne. "Contributions à l'ordonnancement réactif des installations de traitement de surface : application industrielle". Nantes, 2010. https://archive.bu.univ-nantes.fr/pollux/show/show?id=95fa6cc4-e88b-48be-8455-ec2d4ae6d1b8.
Pełny tekst źródłaThis thesis deals with the reactive hoist scheduling problem. The emergence of the industrial market companies located in countries with low production costs and reducing the planning horizon of production require a revision of the scheduling methods. Over the past 10 years, researchers are working on the reactive scheduling, method in which no decision is taken in advance. This method scheduling offers the advantage of responding to variations in production and variable requests, while ensuring a good performance. This is opposed to predictive scheduling, in which the allocation of tasks to different resources is done a priori. The surface treatment is a step in the part production cycle dealing with modification of the physicochemical structure of the surface of parts by immersion in different chemicals. The parts are transported by crane (or hoist), which is generally the critical resource. The processing time, starting with dumping in the tanks and ending with the resumption of the hoist, are generally limited, which imposes strong temporal constraints, which are not present in other scheduling problems. This thesis attempts to apply the reactive scheduling to these facilities constraints. Having demonstrated the impossibility of such an approach, we solve the problem by using a coupled reactive scheduling (assuming flexibility) and predictive scheduling (ensuring quality). This thesis concludes with two applications related to this problem: the definition of the topology of a surface treatment facility using a genetic algorithm and support for product order in such a facility. The deployment at the industry level of this work is the best proof of the scientific concept developed
Cedilnik, Nicolas. "Personnalisation basée sur l'imagerie de modèles cardiaques électrophysiologiques pour la planification du traitement de la tachycardie ventriculaire". Thesis, Université Côte d'Azur, 2020. http://www.theses.fr/2020COAZ4097.
Pełny tekst źródłaAcute infarct survival rates have drastically improved over the last decades, mechanically increasing chronic infarct related affections.Among these affections, ischaemic ventricular tachycardia (VT) is a particularly serious arrhythmia that can lead to the often lethal ventricular fibrillation. VT can be treated by radio frequency ablation of the arrhythmogenic substrate.The first phase of this long and risky interventional cardiology procedure is an electrophysiological (EP) exploration of the heart.This phase aims at localising the ablation targets, notably by inducing the arrhythmia in a controlled setting. In this work we propose to re-create this exploration phase in silico, by personalising cardiac EP models.We show that key information about infarct scar location and heterogeneity can be automatically obtained by a deep learning-based automated segmentation of the myocardium on computed tomography (CT) images.Our goal is to use this information to run patient-specific simulations of depolarisation wave propagation in the myocardium, mimicking the interventional cardiology exploration phase.We start by studying the relationship between the depolarisation wave propagation velocity and the left ventricular wall thickness to personalise an Eikonal model, an approach that can successfully reproduce periodic activation maps of the left ventricle recorded during VT.We then propose efficient algorithms to detect the repolarisation wave on unipolar electrograms (UEG), that we use to analyse the UEGs embedded in such intra-cardiac recordings.Thanks to a multimodal registration between these recordings and CT images, we establish relationships between action potential durations/restitution properties and left ventricular wall thickness.These relationships are finally used to parametrise a reaction-diffusion model able to reproduce interventional cardiologists' induction protocols that trigger realistic and documented VTs. inteinterventional cardiologists' induction protocols that trigger realistic and documented VTs
Adnan, Hashmi Muhammad. "Un langage de programmation agent intégrant la planification temporelle et les mécanismes de coordination de plans". Paris 6, 2012. http://www.theses.fr/2012PA066312.
Pełny tekst źródłaLargent, Axel. "Planification de radiothérapie externe à partir d'imagerie par résonance magnétique". Thesis, Rennes 1, 2018. http://www.theses.fr/2018REN1S085/document.
Pełny tekst źródłaIn external beam radiotherapy, X-ray imaging (CT-scan and CBCT) is the main imaging modality for treatment planning and dose delivery. CT-scan provides the electron density information required for dose calculation. CBCT allows fast imaging for patient positioning, tracking and gating of the tumor. However, X-ray imaging has a poor soft tissue contrast, and it is an ionizing imaging, in contrast of MRI. Thanks to this better soft tissue contrast, MRI could improve patient positioning, tumor and organs at risk delineation, and dose targeting. The introduction of MRI in the radiotherapy workflow is therefore a topical issue. This thesis firstly aims to optimize an MRI protocol with patient in head-and-neck radiotherapy treatment position. This protocol was endorsed by our clinical center. The second aim of this thesis was to conducted dose calculation from MRI. However, this imaging, unlike CT, lacks the electron density information required for dose calculation. To address this issue, an original non-local-mean patch-based method (PBM) and a deep learning method (DLM) were used to generate pseudo-CTs from MRIs, and compute the dose. The DLM was a generative adversarial network, and the PBM was performed by using an approximate nearest neighbor search with MR feature images. These both methods were evaluated and compared to an atlas-based method (ABM) and a bulk density method (BDM). This comparison was performed by using image and dosimetric endpoints. The DLM and PBM appeared the most accurate methods. The DLM was faster and more robust to anatomical variations than the PBM
Barbeau, Richard. "Un modèle intégré de planification de production et de gestion des résidus pour le traitement des sables bitumineux". Thesis, Université Laval, 2009. http://www.theses.ulaval.ca/2009/26220/26220.pdf.
Pełny tekst źródłaDesmarais, Julie. "La planification du traitement auprès des détenus fédéraux incarcérés en centre de traitement psychiatrique : situation du Centre régional de santé mentale de la région du Québec". Thèse, Université de Sherbrooke, 2013. http://hdl.handle.net/11143/6453.
Pełny tekst źródłaElst, Johannes van den. "Modélisation de connaissances pour le pilotage de programmes de traitement d'images". Nice, 1996. http://www.theses.fr/1996NICE4995.
Pełny tekst źródłaLopez, Thomas. "Planification de Chemin et adaptation de posture en environnement dynamique". Phd thesis, INSA de Rennes, 2012. http://tel.archives-ouvertes.fr/tel-00767784.
Pełny tekst źródłaYousfi, Fouad. "Placo : modélisation par workflow et conception d'un système de planification coopérative : application aux unités de soins". Lille 1, 1996. https://pepite-depot.univ-lille.fr/LIBRE/Th_Num/1996/50376-1996-111.pdf.
Pełny tekst źródłaBrown, Richard. "Microbrachytherapy treatment planning". Thesis, Toulouse 3, 2017. http://www.theses.fr/2017TOU30180/document.
Pełny tekst źródłaAn innovative form of radiotherapy, microbrachytherapy, is under development. This therapy targets solid, inoperable tumours by performing injections of liquid containing radioactive microspheres in suspension. Many injections are required to sufficiently cover the tumoural volume, and so to be able to deliver the position of these injections, a method of treatment planning has been developed and validated throughout this research. Throughout this work, three main questions are addressed: • How to perform the dosimetry for microbrachytherapy? • How to perform treatment planning for this modality? • What are the optimal injection properties to deliver the most efficient treatment? Microbrachytherapy dosimetry was performed by calculating the absorbed dose distribution for an injection. This distribution was then convolved at each injection position within the tumour to calculate the patient's absorbed dose distribution. Dosimetry of the tumour and the organs at risk was performed by extracting and analysing dose-volume histograms (DVHs). Once a method of dosimetry was put in place, optimisation algorithms were developed to generate patient-specific treatment plans. For this, three algorithms were tested and compared: Nelder-Mead Simplex, the Bees algorithm and the non-dominated sorting genetic algorithm II. It was found that, thanks to its MO optimisation, the non-dominated sorting algorithm II was the most flexible, and was used preferentially. Lastly, a comparison of injection parameters was performed. It was found that between 90Y, 166Ho, 131I and 177Lu, optimal injections consisted of microspheres of 90Y. Injection volumes of 5, 10 and 20 µL and initial activities of 5, 10 and 20 MBq were tested. It was found that 20 µL injections with 20 MBq were optimal because they minimise the number of injections required. This new technology combined with developments shown in this work demonstrate the feasibility - that was validated on animals - the ability to inject liquid containing radioactive microspheres in suspension to efficiently treat inoperable tumours whilst protecting surrounding healthy tissue. Such tumours, despite still having a poor prognosis, will surely have better support in the near future
Belhaoua, Abdelkrim. "Planification et automatisation d’une reconstruction 3D par stéréovision : prise en compte des incertitudes et optimisation de l’illumination". Strasbourg, 2011. https://publication-theses.unistra.fr/public/theses_doctorat/2011/BELHAOUA_Abdelkrim_2011.pdf.
Pełny tekst źródłaAutomation of the digitization process is an essential step for the development of three-dimensional measurement in different application areas, such as the evaluation of manufactured objects. The aim of this thesis is to assess the conformity of 3D manufactured objects taking into account the geometric tolerances and uncertainties related to various sources of error. 3D reconstruction is performed using a stereoscopic sensor. The quality of the 3D reconstruction can be significantly affected by the illumination properties. Therefore, the determination of favorable illumination parameters is a crucial problem that we have tried to solve by optimizing the placement of illumination sources. The quantification of the 3D error measurements is also the purpose of this thesis. The estimation of 3D errors allows us to assess the 3D reconstruction quality and thus to increase the performance of the 3D errors. A dimensional evaluation system based on computer vision has been developed in our laboratory. This system consists of two modules: the first one uses Situation Graph Trees that enable automation and management of the different procedures leading to the 3D reconstruction. The second module, called HTP (Hierarchical Task Plan) is in charge of the control of monitoring the acquisition sequence and of the full 3D reconstruction of the object and its dimensional evaluation. All algorithms developed within the framework of this dissertation have been validated and integrated into this fully operational system, including a mechanism for dynamic replanning enabling to adapt automatically the system to the actual acquisition conditions
Valdenaire, Simon. "Mise en place et utilisation des faisceaux FFF en radiothérapie : radiobiologie, caractérisation physique, contrôles qualité, modélisation et planification de traitement". Thesis, Aix-Marseille, 2017. http://www.theses.fr/2017AIXM0037/document.
Pełny tekst źródłaIn medical linear electron accelerators, photon beams profiles are homogenised using flattening filters. Technologies have evolved and the presence of this filter is no longer necessary. Flattening filter free (FFF) beams exhibit higher dose rates, heterogeneous dose profiles, modified energy spectra and lower out-of-field dose. This PhD aimed at studying the characteristics of unflattened beams, as well as their impact in clinical utilization. Several subjects were thoroughly investigated: radiobiology, dosimetry, quality controls, modelling and treatment planning. In vitro experiments ensured that the high dose-rate of FFF beams had not a radiobiological impact. A wide review of the literature was conducted to corroborate these results. In order to understand thoroughly the characteristics of FFF beams, measurements were conducted using several detectors. The effect of the spectra and dose rates of unflattened beams on dose calibration were also studied. FFF beams were modeled in two TPSs. The methods, results and model parameters have been compared between the available beam qualities as well as between both TPSs. Furthermore, the implementation of stereotactic treatments technique was the occasion to investigate small beam dosimetry. Prostate cancer cases treated with VMAT and pulmonary tumors treated with stereotactic 3D beams were also studied. The comparison of dose distributions and treatment metrics give advantage to FFF beams. Mastering physical and biological aspects of flattening filter free beams allowed the IPC to start FFF treatments. Comparative studies have since resulted in a deeper understanding on the pertinent use of these beams
Reuzé, Sylvain. "Extraction et analyse de biomarqueurs issus des imageries TEP et IRM pour l'amélioration de la planification de traitement en radiothérapie". Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLS341/document.
Pełny tekst źródłaBeyond the conventional techniques of diagnosis and follow-up of cancer, radiomic analysis allows to personalize radiotherapy treatments, by proposing a non-invasive characterization of tumor heterogeneity. Based on the extraction of advanced quantitative parameters (histograms of intensities, texture, shape) from multimodal imaging, this technique has notably proved its interest in determining predictive signatures of treatment response. During this thesis, signatures of cervical cancer recurrence have been developed, based on radiomic analysis alone or in combination with conventional biomarkers, providing major perspectives in the stratification of patients that can lead to dosimetric treatment plan adaptation.However, various methodological barriers were raised, notably related to the great variability of the protocols and technologies of image acquisition, which leads to major biases in multicentric radiomic studies. These biases were assessed using phantom acquisitions and multicenter patient images for PET imaging, and two methods enabling a correction of the stratification effect were proposed. In MRI, a method of standardization of images by harmonization of histograms has been evaluated in brain tumors.To go further in the characterization of intra-tumor heterogeneity and to allow the implementation of a personalized radiotherapy, a method for local texture analysis has been developed. Specifically adapted to brain MRI, its ability to differentiate sub-regions of radionecrosis or tumor recurrence was evaluated. For this purpose, parametric heterogeneity maps have been proposed to experts as additional MRI sequences.In the future, validation of the predictive models in external centers, as well as the establishment of clinical trials integrating these methods to personalize radiotherapy treatments, will be mandatory steps for the integration of radiomic in the clinical routine
Cauhapé, Damien. "Modélisation et traitement des connaissances sur le temps et les tâches médicales pour les systèmes experts en cancérologie". Bordeaux 1, 1991. http://www.theses.fr/1991BOR10534.
Pełny tekst źródłaSt-Amand, Pascale. "Attention, planification exécutive et problèmes d'apprentissage chez une population d'enfants nés très prématurés". Thesis, Université Laval, 2006. http://www.theses.ulaval.ca/2006/23736/23736.pdf.
Pełny tekst źródłaGerardy, Isabelle Yvonne Joséphine. "Evaluation d'un système de planification pour un traitement de brachythérapie gynécologique en utilisant des techniques Monte Carlo et des mesures expérimentales". Doctoral thesis, Universitat Politècnica de València, 2012. http://hdl.handle.net/10251/14272.
Pełny tekst źródłaGerardy, IYJ. (2011). Evaluation d'un système de planification pour un traitement de brachythérapie gynécologique en utilisant des techniques Monte Carlo et des mesures expérimentales [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/14272
Palancia
Madakat, D. "Approches multicritères pour le traitement des débris spatiaux". Phd thesis, Université Paris Dauphine - Paris IX, 2014. http://tel.archives-ouvertes.fr/tel-01059180.
Pełny tekst źródłaBucari, Miriam Claudia. "Le traitement des enfants handicapés mentaux dans le système scolaire argentin". Bordeaux 2, 1991. http://www.theses.fr/1991BOR21001.
Pełny tekst źródłaTogether with the introduction of the education of the mases, the need appeared in Argentina to create an alternative structure, adapted to those children who were unable to follow the normal lessons. Starting with a pure medical viewpoint at the beginning of this century, the handicap is in the sixties recognized to be surmountable with an early identification and a rational orientation. Later, research carried out in Argentina has proved the influence which the social environment has on the development of children as well as the effect of the educational isolation of the rural regions in a nation with a surface of approximately five times that of France. It appears that the special education is mainly developed in the urban areas. Practically no professional openings can be found for the young handicapped. This is the reason why we have denominated it a "short term education". Also, the slow acting public sector must be considered as well as the importance to the benevolent associations, especially in the capital. The province of Buenos Aires has the advantage of a more complete special education system which, however, is linked to some predominant ideologies
Vautrin, Mathias. "Planification de traitement en radiothérapie stéréotaxique par rayonnement synchrotron. Développement et validation d'un module de calcul de dose par simulations Monte Carlo". Phd thesis, Université de Grenoble, 2011. http://tel.archives-ouvertes.fr/tel-00641325.
Pełny tekst źródłaLECLERE, PASCAL. "Les polygones de voronoi en traitement d'images : application a la fusion de donnees, et a la planification de trajectoires en architecture parallele". Reims, 1996. http://www.theses.fr/1996REIMS019.
Pełny tekst źródłaBen, Daya Bechir. "Planification soutenable des investissements bioénergétiques : intégration des bioraffineries aux pâtes et papiers". Doctoral thesis, Université Laval, 2018. http://hdl.handle.net/20.500.11794/30335.
Pełny tekst źródłaThe Canadian pulp and paper sector has played a major socio-economic role in the last two centuries. In addition to the advantage of their geographical position, P&P companies have accumulated proven experience in the forest industry, including the treatment of wood biomass. Over the last three decades, these entities have faced difficult environmental constraints, compounded by a chronic market crisis. This latest crisis has had unprecedented social consequences leading to a crisis of sustainability. Over the last decade, the green energy industry has become a basic component of the energy transition strategies for developed countries. Biomass has always been at the heart of such a strategy for Canada. For the P&P, this orientation is an opportunity to solve the growing environmental and economic crisis of the sector. Decision-makers need a road map to transform P&P's factories into an Integrated Forest Biorefinery (IFBR). The choice of technologies, the sizing of production capacity and the choice of bioenergy investment are major concerns for decision-makers. However, assessing the sustainability of this transformation remains a major challenge. Our contribution is focused on developing decision support approaches and tools to support an effective, robust and sustainable transformation of Canada's P&P industry. The objective is to assess the sustainability of the IFBR integration and to present a new business model to decision-makers, which can strengthen their ability to negotiate a favorable incentive policy for bioenergy investments within the framework of the public-private partnership. To achieve this goal, our methodology combines decision support tools, mathematical optimization models, along with financial and economic analysis. Our first contribution proposes the design and application of a sustainability evaluation method integrating the life cycle approach and the optimization of the value creation network as part of a multi-objective mathematical model. The proposed model provides a roadmap for sustainable bioenergy investments, minimizing GHG emissions and maximizing the financial value of the biorefinery over a long-term planning horizon while ensuring optimal management of the incubator activity. In the second contribution, we present a sensitivity analysis of the proposed mathematical model according to well selected scenarios, with the development of a framework for communicating the model to the decision-makers. The purpose of this analysis is to assess the robustness of the model, to communicate to stakeholders the implications of investment choices in bioenergy production in an uncertain environment, and to identify opportunities for improving the effectiveness of the proposed model. In the third contribution, we propose an in-depth tax analysis using accelerated depreciation methods applied to investments in bioenergy. This analysis deals with the impact of the types of depreciations on the choice of bioenergy investment and on sustainability. Our goal is to provide decision makers with a set of decision support tools while strengthening their power to negotiate a tax policy favorable to bioenergy investment. In this part, it was highlighted that the choice of the investment coupled with the choice of its depreciation way offers the investor a more complete visibility on the practical consequences of the investment in the bioenergetics field with respect to prevalent tax legislation. This reinforces the public-private partnership and determines the level of public interventionism needed for the success of the expected transformation of the P&P sector. The social impact analysis and stochastic programming approaches for the robust study were not addressed by this work, they were presented as research perspectives.
Pastorelli, Mario. "Disciplines basées sur la taille pour la planification des jobs dans data-intensif scalable computing systems". Electronic Thesis or Diss., Paris, ENST, 2014. http://www.theses.fr/2014ENST0048.
Pełny tekst źródłaThe past decade have seen the rise of data-intensive scalable computing (DISC) systems, such as Hadoop, and the consequent demand for scheduling policies to manage their resources, so that they can provide quick response times as well as fairness. Schedulers for DISC systems are usually focused on the fairness, without optimizing the response times. The best practices to overcome this problem include a manual and ad-hoc control of the scheduling policy, which is error-prone and difficult to adapt to changes. In this thesis we focus on size-based scheduling for DISC systems. The main contribution of this work is the Hadoop Fair Sojourn Protocol (HFSP) scheduler, a size-based preemptive scheduler with aging; it provides fairness and achieves reduced response times thanks to its size-based nature. In DISC systems, job sizes are not known a-priori: therefore, HFSP includes a job size estimation module, which computes approximated job sizes and refines these estimations as jobs progress. We show that the impact of estimation errors on the size-based policies is not signifi- cant, under conditions which are verified in a system such as Hadoop. Because of this, and by virtue of being designed around the idea of working with estimated sizes, HFSP is largely tolerant to job size estimation errors. Our experimental results show that, in a real Hadoop deployment and with realistic workloads, HFSP performs better than the built-in scheduling policies, achieving both fairness and small mean response time. Moreover, HFSP maintains its good performance even when the cluster is heavily loaded, by focusing the resources to few selected jobs with the smallest size. HFSP is a preemptive policy: preemption in a DISC system can be implemented with different techniques. Approaches currently available in Hadoop have shortcomings that impact on the system performance. Therefore, we have implemented a new preemption technique, called suspension, that exploits the operating system primitives to implement preemption in a way that guarantees low latency without penalizing low-priority jobs
Poulin, Éric. "Conception et validation d'un système pour la planification et le guidage en temps réel des traitements de curiethérapie à haut débit de dose du sein". Doctoral thesis, Université Laval, 2015. http://hdl.handle.net/20.500.11794/26351.
Pełny tekst źródłaThis thesis focuses on the development and validation of different tools to increase the efficacy of high dose rate (HDR) breast brachytherapy treatments. The project goal aim at designing and validating a new system for real-time guidance and planning of HDR breast brachytherapy treatments, based mainly on 3D ultrasound (3DUS). As a first step, a clinical study was performed using the first linear 3DUS prototype developed by our group. This study has shown the limitations of the current system (ex : small acquisition volume, no catheter tracking possibility) and that 3DUS volumes are three times smaller than computed tomography volumes. As a second step, a catheter optimization algorithm was developed. The algorithm was shown to be robust to catheter implantation errors and it was possible to reduce significantly the number of catheters without having a significant negative impact on the dosimetry. As a third step, a study was designed to compare the present catheter optimization algorithm to the only commercially available algorithm, HIPO. The results demonstrated that the HIPO algorithm produce significantly worse plan, in term of dosimetry, than the algorithm that was developed in the present thesis. As a fourth step, two methods were developed for personalized, real-time planning of breast HDR brachytherapy treatments. The two methods were efficient and they were able to reduce the number of catheters. A proof-of-concept was validated and it demonstrated the potential of a personalized, real-time planning approach for breast HDR brachytherapy. Using the experience acquired during the clinical study, a new 3DUS system was developed. The system includes a new hybrid acquisition approach and a module for catheter tracking. The results presented in this study have shown the ability of the hybrid 3DUS system to accurately measure linear dimensions and volumes. Furthermore, it allows the reconstruction of the catheters trajectory with accuracy as well as track them in real-time. Finally, in order to dynamically reconstruct catheters, an electromagnetic tracking system was validated. This study has shown that the reconstruction of catheters, in HDR brachytherapy, is significantly more accurate and precise with an electromagnetic tracking system than with the conventional methods.
Perret, Cyril. "La Syllabe comme unité de traitement en production verbale orale et écrite". Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2007. http://tel.archives-ouvertes.fr/tel-00270161.
Pełny tekst źródłaSuite à une tentative de définition de « ce qu'est exactement une syllabe » dans une Introduction, nous avons rapporté des arguments en faveur d'un rôle fonctionnel de cette unité en perception visuelle et auditive.
Le Chapitre I présente une revue de la littérature portant sur la production verbale orale et écrite conceptuellement dirigée (Caramazza & Miceli, 1990 ; Dell, 1986 ; Levelt, Roelofs, & Meyer, 1999). Les différents niveaux de traitement (conceptuel, syntaxique, lexical et moteur) et les propositions concernant les mécanismes de traitement impliqués à chacune de ces étapes sont présentés. Une attention particulière a été portée aux propositions des principaux modèles concernant le rôle de la syllabe dans les processus d'accès lexical et de planification motrice.
Le Chapitre II est consacré à un effet qui est à l'origine d'un vif débat : l'effet d'amorçage syllabique. Ferrand, Segui et Grainger (1996) ont montré que la présentation d'un groupe de segments correspondant à la première syllabe d'un mot (e.g., ba-baleine ; balbalcon) facilite plus la dénomination qu'un groupe de segments plus court (e.g., ba-balcon) ou plus long (bal-baleine). Nous avons essayé de répliquer ce résultat en dénomination orale d'images (Expériences 2a, 2b et 3). Nous avons rapporté des données en faveur de l'hypothèse du recouvrement segmental (Sciller, 1998, 1999, 2000). Nous avons ensuite testé si le temps de présentation de l'amorce (Expérience 4) et le moment de présentation du groupe de segments (Expérience 5) pouvaient expliquer l'absence d'effet d'amorçage syllabique. Là encore, les données sont en accord avec l'hypothèse du recouvrement segmental (Schiller, 1998, 1999, 2000). Nous avons aussi exploré la possibilité d'obtenir cet effet en production verbale écrite (Expériences 1a et 1b).
Dans le Chapitre III, nous avons testé l'hypothèse selon laquelle les latences d'initialisation de mots monosyllabiques devraient être plus courts que celles de mots bisyllabiques, si la syllabe joue un rôle fonctionnel en production verbale orale et écrite. Des études pour les deux modalités ont répondu par la négative (Bachoud-Levi et al., 1998 ; Lambert, 1999 ; Lambert et al., sous presse ; Roelofs, 2002b). Toutefois, Meyer, Roelofs et Levelt, (2003) ont proposé qu'un critère temporel de réponse (Lupker et al., 1997) influence l'instant d'initialisation de la réponse. En conséquence, un effet du nombre de syllabes peut apparaître. Nous avons essayé de répliquer ce résultat en production verbale orale (Expérience 6) et de l'étendre à la production verbale écrite (Expérience 7). Toutefois, nous n'avons pas rapporté de données en faveur de l'hypothèse de Meyer et collaborateurs (2003) pour les deux modalités.
Le Chapitre IV a pour objectif de faire une synthèse des résultats que nous avons obtenus et de proposer des perspectives de recherches.
Habib, Bouchra. "Etude numérique et expérimentale d'un système de planification de traitement pour la radiothérapie intégrant un calcul Monte Carlo : applications aux hétérogénéités et petits faisceaux". Paris 11, 2009. http://www.theses.fr/2009PA112089.
Pełny tekst źródłaImprovements relative to the MC dose calculation speed have been made within the European project MAESTRO by the development of the fast MC code PENFAST and within the TELEDOS project by the parallelization of this code. This PhD work, based on these two projects, focuses on the evaluation of the technical and dosimetric performances of the MC code. These issues are crucial before the use of the MC code in clinical applications. First, variance reduction techniques included in the MC code as well as the parallelization of the calculation have been validated and evaluated in terms of gain in the computing time. The second part of this work has exposed a new, fast and accurate method to determine the initial energy spectrum of the accelerator. This spectrum is required for the MC dose calculation. Afterwards, dose calculations with the fast MC code PENFAST have been evaluated under metrological and clinical conditions. The results showed the ability of the MC code to quickly calculate an accurate dose in both photon and electron modes, even in electronic disequilibrium situations. However, this study revealed an uncertainty, in the TPS MC, in the conversion of the CT image to voxelized geometry which is used for MC dose calculation. The quality of this voxelization may be improved through an artefact correction software and by including additional materials in the database of the code
Petitguillaume, Alice. "Dosimétrie Monte Carlo personnalisée pour la planification et l’évaluation des traitements de radiothérapie interne : développement et application à la radiothérapie interne sélective (SIRT)". Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112190/document.
Pełny tekst źródłaMedical techniques in full expansion arousing high therapeutic expectations, targeted radionuclide therapies (TRT) consist of administering a radiopharmaceutical to selectively treat tumors. Nowadays, the activity injected to the patient is generally standardized. However, in order to establish robust dose-effect relationships and to optimize treatments while sparing healthy tissues at best, a personalized dosimetry must be performed, just like actual clinical practice in external beam radiotherapy. In that context, this PhD main objective was to develop, using the OEDIPE software, a methodology for personalized dosimetry based on direct Monte Carlo calculations. The developed method enables to calculate the tridimensional distribution of absorbed doses depending on the patient anatomy, defined from CT or MRI data, and on the patient-specific activity biodistribution, defined from SPECT or PET data. Radiobiological aspects, such as differences in radiosensitivities and repair time constants between tumoral and healthy tissues, have also been integrated through the linear-quadratic model. This methodology has been applied to the selective internal radiation therapy (SIRT) which consists in the injection of 90Y-microspheres to selectively treat unresectable hepatic cancers. Distributions of absorbed doses and biologically effective doses (BED) along with the equivalent uniform biologically effective doses (EUD) to hepatic lesions have been calculated from 99mTc-MAA activity distributions obtained during the evaluation step for 18 patients treated at hôpital européen Georges Pompidou. Those results have been compared to classical methods used in clinics and the interest of accurate and personalized dosimetry for treatment planning has been investigated. On the one hand, the possibility to increase the activity in a personalized way has been highlighted with the calculation of the maximal activity that could be injected to the patient while meeting tolerance criteria on organs at risk. On the other hand, the use of radiobiological quantities has also enabled to evaluate the potential added value of fractionated protocols in SIRT. The developed tool can thus be used as a help for the optimization of treatment plans. Moreover, a study has been initiated to improve the reconstruction of post-treatment data from 90Y-SPECT. The estimation from those data of doses delivered during treatment could allow to predict tumoral control and to anticipate healthy tissues toxicity as well as to establish precise dose-effect relationships for those treatments
Aziz, Fatima. "Approche géométrique couleur pour le traitement des images catadioptriques". Thesis, Limoges, 2018. http://www.theses.fr/2018LIMO0080/document.
Pełny tekst źródłaThis manuscript investigates omnidirectional catadioptric color images as Riemannian manifolds. This geometric representation offers insights into the resolution of problems related to the distortions introduced by the catadioptric system in the context of the color perception of autonomous systems. The report starts with an overview of the omnidirectional vision, the different used systems, and the geometric projection models. Then, we present the basic notions and tools of Riemannian geometry and its use in the image processing domain. This leads us to introduce some useful differential operators on Riemannian manifolds. We develop a method of constructing a hybrid metric tensor adapted to color catadioptric images. This tensor has the dual characteristic of depending on the geometric position of the image points and their photometric coordinates as well.In this work, we mostly deal with the exploitation of the previously constructed hybrid metric tensor in the catadioptric image processing. Indeed, it is recognized that the Gaussian function is at the core of several filters and operators for various applications, such as noise reduction, or the extraction of low-level characteristics from the Gaussian space- scale representation. We thus build a new Gaussian kernel dependent on the Riemannian metric tensor. It has the advantage of being applicable directly on the catadioptric image plane, also, variable in space and depending on the local image information. As a final part in this thesis, we discuss some possible robotic applications of the hybrid metric tensor. We propose to define the free space and distance transforms in the omni- image, then to extract geodesic medial axis. The latter is a relevant topological representation for autonomous navigation, that we use to define an optimal trajectory planning method
Morlot, Frédéric. "Processus spatio-temporels en géométrie stochastique et application à la modélisation de réseaux de télécommunication". Phd thesis, Télécom ParisTech, 2012. http://pastel.archives-ouvertes.fr/pastel-00931407.
Pełny tekst źródłaArib, Souhila. "Mécanismes de formation de coalitions d’agents dans les processus de planification". Thesis, Paris 9, 2015. http://www.theses.fr/2015PA090027.
Pełny tekst źródłaThe work we present, in this thesis, focuses on the coalition formation problem for self-interested agents which plan their activities in multi-agents systems. As a first step, we have proposed, a mechanism that is based on the analysis of the agents' actions in their plans and reasoning about the plans of others. Additionally, we have addressed the problem of coalition formation with dynamic constraints and preferences that agents reveal and communicate to others during their negotiations. Finally, we have refined our coalition formation mechanism allowing a guided search of the coalitions by building a tree of constraints and a tree of coalitions. Each tree is explored by means of the Monte-Carlo algorithm
Hannachi, Ammar. "Imagerie multimodale et planification interactive pour la reconstruction 3D et la métrologie dimensionnelle". Thesis, Strasbourg, 2015. http://www.theses.fr/2015STRAD024/document.
Pełny tekst źródłaProducing industrially manufactured parts generates a very large number of data of various types defining the manufacturing geometries as well as the quality of production. This PhD work has been carried out within the framework of the realization of a cognitive vision system dedicated to the 3D evaluation of manufactured objects including possibly free form surfaces, taking into account the geometric tolerances and uncertainties. This system allows the comprehensive control of manufactured parts, and provides the means for their automated 3D dimensional inspection. The implementation of a multi-sensor (passive and active) measuring system enabled to improve significantly the assessment quality through an enriched three-dimensional reconstruction of the object to be evaluated. Specifically, we made use simultaneously of a stereoscopic vision system and of a structured light based system in order to reconstruct the edges and surfaces of various 3D objects
Guérin, Clément. "Gestion de contraintes et expertise dans les stratégies d'ordonnancement". Thesis, Rennes 2, 2012. http://www.theses.fr/2012REN20025/document.
Pełny tekst źródłaOnly a few research works in psychology are devoted to scheduling, for example about planning tasks, workers and machines occupation in the shop. In the literature, schedulers are mainly described from the procedural viewpoint. For describing scheduling activity, we adopted the complementary representational viewpoints in terms of constraints management. Two scheduling situations have been studied: timetabling and industrial scheduling. By comparing novices and experts, we observed that the latter used constraints visible on the timetable or on the Gantt chart, to solve the scheduling problem. Moreover, experts used a higher level of abstraction than novices in the control of processing. Finally, we highlighted the similarities and differences between industrial scheduling and timetabling. In addition, we conducted a multidisciplinary study from a previous work in the field of operational research by evaluating a scheduling tool. We investigated the effect of the mutual control modality on human scheduling decisions, and the management of breakdowns risks in a shop by schedulers
Moignier, Cyril. "Dosimétrie des faisceaux de photons de petites dimensions utilisés en radiothérapie stéréotaxique : détermination des données dosimétriques de base et évaluation des systèmes de planification de traitement". Thesis, Paris 11, 2014. http://www.theses.fr/2014PA112258/document.
Pełny tekst źródłaDosimetry of small beams is challenging given their small size compared to the detectors, high dose gradient and the lack of lateral electronic equilibrium. The Ph.D. thesis aims to improve the accuracy of the dose delivered to the patient in stereotactic radiotherapy.On the one hand, dosimetric data used to calibrate the treatment planning system (TPS) were determined using numerical simulations. To achieve this, two CyberKnife radiotherapy facilities were modelled using the PENELOPE Monte Carlo code. Output ratios measurements were performed with several active detectors and with two passive dosimeters (radiochromic film and micro-LiF) and compared with output factors calculated by simulation. Six detectors were modeled in order to study the detectors response in small beams. Among the detectors studied, only the radiochromic films were in agreement with the simulations, they can be used without correction factors. The disturbance of the detectors response in small beams was explained either by the volume effect induced by the active volume, which is too high compared to the beam size, or by the mass density effect induced by the detector body materials which are too far from water mass density. The correction factors, required to correct the disturbance caused by the non-water-equivalence and/or the low spatial resolution of each detector, were calculated for the two CyberKnife systems.On the other hand, a 2D dose measurement protocol using radiographic films and a MatLab program were developed. Stereotactic treatment plans were then performed for a phantom in order to assess the calculation algorithms implemented in the MultiPlan TPS (associated with the CyberKnife system). The analysis of the 2D dose distributions related to the stereotactic treatment plans has shown that the “Pencil Beam” based algorithm implemented in MultiPlan is suitable for dose calculation in homogeneous water-equivalent media but not in low electronic density media such as the lung. Indeed, the dose is overestimated (up to 40%) inside the field and may lead to reduce the tumor treatment efficiency while it is underestimated outside the field which can underestimate the dose to critical organs within proximity of the tumor. Regarding the Monte Carlo algorithm implemented in MultiPlan, calculated and measured dose distributions are consistent and, as a consequence, it is the most suitable algorithm available in MultiPlan to estimate the dose received by a patient when low density media are involved
De, conto Celine. "Evaluation dosimétrique dfes algorithmes implémentés dans les systèmes de planification de traitement en présence d'hétérogénéités de forte densité : cas de la sphère ORL en radiothérapie externe". Thesis, Besançon, 2014. http://www.theses.fr/2014BESA2063/document.
Pełny tekst źródłaThe last few years, cancer treatment techniques in radiation therapy have become more complex to better targetthe tumor while protecting the organs at risk. The treatment planning systems (TPS) achieve a predictivecalculation of the distribution of the dose absorbed by the patient (via CT images).In order to obtain an accurate dose result within a reasonable time, the calculation is performed with simplifiedalgorithms. In the presence of medical devices made of high density metal (hip prosthesis or dental prosthesis),the algorithms reach their limits. Moreover, these devices disrupt computed tomography reconstruction, creatingartifacts on the images and thus making difficult the delineation of organs. The aim of this work is to evaluatethe algorithms of the TPS in the presence of high density heterogeneity using experimental measurements andthe Monte Carlo BEAMnrc code in an anthropomorphic phantom: on one hand with natural samples, and on theother hand, with calibrated samples. Then, a retrospective evaluation of clinical algorithms compared to MonteCarlo is achieved using treated patients in Conformal Radiotherapy and in Intensity Modulated RadiationTherapy (IMRT). The measurements show an attenuation of up to 17 % for dental amalgam compared with theclinical algorithm on CT images with artifacts, creating an under-dosage area in the target volume. All theseresults lead to recommendations for the clinical treatments (corrected CT images if the target volume is closerthan 3 cm to prosthesis, favor the AAA algorithm rather than Pencil Beam …)
Zhu, Wenwu. "Segmentation et recalage d'images TDM multi-phases de l'abdomen pour la planification chirurgicale". Thesis, Strasbourg, 2015. http://www.theses.fr/2015STRAD011/document.
Pełny tekst źródłaThe fusion of arterial and venous phase CT images of the entire abdominal viscera is critical for a better diagnosis, surgi-cal planning and treatment, since these two phase images contain complementary information. However, non-rigid regis-tration of abdominal images is still a big challenge due to the breathing motion, which causes sliding motion between the abdominal viscera and the abdo-thoracic wall. The purpose of this thesis is to provide an accurate registration method for abdominal viscera between venous and arterial phase CT images.In order to remove the sliding motion effect, we decide to separate the image into big motion and less motion regions, and perform the registration on new images where abdo-thoracic wall and thoracic viscera are removed. The segmentation of these sliding interfaces is completed with our fast interactive tools within 10 minitues. Two state-of-the-art non-rigid registration algorithms are then applied on these new images and compared to registration obtained with original images. The evaluation using four abdominal organs (liver, kidney, spleen) and several vessel bifurcations shows that our approach provides a much higher accuracy within 1 mm
Pastorelli, Mario. "Disciplines basées sur la taille pour la planification des jobs dans data-intensif scalable computing systems". Thesis, Paris, ENST, 2014. http://www.theses.fr/2014ENST0048/document.
Pełny tekst źródłaThe past decade have seen the rise of data-intensive scalable computing (DISC) systems, such as Hadoop, and the consequent demand for scheduling policies to manage their resources, so that they can provide quick response times as well as fairness. Schedulers for DISC systems are usually focused on the fairness, without optimizing the response times. The best practices to overcome this problem include a manual and ad-hoc control of the scheduling policy, which is error-prone and difficult to adapt to changes. In this thesis we focus on size-based scheduling for DISC systems. The main contribution of this work is the Hadoop Fair Sojourn Protocol (HFSP) scheduler, a size-based preemptive scheduler with aging; it provides fairness and achieves reduced response times thanks to its size-based nature. In DISC systems, job sizes are not known a-priori: therefore, HFSP includes a job size estimation module, which computes approximated job sizes and refines these estimations as jobs progress. We show that the impact of estimation errors on the size-based policies is not signifi- cant, under conditions which are verified in a system such as Hadoop. Because of this, and by virtue of being designed around the idea of working with estimated sizes, HFSP is largely tolerant to job size estimation errors. Our experimental results show that, in a real Hadoop deployment and with realistic workloads, HFSP performs better than the built-in scheduling policies, achieving both fairness and small mean response time. Moreover, HFSP maintains its good performance even when the cluster is heavily loaded, by focusing the resources to few selected jobs with the smallest size. HFSP is a preemptive policy: preemption in a DISC system can be implemented with different techniques. Approaches currently available in Hadoop have shortcomings that impact on the system performance. Therefore, we have implemented a new preemption technique, called suspension, that exploits the operating system primitives to implement preemption in a way that guarantees low latency without penalizing low-priority jobs
Guerin, Clément. "Gestion de contraintes et expertise dans les stratégies d'ordonnancement". Phd thesis, Université Rennes 2, 2012. http://tel.archives-ouvertes.fr/tel-00744251.
Pełny tekst źródła