Siga este link para ver outros tipos de publicações sobre o tema: Extraction des sources.

Teses / dissertações sobre o tema "Extraction des sources"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Veja os 50 melhores trabalhos (teses / dissertações) para estudos sobre o assunto "Extraction des sources".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Veja as teses / dissertações das mais diversas áreas científicas e compile uma bibliografia correta.

1

González-Valentín, Karen M. (Karen Mercedes) 1978. "Extraction of variation sources due to layout practices". Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/87206.

Texto completo da fonte
Resumo:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2002.
Includes bibliographical references (p. 97).
by Karen M. González-Valentín.
S.M.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Yankova-Doseva, Milena. "TERMS - Text Extraction from Redundant and Multiple Sources". Thesis, University of Sheffield, 2010. http://etheses.whiterose.ac.uk/933/.

Texto completo da fonte
Resumo:
In this work we present our approach to the identity resolution problem: discovering references to one and the same object that come from different sources. Solving this problem is important for a number of different communities (e.g. Database, NLP and Semantic Web) that process heterogeneous data where variations of the same objects are referenced in different formats (e.g. textual documents, web pages, database records, ontologies etc.). Identity resolution aims at creating a single view into the data where different facts are interlinked and incompleteness is remedied. We propose a four-step approach that starts with schema alignment of incoming data sources. As a second step - candidate selection - we discard those entities that are totally different from those that they are compared to. Next the main evidence for identity of two entities comes from applying similarity measures comparing their attribute values. The last step in the identity resolution process is data fusion or merging entities found to be identical into a single object. The principal novel contribution of our solution is the use of a rich semantic knowledge representation that allows for flexible and unified interpretation during the resolution process. Thus we are not restricted in the type of information that can be processed (although we have focussed our work on problems relating to information extracted from text). We report the implementation of these four steps in an IDentity Resolution Framework (IDRF) and their application to two use-cases. We propose a rule based approach for customisation in each step and introduce logical operators and their interpretation during the process. Our final evaluation shows that this approach facilitates high accuracy in resolving identity.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Leouffre, Marc. "Extraction de sources d'électromyogrammes et évaluation des tensions musculaires". Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENT009/document.

Texto completo da fonte
Resumo:
L'évaluation des tensions musculaires chez l'Homme dans les sciences du mouvement et les études posturales présente un grand intérêt pour le sport, la santé ou encore l'ergonomie. La biomécanique s'intéresse tout particulièrement à ces problèmes utilise la cinématique inverse pour recalculer, à partir de mesures physiques externes, les tensions musculaires internes. Le verrou scientifique principal de cette technique est la redondance musculaire, propre au vivant. En effet les actionneurs (muscles) sont plus nombreux que les degrés de liberté à contrôler. Les problèmes de cinématique inverse sont sous-déterminés, ils présentent plus d'inconnues que d'équations, et nécessitent l'usage de procédures d'optimisation. Dans ce contexte l'usage de l'électromyographie (EMG), signal électro-physiologique mesurable à la surface de la peau et témoin de l'activité musculaire, peut donner une idée de l'activité des muscles sous-jacents. La connaissance de l'activité des muscles permettrait d'introduire de l'information supplémentaire dans cette méthodologie inverse afin d'améliorer l'estimation des tensions musculaires réelles au cours de mouvements ou dans une posture donnée. De plus certaines applications ne permettent pas ou peu l'enregistrement de forces ou positions articulaires externes qui nécessitent un appareillage conséquent et rendent difficile l'étude de situations de la vie courante. L'électromyographie est dans un tel contexte une mesure non-invasive et peu encombrante, facilement réalisable. Elle a cependant elle aussi ses propres verrous scientifiques. L'EMG de surface sur de petits muscles très rapprochés comme les nombreux muscles des avant-bras peut être sujette à ce qui est communément appelé « cross-talk » ; la contamination croisée des voies. Ce cross-talk est le résultat de la propagation des signaux musculaires sur plusieurs voies simultanément, si bien qu'il est compliqué d'associer l'activité d'un muscle à une unique voie EMG. Le traitement numérique du signal dispose d'outils permettant, dans certaines conditions, de retrouver des sources inconnues mélangées sur plusieurs capteurs. Ainsi la séparation de sources peut être utilisée sur des signaux EMG afin de retrouver de meilleures estimations des signaux sources reflétant plus fidèlement l'activité de muscles sans l'effet du cross-talk. Ce travail de thèse montre dans un premier temps l'intérêt de l'EMG dans l'étude de l'utilisation d'un prototype d'interface homme-machine novateur. L'EMG permet en particulier de mettre en évidence la présence forte de cocontraction musculaire permettant de stabiliser les articulations pour permettre un contrôle précis du dispositif. En outre des perspectives d'analyse plus fines seraient envisageables en utilisant des techniques de séparation de sources performantes en électromyographie. Dans un second temps l'accent est mis sur l'étude des conditions expérimentales précises permettant l'utilisation des techniques de séparation de sources en contexte linéaire instantané en électromyographie de surface. L'hypothèse d'instantanéité du mélange des sources en particulier est étudiée et sa validité est vérifiée sur des signaux réels. Enfin une solution d'amélioration de la robustesse de la séparation de sources à l'hypothèse de l'instantanéité est proposée. Celle-ci repose sur la factorisation en matrices non-négatives (NMF) des enveloppes des signaux EMG
Evaluation of muscle tensions in movement and gait sciences is of great interest in the fields of sports, health or ergonomics. Biomechanics in particular has been looking forward to solving these problems and developed the use of inverse kinematics to compute internal muscle tensions from external physical measures. Muscular redundancy remains however a complex issue, there are more muscles than degrees of freedom and thus more unknown variables which makes inverse kinematics an under-determined problem needing optimization techniques to be solved. In this context using electromyography (EMG), an electro-physiological signal that can be measured on the skin surface, gives an idea of underlying muscle activities. Knowing muscle activities could be additional information to feed the optimization procedures with and could help improving accuracy of estimated muscle tensions during real gestures or gait situation. There are even situations in which measuring external physical variables like forces, positions or accelerations is not feasible because it might require equipment incompatible with the object of the study. It is often the case in ergonomics when equipping the object of the study with sensors is either too expensive or physically too cumbersome. In such cases EMG can become very handy as a non-invasive measure that does not require the environment to be equipped with other sensors. EMG however has its own limits, surface EMG on small and closely located muscles like muscles of the forearm can be subject to “cross-talk”. Cross-talk is the cross contamination of several sensors it is the result of signal propagation of more than one muscle on one sensor. In presence of cross-talk it is not possible to associate an EMG sensor with a given muscle. There are signal processing techniques dealing with this kind of problem. Source separation techniques allow estimation of unknown sources from several sensors recording mixtures of these sources. Applying source separation techniques on EMG can provide EMG source estimations reflecting individual muscle activities without the effect of cross-talk. First the benefits of using surface EMG during an ergonomics study of an innovative human-computer interface are shown. EMG pointed out a relatively high level of muscle co-contraction that can be explained by the need to stabilize the joints for a more accurate control of the device. It seems legitimate to think that using source separation techniques would provide signals that better represent single muscle activities and these would improve the quality of this study. Then the precise experimental conditions for linear instantaneous source separation techniques to work are studied. Validity of the instantaneity hypothesis in particular is tested on real surface EMG signals and its strong dependency on relative sensor locations is shown. Finally a method to improve robustness of linear instantaneous source separation versus instantaneity hypothesis is proposed. This method relies on non-negative matrix factorization of EMG signal envelopes
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Dumitrescu, Stefan Daniel. "L' extraction d'information des sources de données non structurées et semi-structurées". Toulouse 3, 2011. http://thesesups.ups-tlse.fr/1555/.

Texto completo da fonte
Resumo:
L'objectif de la thèse: Dans le contexte des dépôts de connaissances de grandes dimensions récemment apparues, on exige l'investigation de nouvelles méthodes innovatrices pour résoudre certains problèmes dans le domaine de l'Extraction de l'Information (EI), tout comme dans d'autres sous-domaines apparentés. La thèse débute par un tour d'ensemble dans le domaine de l'Extraction de l'Information, tout en se concentrant sur le problème de l'identification des entités dans des textes en langage naturel. Cela constitue une démarche nécessaire pour tout système EI. L'apparition des dépôts de connaissances de grandes dimensions permet le traitement des sous-problèmes de désambigüisation au Niveau du Sens (WSD) et La Reconnaissance des Entités dénommées (NER) d'une manière unifiée. Le premier système implémenté dans cette thèse identifie les entités (les noms communs et les noms propres) dans un texte libre et les associe à des entités dans une ontologie, pratiquement, tout en les désambigüisant. Un deuxième système implémenté, inspiré par l'information sémantique contenue dans les ontologies, essaie, également, l'utilisation d'une nouvelle méthode pour la solution du problème classique de classement de texte, obtenant de bons résultats
Thesis objective: In the context of recently developed large scale knowledge sources (general ontologies), investigate possible new approaches to major areas of Information Extraction (IE) and related fields. The thesis overviews the field of Information Extraction and focuses on the task of entity recognition in natural language texts, a required step for any IE system. Given the availability of large knowledge resources in the form of semantic graphs, an approach that treats the sub-tasks of Word Sense Disambiguation and Named Entity Recognition in a unified manner is possible. The first implemented system using this approach recognizes entities (words, both common and proper nouns) from free text and assigns them ontological classes, effectively disambiguating them. A second implemented system, inspired by the semantic information contained in the ontologies, also attempts a new approach to the classic problem of text classification, showing good results
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Horton, Bryan. "Rotational motion of pendula systems for wave energy extraction". Thesis, Available from the University of Aberdeen Library and Historic Collections Digital Resources, 2009. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?application=DIGITOOL-3&owner=resourcediscovery&custom_att_2=simple_viewer&pid=25873.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Ferreira, Lage Sandra. "The neurotoxin β-N-methylamino-L-alanine (BMAA) : Sources, bioaccumulation and extraction procedures". Doctoral thesis, Stockholms universitet, Institutionen för ekologi, miljö och botanik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-132142.

Texto completo da fonte
Resumo:
β-methylamino-L-alanine (BMAA) is a neurotoxin linked to neurodegeneration, which is manifested in the devastating human diseases amyotrophic lateral sclerosis, Alzheimer’s and Parkinson’s disease. This neurotoxin is known to be produced by almost all tested species within the cyanobacterial phylum including free living as well as the symbiotic strains. The global distribution of the BMAA producers ranges from a terrestrial ecosystem on the Island of Guam in the Pacific Ocean to an aquatic ecosystem in Northern Europe, the Baltic Sea, where annually massive surface blooms occur. BMAA had been shown to accumulate in the Baltic Sea food web, with highest levels in the bottom dwelling fish-species as well as in mollusks. One of the aims of this thesis was to test the bottom-dwelling bioaccumulation hypothesis by using a larger number of samples allowing a statistical evaluation. Hence, a large set of fish individuals from the lake Finjasjön, were caught and the BMAA concentrations in different tissues were related to the season of catching, fish gender, total weight and species. The results reveal that fish total weight and fish species were positively correlated with BMAA concentration in the fish brain. Therefore, significantly higher concentrations of BMAA in the brain were detected in plankti-benthivorous fish species and heavier (potentially older) individuals. Another goal was to investigate the potential production of BMAA by other phytoplankton organisms. Therefore, diatom cultures were investigated and confirmed to produce BMAA, even in higher concentrations than cyanobacteria. All diatom cultures studied during this thesis work were show to contain BMAA, as well as one dinoflagellate species. This might imply that the environmental spread of BMAA in aquatic ecosystems is even higher than previously thought. Earlier reports on the concentration of BMAA in different organisms have shown highly variable results and the methods used for quantification have been intensively discussed in the scientific community. In the most recent studies, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has become the instrument of choice, due to its high sensitivity and selectivity. Even so, different studies show quite variable concentrations of BMAA. In this thesis, three of the most common BMAA extraction protocols were evaluated in order to find out if the extraction could be one of the sources of variability. It was found that the method involving precipitation of proteins using trichloroacetic acid gave the best performance, complying with all in-house validation criteria. However, extractions of diatom and cyanobacteria cultures with this validated method and quantified using LC-MS/MS still resulted in variable BMAA concentrations, which suggest that also biological reasons contribute to the discrepancies. The current knowledge on the environmental factors that can induce or reduce BMAA production is still limited. In cyanobacteria, production of BMAA was earlier shown to be negative correlated with nitrogen availability – both in laboratory cultures as well as in natural populations. Based on this observation, it was suggested that in unicellular non-diazotrophic cyanobacteria, BMAA might take part in nitrogen metabolism. In order to find out if BMAA has a similar role in diatoms, BMAA was added to two diatom species in culture, in concentrations corresponding to those earlier found in the diatoms. The results suggest that BMAA might induce a nitrogen starvation signal in diatoms, as was earlier observed in cyanobacteria. However, diatoms recover shortly by the extracellular presence of excreted ammonia. Thus, also in diatoms, BMAA might be involved in the nitrogen balance in the cell.

At the time of the doctoral defense, the following paper was unpublished and had a status as follows: Paper 4: Manuscript.

Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Xu, Xu. "Nonlinear dynamics of parametric pendulum for wave energy extraction". Thesis, University of Aberdeen, 2005. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=189414.

Texto completo da fonte
Resumo:
A new concept, extracting energy from sea waves by parametric pendulor, has been explored in this project. It is based on the conversion of vertical oscillations to rotational motion by means of a parametrically-excited pendulor, i.e. a pendulum operating in rotational mode. The main advantage of this concept lies in a direct conversion from vertical oscillations to rotations of the pendulum pivot. This thesis, firstly, reviewed a number of well established linear and nonlinear theories of sea waves and Airy’s sea wave model has been used in the modelling of the sea waves and a parametric pendulum excited by sea waves. The third or fifth order Stokes’s models can be potentially implemented in the future studies. The equation of motion obtained for a parametric pendulum excited by sea waves has the same form as for a simple parametrically-excited pendulum. Then, to deepen the fundamental understanding, an extensive theoretical analysis has been conducted on a parametrically-excited pendulum by using both numerical and analytical methods. The numerical investigations focused on the bifurcation scenarios and resonance structures, particularly, for the rotational motions. Analytical analysis of the system has been performed by applying the perturbation techniques. The approximate solutions, resonance boundary and existing boundary of rotations have been obtained with a good correspondence to numerical results. The experimental study has been carried out by exploring oscillations, rotations and chaotic motions of the pendulum.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Albezzawy, Muhammad Nabil Mustafa. "Advanced signal processing methods for source identification using references". Electronic Thesis or Diss., Lyon, INSA, 2024. http://www.theses.fr/2024ISAL0074.

Texto completo da fonte
Resumo:
Les techniques de référence à rang réduit sont couramment employées pour résoudre les problèmes d’extraction de source et de resynchronisation de champs physiques, lorsque le nombre de références dépasse celui des sources incohérentes. Dans ce cas, la matrice croisée-spectrale devient mal conditionnée, rendant la solution des moindres carrés invalide. Bien que la décomposition en valeurs singulières tronquée (DVST) soit utilisée pour résoudre ce problème, elle n'est valable que pour un bruit scalaire sur les références. De plus, il est difficile de définir un seuil de troncature lorsque les valeurs singulières diminuent progressivement. Cette thèse propose une solution nommée technique de référence maximale-coherent (RMC), basée sur la recherche d’un ensemble de références virtuelles maximales correlées avec les mesures de champ. Cette technique est optimale, surtout en présence d’un bruit corrélé sur la référence. Cependant, elle nécessite également une troncature des valeurs propres, exigeant la connaissance ou l’estimation préalable du nombre de sources incohérentes, un problème inverse mal posé et peu étudié. La thèse présente trois méthodes d’énumération de sources applicables à toutes les techniques de référence : un test du rapport de vraisemblance contre le modèle saturé, une technique de bootstrap paramétrique et une approche de validation croisée. Une étude comparative basée sur des données numériques et expérimentales montre deux résultats importants. D'abord, le nombre de fenêtres spectrales utilisées affecte grandement la performance des trois méthodes, qui se comportent différemment selon ce nombre. Ensuite, le bootstrap paramétrique s’avère être la meilleure méthode en termes de précision et de robustesse par rapport au nombre de fenêtres utilisées. Enfin, la technique RMC accompagnée de bootstrap a été utilisée pour l’extraction de source et la resynchronisation de données réelles provenant d’expériences en laboratoire et d’un moteur électrique, fournissant de meilleurs résultats que la solution des moindres carrés et la DVST dans les mêmes conditions
Rank-reduced reference/coherence techniques based on the use of references, i.e. fixed sensors, are widely used to solve the two equivalent problems of source extraction and resynchronization encountered during remote sensing of physical fields, when the number of references surpasses the number of incoherent sources. In such case, the cross-spectral matrix (CSM) becomes ill-conditioned, resulting in the invalidity of the least squares LS solution. Although the truncated singular value decomposition (TSVD) was successfully applied in the literature to solve this problem, its validity is limited only to the case of scalar noise on the references. It is also very difficult to define a threshold, for truncation, when the singular values are gradually decreasing. This thesis proposes a solution based on finding a set of virtual references that is maximally correlated with the field measurements, named the maximally-coherent reference (MCR) Technique. This solution is optimal, especially, in the case of correlated noise on the reference, where TSVD fails. However the technique also includes an eigenvalue truncation step, similar to the one required for the TSVD, which necessitates a priori knowledge or the estimation of the number of incoherent sources, i.e. source enumeration, which is an ill-posed inverse problem, insufficiently investigated in the literature within the framework of reference techniques. In this thesis, after providing a unified formalism for all the reference techniques in the literature, three alternative source enumeration methods, applicable to all the reference techniques, were presented namely; a direct likelihood ratio test (LRT) against the saturated model, a parametric bootstrap technique and a cross-validation approach. A comparative study is performed among the three methods, based on simulated numerical data, real sound experimental data, and real electrical motor data. The results showed two important outcomes. The first is that the number of snapshots (spectral windows), used in the spectral analysis, greatly affects the performance of the three methods, and that, they behave differently for the same number of used snapshots. The second is that parametric bootstrapping turned out to be the best method in terms of both estimation accuracy and robustness with regard to the used number of snapshots. Finally, the MCR technique accompanied with bootstrapping was employed for source extraction and resynchronization of real data from laboratory experiments, and an e-motor, and it returned better results than the LS solution and the TSVD when employed for the same purpose
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

Syed, Ali Asgher. "Hole extraction layer/perovskite interfacial modification for high performing inverted planar perovskite solar cells". HKBU Institutional Repository, 2018. https://repository.hkbu.edu.hk/etd_oa/553.

Texto completo da fonte
Resumo:
Organo-metallic halide perovskite solar cells (PSCs) are considered as a promising alternative photovoltaic technology due to the advantages of low-cost solution fabrication capability and high power conversion efficiency (PCE). PSCs can be made using a conventional (n-i-p) structure and an inverted (p-i-n) configuration. PCE of the conventional p-i-n type PSCs is slightly higher than that of the inverted n-i-p type PSCs. However, the TiO2 electron transporting layer adopted in the conventional PSCs is formed at a high sintering temperature of >450 °C. The TiO2 electron transporting layer limits the application of conventional PSCs using flexible substrates that are not compatible with the high processing temperature. The hole extraction layer (HEL) in the inverted p-i-n type PSCs can be prepared by low-temperature solution fabrication processes, which can be adopted for achieving high performance large area flexible solar cells at a low cost. Inverted PSCs with a PCE range from 10 to 20% have been reported over the past few years. In comparison with the progresses of other photovoltaic technologies, the rapid enhancement in PCE of the PSCs offers an attractive option for commercial viability. The aim of this PhD project is to study the origin of the improvement in the performance of solution-processable inverted PSCs. The surface morphological and electronic properties of the HEL are crucial for the growth of the perovskite active layer and hence the performance of the inverted PSCs. Enhancement in short circuit current density (Jsc), reduced loss in open circuit voltage (Voc), improvement in cha Organo-metallic halide perovskite solar cells (PSCs) are considered as a promising alternative photovoltaic technology due to the advantages of low-cost solution fabrication capability and high power conversion efficiency (PCE). PSCs can be made using a conventional (n-i-p) structure and an inverted (p-i-n) configuration. PCE of the conventional p-i-n type PSCs is slightly higher than that of the inverted n-i-p type PSCs. However, the TiO2 electron transporting layer adopted in the conventional PSCs is formed at a high sintering temperature of >450 °C. The TiO2 electron transporting layer limits the application of conventional PSCs using flexible substrates that are not compatible with the high processing temperature. The hole extraction layer (HEL) in the inverted p-i-n type PSCs can be prepared by low-temperature solution fabrication processes, which can be adopted for achieving high performance large area flexible solar cells at a low cost. Inverted PSCs with a PCE range from 10 to 20% have been reported over the past few years. In comparison with the progresses of other photovoltaic technologies, the rapid enhancement in PCE of the PSCs offers an attractive option for commercial viability. The aim of this PhD project is to study the origin of the improvement in the performance of solution-processable inverted PSCs. The surface morphological and electronic properties of the HEL are crucial for the growth of the perovskite active layer and hence the performance of the inverted PSCs. Enhancement in short circuit current density (Jsc), reduced loss in open circuit voltage (Voc), improvement in charge collection efficiency (ηcc) through suppression of charge recombination were investigated systematically via controlled growth of the perovskite active layer in solution-processed inverted PSCs. Poly (3,4-ethylenedioxythiophene): poly (4-styrenesulfonate) (PEDOT:PSS) is one of the widely used solution processable conductive materials for hole transporting in different optoelectronic devices. PEDOT:PSS HEL also is a perfect electron blocking layer due to its high LUMO level. However, it has been reported that PEDOT:PSS HEL is related to the deterioration in the stability of PSCs due to its acidic and hygroscopic nature. Modification of PEDOT:PSS using solvent additives or incorporating metallic oxide nanoparticles for improving the processability and the performance of the inverted PSCs were reported. This work has been focused primary on realizing the controlled growth of perovskite active layer via HEL/perovskite interfacial modification using sodium citrate-treated PEDOT:PSS HEL and WO3-PEDOT:PSS composite HEL. Apart from investigating the properties of the modified PEDOT:PSS HELs, the purpose of the work is to improve the understanding of the effect of modified HEL on the growth of the perovskite layer, revealing the charge recombination processes under different operation conditions, analyzing change extraction probability, and thereby improving the overall performance of the PSCs. PCE of >11.30% was achieved for PSCs with a sodium citrate-modified PEDOT:PSS HEL, which is >20% higher than that of a structurally identical control device having a pristine PEDOT:PSS HEL (9.16%). The incident photon to current efficiency (IPCE) and light intensity-dependent J-V measurements reveal that the use of the sodium citrate-modified PEDOT:PSS HEL helps to boost the performance of the inverted PSCs in two ways: (1) it improves the processability of perovskite active layer on HEL, and (2) it enables to enhance the charge extraction efficiency at the HEL/perovskite interface. The suppression of charge recombination in the PSCs with a modified HEL also was examined using photocurrent-effective voltage (Jph-Veff) and transient photocurrent (TPC) measurements. Morphological and structural properties of the perovskite layers were investigated using the scanning electron microscope (SEM) and X-ray diffraction (XRD) measurements. The results reveal that high quality perovskite active layer on the modified HEL was attained forming complete perovskite phase. The surface electronic properties of the modified PEDOT:PSS and pristine PEDOT:PSS layers were studied using X-ray photoelectron spectroscopy (XPS) and ultraviolet photoelectron spectroscopy (UPS) measurements. XPS results reveal that treatment of sodium citrate partially removes the PSS unit in the PEDOT:PSS, resulting in an increase in the ratio of PEDOT to PSS from 0.197 for a treated PEDOT:PSS HEL to that of 0.108 for the pristine PEDOT:PSS HEL. UPS measurements also show that there is an observable reduction in the work function of the modified HEL, implying that sodium citrate-modified PEDOT:PSS HEL possesses an improved electron blocking capability, which is beneficial for efficient operation of the inverted PSCs.;The performance enhancement in MAPbI3-based PSCs with a tungsten oxide (WO3)-PEDOT:PSS composite HEL also was analyzed. The uniform composite WO3-PEDOT:PSS HEL was formed on indium tin oxide (ITO) surface by solution fabrication process. The morphological and surface electronic properties of WO3-PEDOT:PSS composite film were examined using AFM, XPS, UPS and Raman Spectroscopy. SEM images reveal that the perovskite films grown on the composite HEL had a full coverage without observable pin holes. XRD results show clearly that no residual of lead iodide phase was observed, suggesting a complete perovskite phase was obtained for the perovskite active layer grown on the composite HEL. The volume ratio of WO3 to PEDOT:PSS of 1:0.25 was optimized for achieving enhanced current density and Voc in the PSCs. It is demonstrated clearly that the use of the WO3-PEDOT:PSS composite HEL helps to improve the charge collection probability through suppression of the charge recombination at the MAPbI3/composite HEL interface. The charge extraction efficiency at the perovskite/PEDOT:PSS and perovskite/composite HEL interfaces were investigated by analyzing the PL quenching efficiency of the MAPbI3 active layer. It is shown that the PL efficiency quenching at the MAPbI3/composite HEL samples is one order of magnitude higher than that measured for the perovskite/pristine PEDOT:PSS sample, suggesting an enhanced hole extraction probability at the MAPbI3/composite HEL interface. The combined effects of improved perovskite crystal growth and enhanced charge extraction capabilities result in the inverted PSCs with a PCE of 12.65%, which is 22% higher than that of a structurally identical control device (10.39%). The use of the WO3-PEDOT:PSS composite HEL also benefits the efficient operation of the PSCs, demonstrated in the stability test, as compared to that of the control cell under the same aging conditions. With the progresses made in improving the performance of MAPbI3-based PSCs, the research was extended to study the performance of efficient PSCs with mixed halide of MA0.7FA0.3Pb (I0.9Br0.1)3. The effect of the annealing temperature on the growth of the mixed MA0.7FA0.3Pb (I0.9Br0.1)3 perovskite active layer was analyzed. It was found that the optimal growth of the mixed perovskite active layer occurred at an annealing temperature of 100°C. UPS results reveal that the ionization potential of 5.76 eV measured for the mixed cation perovskite is lower than that of MAPbI3-based single cation perovskite layer (5.85 eV), while the corresponding electron affinity of the mixed perovskite was 4.28 eV and that for the MAPbI3 layer was 4.18 eV, respectively. The changes in the bandgap and the energy levels of the MA0.7FA0.3Pb (I0.9Br0.1)3 and MAPbI3 active layers were examined using UV-vis absorption spectroscopy and UPS measurements. Compared to the MAPbI3-based control cell, a 23% increase in Jsc, a 15% increase in Voc and an overall 25% increase in PCE for the MA0.7FA0.3Pb (I0.9Br0.1)3 were achieved as compared to that of the MAPbI3-based PSCs. An obvious improvement in charge collection efficiency in MA0.7FA0.3Pb (I0.9Br0.1)3-based PSCs operated at different Veff was clearly manifested by the light intensity dependent J-V characteristic measurements. PL quenching efficiency also shows the charge transfer between MA0.7FA0.3Pb (I0.9Br0.1)3 and PEDOT:PSS HEL is one order of magnitude higher as compare to that in the MAPbI3-based PSCs, suggesting the formation of improved interfacial properties at the MA0.7FA0.3Pb (I0.9Br0.1)3/HEL interface. The impact of incorporating mixed MA0.7FA0.3Pb (I0.9Br0.1)3 perovskite active layer on PCE and the stability of the PSCs was further studied using a combination of TPC measurement and aging test. The stability of MA0.7FA0.3Pb (I0.9Br0.1)3- and MAPbI3-based PSCs with respect to the aging time was monitored for a period of >2 months. The MA0.7FA0.3Pb (I0.9Br0.1)3-based PSCs are more stable compared to the MAPbI3-based PSCs aged under the same conditions. The aging test supports the findings made with the TPC and light intensity dependent J-V measurements. It shows that the improved interfacial quality at the perovskite/HEL and the enhanced charge extraction capability are favorable for efficient and stable operation of MA0.7FA0.3Pb (I0.9Br0.1)3-based PSCs.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Zaman, Tauhid R. "Information extraction with network centralities : finding rumor sources, measuring influence, and learning community structure". Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/70410.

Texto completo da fonte
Resumo:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2011.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 193-197).
Network centrality is a function that takes a network graph as input and assigns a score to each node. In this thesis, we investigate the potential of network centralities for addressing inference questions arising in the context of large-scale networked data. These questions are particularly challenging because they require algorithms which are extremely fast and simple so as to be scalable, while at the same time they must perform well. It is this tension between scalability and performance that this thesis aims to resolve by using appropriate network centralities. Specifically, we solve three important network inference problems using network centrality: finding rumor sources, measuring influence, and learning community structure. We develop a new network centrality called rumor centrality to find rumor sources in networks. We give a linear time algorithm for calculating rumor centrality, demonstrating its practicality for large networks. Rumor centrality is proven to be an exact maximum likelihood rumor source estimator for random regular graphs (under an appropriate probabilistic rumor spreading model). For a wide class of networks and rumor spreading models, we prove that it is an accurate estimator. To establish the universality of rumor centrality as a source estimator, we utilize techniques from the classical theory of generalized Polya's urns and branching processes. Next we use rumor centrality to measure influence in Twitter. We develop an influence score based on rumor centrality which can be calculated in linear time. To justify the use of rumor centrality as the influence score, we use it to develop a new network growth model called topological network growth. We find that this model accurately reproduces two important features observed empirically in Twitter retweet networks: a power-law degree distribution and a superstar node with very high degree. Using these results, we argue that rumor centrality is correctly quantifying the influence of users on Twitter. These scores form the basis of a dynamic influence tracking engine called Trumor which allows one to measure the influence of users in Twitter or more generally in any networked data. Finally we investigate learning the community structure of a network. Using arguments based on social interactions, we determine that the network centrality known as degree centrality can be used to detect communities. We use this to develop the leader-follower algorithm (LFA) which can learn the overlapping community structure in networks. The LFA runtime is linear in the network size. It is also non-parametric, in the sense that it can learn both the number and size of communities naturally from the network structure without requiring any input parameters. We prove that it is very robust and learns accurate community structure for a broad class of networks. We find that the LFA does a better job of learning community structure on real social and biological networks than more common algorithms such as spectral clustering.
by Tauhid R. Zaman.
Ph.D.
Estilos ABNT, Harvard, Vancouver, APA, etc.
11

Cooper, Erica L. "Automatic repair and recovery for Omnibase : robust extraction of data from diverse Web sources". Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/61157.

Texto completo da fonte
Resumo:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2010.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 31).
In order to make the best use of the multitude of diverse, semi-structured sources of data available on the internet, information retrieval systems need to reliably access the data on these different sites in a manner that is robust to changes in format or structure that these sites might undergo. An interface that gives a system uniform, programmatic access to the data on some web site is called a web wrapper, and the process of inferring a wrapper for a given website based on a few examples of its pages is known as wrapper induction. A challenge of using wrappers for online information extraction arises from the dynamic nature of the web-even the slightest of changes to the format of a web page may be enough to invalidate a wrapper. Thus, it is important to be able to detect when a wrapper no longer extracts the correct information, and also for the system to be able to recover from this type of failure. This thesis demonstrates improved error detection as well as methods of recovery and repair for broken wrappers for START, a natural-language question-answering system developed by Infolab at MIT.
by Erica L. Cooper.
M.Eng.
Estilos ABNT, Harvard, Vancouver, APA, etc.
12

Miranda, Ackerman Eduardo Jacobo. "Extracting Causal Relations between News Topics from Distributed Sources". Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-130066.

Texto completo da fonte
Resumo:
The overwhelming amount of online news presents a challenge called news information overload. To mitigate this challenge we propose a system to generate a causal network of news topics. To extract this information from distributed news sources, a system called Forest was developed. Forest retrieves documents that potentially contain causal information regarding a news topic. The documents are processed at a sentence level to extract causal relations and news topic references, these are the phases used to refer to a news topic. Forest uses a machine learning approach to classify causal sentences, and then renders the potential cause and effect of the sentences. The potential cause and effect are then classified as news topic references, these are the phrases used to refer to a news topics, such as “The World Cup” or “The Financial Meltdown”. Both classifiers use an algorithm developed within our working group, the algorithm performs better than several well known classification algorithms for the aforementioned tasks. In our evaluations we found that participants consider causal information useful to understand the news, and that while we can not extract causal information for all news topics, it is highly likely that we can extract causal relation for the most popular news topics. To evaluate the accuracy of the extractions made by Forest, we completed a user survey. We found that by providing the top ranked results, we obtained a high accuracy in extracting causal relations between news topics.
Estilos ABNT, Harvard, Vancouver, APA, etc.
13

Lawrie, Scott. "Understanding the plasma and improving extraction of the ISIS Penning H⁻ ions source". Thesis, University of Oxford, 2017. https://ora.ox.ac.uk/objects/uuid:1648761a-57b1-4d6f-8281-9d1c36ccd46a.

Texto completo da fonte
Resumo:
A Penning-type surface-plasma negative hydrogen (H-) ion source has been delivering beam at the ISIS pulsed spallation neutron and muon facility for over thirty years. It is one of the most powerful and well-renowned H- sources in the world. Although long-term experience has allowed the source to be operated reliably and set up in a repeatable way, it is treated as something of a 'black box': the detailed plasma physics of why it works has always been unclear. A vacuum Vessel for Extraction and Source Plasma Analyses (VESPA) has been developed to understand the ISIS ion source plasma and improve the beam extracted from it. The VESPA ion source is operated in a completely new regime whereby the analysing sector dipole magnet housed inside a refrigerated 'cold box', presently used on ISIS, is replaced by an on-axis extraction system. The new extraction system incorporates a novel einzel lens with an elliptical aperture. This is the first demonstration of an elliptical einzel being used to focus an asymmetric H- ion beam. With the dipole magnet removed, the ion source has been shown to produce 85 mA of H- beam current at normal settings; of which 80 mA is transported through the new einzel lens system, with a normalised RMS emittance of 0.2 π mm mrad. Optical emission spectroscopy measurements have shown a plasma density of 1019 m–3, an H2 dissociation rate of 70%, an almost constant electron temperature of 3.5 eV and an atomic temperature which linearly increases above the electron temperature. In support of these principal measurements, rigorous particle tracking, electrostatic and thermal simulations were performed. In addition, a suite of new equipment was manufactured by the author. This includes a fast pressure gauge, a temperature controller, a high voltage einzel lens circuit, a fast beam chopper and a caesium detection system.
Estilos ABNT, Harvard, Vancouver, APA, etc.
14

Lindqvist, Max. "Insights into the plasma and beam physics close to the extraction surface in H⁻/D⁻ sources for fusion based on 3D-PIC MCC modeling". Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASP133.

Texto completo da fonte
Resumo:
Les ions négatifs d'hydrogène et de deutérium pour le système d'injecteur de neutres d'ITER sont produits dans des sources ioniques radio-fréquences, principalement par production de surface, et accélérés à travers un système de diaphragmes multiples. Un des principaux facteurs limitants durant l'opération de telles sources ioniques est la quantité d'électrons co-extraits, en particulier en opérant avec du deutérium. Pour une description précise de la dynamique des particules près de la grille plasma (PG), où se trouve une configuration magnétique tridimensionnelle, une modélisation 3D-PIC MCC est requise. Le code 3D-PIC MCC a été utilisé pour simuler un des diaphragmes de la PG de la source d'ions ELISE à l'IPP Garching, un analogue à l'échelle un-demi de la source d'ions d'ITER. Le code original a d'abord été amélioré en ajoutant un module de génération de plasma permettant de modéliser la polarisation de la PG par rapport aux parois de la source. ONIX a été couplé avec le code de modélisation de faisceau IBSimu pour permettre la corrélation des propriétés des particules du plasma au faisceau. Ceci permet d'étudier la probabilité d'extraction, la divergence du faisceau d'ions négatifs dans différentes configurations et géométries de la PG, et fournit des renseignements sur l'optimisation de la grille. L'impact des paramètres plasma, de la polarisation de la grille plasma, et de la géométrie sur la co-extraction d'électrons est présenté. En augmentant la polarisation de la PG au-dessus du potentiel flottant, la quantité et l'instabilité temporelle des électrons co-extraits est fortement réduite, en accord avec l'expérience. De plus, une température électronique moindre, autour d'1 eV, peut réduire la quantité d'électrons co-extraits par un facteur 4 par rapport à 2 eV. Bien que les géométries de la PG n'aient pas eu d'impact significatifs sur la co-extraction d'électrons, l'angle de la face de la PG exposée au plasma affecte la probabilité d'extraction et l'accumulation d'ions négatifs produits en surface à cause de l'accélération par la gaine de Debye. Avec un angle réduit, plus d'ions sont transportés vers la région centrale près du ménisque, résultant en une plus faible divergence au cœur. Pour de futurs travaux, il est important de développer un procédé physique auto-cohérent de modélisation de l'accumulation d'ions négatifs produits en surface dans le volume du plasma. Les contraintes inhérentes aux modélisations 3D-PIC MCC imposées par les échelles temporelles et spatiales du domaine représentent un défi pour cela. Actuellement, il n'a pas été possible de reproduire dans une simulation 3D-PIC MCC la densité mesurée expérimentalement d'ions négatifs produits en surface
Negative hydrogen or deuterium ions for the ITER Neutral Beam Injection system are produced in radio-frequency ion sources, mainly by surface production, and accelerated through a multi-aperture grid system. One of the main limiting factors during the operation of such ion sources is the amount of co-extracted electrons, particularly during operation with D. For a correct description of the particle dynamics close to the Plasma Grid (PG), where a 3D magnetic field is present, self-consistent 3D-PIC MCC modeling is needed. The 3D-PIC MCC code ONIX has been used to simulate one PG aperture in the ELISE ion source, a half-size ITER-like prototype ion source at IPP Garching. The original code was improved by adding a plasma generation module that allows modeling the biasing of the PG with respect to the source walls. ONIX was coupled with the beam code IBSimu to allow the correlation of particle properties from the plasma to the beam and to study the extraction probability and beam divergence of negative ions during different configurations of PG biases and PG geometries, providing insights into grid optimization. The impact of plasma parameters, PG bias, and geometry of the grid on the co-extraction of electrons is presented. By increasing the PG bias above the floating potential, the amount and temporal instability of co-extracted electrons are strongly decreased, in agreement with experiments. Additionally, a lower electron temperature of around 1 eV can reduce the amount of co-extracted electrons by a factor of 4 compared to 2 eV. While the PG geometries studied did not have a significant impact on the co-extraction of electrons, the plasma-facing angle of the PG affects the extraction probability and accumulation of surface produced negative ions due to the acceleration by the Debye sheath. With a shallow angle, more ions are transported to the central region near the meniscus, resulting in a lower core divergence. As for future work, developing a self-consistent physical process for simulating the accumulation of surface produced negative ions in the plasma volume is important. This is challenging due to the time scale and domain size constraints inherent in 3D-PIC MCC modeling. As of now, achieving the experimentally measured density of surface produced negative ions has not been realized in 3D-PIC MCC simulations
Estilos ABNT, Harvard, Vancouver, APA, etc.
15

Serrano, Laurie. "Vers une capitalisation des connaissances orientée utilisateur : extraction et structuration automatiques de l'information issue de sources ouvertes". Caen, 2014. http://www.theses.fr/2014CAEN2011.

Texto completo da fonte
Resumo:
Face à l’augmentation vertigineuse des informations disponibles librement (notamment sur le Web), repérer efficacement celles qui présentent un intérêt s’avère une tâche longue et complexe. Les analystes du renseignement d’origine sources ouvertes sont particulièrement concernés par ce phénomène. En effet, ceux-ci recueillent manuellement une grande partie des informations d'intérêt afin de créer des fiches de connaissance résumant le savoir acquis à propos d’une entité. Dans ce contexte, cette thèse a pour objectif de faciliter et réduire le travail des acteurs du renseignement et de la veille. Nos recherches s’articulent autour de trois axes : la modélisation de l’information, l'extraction d’information et la capitalisation des connaissances. Nous avons réalisé un état de l’art de ces différentes problématiques afin d'élaborer un système global de capitalisation des connaissances. Notre première contribution est une ontologie dédiée à la représentation des connaissances spécifiques au renseignement et pour laquelle nous avons défini et modélisé la notion d'événement dans ce domaine. Par ailleurs, nous avons élaboré et évalué un système d’extraction d’événements fondé sur deux approches actuelles en extraction d'information : une première méthode symbolique et une seconde basée sur la découverte de motifs séquentiels fréquents. Enfin, nous avons proposé un processus d’agrégation sémantique des événements afin d'améliorer la qualité des fiches d'événements obtenues et d'assurer le passage du texte à la connaissance. Celui-ci est fondé sur une similarité multidimensionnelle entre événements, exprimée par une échelle qualitative définie selon les besoins des utilisateurs
Due to the considerable increase of freely available data (especially on the Web), the discovery of relevant information from textual content is a critical challenge. Open Source Intelligence (OSINT) specialists are particularly concerned by this phenomenon as they try to mine large amounts of heterogeneous information to acquire actionable intelligence. This collection process is still largely done by hand in order to build knowledge sheets summarizing all the knowledge acquired about a specific entity. Given this context, the main goal of this thesis work is to reduce and facilitate the daily work of intelligence analysts. For this sake, our researches revolve around three main axis: knowledge modeling, text mining and knowledge gathering. We explored the literature related to these different domains to develop a global knowledge gathering system. Our first contribution is the building of a domain ontology dedicated to knowledge representation for OSINT purposes and that comprises a specific definition and modeling of the event concept for this domain. Secondly, we have developed and evaluated an event recognition system which is based on two different extraction approaches: the first one is based on hand-crafted rules and the second one on a frequent pattern learning technique. As our third contribution, we proposed a semantic aggregation process as a necessary post-processing step to enhance the quality of the events extracted and to convert extraction results into actionable knowledge. This is achieved by means of multiple similarity measures between events, expressed according a qualitative scale which has been designed following our final users' needs
Estilos ABNT, Harvard, Vancouver, APA, etc.
16

Valentin, Sarah. "Extraction et combinaison d’informations épidémiologiques à partir de sources informelles pour la veille des maladies infectieuses animales". Thesis, Montpellier, 2020. http://www.theses.fr/2020MONTS067.

Texto completo da fonte
Resumo:
L’intelligence épidémiologique a pour but de détecter, d’analyser et de surveiller au cours du temps les potentielles menaces sanitaires. Ce processus de surveillance repose sur des sources dites formelles, tels que les organismes de santé officiels, et des sources dites informelles, comme les médias. La veille des sources informelles est réalisée au travers de la surveillance basée sur les événements (event-based surveillance en anglais). Ce type de veille requiert le développement d’outils dédiés à la collecte et au traitement de données textuelles non structurées publiées sur le Web. Cette thèse se concentre sur l’extraction et la combinaison d’informations épidémiologiques extraites d’articles de presse en ligne, dans le cadre de la veille des maladies infectieuses animales. Le premier objectif de cette thèse est de proposer et de comparer des approches pour améliorer l’identification et l’extraction d’informations épidémiologiques pertinentes à partir du contenu d’articles. Le second objectif est d’étudier l’utilisation de descripteurs épidémiologiques (i.e. maladies, hôtes, localisations et dates) dans le contexte de l’extraction d’événements et de la mise en relation d’articles similaires au regard de leur contenu épidémiologique. Dans ce manuscrit, nous proposons de nouvelles représentations textuelles fondées sur la sélection, l’expansion et la combinaison de descripteurs épidémiologiques. Nous montrons que l’adaptation et l’extension de méthodes de fouille de texte et de classification permet d’améliorer l’utilisation des articles en ligne tant que source de données sanitaires. Nous mettons en évidence le rôle de l’expertise quant à la pertinence et l’interprétabilité de certaines des approches proposées. Bien que nos travaux soient menés dans le contexte de la surveillance de maladies en santé animale, nous discutons des aspects génériques des méthodes proposées, vis-à-vis de de maladies inconnues et dans un contexte One Health (« une seule santé »)
Epidemic intelligence aims to detect, investigate and monitor potential health threats while relying on formal (e.g. official health authorities) and informal (e.g. media) information sources. Monitoring of unofficial sources, or so-called event-based surveillance (EBS), requires the development of systems designed to retrieve and process unstructured textual data published online. This manuscript focuses on the extraction and combination of epidemiological information from informal sources (i.e. online news), in the context of the international surveillance of animal infectious diseases. The first objective of this thesis is to propose and compare approaches to enhance the identification and extraction of relevant epidemiological information from the content of online news. The second objective is to study the use of epidemiological entities extracted from the news articles (i.e. diseases, hosts, locations and dates) in the context of event extraction and retrieval of related online news.This manuscript proposes new textual representation approaches by selecting, expanding, and combining relevant epidemiological features. We show that adapting and extending text mining and classification methods improves the added value of online news sources for event-based surveillance. We stress the role of domain expert knowledge regarding the relevance and the interpretability of methods proposed in this thesis. While our researches are conducted in the context of animal disease surveillance, we discuss the generic aspects of our approaches regarding unknown threats and One Health surveillance
Estilos ABNT, Harvard, Vancouver, APA, etc.
17

Gutierrez, Alejandro. "Extraction et manipulation d'information structurée sous la forme de graphe à partir de sources de données existantes". Versailles-St Quentin en Yvelines, 1997. http://www.theses.fr/1997VERS0015.

Texto completo da fonte
Resumo:
Des applications techniques telles que la gestion des réseaux routiers et électriques nécessitent de manipuler un grand volume d'information structurée sous la forme de graphe. Des problèmes typiques portent sur des parcours de chemins (par exemple, calcul du plus court chemin, du chemin avec la capacité maximale, du nombre de sous-composants d'un composant). Trois aspects principaux rendent difficile l'utilisation des algorithmes classiques pour résoudre ces problèmes a savoir, la taille des réseaux, la complexité de la structure des réseaux et la diversité des sources de données. Dans cette thèse, nous proposons un cadre base sur la notion de vue de graphe qui permet la définition des opérations de graphes et s'adapte à leur diversité de représentations. Une vue de graphe définit un graphe spécifique à partir des données stockées dans des sources différentes. La définition d'une vue de graphe consiste à établir une correspondance entre les éléments d'un graphe et les données sous-jacentes par l'intermédiaire de fonctions qui spécifient les éléments du graphe et la façon par laquelle le graphe peut être parcouru. Des opérateurs de dérivation sont proposes pour définir des nouvelles vues de graphes à partir de celles existantes. Ces opérateurs permettent la composition, dans une seule vue de graphe, de graphes contenant des étiquettes différentes de nœuds et des arcs et issus d'implémentations différentes. Des opérations de graphes telles que les parcours de chemins peuvent entre appliquées sur des vues de graphe de base et dérivées. La spécialisation d'un mécanisme de vue au problème spécifique de gestion de graphes, nous permet de proposer des stratégies adaptées au traitement des opérations de parcours de chemins. Nous validons le cadre proposé en supportant l'opération de fermeture transitive généralisée qui permet de résoudre une variété des problèmes de parcours de chemins. Une évaluation analytique des stratégies a été accomplie, nous permettant d'identifier les paramètres principaux qui influencent le comportement des stratégies pour le traitement de cette opération. Les résultats de cette évaluation ont été partiellement valides par un prototype qui implémente les idées principales du cadre proposé
Estilos ABNT, Harvard, Vancouver, APA, etc.
18

Verzeroli, Elodie. "Source NAPIS et Spectromètre PSI-TOF dans le projet ANDROMEDE". Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS221/document.

Texto completo da fonte
Resumo:
Le projet ANDROMEDE a pour but de créer un nouvel instrument d’imagerie ionique sub-micrométrique et d’analyse par spectrométrie de masse, en utilisant l’impact d’ions sur des nano-objets présents à la surface des échantillons solides et plus particulièrement sur les échantillons biologiques. L’étude de ces échantillons avec l’objectif d’analyse in vitro et in vivo nécessite une préparation complexe et requiert une expérimentation à la pression atmosphérique. Cet instrument unique ouvre une nouvelle voie dans l’analyse de surfaces, complémentaire aux méthodes utilisées de nos jours.Au sein du projet ANDROMEDE, deux éléments ont été développés dans le cadre de notre étude. La source NAPIS qui délivre les nanoparticules permettant d’augmenter le rendement d’éjection des ions secondaires, et le spectromètre de masse PSI-TOF pour l’analyse chimique des éléments émis depuis la surface de l’échantillon.Le faisceau primaire de nanoparticules de la source NAPIS est accéléré dans un accélérateur de type Pelletron 4MeV et amené sur une cible. La source de nanoparticules NAPIS a été développée et validée indépendamment au sein de la société ORSAY PHYSICS, avant son couplage sur l’accélérateur.Une nouvelle optique d’extraction appelée ExOTOF ainsi que le spectromètre de masse à extraction orthogonale PSI-TOF ont été développés pour permettre l’analyse des ions secondaires et augmenter la résolution en masse du système. Ces ensembles ont été spécialement dessinés pour ce projet. Ils permettront une extraction et une analyse efficace des ions secondaires émis depuis la surface de l’échantillon en utilisant des faisceaux continus et auront leur application pour les analyses à la pression atmosphérique. L’ensemble a été validé et les premiers tests de sortie du faisceau primaire ont été réalisés avec succès
The goal of the ANDROMEDE project is to create a new instrument for sub-micrometric ion imaging and analysis by mass spectrometry, using ion impacts on nano-objects present in the solid sample surface and more particularly on biological samples. In-vitro and in-vivo analysis of these types of samples require mostly complex preparation and even atmospheric pressure experimentation. This unique instrument opens a new path for surface analysis characterization, which is complementary to the standard methods and technics used today.In the ANDROMEDE project, two elements have been developed in our study. The NAPIS source which delivers the nanoparticles allowing the increase of the secondary ion yield and the PSI-TOF mass spectrometer for the chemical analysis of the elements emitted from the sample surface.The NAPIS source delivers a primary beam of accelerated nanoparticles in a Pelletron 4MeV accelerator which is driven to a target. The NAPIS nanoparticles source has been developed and validated independently in the ORSAY PHYSICS Company firstly before its coupling on the accelerator. The new extraction optics called ExOTOF as well as the PSI-TOF orthogonal extraction mass spectrometer have been developed for the reliable secondary ions study and the increase of the mass resolution.These instruments have been specially designed for this project. This development will allow an efficient extraction and analysis of the secondary ions emitted from the sample surface using continuous primary beams and will have applications for atmospheric pressure studies. The assembly has been completely validated and the first tests of the output beam have been successfully carried out
Estilos ABNT, Harvard, Vancouver, APA, etc.
19

Yeh, Chunghsin. "Extraction de fréquences fondamentales multiples dans des enregistrements polyphoniques". Paris 6, 2008. http://www.theses.fr/2008PA066261.

Texto completo da fonte
Resumo:
La fréquence fondamentale, dite F0, est un descripteur essentiel des signaux audio de musique. Bien que les algorithmes d’estimation de F0 unique aient considérablement progressé, leur application aux signaux de musique reste limitée parce que la plupart d’entre eux contiennent non pas une, mais plusieurs sources harmoniques en même temps. Par conséquent, l’estimation des F0s multiples est une analyse plus appropriée, et qui permet d’élargir le champ d’application à des tâches telles que la séparation de sources, l’extraction d’information de musique ou la transcription automatique de la musique. La difficulté d’estimer des F0s multiples d’un signal audio réside dans le fait que les sources sonores se superposent souvent dans le domaine temporel ainsi que dans le domaine fréquentiel. Les informations extraites sont en partie ambiguës. En particulier, lorsque des notes de musique en relation harmonique sont jouées en même temps, les partiels des notes aiguës peuvent recouvrir les partiels des notes graves. D’ailleurs, les caractéristiques spectrales des instruments de musique sont variées, ce qui augmente l’incertitude des amplitudes estimées des partiels des sources sonores. La complexité qui en résulte génère aussi une ambiguïté d’octave et il est d’autre part difficile d’estimer le nombre de sources. Cette thèse traite ces problèmes en trois étapes: l’ estimation du bruit, l’évaluation conjointe des F0 hypothèses, et l’inférence de la polyphonie. Le signal observé est modélisé par la somme de plusieurs sources harmoniques et du bruit, où chaque source harmonique est modélisée par une somme de sinusoïdes. Dans le cas de l’estimation des F0s, le nombre de sources est à estimer également. Si la partie bruit n’est pas estimée à l’avance, le nombre de sources risque d’être surestimé, les sources supplémentaires servant à expliquer la partie bruit. Un algorithme d’estimation du niveau de bruit est donc développé afin de distinguer les pics relatifs au bruit des pics sinusoïdaux qui correspondent aux partiels des sources harmoniques. Une fois les composantes spectrales identifiées comme étant des sinusoïdes ou du bruit, les partiels d’un ensemble de sources hypothétiques devraient s’ajuster à la plupart des pics sinusoïdaux. Afin d’évaluer leur plausibilité, un algorithme d’estimation conjointe est proposé, ce qui permet de traiter le problème des partiels superposés. L’algorithme d’estimation conjointe proposé est fondé sur trois hypothèses liées aux caractéristiques des instruments de musique: l’harmonicité, la douceur de l’enveloppe spectrale, et l’évolution synchrone des amplitudes des partiels. Lorsque le nombre de sources est connu, les F0s estiméees sont déterminés par la combinaison la plus probable. Dans ce cas, l’algorithme proposé donne un résultat prometteur qui se compare favorablement à l´état de l’art. L’estimation conjointe des F0s multiples permet de traiter de manière satisfaisante le problème des partiels superposés. Cependant, le temps de calcul de cette approche est élevé, parce que le nombre de combinaisons hypothétiques s’accroît exponentiellement avec le nombre de F0s candidats. Au contraire, l’approche basée sur une estimation itérative est plus rapide mais elle est moins optimale pour traiter le problème des partiels superposés. Dans l’espoir d’obtenir d’une part efficacité et d’autre part robustesse, ces deux approches sont combinées. Un algorithme itératif de sélection des F0s candidats, visant à en diminuer le nombre, est proposé. Comparé à deux fonctions de saillance polyphonique, cet algorithme itératif réduit de cents fois le nombre de candidats en perdant seulement 1 à 2% de la précision d’estimation des F0s multiples. Le résultat montre d’ailleurs qu’une augmentation du nombre des F0s candidats ne garantit pas une meilleure performance de l’algorithme d’estimation conjointe. L’estimation du nombre de sources, dite inférence de la polyphonie, est le problème le plus ardu. L’approche proposée consiste à faire une hypothèse sur le nombre de sources maximal et ensuite à sélectionner les meilleures F0s estimés. Pour cela, les F0s candidats qui se trouvent dans les meilleures combinaisons, sous l’hypothèse du nombre de sources maximal, sont retenus. L’estimation finale des F0s est obtenue en vérifiant de manière itérative les combinaisons de F0s sélectionnées selon l’ordre de probabilité de chaque F0. Une hypothèse de F0 est considérée comme valide si elle permet d’expliquer des pics d’énergie significatifs ou si elle améliore la douceur de l’enveloppe spectrale pour l’ensemble des F0s estimés. Le système proposé est évalué en utilisant une base de données de morceaux de musique construite spécialement pour l’occasion. La précision obtenue est environ 65%. Lors de la compétition d’estimation de F0s multiples de MIREX (Music Information Retrieval Evaluation eXchange) 2007, le système proposé a été évalué comme l’un des meilleurs parmi les 16 systèmes soumis.
Estilos ABNT, Harvard, Vancouver, APA, etc.
20

Arman, Molood. "Machine Learning Approaches for Sub-surface Geological Heterogeneous Sources". Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG014.

Texto completo da fonte
Resumo:
Dans le domaine de l'exploration et de la production du pétrole et du gaz, il est essentiel de comprendre les structures géologiques de sous-sol, tels que les diagraphies de puits et les échantillons de roche, afin de fournir des outils de prédiction et d'aide à la décision. Exploiter des données provenant de différentes sources, structurées ou non structurées, telles que des bases de données relationnelles et des rapports numérisés portant sur la géologie du sous-sol, est primordial. Le principal défi pour les données structurées réside dans l'absence d'un schéma global permettant de croiser tous les attributs provenant de différentes sources.Les défis sont autres pour les données non structurées. La plupart des rapports géologiques de sous-sol sont des versions scannées de documents. L'objectif de notre travail de thèse est de fournir une représentation structurée des différentes sources de données, et de construire des modèles de language spécifique au domaine pour l'apprentissage des entités nommées relatives à la géologie du sous-sol
In oil and gas exploration and production, understanding subsurface geological structures, such as well logs and rock samples, is essential to provide predictive and decision support tools. Gathering and using data from a variety of sources, both structured and unstructured, such as relational databases and digitized reports on the subsurface geology, are critical. The main challenge for the structured data is the lack of a global schema to cross-reference all attributes from different sources. The challenges are different for unstructured data. Most subsurface geological reports are scanned versions of documents. Our dissertation aims to provide a structured representation of the different data sources and to build domain-specific language models for learning named entities related to subsurface geology
Estilos ABNT, Harvard, Vancouver, APA, etc.
21

Yang, Seungwon. "Automatic Identification of Topic Tags from Texts Based on Expansion-Extraction Approach". Diss., Virginia Tech, 2014. http://hdl.handle.net/10919/25111.

Texto completo da fonte
Resumo:
Identifying topics of a textual document is useful for many purposes. We can organize the documents by topics in digital libraries. Then, we could browse and search for the documents with specific topics. By examining the topics of a document, we can quickly understand what the document is about. To augment the traditional manual way of topic tagging tasks, which is labor-intensive, solutions using computers have been developed. This dissertation describes the design and development of a topic identification approach, in this case applied to disaster events. In a sense, this study represents the marriage of research analysis with an engineering effort in that it combines inspiration from Cognitive Informatics with a practical model from Information Retrieval. One of the design constraints, however, is that the Web was used as a universal knowledge source, which was essential in accessing the required information for inferring topics from texts. Retrieving specific information of interest from such a vast information source was achieved by querying a search engine's application programming interface. Specifically, the information gathered was processed mainly by incorporating the Vector Space Model from the Information Retrieval field. As a proof of concept, we subsequently developed and evaluated a prototype tool, Xpantrac, which is able to run in a batch mode to automatically process text documents. A user interface of Xpantrac also was constructed to support an interactive semi-automatic topic tagging application, which was subsequently assessed via a usability study. Throughout the design, development, and evaluation of these various study components, we detail how the hypotheses and research questions of this dissertation have been supported and answered. We also present that our overarching goal, which was the identification of topics in a human-comparable way without depending on a large training set or a corpus, has been achieved.
Ph. D.
Estilos ABNT, Harvard, Vancouver, APA, etc.
22

Fargeas, Aureline. "Classification, feature extraction and prediction of side effects in prostate cancer radiotherapy". Thesis, Rennes 1, 2016. http://www.theses.fr/2016REN1S022/document.

Texto completo da fonte
Resumo:
Le cancer de la prostate est l'un des cancers les plus fréquents chez l'homme. L'un des traitements standard est la radiothérapie externe, qui consiste à délivrer un rayonnement d'ionisation à une cible clinique, en l'occurrence la prostate et les vésicules séminales. Les objectifs de la radiothérapie externe sont la délivrance d'une dose d'irradiation maximale à la tumeur tout en épargnant les organes voisins (principalement le rectum et la vessie) pour éviter des complications suite au traitement. Comprendre les relations dose/toxicité est une question centrale pour améliorer la fiabilité du traitement à l'étape de planification inverse. Des modèles prédictifs de toxicité pour le calcul des probabilités de complications des tissus sains (normal tissue complication probability, NTCP) ont été développés afin de prédire les événements de toxicité en utilisant des données dosimétriques. Les principales informations considérées sont les histogrammes dose-volume (HDV), qui fournissent une représentation globale de la distribution de dose en fonction de la dose délivrée par rapport au pourcentage du volume d'organe. Cependant, les modèles actuels présentent certaines limitations car ils ne sont pas totalement optimisés; la plupart d'entre eux ne prennent pas en compte les informations non-dosimétrique (les caractéristiques spécifiques aux patients, à la tumeur et au traitement). De plus, ils ne fournissent aucune compréhension des relations locales entre la dose et l'effet (dose-espace/effet relations) car ils n'exploitent pas l'information riche des distributions de planification de dose 3D. Dans un contexte de prédiction de l'apparition de saignement rectaux suite au traitement du cancer de la prostate par radiothérapie externe, les objectifs de cette thèse sont : i) d'extraire des informations pertinentes à partir de l'HDV et des variables non-dosimétriques, afin d'améliorer les modèles NTCP existants et ii) d'analyser les corrélations spatiales entre la dose locale et les effets secondaires permettant une caractérisation de la distribution de dose 3D à l'échelle de l'organe. Ainsi, les stratégies visant à exploiter les informations provenant de la planification (distributions de dose 3D et HDV) ont été proposées. Tout d'abord, en utilisant l'analyse en composantes indépendantes, un nouveau modèle prédictif de l'apparition de saignements rectaux, combinant d'une manière originale l'information dosimétrique et non-dosimétrique, a été proposé. Deuxièmement, nous avons mis au point de nouvelles approches visant à prendre conjointement profit des distributions de dose de planification 3D permettant de déceler la corrélation subtile entre la dose locale et les effets secondaires pour classer et/ou prédire les patients à risque de souffrir d'un saignement rectal, et d'identifier les régions qui peuvent être à l'origine de cet événement indésirable. Plus précisément, nous avons proposé trois méthodes stochastiques basées sur analyse en composantes principales, l'analyse en composantes indépendantes et la factorisation discriminante en matrices non-négatives, et une méthode déterministe basée sur la décomposition polyadique canonique de tableaux d'ordre 4 contenant la dose planifiée. Les résultats obtenus montrent que nos nouvelles approches présentent de meilleures performances générales que les méthodes prédictives de la littérature
Prostate cancer is among the most common types of cancer worldwide. One of the standard treatments is external radiotherapy, which involves delivering ionizing radiation to a clinical target, in this instance the prostate and seminal vesicles. The goal of radiotherapy is to achieve a maximal local control while sparing neighboring organs (mainly the rectum and the bladder) to avoid normal tissue complications. Understanding the dose/toxicity relationships is a central question for improving treatment reliability at the inverse planning step. Normal tissue complication probability (NTCP) toxicity prediction models have been developed in order to predict toxicity events using dosimetric data. The main considered information are dose-volume histograms (DVH), which provide an overall representation of dose distribution based on the dose delivered per percentage of organ volume. Nevertheless, current dose-based models display limitations as they are not fully optimized; most of them do not include additional non-dosimetric information (patient, tumor and treatment characteristics). Furthermore, they do not provide any understanding of local relationships between dose and effect (dose-space/effect relationship) as they do not exploit the rich information from the 3D planning dose distributions. In the context of rectal bleeding prediction after prostate cancer external beam radiotherapy, the objectives of this thesis are: i) to extract relevant information from DVH and non-dosimetric variables, in order to improve existing NTCP models and ii) to analyze the spatial correlations between local dose and side effects allowing a characterization of 3D dose distribution at a sub-organ level. Thus, strategies aimed at exploiting the information from the radiotherapy planning (DVH and 3D planned dose distributions) were proposed. Firstly, based on independent component analysis, a new model for rectal bleeding prediction by combining dosimetric and non-dosimetric information in an original manner was proposed. Secondly, we have developed new approaches aimed at jointly taking advantage of the 3D planning dose distributions that may unravel the subtle correlation between local dose and side effects to classify and/or predict patients at risk of suffering from rectal bleeding, and identify regions which may be at the origin of this adverse event. More precisely, we proposed three stochastic methods based on principal component analysis, independent component analysis and discriminant nonnegative matrix factorization, and one deterministic method based on canonical polyadic decomposition of fourth order array containing planned dose. The obtained results show that our new approaches exhibit in general better performances than state-of-the-art predictive methods
Estilos ABNT, Harvard, Vancouver, APA, etc.
23

Pruvost, L. "Extraction du bruit de combustion d'un moteur Diesel. Développement et application d'un spectrofiltre". Phd thesis, INSA de Lyon, 2009. http://tel.archives-ouvertes.fr/tel-00429987.

Texto completo da fonte
Resumo:
Cette thèse porte sur la validation et l'application d'un spectrofiltre pour séparer le bruit de combustion et le bruit mécanique d'un moteur Diesel. Le spectrofiltre est un estimateur de fonction de transfert. Dans le contexte de notre étude, il estime les transferts vibroacoustiques des forces de combustion entre les cylindres du moteur et le point d'écoute. Le spectrofiltre s'obtient à partir des pressions-cylindre et du bruit rayonné par le moteur. La méthode que nous examinons est une application de la théorie de la cyclostationnarité. Elle consiste à calculer le spectrofiltre à partir de la partie aléatoire des signaux. L'intérêt de cette méthode est de fortement réduire une erreur provenant de la cohérence entre la combustion et les sources de bruit mécanique. Cette méthode est d'abord justifiée sur la base d'arguments théoriques, et formalisée de façon à mettre en évidence ses avantages et ses inconvénients. Elle est ensuite validée par deux expériences numériques et une expérience en conditions réelles. Elle est finalement appliquée à l'étude des transferts vibroacoustiques des forces de combustion, et de la composition du bruit du moteur, relativement au point de fonctionnement (régime, charge, et teneur en GTL du carburant).
Estilos ABNT, Harvard, Vancouver, APA, etc.
24

Mao, Jin, Lisa R. Moore, Carrine E. Blank, Elvis Hsin-Hui Wu, Marcia Ackerman, Sonali Ranade e Hong Cui. "Microbial phenomics information extractor (MicroPIE): a natural language processing tool for the automated acquisition of prokaryotic phenotypic characters from text sources". BIOMED CENTRAL LTD, 2016. http://hdl.handle.net/10150/622562.

Texto completo da fonte
Resumo:
Background: The large-scale analysis of phenomic data (i.e., full phenotypic traits of an organism, such as shape, metabolic substrates, and growth conditions) in microbial bioinformatics has been hampered by the lack of tools to rapidly and accurately extract phenotypic data from existing legacy text in the field of microbiology. To quickly obtain knowledge on the distribution and evolution of microbial traits, an information extraction system needed to be developed to extract phenotypic characters from large numbers of taxonomic descriptions so they can be used as input to existing phylogenetic analysis software packages. Results: We report the development and evaluation of Microbial Phenomics Information Extractor (MicroPIE, version 0.1.0). MicroPIE is a natural language processing application that uses a robust supervised classification algorithm (Support Vector Machine) to identify characters from sentences in prokaryotic taxonomic descriptions, followed by a combination of algorithms applying linguistic rules with groups of known terms to extract characters as well as character states. The input to MicroPIE is a set of taxonomic descriptions (clean text). The output is a taxon-by-character matrix-with taxa in the rows and a set of 42 pre-defined characters (e.g., optimum growth temperature) in the columns. The performance of MicroPIE was evaluated against a gold standard matrix and another student-made matrix. Results show that, compared to the gold standard, MicroPIE extracted 21 characters (50%) with a Relaxed F1 score > 0.80 and 16 characters (38%) with Relaxed F1 scores ranging between 0.50 and 0.80. Inclusion of a character prediction component (SVM) improved the overall performance of MicroPIE, notably the precision. Evaluated against the same gold standard, MicroPIE performed significantly better than the undergraduate students. Conclusion: MicroPIE is a promising new tool for the rapid and efficient extraction of phenotypic character information from prokaryotic taxonomic descriptions. However, further development, including incorporation of ontologies, will be necessary to improve the performance of the extraction for some character types.
Estilos ABNT, Harvard, Vancouver, APA, etc.
25

Kalledat, Tobias. "Tracking domain knowledge based on segmented textual sources". Doctoral thesis, Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät, 2009. http://dx.doi.org/10.18452/15925.

Texto completo da fonte
Resumo:
Die hier vorliegende Forschungsarbeit hat zum Ziel, Erkenntnisse über den Einfluss der Vorverarbeitung auf die Ergebnisse der Wissensgenerierung zu gewinnen und konkrete Handlungsempfehlungen für die geeignete Vorverarbeitung von Textkorpora in Text Data Mining (TDM) Vorhaben zu geben. Der Fokus liegt dabei auf der Extraktion und der Verfolgung von Konzepten innerhalb bestimmter Wissensdomänen mit Hilfe eines methodischen Ansatzes, der auf der waagerechten und senkrechten Segmentierung von Korpora basiert. Ergebnis sind zeitlich segmentierte Teilkorpora, welche die Persistenzeigenschaft der enthaltenen Terme widerspiegeln. Innerhalb jedes zeitlich segmentierten Teilkorpus können jeweils Cluster von Termen gebildet werden, wobei eines diejenigen Terme enthält, die bezogen auf das Gesamtkorpus nicht persistent sind und das andere Cluster diejenigen, die in allen zeitlichen Segmenten vorkommen. Auf Grundlage einfacher Häufigkeitsmaße kann gezeigt werden, dass allein die statistische Qualität eines einzelnen Korpus es erlaubt, die Vorverarbeitungsqualität zu messen. Vergleichskorpora sind nicht notwendig. Die Zeitreihen der Häufigkeitsmaße zeigen signifikante negative Korrelationen zwischen dem Cluster von Termen, die permanent auftreten, und demjenigen das die Terme enthält, die nicht persistent in allen zeitlichen Segmenten des Korpus vorkommen. Dies trifft ausschließlich auf das optimal vorverarbeitete Korpus zu und findet sich nicht in den anderen Test Sets, deren Vorverarbeitungsqualität gering war. Werden die häufigsten Terme unter Verwendung domänenspezifischer Taxonomien zu Konzepten gruppiert, zeigt sich eine signifikante negative Korrelation zwischen der Anzahl unterschiedlicher Terme pro Zeitsegment und den einer Taxonomie zugeordneten Termen. Dies trifft wiederum nur für das Korpus mit hoher Vorverarbeitungsqualität zu. Eine semantische Analyse auf einem mit Hilfe einer Schwellenwert basierenden TDM Methode aufbereiteten Datenbestand ergab signifikant unterschiedliche Resultate an generiertem Wissen, abhängig von der Qualität der Datenvorverarbeitung. Mit den in dieser Forschungsarbeit vorgestellten Methoden und Maßzahlen ist sowohl die Qualität der verwendeten Quellkorpora, als auch die Qualität der angewandten Taxonomien messbar. Basierend auf diesen Erkenntnissen werden Indikatoren für die Messung und Bewertung von Korpora und Taxonomien entwickelt sowie Empfehlungen für eine dem Ziel des nachfolgenden Analyseprozesses adäquate Vorverarbeitung gegeben.
The research work available here has the goal of analysing the influence of pre-processing on the results of the generation of knowledge and of giving concrete recommendations for action for suitable pre-processing of text corpora in TDM. The research introduced here focuses on the extraction and tracking of concepts within certain knowledge domains using an approach of horizontally (timeline) and vertically (persistence of terms) segmenting of corpora. The result is a set of segmented corpora according to the timeline. Within each timeline segment clusters of concepts can be built according to their persistence quality in relation to each single time-based corpus segment and to the whole corpus. Based on a simple frequency measure it can be shown that only the statistical quality of a single corpus allows measuring the pre-processing quality. It is not necessary to use comparison corpora. The time series of the frequency measure have significant negative correlations between the two clusters of concepts that occur permanently and others that vary within an optimal pre-processed corpus. This was found to be the opposite in every other test set that was pre-processed with lower quality. The most frequent terms were grouped into concepts by the use of domain-specific taxonomies. A significant negative correlation was found between the time series of different terms per yearly corpus segments and the terms assigned to taxonomy for corpora with high quality level of pre-processing. A semantic analysis based on a simple TDM method with significant frequency threshold measures resulted in significant different knowledge extracted from corpora with different qualities of pre-processing. With measures introduced in this research it is possible to measure the quality of applied taxonomy. Rules for the measuring of corpus as well as taxonomy quality were derived from these results and advice suggested for the appropriate level of pre-processing.
Estilos ABNT, Harvard, Vancouver, APA, etc.
26

Coly, Arona. "Etudes expérimentales de sources d'ions RCE à 2,45GHz pour la production de courants intenses". Phd thesis, Université de Grenoble, 2010. http://tel.archives-ouvertes.fr/tel-00607679.

Texto completo da fonte
Resumo:
Cette thèse s'inscrit dans le cadre d'une collaboration industrielle avec la société Pantechnik. Elle a consisté au développement d'un nouveau banc de test dédié à la caractérisation de sources d'ions RCE fonctionnant à 2,45 GHz en vue de la production de faisceaux intenses pour des applications industrielles. Deux sources RCE avec des structures magnétiques différentes sont testées sur la ligne avec le même système d'injection HF. Une nouvelle source avec aimants permanents, présentant une extraction dipolaire (SPEED) a été conçue et testée. Une étude de l'extraction des faisceaux d'ions dans le champ magnétique dipolaire est proposée. Les premières expériences ont montré une densité de courant ionique total de ~10 mA/cm2 à 20 kV et une puissance HF de 900 W. Les tests en plasma d'hydrogène ont montré que l'espèce dominante est la molécule H2+. Des propositions sont faites pour modifier la structure magnétique de la source d'ions afin améliorer ces performances et produire principalement des ions H+. La source d'ions MONO1000 a été testée à forte puissance HF avec un système de couplage par guide d'onde. De très intenses densités de courant d'ions hydrogène ont été mesurées (~95 mA/cm2) avec une extraction de type diode et une efficacité de transport de l'ordre de 70%. Les premiers tests utilisant un système d'extraction amélioré à cinq électrodes sont présentés.
Estilos ABNT, Harvard, Vancouver, APA, etc.
27

Klein, Philipp. "Non-Intrusive Information Sources for Activity Analysis in Ambient Assisted Living Scenarios". Thesis, Mulhouse, 2015. http://www.theses.fr/2015MULH8932/document.

Texto completo da fonte
Resumo:
Comme les gens vieillissent, ils sont souvent confrontés à un certain degré de diminution des capacités cognitives ou de la force physique. Isolement de la vie sociale, mauvaise qualité de la vie, et risque accru de blessures en sont les principales conséquences. Ambient Assisted Living (AAL) est une vision de la façon dont les gens vivent leur vie dans leur propre maison, à mesure qu'ils vieillissent : handicaps ou limitations sont compensées par la technologie, là où le personnel de prestation de soins est rare ou des proches ne sont pas en mesure d'aider. Les personnes concernées sont assistés par la technologie. Le terme "ambiante" en AAL exprime, ce que cette technologie doit être, au- delà de l’assistance. Elle doit être intégrée dans l’environnement de manière à ce qu'elle ne soit pas reconnue en tant que tel. L'interaction avec les résidents doit être intuitive et naturelle. L'équipement technique doit être discret ct bien intégré. Les domaines d'application ciblés dans cette thèse sont le suivi de l’activité et la recherche de profils d'activités dans des appartements ou des petites maisons. L'acquisition d’informations concernant l’activité des résidents est vitale pour le succès de toute la technologie d’assistance. Dans de nombreux domaines de la vie quotidienne, ceci est déjà de la routine. L’état de l’art en matière de technologie de détection comprend des caméras, des barrières lumineuses, des capteurs RFID, la radiolocalisation de signal en utilisant des transpondeurs et des planchers sensibles à la pression. En raison de leurs principes de fonctionnement, ils ont malheureusement un impact important sur les environnements domestiques et de vie. Par conséquent, cette thèse est consacrée à la recherche de technologies d’acquisition d’informations de l’activité non-intrusive ayant un impact minimal sur la vie quotidienne. Deux technologies de base, la détection de présence passive sans dispositif et le suivi de charges de manière non-intrusive, sont prises en compte dans cette thèse
As people grow older, they are often faced with some degree of decreasing cognitive abilities or physical strength. Isolation from social life, poor quality of life, and increased risk or injuries are the consequence. Ambient Assisted Living (AAL) is a vision for the way people live their life in their own home, as they grow older: disabilities or limitations are compensated for by technology, where care-giving personnel is scarce or relatives are unable to help. Affected people are assisted by technology. The term "Ambient" in AAL expresses, what this technology needs to be, beyond assistive. It needs to integrate into the living environment in such a way that it is not recognized as such any more. Interaction with residents needs to be intuitive and natural. Technical equipment should be unobtrusive and well integrated. The areas of application targeted in this thesis are activity monitoring and activity pattern discovery in apartments or small houses. The acquisition of information regarding the residents' activity is vital for the success of any assistive technology. In many areas of daily life, this is routine already. State-of-the-art sensing technology includes cameras, light barriers, RFID sensors, radio signal localization using transponders, and pressure sensitive Floors. Due to their operating principles, they have a big impact on home and living environments. Therefore, this thesis is dedicated to research for non-intrusive activity information acquisition technology, that has minimal impact on daily life. Two base technologies are taken into account in this thesis
Estilos ABNT, Harvard, Vancouver, APA, etc.
28

Solana, Ciprés Miriam. "Supercritical CO2 technologies for the production of bioactive compounds from natural sources". Doctoral thesis, Università degli studi di Padova, 2016. http://hdl.handle.net/11577/3424461.

Texto completo da fonte
Resumo:
This Ph.D. project was aimed to evaluate the potential of supercritical CO2 technologies for obtaining natural extracts rich in bioactive compounds from residual agricultural biomass and microalgae. Three different supercritical CO2 techniques have been investigated with the purpose of optimizing the operating conditions and assessing the possibility to obtain the desired products in a safe, green and efficient way. First, attention has been focused on supercritical CO2 extraction (SFE). Experimental work regarding essential fatty acids extraction from microalgae and the application of solubility and kinetic models to the data measured are reported. Another application concerns the SFE of different bioactive compounds from rocket salad. In this case, a sequential extractive approach is proposed, consisting in the extraction of two products by using different co-solvents: the first one rich in phenols and glucosinolates and the second step headed to extract lipids. A profitability analysis of the SFE process is also reported, where data from experimental results have been used together with large scale process simulations. Eventually, SFE is applied to asparagus for the effective recovery of phenolic compounds and is compared with Soxhlet and pressurized liquid extraction methods. Another investigation has been carried out to obtain a dried natural extract from cherries. In this case, the supercritical anti-solvent technique (SAS) has been investigated, which allows to precipitate compounds non soluble in CO2 previously extracted by a traditional solvent method. The operating conditions were optimized to obtain a precipitate rich in polyphenols and anthocyanins from cherries extracts. Eventually, it is presented a study about supercritical counter-current fractionation for the extraction of fat from soy skim, which is the aqueous residue after enzyme-assisted aqueous extraction of soybeans. In the conclusion, a summary of the potential of CO2 in natural extracts technology is outlined and some perspectives are discussed for industrial applications in the near future.
La richiesta di estratti naturali e composti di origine vegetale sta aumentando a causa del loro crescente uso in alimenti funzionali, medicina naturale, additivi per la salute, prodotti cosmetici e applicazioni farmaceutiche. Questi settori richiedono prodotti ultra-puliti, verificabili, di alta qualità, apparecchiature affidabili e prezzi competitivi. Finora, sono state studiate numerose fonti vegetali come materie prime per la produzione di estratti naturali. L'idea della trasformazione dei rifiuti alimentari è sempre più presa in considerazione anche perché è vantaggiosa rispetto ai problemi di contaminazione, di gestione del processo e delle vie economiche. Inoltre, le fonti naturali emergenti come le microalghe contengono elevate quantità di composti di alto valore, la cui estrazione potrebbe essere ambientalmente ed economicamente vantaggiosa. In questo contesto, è necessario mettere a punto un metodo di estrazione e/o separazione efficiente e “verde” non solo per elaborare gli alimenti non vendibili e i rifiuti e sfruttare le fonti naturali emergenti, ma anche per creare un prodotto molto richiesto nel mercato attuale. La tecnologia della CO2 supercritica è emersa come un’importante alternativa ai processi tradizionali con solventi organici e meccanici, grazie alla sua pressione critica moderata, che permette di contenere i costi di compressione, mentre la sua bassa temperatura critica consente l'estrazione di composti termosensibili senza degradazione. Altri vantaggi rispetto ai metodi di estrazione con solventi classici includono il fatto che la CO2 è inerte, non tossica e rispettosa dell'ambiente. Inoltre, nei processi supercritici, la CO2 viene facilmente rimossa dopo la depressurizzazione e permette cicli più veloci. D’altra parte, la natura polare della maggior parte dei composti naturali rende necessaria l'aggiunta di co-solventi alla CO2 supercritica, al fine di migliorare l'affinità del fluido verso composti polari, e il loro effetto è rilevante sulla composizione dell’estratto e di conseguenza sull'economia del processo. In questo caso, è importante utilizzare solventi ecologici, come acqua ed etanolo, al fine di mantenere i vantaggi dei processi supercritici. L’estrazione supercritica con la CO2 è il processo più studiato ed applicato tra quelli che usano la CO2 sotto pressione. Tuttavia, c’è un certo numero di tecnologie supercritiche che vengono studiate e sviluppate per altre applicazioni interessanti, come la precipitazione di composti polari (precipitazione supercritica con anti-solvente, SAS), che permette di ottenere precipitati secchi naturali, o la separazione di composti in miscele liquide (frazionamento supercritico in contro-corrente) per l'ottenimento di composti bioattivi più puri. L’ottimizzazione dei processi e delle variabili operative per estrarre composti di interese dalle nuove fonti naturali sono importanti per garantire rese massime di alta qualità e rendere il prodotto finale adatto per l'uso nelle industrie alimentari, cosmetiche e farmaceutiche. Perciò, è fondamentale continuare la ricerca delle tecnologie con la CO2 supercritica per diversi materiali e generare nuovi dati che possano essere utili per il potenziale scale-up dei processi proposti. Per tutti questi motivi, l'obiettivo di questo progetto di ricerca è stato quello di valutare il potenziale delle tecnologie supercritiche con CO2 per ottenere in modo sicuro, verde ed efficiente nuovi estratti naturali ricchi di composti bioattivi da prodotti agricoli e microalghe. I temi affrontati da questa tesi sono organizzati e suddivisi in capitoli come segue. Il capitolo 1 è una discussione introduttiva sul mercato degli estratti naturali, la situazione dei diversi metodi di estrazione, le ricerche e gli ultimi risultati riportati per le tecnologie supercritiche con CO2. I capitoli 2, 3, 4 e 5 presentano i risultati sperimentali e la modellizzazione eseguite sulla estrazione con CO2 supercritica da diverse fonti naturali, con lo scopo di verificare l'efficacia di questo metodo per ottenere in modo competitivo estratti naturali ricchi di diversi composti bioattivi. Nel capitolo 2, viene mostrata l’estrazione di acidi grassi essenziali con CO2 supercritica da tre diverse specie di microalghe. Viene studiato l'effetto delle variabili operative sulla resa di estrazione totale e sulla solubilità. Vengono applicati i modelli matematici sviluppati da Sovová per descrivere le curve di estrazione sperimentali. Nel capitolo 3 viene riportata l'estrazione di frazioni arricchite in diverse classi di composti bioattivi. Secondo i risultati, si propone l'applicazione di un metodo di estrazione sequenziale, utilizzando prima CO2 + etanolo per l'estrazione dei lipidi e poi l’acqua come co-solvente per ottenere estratti ricchi in composti fenolici e glucosinolati. Il capitolo 4 descrive la valutazione economica di un impianto su scala industriale per la produzione di estratti naturali ricchi di glucosinolati e composti fenolici di rucola. Il software Aspen Plus ™ V8.2 è stato impiegato per la simulazione del processo su larga scala, basandosi sulle misure sperimentali di laboratorio. Viene valutato l'effetto dei parametri operativi sui costi di processo. Il capitolo 5 è imperniato sul recupero dei composti fenolici da asparagi. Ulteriormente, viene esaminato l'effetto di miscele di co-solventi diverse sulla estrazione con CO2 supercritica per estrarre selettivamente molecole di polifenoli. I risultati sono confrontati con l'estrazione attraverso un liquido sotto pressione (PLE) e con il metodo Soxhlet. Inoltre, è stato applicato il processo supercritico con anti-solvente (SAS), con l'obiettivo di ottenere un precipitato essiccato ricco di composti antiossidanti. Il capitolo 6 è focalizzato sul processo SAS con CO2 per ottenere precipitati che sono ricchi di polifenoli e antociani composti da ciliegie. Vengono confrontati il modo continuo e batch di funzionamento. Inoltre, sono discussi l'effetto della pressione e della composizione della CO2 sui rendimenti di precipitazione di polifenoli e antociani. Il terzo metodo che è stato studiato è il frazionamento in controcorrente per la separazione dei composti di interesse di una miscela liquida. Nel capitolo 7, viene riportata la verifica del impianto di frazionamento con CO2 in colonna continua riempita. A tal fine, è stato eseguito il recupero di butanolo da soluzioni acquose. L'influenza delle variabile operative, come il rapporto delle portate del solvente e la soluzione, la temperatura, la pressione e la composizione della soluzione è stato studiato sperimentalmente in termini di efficienza di separazione, percentuale tasso di rimozione del butanolo, rimozione totale e di concentrazione butanolo nell'estratto alla fine del ciclo continuo. Nel capitolo 8 è presentato l'uso della CO2 in controcorrente come mezzo per ridurre il grasso residuo nella soia dopo l'estrazione acquosa enzimatica assistita della soia. In particolare, vengono analizzati gli effetti del rapporto di solventi da alimentare, dell’aggiunta di etanolo come modificatore e dell'introduzione di un riempimento nella colonna. L'interpretazione dei risultati è stata effettuata mediante l’analisi statistico ANOVA. Infine, nelle conclusioni, sono discussi la sintesi della tesi e gli aspetti che dovrebbero essere messi a fuoco per garantire il futuro di questa tecnologia.
Estilos ABNT, Harvard, Vancouver, APA, etc.
29

Addo, Douglas Kweku. "OPERATION AND PROCESS CONTROL DEVELOPMENT FOR A PILOT-SCALE LEACHING AND SOLVENT EXTRACTION CIRCUIT RECOVERING RARE EARTH ELEMENTS FROM COAL-BASED SOURCES". UKnowledge, 2019. https://uknowledge.uky.edu/mng_etds/50.

Texto completo da fonte
Resumo:
The US Department of Energy in 2010 has identified several rare earth elements as critical materials to enable clean technologies. As part of ongoing research in REEs (rare earth elements) recovery from coal sources, the University of Kentucky has designed, developed and is demonstrating a ¼ ton/hour pilot-scale processing plant to produce high-grade REEs from coal sources. Due to the need to control critical variables (e.g. pH, tank level, etc.), process control is required. To ensure adequate process control, a study was conducted on leaching and solvent extraction control to evaluate the potential of achieving low-cost REE recovery in addition to developing a process control PLC system. The overall operational design and utilization of Six Sigma methodologies is discussed. Further, the application of the controls design, both procedural and electronic for the control of process variables such as pH is discussed. Variations in output parameters were quantified as a function of time. Data trends show that the mean process variable was maintained within prescribed limits. Future work for the utilization of data analysis and integration for data-based decision-making will be discussed.
Estilos ABNT, Harvard, Vancouver, APA, etc.
30

Herrault, Pierre-Alexis. "Extraction de fragments forestiers et caractérisation de leurs évolutions spatio-temporelles pour évaluer l'effet de l'histoire sur la biodiversité : une approche multi-sources". Thesis, Toulouse 2, 2015. http://www.theses.fr/2015TOU20018/document.

Texto completo da fonte
Resumo:
La biodiversité dans les paysages dépend des caractéristiques de ce paysage mais peut aussi être influencée par son histoire. En effet, certaines espèces ne réagissent pas immédiatement à une perturbation mais peuvent montrer un temps de réponse plus ou moins long. De ce fait, la prise en compte de l'évolution de l'habitat des espèces est devenue un enjeu important en écologie depuis quelques années, pour mieux comprendre la présence ou la diversité biologique actuelle. L'objectif de cette thèse en géomatique s'inscrit dans ce contexte applicatif d'écologie historique. Le sujet que nous traitons porte sur l'extraction automatique d'îlots boisés et la caractérisation de leur évolution spatio-temporelle depuis le milieu du XIXe siècle pour modéliser l'effet de leur trajectoire historique sur la diversité actuelle en syrphes forestiers (Diptera : Syrphidae). Le site d'étude est un paysage agri-forestier des Coteaux de Gascogne. La démarche générale proposée se compose de trois étapes principales : (1) la constitution de la base de données spatiales des îlots boisés intégrant plusieurs sources de données hétérogènes, (2) l'appariement des îlots boisés aux différentes dates avec la caractérisation de leur évolution spatio-temporelle, (3) la modélisation statistique des relations espèces-habitats intégrant l'histoire comme un des facteurs explicatifs de la diversité en syrphes observée. Plusieurs contributions méthodologiques ont été apportées à cette démarche. Nous avons proposé une nouvelle méthode de correction géométrique fondée sur la régression ridge à noyau pour rendre compatible les données spatiales anciennes et actuelles mobilisées. Nous avons également développé une approche et un outil de vectorisation automatique des forêts dans les dessins-minutes de la carte d'Etat-Major du XIXe siècle. Enfin, une première évaluation de l'impact de l'incertitude spatiale sur la réponse des modèles espèces-habitats a également été initiée. D'un point de vue écologique, les résultats ont révélé un effet significatif de la continuité temporelle des habitats sur la diversité en syrphes forestiers. Nous avons montré que les forêts les plus isolées présentaient une dette d'extinction ou un crédit de colonisation selon le type d'évolutions apparues au cours de la dernière période étudiée (1979-2010). Il s'est avéré qu'une durée de 30 ans n'était pas suffisante aux syrphes forestiers pour qu'ils retrouvent un été d'équilibre à la suite d'une évolution spatiale de leur habitat isolé
Biodiversity in landscapes depends on landscape spatial patterns but can also be influenced by landscape history. Indeed, some species are likely to respond in the longer term to habitat disturbances. Therefore, in recent years, landscape dynamics have become a possible factor to explain current biodiversity. The aim of this thesis in GIS is part of this historical ecology context. We are dealing with automatic extraction of forest patches and characterization of their spatiotemporal evolution. The objective is to evaluate forest dynamics effects on current diversity of forest hoverflies. (Diptera: Syrphidae) in the agri-forestry landscape of Coteaux de Gascogne. The proposed general approach consists of three main steps: (1) the forest spatial database production from heterogeneous sources, (2) forest patches matching and characterization of their spatiotemporal evolution, (3) species-habitat modeling while integrating history as one of the factors likely to explain hoverflies diversity. Several methodological contributions were made. We proposed a new geometric correction approach based on kernel ridge regression to make consistent past and present selected data sources. We also developed an automatic extraction approach of forest from Historical Map of France of the 19th century. Finally, spatial uncertainty effects on ecological models responses have been assessed. From an ecological viewpoint, a significant effect from historical continuity of patches on forest hoverflies diversity was revealed. The most isolated fragments presented an extinction debt or a colonization credit according to area dynamics occurred in the last time-period (1970-2010). As it turns out, 30 years was not sufficient for forest hoverflies to reach new equilibrium after isolated habitat changes
Estilos ABNT, Harvard, Vancouver, APA, etc.
31

Rodrigues, Dina Maria Ferreira. "Functional foods with innovative ingredients from seaweeds and mushrooms sources". Doctoral thesis, Universidade de Aveiro, 2016. http://hdl.handle.net/10773/17075.

Texto completo da fonte
Resumo:
Doutoramento em Química
Recursos naturais, como algas e cogumelos podem ser usados para a obtenção de novos produtos, oferecendo uma forma alternativa e sustentável de originar novos alimentos ou ingredientes funcionais com propriedades biológicas que podem ajudar a desenvolver novas estratégias baseadas na saúde preventiva. A principal força motriz desta tese foi encontrar em algas e cogumelos, extratos com compostos bioativos naturais que possam ser usados no desenvolvimento de novos alimentos funcionais. Com este objectivo, foi efectuada uma abordagem integrada e vários passos foram seguidos numa ordem cronológica. Em primeiro lugar, selecionou-se um conjunto de espécies de algas marinhas assim como um conjunto de espécies de cogumelos comestíveis que foram caracterizados quanto à sua composição química e compostos bioativos potenciais. Tendo sido realizada a sua caracterização, surgiu a necessidade de se aplicar metodologias de extracção adequadas, rentáveis e amigas do ambiente, capazes de extrair ingredientes naturais biologicamente ativos de interesse tanto de algas como de cogumelos. Este objetivo nesta fase levou à segunda etapa do trabalho experimental que envolveu o estudo de várias técnicas de extração (extração com água quente, extração assistida por enzimas e por ultra-sons e extracção sob alta pressão hidrostática), a fim de se caracterizar plenamente o potencial das diferentes fontes naturais, introduzindo em simultâneo diferentes seletividades e eficiências, preservando o máximo de bioatividade. A etapa seguinte desta pesquisa integrada esteve relacionada com a caracterização química mais aprofundada de compostos bioativos presentes nos quatro extratos enzimáticos selecionados de algas (extrato S. muticum obtido com Alcalase e extrato O. pinnatifida obtido com Viscozyme) e de cogumelos (extratos de Ph. nameko obtidos com Celulase e com Flavourzyme) onde as propriedades biológicas alvo foram confirmadas ou verificadas como as mais promissoras. Os extratos selecionados com potenciais propriedades biológicas, após a caracterização química foram avaliados no que respeita à sua estabilidade in vitro para confirmar e consolidar o seu potencial biológico e ser mais explorado na perspectiva do alimento funcional. A última etapa deste trabalho envolveu o desenvolvimento de um novo alimento funcional incorporando-se os dois extratos mais promissores e validados previamente (extrato O. pinnatifida obtido com Viscozyme e extrato Ph. nameko obtido com Flavourzyme) num creme lácteo para barrar, procedendo-se à avaliação do seu potencial biológico e tecnológico. Um creme lácteo de barrar funcional, combinando requeijão e iogurte tipo grego com incorporação dos extratos selecionados foi formulado e explorado com sucesso. O desenvolvimento de alimentos funcionais, ou mesmo de nutracêuticos, a partir de extratos de algas e de cogumelos comestíveis é viável podendo-se estender o estudo da incorporação destes extratos em outros tipos de alimentos como bebidas ou sorvete.
Natural resources such as seaweed and mushrooms can be used to obtain new products, providing an alternative and sustainable manner to provide new functional foods or ingredients with biological properties that may help to develop new strategies based on preventive health. The main driving force of this thesis was to find in seaweeds and mushrooms, extracts with natural bioactive compounds to be used in the development of new functional foods. In order to do so, an integrated approach was established and the various steps were followed chronologically. Firstly, a set of edible seaweeds species and a set of edible mushrooms species were characterised for their proximate composition and potential bioactive compounds. Once the characterization was achieved the need for suitable, fast, cost-effective and environmentally friendly extraction methodologies capable of extracting the biologically active natural ingredients of interest from both the seaweeds and mushrooms led to the second stage. This stage involved the study of several extraction techniques (hot water extraction; enzyme- and ultrasound-assisted extraction and high hydrostatic pressure) in order to fully characterize the potential of the different natural sources, introducing different extraction selectivity and efficiency while aiming at maximum preservation of bioactivity. The next stage in this integrated research was related with the deeper chemical characterization of the bioactive components present in the four enzymatic selected extracts from seaweeds (S. muticum extract obtained with Alcalase and O. pinnatifida extract obtained with Viscozyme) and mushrooms (Ph. nameko extracts obtained with Cellulase and with Flavourzyme) where target biological properties were confirmed or found to be more promising. The selected extracts with potential biological properties, following the chemical characterization went through the evaluation of in vitro stability to confirm and consolidate its biological potential to be further explored within the functional food perspective. The last stage of this thesis involved the development of a new functional food by incorporating the two most promising and validated extracts (O. pinnatifida obtained with Viscozyme and Ph. nameko obtained with Flavourzyme) in a spreadable dairy cream with assessment of their biological and technological potential. A functional dairy spreadable dairy cream combining whey cheese and greek type yoghurt with incorporation of the selected extracts was successfully formulated and explored. The development of functional foods, or even nutraceuticals, from edible seaweed and mushroom extracts is feasible and could be extended studying the incorporation of these extracts in other types of food such as beverages or ice cream.
Estilos ABNT, Harvard, Vancouver, APA, etc.
32

Montellano, Duran Ivar Mauricio [Verfasser], e Ursel [Akademischer Betreuer] Fantz. "Application of a 3D Monte Carlo PIC code for modeling the particle extraction from negative ion sources / Ivar Mauricio Montellano Duran ; Betreuer: Ursel Fantz". Augsburg : Universität Augsburg, 2019. http://d-nb.info/1203542690/34.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
33

Lane, Marshalle. "Dispersive liquid-liquid micro-extraction coupled with gas chromatography for the detection of trihalomethanes in different water sources in the Western Cape, South Africa". Thesis, Cape Peninsula University of Technology, 2018. http://hdl.handle.net/20.500.11838/2852.

Texto completo da fonte
Resumo:
Thesis (MTech (Chemistry))--Cape Peninsula University of Technology, 2018.
Trihalomethanes (THMs) are a group of four compounds that are formed, along with other disinfected by-products. This happens when chloride or other disinfectants are used to control microbial contamination in drinking water, which then reacts with natural organic or inorganic substances in water. Trihalomethanes are better known by their common names such as chloroform, bromodichloromethane, chlorodibromomethane and bromoform. These four compounds are known to be classified as cancer group B carcinogens (shown to cause cancer in laboratory animals). Trihalomethane levels tend to increase with pH, temperature, time and the level of “precursors" present. Precursors are known to be organic substances which react with chloride to form THMs. One significant way of reducing the amount of THMs in water is to eliminate or reduce chlorination before filtrations and reduce precursors. There are guideline limits for THMs in the SANS 241:2015 document, but they are not continuously monitored and their levels in natural water are not known. The aim of this study is to develop a rapid, fast and reliable liquid-liquid microextraction technique, to determine the presence of THMs in natural water sources. This study particularly focuses on different water sources e.g. river, underground, borehole and chlorinated water. Chlorinated water is the water that has been presumably treated for bacteria and fungus growth. The results that were obtained for chlorinated water are as follow, 10.120 μg/L − 11.654 μg/L for chloroform, 2.214 μg/L - 2.666 μg/L for bromodichloromethane, 0.819 μg/L − 0.895 μg/L chlorodibromomethane and 0.103 μg/L - 0.135 μg/L for bromoform from validation data. All these THMs concentrations have been found to be below the SANS 241:2015 limits. Natural water shows a very high affinity for chloroform. This is what is expected under normal conditions as chloroform is the most abundant THM of all THMs present in natural water. The liquid-liquid microextraction technique that was optimized and used for the determination of THMs in this study is a rapid, simple and inexpensive technique that provides low limits of detection (LOD) e.g. 0.1999 μg/L chlorodibromomethane and 0.2056 μg/L bromoform and wide dynamic range (LOQ) of 0.6664 μg/L chlorodibromomethane and 0.6854 μg/L bromoform for the determination of THMs.
Estilos ABNT, Harvard, Vancouver, APA, etc.
34

Baron, Valentin. "Méthodes d’identification de sources acoustiques paramétriques par mesures d’antennerie". Thesis, Lyon, 2020. http://www.theses.fr/2020LYSEI121.

Texto completo da fonte
Resumo:
La caractérisation de sources acoustiques a pour but de décrire des émetteurs sonores à travers certains paramètres comme leur localisation dans l'espace, le niveau sonore qu'ils produisent ou encore leur identification grâce à leur signature acoustique. Dans cette thèse, l'objectif est d'obtenir certains de ces trois paramètres dans deux cas d'application concrets, pour des sources situées en champ lointain et grâce à des antennes acoustiques. La première application concerne l'impact acoustique de la prospection minière sous-marine par grands fonds dans le cadre du projet FUI Abysound. Au sein du projet, la thèse cherche à caractériser les machines excavatrices placées sur les fonds marins en obtenant leur localisation et leur niveau sonore. Une première phase de design a abouti à la construction d'une antenne conique de 3 m. En s'appuyant ensuite sur les données issues de deux campagnes expérimentales menées en mer Méditerranée avec cette antenne, la méthode haute-résolution MUSIC localise précisément les sources mises en oeuvre, qu'elles soient mobiles et à plus de 600 m de l'antenne, ou immergées par 700 m de fond. Leur niveau sonore est ensuite estimé par formation de voies, et les niveaux attendus sont retrouvés pour des signaux monochromatiques ou à large bande. Dans la seconde application, une procédure complète pour la localisation et l'identification de drones est proposée pour la protection de sites sensibles. Elle combine traitement d'antenne et apprentissage statistique en s'articulant autour de trois étapes clés : la localisation, la focalisation et l'identification. La méthode MUSIC localise à nouveau les sources acoustiques présentes autour de l'antenne industrielle utilisée, puis la focalisation reconstruit le signal temporel de chacune, et un modèle SVM les identifie comme drone ou non. Les validations expérimentales, en intérieur comme en extérieur, constituent une contribution importante de ce travail de thèse. Les données acquises montrent entre autres que la procédure localise des drones à 3° près en extérieur, les détecte à 99 %, ou encore les identifie en présence d'une source perturbatrice plus puissante
Acoustic sources characterization aims to describe sound emitters through some parameters like their localization in space, the sound level they produce or their identification thanks to their acoustic signature. In this thesis, the objective is to obtain some of these parameters in two industrial application cases, for sources located in far-field and by the use of acoustic arrays. The first application concerns deep-sea mining acoustic impact in the context of Abysound FUI project. Within it, the thesis searches to characterize the excavation machine located on the seabed by assessing their localization and their sound level. First, a design phase has led to the construction of a 3 m size acoustic array. Then, using data from two experimental campaigns conducted in the Mediterranean Sea with this array, the high-resolution method MUSIC accurately localizes the used acoustic sources, either mobile and more than 600 m away from the array, or immersed by 700 m depth. Their sound level is then estimated by beamforming, and the expected levels are verified for monochromatic or wideband signals. In the second application, a complete procedure for the localization and the identification of drones is proposed to protect sensitive areas. It combines array processing and machine learning through three key steps: localization, focalization, and identification. MUSIC localizes again nearby acoustic sources around the industrial array used, then focalization reconstructs each temporal signal, and a SVM model identifies them as drone or not. Experimental validations, inside and outside, establish an important contribution of this thesis work. Acquired data show for instance that the procedure localizes drones with 3° accuracy outside, detects them at 99 %, or identifies them despite the presence of a more powerful source
Estilos ABNT, Harvard, Vancouver, APA, etc.
35

Camargo, Adriano Costa de. "Hurdles and potentials in value-added use of peanut and grape by-products as sources of phenolic compounds". Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/11/11141/tde-09112016-171820/.

Texto completo da fonte
Resumo:
Recent studies have demonstrated that peanut and grape processing by-products may be richer sources of bioactive compounds as compared to their original raw material and feedstock; however, before their application as a source of nutraceuticals or in the prevention of lipid oxidation in food systems, certain technological challenges have to be addressed. This study discusses recent advances in the application of plant food processing by-products as sources of phenolic compounds with special emphasis on the profiling and screening of phenolics using high-performance liquid chromatography-mass spectrometry, their potential health benefits, and microbiologial safety. The major findings are summarized in chapters 2, 3, and 4. The first chapter deals with phenolics from grape by-products. In general, insoluble-bound phenolics were more effective in inhibiting copper-induced human LDL-cholesterol oxidation in vitro than free and esterified phenolics. Phenolic extracts from all fractions inhibited peroxyl radical-induced DNA strand breakage. The third chapter brings about the effects of gamma-irradiation on the microbial growth, phenolic composition, and antioxidant properties of peanut skin. Gamma-irradiation at 5.0 kGy decreased the microbiological count of the product. Total phenolic and proanthocyanidin contents, ABTS radical cation, DPPH radical, hydrogen peroxide, and hydroxyl radical scavenging capacities as well as the reducing power of the sample were increased upon gamma-irradiation in both the free and insoluble-bound phenolic fractions. The bioactivity of the free phenolics against in vitro human LDL-cholesterol oxidation and copper induced DNA strand breakage was improved upon gamma-irradiation. Phenolic compounds were positively or tentatively identified and their distribution was in the decreasing order of free > esterified > insoluble-bound forms. Procyanidin dimer A was increased in all phenolic fractions, whereas procyanidin dimer B decreased. Gamma-irradiation induced changes may be explained by molecular conversion, depolymerization, and cross-linking. In the fourth chapter, the ability of selected enzymes in improving the extraction of insoluble-bound phenolics from the starting material (experiment I) or the residues containing insoluble-bound phenolics (experiment II) were evaluated. Pronase and Viscozyme improved the extraction of insoluble-bound phenolics. Viscozyme released higher amounts of gallic acid, catechin, and prodelphinidin dimer A compared to Pronase treatment. Furthermore, p-coumaric and caffeic acids, as well as procyanidin dimer B, were extracted with Viscozyme but not with Pronase treatment. Solubility plays an important role in the bioavailability of phenolic compounds, hence this study may assist in better exploitation of phenolics from winemaking by-products as functional food ingredients or supplements.
Estudos recentes têm demonstrado que subprodutos da indústria processadora de amendoim e uva podem ser mais ricos em compostos bioativos em comparação às suas matérias-primas. No entanto, alguns desafios tecnológicos precisam ser enfrentados antes da sua aplicação como fonte de compostos nutracêuticos ou na prevenção da oxidação lipídica em sistemas alimentares. Este estudo discute os recentes avanços na aplicação de subprodutos da indústria processadora de amendoim e uva como fontes de compostos fenólicos. Especial ênfase foi dada a sua caracterização por cromatografia líquida acoplada à espectrometria de massas, aos potenciais benefícios à saúde e à segurança microbiológica. As principais conclusões estão apresentadas nos capítulos 2, 3 e 4. O primeiro capítulo trata de compostos bioativos de subprodutos da indústria de suco de uva e da produção vinícola. A fração da qual foram extraídos os compostos fenólicos ligados à parede celular foi predominante. Em geral, esta fração também foi a mais eficaz na inibição da oxidação do LDL - colesterol in vitro quando comparada à fração que continha os fenólicos livres e os esterificados. Os compostos fenólicos de todas as frações inibiram o dano oxidativo ao DNA induzido por radicais peroxila. O terceiro capítulo fala sobre os efeitos da irradiação gama sobre a carga microbiana, a composição fenólica e as propriedades antioxidantes da película de amendoim. A irradiação gama (5,0 kGy) diminuiu a contagem microbiana do produto. Os compostos fenólicos totais, o teor de proantocianidinas e a capacidade dos extratos em neutralizar radicais como o ABTS, DPPH e espécies reativas de oxigênio como o peróxido de hidrogênio e radicais hidroxila, assim como o poder redutor da amostra, aumentaram devido à irradiação gama em ambas as frações (contendo fenólicos livres e ligados à parede celular). A bioatividade dos compostos fenólicos livres contra a oxidação do LDL-colesterol in vitro e contra os danos oxidativos ao DNA aumentou com a irradiação gama. Os compostos fenólicos foram positivamente ou tentativamente identificados, distribuindo-se entre: fenólicos livres > esterificados > ligados. Houve aumento na concentração de dímeros de procianidina A em todas as frações, enquanto a concentração de dímeros de procianidina B diminuiu. Essas alterações podem ser explicadas pela conversão molecular, despolimerização e formação de ligações cruzadas. No quarto e último capítulo, enzimas selecionadas foram aplicadas à matéria-prima inicial (experimento I) ou nos resíduos contendo apenas compostos fenólicos insolúveis (experimento II). Pronase e Viscozyme aumentaram a extração de compostos fenólicos insolúveis (ligados à parede celular). Viscozyme liberou maiores quantidades de ácido gálico, catequina e dímero de prodelfinidina A em comparação ao tratamento com Pronase. Além disso, os ácidos p-cumárico e ácido caféico, bem como o dímero de procianidina B, foram extraídos com Viscozyme, mas não com Pronase. A solubilidade desempenha um papel importante na biodisponibilidade de compostos fenólicos. Desta forma, o terceiro estudo oferece uma alternativa para a exploração de compostos fenólicos de subprodutos da indústria vinícola como ingredientes alimentares com propriedades funcionais ou suplementos alimentares.
Estilos ABNT, Harvard, Vancouver, APA, etc.
36

Boustany, Roger. "Séparation aveugle à l'ordre deux de sources cyclostationnaires : Application aux mesures vibroacoustiques". Compiègne, 2005. http://www.theses.fr/2005COMP1596.

Texto completo da fonte
Resumo:
L'objectif de cette thèse est la séparation aveugle de sources vibroacoustiques. L'hypothèse fondamentale est la cyclostationnarité des signaux de machines tournantes. Les outils utilisés sont fondés sur les statistiques cycliques d'ordre deux et en particulier sur la densité spectrale cyclique. Des méthodes de séparation aveugle de sources utilisant explicitement ces outils sont développées. Nous favorisons l'approche fréquentielle pour la séparation des mélanges convolutifs impliquant de très longues réponses impulsionnelles et nous proposons une solution au problème de permutations basée sur la cyclostationnarité. Des approches visant à extraire une seule source cyclostationnaire noyée dans un nombre inconnu d'interférences et en présence de bruit trés important sont ensuite proposées. Une méthode basée sur la régression cyclique à rang réduit est finalement favorisée. Des applications sur des signaux vibroacoustiques réels sont détaillées pour illustrer la très bonne performance des méthodes
The objective of this thesis is the blind separation of vibroacoustic sources. The fundamental assumption is the cyclostationarity of rotating machines signals. The tools used are founded on the second-order cyclic statistics and in particular on the cyclic power spectrum. Blind source separation methods using explicitly these tools are developed. We favour the frequency-domain approach to separate convolutive mixtures involving very long impulse responses and we propose a solution to the problem of permutations based on cyclostationarity. Approaches aiming at extracting only one cyclostationary source of interest drowned by an unknown number of interferences and in the presence of a very important noise are proposed. A method based on the reduced-rank cyclic regression is finally favoured. Applications on real vibroacoustic signals are detailed to illustrate the very good performance of the methods
Estilos ABNT, Harvard, Vancouver, APA, etc.
37

Thiebaut, Carole. "Caractérisation multidimensionnelle des images astronomiques : application aux images TAROT". Toulouse, INPT, 2003. http://www.theses.fr/2003INPT022H.

Texto completo da fonte
Resumo:
Le volume de données des observatoires automatiques est conséquent et nécessite des traitements robustes et rapides. L'objectif de cette thèse est de développer des algorithmes de traitement pour les images astronomiques, et en particulier pour celles du télescope TAROT. Nous avons dans un premier temps intégré les étapes d'extraction de sources. Cette étude nous a permis de construire les catalogues des objets de l'image. Nous avons ensuite mis au point un classifieur automatique pour l'identification de la nature des sources extraites. Le second volet de cette étude est la construction des signaux relatifs à l'évolution de la luminosité des sources astronomiques en fonction du temps. Nous avons développé des méthodes d'analyse de séries temporelles astronomiques échantillonnées irrégulièrement. Ces analyses nous ont permis d'extraire les caractéristiques principales de ces signaux, en vue d'une identification automatique de ces étoiles variables.
Estilos ABNT, Harvard, Vancouver, APA, etc.
38

Vena, Phumla Faith. "Integration of xylan extraction prior to kraft and sodaAQ pulping from South African grown Eucalyptus grandis, giant bamboo and sugarcane bagasse to produce paper pulps, value added biopolymers and fermentable sugars". Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019.1/80116.

Texto completo da fonte
Resumo:
Thesis (PhD)--Stellenbosch University, 2013.
ENGLISH ABSTRACT: The extraction of hemicelluloses prior to pulping that would have been dissolved in black liquor during pulping process, is an attractive alternative for pulp and paper mills as they, in addition to their core products, can increase their revenue by producing biofuels, biopolymers, paper additives and other chemicals. However, the amount of hemicelluloses extracted will be limited by the requirement to maintain pulp yield and pulp quality in comparison to existing pulping processes. In the present study, mild alkaline (NaOH) and dilute sulphuric acid conditions were used to extract hemicelluloses from Eucalyptus grandis, giant bamboo (Bambusa balcooa) and sugarcane (Saccharum officinarum) bagasse (SCB) prior to kraft or sodaAQ pulping processes. The effects of catalyst concentration, temperature and reaction time on hemicelluloses pre-extraction were studied, using a statistical experimental design to investigate conditions under which hemicelluloses could be extracted prior to alkaline pulping with minimal interference on cellulose (glucan) content. Subsequently, selected pre-extracted materials were subjected to kraft or sodaAQ pulping to evaluate the effect of the hemicelluloses pre-extraction on cooking chemicals, pulp yield and properties. This study also included evaluation of hot water hemicelluloses pre-extraction of SCB as it was part of a dilute sulphuric acid experimental design. The pulp yield, cooking chemicals and handsheet strength properties were compared with those obtained from kraft or sodaAQ pulping of non extracted raw materials. The results showed that alkaline pre-extraction options investigated preserves the pulp yield with minimal effect on handsheet strength properties depending on the choice of the subsequent pulping method while a fraction of xylan was extracted in polymeric form. In addition, less active alkali was required to delignify the xylan extracted materials. The integration of hemicelluloses pre-extraction by alkaline methods into a kraft pulping process was preferred for giant bamboo and E. grandis since it maintained pulp yields at desired industrial levels of 50%, and pulps within a bleachable kappa number range. Another advantage observed was the reduction in total cooking active alkali required to delignify alkaline extracted giant bamboo or E. grandis by 8or 3 percentage points respectively. However, the compromise to maintain the pulp yield was obtained when only 13.6% or 12.4% polymeric xylan was solubilised from giant bamboo or E. grandis respectively. Slight improvement in burst index of the handsheet was observed for extracted giant bamboo. On the other hand, pulp viscosity was increased by 13% due to the removal of low molecular weight hemicelluloses, while the breaking strength of the handsheet was also increased by 8.9% for pulps produced from extracted E. grandis. In the case of sugarcane bagasse, hemicelluloses pre-extraction by alkaline methods integrated well with the sodaAQ pulping process. It enabled a xylan recovery of 69.1%, while providing pulp with higher screened pulp yield (45.0%), with an advantageous decrease in kappa number (15.5). The handsheet tear index was superior without reduction in viscosity compared to pulp produced from non extracted SCB. On the contrary, results obtained from optimised dilute sulphuric acid pre-extraction of all the tested feedstocks were found to negatively impact subsequent kraft or sodaAQ pulping processes resulting in lower pulp yields and poorer strengths properties. Nonetheless, the differences were better when sodaAQ pulping was used compared to kraft pulping. SodaAQ protects the carbohydrates against the peeling reaction under alkaline medium. Conversely, pre-extraction of SCB with hot water resulted in low concentration of xylooligomers (5.7%), while the subsequent sodaAQ pulping resulted in no pulp yield reduction. The tear index and optical brightness of the handsheet papers produced from hot water extracted SCB were slightly improved while the breaking length, tensile and burst indexes were similar to those of pulps produced from non extracted SCB fibres. Of equal importance were the observed higher tear and burst indexes of handsheets produced from giant bamboo compared to E. grandis for both extracted and non extracted materials prepared under similar pulping processes. The advantage of bamboo was due to the larger fibre length and different morphological properties to those of hardwoods. However, the pulps produced from giant bamboo showed higher kappa numbers than those pulps produced from E. grandis due to the high condensation behaviour of bamboo lignins under alkaline conditions. Higher kappa numbers explained the higher demand for subsequent bleaching chemicals. In conclusion, the pulp mill biorefinery concept through hemicelluloses pre-extraction with NaOH can be achieved with modified kraft pulping or the sodaAQ pulping processes, but it depends on the type of raw material, extraction method and quality and performance requirements of a particular paper. The low pulping chemicals demand, comparable pulp yields and the improvement in some physico-chemical properties of the pulps from preextracted materials were observed. Furthermore, owing to xylan pre-extraction a larger amount of (extracted) material could be loaded into the digester as when non-extracted materials were used.
AFRIKAANSE OPSOMMING: Ekstraksie van hemiselluloses wat tydens verpulping in die swartloog opgelos word, bied ‘n aantreklike alternatief aan pulp- en papiermeulens om, addisioneel tot hul hoofprodukte, hul inkomste deur die vervaardiging van biobrandstowwe, biopolimere, papierbymiddels en ander chemikalië, daardeur te kan verhoog. Die hoeveelheid hemiselluloses wat ge-ekstraheer kan word, sal egter beperk word deur die vereiste dat pulpopbrengs en –kwaliteit tydens bestaande verpulpingsprosesse gehandhaaf moet word. In hierdie ondersoek is matige alkaliese (NaOH) en verdunde swawelsuurtoestande gebruik om hemiselluloses vóór kraft- of natriumantrakinoonverpulping uit Eucalyptus grandis, reuse bamboes (Bambusa balcooa) en suikerriet (Saccharum officinarum) bagasse, mee te ekstraheer. Die invloed van katalisatorkonsentrasie, temperatuur en reaksietyd is mbv ‘n statistiese, eksperimentele ontwerp ondersoek om die toestande te bepaal waaronder hemiselluloses, met minimale effek op die sellulose (glukaan) –inhoud, vóór alkaliese verpulping ge-ekstraheer kan word. Die pre-ge-ekstraheerde materiale, met hoë glukaan- en voldoende hemisellulosesinhoud, is vervolgens aan kraft- en natriumantrakinoonverpulping onderwerp om die invloed van pre-ekstraksie van hemiselluloses op die verpulpingsreagense, pulpopbrengs en - eienskappe vas te stel. Hierdie studie het ook die evualering van warmwater hemisellulosespre-ekstraksie van suikerrietbagasse, wat deel is van ‘n verdunde swawelsuur eksperimentele uitleg, ingesluit. Pulpopbrengs, die hoeveelheid verpulpingsreagense en handveleienskappe van dieselfde materiale wat nie vooraf ge-ekstraheer is nie, is vergelyk. Die resultate toon dat alkaliese pre-ekstraksie metodes wat ondersoek is die pulpopbrengs met minimale effek op handvel sterkte-eienskappe afhangende van die keuse van daaropvolgende pulpmetode kon handhaaf terwyl ‘n fraksie van xilaan in polimeriese vorm ge-ekstraheer is. Addisioneel, is minder aktiewe alkali benodig om die xilaan ge-ekstraheerde materiale te delignifiseer. Die integrasie van hemisellulosespre-ekstraksie dmv alkaliese metodes tydens ‘n kraft verpulpingsproses is vir reuse bamboes en E. grandis verkies omdat pulpopbrengste op ideale industriële vlakke van 50% gehandhaaf en is en pulp in ‘n bleikbare kappa nommergebied interval kon lewer. ‘n Verdere voordeel wat waargeneem is was die vermindering in die totale gekookte aktiewe alkali benodig vir reuse bamboes of E. grandis met 8 of 3 persentasiepunte onderskeidelik. Die kompromie om die pulpopbrengs te handhaaf is verkry toe slegs 13.6% of 12.4% polimeriese xilaan opgelos is vanuit reuse bamboes of E. grandis onderskeidelik. ‘n Effense verbetering in bars-indeks van die handvelle is waargeneem vir ge-ekstraheerde reuse bamboes. Pulpviskositeit het met 13% gestyg agv die verwydering van die lae molekulêre massa hemiselluloses, terwyl breeksterkte van handvelle ook met 8.9% toegeneem het vir pulp verkry uit pre-gekstraheerde E. grandis. NaOH pre-ekstraksie van 69.1% xilaan (droë massa) uit suikerriet bagasse (SCB) het ‘n hoër natriumantrakinoon, gesifte pulpopbrengs gelewer (45.0%) met ‘n verbeterde afname in kappa-getal (15.5) en uitstekende skeursterkte sonder verlaging in viskositeit, soos vergelyk met nie-ge-ekstraheerde suikkerrietbagasse. Daarteenoor het die resultate verkry met die geoptimeerde verdunde swawelsuur preekstraksie van al die getoetste rumateriale getoon om‘n negatiewe effek te gehad het op die daaropvolgende kraft- of natriumantrakinoonverpulping dws het laer pulpopbrengste en swakker sterkte-eienskappe opgelewer. Die verskille was nietemin kleiner toe natriumantrakinoonverpulping ipv kraftverpulping gebruik is. Antrakinoon beskerm die koolhidrate teen die afskilreaksie in alkaliese medium. Daarteenoor het pre-ekstraksie van suikerrierbagasse met warm water tot 'n lae hoeveelheid (5.7%) xilaanoligomere gelei, terwyl die daaropvolgende natriumantrakinoonverpulping geen verlaging in pulpopbrengs veroorsaak het nie. Skeursterkte en optiese helderheid van handvelle wat uit warm water ge-ekstraheerde suikerrietbagasse vervaardig is, het ietwat verbeter terwyl breek-, trek- en barssterkte dieselfde was as van suikerrietbagasse pulp wat nie ge-ekstraheer is nie. Net so belangrik was die waargenome hoër skeur- en barsindekse van handvelle vervaardig van reuse bamboes in vergelyking met E. grandis van beide geekstraheerde en nie ge-ekstraheerde materiale voorberei onder dieselfde verpulpings toestande. Bamboes se sterker eienskappe was as gevolg van die hoër vesellengte en ander morfologiese eienskappe as diévan loofhout. Pulp wat vervaardig is van reuse bamboes het ‘n hoër kappanommer getoon as pulp van E. grandis as gevolg van die hoë kondensasiegedrag van bamboeslignien onder alkaliese toestande. Hoër kappanommers kon die gepaardgaande hoër aanvraag vir bleikchemikalieë verklaar. Ten slotte, die pulpmeul bio-raffinaderykonsep nl. deur hemisellulosesekstraksie met NaOH gekombineer met óf ‘n gemodifiseerde kraft verpulping óf ‘n gemodifiseerde natriumantrakinoon verpulping, is wel uitvoerbaar. Dit word egter sterk beïnvloed deur die tipe ru-materiaal en die ekstraksie-metode gebruik, asook deur die kwaliteits- en gebruiksvereistes van verskillende tipes papier. ‘n Lae aanvraag vir verpulpingschemikalieë, vergelykbare pulpopbrengste en die verbetering in fisies-chemiese eienskappe van pulp vanaf pre-ge-ekstraheerde materiale is waargeneem. Verder kon, as gevolg van xilaan ekstraksie, meer ge-ekstraheerde materiaal in die verteerder gelaai word as wanneer nie-ge-ekstraheerde materiaal gebruik is.
Estilos ABNT, Harvard, Vancouver, APA, etc.
39

Barbu, Eduard. "Extracting conceptual structures from multiple sources". Doctoral thesis, Università degli studi di Trento, 2010. https://hdl.handle.net/11572/368347.

Texto completo da fonte
Resumo:
This thesis extracts conceptual structures from multiple sources: Wordnet, Web Corpora and Wikipedia. The conceptual structures extracted from Wordnet and Web Corpora are inspired by the feature norm effort in cognitive psychology. The conceptual structure extracted from Wikipedia makes the transition between feature norm structures and theory like structures. The main contribution of this thesis can be grouped in two categories: 1. Novel methods for the extraction of conceptual structures. More precisely, there are three new methods we developed: (a) Conceptual structure extraction from Wordnet. We devise a procedure for property extraction from Wordnet using the notion of semantic neighborhood. The procedure exploits the main relations organizing the nouns, the information in glosses and the inheritance of properties principle. (b) Feature Norms like extraction from corpora. We propose a method to acquire feature norm like structures from corpora using weakly supervised methods. (c) Conceptual Structure from Wikipedia. A novel unsupervised method for the extraction of conceptual structures from Wikipedia entries of similar concepts is put forward. The main idea we follow is that similar concepts (i.e. those classied under the same node in a taxonomy) are described in a comparable way in Wikipedia. Moreover, to understand the kind of information extracted from Wikipedia we annotate this knowledge with a set of property types. 2. Evaluation. We evaluate Wordnet as a model of semantic memory and suggest the addition of new semantic relations. We also assess the properties extracted from all sources for a unified test set, in a clustering experiment.
Estilos ABNT, Harvard, Vancouver, APA, etc.
40

Barbu, Eduard. "Extracting conceptual structures from multiple sources". Doctoral thesis, University of Trento, 2010. http://eprints-phd.biblio.unitn.it/180/2/eduardThesis.pdf.

Texto completo da fonte
Resumo:
This thesis extracts conceptual structures from multiple sources: Wordnet, Web Corpora and Wikipedia. The conceptual structures extracted from Wordnet and Web Corpora are inspired by the feature norm effort in cognitive psychology. The conceptual structure extracted from Wikipedia makes the transition between feature norm structures and theory like structures. The main contribution of this thesis can be grouped in two categories: 1. Novel methods for the extraction of conceptual structures. More precisely, there are three new methods we developed: (a) Conceptual structure extraction from Wordnet. We devise a procedure for property extraction from Wordnet using the notion of semantic neighborhood. The procedure exploits the main relations organizing the nouns, the information in glosses and the inheritance of properties principle. (b) Feature Norms like extraction from corpora. We propose a method to acquire feature norm like structures from corpora using weakly supervised methods. (c) Conceptual Structure from Wikipedia. A novel unsupervised method for the extraction of conceptual structures from Wikipedia entries of similar concepts is put forward. The main idea we follow is that similar concepts (i.e. those classied under the same node in a taxonomy) are described in a comparable way in Wikipedia. Moreover, to understand the kind of information extracted from Wikipedia we annotate this knowledge with a set of property types. 2. Evaluation. We evaluate Wordnet as a model of semantic memory and suggest the addition of new semantic relations. We also assess the properties extracted from all sources for a unified test set, in a clustering experiment.
Estilos ABNT, Harvard, Vancouver, APA, etc.
41

Sofianos, Stratis. "Singing voice extraction from stereophonic recordings". Thesis, University of Hertfordshire, 2013. http://hdl.handle.net/2299/10054.

Texto completo da fonte
Resumo:
Singing voice separation (SVS) can be defined as the process of extracting the vocal element from a given song recording. The impetus for research in this area is mainly that of facilitating certain important applications of music information retrieval (MIR) such as lyrics recognition, singer identification, and melody extraction. To date, the research in the field of SVS has been relatively limited, and mainly focused on the extraction of vocals from monophonic sources. The general approach in this scenario has been one of considering SVS as a blind source separation (BSS) problem. Given the inherent diversity of music, such an approach is motivated by the quest for a generic solution. However, it does not allow the exploitation of prior information, regarding the way in which commercial music is produced. To this end, investigations are conducted into effective methods for unsupervised separation of singing voice from stereophonic studio recordings. The work involves extensive literature review of existing methods that relate to SVS, as well as commercial approaches. Following the identification of shortcomings of the conventional methods, two novel approaches are developed for the purpose of SVS. These approaches, termed SEMANICS and SEMANTICS draw their motivation from statistical as well as spectral properties of the target signal and focus on the separation of voice in the frequency domain. In addition, a third method, named Hybrid SEMANTICS, is introduced that addresses time‐, as well as frequency‐domain separation. As there is lack of a concrete standardised music database that includes a large number of songs, a dataset is created using conventional stereophonic mixing methods. Using this database, and based on widely adopted objective metrics, the effectiveness of the proposed methods has been evaluated through thorough experimental investigations.
Estilos ABNT, Harvard, Vancouver, APA, etc.
42

Rougier, Simon. "Apport des images satellites à très haute résolution spatiale couplées à des données géographiques multi-sources pour l’analyse des espaces urbains". Thesis, Strasbourg, 2016. http://www.theses.fr/2016STRAH019/document.

Texto completo da fonte
Resumo:
Les villes sont confrontées à de nombreuses problématiques environnementales. Leurs gestionnaires ont besoin d'outils et d'une bonne connaissance de leur territoire. Un objectif est de mieux comprendre comment s'articulent les trames grise et verte pour les analyser et les représenter. Il s'agit aussi de proposer une méthodologie pour cartographier la structure urbaine à l'échelle des tissus en tenant compte de ces trames. Les bases de données existantes ne cartographient pas la végétation de manière exhaustive. Ainsi la première étape est d'extraire la végétation arborée et herbacée à partir d'images satellites Pléiades par une analyse orientée-objet et une classification par apprentissage actif. Sur la base de ces classifications et de données multi-sources, la cartographie des tissus se base sur une démarche d'extraction de connaissances à partir d'indicateurs issus de l'urbanisme et de l'écologie du paysage. Cette méthodologie est construite sur Strasbourg puis appliquée à Rennes
Climate change presents cities with significant environmental challenges. Urban planners need decision-making tools and a better knowledge of their territory. One objective is to better understand the link between the grey and the green infrastructures in order to analyse and represent them. The second objective is to propose a methodology to map the urban structure at urban fabric scale taking into account the grey and green infrastructures. In current databases, vegetation is not mapped in an exhaustive way. Therefore the first step is to extract tree and grass vegetation using Pléiades satellite images using an object-based image analysis and an active learning classification. Based on those classifications and multi-sources data, an approach based on knowledge discovery in databases is proposed. It is focused on set of indicators mostly coming from urbanism and landscape ecology. The methodology is built on Strasbourg and applied on Rennes to validate and check its reproducibility
Estilos ABNT, Harvard, Vancouver, APA, etc.
43

Soubercaze-Pun, Geoffroy. "De l’étude de bruit basse fréquence à la conception d’oscillateur en bande X à partir de transistor AlGaN/GaN HEMT". Toulouse 3, 2007. http://www.theses.fr/2007TOU30081.

Texto completo da fonte
Resumo:
L'objectif de ce travail est d'étudier les transistors à effet de champ à haute mobilité électronique (HEMT) réalisés en Nitrure de Gallium par des mesures en bruit basse fréquence et de réaliser un oscillateur à faible bruit de phase en bande-X. Dans la première partie, nous décrivons succinctement les propriétés du matériau, le transistor ainsi que les sources de bruit basses susceptibles d'êtres présentes dans une structure de type HEMT. La méthodologie de mesure et le banc de bruit basse fréquence sont présentés. Une étude comparative est réalisée sur les comportements en bruit basse fréquence des composants épitaxiés sur différents substrats (Si, SiC, Al2O3). Enfin, une les variations de l'index de fréquence g du bruit en 1/fg relevées sur certains composants sont corrélées au mécanisme de transport des électrons dans la structure : pour cela, nous avons confronté les mesures en bruit basse fréquence avec des simulations physiques. La seconde partie s'intéresse aux composants épitaxiés sur un substrat de Carbure de Silicium. Une méthodologie d'extraction de composantes mathématiques du spectre de bruit basse fréquence est présentée puis validée. Des études en fonction de la polarisation et de la température ont permis de découvrir l'origine des pièges et de les localiser. Enfin, une corrélation avec une étude physique (SIMS) est présentée. Dans la troisième partie, nous développons un modèle large signal afin de réaliser un démonstrateur en bande X. Les performances à l'état de l'art de l'oscillateur sont ensuite présentées (POUT=20dBm, Lf(100kHz)=-105 dBc/Hz à 10 GHz)
This work is dedicated to the study in the field of low frequency noise characterization of Gallium Nitride High Electron Mobility Transistors (HEMT) and to the design of an X-Band low phase noise oscillator. First of all, we describe the Gallium Nitride intrinsic properties, the HEMT structure and the associated noise sources that can occur in such device. The low frequency noise (LFN) measurement methodology is also presented. Then, a comparative study is exposed using low frequency noise measurement between devices grown on different substrate (Si, SiC, Al2O3). Finally, an investigation on the 1/fg noise and the frequency index g is performed, indicating a correlation between the frequency index g and the transport mechanism of the carriers in the two dimensional electron gas (2DEG) or in a parasitic AlGaN channel between drain and gate. This study makes use of both LFN measurements and physical simulations. The second part focuses on HEMT grown on SiC substrate: low frequency noise spectra are investigated, and a mathematical extraction procedure is presented. Then, an accurate study is lade thanks to the mathematical extraction of the noise sources versus biasing and under different thermal stress conditions to find the origin of G-R centers. A correlation between this study and SIMS measurements is presented. The last section of this work deals with large signal modelling and X-band oscillator: an original, accurate and fast modelling technique is proposed as an alternative to the usually time consuming traditional techniques. Thus the oscillator is designed, and its performances are discussed (POUT=20dBm, Lf(100kHz)=-105 dBc/Hz at 10 GHz)
Estilos ABNT, Harvard, Vancouver, APA, etc.
44

Rougier, Simon. "Apport des images satellites à très haute résolution spatiale couplées à des données géographiques multi-sources pour l’analyse des espaces urbains". Electronic Thesis or Diss., Strasbourg, 2016. http://www.theses.fr/2016STRAH019.

Texto completo da fonte
Resumo:
Les villes sont confrontées à de nombreuses problématiques environnementales. Leurs gestionnaires ont besoin d'outils et d'une bonne connaissance de leur territoire. Un objectif est de mieux comprendre comment s'articulent les trames grise et verte pour les analyser et les représenter. Il s'agit aussi de proposer une méthodologie pour cartographier la structure urbaine à l'échelle des tissus en tenant compte de ces trames. Les bases de données existantes ne cartographient pas la végétation de manière exhaustive. Ainsi la première étape est d'extraire la végétation arborée et herbacée à partir d'images satellites Pléiades par une analyse orientée-objet et une classification par apprentissage actif. Sur la base de ces classifications et de données multi-sources, la cartographie des tissus se base sur une démarche d'extraction de connaissances à partir d'indicateurs issus de l'urbanisme et de l'écologie du paysage. Cette méthodologie est construite sur Strasbourg puis appliquée à Rennes
Climate change presents cities with significant environmental challenges. Urban planners need decision-making tools and a better knowledge of their territory. One objective is to better understand the link between the grey and the green infrastructures in order to analyse and represent them. The second objective is to propose a methodology to map the urban structure at urban fabric scale taking into account the grey and green infrastructures. In current databases, vegetation is not mapped in an exhaustive way. Therefore the first step is to extract tree and grass vegetation using Pléiades satellite images using an object-based image analysis and an active learning classification. Based on those classifications and multi-sources data, an approach based on knowledge discovery in databases is proposed. It is focused on set of indicators mostly coming from urbanism and landscape ecology. The methodology is built on Strasbourg and applied on Rennes to validate and check its reproducibility
Estilos ABNT, Harvard, Vancouver, APA, etc.
45

Bryś, Tomasz. "Extraction of ultracold neutrons from a solid Deuterium source /". Zürich : ETH, 2007. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=17350.

Texto completo da fonte
Estilos ABNT, Harvard, Vancouver, APA, etc.
46

Wang, Zongkang. "A source-extraction based coupling method for computational aeroacoustics". Thesis, University of Greenwich, 2004. http://gala.gre.ac.uk/6339/.

Texto completo da fonte
Resumo:
This thesis involves the computation of aerodynamically generated sound using a source-extraction based coupling approach. In the present coupling method, the unsteady aerodynamic calculation and the calculation of sound propagation are separated artificially. A set of acoustic perturbation equations is derived by decomposing all flow variables into their dominant part and their fluctuating part, and neglecting some small-magnitude terms, and further simplified into a set of isentropic perturbation equations. Accompanying the derivation of the acoustic perturbation equations, a new extracting formulation for the acoustic source terms contained in the unsteady flow field is proposed. The acoustic source terms required in solving the acoustic perturbation equations are computed numerically from the time-dependent solutions of the unsteady flow field. In the simulation of the unsteady flow, the unsteady Reynolds-Averaged-Navier-Stokes equations (RANS) based cell-centred finite volume method is mainly used. A large eddy simulation (LES) technique is also employed in the investigation of one application case. A powerful and efficient high order dispersion-relation-preserving (DRP) finite difference scheme with fully staggered-grid variable arrangements is implemented in the solution of the acoustic perturbation equations. The performance of a set of radiation boundary conditions is examined for various background flows. A suitable and efficient coupling procedure, in conjunction with the source-extraction formulation, is designed between the cell-centred finite volume based CFD solver and the fully-staggered finite difference based acoustic solver. A range of acoustic model problems are investigated with the purpose of assessing the feasibility and accuracy of the source-extraction formulation associated with the coupling procedure. These model problems include wave propagation, reflection, interaction, and scattering, of acoustic pulse with/without background mean flow. The accuracy of computational results from these model problems is very encouraging when reasonable computational mesh sizes and time steps are used in both the CFD solver and the acoustic solver. Several applications of the source-extraction based coupling method to some more complex cases have also been examined. These cases are: 1) generation and propagation of sound by a series of vortices impinging on a finite thin flat plate; 2) generation and propagation of sound from a subsonic flow past a finite thin flat plate with a small angle of attack; 3) generation and near field radiation of aerodynamic sound from an low speed, laminar flow over a two-dimensional automobile door cavity; 4) flow-induced noise from an open cavity turbulent flow. These application calculations have demonstrated preliminarily the capability and potential of the new source extraction formulation for solving more realistic aeroacoustic problems.
Estilos ABNT, Harvard, Vancouver, APA, etc.
47

Acharya, Rupesh. "Object Oriented Design Pattern Extraction From Java Source Code". Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-207394.

Texto completo da fonte
Resumo:
In case of software architecture reconstruction, design pattern detection plays a vital role since its presence reflects the point of design decision. Currently most of the studied approaches only focus on the Gang of Four (GOF) design patterns so those tools are not flexible enough to identify other proprietary pattern instances. Moreover, the GOF design pattern can be implemented in various ways which many of the tools suffers to detect. Apart from that not only design pattern is of vital importance for software architecture reconstruction but other patterns like anti-patterns and presence of bad smell code are also equally important. So the approach discussed here is a solution for detecting any pattern instances (not only GOF patterns) from the source code provided that relevant information is extracted during the static analysis phase. Our approach is based on the graph pattern matching technique where the source code is modeled as a graph and the pattern to search for is provided as a graph query pattern. For the detection of patterns we focus on structural and behavioral analysis of source code as in the case of a tool called PINOT. The novelty of our approach compared to PINOT is that the choice of behavioral analyzers can be provided as a constraint in the graph query pattern unlike hardcoded in PINOT. Moreover, we can provide more than one constraint in the graph query pattern at node, edge or complete graph level hence, we can compose our query pattern as we want which helps us to specify different kind of new patterns and handle varying implementations of design patterns as well.
Estilos ABNT, Harvard, Vancouver, APA, etc.
48

Dmour, Mohammad A. "Mixture of beamformers for speech separation and extraction". Thesis, University of Edinburgh, 2010. http://hdl.handle.net/1842/4685.

Texto completo da fonte
Resumo:
In many audio applications, the signal of interest is corrupted by acoustic background noise, interference, and reverberation. The presence of these contaminations can significantly degrade the quality and intelligibility of the audio signal. This makes it important to develop signal processing methods that can separate the competing sources and extract a source of interest. The estimated signals may then be either directly listened to, transmitted, or further processed, giving rise to a wide range of applications such as hearing aids, noise-cancelling headphones, human-computer interaction, surveillance, and hands-free telephony. Many of the existing approaches to speech separation/extraction relied on beamforming techniques. These techniques approach the problem from a spatial point of view; a microphone array is used to form a spatial filter which can extract a signal from a specific direction and reduce the contamination of signals from other directions. However, when there are fewer microphones than sources (the underdetermined case), perfect attenuation of all interferers becomes impossible and only partial interference attenuation is possible. In this thesis, we present a framework which extends the use of beamforming techniques to underdetermined speech mixtures. We describe frequency domain non-linear mixture of beamformers that can extract a speech source from a known direction. Our approach models the data in each frequency bin via Gaussian mixture distributions, which can be learned using the expectation maximization algorithm. The model learning is performed using the observed mixture signals only, and no prior training is required. The signal estimator comprises of a set of minimum mean square error (MMSE), minimum variance distortionless response (MVDR), or minimum power distortionless response (MPDR) beamformers. In order to estimate the signal, all beamformers are concurrently applied to the observed signal, and the weighted sum of the beamformers’ outputs is used as the signal estimator, where the weights are the estimated posterior probabilities of the Gaussian mixture states. These weights are specific to each timefrequency point. The resulting non-linear beamformers do not need to know or estimate the number of sources, and can be applied to microphone arrays with two or more microphones with arbitrary array configuration. We test and evaluate the described methods on underdetermined speech mixtures. Experimental results for the non-linear beamformers in underdetermined mixtures with room reverberation confirm their capability to successfully extract speech sources.
Estilos ABNT, Harvard, Vancouver, APA, etc.
49

Gueroult, Renaud. "Étude d'une source d'ions obtenue par extraction et accélération à partir d'une source plasma filaire". Phd thesis, Palaiseau, Ecole polytechnique, 2011. https://pastel.hal.science/docs/00/64/68/21/PDF/these.pdf.

Texto completo da fonte
Resumo:
L'objet de ce travail consiste dans un premier temps en la modélisation d'une source plasma filaire en régime continu basse pression, puis, dans un second temps, en la caractérisation des propriétés d'une source d'ions développée sur la base de la source plasma préalablement étudiée. Un code particulaire permettant de modéliser le fonctionnement de la source plasma est ainsi développé, puis validé par comparaison avec les résultats d'une étude expérimentale. A la lueur des résultats de simulation, une analyse de la décharge en termes d'écoulement ionique de Child-Langmuir collisionnel en géométrie cylindrique est proposée, permettant alors d'interpréter la transition de mode comme une réorganisation naturelle de la décharge lorsque le courant de décharge excède un courant limite fonction de la tension de décharge, de la pression et de l'espace inter-électrodes. L'analyse de la fonction de distribution en énergie des ions impactant la cathode met par ailleurs en évidence la possibilité d'extraire un faisceau d'ions de faible dispersion en énergie autour de la tension de décharge, et ce sous réserve d'opérer la décharge dans son mode haute pression. Un prototype de source d'ions permettant l'extraction et l'accélération d'ions à partir de la source filaire est alors proposé. L'étude expérimentale d'un tel dispositif confirme le fait que l'adjonction d'un processus d'accélération se traduit par un simple décalage de la fonction de distribution en vitesse des ions d'une valeur correspondant à la tension d'accélération, mais ne l'élargit pas. Il est ainsi possible de produire des faisceaux d'ions de diverses espèces ioniques, d'énergie modulable (0-5 keV) et de dispersion en énergie limitée (~ 10 eV). Les courants de faisceau mesurés sont de quelques dizaines de micro-ampères, et la divergence d'un tel faisceau est de l'ordre du degré. Une modélisation numérique de cette source d'ions est finalement entreprise afin d'identifier de possibles optimisations du concept proposé
In this study we first model a DC low pressure wire plasma source and then characterize the properties of an ion gun derived from the plasma source. In order to study the properties of the derived ion gun, we develop a particle-in-cell code fitted to the modelling of the wire plasma source operation, and validate it by confrontation with the results of an experimental study. In light of the simulation results, an analysis of the wire discharge in terms of a collisional Child-Langmuir ion flow in cylindrical geometry is proposed. We interpret the mode transition as a natural reorganisation of the discharge when the current is increased above a threshold value which is a function of the discharge voltage, the pressure and the inter-electrodes distance. In addition, the analysis of the energy distribution function of ions impacting the cathode demonstrates the ability to extract an ion beam of low energy spread around the discharge voltage assuming that the discharge is operated in its high pressure mode. An ion source prototype allowing the extraction and acceleration of ions from the wire source is then proposed. The experimental study of such a device confirms that, apart from a shift corresponding to the accelerating voltage, the acceleration scheme does not spread the ion velocity distribution function along the axis of the beam. It is therefore possible to produce tunable energy (0-5 keV) ion beams of various ionic species presenting limited energy dispersion (~ 10 eV). The typical beam currents are about a few tens of micro-amperes, and the divergence of such a beam is on the order of one degree. A numerical modelling of the ion source is eventually conducted in order to identify potential optimizations of the concept
Estilos ABNT, Harvard, Vancouver, APA, etc.
50

Gueroult, Renaud. "Étude d'une source d'ions obtenue par extraction et accélération à partir d'une source plasma filaire". Phd thesis, Ecole Polytechnique X, 2011. http://pastel.archives-ouvertes.fr/pastel-00646821.

Texto completo da fonte
Resumo:
L'objet de ce travail consiste dans un premier temps en la modélisation d'une source plasma filaire en régime continu basse pression, puis, dans un second temps, en la caractérisation des propriétés d'une source d'ions développée sur la base de la source plasma préalablement étudiée. Un code particulaire permettant de modéliser le fonctionnement de la source plasma est ainsi développé, puis validé par comparaison avec les résultats d'une étude expérimentale. A la lueur des résultats de simulation, une analyse de la décharge en termes d'écoulement ionique de Child-Langmuir collisionnel en géométrie cylindrique est proposée, permettant alors d'interpréter la transition de mode comme une réorganisation naturelle de la décharge lorsque le courant de décharge excède un courant limite fonction de la tension de décharge, de la pression et de l'espace inter-électrodes. L'analyse de la fonction de distribution en énergie des ions impactant la cathode met par ailleurs en évidence la possibilité d'extraire un faisceau d'ions de faible dispersion en énergie autour de la tension de décharge, et ce sous réserve d'opérer la décharge dans son mode haute pression. Un prototype de source d'ions permettant l'extraction et l'accélération d'ions à partir de la source filaire est alors proposé. L'étude expérimentale d'un tel dispositif confirme le fait que l'adjonction d'un processus d'accélération se traduit par un simple décalage de la fonction de distribution en vitesse des ions d'une valeur correspondant à la tension d'accélération, mais ne l'élargit pas. Il est ainsi possible de produire des faisceaux d'ions de diverses espèces ioniques, d'énergie modulable (0-5 keV) et de dispersion en énergie limitée (~ 10 eV). Les courants de faisceau mesurés sont de quelques dizaines de micro-ampères, et la divergence d'un tel faisceau est de l'ordre du degré. Une modélisation numérique de cette source d'ions est finalement entreprise afin d'identifier de possibles optimisations du concept proposé.
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia