Dissertationen zum Thema „Extraction des sources“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Extraction des sources" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
González-Valentín, Karen M. (Karen Mercedes) 1978. „Extraction of variation sources due to layout practices“. Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/87206.
Der volle Inhalt der QuelleIncludes bibliographical references (p. 97).
by Karen M. González-Valentín.
S.M.
Yankova-Doseva, Milena. „TERMS - Text Extraction from Redundant and Multiple Sources“. Thesis, University of Sheffield, 2010. http://etheses.whiterose.ac.uk/933/.
Der volle Inhalt der QuelleLeouffre, Marc. „Extraction de sources d'électromyogrammes et évaluation des tensions musculaires“. Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENT009/document.
Der volle Inhalt der QuelleEvaluation of muscle tensions in movement and gait sciences is of great interest in the fields of sports, health or ergonomics. Biomechanics in particular has been looking forward to solving these problems and developed the use of inverse kinematics to compute internal muscle tensions from external physical measures. Muscular redundancy remains however a complex issue, there are more muscles than degrees of freedom and thus more unknown variables which makes inverse kinematics an under-determined problem needing optimization techniques to be solved. In this context using electromyography (EMG), an electro-physiological signal that can be measured on the skin surface, gives an idea of underlying muscle activities. Knowing muscle activities could be additional information to feed the optimization procedures with and could help improving accuracy of estimated muscle tensions during real gestures or gait situation. There are even situations in which measuring external physical variables like forces, positions or accelerations is not feasible because it might require equipment incompatible with the object of the study. It is often the case in ergonomics when equipping the object of the study with sensors is either too expensive or physically too cumbersome. In such cases EMG can become very handy as a non-invasive measure that does not require the environment to be equipped with other sensors. EMG however has its own limits, surface EMG on small and closely located muscles like muscles of the forearm can be subject to “cross-talk”. Cross-talk is the cross contamination of several sensors it is the result of signal propagation of more than one muscle on one sensor. In presence of cross-talk it is not possible to associate an EMG sensor with a given muscle. There are signal processing techniques dealing with this kind of problem. Source separation techniques allow estimation of unknown sources from several sensors recording mixtures of these sources. Applying source separation techniques on EMG can provide EMG source estimations reflecting individual muscle activities without the effect of cross-talk. First the benefits of using surface EMG during an ergonomics study of an innovative human-computer interface are shown. EMG pointed out a relatively high level of muscle co-contraction that can be explained by the need to stabilize the joints for a more accurate control of the device. It seems legitimate to think that using source separation techniques would provide signals that better represent single muscle activities and these would improve the quality of this study. Then the precise experimental conditions for linear instantaneous source separation techniques to work are studied. Validity of the instantaneity hypothesis in particular is tested on real surface EMG signals and its strong dependency on relative sensor locations is shown. Finally a method to improve robustness of linear instantaneous source separation versus instantaneity hypothesis is proposed. This method relies on non-negative matrix factorization of EMG signal envelopes
Dumitrescu, Stefan Daniel. „L' extraction d'information des sources de données non structurées et semi-structurées“. Toulouse 3, 2011. http://thesesups.ups-tlse.fr/1555/.
Der volle Inhalt der QuelleThesis objective: In the context of recently developed large scale knowledge sources (general ontologies), investigate possible new approaches to major areas of Information Extraction (IE) and related fields. The thesis overviews the field of Information Extraction and focuses on the task of entity recognition in natural language texts, a required step for any IE system. Given the availability of large knowledge resources in the form of semantic graphs, an approach that treats the sub-tasks of Word Sense Disambiguation and Named Entity Recognition in a unified manner is possible. The first implemented system using this approach recognizes entities (words, both common and proper nouns) from free text and assigns them ontological classes, effectively disambiguating them. A second implemented system, inspired by the semantic information contained in the ontologies, also attempts a new approach to the classic problem of text classification, showing good results
Horton, Bryan. „Rotational motion of pendula systems for wave energy extraction“. Thesis, Available from the University of Aberdeen Library and Historic Collections Digital Resources, 2009. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?application=DIGITOOL-3&owner=resourcediscovery&custom_att_2=simple_viewer&pid=25873.
Der volle Inhalt der QuelleFerreira, Lage Sandra. „The neurotoxin β-N-methylamino-L-alanine (BMAA) : Sources, bioaccumulation and extraction procedures“. Doctoral thesis, Stockholms universitet, Institutionen för ekologi, miljö och botanik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-132142.
Der volle Inhalt der QuelleAt the time of the doctoral defense, the following paper was unpublished and had a status as follows: Paper 4: Manuscript.
Xu, Xu. „Nonlinear dynamics of parametric pendulum for wave energy extraction“. Thesis, University of Aberdeen, 2005. http://digitool.abdn.ac.uk:80/webclient/DeliveryManager?pid=189414.
Der volle Inhalt der QuelleAlbezzawy, Muhammad Nabil Mustafa. „Advanced signal processing methods for source identification using references“. Electronic Thesis or Diss., Lyon, INSA, 2024. http://www.theses.fr/2024ISAL0074.
Der volle Inhalt der QuelleRank-reduced reference/coherence techniques based on the use of references, i.e. fixed sensors, are widely used to solve the two equivalent problems of source extraction and resynchronization encountered during remote sensing of physical fields, when the number of references surpasses the number of incoherent sources. In such case, the cross-spectral matrix (CSM) becomes ill-conditioned, resulting in the invalidity of the least squares LS solution. Although the truncated singular value decomposition (TSVD) was successfully applied in the literature to solve this problem, its validity is limited only to the case of scalar noise on the references. It is also very difficult to define a threshold, for truncation, when the singular values are gradually decreasing. This thesis proposes a solution based on finding a set of virtual references that is maximally correlated with the field measurements, named the maximally-coherent reference (MCR) Technique. This solution is optimal, especially, in the case of correlated noise on the reference, where TSVD fails. However the technique also includes an eigenvalue truncation step, similar to the one required for the TSVD, which necessitates a priori knowledge or the estimation of the number of incoherent sources, i.e. source enumeration, which is an ill-posed inverse problem, insufficiently investigated in the literature within the framework of reference techniques. In this thesis, after providing a unified formalism for all the reference techniques in the literature, three alternative source enumeration methods, applicable to all the reference techniques, were presented namely; a direct likelihood ratio test (LRT) against the saturated model, a parametric bootstrap technique and a cross-validation approach. A comparative study is performed among the three methods, based on simulated numerical data, real sound experimental data, and real electrical motor data. The results showed two important outcomes. The first is that the number of snapshots (spectral windows), used in the spectral analysis, greatly affects the performance of the three methods, and that, they behave differently for the same number of used snapshots. The second is that parametric bootstrapping turned out to be the best method in terms of both estimation accuracy and robustness with regard to the used number of snapshots. Finally, the MCR technique accompanied with bootstrapping was employed for source extraction and resynchronization of real data from laboratory experiments, and an e-motor, and it returned better results than the LS solution and the TSVD when employed for the same purpose
Syed, Ali Asgher. „Hole extraction layer/perovskite interfacial modification for high performing inverted planar perovskite solar cells“. HKBU Institutional Repository, 2018. https://repository.hkbu.edu.hk/etd_oa/553.
Der volle Inhalt der QuelleZaman, Tauhid R. „Information extraction with network centralities : finding rumor sources, measuring influence, and learning community structure“. Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/70410.
Der volle Inhalt der QuelleCataloged from PDF version of thesis.
Includes bibliographical references (p. 193-197).
Network centrality is a function that takes a network graph as input and assigns a score to each node. In this thesis, we investigate the potential of network centralities for addressing inference questions arising in the context of large-scale networked data. These questions are particularly challenging because they require algorithms which are extremely fast and simple so as to be scalable, while at the same time they must perform well. It is this tension between scalability and performance that this thesis aims to resolve by using appropriate network centralities. Specifically, we solve three important network inference problems using network centrality: finding rumor sources, measuring influence, and learning community structure. We develop a new network centrality called rumor centrality to find rumor sources in networks. We give a linear time algorithm for calculating rumor centrality, demonstrating its practicality for large networks. Rumor centrality is proven to be an exact maximum likelihood rumor source estimator for random regular graphs (under an appropriate probabilistic rumor spreading model). For a wide class of networks and rumor spreading models, we prove that it is an accurate estimator. To establish the universality of rumor centrality as a source estimator, we utilize techniques from the classical theory of generalized Polya's urns and branching processes. Next we use rumor centrality to measure influence in Twitter. We develop an influence score based on rumor centrality which can be calculated in linear time. To justify the use of rumor centrality as the influence score, we use it to develop a new network growth model called topological network growth. We find that this model accurately reproduces two important features observed empirically in Twitter retweet networks: a power-law degree distribution and a superstar node with very high degree. Using these results, we argue that rumor centrality is correctly quantifying the influence of users on Twitter. These scores form the basis of a dynamic influence tracking engine called Trumor which allows one to measure the influence of users in Twitter or more generally in any networked data. Finally we investigate learning the community structure of a network. Using arguments based on social interactions, we determine that the network centrality known as degree centrality can be used to detect communities. We use this to develop the leader-follower algorithm (LFA) which can learn the overlapping community structure in networks. The LFA runtime is linear in the network size. It is also non-parametric, in the sense that it can learn both the number and size of communities naturally from the network structure without requiring any input parameters. We prove that it is very robust and learns accurate community structure for a broad class of networks. We find that the LFA does a better job of learning community structure on real social and biological networks than more common algorithms such as spectral clustering.
by Tauhid R. Zaman.
Ph.D.
Cooper, Erica L. „Automatic repair and recovery for Omnibase : robust extraction of data from diverse Web sources“. Thesis, Massachusetts Institute of Technology, 2010. http://hdl.handle.net/1721.1/61157.
Der volle Inhalt der QuelleCataloged from PDF version of thesis.
Includes bibliographical references (p. 31).
In order to make the best use of the multitude of diverse, semi-structured sources of data available on the internet, information retrieval systems need to reliably access the data on these different sites in a manner that is robust to changes in format or structure that these sites might undergo. An interface that gives a system uniform, programmatic access to the data on some web site is called a web wrapper, and the process of inferring a wrapper for a given website based on a few examples of its pages is known as wrapper induction. A challenge of using wrappers for online information extraction arises from the dynamic nature of the web-even the slightest of changes to the format of a web page may be enough to invalidate a wrapper. Thus, it is important to be able to detect when a wrapper no longer extracts the correct information, and also for the system to be able to recover from this type of failure. This thesis demonstrates improved error detection as well as methods of recovery and repair for broken wrappers for START, a natural-language question-answering system developed by Infolab at MIT.
by Erica L. Cooper.
M.Eng.
Miranda, Ackerman Eduardo Jacobo. „Extracting Causal Relations between News Topics from Distributed Sources“. Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-130066.
Der volle Inhalt der QuelleLawrie, Scott. „Understanding the plasma and improving extraction of the ISIS Penning H⁻ ions source“. Thesis, University of Oxford, 2017. https://ora.ox.ac.uk/objects/uuid:1648761a-57b1-4d6f-8281-9d1c36ccd46a.
Der volle Inhalt der QuelleLindqvist, Max. „Insights into the plasma and beam physics close to the extraction surface in H⁻/D⁻ sources for fusion based on 3D-PIC MCC modeling“. Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASP133.
Der volle Inhalt der QuelleNegative hydrogen or deuterium ions for the ITER Neutral Beam Injection system are produced in radio-frequency ion sources, mainly by surface production, and accelerated through a multi-aperture grid system. One of the main limiting factors during the operation of such ion sources is the amount of co-extracted electrons, particularly during operation with D. For a correct description of the particle dynamics close to the Plasma Grid (PG), where a 3D magnetic field is present, self-consistent 3D-PIC MCC modeling is needed. The 3D-PIC MCC code ONIX has been used to simulate one PG aperture in the ELISE ion source, a half-size ITER-like prototype ion source at IPP Garching. The original code was improved by adding a plasma generation module that allows modeling the biasing of the PG with respect to the source walls. ONIX was coupled with the beam code IBSimu to allow the correlation of particle properties from the plasma to the beam and to study the extraction probability and beam divergence of negative ions during different configurations of PG biases and PG geometries, providing insights into grid optimization. The impact of plasma parameters, PG bias, and geometry of the grid on the co-extraction of electrons is presented. By increasing the PG bias above the floating potential, the amount and temporal instability of co-extracted electrons are strongly decreased, in agreement with experiments. Additionally, a lower electron temperature of around 1 eV can reduce the amount of co-extracted electrons by a factor of 4 compared to 2 eV. While the PG geometries studied did not have a significant impact on the co-extraction of electrons, the plasma-facing angle of the PG affects the extraction probability and accumulation of surface produced negative ions due to the acceleration by the Debye sheath. With a shallow angle, more ions are transported to the central region near the meniscus, resulting in a lower core divergence. As for future work, developing a self-consistent physical process for simulating the accumulation of surface produced negative ions in the plasma volume is important. This is challenging due to the time scale and domain size constraints inherent in 3D-PIC MCC modeling. As of now, achieving the experimentally measured density of surface produced negative ions has not been realized in 3D-PIC MCC simulations
Serrano, Laurie. „Vers une capitalisation des connaissances orientée utilisateur : extraction et structuration automatiques de l'information issue de sources ouvertes“. Caen, 2014. http://www.theses.fr/2014CAEN2011.
Der volle Inhalt der QuelleDue to the considerable increase of freely available data (especially on the Web), the discovery of relevant information from textual content is a critical challenge. Open Source Intelligence (OSINT) specialists are particularly concerned by this phenomenon as they try to mine large amounts of heterogeneous information to acquire actionable intelligence. This collection process is still largely done by hand in order to build knowledge sheets summarizing all the knowledge acquired about a specific entity. Given this context, the main goal of this thesis work is to reduce and facilitate the daily work of intelligence analysts. For this sake, our researches revolve around three main axis: knowledge modeling, text mining and knowledge gathering. We explored the literature related to these different domains to develop a global knowledge gathering system. Our first contribution is the building of a domain ontology dedicated to knowledge representation for OSINT purposes and that comprises a specific definition and modeling of the event concept for this domain. Secondly, we have developed and evaluated an event recognition system which is based on two different extraction approaches: the first one is based on hand-crafted rules and the second one on a frequent pattern learning technique. As our third contribution, we proposed a semantic aggregation process as a necessary post-processing step to enhance the quality of the events extracted and to convert extraction results into actionable knowledge. This is achieved by means of multiple similarity measures between events, expressed according a qualitative scale which has been designed following our final users' needs
Valentin, Sarah. „Extraction et combinaison d’informations épidémiologiques à partir de sources informelles pour la veille des maladies infectieuses animales“. Thesis, Montpellier, 2020. http://www.theses.fr/2020MONTS067.
Der volle Inhalt der QuelleEpidemic intelligence aims to detect, investigate and monitor potential health threats while relying on formal (e.g. official health authorities) and informal (e.g. media) information sources. Monitoring of unofficial sources, or so-called event-based surveillance (EBS), requires the development of systems designed to retrieve and process unstructured textual data published online. This manuscript focuses on the extraction and combination of epidemiological information from informal sources (i.e. online news), in the context of the international surveillance of animal infectious diseases. The first objective of this thesis is to propose and compare approaches to enhance the identification and extraction of relevant epidemiological information from the content of online news. The second objective is to study the use of epidemiological entities extracted from the news articles (i.e. diseases, hosts, locations and dates) in the context of event extraction and retrieval of related online news.This manuscript proposes new textual representation approaches by selecting, expanding, and combining relevant epidemiological features. We show that adapting and extending text mining and classification methods improves the added value of online news sources for event-based surveillance. We stress the role of domain expert knowledge regarding the relevance and the interpretability of methods proposed in this thesis. While our researches are conducted in the context of animal disease surveillance, we discuss the generic aspects of our approaches regarding unknown threats and One Health surveillance
Gutierrez, Alejandro. „Extraction et manipulation d'information structurée sous la forme de graphe à partir de sources de données existantes“. Versailles-St Quentin en Yvelines, 1997. http://www.theses.fr/1997VERS0015.
Der volle Inhalt der QuelleVerzeroli, Elodie. „Source NAPIS et Spectromètre PSI-TOF dans le projet ANDROMEDE“. Thesis, Université Paris-Saclay (ComUE), 2017. http://www.theses.fr/2017SACLS221/document.
Der volle Inhalt der QuelleThe goal of the ANDROMEDE project is to create a new instrument for sub-micrometric ion imaging and analysis by mass spectrometry, using ion impacts on nano-objects present in the solid sample surface and more particularly on biological samples. In-vitro and in-vivo analysis of these types of samples require mostly complex preparation and even atmospheric pressure experimentation. This unique instrument opens a new path for surface analysis characterization, which is complementary to the standard methods and technics used today.In the ANDROMEDE project, two elements have been developed in our study. The NAPIS source which delivers the nanoparticles allowing the increase of the secondary ion yield and the PSI-TOF mass spectrometer for the chemical analysis of the elements emitted from the sample surface.The NAPIS source delivers a primary beam of accelerated nanoparticles in a Pelletron 4MeV accelerator which is driven to a target. The NAPIS nanoparticles source has been developed and validated independently in the ORSAY PHYSICS Company firstly before its coupling on the accelerator. The new extraction optics called ExOTOF as well as the PSI-TOF orthogonal extraction mass spectrometer have been developed for the reliable secondary ions study and the increase of the mass resolution.These instruments have been specially designed for this project. This development will allow an efficient extraction and analysis of the secondary ions emitted from the sample surface using continuous primary beams and will have applications for atmospheric pressure studies. The assembly has been completely validated and the first tests of the output beam have been successfully carried out
Yeh, Chunghsin. „Extraction de fréquences fondamentales multiples dans des enregistrements polyphoniques“. Paris 6, 2008. http://www.theses.fr/2008PA066261.
Der volle Inhalt der QuelleArman, Molood. „Machine Learning Approaches for Sub-surface Geological Heterogeneous Sources“. Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG014.
Der volle Inhalt der QuelleIn oil and gas exploration and production, understanding subsurface geological structures, such as well logs and rock samples, is essential to provide predictive and decision support tools. Gathering and using data from a variety of sources, both structured and unstructured, such as relational databases and digitized reports on the subsurface geology, are critical. The main challenge for the structured data is the lack of a global schema to cross-reference all attributes from different sources. The challenges are different for unstructured data. Most subsurface geological reports are scanned versions of documents. Our dissertation aims to provide a structured representation of the different data sources and to build domain-specific language models for learning named entities related to subsurface geology
Yang, Seungwon. „Automatic Identification of Topic Tags from Texts Based on Expansion-Extraction Approach“. Diss., Virginia Tech, 2014. http://hdl.handle.net/10919/25111.
Der volle Inhalt der QuellePh. D.
Fargeas, Aureline. „Classification, feature extraction and prediction of side effects in prostate cancer radiotherapy“. Thesis, Rennes 1, 2016. http://www.theses.fr/2016REN1S022/document.
Der volle Inhalt der QuelleProstate cancer is among the most common types of cancer worldwide. One of the standard treatments is external radiotherapy, which involves delivering ionizing radiation to a clinical target, in this instance the prostate and seminal vesicles. The goal of radiotherapy is to achieve a maximal local control while sparing neighboring organs (mainly the rectum and the bladder) to avoid normal tissue complications. Understanding the dose/toxicity relationships is a central question for improving treatment reliability at the inverse planning step. Normal tissue complication probability (NTCP) toxicity prediction models have been developed in order to predict toxicity events using dosimetric data. The main considered information are dose-volume histograms (DVH), which provide an overall representation of dose distribution based on the dose delivered per percentage of organ volume. Nevertheless, current dose-based models display limitations as they are not fully optimized; most of them do not include additional non-dosimetric information (patient, tumor and treatment characteristics). Furthermore, they do not provide any understanding of local relationships between dose and effect (dose-space/effect relationship) as they do not exploit the rich information from the 3D planning dose distributions. In the context of rectal bleeding prediction after prostate cancer external beam radiotherapy, the objectives of this thesis are: i) to extract relevant information from DVH and non-dosimetric variables, in order to improve existing NTCP models and ii) to analyze the spatial correlations between local dose and side effects allowing a characterization of 3D dose distribution at a sub-organ level. Thus, strategies aimed at exploiting the information from the radiotherapy planning (DVH and 3D planned dose distributions) were proposed. Firstly, based on independent component analysis, a new model for rectal bleeding prediction by combining dosimetric and non-dosimetric information in an original manner was proposed. Secondly, we have developed new approaches aimed at jointly taking advantage of the 3D planning dose distributions that may unravel the subtle correlation between local dose and side effects to classify and/or predict patients at risk of suffering from rectal bleeding, and identify regions which may be at the origin of this adverse event. More precisely, we proposed three stochastic methods based on principal component analysis, independent component analysis and discriminant nonnegative matrix factorization, and one deterministic method based on canonical polyadic decomposition of fourth order array containing planned dose. The obtained results show that our new approaches exhibit in general better performances than state-of-the-art predictive methods
Pruvost, L. „Extraction du bruit de combustion d'un moteur Diesel. Développement et application d'un spectrofiltre“. Phd thesis, INSA de Lyon, 2009. http://tel.archives-ouvertes.fr/tel-00429987.
Der volle Inhalt der QuelleMao, Jin, Lisa R. Moore, Carrine E. Blank, Elvis Hsin-Hui Wu, Marcia Ackerman, Sonali Ranade und Hong Cui. „Microbial phenomics information extractor (MicroPIE): a natural language processing tool for the automated acquisition of prokaryotic phenotypic characters from text sources“. BIOMED CENTRAL LTD, 2016. http://hdl.handle.net/10150/622562.
Der volle Inhalt der QuelleKalledat, Tobias. „Tracking domain knowledge based on segmented textual sources“. Doctoral thesis, Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät, 2009. http://dx.doi.org/10.18452/15925.
Der volle Inhalt der QuelleThe research work available here has the goal of analysing the influence of pre-processing on the results of the generation of knowledge and of giving concrete recommendations for action for suitable pre-processing of text corpora in TDM. The research introduced here focuses on the extraction and tracking of concepts within certain knowledge domains using an approach of horizontally (timeline) and vertically (persistence of terms) segmenting of corpora. The result is a set of segmented corpora according to the timeline. Within each timeline segment clusters of concepts can be built according to their persistence quality in relation to each single time-based corpus segment and to the whole corpus. Based on a simple frequency measure it can be shown that only the statistical quality of a single corpus allows measuring the pre-processing quality. It is not necessary to use comparison corpora. The time series of the frequency measure have significant negative correlations between the two clusters of concepts that occur permanently and others that vary within an optimal pre-processed corpus. This was found to be the opposite in every other test set that was pre-processed with lower quality. The most frequent terms were grouped into concepts by the use of domain-specific taxonomies. A significant negative correlation was found between the time series of different terms per yearly corpus segments and the terms assigned to taxonomy for corpora with high quality level of pre-processing. A semantic analysis based on a simple TDM method with significant frequency threshold measures resulted in significant different knowledge extracted from corpora with different qualities of pre-processing. With measures introduced in this research it is possible to measure the quality of applied taxonomy. Rules for the measuring of corpus as well as taxonomy quality were derived from these results and advice suggested for the appropriate level of pre-processing.
Coly, Arona. „Etudes expérimentales de sources d'ions RCE à 2,45GHz pour la production de courants intenses“. Phd thesis, Université de Grenoble, 2010. http://tel.archives-ouvertes.fr/tel-00607679.
Der volle Inhalt der QuelleKlein, Philipp. „Non-Intrusive Information Sources for Activity Analysis in Ambient Assisted Living Scenarios“. Thesis, Mulhouse, 2015. http://www.theses.fr/2015MULH8932/document.
Der volle Inhalt der QuelleAs people grow older, they are often faced with some degree of decreasing cognitive abilities or physical strength. Isolation from social life, poor quality of life, and increased risk or injuries are the consequence. Ambient Assisted Living (AAL) is a vision for the way people live their life in their own home, as they grow older: disabilities or limitations are compensated for by technology, where care-giving personnel is scarce or relatives are unable to help. Affected people are assisted by technology. The term "Ambient" in AAL expresses, what this technology needs to be, beyond assistive. It needs to integrate into the living environment in such a way that it is not recognized as such any more. Interaction with residents needs to be intuitive and natural. Technical equipment should be unobtrusive and well integrated. The areas of application targeted in this thesis are activity monitoring and activity pattern discovery in apartments or small houses. The acquisition of information regarding the residents' activity is vital for the success of any assistive technology. In many areas of daily life, this is routine already. State-of-the-art sensing technology includes cameras, light barriers, RFID sensors, radio signal localization using transponders, and pressure sensitive Floors. Due to their operating principles, they have a big impact on home and living environments. Therefore, this thesis is dedicated to research for non-intrusive activity information acquisition technology, that has minimal impact on daily life. Two base technologies are taken into account in this thesis
Solana, Ciprés Miriam. „Supercritical CO2 technologies for the production of bioactive compounds from natural sources“. Doctoral thesis, Università degli studi di Padova, 2016. http://hdl.handle.net/11577/3424461.
Der volle Inhalt der QuelleLa richiesta di estratti naturali e composti di origine vegetale sta aumentando a causa del loro crescente uso in alimenti funzionali, medicina naturale, additivi per la salute, prodotti cosmetici e applicazioni farmaceutiche. Questi settori richiedono prodotti ultra-puliti, verificabili, di alta qualità, apparecchiature affidabili e prezzi competitivi. Finora, sono state studiate numerose fonti vegetali come materie prime per la produzione di estratti naturali. L'idea della trasformazione dei rifiuti alimentari è sempre più presa in considerazione anche perché è vantaggiosa rispetto ai problemi di contaminazione, di gestione del processo e delle vie economiche. Inoltre, le fonti naturali emergenti come le microalghe contengono elevate quantità di composti di alto valore, la cui estrazione potrebbe essere ambientalmente ed economicamente vantaggiosa. In questo contesto, è necessario mettere a punto un metodo di estrazione e/o separazione efficiente e “verde” non solo per elaborare gli alimenti non vendibili e i rifiuti e sfruttare le fonti naturali emergenti, ma anche per creare un prodotto molto richiesto nel mercato attuale. La tecnologia della CO2 supercritica è emersa come un’importante alternativa ai processi tradizionali con solventi organici e meccanici, grazie alla sua pressione critica moderata, che permette di contenere i costi di compressione, mentre la sua bassa temperatura critica consente l'estrazione di composti termosensibili senza degradazione. Altri vantaggi rispetto ai metodi di estrazione con solventi classici includono il fatto che la CO2 è inerte, non tossica e rispettosa dell'ambiente. Inoltre, nei processi supercritici, la CO2 viene facilmente rimossa dopo la depressurizzazione e permette cicli più veloci. D’altra parte, la natura polare della maggior parte dei composti naturali rende necessaria l'aggiunta di co-solventi alla CO2 supercritica, al fine di migliorare l'affinità del fluido verso composti polari, e il loro effetto è rilevante sulla composizione dell’estratto e di conseguenza sull'economia del processo. In questo caso, è importante utilizzare solventi ecologici, come acqua ed etanolo, al fine di mantenere i vantaggi dei processi supercritici. L’estrazione supercritica con la CO2 è il processo più studiato ed applicato tra quelli che usano la CO2 sotto pressione. Tuttavia, c’è un certo numero di tecnologie supercritiche che vengono studiate e sviluppate per altre applicazioni interessanti, come la precipitazione di composti polari (precipitazione supercritica con anti-solvente, SAS), che permette di ottenere precipitati secchi naturali, o la separazione di composti in miscele liquide (frazionamento supercritico in contro-corrente) per l'ottenimento di composti bioattivi più puri. L’ottimizzazione dei processi e delle variabili operative per estrarre composti di interese dalle nuove fonti naturali sono importanti per garantire rese massime di alta qualità e rendere il prodotto finale adatto per l'uso nelle industrie alimentari, cosmetiche e farmaceutiche. Perciò, è fondamentale continuare la ricerca delle tecnologie con la CO2 supercritica per diversi materiali e generare nuovi dati che possano essere utili per il potenziale scale-up dei processi proposti. Per tutti questi motivi, l'obiettivo di questo progetto di ricerca è stato quello di valutare il potenziale delle tecnologie supercritiche con CO2 per ottenere in modo sicuro, verde ed efficiente nuovi estratti naturali ricchi di composti bioattivi da prodotti agricoli e microalghe. I temi affrontati da questa tesi sono organizzati e suddivisi in capitoli come segue. Il capitolo 1 è una discussione introduttiva sul mercato degli estratti naturali, la situazione dei diversi metodi di estrazione, le ricerche e gli ultimi risultati riportati per le tecnologie supercritiche con CO2. I capitoli 2, 3, 4 e 5 presentano i risultati sperimentali e la modellizzazione eseguite sulla estrazione con CO2 supercritica da diverse fonti naturali, con lo scopo di verificare l'efficacia di questo metodo per ottenere in modo competitivo estratti naturali ricchi di diversi composti bioattivi. Nel capitolo 2, viene mostrata l’estrazione di acidi grassi essenziali con CO2 supercritica da tre diverse specie di microalghe. Viene studiato l'effetto delle variabili operative sulla resa di estrazione totale e sulla solubilità. Vengono applicati i modelli matematici sviluppati da Sovová per descrivere le curve di estrazione sperimentali. Nel capitolo 3 viene riportata l'estrazione di frazioni arricchite in diverse classi di composti bioattivi. Secondo i risultati, si propone l'applicazione di un metodo di estrazione sequenziale, utilizzando prima CO2 + etanolo per l'estrazione dei lipidi e poi l’acqua come co-solvente per ottenere estratti ricchi in composti fenolici e glucosinolati. Il capitolo 4 descrive la valutazione economica di un impianto su scala industriale per la produzione di estratti naturali ricchi di glucosinolati e composti fenolici di rucola. Il software Aspen Plus ™ V8.2 è stato impiegato per la simulazione del processo su larga scala, basandosi sulle misure sperimentali di laboratorio. Viene valutato l'effetto dei parametri operativi sui costi di processo. Il capitolo 5 è imperniato sul recupero dei composti fenolici da asparagi. Ulteriormente, viene esaminato l'effetto di miscele di co-solventi diverse sulla estrazione con CO2 supercritica per estrarre selettivamente molecole di polifenoli. I risultati sono confrontati con l'estrazione attraverso un liquido sotto pressione (PLE) e con il metodo Soxhlet. Inoltre, è stato applicato il processo supercritico con anti-solvente (SAS), con l'obiettivo di ottenere un precipitato essiccato ricco di composti antiossidanti. Il capitolo 6 è focalizzato sul processo SAS con CO2 per ottenere precipitati che sono ricchi di polifenoli e antociani composti da ciliegie. Vengono confrontati il modo continuo e batch di funzionamento. Inoltre, sono discussi l'effetto della pressione e della composizione della CO2 sui rendimenti di precipitazione di polifenoli e antociani. Il terzo metodo che è stato studiato è il frazionamento in controcorrente per la separazione dei composti di interesse di una miscela liquida. Nel capitolo 7, viene riportata la verifica del impianto di frazionamento con CO2 in colonna continua riempita. A tal fine, è stato eseguito il recupero di butanolo da soluzioni acquose. L'influenza delle variabile operative, come il rapporto delle portate del solvente e la soluzione, la temperatura, la pressione e la composizione della soluzione è stato studiato sperimentalmente in termini di efficienza di separazione, percentuale tasso di rimozione del butanolo, rimozione totale e di concentrazione butanolo nell'estratto alla fine del ciclo continuo. Nel capitolo 8 è presentato l'uso della CO2 in controcorrente come mezzo per ridurre il grasso residuo nella soia dopo l'estrazione acquosa enzimatica assistita della soia. In particolare, vengono analizzati gli effetti del rapporto di solventi da alimentare, dell’aggiunta di etanolo come modificatore e dell'introduzione di un riempimento nella colonna. L'interpretazione dei risultati è stata effettuata mediante l’analisi statistico ANOVA. Infine, nelle conclusioni, sono discussi la sintesi della tesi e gli aspetti che dovrebbero essere messi a fuoco per garantire il futuro di questa tecnologia.
Addo, Douglas Kweku. „OPERATION AND PROCESS CONTROL DEVELOPMENT FOR A PILOT-SCALE LEACHING AND SOLVENT EXTRACTION CIRCUIT RECOVERING RARE EARTH ELEMENTS FROM COAL-BASED SOURCES“. UKnowledge, 2019. https://uknowledge.uky.edu/mng_etds/50.
Der volle Inhalt der QuelleHerrault, Pierre-Alexis. „Extraction de fragments forestiers et caractérisation de leurs évolutions spatio-temporelles pour évaluer l'effet de l'histoire sur la biodiversité : une approche multi-sources“. Thesis, Toulouse 2, 2015. http://www.theses.fr/2015TOU20018/document.
Der volle Inhalt der QuelleBiodiversity in landscapes depends on landscape spatial patterns but can also be influenced by landscape history. Indeed, some species are likely to respond in the longer term to habitat disturbances. Therefore, in recent years, landscape dynamics have become a possible factor to explain current biodiversity. The aim of this thesis in GIS is part of this historical ecology context. We are dealing with automatic extraction of forest patches and characterization of their spatiotemporal evolution. The objective is to evaluate forest dynamics effects on current diversity of forest hoverflies. (Diptera: Syrphidae) in the agri-forestry landscape of Coteaux de Gascogne. The proposed general approach consists of three main steps: (1) the forest spatial database production from heterogeneous sources, (2) forest patches matching and characterization of their spatiotemporal evolution, (3) species-habitat modeling while integrating history as one of the factors likely to explain hoverflies diversity. Several methodological contributions were made. We proposed a new geometric correction approach based on kernel ridge regression to make consistent past and present selected data sources. We also developed an automatic extraction approach of forest from Historical Map of France of the 19th century. Finally, spatial uncertainty effects on ecological models responses have been assessed. From an ecological viewpoint, a significant effect from historical continuity of patches on forest hoverflies diversity was revealed. The most isolated fragments presented an extinction debt or a colonization credit according to area dynamics occurred in the last time-period (1970-2010). As it turns out, 30 years was not sufficient for forest hoverflies to reach new equilibrium after isolated habitat changes
Rodrigues, Dina Maria Ferreira. „Functional foods with innovative ingredients from seaweeds and mushrooms sources“. Doctoral thesis, Universidade de Aveiro, 2016. http://hdl.handle.net/10773/17075.
Der volle Inhalt der QuelleRecursos naturais, como algas e cogumelos podem ser usados para a obtenção de novos produtos, oferecendo uma forma alternativa e sustentável de originar novos alimentos ou ingredientes funcionais com propriedades biológicas que podem ajudar a desenvolver novas estratégias baseadas na saúde preventiva. A principal força motriz desta tese foi encontrar em algas e cogumelos, extratos com compostos bioativos naturais que possam ser usados no desenvolvimento de novos alimentos funcionais. Com este objectivo, foi efectuada uma abordagem integrada e vários passos foram seguidos numa ordem cronológica. Em primeiro lugar, selecionou-se um conjunto de espécies de algas marinhas assim como um conjunto de espécies de cogumelos comestíveis que foram caracterizados quanto à sua composição química e compostos bioativos potenciais. Tendo sido realizada a sua caracterização, surgiu a necessidade de se aplicar metodologias de extracção adequadas, rentáveis e amigas do ambiente, capazes de extrair ingredientes naturais biologicamente ativos de interesse tanto de algas como de cogumelos. Este objetivo nesta fase levou à segunda etapa do trabalho experimental que envolveu o estudo de várias técnicas de extração (extração com água quente, extração assistida por enzimas e por ultra-sons e extracção sob alta pressão hidrostática), a fim de se caracterizar plenamente o potencial das diferentes fontes naturais, introduzindo em simultâneo diferentes seletividades e eficiências, preservando o máximo de bioatividade. A etapa seguinte desta pesquisa integrada esteve relacionada com a caracterização química mais aprofundada de compostos bioativos presentes nos quatro extratos enzimáticos selecionados de algas (extrato S. muticum obtido com Alcalase e extrato O. pinnatifida obtido com Viscozyme) e de cogumelos (extratos de Ph. nameko obtidos com Celulase e com Flavourzyme) onde as propriedades biológicas alvo foram confirmadas ou verificadas como as mais promissoras. Os extratos selecionados com potenciais propriedades biológicas, após a caracterização química foram avaliados no que respeita à sua estabilidade in vitro para confirmar e consolidar o seu potencial biológico e ser mais explorado na perspectiva do alimento funcional. A última etapa deste trabalho envolveu o desenvolvimento de um novo alimento funcional incorporando-se os dois extratos mais promissores e validados previamente (extrato O. pinnatifida obtido com Viscozyme e extrato Ph. nameko obtido com Flavourzyme) num creme lácteo para barrar, procedendo-se à avaliação do seu potencial biológico e tecnológico. Um creme lácteo de barrar funcional, combinando requeijão e iogurte tipo grego com incorporação dos extratos selecionados foi formulado e explorado com sucesso. O desenvolvimento de alimentos funcionais, ou mesmo de nutracêuticos, a partir de extratos de algas e de cogumelos comestíveis é viável podendo-se estender o estudo da incorporação destes extratos em outros tipos de alimentos como bebidas ou sorvete.
Natural resources such as seaweed and mushrooms can be used to obtain new products, providing an alternative and sustainable manner to provide new functional foods or ingredients with biological properties that may help to develop new strategies based on preventive health. The main driving force of this thesis was to find in seaweeds and mushrooms, extracts with natural bioactive compounds to be used in the development of new functional foods. In order to do so, an integrated approach was established and the various steps were followed chronologically. Firstly, a set of edible seaweeds species and a set of edible mushrooms species were characterised for their proximate composition and potential bioactive compounds. Once the characterization was achieved the need for suitable, fast, cost-effective and environmentally friendly extraction methodologies capable of extracting the biologically active natural ingredients of interest from both the seaweeds and mushrooms led to the second stage. This stage involved the study of several extraction techniques (hot water extraction; enzyme- and ultrasound-assisted extraction and high hydrostatic pressure) in order to fully characterize the potential of the different natural sources, introducing different extraction selectivity and efficiency while aiming at maximum preservation of bioactivity. The next stage in this integrated research was related with the deeper chemical characterization of the bioactive components present in the four enzymatic selected extracts from seaweeds (S. muticum extract obtained with Alcalase and O. pinnatifida extract obtained with Viscozyme) and mushrooms (Ph. nameko extracts obtained with Cellulase and with Flavourzyme) where target biological properties were confirmed or found to be more promising. The selected extracts with potential biological properties, following the chemical characterization went through the evaluation of in vitro stability to confirm and consolidate its biological potential to be further explored within the functional food perspective. The last stage of this thesis involved the development of a new functional food by incorporating the two most promising and validated extracts (O. pinnatifida obtained with Viscozyme and Ph. nameko obtained with Flavourzyme) in a spreadable dairy cream with assessment of their biological and technological potential. A functional dairy spreadable dairy cream combining whey cheese and greek type yoghurt with incorporation of the selected extracts was successfully formulated and explored. The development of functional foods, or even nutraceuticals, from edible seaweed and mushroom extracts is feasible and could be extended studying the incorporation of these extracts in other types of food such as beverages or ice cream.
Montellano, Duran Ivar Mauricio [Verfasser], und Ursel [Akademischer Betreuer] Fantz. „Application of a 3D Monte Carlo PIC code for modeling the particle extraction from negative ion sources / Ivar Mauricio Montellano Duran ; Betreuer: Ursel Fantz“. Augsburg : Universität Augsburg, 2019. http://d-nb.info/1203542690/34.
Der volle Inhalt der QuelleLane, Marshalle. „Dispersive liquid-liquid micro-extraction coupled with gas chromatography for the detection of trihalomethanes in different water sources in the Western Cape, South Africa“. Thesis, Cape Peninsula University of Technology, 2018. http://hdl.handle.net/20.500.11838/2852.
Der volle Inhalt der QuelleTrihalomethanes (THMs) are a group of four compounds that are formed, along with other disinfected by-products. This happens when chloride or other disinfectants are used to control microbial contamination in drinking water, which then reacts with natural organic or inorganic substances in water. Trihalomethanes are better known by their common names such as chloroform, bromodichloromethane, chlorodibromomethane and bromoform. These four compounds are known to be classified as cancer group B carcinogens (shown to cause cancer in laboratory animals). Trihalomethane levels tend to increase with pH, temperature, time and the level of “precursors" present. Precursors are known to be organic substances which react with chloride to form THMs. One significant way of reducing the amount of THMs in water is to eliminate or reduce chlorination before filtrations and reduce precursors. There are guideline limits for THMs in the SANS 241:2015 document, but they are not continuously monitored and their levels in natural water are not known. The aim of this study is to develop a rapid, fast and reliable liquid-liquid microextraction technique, to determine the presence of THMs in natural water sources. This study particularly focuses on different water sources e.g. river, underground, borehole and chlorinated water. Chlorinated water is the water that has been presumably treated for bacteria and fungus growth. The results that were obtained for chlorinated water are as follow, 10.120 μg/L − 11.654 μg/L for chloroform, 2.214 μg/L - 2.666 μg/L for bromodichloromethane, 0.819 μg/L − 0.895 μg/L chlorodibromomethane and 0.103 μg/L - 0.135 μg/L for bromoform from validation data. All these THMs concentrations have been found to be below the SANS 241:2015 limits. Natural water shows a very high affinity for chloroform. This is what is expected under normal conditions as chloroform is the most abundant THM of all THMs present in natural water. The liquid-liquid microextraction technique that was optimized and used for the determination of THMs in this study is a rapid, simple and inexpensive technique that provides low limits of detection (LOD) e.g. 0.1999 μg/L chlorodibromomethane and 0.2056 μg/L bromoform and wide dynamic range (LOQ) of 0.6664 μg/L chlorodibromomethane and 0.6854 μg/L bromoform for the determination of THMs.
Baron, Valentin. „Méthodes d’identification de sources acoustiques paramétriques par mesures d’antennerie“. Thesis, Lyon, 2020. http://www.theses.fr/2020LYSEI121.
Der volle Inhalt der QuelleAcoustic sources characterization aims to describe sound emitters through some parameters like their localization in space, the sound level they produce or their identification thanks to their acoustic signature. In this thesis, the objective is to obtain some of these parameters in two industrial application cases, for sources located in far-field and by the use of acoustic arrays. The first application concerns deep-sea mining acoustic impact in the context of Abysound FUI project. Within it, the thesis searches to characterize the excavation machine located on the seabed by assessing their localization and their sound level. First, a design phase has led to the construction of a 3 m size acoustic array. Then, using data from two experimental campaigns conducted in the Mediterranean Sea with this array, the high-resolution method MUSIC accurately localizes the used acoustic sources, either mobile and more than 600 m away from the array, or immersed by 700 m depth. Their sound level is then estimated by beamforming, and the expected levels are verified for monochromatic or wideband signals. In the second application, a complete procedure for the localization and the identification of drones is proposed to protect sensitive areas. It combines array processing and machine learning through three key steps: localization, focalization, and identification. MUSIC localizes again nearby acoustic sources around the industrial array used, then focalization reconstructs each temporal signal, and a SVM model identifies them as drone or not. Experimental validations, inside and outside, establish an important contribution of this thesis work. Acquired data show for instance that the procedure localizes drones with 3° accuracy outside, detects them at 99 %, or identifies them despite the presence of a more powerful source
Camargo, Adriano Costa de. „Hurdles and potentials in value-added use of peanut and grape by-products as sources of phenolic compounds“. Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/11/11141/tde-09112016-171820/.
Der volle Inhalt der QuelleEstudos recentes têm demonstrado que subprodutos da indústria processadora de amendoim e uva podem ser mais ricos em compostos bioativos em comparação às suas matérias-primas. No entanto, alguns desafios tecnológicos precisam ser enfrentados antes da sua aplicação como fonte de compostos nutracêuticos ou na prevenção da oxidação lipídica em sistemas alimentares. Este estudo discute os recentes avanços na aplicação de subprodutos da indústria processadora de amendoim e uva como fontes de compostos fenólicos. Especial ênfase foi dada a sua caracterização por cromatografia líquida acoplada à espectrometria de massas, aos potenciais benefícios à saúde e à segurança microbiológica. As principais conclusões estão apresentadas nos capítulos 2, 3 e 4. O primeiro capítulo trata de compostos bioativos de subprodutos da indústria de suco de uva e da produção vinícola. A fração da qual foram extraídos os compostos fenólicos ligados à parede celular foi predominante. Em geral, esta fração também foi a mais eficaz na inibição da oxidação do LDL - colesterol in vitro quando comparada à fração que continha os fenólicos livres e os esterificados. Os compostos fenólicos de todas as frações inibiram o dano oxidativo ao DNA induzido por radicais peroxila. O terceiro capítulo fala sobre os efeitos da irradiação gama sobre a carga microbiana, a composição fenólica e as propriedades antioxidantes da película de amendoim. A irradiação gama (5,0 kGy) diminuiu a contagem microbiana do produto. Os compostos fenólicos totais, o teor de proantocianidinas e a capacidade dos extratos em neutralizar radicais como o ABTS, DPPH e espécies reativas de oxigênio como o peróxido de hidrogênio e radicais hidroxila, assim como o poder redutor da amostra, aumentaram devido à irradiação gama em ambas as frações (contendo fenólicos livres e ligados à parede celular). A bioatividade dos compostos fenólicos livres contra a oxidação do LDL-colesterol in vitro e contra os danos oxidativos ao DNA aumentou com a irradiação gama. Os compostos fenólicos foram positivamente ou tentativamente identificados, distribuindo-se entre: fenólicos livres > esterificados > ligados. Houve aumento na concentração de dímeros de procianidina A em todas as frações, enquanto a concentração de dímeros de procianidina B diminuiu. Essas alterações podem ser explicadas pela conversão molecular, despolimerização e formação de ligações cruzadas. No quarto e último capítulo, enzimas selecionadas foram aplicadas à matéria-prima inicial (experimento I) ou nos resíduos contendo apenas compostos fenólicos insolúveis (experimento II). Pronase e Viscozyme aumentaram a extração de compostos fenólicos insolúveis (ligados à parede celular). Viscozyme liberou maiores quantidades de ácido gálico, catequina e dímero de prodelfinidina A em comparação ao tratamento com Pronase. Além disso, os ácidos p-cumárico e ácido caféico, bem como o dímero de procianidina B, foram extraídos com Viscozyme, mas não com Pronase. A solubilidade desempenha um papel importante na biodisponibilidade de compostos fenólicos. Desta forma, o terceiro estudo oferece uma alternativa para a exploração de compostos fenólicos de subprodutos da indústria vinícola como ingredientes alimentares com propriedades funcionais ou suplementos alimentares.
Boustany, Roger. „Séparation aveugle à l'ordre deux de sources cyclostationnaires : Application aux mesures vibroacoustiques“. Compiègne, 2005. http://www.theses.fr/2005COMP1596.
Der volle Inhalt der QuelleThe objective of this thesis is the blind separation of vibroacoustic sources. The fundamental assumption is the cyclostationarity of rotating machines signals. The tools used are founded on the second-order cyclic statistics and in particular on the cyclic power spectrum. Blind source separation methods using explicitly these tools are developed. We favour the frequency-domain approach to separate convolutive mixtures involving very long impulse responses and we propose a solution to the problem of permutations based on cyclostationarity. Approaches aiming at extracting only one cyclostationary source of interest drowned by an unknown number of interferences and in the presence of a very important noise are proposed. A method based on the reduced-rank cyclic regression is finally favoured. Applications on real vibroacoustic signals are detailed to illustrate the very good performance of the methods
Thiebaut, Carole. „Caractérisation multidimensionnelle des images astronomiques : application aux images TAROT“. Toulouse, INPT, 2003. http://www.theses.fr/2003INPT022H.
Der volle Inhalt der QuelleVena, Phumla Faith. „Integration of xylan extraction prior to kraft and sodaAQ pulping from South African grown Eucalyptus grandis, giant bamboo and sugarcane bagasse to produce paper pulps, value added biopolymers and fermentable sugars“. Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019.1/80116.
Der volle Inhalt der QuelleENGLISH ABSTRACT: The extraction of hemicelluloses prior to pulping that would have been dissolved in black liquor during pulping process, is an attractive alternative for pulp and paper mills as they, in addition to their core products, can increase their revenue by producing biofuels, biopolymers, paper additives and other chemicals. However, the amount of hemicelluloses extracted will be limited by the requirement to maintain pulp yield and pulp quality in comparison to existing pulping processes. In the present study, mild alkaline (NaOH) and dilute sulphuric acid conditions were used to extract hemicelluloses from Eucalyptus grandis, giant bamboo (Bambusa balcooa) and sugarcane (Saccharum officinarum) bagasse (SCB) prior to kraft or sodaAQ pulping processes. The effects of catalyst concentration, temperature and reaction time on hemicelluloses pre-extraction were studied, using a statistical experimental design to investigate conditions under which hemicelluloses could be extracted prior to alkaline pulping with minimal interference on cellulose (glucan) content. Subsequently, selected pre-extracted materials were subjected to kraft or sodaAQ pulping to evaluate the effect of the hemicelluloses pre-extraction on cooking chemicals, pulp yield and properties. This study also included evaluation of hot water hemicelluloses pre-extraction of SCB as it was part of a dilute sulphuric acid experimental design. The pulp yield, cooking chemicals and handsheet strength properties were compared with those obtained from kraft or sodaAQ pulping of non extracted raw materials. The results showed that alkaline pre-extraction options investigated preserves the pulp yield with minimal effect on handsheet strength properties depending on the choice of the subsequent pulping method while a fraction of xylan was extracted in polymeric form. In addition, less active alkali was required to delignify the xylan extracted materials. The integration of hemicelluloses pre-extraction by alkaline methods into a kraft pulping process was preferred for giant bamboo and E. grandis since it maintained pulp yields at desired industrial levels of 50%, and pulps within a bleachable kappa number range. Another advantage observed was the reduction in total cooking active alkali required to delignify alkaline extracted giant bamboo or E. grandis by 8or 3 percentage points respectively. However, the compromise to maintain the pulp yield was obtained when only 13.6% or 12.4% polymeric xylan was solubilised from giant bamboo or E. grandis respectively. Slight improvement in burst index of the handsheet was observed for extracted giant bamboo. On the other hand, pulp viscosity was increased by 13% due to the removal of low molecular weight hemicelluloses, while the breaking strength of the handsheet was also increased by 8.9% for pulps produced from extracted E. grandis. In the case of sugarcane bagasse, hemicelluloses pre-extraction by alkaline methods integrated well with the sodaAQ pulping process. It enabled a xylan recovery of 69.1%, while providing pulp with higher screened pulp yield (45.0%), with an advantageous decrease in kappa number (15.5). The handsheet tear index was superior without reduction in viscosity compared to pulp produced from non extracted SCB. On the contrary, results obtained from optimised dilute sulphuric acid pre-extraction of all the tested feedstocks were found to negatively impact subsequent kraft or sodaAQ pulping processes resulting in lower pulp yields and poorer strengths properties. Nonetheless, the differences were better when sodaAQ pulping was used compared to kraft pulping. SodaAQ protects the carbohydrates against the peeling reaction under alkaline medium. Conversely, pre-extraction of SCB with hot water resulted in low concentration of xylooligomers (5.7%), while the subsequent sodaAQ pulping resulted in no pulp yield reduction. The tear index and optical brightness of the handsheet papers produced from hot water extracted SCB were slightly improved while the breaking length, tensile and burst indexes were similar to those of pulps produced from non extracted SCB fibres. Of equal importance were the observed higher tear and burst indexes of handsheets produced from giant bamboo compared to E. grandis for both extracted and non extracted materials prepared under similar pulping processes. The advantage of bamboo was due to the larger fibre length and different morphological properties to those of hardwoods. However, the pulps produced from giant bamboo showed higher kappa numbers than those pulps produced from E. grandis due to the high condensation behaviour of bamboo lignins under alkaline conditions. Higher kappa numbers explained the higher demand for subsequent bleaching chemicals. In conclusion, the pulp mill biorefinery concept through hemicelluloses pre-extraction with NaOH can be achieved with modified kraft pulping or the sodaAQ pulping processes, but it depends on the type of raw material, extraction method and quality and performance requirements of a particular paper. The low pulping chemicals demand, comparable pulp yields and the improvement in some physico-chemical properties of the pulps from preextracted materials were observed. Furthermore, owing to xylan pre-extraction a larger amount of (extracted) material could be loaded into the digester as when non-extracted materials were used.
AFRIKAANSE OPSOMMING: Ekstraksie van hemiselluloses wat tydens verpulping in die swartloog opgelos word, bied ‘n aantreklike alternatief aan pulp- en papiermeulens om, addisioneel tot hul hoofprodukte, hul inkomste deur die vervaardiging van biobrandstowwe, biopolimere, papierbymiddels en ander chemikalië, daardeur te kan verhoog. Die hoeveelheid hemiselluloses wat ge-ekstraheer kan word, sal egter beperk word deur die vereiste dat pulpopbrengs en –kwaliteit tydens bestaande verpulpingsprosesse gehandhaaf moet word. In hierdie ondersoek is matige alkaliese (NaOH) en verdunde swawelsuurtoestande gebruik om hemiselluloses vóór kraft- of natriumantrakinoonverpulping uit Eucalyptus grandis, reuse bamboes (Bambusa balcooa) en suikerriet (Saccharum officinarum) bagasse, mee te ekstraheer. Die invloed van katalisatorkonsentrasie, temperatuur en reaksietyd is mbv ‘n statistiese, eksperimentele ontwerp ondersoek om die toestande te bepaal waaronder hemiselluloses, met minimale effek op die sellulose (glukaan) –inhoud, vóór alkaliese verpulping ge-ekstraheer kan word. Die pre-ge-ekstraheerde materiale, met hoë glukaan- en voldoende hemisellulosesinhoud, is vervolgens aan kraft- en natriumantrakinoonverpulping onderwerp om die invloed van pre-ekstraksie van hemiselluloses op die verpulpingsreagense, pulpopbrengs en - eienskappe vas te stel. Hierdie studie het ook die evualering van warmwater hemisellulosespre-ekstraksie van suikerrietbagasse, wat deel is van ‘n verdunde swawelsuur eksperimentele uitleg, ingesluit. Pulpopbrengs, die hoeveelheid verpulpingsreagense en handveleienskappe van dieselfde materiale wat nie vooraf ge-ekstraheer is nie, is vergelyk. Die resultate toon dat alkaliese pre-ekstraksie metodes wat ondersoek is die pulpopbrengs met minimale effek op handvel sterkte-eienskappe afhangende van die keuse van daaropvolgende pulpmetode kon handhaaf terwyl ‘n fraksie van xilaan in polimeriese vorm ge-ekstraheer is. Addisioneel, is minder aktiewe alkali benodig om die xilaan ge-ekstraheerde materiale te delignifiseer. Die integrasie van hemisellulosespre-ekstraksie dmv alkaliese metodes tydens ‘n kraft verpulpingsproses is vir reuse bamboes en E. grandis verkies omdat pulpopbrengste op ideale industriële vlakke van 50% gehandhaaf en is en pulp in ‘n bleikbare kappa nommergebied interval kon lewer. ‘n Verdere voordeel wat waargeneem is was die vermindering in die totale gekookte aktiewe alkali benodig vir reuse bamboes of E. grandis met 8 of 3 persentasiepunte onderskeidelik. Die kompromie om die pulpopbrengs te handhaaf is verkry toe slegs 13.6% of 12.4% polimeriese xilaan opgelos is vanuit reuse bamboes of E. grandis onderskeidelik. ‘n Effense verbetering in bars-indeks van die handvelle is waargeneem vir ge-ekstraheerde reuse bamboes. Pulpviskositeit het met 13% gestyg agv die verwydering van die lae molekulêre massa hemiselluloses, terwyl breeksterkte van handvelle ook met 8.9% toegeneem het vir pulp verkry uit pre-gekstraheerde E. grandis. NaOH pre-ekstraksie van 69.1% xilaan (droë massa) uit suikerriet bagasse (SCB) het ‘n hoër natriumantrakinoon, gesifte pulpopbrengs gelewer (45.0%) met ‘n verbeterde afname in kappa-getal (15.5) en uitstekende skeursterkte sonder verlaging in viskositeit, soos vergelyk met nie-ge-ekstraheerde suikkerrietbagasse. Daarteenoor het die resultate verkry met die geoptimeerde verdunde swawelsuur preekstraksie van al die getoetste rumateriale getoon om‘n negatiewe effek te gehad het op die daaropvolgende kraft- of natriumantrakinoonverpulping dws het laer pulpopbrengste en swakker sterkte-eienskappe opgelewer. Die verskille was nietemin kleiner toe natriumantrakinoonverpulping ipv kraftverpulping gebruik is. Antrakinoon beskerm die koolhidrate teen die afskilreaksie in alkaliese medium. Daarteenoor het pre-ekstraksie van suikerrierbagasse met warm water tot 'n lae hoeveelheid (5.7%) xilaanoligomere gelei, terwyl die daaropvolgende natriumantrakinoonverpulping geen verlaging in pulpopbrengs veroorsaak het nie. Skeursterkte en optiese helderheid van handvelle wat uit warm water ge-ekstraheerde suikerrietbagasse vervaardig is, het ietwat verbeter terwyl breek-, trek- en barssterkte dieselfde was as van suikerrietbagasse pulp wat nie ge-ekstraheer is nie. Net so belangrik was die waargenome hoër skeur- en barsindekse van handvelle vervaardig van reuse bamboes in vergelyking met E. grandis van beide geekstraheerde en nie ge-ekstraheerde materiale voorberei onder dieselfde verpulpings toestande. Bamboes se sterker eienskappe was as gevolg van die hoër vesellengte en ander morfologiese eienskappe as diévan loofhout. Pulp wat vervaardig is van reuse bamboes het ‘n hoër kappanommer getoon as pulp van E. grandis as gevolg van die hoë kondensasiegedrag van bamboeslignien onder alkaliese toestande. Hoër kappanommers kon die gepaardgaande hoër aanvraag vir bleikchemikalieë verklaar. Ten slotte, die pulpmeul bio-raffinaderykonsep nl. deur hemisellulosesekstraksie met NaOH gekombineer met óf ‘n gemodifiseerde kraft verpulping óf ‘n gemodifiseerde natriumantrakinoon verpulping, is wel uitvoerbaar. Dit word egter sterk beïnvloed deur die tipe ru-materiaal en die ekstraksie-metode gebruik, asook deur die kwaliteits- en gebruiksvereistes van verskillende tipes papier. ‘n Lae aanvraag vir verpulpingschemikalieë, vergelykbare pulpopbrengste en die verbetering in fisies-chemiese eienskappe van pulp vanaf pre-ge-ekstraheerde materiale is waargeneem. Verder kon, as gevolg van xilaan ekstraksie, meer ge-ekstraheerde materiaal in die verteerder gelaai word as wanneer nie-ge-ekstraheerde materiaal gebruik is.
Barbu, Eduard. „Extracting conceptual structures from multiple sources“. Doctoral thesis, Università degli studi di Trento, 2010. https://hdl.handle.net/11572/368347.
Der volle Inhalt der QuelleBarbu, Eduard. „Extracting conceptual structures from multiple sources“. Doctoral thesis, University of Trento, 2010. http://eprints-phd.biblio.unitn.it/180/2/eduardThesis.pdf.
Der volle Inhalt der QuelleSofianos, Stratis. „Singing voice extraction from stereophonic recordings“. Thesis, University of Hertfordshire, 2013. http://hdl.handle.net/2299/10054.
Der volle Inhalt der QuelleRougier, Simon. „Apport des images satellites à très haute résolution spatiale couplées à des données géographiques multi-sources pour l’analyse des espaces urbains“. Thesis, Strasbourg, 2016. http://www.theses.fr/2016STRAH019/document.
Der volle Inhalt der QuelleClimate change presents cities with significant environmental challenges. Urban planners need decision-making tools and a better knowledge of their territory. One objective is to better understand the link between the grey and the green infrastructures in order to analyse and represent them. The second objective is to propose a methodology to map the urban structure at urban fabric scale taking into account the grey and green infrastructures. In current databases, vegetation is not mapped in an exhaustive way. Therefore the first step is to extract tree and grass vegetation using Pléiades satellite images using an object-based image analysis and an active learning classification. Based on those classifications and multi-sources data, an approach based on knowledge discovery in databases is proposed. It is focused on set of indicators mostly coming from urbanism and landscape ecology. The methodology is built on Strasbourg and applied on Rennes to validate and check its reproducibility
Soubercaze-Pun, Geoffroy. „De l’étude de bruit basse fréquence à la conception d’oscillateur en bande X à partir de transistor AlGaN/GaN HEMT“. Toulouse 3, 2007. http://www.theses.fr/2007TOU30081.
Der volle Inhalt der QuelleThis work is dedicated to the study in the field of low frequency noise characterization of Gallium Nitride High Electron Mobility Transistors (HEMT) and to the design of an X-Band low phase noise oscillator. First of all, we describe the Gallium Nitride intrinsic properties, the HEMT structure and the associated noise sources that can occur in such device. The low frequency noise (LFN) measurement methodology is also presented. Then, a comparative study is exposed using low frequency noise measurement between devices grown on different substrate (Si, SiC, Al2O3). Finally, an investigation on the 1/fg noise and the frequency index g is performed, indicating a correlation between the frequency index g and the transport mechanism of the carriers in the two dimensional electron gas (2DEG) or in a parasitic AlGaN channel between drain and gate. This study makes use of both LFN measurements and physical simulations. The second part focuses on HEMT grown on SiC substrate: low frequency noise spectra are investigated, and a mathematical extraction procedure is presented. Then, an accurate study is lade thanks to the mathematical extraction of the noise sources versus biasing and under different thermal stress conditions to find the origin of G-R centers. A correlation between this study and SIMS measurements is presented. The last section of this work deals with large signal modelling and X-band oscillator: an original, accurate and fast modelling technique is proposed as an alternative to the usually time consuming traditional techniques. Thus the oscillator is designed, and its performances are discussed (POUT=20dBm, Lf(100kHz)=-105 dBc/Hz at 10 GHz)
Rougier, Simon. „Apport des images satellites à très haute résolution spatiale couplées à des données géographiques multi-sources pour l’analyse des espaces urbains“. Electronic Thesis or Diss., Strasbourg, 2016. http://www.theses.fr/2016STRAH019.
Der volle Inhalt der QuelleClimate change presents cities with significant environmental challenges. Urban planners need decision-making tools and a better knowledge of their territory. One objective is to better understand the link between the grey and the green infrastructures in order to analyse and represent them. The second objective is to propose a methodology to map the urban structure at urban fabric scale taking into account the grey and green infrastructures. In current databases, vegetation is not mapped in an exhaustive way. Therefore the first step is to extract tree and grass vegetation using Pléiades satellite images using an object-based image analysis and an active learning classification. Based on those classifications and multi-sources data, an approach based on knowledge discovery in databases is proposed. It is focused on set of indicators mostly coming from urbanism and landscape ecology. The methodology is built on Strasbourg and applied on Rennes to validate and check its reproducibility
Bryś, Tomasz. „Extraction of ultracold neutrons from a solid Deuterium source /“. Zürich : ETH, 2007. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=17350.
Der volle Inhalt der QuelleWang, Zongkang. „A source-extraction based coupling method for computational aeroacoustics“. Thesis, University of Greenwich, 2004. http://gala.gre.ac.uk/6339/.
Der volle Inhalt der QuelleAcharya, Rupesh. „Object Oriented Design Pattern Extraction From Java Source Code“. Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-207394.
Der volle Inhalt der QuelleDmour, Mohammad A. „Mixture of beamformers for speech separation and extraction“. Thesis, University of Edinburgh, 2010. http://hdl.handle.net/1842/4685.
Der volle Inhalt der QuelleGueroult, Renaud. „Étude d'une source d'ions obtenue par extraction et accélération à partir d'une source plasma filaire“. Phd thesis, Palaiseau, Ecole polytechnique, 2011. https://pastel.hal.science/docs/00/64/68/21/PDF/these.pdf.
Der volle Inhalt der QuelleIn this study we first model a DC low pressure wire plasma source and then characterize the properties of an ion gun derived from the plasma source. In order to study the properties of the derived ion gun, we develop a particle-in-cell code fitted to the modelling of the wire plasma source operation, and validate it by confrontation with the results of an experimental study. In light of the simulation results, an analysis of the wire discharge in terms of a collisional Child-Langmuir ion flow in cylindrical geometry is proposed. We interpret the mode transition as a natural reorganisation of the discharge when the current is increased above a threshold value which is a function of the discharge voltage, the pressure and the inter-electrodes distance. In addition, the analysis of the energy distribution function of ions impacting the cathode demonstrates the ability to extract an ion beam of low energy spread around the discharge voltage assuming that the discharge is operated in its high pressure mode. An ion source prototype allowing the extraction and acceleration of ions from the wire source is then proposed. The experimental study of such a device confirms that, apart from a shift corresponding to the accelerating voltage, the acceleration scheme does not spread the ion velocity distribution function along the axis of the beam. It is therefore possible to produce tunable energy (0-5 keV) ion beams of various ionic species presenting limited energy dispersion (~ 10 eV). The typical beam currents are about a few tens of micro-amperes, and the divergence of such a beam is on the order of one degree. A numerical modelling of the ion source is eventually conducted in order to identify potential optimizations of the concept
Gueroult, Renaud. „Étude d'une source d'ions obtenue par extraction et accélération à partir d'une source plasma filaire“. Phd thesis, Ecole Polytechnique X, 2011. http://pastel.archives-ouvertes.fr/pastel-00646821.
Der volle Inhalt der Quelle