Segui questo link per vedere altri tipi di pubblicazioni sul tema: Techniques.

Tesi sul tema "Techniques"

Cita una fonte nei formati APA, MLA, Chicago, Harvard e in molti altri stili

Scegli il tipo di fonte:

Vedi i top-50 saggi (tesi di laurea o di dottorato) per l'attività di ricerca sul tema "Techniques".

Accanto a ogni fonte nell'elenco di riferimenti c'è un pulsante "Aggiungi alla bibliografia". Premilo e genereremo automaticamente la citazione bibliografica dell'opera scelta nello stile citazionale di cui hai bisogno: APA, MLA, Harvard, Chicago, Vancouver ecc.

Puoi anche scaricare il testo completo della pubblicazione scientifica nel formato .pdf e leggere online l'abstract (il sommario) dell'opera se è presente nei metadati.

Vedi le tesi di molte aree scientifiche e compila una bibliografia corretta.

1

Hassan, Alaa. "Proposition et développement d’une approche pour la maîtrise conjointe qualité/coût lors de la conception et de l’industrialisation du produit". Paris, ENSAM, 2010. http://www.theses.fr/2010ENAM0009.

Testo completo
Abstract (sommario):
Le besoin de compétitivité impose aujourd'hui de concevoir plus rapidement, mieux et moins cher que les concurrents. Dans ce contexte, la prise en compte de l’ensemble des contraintes du produit dépends du retour des indicateurs de performance (coût, risque, qualité, délai,…). Il est donc important de bien évaluer ces indicateurs afin de retourner des informations robustes et cohérentes qui permettent de prendre une bonne décision pour piloter le cycle de développement du produit de sort qu'on obtient un meilleur rapport qualité/coût. L’interopérabilité des approches QFD, FMEA et KCs a été étudiée afin de les exploiter dans un seul cadre de maîtrise de la qualité pour assurer l’homogénéité et la cohérence des indicateurs de la qualité. Le coût a été pris en compte en proposant l’approche CbFMEA, basée sur FMEA classique, qui permet d’estimer le coût de non-qualité afin d’évaluer la gravité financière des défaillances du produit. Ce coût a été ajouté au coût de fabrication du produit estimé par la méthode ABC. L’approche du coût (CbFMEA/ABC) a été couplée à l’approche de la qualité (QFD/FMEA/KCs) résultant une approche conjointe qualité/coût illustrée par un diagramme d’activités. Un modèle de données de l’approche conjointe qualité/coût a été proposé est une maquette informatique a été développée afin de valider les concepts et le modèle proposés. Une application de cette approche a été illustrée dans la phase de pré-gamme d’usinage via un cas d’études. Cette application est nommée QCCPP, elle fournit des indicateurs de capabilité, de risques et du coût. L’objectif est l’aide à la décision lors du choix multicritères des ressources d’usinage pour l’amélioration conjointe de rapport qualité/coût du produit
The need for competitiveness requires today to design faster, better and cheaper than competitors. In this context, taking into account the product constraints depends on the return of performance indicators (cost, risk, quality, time. . . ). Therefore, it is important to evaluate in detail these indicators ta return robust and coherent information. This information allows making a good decision to pilot the product development cycle in order to obtain a better quality/cost ratio. The interoperability of the approaches QFD, FMEA and KCs was studied to exploit them in a single framework of quality management ta ensure homogeneity and coherence of the quality indicators. Cast was taken into account in proposing the CbFMEA approach based on classical FMEA, which estimates the cost of non-quality to assess the failure financial severity of the product. This cost was added to the manufacturing cost of the product estimated by the ABC method. The cost approach (CbFMENABC) was coupled with the quality approach (QFD / FMEA / KCs) resulting a combined quality/cost approach illustrated by an activity diagram. An information model of the joint quality/cost approach was proposed and a prototype has been developed to validate the proposed concepts and model. An application of this approach was illustrated in conceptual process planning phase via a case study. This application is narned QCCPP; it provides indicators of capability, risks and cost. The objective is to support the decision-making during the multi-criteria selection of manufacturing alternatives for the joint improvement of product quality/cost ratio
Gli stili APA, Harvard, Vancouver, ISO e altri
2

Solomonov, Isak. "VaR Techniques". Thesis, Uppsala universitet, Analys och sannolikhetsteori, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-447476.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
3

Baek, Kwang Ho. "Idea generation techniques : an analysis of three idea generating techniques". Virtual Press, 1998. http://liblink.bsu.edu/uhtbin/catkey/1100445.

Testo completo
Abstract (sommario):
This experiment was designed to give further understanding of the underlying factors which influence group idea generation. The first objective of this study was to compare the impact of using computer technology and traditional technologies for creating ideas. The effectiveness of three idea generating techniques, original brainstorming, nominal group technique, and electronic brainstorming were considered. It was, however, hypothesized that electronic brainstorming would outperform the nominal group technique and original brainstorming regardless of the length of time provided.The second objective of this study was to probe how subjects in different idea generating conditions discerned their performance during and after sessions. It was expected that subjects in the original brainstorming groups would perceive that they produce more ideas and they would be more satisfied with results and the process.An ANOVA with a 3x2 factorial design was planned for the study. The independent variables for the study were types of group and types of session. Yet, on account of small sampling size an inferential analysis was precluded. A descriptive analysis was followed.The analysis of five dependent variables, quality, originality, practicality, numbers of nonoverlapping ideas, and perceptions showed that there were no significant differences among three idea generation techniques regarding the length of time provided. However, a quantity variable showed that numbers of nonoverlapping ideas were increased as the length of time were prolonged in six idea generating conditions.
Department of Counseling Psychology and Guidance Services
Gli stili APA, Harvard, Vancouver, ISO e altri
4

Sekkal, Rafiq. "Techniques visuelles pour la détection et le suivi d’objets 2D". Thesis, Rennes, INSA, 2014. http://www.theses.fr/2014ISAR0032/document.

Testo completo
Abstract (sommario):
De nos jours, le traitement et l’analyse d’images trouvent leur application dans de nombreux domaines. Dans le cas de la navigation d’un robot mobile (fauteuil roulant) en milieu intérieur, l’extraction de repères visuels et leur suivi constituent une étape importante pour la réalisation de tâches robotiques (localisation, planification, etc.). En particulier, afin de réaliser une tâche de franchissement de portes, il est indispensable de détecter et suivre automatiquement toutes les portes qui existent dans l’environnement. La détection des portes n’est pas une tâche facile : la variation de l’état des portes (ouvertes ou fermées), leur apparence (de même couleur ou de couleur différentes des murs) et leur position par rapport à la caméra influe sur la robustesse du système. D’autre part, des tâches comme la détection des zones navigables ou l’évitement d’obstacles peuvent faire appel à des représentations enrichies par une sémantique adaptée afin d’interpréter le contenu de la scène. Pour cela, les techniques de segmentation permettent d’extraire des régions pseudo-sémantiques de l’image en fonction de plusieurs critères (couleur, gradient, texture…). En ajoutant la dimension temporelle, les régions sont alors suivies à travers des algorithmes de segmentation spatio-temporelle. Dans cette thèse, des contributions répondant aux besoins cités sont présentées. Tout d’abord, une technique de détection et de suivi de portes dans un environnement de type couloir est proposée : basée sur des descripteurs géométriques dédiés, la solution offre de bons résultats. Ensuite, une technique originale de segmentation multirésolution et hiérarchique permet d’extraire une représentation en régions pseudosémantique. Enfin, cette technique est étendue pour les séquences vidéo afin de permettre le suivi des régions à travers le suivi de leurs contours. La qualité des résultats est démontrée et s’applique notamment au cas de vidéos de couloir
Nowadays, image processing remains a very important step in different fields of applications. In an indoor environment, for a navigation system related to a mobile robot (electrical wheelchair), visual information detection and tracking is crucial to perform robotic tasks (localization, planning…). In particular, when considering passing door task, it is essential to be able to detect and track automatically all the doors that belong to the environment. Door detection is not an obvious task: the variations related to the door status (open or closed), their appearance (e.g. same color as the walls) and their relative position to the camera have influence on the results. On the other hand, tasks such as the detection of navigable areas or obstacle avoidance may involve a dedicated semantic representation to interpret the content of the scene. Segmentation techniques are then used to extract pseudosemantic regions based on several criteria (color, gradient, texture...). When adding the temporal dimension, the regions are tracked then using spatiotemporal segmentation algorithms. In this thesis, we first present joint door detection and tracking technique in a corridor environment: based on dedicated geometrical features, the proposed solution offers interesting results. Then, we present an original joint hierarchical and multiresolution segmentation framework able to extract a pseudo-semantic region representation. Finally, this technique is extended to video sequences to allow the tracking of regions along image sequences. Based on contour motion extraction, this solution has shown relevant results that can be successfully applied to corridor videos
Gli stili APA, Harvard, Vancouver, ISO e altri
5

Kothari, Neeraj. "Novel Polarimetry Techniques". Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/19779.

Testo completo
Abstract (sommario):
Polarization specific measurements are advancing the capabilities of scientific instruments looking for ever smaller effects and material parameters. For example, the magneto-optical nonlinear Faraday effect can be used to characterize various electric and magnetic polarizability parameters of an individual molecule. Another major application is detection of desired particles in a highly scattering environment, the physical effect of which has been extensively researched, and is being overcome by using time-gated and polarization techniques. The polarimeter sensitivity is limited by the extinction-ratio obtained from polarizers. Of available polarizer materials, naturally occurring Calcite crystals provide the best extinction ratios because of their good optical homogeneity and high birefringence. However, there is a need for polarization determination with higher sensitivities, and thus a necessity to find better polarizing materials and methods. I developed a next-generation polarimeter in an attempt to sensitively detect the second-order Faraday effect, along with a substance s chirality and Verdet constant. Also, I developed a device uniquely able to sensitively detect chiral signatures in the presence of massive depolarizing scattering. In addition, I begun developing a novel type of polarimeter based on the highly-polarization-sensitive nonlinear-optical process of harmonic generation, whose required crystals can be grown with extremely high quality.
Gli stili APA, Harvard, Vancouver, ISO e altri
6

Faur, Andrei. "Memory Profiling Techniques". Thesis, Linköpings universitet, Programvara och system, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-79598.

Testo completo
Abstract (sommario):
Memory profiling is an important technique which aids program optimization and can even help tracking down bugs. The main problem with the current memory profiling techniques and tools is that they slow down the target software considerably therefore making them inadequate for mainline integration. Ideally, the user would be able to monitor memory consumption without having to worry about the rest of the software being affected in any way. This thesis provides a comparison of existing techniques and tools along with the description of a memory profiler implementation which tries to provide a balance between the information it is able to retrieve and the influence it has on the target software.
Gli stili APA, Harvard, Vancouver, ISO e altri
7

Womack, Timothy Guy Christian. "Reverse complilation techniques". Thesis, Royal Holloway, University of London, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.270435.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
8

Milton, Nicholas Ross. "Personal knowledge techniques". Thesis, University of Nottingham, 2003. http://eprints.nottingham.ac.uk/13122/.

Testo completo
Abstract (sommario):
Work towards the development of a new computer-assisted methodology for psychological study and intervention is described. This is referred to as the Personal Knowledge Methodology since it focuses on the elicitation and presentation of personal knowledge. Personal knowledge includes the knowledge individuals have of their life history, their behaviours, their moods, their relationships, their ambitions, and so on. Principles and techniques used in Knowledge Engineering form the basis of the design of the Personal Knowledge Methodology. At the heart of the methodology is the use of a suite of knowledge acquisition and modelling techniques. These are referred to as Personal Knowledge Techniques. Based on a review of a wide-range of literature, eight techniques were selected to be assessed for their possible use as Personal Knowledge Techniques. These included interview-based techniques, repertory grid techniques and diagram-based techniques. Two in-depth studies took place involving 18 participants and a total of 100 knowledge acquisition sessions. The results revealed that each of the eight techniques showed promise at efficiently capturing and structuring aspects of an individual’s personal knowledge. In addition, the techniques showed potential for providing help in allowing reflection and revealing insights. In particular, a technique based on the construction and use of a state transition network was found to be the most highly rated by the participants. A content analysis of the knowledge acquired formed the basis of an ontology of personal knowledge that would underpin many uses of the Personal Knowledge Methodology. The empirical work and analysis led to a number of ideas for future developments of the methodology and uses for the Personal Knowledge Techniques.
Gli stili APA, Harvard, Vancouver, ISO e altri
9

Hill, Stephen A. "Functional programming techniques". Thesis, University of Kent, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.236147.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
10

Alston, D. "Combined Fourier techniques". Thesis, University of Westminster, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.383568.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
11

Rubio, Pedro, Moises Gonzalez, Diego Roses e Rodrigo Lopez. "Advanced Monitoring Techniques". International Foundation for Telemetering, 2014. http://hdl.handle.net/10150/577390.

Testo completo
Abstract (sommario):
ITC/USA 2014 Conference Proceedings / The Fiftieth Annual International Telemetering Conference and Technical Exhibition / October 20-23, 2014 / Town and Country Resort & Convention Center, San Diego, CA
The State of The Art in Operating Systems and new human machine interfaces are moving forward quickly. Flight Test Data Processing Department has developed new tools for monitoring Flight Tests using new computer technologies like .NET virtual machines, "on-the-fly" compilation, intelligent behavior, multi-touch capabilities and high performance vector graphics libraries. All these new techniques allows the user to optimize Flight Tests reducing the time for taking decisions, helping to make complex calculations in real time and adapting the visualization displays to Flight Test Engineers requirements in real time.
Gli stili APA, Harvard, Vancouver, ISO e altri
12

Cilke, Tom. "Video Compression Techniques". International Foundation for Telemetering, 1988. http://hdl.handle.net/10150/615075.

Testo completo
Abstract (sommario):
International Telemetering Conference Proceedings / October 17-20, 1988 / Riviera Hotel, Las Vegas, Nevada
This paper will attempt to present algorithms commonly used for video compression, and their effectiveness in aerospace applications where size, weight, and power are of prime importance. These techniques will include samples of one-, two-, and three-dimensional algorithms. Implementation of these algorithms into usable hardware is also explored but limited to monochrome video only.
Gli stili APA, Harvard, Vancouver, ISO e altri
13

Cropper, Nick I. "Effective termination techniques". Thesis, University of St Andrews, 1997. http://hdl.handle.net/10023/13453.

Testo completo
Abstract (sommario):
An important property of term rewriting systems is termination: the guarantee that every rewrite sequence is finite. This thesis is concerned with orderings used for proving termination, in particular the Knuth-Bendix and polynomial orderings. First, two methods for generating termination orderings are enhanced. The Knuth-Bendix ordering algorithm incrementally generates numeric and symbolic constraints that are sufficient for the termination of the rewrite system being constructed. The KB ordering algorithm requires an efficient linear constraint solver that detects the nature of degeneracy in the solution space, and for this a revised method of complete description is presented that eliminates the space redundancy that crippled previous implementations. Polynomial orderings are more powerful than Knuth-Bendix orderings, but are usually much harder to generate. Rewrite systems consisting of only a handful of rules can overwhelm existing search techniques due to the combinatorial complexity. A genetic algorithm is applied with some success. Second, a subset of the family of polynomial orderings is analysed. The polynomial orderings on terms in two unary function symbols are fully resolved into simpler orderings. Thus it is shown that most of the complexity of polynomial orderings is redundant. The order type (logical invariant), either r or A (numeric invariant), and precedence is calculated for each polynomial ordering. The invariants correspond in a natural way to the parameters of the orderings, and so the tabulated results can be used to convert easily between polynomial orderings and more tangible orderings. The orderings of order type are two of the recursive path orderings. All of the other polynomial orderings are of order type w or w2 and each can be expressed as a lexicographic combination of r (weight), A (matrix), and lexicographic (dictionary) orderings. The thesis concludes by showing how the analysis extends to arbitrary monadic terms, and discussing possible developments for the future.
Gli stili APA, Harvard, Vancouver, ISO e altri
14

Caignard, Rolland. "Poésie et techniques". Aix-Marseille 1, 2002. http://www.theses.fr/2002AIX10028.

Testo completo
Abstract (sommario):
Le but de cette thèse est d'analyser les relations qui existent entre la poésie contemporaine (visuelle, concrète, visive, sonore, informatique, etc. ) et diverses techniques anciennes et modernes (typographie, photographie, machine à écrire, radio, magnétophone, ordinateur, réseaux télématiques, etc. ) en dégageant, des éléments techniques (supports et fonctionnalités) dans la poésie, en trouvant sous les formes poétiques des formes techniques. Elle vise à noter les transformations de l'écriture de la poésie selon la technique utilisée et à montrer qu'à chaque technique correspond une poésie spécifique. L'observation de ce lien entre la poésie et les techniques est une ébauche à une étude médiologique de la création poétique où la création serait motivée par les techniques et leur impact dans les structures sociales et comportementales humaines. Le développement technique entraînerait un bouleversement sensoriel qui provoquerait en conséquence un bouleversement esthétique. Ainsi deux hypothèses sont considérées : 1) il existe un moment où le poète se trouve en situation technique avec l'arrivée d'une nouvelle technique qu'il va expérimenter ; 2) l'une des fonctions majeures de l'expérimentation poétique est le détournement de la technique. La recherche tente de relever les mises en siuations techno-poétiques par les exemples significatifs d'artistes et d'oeuvres poétiques.
Gli stili APA, Harvard, Vancouver, ISO e altri
15

Fulgham, Melanie L. "Multicomputer routing techniques /". Thesis, Connect to this title online; UW restricted, 1997. http://hdl.handle.net/1773/7021.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
16

Cifuentes, Cristina. "Reverse compilation techniques". Thesis, Queensland University of Technology, 1994.

Cerca il testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
17

Laperche, Blandine. "Appropriabilité de l'information scientifique et technique, innovation et normalisation des techniques de production". Littoral, 1997. http://www.theses.fr/1997DUNK0011.

Testo completo
Abstract (sommario):
L'information scientifique et technique constitue un système organisé de connaissances, savoir et savoir-faire. Elle est l'intrant principal à la production de marchandises matérielles ou immatérielles. En tant que moyen de production, elle cristallise les rapports de production capitalistes dans lesquels elle a été construite. L'information scientifique et technique ne peut donc pas être appréhendée comme un bien libre et diffusée gratuitement ; son appropriation s'effectue par le biais du travail nécessaire à sa constitution, dans la sphère de la production. La concurrence capitaliste, qui exige le renouvellement constant des moyens de production, est à l'origine, au sein des systèmes nationaux d'innovation, de la combinaison des informations scientifiques et techniques en ensembles informationnels systématiquement intégrés dans les processus de production. La succession erratique de périodes de croissance et de crise lors des mouvements longs d'accumulation multiplie, pour les grandes firmes, les modalités d'appropriation des informations et restreint l'accès aux externalités produites par l'innovation technologique. Dans le contexte actuel de concurrence technologique mondiale, les firmes s'organisent en réseaux facilitant l'appropriation et la protection des informations nodales aux processus de production. La normalisation, en amont de la production, des techniques de production et des ensembles informationnels procure aux firmes innovantes des rentes de monopole réinvesties, en fonction des opportunités de profit, dans les processus d'accumulation du savoir. Il en résulte la diffusion d'ensembles informationnels protégés et des méthodes de travail qui leur correspondent. Ceci est, dans les pays industriels, à l'origine de stratifications industrielles et salariales au profit des firmes les plus compétitives et des individus les plus qualifies. Les pays à faibles systèmes nationaux d'innovation se heurtent au cout important de l'appropriation des informations et l'endogénéisation de ces normes
Scientific and technical information is an organized system of knowledge, learning and know-how. It is the principal input of production of material or immaterial goods. As a production means, it crystallizes the capitalist relations in which it has been built. Scientific and technical information cannot be, therefore, considered as a public good, freely distributed ; its appropriation is carried out by the work necessary for its formation, in the sphere of production. The capitalist competition, which requires the constant renewal of production means, is at the origin, within national systems of innovation, of the combination of informations in information pools systematically integrated in production processes. The erratic succession of periods of growth and crisis, during the long waves of accumulation multiplies, for big firms, the methods appropriation of informations and limits the access to the externalities produced by technological innovation. In the actual context of global technical competition, firms organized themselves in networks which make easier the appropriation and the protection of scientific and technical informations nodal to production processes. The standardization, upstream the production, of production techniques and informational pools, gives, for innovative firms, monopolistic rents reinvested, in function of opportunities of profits, in processes of knowledge accumulation. This process leads to the diffusion of protected informational pools and the correspondant methods of work. This diffusion of protected informational pools is, in industrialized countries, at the origin of industrial and salarial stratifications for the benefit of the most competitive firms and the most qualified workers. The countries which have weak national systems of innovation bump into the important cost of information appropriation and endogeneisation of these standards
Gli stili APA, Harvard, Vancouver, ISO e altri
18

Tsiamis, Andreas. "Electrical test structures and measurement techniques for the characterisation of advanced photomasks". Thesis, University of Edinburgh, 2010. http://hdl.handle.net/1842/4296.

Testo completo
Abstract (sommario):
Existing photomask metrology is struggling to keep pace with the rapid reduction of IC dimensions as traditional measurement techniques are being stretched to their limits. This thesis examines the use of on-mask probable electrical test structures and measurement techniques to meet this challenge and to accurately characterise the imaging capabilities of advanced binary and phase-shifting chrome-on-quartz photomasks. On-mask, electrical and optical linewidth measurement techniques have highlighted that the use of more than one measurement method, complementing each other, can prove valuable when characterising an advanced photomask process. Industry standard optical metrology test patterns have been adapted for the direct electrical equivalent measurement and the structures used to characterise different feature arrangements fabricated on standard and advanced photomasks with proximity correction techniques. The electrical measurements were compared to measurements from an optical mask metrology and verification tool and a state-of-the-art CD-AFM system and the results have demonstrated the capability and strengths of the on-mask electrical measurement. For example, electrical and AFM measurements on submicron features agreed within 10nm of each other while optical measurements were offset by up to 90nm. Hence, electrical techniques can prove valuable in providing feedback to the large number of metrology tools already supporting photomask manufacture, which in turn will help to develop CD standards for maskmaking. Electrical test structures have also been designed to enable the characterisation of optical proximity correction to characterise right angled corners in conducting tracks using a prototype design for both on-mask and wafer characterisation. Measurement results from the on-mask structures have shown that the electrical technique is sensitive enough to detect the effect of OPC on inner corners and to identify any defects in the fabricated features. For example less than 10 (5%) change in the expected resistance data trends indicated a deformed OPC feature. Results from on-wafer structures have shown that the correction technique has an impact on the final printed features and the measured resistance can be used to characterise the effects of different levels of correction. Overall the structures have shown their capability to characterise this type of optical proximity correction on both mask and wafer level. Test structures have also been designed for the characterisation of the dimensional mismatch between closely spaced photomask features. A number of photomasks were fabricated with these structures and the results from electrical measurements have been analysed to obtain information about the capability of the mask making process. The electrical test structures have demonstrated the capability of measuring tool and process induced dimensional mismatches in the nanometer range on masks which would otherwise prove difficult with standard optical metrology techniques. For example, electrical measurements detected mismatches of less than 15nm on 500nm wide features.
Gli stili APA, Harvard, Vancouver, ISO e altri
19

Clément, Sophie. "Les techniques de percussion : un reflet des changements techniques durant l’Acheuléen ?" Thesis, Paris 10, 2019. http://faraway.parisnanterre.fr/login?url=http://bdr.parisnanterre.fr/theses/intranet/2019/2019PA100033/2019PA100033.pdf.

Testo completo
Abstract (sommario):
Durant l’Acheuléen, des milliers de pièces bifaciales ont été façonnées, dans un éventail de matières premières, allant du silex au basalte en passant par l’os et le calcaire. La variabilité technique et morphologique incontestable de ces outils emblématiques a conduit à diverses interprétations, sur les moyens mis en œuvre pour leur confection. Les techniques de percussion utilisées ont été présentées comme ayant une incidence sur la qualité d’exécution ou le degré de finition de celui-ci. Le travail entrepris dans cette thèse de doctorat propose comprendre les liens entre techniques de percussion et changement technique à travers le prisme des matières premières microgrenues et tenaces en proposant une méthodologie renouvelée. Une expérimentation réalisée exclusivement avec des matériaux africains, pour les produits façonnés comme pour les percuteurs, combinée à une grille d’analyse dédiée permet d’aborder la question de la reconnaissance des stigmates et les réactions des matériaux. La mise en relation de l’analyse structurelle des pièces bifaciales avec les techniques de percussion utilisées donne des éléments de réponses sur la question du lien entre la technique de percussion et la morphologie du biface, permettant ainsi de comprendre la hiérarchie technique qui peut en découler. Grâce à d’autres résultats concernant à la fois des collections issues de sites du sud-ouest de la France et une expérimentation sur des quartzites, l’importance du concept de départ et de la structure du biface est ainsi mise en exergue, face aux possibilités qu’offrent les techniques de percussion. Les perspectives obtenues autorisent alors une réflexion nouvelle sur la place de ces techniques pendant la période acheuléenne
During Acheulean period, thousands of bifacial tools have been knapped from a vast range of raw materials like flint, volcanic stones (e.g.basalt, phonolite), bone or limestone. Technical and morpholigical variability of these emblematic tools can’t be denied and led to diverse interpretations regarding the means adopted to produce them. The percussion techniques often got presented as having an impact on the quality of execution or the degree of completion. This PhD aimed at understanding the link between percussion techniques and technical changes through the prism of grained and resistant raw materials by renewing the methodological approach. An experimentation completed only with african raw materials including knapped stones and percussive tools, in addition with a specific evaluation grid, entitles us to observe physical reactions and to talk about scarrecognition. Structural analysis of bifacial tools related to percussion techniques helps to define the link between those techniques and the morphological shapes of the tools and thererefore understanding the resulting hierarchy. More analysis of archaeological lithic assembling from sites in southern France and an other experimentation on quartizte underline the importance of the original (or initial) concept and of the tool’s structure. These perspectives entitle us to propose a new reflexion about the importance of those percussion techniques during Acheulean period
Gli stili APA, Harvard, Vancouver, ISO e altri
20

Higginbotham, Lee Ann M. "Teaching techniques : suggested techniques in teaching music through performance in choir /". View online, 2008. http://repository.eiu.edu/theses/docs/32211131459702.pdf.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
21

Paolani, Giulia. "Brain perfusion imaging techniques". Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019.

Cerca il testo completo
Abstract (sommario):
In questo lavoro si sono analizzate due diverse tecniche di imaging di perfusione implementate in Risonanza Magnetica e Tomografia Assiale Computerizzata (TAC). La prima analisi proposta riguarda la tecnica di Arterial Spin Labeling che permette di ottenere informazioni di perfusione senza la somministrazione di un mezzo di contrasto. In questo lavoro si è sviluppata e testata una pipeline completa, attraverso lo sviluppo sia di un protocollo di acquisizione che di post-processing. In particolare, sono stati definiti parametri di acquisizione standard, che permettono di ottenere una buona qualità dei dati, successivamente elaborati attraverso un protocollo di post processing che, a partire dall'acquisizione di un esperimento di ASL, permette il calcolo di una mappa quantitativa di cerebral blood flow (CBF). Nel corso del lavoro, si è notata una asimmetria nella valutazione della perfusione, non giustificata dai dati e probabilmente dovuta ad una configurazione hardware non ottimale. Risolta questa difficoltà tecnica, la pipeline sviluppata sarà utilizzata come standard per l’acquisizione e il post-processing di dati ASL. La seconda analisi riguarda dati acquisiti attraverso esperimenti di perfusione TAC. Si è presa in considerazione la sua applicazione a casi di infarti cerebrali in cui le tecniche di trombectomia sono risultate inefficaci. L'obiettivo di questo lavoro è stata la definizione di una pipeline che permetta il calcolo autonomo delle mappe di perfusione e la standardizzazione della trattazione dei dati. In particolare, la pipeline permette l’analisi di dati di perfusione attraverso l’utilizzo di soli software open-source, contrapponendosi alla metodologia operativa comunemente utilizzata in clinica e rendendo le analisi riproducibili. Il lavoro proposto è inserito in un progetto più ampio, che include future analisi longitudinali con coorti di pazienti più ampie per definire e validare parametri predittivi degli outcome dei pazienti.
Gli stili APA, Harvard, Vancouver, ISO e altri
22

Gunda, Sai Ganesh. "Requirements engineering : elicitation techniques". Thesis, University West, Department of Economics and IT, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:hv:diva-596.

Testo completo
Abstract (sommario):

Requirement engineering is the first and crucial phase in the development of software. The main aim of the requirement engineering process is gathering of requirements. It involves set of activities like system feasibility study, elicitation analysis, validation and management of the requirements. There are many methods already exist to perform the requirement gathering process and the software developers apply them to gather the requirements but still they are facing so many problems in gathering the requirements due to the lack of knowledge on result of the methods and selection of appropriate method. This affects the quality of software and increases the production cost of software. this paper presents the literature study and the experimental case study on analyzing and compare different methods for requirement gathering process, this provides the flexibility to requirements engineers to know the characteristics and the effectiveness of every method, it is useful to select the particular elicitation method depends on the type of application and the situation. And this analysis may be useful for the future development of new methods for requirement elicitation.

Gli stili APA, Harvard, Vancouver, ISO e altri
23

Crouch, Stephen Capdepon. "Synthetic aperture LADAR techniques". Thesis, Montana State University, 2012. http://etd.lib.montana.edu/etd/2012/crouch/CrouchS0812.pdf.

Testo completo
Abstract (sommario):
Synthetic Aperture Ladar (SAL) system performance is generally limited by the chirp ranging sub-system. Use of a high bandwidth linearized chirp laser centered at 1.55 microns enables high resolution ranging. Application of Phase Gradient Autofocus (PGA) to high resolution, stripmap mode SAL images and the first demonstration of Interferometric SAL (IFSAL) for topography mapping are shown in a laboratory setup with cross range resolution commensurate with the high range resolution. Projective SAL imaging is demonstrated as a proof of concept. Finally spotlight mode SAL in monostatic and bistatic configurations is explored.
Gli stili APA, Harvard, Vancouver, ISO e altri
24

Marchal, Frédéric. "cancer et techniques innovantes". Habilitation à diriger des recherches, Université Henri Poincaré - Nancy I, 2006. http://tel.archives-ouvertes.fr/tel-00120947.

Testo completo
Abstract (sommario):
Tout chirurgien est confronté au cancer à un moment ou à un autre. La connaissance de la cancérologie générale et des bases de la chirurgie oncologique est une nécessité. Si cette spécificité n'est pas reconnue en France, il existe une spécialité de chirurgie oncologique dans plusieurs pays européens. La chirurgie oncologique évolue selon les connaissances de la carcinogénèse, et les thérapeutiques non chirurgicales viennent modifier la place et les modalités de la chirurgie oncologique. La chirurgie oncologique a récemment été redéfinie à travers un Livre Blanc publié par la Fédération Nationale des Centres de Lutte Contre le Cancer, et une approche un peu différente est réalisée par l'Institut National du Cancer (INCA). Les caractéristiques de la chirurgie oncologique peuvent se décliner sur un mode général et ensuite être appliquées à chaque organe concerné. Dans ce sens la chirurgie oncologique ajoute une spécificité à la chirurgie d'organe. Les principales caractéristiques qui définissent la chirurgie oncologique - une prise en charge globale, l'indication chirurgicale intégrée dans une réflexion multidisciplinaire pré et post-opératoire, le respect de l'organe ou sa reconstruction, les notions fondamentales de choix optimal des incisions, de diagnostic pré opératoire, de bilan d'extension préalable, de limites d'exérèse en fonction des pathologies et des organes concernés – sont intégrées dans une démarche de pratique médicale basée sur le niveau de preuve scientifique.
La chirurgie oncologique participe à la qualité de la prise en charge globale du patient atteint d'un cancer. La chirurgie oncologique apporte aux spécialistes d'organe des connaissances et des compétences complémentaires sans nier leur évidente implication dans la prise en charge des patients atteints de cancer. Le cas des réunions de concertation pluridisciplinaires est une illustration de mon engagement à promouvoir cette complémentarité, en participant aux différentes réunions, tant au Centre anticancéreux qu'au CHU de Nancy.
L'expérience acquise dans le service de chirurgie générale et viscérale, ainsi que dans les différents services francophones de chirurgie viscérale, a permis de développer la recherche clinique. Cette recherche a pour thème la chirurgie fonctionnelle digestive et viscérale (sujet de thèse de Docteur en Médecine), la chirurgie pariétale, la chirurgie endocrinienne et la chirurgie des urgences. Ma formation en chirurgie oncologique m'a permis d'orienter les recherches fondamentales sur les traitements locaux des métastases hépatiques. La collaboration avec le laboratoire de chirurgie expérimentale de la Faculté de Médecine de Nancy a contribué à la mise au point de nouveaux modèles expérimentaux animaux. Aux contacts des équipes étrangères, la poursuite de ces travaux de recherche, tant cliniques qu'expérimentaux, s'est renforcée et a permis le développement à Nancy de techniques innovantes comme la chimiohyperthermie intrapéritonéale et le ganglion sentinelle.
Gli stili APA, Harvard, Vancouver, ISO e altri
25

Oguz, Pinar. "Implementation Of Northfinding Techniques". Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/12607271/index.pdf.

Testo completo
Abstract (sommario):
ABSTRACT IMPLEMENTATION OF NORTHFINDING TECHNIQUES Oguz, Pinar MS, Department of Electrical and Electronics Engineering Supervisor: Assoc. Prof. Dr. T.Engin Tuncer June 2006, 131 pages The fundamental problem of navigation is to find the initial north angle of the body with respect to the reference frame. Determination of the north angle of the body frame is required in spacecraft, aircraft, sea-craft, land-craft and missile control and guidance. This thesis discusses implementation and comparison of four northfinding techniques. These are GPS (Global Positioning System) based with integer search, GPS based with Kalman filter, accelerometer based and IMU (Inertial Measurement Unit) based techniques. The north angle is determined by the processing of difference measurements of the GPS carrier phase between two antennas at GPS based northfinding techniques. Carrier phase ambiguity resolution is the main problem in GPS based techniques. Since, GPS receiver measures only the fractional part of the carrier phase. Therefore, integer part remains unknown. Two distinct ideas are applied to solve carrier phase ambiguities in two techniques. One of them is integer search on single phase difference. Suitable integer sets are checked on the cost function which is constructed from the single phase difference between two antennas. The other technique uses integer estimator and attitude estimator with Kalman filter rely on double difference phase measurements which are obtained from carrier phase differences of two antennas and two satellites at one instant. To test the GPS based techniques, a realistic GPS emulator is implemented. GPS emulator provides typical GPS raw navigation data including satellite positions, pseudoranges and carrier phases. Accelerometer based northfinding technique is composed of a vertically placed linear accelerometer on a rotating platform. The north angle is found by Coriolis acceleration due to Earth and platform rotation. Implementation problems of this technique in practice are discussed. IMU based northfinding technique has inertial sensor components such as gyroscopes and accelerometers to sense the Earth rotation rate and gravitational force respectively. The north angle is found by the processing of these inertial sensors output. Real set-up is established to test the IMU based technique.
Gli stili APA, Harvard, Vancouver, ISO e altri
26

Conradi, Carl Peter. "LINC transmitter linearization techniques". Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape4/PQDD_0018/MQ49794.pdf.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
27

Birbil, Sevket Ilker. "Stochastic Global Optimization Techniques". NCSU, 2002. http://www.lib.ncsu.edu/theses/available/etd-20020403-171452.

Testo completo
Abstract (sommario):

In this research, a novel population-based global optimization method has been studied. The method is called Electromagnetism-like Mechanism or in short EM. The proposed method mimicks the behavior of electrically charged particles. In other words, a set of points is sampled from the feasible region and these points imitate the role of the charged particles in basic electromagnetism. The underlying idea of the method is directing sample points toward local optimizers, which point out attractive regions of the feasible space.The proposed method has been applied to different test problems from the literature. Moreover, the viability of the method has been tested by comparing its results with other reported results from the literature. Without using the higher order information, EM has converged rapidly (in terms of the number of function evaluations) to the global optimum and produced highly efficient results for problems of varying degree of difficulty.After a systematic study of the underlying stochastic process, the proof of convergence to the global optimum has been given for the proposed method. The thrust of the proof has been to show that in the limit, at least one of the points in the population moves to the neighborhood of the global optimum with probability one.The structure of the proposed method is very flexible permitting the easy development of variations. Capitalizing on this, several variants of the proposed method has been developed and compared with the other methods from the literature. These variants of EM have been able to provide accurate answers to selected problems and in many cases have been able to outperform other well-known methods.

Gli stili APA, Harvard, Vancouver, ISO e altri
28

Jaganathan, Venkata Krishnan. "Robust motion estimation techniques". Diss., Columbia, Mo. : University of Missouri-Columbia, 2007. http://hdl.handle.net/10355/6032.

Testo completo
Abstract (sommario):
Thesis (M.S.)--University of Missouri-Columbia, 2007.
The entire dissertation/thesis text is included in the research.pdf file; the official abstract appears in the short.pdf file (which also appears in the research.pdf); a non-technical general description, or public abstract, appears in the public.pdf file. Title from title screen of research.pdf file (viewed on April 15, 2008) Includes bibliographical references.
Gli stili APA, Harvard, Vancouver, ISO e altri
29

Liang, Fang. "Hyperplane-based classification techniques". Connect to online resource, 2007. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3284447.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
30

Popescu, Marcel. "Techniques d'observation spectroscopique d'astéroïdes". Phd thesis, Observatoire de Paris, 2012. http://tel.archives-ouvertes.fr/tel-00785991.

Testo completo
Abstract (sommario):
L'objectif fondamental des sciences planétaires est la compréhension de la formation et de l'évolution du Système Solaire. Pour atteindre cet objectif les astéroïdes présentent un intérêt tout particulier pour la communauté scientifique. En effet, nous pouvons regarder la population astéroïdale comme une fenêtre vers le passée, par laquelle nous regardons les débuts de la formation du système planétaire. Ils sont les témoins des premiers moments de la formation des planètes gardant dans leur structure la complexité chimique de la nébuleuse primordiale. Pour cette raison, les études physiques et dynamiques de ces corps nous apportent des informations essentielles sur l'histoire et l'évolution de notre Système Solaire et plus généralement sur la formation des systèmes planétaires. Pendant ma thèse j'ai développé l'application Modelling for Asteroids (acronyme M4AST). M4AST est un service en libre service sur internet permettant la modélisation des surfaces d'astéroïdes en utilisant plusieurs approches théoriques. M4AST est composé d'une base de données contenant quelques 2500 spectres d'astéroïdes et d'une bibliothèque de routines permettant la modélisation et l'obtention de plusieurs paramètres minéralogiques. La base de données est accessible aussi bien par les biais des protocoles de l'Observatoire Virtuel (OV-Paris) que par sa propre interface. Le service est accessible depuis l'adresse http:// cardamine.imcce.fr/m4ast. M4AST permet plusieurs types d'analyses : classification taxonomique, modélisation de l'altération spatiale, comparaison avec les spectres des météorites et des minéraux terrestres, calculs des centres et des surfaces des bandes. J'ai participé à plus de 10 campagnes d'observations pour la caractérisation physique et dynamique des astéroïdes. Les observations spectroscopiques ont servi à la caractérisation minéralogique des surfaces d'astéroïdes. L'astrométrie a plutôt servi à la confirmation et la sécurisation de nouvelles découvertes d'astéroïdes. Pendant la thèse, j'ai observé et caractérisé les spectres en infrarouge proche de huit astéroïdes géocroiseurs : 1917, 8567, 16960, 164400, 188452, 2010 TD54, 5620, and 2001 SG286. Ces observations ont été obtenues avec le télescope IRTF et du spectrographe SpeX, en employant l'infrastructure CODAM de l'Observatoire de Paris. Pour chaque astéroïde j'ai proposé des solutions minéralogiques. Une révision de leur taxonomie a aussi été effectuée pour cinq astéroïdes de mon échantillon. Quatre des objets sont des objets à faible delta-V, qui sont des cibles souhaitables/possibles pour des missions spatiales. L'astéroïde (5620) Jasonwheeler montre un spectre similaire à ceux des météorites chondritiques. J'ai observé et modélisé six astéroïdes de la ceinture principale. (9147) Kourakuen, (854) Frostia, (10484) Hecht and (31569) 1999 FL18 montrent des caractéristiques des astéroïdes du type V; (1333) Cevenola, (3623) Chaplin sont du type taxonomique S. Quelques astéroïdes de cet échantillon sont particuliers : (854) Frostia est un astéroïde binaire, (10484) Hecht et (31569) 1999 FL18 ont des gémeaux dynamiques, (1333) Cevenola et (3623) Chaplin sont des objets avec des courbes de lumières à grandes amplitudes. La classification taxonomique, la comparaison avec les météorites, permettent l'établissement des solutions minéralogiques intéressantes et des ressemblances avec les météorites de la classe des howardites, eucrites et diogenites.
Gli stili APA, Harvard, Vancouver, ISO e altri
31

Barratt, Keith. "Code assisted synchronisation techniques". Thesis, Lancaster University, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.414940.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
32

Chen, Rui. "Novel particle sizing techniques". Thesis, University of Nottingham, 2013. http://eprints.nottingham.ac.uk/13492/.

Testo completo
Abstract (sommario):
Two novel approaches to particle size measurement are investigated; these are designated as Particle Movement Displacement Distribution (PMDD) method and Separated Multiple Image Technique (SMIT). An advantage of these methods compared with the established particle sizing methods of Static Light Scattering (SLS) and Dynamic Light Scattering (DLS) is that PMDD and SMIT do not suffer from the intensity weighting problem that affects SLS and DLS. The performance of the PMDD method is examined through computer simulations and through analysis of pre-existing experimental data. The SMIT method is investigated through computer simulations and through the construction and use of an optical system. The ability of both methods was measured through the assessment of an ‘area error’ measure which gives an estimate of the accuracy of a recovered particle size distribution. This area error measure varies between 0 and 2; with 0 corresponding to a perfectly recovered distribution. Typically a good inversion of DLS data can achieve an area-error value of 0.32 to 0.34 and this figure (along with the recovered mean particle size and standard deviation of the distribution) was used to judge quantitatively the success of the methods. The PMDD method measures the centre of individual particles in each image. A vector histogram is formed based on the connection between the centres in the first image and the centres in the next image. This vector histogram contains information about the particle size distribution. A maximum likelihood data inversion procedure is used to yield a particle size distribution from this data. The SMIT method is similar to the Static Light Scattering (SLS) method, but it combines angular dependent intensity method and individual visualisation method together to recover individual particle sizes without an intensity weighting. A multi-aperture mask and wedge prisms are utilised in this method to capture particle images formed from light scattered into a number of selected directions. A look-up table is then used to recover the individual particle sizes, which are then formed into a histogram. For the PMDD method, computer simulation results established the optimum values for parameters such as the time interval between frames, the exposure time and the particle concentration and also investigated the effects of different noise sources. For mono-modal size distributions, the PMDD method was shown through computer simulation to be capable of recovering a particle size distribution with an area error of around 0.27 which compares well with the typical DLS result. PMDD results were also recovered from mono-modal experimental data with mean particle sizes close to the manufacturers quoted particle mean size. However, recovery of bi-modal distributions was found to be not so successful; for bi-modal distributions, the recovered distributions generally had only a single peak, which, of course gives a very poor area-error figure. This result compares poorly with the particle tracking method ‘Nano Particle Tracking Analysis’ which is able to recover bi-modal distributions. For this reason further research was concentrated on an image intensity method (SMIT). For the SMIT method, computer simulation results established the optimum values for parameters such as the particle concentration and also investigated the effects of different noise sources and of aberrations in the optical system. The SMIT method was shown through computer simulation to be capable of recovering particle size distributions for both mono-modal and bi-modal distributions. The area error values obtained were in the range 0.24 to 0.45, and most of the results are good compared to the DLS value. The major problem with the SMIT method was found to be the presence of a small number of recovered particle radii much larger (or smaller) than the true sizes. These errors were attributed to ambiguities in the look-up table system used to convert the relative intensity data values into particle sizes. Two potential methods to reduce the influence of these ambiguities were investigated. These were, firstly by combining Brownian motion movement data from tracking individual particles over a few frames of data, and secondly by combining an estimate of the total scattered intensity from a particle with the normal SMIT data to constrain the look-up procedure. In computer simulations both approaches gave some improvement but the use of the total scattered intensity method gave the better results. In a computer simulation this method managed to improve the area-error from 0.37 for SMIT alone to 0.25 for SMIT combined with this extra information. Based on the success of these computer simulation results, an experimental SMIT system was constructed and tested. It was found necessary to first calibrate the optical system, to account for the different optical transmission coefficients of the different prisms/optical paths. But using a particle sample with particles of known size to calibrate; other particle sizes were successfully recovered from the experimental data using the original SMIT data processing. A majority of the recovered particle radius were close to the manufacturers quoted particle mean radius. Attempts to implement the total intensity approach to enhance the SMIT were found not be successful due to the difficulty in measuring the small displacements in particle positions required with sufficient accuracy. A possible alternative design to overcome this problem is suggested in the future work section 7.2.
Gli stili APA, Harvard, Vancouver, ISO e altri
33

Wang, Yongsheng. "Advanced video encryption techniques". Thesis, Queen's University Belfast, 2013. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.602966.

Testo completo
Abstract (sommario):
Protecting video streams while incurring minimal impact on the compression performance is very important for practical video distribution. Selective encryption is one of the most promising techniques that can offer the required security while maintaining format compliance after encryption with no or little impact on the compression performance. Also, selective encryption techniques can be employed in video surveillance systems to alleviate concerns over privacy invasion by applying the encryption to specific regions of interest. This thesis presents advanced selective encryption techniques for a range of video applications and new methods to effectively and efficiently protect privacy 111 video surveillance systems by applying selective encryption. Background knowledge on video encryption is introduced and previous work is reviewed. Two improved video encryption methods are first demonstrated: one randomly selects one of two equivalent zig-zag scan orders for video preview applications; the other is based on encrypting the sign bits of motion vectors to enhance the scrambling effect. Then, two recently proposed fast selective encryption methods for H.264/AVC are analyzed to show that they are not as efficient as only encrypting the sign bits of nonzero coefficients. A tunable selective encryption scheme for H.264/AVC is developed to provide a tunable scrambling effect by simply adjusting three parameters, so that for different scenarios the user can easily adjust the scrambling effect according to specific requirements. Finally, to more effectively protect privacy in video surveillance systems, it is proposed to el1crypt intra prediction modes within regions of interest in addition to encrypting sign bits of nonzero coefficients, as only encrypting sign bits produces a relatively weak scrambling effect. A re-encoding method is presented to remove the drift error in the non-privacy region caused by the encryption. A spiral binary mask mechanism is also proposed to more efficiently signal the position of the privacy region.
Gli stili APA, Harvard, Vancouver, ISO e altri
34

Barnier, Fabien. "Fibre Bragg grating techniques". Thesis, University of Hull, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.322570.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
35

鄭展朗 e Chin-long Cheng. "Reference code correlator techniques". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1999. http://hub.hku.hk/bib/B42574985.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
36

England, Janine V. "Digital filter design techniques/". Thesis, Monterey, California. Naval Postgraduate School, 1988. http://hdl.handle.net/10945/23177.

Testo completo
Abstract (sommario):
An overview and investigation of the more popular digital filter design techniques are presented, with the intent of providing the filter design engineer a complete and concise source of information. Advantages and disadvantages of the various techniques are discussed, and extensive design examples used to illustrate their application to specific design problems. Both IIR (Butterworth, Chebyshev and elliptic), and FIR (Fourier coefficient design, windows and frequency sampling) design methods are featured, as well as, the Optimum FIR Filter Design Program of Parks and McClellan, and the Minimum p - Error IIR Filter Design Method of Deczky. Keywords: Digital filter design, IIR, FIR, Butterworth, Chebyshev, Elliptic, Fourier coefficient, Windows, Frequency sampling, Remez exchange algorithm, Minimum p-error, and IRR filter design
Gli stili APA, Harvard, Vancouver, ISO e altri
37

Waller, Marcus D. "3D rasterisarion hardware techniques". Thesis, University of Sussex, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.388702.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
38

Goh, Jason Y. L. "Interferometric confocal imaging techniques". Thesis, University of Nottingham, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.246410.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
39

Arani, Faramarz Shayan. "Trellis coded modulation techniques". Thesis, University of Warwick, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.387324.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
40

Hu, Zhihua. "Spectral fatigue analysis techniques". Thesis, University College London (University of London), 1994. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.362446.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
41

Alleston, S. B. "Ultrafast optoelectronic measurement techniques". Thesis, University of Essex, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.413127.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
42

Bharadia, Ketan R. "Network application detection techniques". Thesis, Loughborough University, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.247912.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
43

POLO, MARIA GUILLERMINA ALBARRACIN. "MODULATION TECHNIQUES IN EHF". PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2016. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=27198@1.

Testo completo
Abstract (sommario):
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
PROGRAMA DE EXCELENCIA ACADEMICA
Devido às exigências da largura de banda, especialmente nas comunicações sem fios que são cada dia maiores pelo aumento do numero de usuários, é necessário estudar a banda de EHF(Extremely High Frequency). A transmissão e recepção de dados em EHF constitui uma possível solução para aliviar a escassez do espectro e satisfazer a crescente demanda de maiores velocidades tentado resolver as limitações dos sistemas atuais. As ondas de radio na banda EHF vão de 30 até 300 GHz e são chamadas ondas milimétricas, já que seus comprimentos de onda vão de 10 mm até 1 mm. Neste trabalho, a montagem de um sistema de geração e detecção de ondas de EHF a partir do batimento de dois lasers é apresentada. Técnicas de modulação e demodulação em fase, amplitude e frequência na faixa de 200-300 GHz são demonstradas.
The capacity of wireless communications has started to reach the top and the unstoppable increase of users is becoming a problem because more bandwidth is needed, which has gave rise to the study of EHF (Extremely High Frequency) band. Transmission and reception of data in EHF is shown as a solution to alleviate the scarcity of the spectrum and to meet the request of faster speeds to solve the limitation of the actual systems. The range of radio waves in EHF band go from 30 to 300 GHz, and are called millimeter waves since their wavelengths are between 10mm and 1mm. In this work is presented a system capable to generate and detect EHF waves from the beating of two lasers, and at the same time different modulation and demodulation techniques (phase, amplitude and frequency) are presented.
Gli stili APA, Harvard, Vancouver, ISO e altri
44

Bazargan, K. "Techniques in display holography". Thesis, Imperial College London, 1986. http://hdl.handle.net/10044/1/37938.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
45

Drummond, Humphrey John Jardine. "Microprocessor controlled electroanalytical techniques". Thesis, Imperial College London, 1990. http://hdl.handle.net/10044/1/46285.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
46

Tsai, Pearl T. "Parallel network simulation techniques". Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/36033.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
47

Khisti, Ashish 1979. "Coding techniques for multicasting". Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/28546.

Testo completo
Abstract (sommario):
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2004.
Includes bibliographical references (leaves 74-77).
We study some fundamental limits of multicasting in wireless systems and propose practical architectures that perform close to these limits. In Chapter 2, we study the scenario in which one transmitter with multiple antennas distributes a common message to a large number of users. For a system with a fixed number (L) of transmit antennas, we show that, as the number of users (K) becomes large, the rate of the worst user decreases as O(K-(1/L)). Thus having multiple antennas provides significant gains in the performance of multicasting system with slow fading. We propose a robust architecture for multicasting over block fading channels, using rateless erasure codes at the application layer. This architecture provides new insights into the cross layer interaction between the physical layer and the application layer. For systems with rich time diversity, we observe that it is better to exploit the time diversity using erasure codes at the application layer rather than be conservative and aim for high reliability at the physical layer. It is known that the spatial diversity gains are not significantly high in systems with rich time diversity. We take a step further and show that to realize these marginal gains one has to operate very close to the optimal operating point. Next, we study the problem of multicasting to multiple groups with a multiple antenna transmitter. The solution to this problem motivates us to study a multiuser generalization of the dirty paper coding problem. This generalization is interesting in its own right and is studied in detail in Chapter 3. The scenario we study is that of one sender and many receivers, all interested in a common message. There is additive interference on the channel of each receiver, which is known only to the sender.
(cont.) The sender has to encode the message in such the way that it is simultaneously 'good' to all the receivers. This scenario is a non-trivial generalization of the dirty paper coding result, since it requires that the sender deal with multiple interferences simultaneously. We prove a capacity theorem for the special case of two user binary channel and derive achievable rates for many other channel modes including the Gaussian channel and the memory with defects model. Our results are rather pessimistic since the value of side information diminishes as the number of users increase.
by Ashish Khisti.
S.M.
Gli stili APA, Harvard, Vancouver, ISO e altri
48

Wong, Andrew (Andrew Yung-An) 1979. "Robotically reproducing turntable techniques". Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/87300.

Testo completo
Gli stili APA, Harvard, Vancouver, ISO e altri
49

Clark, Jerry. "Concurrent Telemetry Processing Techniques". International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/611423.

Testo completo
Abstract (sommario):
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California
Improved processing techniques, particularly with respect to parallel computing, are the underlying focus in computer science, engineering, and industry today. Semiconductor technology is fast approaching device physical limitations. Further advances in computing performance in the near future will be realized by improved problem-solving approaches. An important issue in parallel processing is how to effectively utilize parallel computers. It is estimated that many modern supercomputers and parallel processors deliver only ten percent or less of their peak performance potential in a variety of applications. Yet, high performance is precisely why engineers build complex parallel machines. Cumulative performance losses occur due to mismatches between applications, software, and hardware. For instance, a communication system's network bandwidth may not correspond to the central processor speed or to module memory. Similarly, as Internet bandwidth is consumed by modern multimedia applications, network interconnection is becoming a major concern. Bottlenecks in a distributed environment are caused by network interconnections and can be minimized by intelligently assigning processing tasks to processing elements (PEs). Processing speeds are improved when architectures are customized for a given algorithm. Parallel processing techniques have been ineffective in most practical systems. The coupling of algorithms to architectures has generally been problematic and inefficient. Specific architectures have evolved to address the prospective processing improvements promised by parallel processing. Real performance gains will be realized when sequential algorithms are efficiently mapped to parallel architectures. Transforming sequential algorithms to parallel representations utilizing linear dependence vector mapping and subsequently configuring the interconnection network of a systolic array will be discussed in this paper as one possible approach for improved algorithm/architecture symbiosis.
Gli stili APA, Harvard, Vancouver, ISO e altri
50

Shah, Bansi C. "Novel fingerprint development techniques". Thesis, Loughborough University, 2013. https://dspace.lboro.ac.uk/2134/12533.

Testo completo
Abstract (sommario):
There are numerous pre-existing fingerprint development techniques, however, often prints are difficult to develop, depending on their age or the surface upon which they have been deposited. Forensic scientists are relentlessly looking for new and better methods to enhance fingerprints. More recent technologies have higher sensitivity to very low levels of constituents present in residues and so are able to unearth significant details from a person's fingerprints at molecular level e.g. DNA, drug metabolites. Therefore, research continues in an attempt to generate novel, nondestructive processes that can enhance latent fingerprints. Exposing fingerprints to the p-block compounds selenium dioxide (SeO2), phosphorus sulfides (P4Sx) and phosphonitrilic chloride trimer (NPCl2)3, in the vapour phase resulted in latent prints being visualized on a range of media. Selenium dioxide revealed prints on metal surfaces (e.g. brass) which were enhanced further upon formation of a dark brown coating of copper-selenide formed on the surface when exposed to moisture, giving a better contrast. P4S3 vapour revealed a higher percentage of prints and samples had greater stability in air while although (NPCl2)3 was able to develop fingerprints, the low quality was undesirable. Initially it was thought that (NPCl2)3 has the potential for further derivatisation but was proven very difficult to interact with compounds especially those with the potential to induce fluorescence. However, all three compounds are commercially available and sublimation techniques are straightforward.
Gli stili APA, Harvard, Vancouver, ISO e altri
Offriamo sconti su tutti i piani premium per gli autori le cui opere sono incluse in raccolte letterarie tematiche. Contattaci per ottenere un codice promozionale unico!

Vai alla bibliografia