Dissertations / Theses on the topic 'Technique selection'

To see the other types of publications on this topic, follow the link: Technique selection.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Technique selection.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Nejadmalayeri, Amir Hossein. "CDMA Channel Selection Using Switched Capacitor Technique." Thesis, University of Waterloo, 2001. http://hdl.handle.net/10012/782.

Full text
Abstract:
CDMA channel selection requires sharp as well as wide-band Filtering. SAW Filters which have been used for this purpose are only available in IF range. In direct conversion receivers this has to be done at low frequencies. Switched Capacitor technique has been employed to design a low power, highly selective low-pass channel select Filter for CDMA wireless receivers. The topology which has been chosen ensures the low sensitivity of the Filter response. The circuit has been designed in a mixed-mode 0. 18u CMOS technology working with a single supply of 1. 8 V while its current consumption is less than 10 mA.
APA, Harvard, Vancouver, ISO, and other styles
2

Manning, Michelle Louise. "Biomechanics of technique selection in women's artistic gymnastics." Thesis, Cardiff Metropolitan University, 2015. http://hdl.handle.net/10369/7568.

Full text
Abstract:
Technique selection is fundamental to Women’s Artistic Gymnastics with rapidly evolving difficulty and complexity; a result of changes in the scoring system and apparatus design. The aim of this research was to increase knowledge and understanding of the biomechanics underpinning female longswing techniques to determine effective technique selection. Five progressive themes addressed this aim; contemporary trend analysis, biomechanical conceptual approach, method validation, biomechanical musculoskeletal approach and biomechanical energetic approach. Elite competition provided the basis to the thesis with a strong ecologically valid trend analysis reporting the straddle Tkachev as the most frequently performed release skill preceded by three distinct longswing techniques; arch, straddle, pike. Quantifying each technique through a biomechanical conceptual approach enumerated differences observed and examined their influence on key release parameters. Significant differences (p≤0.05) were reported in the initiation and joint angular kinematics within the functional phases; however not for release parameters. Further examination into the joint kinetics and energetic demands of the gymnast were required to explain technique selection. Non-invasive methods of joint kinetic data collection are challenging within the elite competitive environment; therefore indirect methods were validated to provide confidence in the subsequent musculoskeletal approach. Inverse dynamic estimations were most sensitive to kinematic inputs with field versus lab comparisons highlighting systematic differences in joint moments (0.8%RMSD in consistency). Joint kinetics provided new knowledge of the underlying biomechanics of varying techniques, specifically greater shoulder joint moments and hip joint powers during the pike longswing. Examining gymnast energetic contribution to the total gymnast-high-bar energy system developed a novel effectiveness score highlighting the potential energy excess available to the arch (30%) and straddle (2%) techniques, indicating the potential to develop more complex versions of skills. This research provides coaches and scientists with specific physical preparation requirements for varying longswing techniques and insight into the need for customised technique selection.
APA, Harvard, Vancouver, ISO, and other styles
3

Guan, Linhua. "Evaluation of a statistical infill candidate selection technique." Texas A&M University, 2003. http://hdl.handle.net/1969/207.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ergez, Merve. "Strategic scent selection : an application of Zaltman Metaphor Elicitation Technique." Thesis, Massachusetts Institute of Technology, 2014. http://hdl.handle.net/1721.1/90238.

Full text
Abstract:
Thesis: S.M. in Management Studies, Massachusetts Institute of Technology, Sloan School of Management, 2014.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 40-41).
This thesis aims to create a better branding message for the OLIVITA Artisan by finding specific scents that are closely aligned with its marketing strategy for their future product line, to reassure the existing brand image and increase its competency in the market. In order to find the scents that are aligned with OLIVITA Artisan's existing brand image, an edited version of the Zaltman Metaphor Elicitation Technique (ZMET), a patented market research tool that has been proven successful in eliciting the conscious and more importantly the unconscious thoughts by exploring people's non-literal or metaphoric expressions, was applied for the "Mediterranean life" concept among a convenience sample. The transcriptions of the interviews were analyzed, relevant statements were extracted and the strongest statements were selected and logically grouped to be able to identify the key themes in consumer's mind by Voice of the Customer analysis. The themes were also used to create short stories. The results shaped the scent recommendations for the OLIVITA Artisan, including four pure, three conceptual scents that are closely aligned with its brand image.
by Merve Ergez.
S.M. in Management Studies
APA, Harvard, Vancouver, ISO, and other styles
5

Secker, J. "Branch labelling : a technique for disjunctive planning with selection interactions." Thesis, Keele University, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.293511.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kuchimanchi, Sriram. "Towards a regression test selection technique for message-based software integration." ScholarWorks@UNO, 2004. http://louisdl.louislibraries.org/u?/NOD,164.

Full text
Abstract:
Thesis (M.S.)--University of New Orleans, 2004.
Title from electronic submission form. "A thesis ... in partial fulfillment of the requirements for the degree of Master of Science in the Department of Computer Science."--Thesis t.p. Vita. Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
7

Howard, Lee Mont. "Technique and Cue Selection for Graphical Presentation of Generic Hyperdimensional Data." BYU ScholarsArchive, 2012. https://scholarsarchive.byu.edu/etd/3271.

Full text
Abstract:
The process of visualizing n-D data presents the user with four problems: finding a hyperdimensional graphics package capable of rendering n-D data, finding a suitable presentation technique supported by the package that allows insight to be gained, using the provided user interface to interact with the presentation technique to explore the information in the data, and finding a way to share the information gained with others. Many graphics packages have been written to solve the first problem. However, existing packages do not sufficiently solve the other three problems. A hyperdimensional graphics package that sufficiently solves all these problems simplifies the user experience and allows the user to explore, interact with, and share the data. I have implemented a package that solves all four problems. The package is able to render n-D data through appropriate encapsulation of presentation techniques and their associated visual cues. Through the use of an extensible plugin system, presentation techniques can be easily added and accommodated. Desirable features are supported by the user interface to allow the user to interact easily with the data. Sharing of visualizations and annotations are included to allow users to share information with one another. By providing a hyperdimensional graphics package that easily accommodates presentation techniques and includes desirable features, including those that are rarely or never supported, the user benefits from tools that allow improved interaction with multivariate data to extract information and share it with others.
APA, Harvard, Vancouver, ISO, and other styles
8

Uddin, Sheikh Fakhar, and Ismail Khan Khattak. "Spectrum Selection Technique to Satisfy the QoS Requirements in Cognitive Radio Network." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-1980.

Full text
Abstract:
The demand of wireless spectrum is increasing very fast as the field of telecommunication is advancing rapidly. The spectrum was underutilized because of fixed spectrum assignment policy and this valuable spectrum can be utilized efficiently by cognitive radio technology. In this thesis we have studied spectrum selection problems in cognitive radio network. Channel sharing and channel contention problems arise when multiple secondary users tend to select same channel. The thesis work is focused on spectrum selection issue with the aim to minimize the overall system time and to solve the problem of channel contention and channel sharing. The overall system time of secondary connection is an important performance measure to provide quality of service for secondary users in cognitive radio network. We studied two spectrum selection schemes that considerably reduce the overall system time and resolve the problems of channel sharing and channel contention. An analytical model associated with Preemptive Resume Priority (PRP) M/G/1 queuing model has been provided to evaluate the studied spectrum selection scheme. This model also analyzes the effect of multiple handoffs due to arrival of primary users. According to this scheme, the traffic load is distributed among multiple channels to balance the traffic load. Secondary users select the operating channels based on the spectrum selection algorithm. They can intelligently adopt better channel selection scheme by considering traffic statistics and overall transmission time. All simulation scenarios are developed in MATLAB. Based on our result we can conclude that both channel selection schemes considerably reduce the overall transmission time of secondary users in cognitive radio network. The overall transmission time increase with the rise of arrival rate of secondary user. The probability based channel selection scheme perform better with lower arrival rate and sensing based channel selection scheme perform better with higher arrival rate of secondary users. These channel selection schemes help distribute the traffic load of secondary users evenly among multiple channels. Hence, increase the channel utilization and resolve the channel contention problem.
APA, Harvard, Vancouver, ISO, and other styles
9

Ojum, Victor Chimenim. "A techno-economic analysis of artificial lift technique selection in the petroleum industry." Thesis, University of Dundee, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.510641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

SILVA, RODRIGO MARQUES ALMEIDA DA. "AN OPTIMIZED PHOTOREALISTIC RENDERING METHOD WITH STATISTIC DISTRIBUTION AND AUTOMATIC RENDERING TECHNIQUE SELECTION." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2015. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=25807@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
O processo de renderização fotoreal para cinema e TV demanda, cada vez mais, poder de processamento, necessitando não só de algoritmos paralelos, bem como sistema de distribuição de tarefas. No entanto, mesmo em sistemas de produção, o tempo necessário para avaliar uma animação pode chegar a vários dias, dificultando a melhoria da qualidade artística e limitando alterações. Neste trabalho foca-se na otimização de três processos inerentes à renderização, a renderização local, na qual o sistema trabalha para renderizar um conjunto de pixels de forma otimizada, aproveitando os recursos de hardware disponíveis e aproveitando dados de renderizações previamente realizadas, pelo nó ou teste. O processo de gerenciamento, extremamente crítico para o resultado, é alterado para não só distribuir, mas analisar toda a infraestrutura de renderização, otimizando o processo de distribuição e permitindo o estabelecimento de metas como prazo e custo. Além disso, o modelo de gerenciamento é expandido para a nuvem, utilizando-a como transbordo de processamento. Ainda, um novo processo foi criado para avaliar a renderização de forma colaborativa, onde cada nó comunica resultados parciais e assim otimiza a renderização de outros. Por fim, diversas técnicas inovadoras foram criadas para melhorar o processo como um todo, removendo desperdícios e reaproveitando trabalho.
The photorealistic rendering process for cinema and TV increasingly demands processing power, requiring fast parallel algorithms and effective task distribution systems. However, the processes currently used by the academia and by the industry still consume several days to evaluate an animation in super-resolution (typically 8K), what makes difficult the improvement of artistic quality and limits the number of experiments with scene parameters. In this work, we focus on the optimization of three processes involved in photorealistic rendering, reducing the total time of rendering substantially. Firstly, we optimize the local rendering, in which the system works to render a set of pixels optimally, taking advantage of the available hardware resources and using previous rendering data. Secondly, we optimize the management process, which is changed not only to distribute frames but also to analyze all the rendering infrastructure, optimizing the distribution process and allowing the establishment of goals as time and cost. Furthermore, the management model is expanded to the cloud, using the cloud as a processing overflow. Thirdly, we propose a new optimized process to evaluate the rendering task collaboratively, where each node communicates partial results to other nodes, allowing the optimization of the rendering process in all nodes. Altogether, this thesis is an innovative proposal to improve the whole process of high-performance rendering, removing waste of resources and reducing rework.
APA, Harvard, Vancouver, ISO, and other styles
11

Liu, Xuanhui [Verfasser], and A. [Akademischer Betreuer] Mädche. "Designing Decision Aids for Digital Service Design Technique Selection / Xuanhui Liu ; Betreuer: A. Mädche." Karlsruhe : KIT-Bibliothek, 2019. http://d-nb.info/1193126703/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Dahlgren, Mikael. "Object Selection by Relative Hand to Cursor Mapping : Design and Evaluation of a 2D-Object Selection Technique for Hand Tracking in Virtual Environments." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-292107.

Full text
Abstract:
Hand and tracker jitter negatively affect performance with ray-casting in virtual environments, making it difficult to acquire small objects which require high levels of precision. This study evaluates an alternative target acquisition technique for 2D-interfaces in virtual environments with relative hand-to-cursor mapping. Trade-offs between the relative hand-to-cursor mapping and ray-casting are explored in comparative evaluation, utilizing Fitts's law, SUS-surveys and interviews. The results demonstrate that relative hand-to-cursor mapping performs equal to ray-casting while also allowing higher precision in certain selection tasks.
Strålföljning påverkas negativt av darrning i händer och spårteknik i virtuella miljöer vilket gör det svårt att markera objekt som kräver hög precision. Denna studie utvärderar en alternativ teknik till strålföljning for objektmarkering i 2D-granssnitt i virtuella miljöer kallad relativ hand till musmarkör mappning. Avvägning mellan dessa två tekniker utvärderas i en jämförande studie med hjälp av Fitts's lag, SUS-undersökning och intervjuer. Resultaten demonstrerar att relativ hand till murmarkor mappning presterar jämförbart med strålföljning och tillåter samtidigt högre precision i särskilda objektmarkeringsuppgifter.
APA, Harvard, Vancouver, ISO, and other styles
13

Anderson, Lindsey M. "Assessing job relatedness in an in-basket test using the critical incident technique." Menomonie, WI : University of Wisconsin--Stout, 2007. http://www.uwstout.edu/lib/thesis/2007/2007andersonl.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Fairley, Rachel Anne Charlotte. "An investigation of diet selection as a technique for determining the ideal protein for growing pigs." Thesis, Open University, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.295298.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Thirukoveluri, Padma Latha. "Planning and Simulating Observations for a Sounding Rocket Experiment to Measure Polar Night Nitric Oxide in the Lower Thermosphere by Stellar Occultation." Thesis, Virginia Tech, 2011. http://hdl.handle.net/10919/32969.

Full text
Abstract:
The objective of this thesis was to select a star for observation and determine the error in the retrieval technique for a rocket experiment to measure lower thermospheric Nitric Oxide in the polar night using stellar occultation technique. These objectives are accomplished by planning the geometry, determining the requirements for observations, window for launch and discussing the retrieval technique. The planning is carried out using an approximated (no drag) and simulated rocket trajectory (provided by NSROC: NASA Rocket Operations Contract). The simulation for the retrievals is done using data from Student Nitric Oxide Explorer. Stars were taken from a catalogue called TD1. Launch times were obtained from the geometry planned resulting from selecting a zenith angle after choosing a maximum occultation height and determining rocket apogee. Window for observing Spica was found to be 20 minutes. The retrieval technique and simulations showed that column densities and volume densities should be retrievable to less than 5% and 20% respectively observing occultation heights 90-120km. The study suggests that choosing a star positioned north w.r.t the observation location gives us more poleward latitudes and larger launch window. Future research can be carried out applying the stellar occultation and retrieval technique to a satellite.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
16

Dennetiere, David. "Conception et réalisation d'un prototype d'imageur X durs à selection spectrale pour le Laser MégaJoule." Palaiseau, Ecole polytechnique, 2012. http://www.theses.fr/2012EPXX0102.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Gould, Richard Robert, and RichardGould@ozemail com au. "International market selection-screening technique: replacing intuition with a multidimensional framework to select a short-list of countries." RMIT University. Social Science & Planning, 2002. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20081125.145312.

Full text
Abstract:
The object of this research was to develop an international market screening methodology which selects highly attractive markets, allowing for the ranges in diversity amongst organisations, countries and products. Conventional business thought is that, every two to five years, dynamic organisations which conduct business internationally should decide which additional foreign market or markets to next enter. If they are internationally inexperienced, this will be their first market; if they are experienced, it might be, say, their 100th market. How should each organisation select their next international market? One previous attempt has been made to quantitatively test which decision variables, and what weights, should be used when choosing between the 230 countries of the world. The literature indicate that a well-informed selection decision could consider over 150 variables that measure aspects of each foreign market's economic, political, legal, cultural, technical and physical environments. Additionally, attributes of the organisation have not been considered when selecting the most attractive short-list of markets. The findings presented in the dissertation are that 30 criteria accounted for 95 per cent of variance at cross-classification rates of 95 per cent. The weights of each variable, and the markets selected statistically as being the most attractive, were found to vary with the capabilities, goals and values of the organisation. This frequently means that different countries will be best for different organisations selling the same product. A
APA, Harvard, Vancouver, ISO, and other styles
18

Clack, Gregory Lionel. "Developing a capital project selection framework using a multi-criteria decision analysis technique in a group decision environment." Thesis, Stellenbosch : Stellenbosch University, 2004. http://hdl.handle.net/10019.1/49908.

Full text
Abstract:
Thesis (MBA)--Stellenbosch University, 2004.
ENGLISH ABSTRACT: Everyone, generally. would like to make good decisions, or receive the greatest benefit from a decision made. Companies are no different in this respect and the process of selecting an investment project portfolio has become an important activity. This is, further, complicated by the fact that companies have multiple, and often, conflicting objectives in a situation of capital rationing. This study project proposes a conceptual framework for project portfolio establishment, for application in an industrial manufacturing type environment, by integrating project evaluation and selection, a multi-criteria decision analysis technique and group decision-making. The project issues, the selection of a multi-criteria decision analysis technique and group decisionmaking are dealt with sequentially and then integrated to develop this conceptual framework. The explorative part of this study project deals with project evaluation and selection issues, and the concept of the triple bottom line is proposed to capture the multiple objectives of the company's decision context. Further, decision analysis concepts are reviewed and three categories of multi-criteria decision analysis methods identified. Selected methods in these categories are described, examined and the advantages and drawbacks of the different categories highlighted. The Analytic Hierarchy Process is proposed as the underlying multi-criteria decision analysis technique to support this conceptual framework. Group decision-making is investigated, and aggregation procedures and a method of consistency checking suggested. Finally, the framework is applied to a hypothetical case and the results presented.
AFRIKAANSE OPSOMMING: Oor die algemeen wil almal goeie besluite neem, of maksimum voordeel uit die besluite trek. Maatskappye het dieselfde motivering en die aktiwiteit om 'n kapitaalinvesteringsportfolio saam te stel word as van groot belang beskou. Hierdie aktiwiteit is ook deur die werklikheid van vele, dikwels teenstrydige doelwitte sowel as beperkte fondse bemoeilik. In hierdie werkstuk word 'n begripsraamwerk vir die daarstelling van 'n investeringsportfolio vir kapitaalprojekte in 'n nywerheidsomgewing, wat projekevaluering en - keuring, veelvoudige kriterium besluitnemingstegnieke en groepsbesluitneming insluit, voorgestel. Projekverwante faktore, die keuse van 'n veelvoudige kriterium besluitnemingstegniek en derdens groepsbesluitneming word apart bespreek en daarna in die bogenoemde raamwerk geintegreer. Die navorsingsgedeelte van hierdie werkstuk verwys na die evaluering en keuse van projekte. Die begrip van 'n drievoudige maatstaf ('triple bottom line') om die kompleksiteit van 'n maatskappy se besluitnemingsdoelwitte te illustreer, word ondersteun. Daarna word besluitnemingsbegrippe bespreek en drie kategoriee van tegnieke vir veelvoudige kriterium besluitnemings uitgelig. Onder hierdie kategoriee is verkose metodes beskryf en ondersoek, en voor- en nadele van die kategoriee uitgewys. Die Analitiese Hierargie proses word voorgestel as basis van die begripsraamwerk. Groepsbesluitneming word ondersoek en versamelingsmetodes met kontrole vir konsekwentheid word voorgestel. Laastens word die besluitnemingsraamwerk op 'n denkbeeldige geval toegepas en word die resultate bespreek.
APA, Harvard, Vancouver, ISO, and other styles
19

Lee, Young Ki. "A fault diagnosis technique for complex systems using Bayesian data analysis." Diss., Georgia Institute of Technology, 2008. http://hdl.handle.net/1853/22539.

Full text
Abstract:
This research develops a fault diagnosis method for complex systems in the presence of uncertainties and possibility of multiple solutions. Fault diagnosis is a challenging problem because data used in diagnosis contain random errors and often systematic errors as well. Furthermore, fault diagnosis is basically an inverse problem so that it inherits unfavorable characteristics of inverse problems: The existence and uniqueness of an inverse solution are not guaranteed and the solution may be unstable. The weighted least squares method and its variations are traditionally used for solving inverse problems. However, the existing algorithms often fail to identify multiple solutions if they are present. In addition, the existing algorithms are not capable of selecting variables systematically so that they generally use the full model in which may contain unnecessary variables as well as necessary variables. Ignoring this model uncertainty often gives rise to, so called, the smearing effect in solutions, because of which unnecessary variables are overestimated and necessary variables are underestimated. The proposed method solves the inverse problem using Bayesian inference. An engineering system can be parameterized using state variables. The probability of each state variable is inferred from observations made on the system. A bias in an observation is treated as a variable, and the probability of the bias variable is inferred as well. To take the uncertainty of model structure into account, multiple Bayesian models are created with various combinations of the state variables and the bias variables. The results from all models are averaged according to how likely each model is. Gibbs sampling is used for approximating updated probabilities. The method is demonstrated for two applications: the status matching of a turbojet engine and the fault diagnosis of an industrial gas turbine. In the status matching application only physical faults in the components of a turbojet engine are considered whereas in the fault diagnosis application sensor biases are considered as well as physical faults. The proposed method is tested in various faulty conditions using simulated measurements. Results show that the proposed method identifies physical faults and sensor biases simultaneously. It is also demonstrated that multiple solutions can be identified. Overall, there is a clear improvement in ability to identify correct solutions over the full model that contains all state and bias variables.
APA, Harvard, Vancouver, ISO, and other styles
20

Millet, Barbara. "Design and Evaluation of Three Alternative Keyboard Layouts for a Five-Key Text Entry Technique." Scholarly Repository, 2009. http://scholarlyrepository.miami.edu/oa_dissertations/513.

Full text
Abstract:
Despite the increase in popularity of handheld devices, text entry on such devices is becoming more difficult due to reduced form factors that limit display size, input modes, and interaction techniques. In an effort to circumvent these issues, research has found that five-key methods are effective for text entry on devices such as in-car navigation systems, television and gaming controllers, wrist watches, and other small devices. Five-key text entry methods use four directional keys to move a selector over an on-screen keyboard and an Enter key for selection. Although other researchers have described five-key character layouts using alphabetical order and predictive layouts based on digraph frequencies, there is considerable latitude in designing the rest of a comprehensive on-screen keyboard. Furthermore, it might be possible to capitalize on the relative strengths of the alphabetic and predictive layouts by combining them in a hybrid layout. Thus, this research examines the design of alternative keyboard layouts for five-key text entry techniques. Three keyboard layouts (Alphabetical, Predictive, and Hybrid) were selected to represent standard and less familiar arrangements. The analysis centered on a series of controlled experiments conducted on a research platform designed by the author. In this work, when the immediate usability of three alternative keyboard layouts for supporting five-key text entry was investigated, results indicated no statistically significant differences in performance across the tested keyboards. Furthermore, experimental results show that following immediate usability, but still at the onset of learning, there was no overall difference in performance among the three keyboard layouts across four text types. However, the Alphabetical keyboard surpassed both the Predictive and Hybrid keyboards in text entry speed in typing Web addresses. The nonstandard keyboards performed superior to the Alphabetical keyboards in typing Words/Spaces and Sentences, but performed no better in typing Address strings than the Alphabetical. Use of mixed effects modeling suggested that the longitudinal data was best fitted by a quadratic model. Text entry performance on all three layouts improved as a function of practice, demonstrating that participants could learn the unfamiliar layouts to complete text entry tasks. Overall, there was no indication that use of nonstandard layouts impedes performance. In fact, trend in time data suggests that the learning rates were greater for the nonstandard keyboards over the standard layout. Overall, participants preferred the Hybrid layout. In summary, this dissertation focused on creating and validating novel and effective five-key text entry techniques for constrained devices.
APA, Harvard, Vancouver, ISO, and other styles
21

BANNARI-CHRISTIEANS, SOUAD. "Selection de ferments aromatiques pour saucisson sec. Controle d implantation des bacteries par technique d electrophorese en champ pulse." Clermont-Ferrand 2, 1998. http://www.theses.fr/1998CLF21012.

Full text
Abstract:
Les criteres de qualites sensorielles sont privilegies par le consommateur pour le choix des produits alimentaires. Ceci est particulierement vrai pour le saucisson sec qui est consomme en l'etat et choisi, des l'achat, a son aspect et a son odeur. Le produit devra aussi repondre a l'attente du consommateur par son gout et sa flaveur. Pour se demarquer sur le marche, l'industriel francais souhaite des produits types du point de vue aromatique. Aussi, l'objectif de cette etude est de selectionner des ferments aromatiques pour saucisson sec et de controler l'implantation des bacteries par des techniques de denombrements microbiologiques classiques et moleculaires telle que l'electrophorese en champ pulse (r-ecp), adaptee pour la premiere fois aux bacteries du saucisson sec. Une collection de bacteries (staphylocoques, lactobacilles et levures) a ete constituee a partir de saucissons de fabrications industrielles, artisanales et fermieres. Apres identification biochimique, seules les especes susceptibles d'etre employees comme ferments starters ont ete retenues. Les staphylocoques sont representes essentiellement par l'espece s. Xylosus, les lactobacilles par l. Curvatus, l. Sake, l. Plantarum et les levures par c. Famata et c. Zeylanoides. Dans un premier temps, la selection des differentes bacteries isolees a ete effectuee selon leur proprietes biochimiques, a savoir, la lipolyse pour les staphylocoques et les levures, la proteolyse pour les lactobacilles et les staphylocoques et l'acidification pour les lactobacilles. La diversite genetique des differentes bacteries selectionnees a ete ulterieurement evaluee par rapd et r-ecp. Cette etape a permis d'eliminer les souches genetiquement identiques et d'etablir les profils des bacteries retenues pour ulterieurement verifier leur implantation et leur potentiel aromatisant. Les premiers essais de fabrication ont permis de selectionner six staphylocoques (lipolytiques et/ou proteolytiques), cinq lactobacilles acidifiants et deux levures lipolytiques. Ces differentes souches ont ete retenues pour leur potentiel aromatique et pour leur capacite d'implantation. En effet, lors des differentes fabrications experimentales, le controle d'implantation des bacteries ensemencees a revele que les techniques classiques de denombrements microbiologiques ne permettent pas de juger de la bonne implantation des micro-organismes. Dans le cas ou les souches ne sont pas presentes, les denombrements microbiologiques sont corrects alors que les profils obtenus par r-ecp sont identiques a ceux de la melee non ensemencee. Les micro-organismes endogenes (de la melee) peuvent donc supplanter les levains apportes en debut de fabrication. Les essais realises a partir d'associations entre lactobacilles et staphylocoques selectionnes ont amene a en retenir six performantes (l153 + s16, l153 + s19, l153 + s20, l10 + s19, l10 + s20 et l33 + s20). Les 3 staphylocoques s16, s19 et s20 appartiennent a l'espece s. Xylosus, s19 est proteolytique, s16 lipolytique et s20 a la fois lipolytique et proteolytique. Les lactobacilles l33 (l. Plantarum) et l153 (l. Sake) ne sont pas proteolytiques et montrent un pouvoir acidifiant modere, alors que l10 (l. Curvatus) est proteolytique et tres acidifiant. Ces resultats nous ont amene a conclure que la lipolyse et la proteolyse des bacteries sont en partie responsables de la flaveur caracteristique du saucisson sec. Les differentes associations testees par les sept industriels impliques dans ce projet ont permis de souligner plus specialement l'effet aromatisant de deux associations : s19+l10 et s20+l153, juge plus important que le levain temoin actuellement commercialise.
APA, Harvard, Vancouver, ISO, and other styles
22

Chen, Jianyi. "A novel current based faulted phase selection and fault detection technique for EHV transmission systems with some penetration of wind generation." Thesis, University of Bath, 2015. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.675721.

Full text
Abstract:
In recent years, the capacity of electrical power systems (EPS) has been growing in order to match an increasing demand for electrical power. The expanding power source especially the penetration of the renewable energy makes the systems more difficult to manage and operate. Thus the task of protecting these systems especially for the extra-high-voltage (EHV) transmission line can no longer be handled using the traditional protection schemes, which were designed for simple power system configurations and therefore are not suitable for modern power systems. Also, the protection of EHV transmission line should take into account the increasing penetration of renewable energy generation such as wind farms and the effects of such generations on protection schemes. This thesis describes a novel phase selection and fault detection scheme using current signal data from one end only of a typical UK 400kV transmission system. Firstly, the measured current signals are decomposed using the wavelet transform to obtain the necessary frequency details and then the spectral energy for a chosen number of wavelet coefficients are calculated using a moving short time window; this forms the feature extraction stage, which in turn, defines the inputs for the artificial neural network which is used for classifying the types of fault. After the fault type is identified, the proposed scheme selects the specific neural network of the fault type to distinguish between internal and external faults by utilizing the same patterns features extracted from the previous stage. The input features comprise both the high and low frequency components to enhance the performance of the scheme. An extensive series of studies for a whole variety of different system and fault conditions clearly show that the performance of the scheme both for phase selection and detection is accurate and robust. For testing the robustness of the scheme and also as this research project is a UK-China jointly funded EPSRC project, this designed scheme is also applied to a typical 500kV Chinese transmission system with only traditional power generations and with both traditional and renewable generations (wind farms). The effect of the penetration of wind farms on the performance of the proposed protection scheme is thus investigated. For both systems, promising results from the new protection scheme for the phase selection and fault detection are achieved.
APA, Harvard, Vancouver, ISO, and other styles
23

Mustafa, Elmabrook B. M. "Some new localized quality of service models and algorithms for communication networks : the development and evaluation of new localized quality of service routing algorithms and path selection methods for both flat and hierarchical communication networks." Thesis, University of Bradford, 2009. http://hdl.handle.net/10454/4288.

Full text
Abstract:
The Quality of Service (QoS) routing approach is gaining an increasing interest in the Internet community due to the new emerging Internet applications such as real-time multimedia applications. These applications require better levels of quality of services than those supported by best effort networks. Therefore providing such services is crucial to many real time and multimedia applications which have strict quality of service requirements regarding bandwidth and timeliness of delivery. QoS routing is a major component in any QoS architecture and thus has been studied extensively in the literature. Scalability is considered one of the major issues in designing efficient QoS routing algorithms due to the high cost of QoS routing both in terms of computational effort and communication overhead. Localized quality of service routing is a promising approach to overcome the scalability problem of the conventional quality of service routing approach. The localized quality of service approach eliminates the communication overhead because it does not need the global network state information. The main aim of this thesis is to contribute towards the localised routing area by proposing and developing some new models and algorithms. Toward this goal we make the following major contributions. First, a scalable and efficient QoS routing algorithm based on a localised approach to QoS routing has been developed and evaluated. Second, we have developed a path selection technique that can be used with existing localized QoS routing algorithms to enhance their scalability and performance. Third, a scalable and efficient hierarchical QoS routing algorithm based on a localised approach to QoS routing has been developed and evaluated.
APA, Harvard, Vancouver, ISO, and other styles
24

Pietrani, Elina Eunice Montechiari. "O Processo de seleção de pessoal em psicologia na era da técnica: reflexões sob a perspectiva fenomenológico-hermenêutica." Universidade do Estado do Rio de Janeiro, 2014. http://www.bdtd.uerj.br/tde_busca/arquivo.php?codArquivo=6916.

Full text
Abstract:
Neste trabalho, analisamos o modo como o processo de seleção de pessoal se estabeleceu em meio às determinações de sentido nesta que Martin Heidegger denominou a era da técnica. Esse filósofo descreve a época em que vivemos como uma era que se caracteriza essencialmente pela ênfase no pensamento técnico-calculante, em que todas as coisas são tomadas pelo caráter da mensuração e calculabilidade. Nesse sentido, podemos afirmar que a era moderna detém como verdades algumas características, como: fundo de reserva, funcionalidade (serventia) e a produtividade sem limites. Esse modelo de pensamento tem um impacto direto na realização do processo de seleção de pessoal pela Psicologia, haja vista ser esse o critério básico exigido para que o trabalhador seja aprovado. Ocorre que, ao tomar esse critério como a única e absoluta verdade da capacidade do trabalhador, outras capacidades e motivações, quando muito, ficam relegadas a um segundo plano. O homem, tomado como um estoque de matéria-prima, com funcionalidades específicas e pela determinação da produtividade incessante, passa a se comportar de modo autômato, tal como a máquina, cuja utilidade dura enquanto durar a necessidade de sua produção, sendo descartado quando outras necessidades se sobrepõem àquela. Através da análise da trajetória da organização do trabalho e sua interface com a Psicologia, procuramos esclarecer o domínio do caráter técnico instrumental que vem sustentando a Psicologia no modo de realização da seleção de pessoal, baseando-nos em autores como Sampaio, Chiavenato, Pontes e Leme. Apresentamos, também, a contribuição de outros autores, como Sennett, Dejours e Schwartz, que tentaram, a seu modo, construir uma análise crítica da relação homem-trabalho sob os parâmetros predominantes na atualidade. Por fim, por meio a uma visada fenomenológico-hermenêutica, pudemos refletir sobre o processo de seleção em Psicologia e compreender como esta, ao ser constantemente interpelada pela era da técnica, vem tomando os atributos dessa era como verdades absolutas e, assim, estabelecendo seu fazer em seleção de pessoal sob essas verdades. Ao orientar seu fazer por esse modo, a Psicologia, comprometida com o processo de seleção, compactua, sedimenta e fortalece essa forma de pensar em que o homem é tomado como objeto de produção tal qual a máquina, consolidando uma relação homem-trabalho em bases preponderantemente deterministas e, como tal, aprisionadoras. A proposta aqui desenvolvida consiste em evidenciar a possibilidade de outra posição da Psicologia frente ao modo de estabelecimento do processo de seleção, de forma a resistir à perspectiva de homem apenas como um fator produtivo.
In this work, we examined how the process of personnel selection was established at determinations of a period called by Martin Heidegger as the era of technology. This philosopher describes the time we live in such an era that is essentially characterized by the emphasis on technical and calculating thought, in which all things are made by the character of the measurement and accountability. In this way, we can say that the modern era has some characteristics taken as truths, as reserve fund, functionality (usefulness) and productivity without limits. This type of thinking has a direct impact on the achievement of the personnel selection process by Psychology, considering this is the basic criterion required for the worker to be approved. However, if this criterion is taken as a absolute truth and capacity of the worker, other capabilities and motivations are relegated to low priority. Man, taken as a stock of raw material, with specific functionalities and at determination of uninterrupted productivity, begins to behave automaton mode, such as machine, whose usefulness lasts as long as the need for its production. Afterwards, he is discarded when other needs overlap those former needs. Through the analysis of the industrial work trajectory and its interface with Psychology, we could clarify the field of the instrumental technical model that has sustained Psychology in the way of accomplishing personnel selection, based in Sampaio, Chiavenato, Pontes and Leme theories. We also presented the contribution of other authors such as Sennett, Dejours and Schwartz, who have tried, in their own way, to build a critical analysis of worker labor relationship, under current parameters. Finally, through a phenomenological-hermeneutic sight, we could meditate on the selection process in Psychology and to understand how this, which is constantly challenged by the era of technology, has been taking the attributes of that era as absolute truths and thus establishing its making in personnel selection under these truths. Guiding its doing by this way, Psychology, committed to the process of selection, complies, consolidates and strengthens this way of thinking, in which man is regarded as an object of production such the machine, consolidating a man-work relationship predominantly on deterministic and, as such, imprisoning bases. The proposal developed here is to emphasize the possibility of another position of Psychology in relation to the mode of the established selection process, in order to resist the prospect of man only as a productive factor.
APA, Harvard, Vancouver, ISO, and other styles
25

Godoy, Rodrigo Juliani Corrêa de. "Plantwide control: a review and proposal of an augmented hierarchical plantwide control design technique." Universidade de São Paulo, 2017. http://www.teses.usp.br/teses/disponiveis/3/3139/tde-07112017-140120/.

Full text
Abstract:
The problem of designing control systems for entire plants is studied. A review of previous works, available techniques and current research challenges is presented, followed by the description of some theoretical tools to improve plantwide control, including the proposal of an augmented lexicographic multi-objective optimization procedure. With these, an augmented hierarchical plantwide control design technique and an optimal multi-objective technique for integrated control structure selection and controller tuning are proposed. The main contributions of these proposed techniques are the inclusion of system identification and optimal control tuning as part of the plantwide design procedure for improved results, support to multi-objective control specifications and support to any type of plant and controllers. Finally, the proposed techniques are applied to industrial benchmarks to demonstrate and validate its applicability.
O problema de projetar sistemas de controle para plantas inteiras é estudado. Uma revisão de trabalhos anteriores, técnicas disponíveis e atuais desafios de pesquisa é apresentada, seguida da descrição de algumas ferramentas teóricas para melhorar o controle plantwide, incluindo a proposta de um procedimento de otimização multi-objetivo lexicográfico aumentado. Com tais elementos, são propostas uma nova técnica hierárquica aumentada de projeto de sistemas de controle plantwide e uma técnica multi-objetivo para seleção de estrutura de controlador integrada à sintonia ótima do controlador. As principais contribuições das técnicas propostas são a inclusão de identificação de sistemas e sintonia ótima de controladores como parte do procedimento de projeto de controle plantwide para melhores resultados, suporte a especificações multi-objetivo e suporte a quaisquer tipos de plantas e controladores. Finalmente, as técnicas propostas são aplicadas a benchmarks industriais para demonstrar e validar sua aplicabilidade.
APA, Harvard, Vancouver, ISO, and other styles
26

Parisi, Mariana Migliorini. "Padronização de técnica de purificação de monócitos como modelo de cultura celular para estudo da diferenciação in vitro de macrófagos." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2014. http://hdl.handle.net/10183/103310.

Full text
Abstract:
Monócitos são células hematopoiéticas com função na imunidade inata e adquirida. De acordo com o estímulo que recebem, podem se diferenciar em macrófagos e potencializar suas funções efetoras, modulando a resposta imune e participando de vários processos fisiológicos e patológicos. Os macrófagos são muito heterogêneos e capazes de assumir diferentes fenótipos em resposta aos estímulos que recebem do microambiente. Em um ambiente gorvernado por interferon gama (IFN-γ), se diferenciam em células com aumentada capacidade de apresentação de antígenos e síntese de citocinas pró-inflamatórias (macrófagos M1). Por outro lado, quando são estimulados por interleucina-4 (IL-4), eles se diferenciam em um fenótipo antagonista com atividade de reparo (macrófagos M2). Dada a importância destas células no sistema imune, é necessário desenvolver e otimizar técnicas que sirvam de ferramentas para estudar a diferenciação de macrófagos em seus perfis fenotípicos bem como seu papel em doenças humanas. Assim, o objetivo deste estudo foi padronizar e comparar dois diferentes protocolos de isolamento de monócitos descritos na literatura e analisar seu impacto sobre a diferenciação de macrófagos. Isolamos monócitos do sangue periférico de cinco indivíduos saudáveis pelas técnicas de aderência e seleção positiva. Os monócitos foram diferenciados a macrófagos com suplementação de M-CSF. Depois da indução aos perfis M1 e M2, avaliamos marcadores de superfície celular, expressão de mRNA de citocinas, secreção de quimiocinas e fagocitose. Observamos que os métodos utilizados para isolar monócitos possuem diferentes purezas, mas que monócitos isolados por ambos métodos foram capazes de ser diferenciados a macrófagos em seus perfis M1 e M2. Análises de citometria de fluxo mostraram que há uma diminuição da expressão de CD14, principalmente em macrófagos M2, e manutenção (M2) ou aumento (m1) da expressão de HLA-DR. Monócitos (CD80-CD86high) induzidos aos fenótipo M1 são caracterizados pela regulação positiva de CD80 e regulação negativa de CD86 (CD80++CD86+). O perfil M2 foi caracterizado pela expressão de CD206high e ausência de CD163- (CD206highCD163-). A expressão do mRNA revelou que IL-1β e TNF-α foram marcadores de M1 e TGF-β e CCL18 foram marcadores de M2. Além disso, quimiocinas inflamatórias como CXCL9, CXCL10 e CCL5 foram significativamente aumentadas em macrophagos M1. Macrófagos M1 e M2 são ativos e funcionais como demonstrado no ensaio de fagocitose. Embora ambos métodos utilizados para isolar monócitos tiveram purezas diferentes, ambas técnicas forneceram monócitos capazes de serem diferenciados a macrófagos.
Monocytes are hematopoietic cells with a major role in innate and adaptative immunity. According to the stimulus they receive, they can differentiate into macrophages and enhance effector functions by modulating the immune response and participating in various physiological and pathophysiological processes. Macrophages are very heterogeneous and are able to assume different phenotypes in response to the different stimuli they receive from the microenvironment. In a proinflammatory milieu ruled by interferon gamma (IFN-γ), they differentiate into cells with increased capacity to present antigens and synthesis of proinflammatory cytokines (M1 macrophages). On the other hand, when they are stimulated with interleukin 4 (IL-4), they differentiate into an antagonist phenotype with repair activity (M2 macrophages). Given the importance of these cells in the immune system, it is necessary to develop and optimize techniques that serve as useful tools for studying the differentiation of macrophages in their different phenotypic profiles as well as their roles in human diseases. In this regard, the aim of this study was to standardize and compare two different human monocyte isolation protocols described in the literature and analyze their impact on macrophage differentiation. We isolated peripheral blood monocytes from five healthy subjects by the adherence technique and positive selection. Monocytes were differentiated into macrophages with M-CSF supplementation. After M1 or M2 induction, we evaluated cell surface markers, mRNA cytokine expression, chemokine secretion and phagocytosis. We found that the methods used to isolate monocytes have different purities, but monocytes isolated from both methods were able to differentiate into the M1 and M2 profile. The monocyte and macrophage flow cytometry analysis demonstrated that CD14 decreased expression, mainly in M2 macrophages, and maintained (M2) or increased (M1) the HLA-DR expression. Monocytes (CD80-CD86high) induced to an M1 phenotype were characterized by upregulation of CD80 and down regulation of CD86. (CD80++CD86+) The M2 profile was characterized by the expression of CD206high and absence of CD163- (CD206highCD163-). The mRNA expression revealed that IL-1β and TNF-α were M1 markers and TGF-β and CCL18 were M2 markers. Further more, inflammatory chemokines as CXCL9, CXCL10 and CCL5 were significantly increased in M1 macrophages. M1 and M2 macrophages were active and functional as shown in the phagocytic assay. Although methods used for the isolation of monocytes yielded different purities, both techniques provided monocytes able to differentiate to macrophages.
APA, Harvard, Vancouver, ISO, and other styles
27

Green, Steven P. "An evaluation of the morphology, spontaneous acrosome reaction (AR) and fertilizing efficacy of human hyperactivated (HA) spermatozoa recovered individually, in vitro, using a new technique called Computer Image Sperm Selection (CISS)." Thesis, University of Nottingham, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.342066.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Abdullah, Arham. "Intelligent selection of demolition techniques." Thesis, Loughborough University, 2003. https://dspace.lboro.ac.uk/2134/12231.

Full text
Abstract:
There is a need to improve the current demolition techniques selection process that involves a multicriteria decision making problems because the decision performed by demolition engineers were based on their knowledge and experience without any systematic procedure that can be followed to support the decision process. There is also a need to capture the expert's knowledge since significant proportion of senior, experienced demolition engineers are close to retirement, and unless their knowledge is captured in some form, it would be lost. Concerning to these needs, the research aim is to develop an intelligent decision support system that incorporated the demolition expert's knowledge in selecting the most appropriate demolition techniques for a given structure. Various research methodologies were adopted to achieve the aim. Literature on demolition industry was first reviewed. Knowledge acquisition approaches were used to capture the demolition expert knowledge, which included an industry survey through postal questionnaire, semi-structured interviews, and protocol analysis. The rapid prototyping methodology was used in developing the prototype system. The proposed intelligent decision support system is called 'Demolition Techniques Selection System' (DTSS). The prototype system consists of two stages. The first stage will assist the decision maker to select the most appropriate demolition techniques in term of technical aspects by using Analytic Hierarchy Process (AHP) model. The second stage allows the decision maker to assess the demolition techniques in terms of cost by using the Demolition Cost Estimation model. The prototype was evaluated during and after the development process to verify, validates, and improves it. The evaluation revealed that the prototype system demonstrated many benefits and applicable for use in the industry. It is concluded that the prototype provides a clear, systematic and structured framework that improved the current demolition techniques selection process. It also serves as an information source that contains a considerable amount of information on demolition techniques. It can act as a teaching aid for young professionals coming into the demolition industry by giving them a basic information and understanding of demolition. Demolition contractors can use the system as a marketing aid to impress potential clients to win a project because of its ability to give rational and structured decisions with the capability of generating graphical reports and sensitivity analysis.
APA, Harvard, Vancouver, ISO, and other styles
29

Buchholz, Timothy C. "Analysis and Categorization of Selected Musical Elements within Forty-three Solo Jazz Vocal "Standards" with Pedagogical Application to Repertoire Selection and the Teaching of Jazz Concepts in the Jazz Voice Lesson." Scholarly Repository, 2010. http://scholarlyrepository.miami.edu/oa_dissertations/386.

Full text
Abstract:
While the concept of teaching jazz style to vocal students is not a new one, previous materials written on the subject have not addressed two important aspects of this process. One is the concept of selecting jazz vocal solo repertoire that is both musically and vocally purposeful and appropriate for the student. The other is how to teach stylistic concepts that will apply to both current repertoire as well as songs the student will learn in the future. This doctoral essay provides both a categorized list of solo jazz vocal repertoire as well as strategies for introducing stylistic elements of jazz into the private-lesson setting. Through a systematic analysis of jazz vocal standards, a list of repertoire selections was categorized by rhythmic style, melodic range, melodic harmony, melodic rhythm, and harmonic content. In addition, the stylistic need to add syncopations to swing songs with non-syncopated melodies was addressed. Suggestions are included on how to implement this categorized list in the music selection process for students. Furthermore, this essay provides jazz voice teachers with strategies to efficiently incorporate important aspects of jazz styles such as rhythmic feel, song form, improvisation, and harmony into the lesson setting. By showing connections between these concepts and the literature that is being taught, students can become more competent and confident within the vocal and stylistic elements of the jazz idiom.
APA, Harvard, Vancouver, ISO, and other styles
30

CALDEIRA, ANDRE MACHADO. "PORTFOLIO SELECTION USING NON PARAMETRIC TECHNIQUES." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2005. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=6988@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
Nos anos 50, Henrry Markowitz criou um modelo que maximiza a razão entre a média e o desvio padrão [Markowitz, 1952 & 1959]. Esse modelo é muito utilizado até os dias de hoje. Porém ele supõe que os retornos dos ativos do portifólio sejam normalmente distribuídos, e isso não é tão comum, logo seu uso é limitado. Esse trabalho propõe um modelo mais robusto em termos de risco, que possa ser utilizado sem restrições de distribuições, não necessitando do conhecimento a priori das distribuições e que seja uma aproximação do modelo de Markowitz, caso os retornos dos ativos sejam normalmente distribuídos. Para possibilitar isso, o índice maximizado pelo modelo de Markowitz é escrito como uma função da média e da entropia. A seleção do portifólio é dada pelo portifólio que obtiver o maior índice proposto dentro da amostra selecionada.
In the 50 s, Henrry Markowitz created a model that maximizes the mean to standard deviation ratio [Markowitz, 1952]. This model is largely use in the financial market. However, it assumes that portfolio s equities returns are normally distributed, and this not always happens, therefore limiting its use. This work proposes a more robust model in risk measure that can be used without any distribution constraint, however it reduces to Markowitz model if the assets returns are normal distributed. To make it possible, the index maximized by Markowitz will be written as a function of the mean and the entropy. The portfolio selection is that one witch has the largest proposed index in the selected sample.
APA, Harvard, Vancouver, ISO, and other styles
31

Bal, Darbara Jay. "Materials selection using knowledge-based techniques." Thesis, University of Warwick, 1995. http://wrap.warwick.ac.uk/80317/.

Full text
Abstract:
A successful design is one of the most important elements for the commercial success of a product and the selection of appropriate materials is a key step within the product design process. The task is not easy; a large number of interacting factors, both technical and economic, need to be taken into consideration and a vast amount of data investigated. Product designers can benefit from using computer systems which can emulate the reasoning processes of an expert in selecting materials and provide ready access to appropriate materials data. The knowledge based system developed, Plassel fulfils the key requirements identified for such a system. It can: 1. Emulate the reasoning processes of a plastics expert. 2. Allow a customised data search to be undertaken 3. Access a range of data sources covering both embodiment and detail data. 4. Convert component functional requirements into property requirements. 5. Allow knowledge and experience to be stored in the system 6. Allow cost to be fully considered Professor Ashby in 1993 [1] stated "A full expert system for materials selection is decades away. Success has been achieved in specialised highly focused applications". Plasse1 is not such an application, it provides access to a full set of selection facilities. Novel aspects of Plassel include its ability to select on multi-dimensional criteria, automatically 'rate' materials and to conduct customised searches. Professor Ashby concludes with "It is only a question of time before more fully developed systems become available. They are something to keep informed about." Plassel is a more fully developed system for plastic materials selection than those currently available.
APA, Harvard, Vancouver, ISO, and other styles
32

Abdul, Rehman Bin Omer. "Mobile Edge Computing Clustering Algorithms for Pedestrian Mobility Scenarios." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018.

Find full text
Abstract:
The purpose of this study is to provide a general framework for the latest trends of mobile network architectures. Edge computing brings the computing capability of the cloud to the edge of the network for minimizing the latency. In the D2D mode, Fog Nodes interact with each other. With the help of clustering, Fog nodes are categorized into two types: Fog Cluster Head (FCH) and Fog Cluster Member (FCM). In each cluster, FCMs offload the task towards their respective FCHs for computation. The characterization of the performance of system model taking into account the average energy consumption, average task delay, fairness, and packet loss. We provide results based on the numerical simulation performed in Matlab in order to show the difference in the performance of the network using different policies and clustering and cluster update frequencies. In this thesis, a theoretic framework is presented that aims to characterize the performance of Fog network with pedestrian mobility without priority approach and also pedestrian mobility with priority approach using clustering approach and compare the results. The simulation results show how the priority approach has the profound impact on the energy consumption, task delay, and packet loss and solve the problem of coverage constraint.
APA, Harvard, Vancouver, ISO, and other styles
33

Tan, Feng. "Improving Feature Selection Techniques for Machine Learning." Digital Archive @ GSU, 2007. http://digitalarchive.gsu.edu/cs_diss/27.

Full text
Abstract:
As a commonly used technique in data preprocessing for machine learning, feature selection identifies important features and removes irrelevant, redundant or noise features to reduce the dimensionality of feature space. It improves efficiency, accuracy and comprehensibility of the models built by learning algorithms. Feature selection techniques have been widely employed in a variety of applications, such as genomic analysis, information retrieval, and text categorization. Researchers have introduced many feature selection algorithms with different selection criteria. However, it has been discovered that no single criterion is best for all applications. We proposed a hybrid feature selection framework called based on genetic algorithms (GAs) that employs a target learning algorithm to evaluate features, a wrapper method. We call it hybrid genetic feature selection (HGFS) framework. The advantages of this approach include the ability to accommodate multiple feature selection criteria and find small subsets of features that perform well for the target algorithm. The experiments on genomic data demonstrate that ours is a robust and effective approach that can find subsets of features with higher classification accuracy and/or smaller size compared to each individual feature selection algorithm. A common characteristic of text categorization tasks is multi-label classification with a great number of features, which makes wrapper methods time-consuming and impractical. We proposed a simple filter (non-wrapper) approach called Relation Strength and Frequency Variance (RSFV) measure. The basic idea is that informative features are those that are highly correlated with the class and distribute most differently among all classes. The approach is compared with two well-known feature selection methods in the experiments on two standard text corpora. The experiments show that RSFV generate equal or better performance than the others in many cases.
APA, Harvard, Vancouver, ISO, and other styles
34

Nejadmalayeri, Amir Hossein. "CDMA channel selection using switched capacitor techniques." Waterloo, Ont. : University of Waterloo, [Dept. of Electrical and Computer Engineering], 2001. http://etd.uwaterloo.ca/etd/anejadma2001.pdf.

Full text
Abstract:
Thesis (M.A.Sc.) - University of Waterloo, 2001.
"A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Master of Applied Science in Electrical and Computer Engineering". Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
35

Smith, Helen Elizabeth. "Systematic selection of down stream processing techniques." Thesis, University College London (University of London), 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.272270.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Stark, J. Alex. "Statistical model selection techniques for data analysis." Thesis, University of Cambridge, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.390190.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Zhang, Ping. "Forecasting financial outcomes using variable selection techniques." Thesis, University of Glasgow, 2019. http://theses.gla.ac.uk/41019/.

Full text
Abstract:
Since the activities of market participants can be influenced by financial outcomes, providing accurate forecasts of these financial outcomes can help participants to reduce the risk of adjusting to any market change in the future. Predictions of financial outcomes have usually been obtained by conventional statistical models based on researchers' knowledge. With the development of data collection and storage, an extensive set of explanatory variables will be extracted from big data capturing more economic theories and then applied to predictive methods, which can increase the difficulty of model interpretation and produce biased estimation. This may further reduce predictive ability. To overcome these problems, variable selection techniques are frequently employed to simplify model selection and produce more accurate forecasts. In this PhD thesis, we aim to combine variable selection approaches with traditional reduced-form models to define and forecast the financial outcomes in question (market implied ratings, Initial Public Offering (IPO) decisions and the failure of companies). This provides benefits for market participants in detecting potential investment opportunities and reducing credit risk. Making accurate predictions of corporate credit ratings is a crucial issue for both investors and rating agencies, since firms' credit ratings are associated with financial flexibility and debt or equity issuance. In Chapter 2, we attempt to determine market-implied credit ratings in relation to financial ratios, market-driven factors and macroeconomic indicators. We conclude that applying variable selection techniques, the least absolute shrinkage and selection operator (LASSO) and its extension (Elastic net) can improve predictive power. Moreover, the predictive ability of LASSO-selected models is clearly better than that of the benchmark ordered probit model in all out-of-sample predictions. Finally, fewer predictors can be selected into LASSO models controlled by BIC-type tuning parameter to produce more accurate out-of-sample prediction than its counterpart AIC-type selector. Next, the LASSO technique is further applied to binary event prediction. A bank's decision to go public by issuing an Initial Public Offering (IPO) is the binary object in Chapter 3, which transforms the operations and capital structure of a bank. Much of the empirical investigation in this area focuses on the determinants of the IPO decision, applying accounting ratios and other publicly available information in non-linear models. We mark a break with this literature by offering methodological extensions as well as an extensive and updated US dataset to predict bank IPOs. Combining the least absolute shrinkage and selection operator (LASSO) with a Cox proportional hazard, we uncover value in several financial factors as well as market-driven and macroeconomic variables, in predicting a bank's decision to go public. Importantly, we document a significant improvement in the model's predictive ability compared to standard frameworks used in the literature. Finally, we show that the sensitivity of a bank's IPO to financial characteristics is higher during periods of global financial crisis than in calmer times. Moving to another line of variable selection techniques, Bayesian Model Averaging (BMA) is combined with reduced-form models in Chapter 4. The failure of companies is closely related to the health of the whole economy, since the beginning of the most recent global crisis was the bankruptcy of Lehman Brothers. In this chapter, we forecast the failure of UK private firms incorporating with financial ratios and macroeconomic variables. Since two important financial crises and firm heterogeneities are covered in our dataset, the predictive powers of candidate models in different periods and cross-sections are validated. We first detect that applying BMA to the discrete hazard models can improve the predictive performance in different sub-periods. However, comparing the results with classified models, it should be noted that the Naive Bayes (NB) classifier provides slightly higher predictive accuracy than BMA models of discrete hazard models. Moreover, the predictive performance of the discrete hazard model and its BMA version are more sensitive to adding time or industry dummy variables than other competing models. Considering financial crisis or firm heterogeneity can influence the predictive power of each candidate model in the out-of-sample prediction of failure.
APA, Harvard, Vancouver, ISO, and other styles
38

Nikolaidis, Konstantinos. "Techniques for data pattern selection and abstraction." Thesis, University of Liverpool, 2012. http://livrepository.liverpool.ac.uk/10535/.

Full text
Abstract:
This thesis concerns the problem of prototype reduction in instance-based learning. In order to deal with problems such as storage requirements, sensitivity to noise and computational complexity, various algorithms have been presented that condense the number of stored prototypes, while maintaining competent classification accuracy. Instance selection, which recovers a smaller subset of the original training set, is the most widely used technique for instance reduction. But, prototype abstraction that generates new prototypes to replace the initial ones has also gained a lot of interest recently. The major contribution of this work is the proposal of four novel frameworks for performing prototype reduction, the Class Boundary Preserving algorithm (CBP), a hybrid method that uses both selection and generation of prototypes, Instance Seriation for Prototype Abstraction (ISPA), which is an abstraction algorithm, and two selective techniques, Spectral Instance Reduction (SIR) and Direct Weight Optimization (DWO). CBP is a multi-stage method based on a simple heuristic that is very effective in identifying samples close to class borders. Using a noise filter harmful instances are removed, while the powerful heuristic determines the geometrical distribution of patterns around every instance. Together with the concepts of nearest enemy pairs and mean shift clustering this algorithm decides on the final set of retained prototypes. DWO is a selection model whose output set of prototypes is decided by a set of binary weights. These weights are computed according to an objective function composed of the ratio between the nearest friend and nearest enemy of every sample. In order to obtain good quality results DWO is optimized using a genetic algorithm. ISPA is an abstraction technique that employs the concept of data seriation to organize instances in an arrangement that favours merging between them. As a result, a new set of prototypes is created. Results show that CBP, SIR and DWO, the three major algorithms presented in this thesis, are competent and efficient in terms of at least one of the two basic objectives, classification accuracy and condensation ratio. The comparison against other successful condensation algorithms illustrates the competitiveness of the proposed models. The SIR algorithm presents a set of border discriminating features (BDFs) that depicts the local distribution of friends and enemies of all samples. These are then used along with spectral graph theory to partition the training set in to border and internal instances.
APA, Harvard, Vancouver, ISO, and other styles
39

Loscalzo, Steven. "Group based techniques for stable feature selection." Diss., Online access via UMI:, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
40

Van, Tonder Martin Stephen. "The development and evaluation of gaze selection techniques." Thesis, Nelson Mandela Metropolitan University, 2009. http://hdl.handle.net/10948/882.

Full text
Abstract:
Eye gaze interaction enables users to interact with computers using their eyes. A wide variety of eye gaze interaction techniques have been developed to support this type of interaction. Gaze selection techniques, a class of eye gaze interaction techniques which support target selection, are the subject of this research. Researchers developing these techniques face a number of challenges. The most significant challenge is the limited accuracy of eye tracking equipment (due to the properties of the human eye). The design of gaze selection techniques is dominated by this constraint. Despite decades of research, existing techniques are still significantly less accurate than the mouse. A recently developed technique, EyePoint, represents the state of the art in gaze selection techniques. EyePoint combines gaze input with keyboard input. Evaluation results for this technique are encouraging, but accuracy is still a concern. Early trigger errors, resulting from users triggering a selection before looking at the intended target, were found to be the most commonly occurring errors for this technique. The primary goal of this research was to improve the usability of gaze selection techniques. In order to achieve this goal, novel gaze selection techniques were developed. New techniques were developed by combining elements of existing techniques in novel ways. Seven novel gaze selection techniques were developed. Three of these techniques were selected for evaluation. A software framework was developed for implementing and evaluating gaze selection techniques. This framework was used to implement the gaze selection techniques developed during this research. Implementing and evaluating all of the techniques using a common framework ensured consistency when comparing the techniques. The novel techniques which were developed were evaluated against EyePoint and the mouse using the framework. The three novel techniques evaluated were named TargetPoint, StaggerPoint and ScanPoint. TargetPoint combines motor space expansion with a visual feedback highlight whereas the StaggerPoint and TargetPoint designs explore novel approaches to target selection disambiguation. A usability evaluation of the three novel techniques alongside EyePoint and the mouse revealed some interesting trends. TargetPoint was found to be more usable and accurate than EyePoint. This novel technique also proved more popular with test participants. One aspect of TargetPoint which proved particularly popular was the visual feedback highlight, a feature which was found to be a more effective method of combating early trigger errors than existing approaches. StaggerPoint was more efficient than EyePoint, but was less effective and satisfying. ScanPoint was the least popular technique. The benefits of providing a visual feedback highlight and test participants' positive views thereof contradict views expressed in existing research regarding the usability of visual feedback. These results have implications for the design of future gaze selection techniques. A set of design principles was developed for designing new gaze selection techniques. The designers of gaze selection techniques can benefit from these design principles by applying them to their techniques
APA, Harvard, Vancouver, ISO, and other styles
41

Espy, John. "Data mining techniques for constructing jury selection models." Thesis, California State University, Long Beach, 2014. http://pqdtopen.proquest.com/#viewpdf?dispub=1527548.

Full text
Abstract:

Jury selection can determine a case before it even begins. The goal is to predict whether a juror rules for the plaintiff or the defense in the medical malpractice trials that are conducted, and which variables are significant in predicting this. The data for the analysis were obtained from mock trials that simulated actual trials, with possible arguments from the defense and the plaintiff with ample discussion time. These mock trials were supplemented by surveys that attempted to capture the characteristics and attitudes of the mock juror and the case at hand. The data were modeled using the logistic regression as well as decision trees and neural networks techniques.

APA, Harvard, Vancouver, ISO, and other styles
42

Zaman, Uzair khaleeq uz. "Intégration Produit-Process appliquée à la sélection de procédés de Fabrication Additive." Thesis, Paris, ENSAM, 2019. http://www.theses.fr/2019ENAM0006/document.

Full text
Abstract:
Cette recherche vise à proposer une approche intégrée permettant la prise en compte simultanée des paramètres Produits / process dans le cadre d’une fabrication par ajout de matière. Le développement produit est en profonde mutation, prenant en compte les contraintes de personnalisation, de temps de mise sur le marché de plus en plus court, la volonté d’une approche eco-responsable etc. Ce changement de paradigme conduit à s’intéresser au choix du couple matériau /process dès la phase de conception afin de prendre en compte les contraintes liées au procédé identifié. Cette approche multi critère s’intéresse à la fois au couple matériau procédé mais prend en compte les aspect fonctionnels de la pièce. Ainsi ce travail de thèse présente une méthodologie de décision générique, basée sur des outils de prise de décision multicritères, qui peut non seulement proposer une solution satisfaisant les contraintes liées aux matériaux, processus et processus par addition de matière, mais propose également de servir de guide aux concepteurs permettant un choix raisonné basé sur des combinaisons matériau-machine orientées conception et obtenu à partir d’une base de données de 38 fournisseurs internationaux de machine de fabrication par ajout de matière
The doctoral research focuses to build an integrated approach that can simultaneously handle the product and process parameters related to additive manufacturing (AM). Since, market dynamics of today are constantly evolving, drivers such as mass customization strategies, shorter product development cycles, a large pool of materials to choose from, abundant manufacturing processes, etc., have made it essential to choose the right compromise of materials, manufacturing processes and associated machines in early stages of design considering the Design for AM guidelines. As several criteria, material attributes and process functionality requirements are involved for decision making in the industries, the thesis introduces a generic decision methodology, based on multi-criteria decision-making tools, that can not only provide a set of compromised AM materials, processes and machines but will also act as a guideline for designers to achieve a strong foothold in the AM industry by providing practical solutions containing design oriented and feasible material-machine combinations from a database of 38 renowned AM vendors in the world today
APA, Harvard, Vancouver, ISO, and other styles
43

Lövgren, Ulf. "Enzyme immunoassay in combination with liquid chromatography for sensitive and selective determination of drugs in biosamples." Lund : Dept. of Analytical Chemistry, Lund University, 1997. http://books.google.com/books?id=Ju1qAAAAMAAJ.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Badillo, Brian Elvis. "Migrating Three Dimensional Interaction Techniques." Thesis, Virginia Tech, 2007. http://hdl.handle.net/10919/32315.

Full text
Abstract:
Multiplatform virtual environment (VE) development is fast-becoming a realization for today’s developers. 3D user interfaces (3DUIs) can easily be ported to a variety of VE systems. However, few researchers have addressed the need to intelligently migrate 3DUIs across VE systems. We claim that the naïve migration of 3D interaction techniques (3DITs) to other VE systems could result in decreases in usability. We also claim that device specificity can be used to increase usability on these other VE systems. In this thesis, we have chosen three manipulation 3DITs to naively migrate across a set of four VE systems. We use an exploratory usability study to identify any usability issues stemming from our naïve migrations. After finding decreases in usability in select migrations, we redesigned two of the 3DITs for device specificity. We investigated the benefits of our redesigns with usability studies on the original, naïve, and redesigned implementations of both 3DITs. Results from our studies are mixed. In one case we demonstrate that device specificity can be used effectively to increase 3DIT migratability. As a result from our experience in this work, we have learned several lessons in device-specific design as well as 3DIT migration.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
45

Askling, Kim. "Application of Topic Models for Test Case Selection : A comparison of similarity-based selection techniques." Thesis, Linköpings universitet, Programvara och system, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-159803.

Full text
Abstract:
Regression testing is just as important for the quality assurance of a system, as it is time consuming. Several techniques exist with the purpose of lowering the execution times of test suites and provide faster feedback to the developers, examples are ones based on transition-models or string-distances. These techniques are called test case selection (TCS) techniques, and focuses on selecting subsets of the test suite deemed relevant for the modifications made to the system under test. This thesis project focused on evaluating the use of a topic model, latent dirichlet allocation, as a means to create a diverse selection of test cases for coverage of certain test characteristics. The model was tested on authentic data sets from two different companies, where the results were compared against prior work where TCS was performed using similarity-based techniques. Also, the model was tuned and evaluated, using an algorithm based on differential evolution, to increase the model’s stability in terms of inferred topics and topic diversity. The results indicate that the use of the model for test case selection purposes was not as efficient as the other similarity-based selection techniques studied in work prior to thist hesis. In fact, the results show that the selection generated using the model performs similar, in terms of coverage, to a randomly selected subset of the test suite. Tuning of the model does not improve these results, in fact the tuned model performs worse than the other methods in most cases. However, the tuning process results in the model being more stable in terms of inferred latent topics and topic diversity. The performance of the model is believed to be strongly dependent on the characteristics of the underlying data used to train the model, putting emphasis on word frequencies and the overall sizes of the training documents, and implying that this would affect the words’ relevance scoring to the better.
APA, Harvard, Vancouver, ISO, and other styles
46

Lee, Toby. "Optimizing CAT-ASVAB item selection using form assembly techniques." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2006. http://library.nps.navy.mil/uhtbin/hyperion/06Jun%5FLee.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Vege, Sri Harsha. "Ensemble of Feature Selection Techniques for High Dimensional Data." TopSCHOLAR®, 2012. http://digitalcommons.wku.edu/theses/1164.

Full text
Abstract:
Data mining involves the use of data analysis tools to discover previously unknown, valid patterns and relationships from large amounts of data stored in databases, data warehouses, or other information repositories. Feature selection is an important preprocessing step of data mining that helps increase the predictive performance of a model. The main aim of feature selection is to choose a subset of features with high predictive information and eliminate irrelevant features with little or no predictive information. Using a single feature selection technique may generate local optima. In this thesis we propose an ensemble approach for feature selection, where multiple feature selection techniques are combined to yield more robust and stable results. Ensemble of multiple feature ranking techniques is performed in two steps. The first step involves creating a set of different feature selectors, each providing its sorted order of features, while the second step aggregates the results of all feature ranking techniques. The ensemble method used in our study is frequency count which is accompanied by mean to resolve any frequency count collision. Experiments conducted in this work are performed on the datasets collected from Kent Ridge bio-medical data repository. Lung Cancer dataset and Lymphoma dataset are selected from the repository to perform experiments. Lung Cancer dataset consists of 57 attributes and 32 instances and Lymphoma dataset consists of 4027 attributes and 96 ix instances. Experiments are performed on the reduced datasets obtained from feature ranking. These datasets are used to build the classification models. Model performance is evaluated in terms of AUC (Area under Receiver Operating Characteristic Curve) performance metric. ANOVA tests are also performed on the AUC performance metric. Experimental results suggest that ensemble of multiple feature selection techniques is more effective than an individual feature selection technique.
APA, Harvard, Vancouver, ISO, and other styles
48

Gustafsson, Robin. "Ordering Classifier Chains using filter model feature selection techniques." Thesis, Blekinge Tekniska Högskola, Institutionen för datalogi och datorsystemteknik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-14817.

Full text
Abstract:
Context: Multi-label classification concerns classification with multi-dimensional output. The Classifier Chain breaks the multi-label problem into multiple binary classification problems, chaining the classifiers to exploit dependencies between labels. Consequently, its performance is influenced by the chain's order. Approaches to finding advantageous chain orders have been proposed, though they are typically costly. Objectives: This study explored the use of filter model feature selection techniques to order Classifier Chains. It examined how feature selection techniques can be adapted to evaluate label dependence, how such information can be used to select a chain order and how this affects the classifier's performance and execution time. Methods: An experiment was performed to evaluate the proposed approach. The two proposed algorithms, Forward-Oriented Chain Selection (FOCS) and Backward-Oriented Chain Selection (BOCS), were tested with three different feature evaluators. 10-fold cross-validation was performed on ten benchmark datasets. Performance was measured in accuracy, 0/1 subset accuracy and Hamming loss. Execution time was measured during chain selection, classifier training and testing. Results: Both proposed algorithms led to improved accuracy and 0/1 subset accuracy (Friedman & Hochberg, p < 0.05). FOCS also improved the Hamming loss while BOCS did not. Measured effect sizes ranged from 0.20 to 1.85 percentage points. Execution time was increased by less than 3 % in most cases. Conclusions: The results showed that the proposed approach can improve the Classifier Chain's performance at a low cost. The improvements appear similar to comparable techniques in magnitude but at a lower cost. It shows that feature selection techniques can be applied to chain ordering, demonstrates the viability of the approach and establishes FOCS and BOCS as alternatives worthy of further consideration.
APA, Harvard, Vancouver, ISO, and other styles
49

Lau, Kevin Keung. "The use of graphical techniques in component selection systems." Thesis, University of Bath, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.334701.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Zhang, Fu. "Intelligent feature selection for neural regression : techniques and applications." Thesis, University of Warwick, 2012. http://wrap.warwick.ac.uk/49639/.

Full text
Abstract:
Feature Selection (FS) and regression are two important technique categories in Data Mining (DM). In general, DM refers to the analysis of observational datasets to extract useful information and to summarise the data so that it can be more understandable and be used more efficiently in terms of storage and processing. FS is the technique of selecting a subset of features that are relevant to the development of learning models. Regression is the process of modelling and identifying the possible relationships between groups of features (variables). Comparing with the conventional techniques, Intelligent System Techniques (ISTs) are usually favourable due to their flexible capabilities for handling real‐life problems and the tolerance to data imprecision, uncertainty, partial truth, etc. This thesis introduces a novel hybrid intelligent technique, namely Sensitive Genetic Neural Optimisation (SGNO), which is capable of reducing the dimensionality of a dataset by identifying the most important group of features. The capability of SGNO is evaluated with four practical applications in three research areas, including plant science, civil engineering and economics. SGNO is constructed using three key techniques, known as the core modules, including Genetic Algorithm (GA), Neural Network (NN) and Sensitivity Analysis (SA). The GA module controls the progress of the algorithm and employs the NN module as its fitness function. The SA module quantifies the importance of each available variable using the results generated in the GA module. The global sensitivity scores of the variables are used determine the importance of the variables. Variables of higher sensitivity scores are considered to be more important than the variables with lower sensitivity scores. After determining the variables’ importance, the performance of SGNO is evaluated using the NN module that takes various numbers of variables with the highest global sensitivity scores as the inputs. In addition, the symbolic relationship between a group of variables with the highest global sensitivity scores and the model output is discovered using the Multiple‐Branch Encoded Genetic Programming (MBE‐GP). A total of four datasets have been used to evaluate the performance of SGNO. These datasets involve the prediction of short‐term greenhouse tomato yield, prediction of longitudinal dispersion coefficients in natural rivers, prediction of wave overtopping at coastal structures and the modelling of relationship between the growth of industrial inputs and the growth of the gross industrial output. SGNO was applied to all these datasets to explore its effectiveness of reducing the dimensionality of the datasets. The performance of SGNO is benchmarked with four dimensionality reduction techniques, including Backward Feature Selection (BFS), Forward Feature Selection (FFS), Principal Component Analysis (PCA) and Genetic Neural Mathematical Method (GNMM). The applications of SGNO on these datasets showed that SGNO is capable of identifying the most important feature groups of in the datasets effectively and the general performance of SGNO is better than those benchmarking techniques. Furthermore, the symbolic relationships discovered using MBE‐GP can generate performance competitive to the performance of NN models in terms of regression accuracies.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography