Dissertations / Theses on the topic 'Heuristic analyzer'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 30 dissertations / theses for your research on the topic 'Heuristic analyzer.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Semyonov, S. G., Svitlana Gavrylenko, and Viktor Chelak. "Processing information on the state of a computer system using probabilistic automata." Thesis, Institute of Electrical and Electronics Engineers, 2017. http://repository.kpi.kharkov.ua/handle/KhPI-Press/40752.
Full textGoyal, Devendra. "Five new systems heuristics to analyze small group teamwork." Thesis, Massachusetts Institute of Technology, 2016. http://hdl.handle.net/1721.1/106248.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (pages 69-72).
Small groups are the most common of all organizations. In this thesis we apply a systems architecture approach to small group teamwork in order to study small groups. We propose five new systems heuristics and use these heuristics as lenses to view small groups. We make use of concepts from systems and complexity theories in order to come up with these heuristics. With each heuristic, we provide literature review both from systems and small group perspectives. We find that in many cases there is sufficient literature available to support application of each heuristic on small groups, but in some cases the literature is scant. We further apply these heuristics to small group work in a movie called "Twelve Angry Men" to provide an application example of these heuristics to analyze the work of a jury group presented in the movie. We find that each heuristic within its scope is able to provide significant insights into the working of the group. We finally provide guidelines for how these heuristics can be used for the practice of leadership in small groups. We report that each heuristic covers different aspects of group life and together can be used to analyze group work in details. Understanding group work in real time opens up opportunities for a member or members to influence the work and thus help practice leadership.
by Devendra Goyal.
S.M. in Engineering and Management
Balakrishnan, Anantaram, Thomas L. Magnanti, and Prakash Mirchandani. "Heuristics, LPs, and Trees on Trees: Network Design Analyses." Massachusetts Institute of Technology, Operations Research Center, 1994. http://hdl.handle.net/1721.1/5277.
Full textWatson, Jebulan Ryan. "Finding a representative day for simulation analyses." Thesis, Atlanta, Ga. : Georgia Institute of Technology, 2009. http://hdl.handle.net/1853/31762.
Full textCommittee Chair: John-Paul Clarke; Committee Member: Ellis Johnson; Committee Member: Eric Feron. Part of the SMARTech Electronic Thesis and Dissertation Collection.
FILHO, EMILIO ABUD. "APPLICATION OF VARIATIONAL METHODS AND HEURISTIC FORMULATIONS FOR ANALYZES AND NUMERICAL SYNTHESIS OF RECTANGULAR WAVEGUIDE TRANSFORMERS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2010. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=16456@1.
Full textTransformadores de guia de onda são amplamente empregados no projeto de componentes em onda guiada e são encontrados em praticamente todas as cadeias alimentadoras de antenas e demais estruturas de onda guiada na faixa de microondas. Embora a teoria de transformadores seja conhecida, os requisitos de ordem sistêmica têm levado os projetos de transformadores de guia de onda ao seu limite. Para tal nível de exigência, e considerando o número de variáveis no projeto de transformadores, técnicas numéricas de análise (tais como FDTD e expansão modal dentre outros), e otimização têm sido obrigatoriamente empregadas. Por outro lado, o número de variáveis de um transformador, acaba sendo um processo de alto consumo de tempo computacional, incoerente com o porte e objetivo de custo desses transformadores. Este trabalho propõe uma possibilidade alternativa para a análise mais rápida para essas estruturas, através do emprego de formulações fechadas derivadas de métodos varacionais. Um modelo heurístico é proposto para o caso de descontinuidades em dois planos, sejam para o caso de descontinuidades homogêneas ou para não-homogêneas.
Waveguide transformers are widely used on antenna’s feeder chains and other microwave devices. Although the theory of quarter wavelength transformers is well known, the current electrical performance of such microwave devices has been pushing the waveguide transformers design to its limit. For attending such level of requirements, and considering the number of existing variables on a waveguide transformer design, very accurate numerical techniques has been applied on its analyses, (such as FDTD, mode matching, etc), and optimization techniques as well. On the other hand, such numerical techniques are very memory and/or CPU/time consuming, which do not match with the cost objective of those simple concept transformers. This work proposes an alternative technique, based on close-form models derived from varational theory. A heuristic model is also proposed for attending the two plane transformer case, which can be easily applied for both homogeneous and inhomogeneous structures. Keywords Waveguide;
Glöckner, Andreas. "Automatische Prozesse bei Entscheidungen : das dominierende Prinzip menschlicher Entscheidungen: Intuition, komplex-rationale Analyse oder Reduktion? /." Hamburg : Kovač, 2006. http://www.verlagdrkovac.de/3-8300-2625-0.htm.
Full textAhmadi, Jah Robert Roham, Daniel Chatten, and Ali Hesen Sabah. "Analysera eller gå på magkänsla? : Hur svenska chefer använder analys och intuition i sina beslut under Coronakrisen." Thesis, Linnéuniversitetet, Institutionen för organisation och entreprenörskap (OE), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-104827.
Full textA crisis such as the Covid-19 pandemic is an extreme situation that differs from day-to-day situations and require that the right decisions be made. Such extreme situations put pressure on managers in organizations to make decisions that many times are improvised, in part because of time pressure and stress, and in part because each crisis is unique and makes it harder to know what the right decision is. The decisions managers make during a crisis are often different from how those decisions would have been made during a normal situation. Should the manager analyse the situation before the decision is made because the crisis is so complex, or should the manager instead follow his or her gut feeling because the crisis’ complexity is too overwhelming to possibly analyse? Such a question has received much attention in research of decision making, not least under extreme situations and crisis such as a pandemic. The purpose of this study is to increase the understanding of how managers deal with the improvised decision making that occur during a crisis. This study contrasts analytical decisions to intuitive decisions, while at the same time opens for the possibility that both styles of decision making could be combined. Interviews have been made with managers from different industries throughout Sweden to increase the understanding of crisis decision making during the Covid-19 pandemic. The study shows that most managers use analysis or combine analysis with intuition. Few managers tend to use intuition only. Furthermore, this study shows that the way the manager views the crisis can affect the decisions that he or she makes. If the manager views the merely as a threat, he or she will tend to focus on internal activities aimed at reducing the negative effects caused by the pandemic on the organisation and their members. If the manager chooses also to view the pandemic as an opportunity, it can lead to external activities that can take advantage of the pandemic, by for example expanding their business and business network. The study shows that most decisions have been made through communication and interplay with other actors. Only few decisions have been made without any communication or interplay whatsoever. The fact that most decisions have been made through communication with others seem to have reduced the effect of different biases. Managers have become less partial when other people’s perspectives have been included in the decisions. Finally, most managers believe that this pandemic has made them a better decision maker, and some believe that prior stressful situations and crisis have greatly assisted them during this pandemic.
Rambow, Angela Inge. "Sozialer Wert der Stadtbibliothek Wolgast." Doctoral thesis, Humboldt-Universität zu Berlin, Philosophische Fakultät I, 2006. http://dx.doi.org/10.18452/15525.
Full textThis thesis aims at validly and reliably proving social effects (Impact, Outcome) of a selected library. It deals with the complex problem of surveying the context of activities causing significant changes. Apart from that, it looks into the library''s influence on the individual and on society, thus furnishing results which facilitate the assessment of a library''s achievements. Starting with various definitions of outcomes in the context of their theoretical background (chapter one), different approaches to assessing performace, impact and outcome are compared (chapter two). Based on that, the selection of the social-process-audit approach is justified and then the research design described (chapter three). In chapter four, the results of the survey are presented. The empirical study proves that the effects of the selected library are socially significant. This library''s offer does not only bring about a variety of benefits, but it also causes changes in its stakeholders'' lives – long-term social effects that may be of strategic significance for the library. The special features of the thesis are (1) the valid proof of social effects, (2) the fact that the survey has not been restricted to ways of use, but has been carried out in the context of the library''s task and (3) that the inclusion of the positions of all relevant interest groups (stakeholders) – with regard to both determining the intended effects and the ones that have been actually reached – has built a decisive basis. The concluding remarks (chapter five) consider problems of the scientific-empirical method of interviewing and contain a generalization of the thesis''s results as well as selected topical questions.
Rumm, Stefanie [Verfasser], Klaus [Akademischer Betreuer] [Gutachter] Menrad, and Luisa [Gutachter] Menapace. "Verbrauchereinschätzungen zu Biokunststoffen: eine Analyse vor dem Hintergrund des heuristic-systematic model / Stefanie Rumm ; Gutachter: Klaus Menrad, Luisa Menapace ; Betreuer: Klaus Menrad." München : Universitätsbibliothek der TU München, 2016. http://d-nb.info/1125627026/34.
Full textKirner, Benedikt. "The ethical focal point of moral symmetry – a heuristic to analyse risk culture and misconduct in banking: Demonstrated on three recent misconduct cases." HHL Leipzig Graduate School of Management, 2020. https://slub.qucosa.de/id/qucosa%3A74391.
Full textThau, Sebastian Leonid [Verfasser], and A. [Akademischer Betreuer] Albers. "Heuristiken zur Analyse und Synthese technischer Systeme mit dem C&C²[[Elektronische Ressource]] : Ansatz auf Basis von Entwicklungsprojekten im industriellen Umfeld = Heuristics to analyze and design technical systems with the C&C² / Sebastian Leonid Thau. Betreuer: A. Albers." Karlsruhe : KIT-Bibliothek, 2013. http://d-nb.info/1043756256/34.
Full textJaneček, David. "Sdružená EEG-fMRI analýza na základě heuristického modelu." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2015. http://www.nusl.cz/ntk/nusl-221334.
Full textNamanya, Anitta P. "A Heuristic Featured Based Quantification Framework for Efficient Malware Detection. Measuring the Malicious intent of a file using anomaly probabilistic scoring and evidence combinational theory with fuzzy hashing for malware detection in Portable Executable files." Thesis, University of Bradford, 2016. http://hdl.handle.net/10454/15863.
Full textAmeli, Mostafa. "Heuristic Methods for Calculating Dynamic Traffic Assignment Simulation-based dynamic traffic assignment: meta-heuristic solution methods with parallel computing Non-unicity of day-to-day multimodal user equilibrium: the network design history effect Improving traffic network performance with road banning strategy: a simulation approach comparing user equilibrium and system optimum." Thesis, Lyon, 2019. http://www.theses.fr/2019LYSET009.
Full textTransport systems are dynamically characterized not only by nonlinear interactions between the different components but also by feedback loops between the state of the network and the decisions of users. In particular, network congestion affects both the distribution of local demand by modifying route choices and overall multimodal demand. Depending on the conditions of the network, they may decide to change for example their transportation mode. Several equilibria can be defined for transportation systems. The user equilibrium corresponds to the situation where each user is allowed to behave selfishly and to minimize his own travel costs. The system optimum corresponds to a situation where the total transport cost of all the users is minimum. In this context, the study aims to calculate route flow patterns in a network considering different equilibrium conditions and study the network equilibrium in a dynamic setting. The study focuses on traffic models capable of representing large-scale urban traffic dynamics. Three main issues are addressed. First, fast heuristic and meta-heuristic methods are developed to determine equilibria with different types of traffic patterns. Secondly, the existence and uniqueness of user equilibria is studied. When there is no uniqueness, the relationship between multiple equilibria is examined. Moreover, the impact of network history is analyzed. Thirdly, a new approach is developed to analyze the network equilibrium as a function of the level of demand. This approach compares user and system optimums and aims to design control strategies in order to move the user equilibrium situation towards the system optimum
Müller, Johannes. "Wertbasierte Portfolio-Optimierung bei Software-Produktlinien." Doctoral thesis, Universitätsbibliothek Leipzig, 2012. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-83047.
Full textBou, Malham Christelle. "Méthodologie d’optimisation hybride (Exergie/Pinch) et application aux procédés industriels." Thesis, Paris Sciences et Lettres (ComUE), 2018. http://www.theses.fr/2018PSLEM082.
Full textIn the perspective of the prevailing and alarming energy scene, this doctoral work puts forward a methodology that couples pinch and exergy analysis in a way to surpass their individual limitations in the aim of generating optimal operating conditions and topology for industrial processes. A global methodology, a hybrid of the two thermodynamic methods in an intertwined approach with heuristic rules and numerical optimization, is therefore evoked. Using new optimizing exergy-based criteria, exergy analysis is used not only to assess the exergy losses but also to guide the potential improvements in industrial processes structure and operating conditions. And while pinch analysis considers only heat integration to satisfy existent needs, the proposed methodology allows including other forms of recoverable exergy and explores new synergy pathways through conversion systems. After exhibiting the guidelines of the proposed methodology, the entire approach is demonstrated on two industrial systems, a vacuum gasoil hydrotreating process and a natural gas liquefaction process. The application of the methodological framework on realistic processes demonstrated how to adjust each process operating conditions and how to implement conversion systems ensuing substantial energy savings
Chu, Liu. "Reliability and optimization, application to safety of aircraft structures." Thesis, Rouen, INSA, 2016. http://www.theses.fr/2016ISAM0008/document.
Full textTremendous struggles of researchers in the field of aerodynamic design and aircraft production were made to improve wing airfoil by optimization techniques. The development of computational fluid dynamic (CFD) in computer simulation cuts the expense of aerodynamic experiment while provides convincing results to simulate complicated situation of aircraft. In our work, we chose a special and important part of aircraft, namely, the structure of wing.Reliability based optimization is one of the most appropriate methods for structural design under uncertainties. It struggles to seek for the best compromise between cost and safety while considering system uncertainties by incorporating reliability measures within the optimization. Despite the advantages of reliability based optimization, its application to practical engineering problem is still quite challenging. In our work, uncertainty analysis in numerical simulation is introduced and expressed by probability theory. Monte Carlo simulation as an effective method to propagate the uncertainties in the finite element model of structure is applied to simulate the complicate situations that may occur. To improve efficiency of Monte Carlo simulation in sampling process, Latin Hypercube sampling is performed. However, the huge database of sampling is difficult to provide explicit evaluation of reliability. Polynomial chaos expansion is presented and discussed. Kriging model as a surrogate model play an important role in the reliability analysis.Traditional methods of optimization have disadvantages in unacceptable time-complexity or natural drawbacks of premature convergence because of finding the nearest local optima of low quality. Simulated Annealing is a local search-based heuristic, Genetic Algorithm draws inspiration from the principles and mechanisms of natural selection, that makes us capable of escaping from being trapped into a local optimum. In reliability based design optimization, these two methods were performed as the procedure of optimization. The loop of reliability analysis is running in surrogate model
Connault, Pierre. "Calibration d'algorithmes de type Lasso et analyse statistique de données métallurgiques en aéronautique." Thesis, Paris 11, 2011. http://www.theses.fr/2011PA112041.
Full textOur work contains a methodological and an applied part.In the methodological part we study Lasso and a variant of this algorithm : the projectedLasso. We develop slope heuristics to calibrate them.Our approach uses sparsity properties of the Lasso, showing how to remain to a modelselection framework. This both involves a penalized criterion and the tuning of a constant.To this aim, we adopt the classical approaches of Birgé and Massart about slope heuristics.This leads to the notion of canonical penalty.Slope and (tenfold) crossvalidation are then compared through simulations studies.Results suggest the user to consider both of them. In order to increase calculation speed,simplified penalties are (unsuccessfully) tried.The applied part is about aeronautics. The results of the methodological part doapply in reliability : in classical approaches (without Lasso) the large number of variables/number of data ratio leads to an instability of linear models, and to huge calculustimes. Lasso provides a helpful solution.In aeronautics, dealing with reliability questions first needs to study quality of theelaboration and forging processes. Four major axis have to be considered : analysing thefactor of the process, discrimining recipes, studying the impact of time on quality anddetecting outliers. This provides a global statistical strategy of impowerment for processes
Boussaa, Mohamed. "Automatic non-functional testing and tuning of configurable generators." Thesis, Rennes 1, 2017. http://www.theses.fr/2017REN1S011/document.
Full textGenerative software development has paved the way for the creation of multiple generators (code generators and compilers) that serve as a basis for automatically producing code to a broad range of software and hardware platforms. With full automatic code generation, users are able to rapidly synthesize software artifacts for various software platforms. In addition, they can easily customize the generated code for the target hardware platform since modern generators (i.e., C compilers) become highly configurable, offering numerous configuration options that the user can apply. Consequently, the quality of generated software becomes highly correlated to the configuration settings as well as to the generator itself. In this context, it is crucial to verify the correct behavior of generators. Numerous approaches have been proposed to verify the functional outcome of generated code but few of them evaluate the non-functional properties of automatically generated code, namely the performance and resource usage properties. This thesis addresses three problems : (1) Non-functional testing of generators: We benefit from the existence of multiple code generators with comparable functionality (i.e., code generator families) to automatically test the generated code. We leverage the metamorphic testing approach to detect non-functional inconsistencies in code generator families by defining metamorphic relations as test oracles. We define the metamorphic relation as a comparison between the variations of performance and resource usage of code, generated from the same code generator family. We evaluate our approach by analyzing the performance of HAXE, a popular code generator family. Experimental results show that our approach is able to automatically detect several inconsistencies that reveal real issues in this family of code generators. (2) Generators auto-tuning: We exploit the recent advances in search-based software engineering in order to provide an effective approach to tune generators (i.e., through optimizations) according to user's non-functional requirements (i.e., performance and resource usage). We also demonstrate that our approach can be used to automatically construct optimization levels that represent optimal trade-offs between multiple non-functional properties such as execution time and resource usage requirements. We evaluate our approach by verifying the optimizations performed by the GCC compiler. Our experimental results show that our approach is able to auto-tune compilers and construct optimizations that yield to better performance results than standard optimization levels. (3) Handling the diversity of software and hardware platforms in software testing: Running tests and evaluating the resource usage in heterogeneous environments is tedious. To handle this problem, we benefit from the recent advances in lightweight system virtualization, in particular container-based virtualization, in order to offer effective support for automatically deploying, executing, and monitoring code in heterogeneous environment, and collect non-functional metrics (e.g., memory and CPU consumptions). This testing infrastructure serves as a basis for evaluating the experiments conducted in the two first contributions
Gurevsky, Evgeny. "Conception de lignes de fabrication sous incertitudes : analyse de sensibilité et approche robuste." Phd thesis, Ecole Nationale Supérieure des Mines de Saint-Etienne, 2011. http://tel.archives-ouvertes.fr/tel-00820619.
Full textLoubiere, Peio. "Amélioration des métaheuristiques d'optimisation à l'aide de l'analyse de sensibilité." Thesis, Paris Est, 2016. http://www.theses.fr/2016PESC1051/document.
Full textHard optimization stands for a class of problems which solutions cannot be found by an exact method, with a polynomial complexity.Finding the solution in an acceptable time requires compromises about its accuracy.Metaheuristics are high-level algorithms that solve these kind of problems. They are generic and efficient (i.e. they find an acceptable solution according to defined criteria such as time, error, etc.).The first chapter of this thesis is partially dedicated to the state-of-the-art of these issues, especially the study of two families of population based metaheuristics: evolutionnary algorithms and swarm intelligence based algorithms.In order to propose an innovative approach in metaheuristics research field, sensitivity analysis is presented in a second part of this chapter.Sensitivity analysis aims at evaluating arameters influence on a function response. Its study characterises globally a objective function behavior (linearity, non linearity, influence, etc.), over its search space.Including a sensitivity analysis method in a metaheuristic enhances its seach capabilities along most promising dimensions.Two algorithms, binding these two concepts, are proposed in second and third parts.In the first one, ABC-Morris, Morris method is included in artificial bee colony algorithm.This encapsulation is dedicated because of the similarity of their bare bone equations, With the aim of generalizing the approach, a new method is developped and its generic integration is illustrated on two metaheuristics.The efficiency of the two methods is tested on the CEC 2013 conference benchmark. The study contains two steps: an usual performance analysis of the method, on this benchmark, regarding several state-of-the-art algorithms and the comparison with its original version when influences are uneven deactivating a subset of dimensions
Rojas, Jhojan Enrique. "Méthodologie d’analyse de fiabilité basée sur des techniques heuristiques d’optimisation et modèles sans maillage : applications aux systèmes mécaniques." Thesis, Rouen, INSA, 2008. http://www.theses.fr/2008ISAM0003/document.
Full textStructural Engineering designs must be adapted to satisfy performance criteria such as safety, functionality, durability and so on, generally established in pre-design phase. Traditionally, engineering designs use deterministic information about dimensions, material properties and external loads. However, the structural behaviour of the complex models needs to take into account different kinds and levels of uncertainties. In this sense, this analysis has to be made preferably in terms of probabilities since the estimate the probability of failure is crucial in Structural Engineering. Hence, reliability is the probability related to the perfect operation of a structural system throughout its functional lifetime; considering normal operation conditions. A major interest of reliability analysis is to find the best compromise between cost and safety. Aiming to eliminate main difficulties of traditional reliability methods such as First and Second Order Reliability Method (FORM and SORM, respectively) this work proposes the so-called Heuristic-based Reliability Method (HBRM). The heuristic optimization techniques used in this method are: Genetic Algorithms, Particle Swarm Optimization and Ant Colony Optimization. The HBRM does not require initial guess of design solution because it’s based on multidirectional research. Moreover, HBRM doesn’t need to compute the partial derivatives of the limit state function with respect to the random variables. The evaluation of these functions is carried out using analytical, semi analytical and numerical models. To this purpose were carried out the following approaches: Ritz method (using MATLAB®), finite element method (through MATLAB® and ANSYS®) and Element-free Galerkin method (via MATLAB®). The combination of these reliability analyses, optimization procedures and modelling methods configures the design based reliability methodology proposed in this work. The previously cited numerical tools were used to evaluate its advantages and disadvantages for specific applications and to demonstrate the applicability and robustness of this alternative approach. Good agreement was observed between the results of bi and three-dimensional applications in statics, stability and dynamics. These numerical examples explore explicit and implicit multi limit state functions for several random variables. Deterministic validation and stochastic analyses lied to Muscolino perturbation method give the bases for reliability analysis in 2-D and 3-D fluidstructure interaction problems. This methodology is applied to an industrial structure lied to a modal synthesis. The results of laminated composite plates modelled by the EFG method are compared with their counterparts obtained by finite elements. Finally, an extension in reliability based design optimization is proposed using the optimal safety factors method. Therefore, numerical applications that perform weight minimization while taking into account a target reliability index using mesh-based and meshless models are proposed
Os projectos de Engenharia Estrutural devem se adaptar a critérios de desempenho, segurança, funcionalidade, durabilidade e outros, estabelecidos na fase de anteprojeto. Tradicionalmente, os projectos utilizam informações de natureza deterministica nas dimensões, propriedades dos materiais e carregamentos externos. No entanto, a modelagem de sistemas complexos implica o tratamento de diferentes tipos e níveis de incertezas. Neste sentido, a previsão do comportamento deve preferivelmente ser realizada em termos de probabilidades dado que a estimativa da probabilidade de sucesso de um critério é uma necessidade primária na Engenharia Estrutural. Assim, a confiabilidade é a probabilidade relacionada à perfeita operação de um sistema estrutural durante um determinado tempo em condições normais de operação. O principal objetivo desta análise é encontrar o melhor compromisso entre custo e segurança. Visando a paliar as principais desvantagens dos métodos tradicionais FORM e SORM (First and Second Order Reliability Method), esta tese propõe um método de análise de confiabilidade baseado em técnicas de optimização heurísticas denominado HBRM (Heuristic-based Reliability Method). Os métodos heurísticos de otimização utilizados por este método são: Algoritmos Genéticos (Genetic Algorithms), Optimização por Bandos Particulares (Particle Swarm Optimisation) e Optimização por Colónia de Formigas (Ant Colony Optimization). O método HBRM não requer de uma estimativa inicial da solução e opera de acordo com o princípio de busca multidirecional, sem efetuar o cálculo de derivadas parciais da função de estado limite em relação às variáveis aleatórias. A avaliação das funções de estado limite é realizada utilizando modelos analíticos, semi analíticos e numéricos. Com este fim, a implementação do método de Ritz (via MATLAB®), o método dos elementos terminados (via MATLAB® e ANSYS®) e o método sem malha de Galerkin (Element-free Galerkin via MATLAB®) foi necessária. A combinação da análise de confiabilidade, os métodos de optimização e métodos de modelagem, acima mencionados, configura a metodologia de projeto proposta nesta tese. A utilização de diferentes métodos de modelagem e de otimização teve por objetivo destacar as suas vantagens e desvantagens em aplicações específicas, assim como demonstrar a aplicabilidade e a robustez da metodologia de análise de confiabilidade utilizando estas técnicas numéricas. Isto foi possível graças aos bons resultados encontrados na maior parte das aplicações. As aplicações foram uni, bi e tridimensionais em estática, estabilidade e dinâmica de estruturas, as quais exploram a avaliação explícita e implícita de funções de estado limite de várias variáveis aleatórias. Procedimentos de validação déterministica e de análises estocásticas, aplicando o método de perturbação de Muscolino, fornecem as bases da análise de confiabilidade nas aplicações de problemas de iteração fluído-estrutura bi e tridimensionais. A metodologia é testada com uma estrutura industrial. Resultados de aplicações bidimensionais em estratificados compostos, modelados pelo método EFG são comparados com os obtidos por elementos finitos. No fim da tese, uma extensão da metodologia à optimização baseada em confiabilidade é proposta aplicando o método dos factores óptimos de segurança. Finalmente são apresentadas as aplicações para a minimização do peso em sistemas modelados pelo método de EF e o método EFG que exigem um índice de confiabilidade alvo
Ireta, sánchez Iván tadeo. "La improvisación libre en solo : una aproximación fenomenológica y una proposición de análisis." Thesis, Toulouse 2, 2019. http://www.theses.fr/2019TOU20110.
Full textFree improvisation en solo emerged as a musical practice in the 1970s and is also known as the Free European Improvisation. However, not all musicians of this musical movement experimented this experience´s improvisation abstained from interaction. But so, the English Derek Bailey, Evan Parker, Barry Guy, Paul Rutherford and Howard Riley; the Dutch Hann Bennink and Maarten Altena; the Germans Peter Brötzmann and Alexander Von Schlippenbach; the Belgian Fred Van Hove; and the Swiss Irène Schweizer were the precursors that recorded in these conditions.The first chapter aims to clarify the notion of improvisation in the music field. The conditions that surround this musical phenomenon and generate inappropriate interpretations are identified. Further, the Matthias Rousselot paradigm is explained, whose terms of improviso and improvisum specify the two states of this musical phenomenon. The improvisation differences in the collective and idiomatic environments are also exposed.The second chapter consists of a historical review of the influences that contributed to the emergence of free improvisation and also, its aesthetic characteristics that distinguish it from Free Jazz and indeterministic music are highlighted. The third chapter reports a historical approximation of free improvisation en solo from the information obtained from different record productions provided on this practice. In this section, the most significant LP’s are exposed during this period. Also, an approximation is made about their development conditions based on reflection and personal experience in order to demonstrate their unpredictability, their immediacy, their decisive nature and their transience.The fourth chapter consists of analyzing three audiovisual improvisation records of musicians as Fred Van Hove, Barry Guy and Evan Parker. To achieve this, the notion of “formative unity” is proposed in order to establish a perspective whose focus is the process or improviso. The foregoing is in order to establish a different perspective of musical analysis in an ephemeral musical phenomenon. Finally, we want to point out that this research will contribute to claiming free improvisation en solo, understanding it as an activity of authentic artistic creation
La improvisación libre en solo surgió como una práctica musical en la década de los setentas, y al interior de lo que se conoce como la Free European Improvisation. Sin embargo, no todos los músicos de este movimiento musical se aventuraron a experimentar la improvisación abstenidos de interacción. Pero así, los ingleses Derek Bailey, Evan Parker, Barry Guy, Paul Rutherford, Howard Riley; los neerlandeses Hann Bennink y Maarten Altena; los alemanes Peter Brötzmann y Alexander Von Schlippenbach; el belga Fred Van Hove y la suiza Irène Schweizer fueron los precursores que grabaron en estas condiciones.El primer capítulo tiene por objetivo esclarecer la noción de improvisación en el campo de la música. Se identifican los términos que circundan a este fenómeno musical y que suscitan interpretaciones inadecuadas. Además, se explica el paradigma de Matthias Rousselot, cuyos términos de improviso e improvisum especifican los dos estados de este fenómeno musical. También se exponen las diferencias de improvisación en los entornos colectivo idiomático y libre.El segundo capítulo consiste en una revisión histórica de las influencias que contribuyeron en la emergencia de la improvisación libre, y además, se remarcan sus características estéticas que la distinguen del Free Jazz y de la música indeterminista. En el tercer capítulo se hace una aproximación histórica de la improvisación libre en solo, a partir de la información obtenida de diferentes producciones discográficas conferidas a esta práctica. En esta sección, se exponen los LP’s más significativos durante este período. También, se hace una aproximación sobre sus condiciones de desarrollo con base en la reflexión y en la experiencia personal con el objetivo de demostrar su imprevisibilidad, su inmediatez, su naturaleza resolutiva y su transitoriedad. El cuarto capítulo consiste en analizar tres registros audiovisuales de improvisaciones de los músicos Fred Van Hove, Barry Guy y Evan Parker. Para conseguirlo, se propone la noción de “unidad formacional” con el fin de establecer una perspectiva cuyo enfoque de atención sea el proceso o el improviso. Es decir, un enfoque diferente de análisis musical de un fenómeno sonoro efímero. Por último, queremos señalar que este trabajo de investigación contribuirá a reivindicar la improvisación libre en solo, comprendiéndola como una actividad de creación artística auténtica
Maqrot, Sara. "Méthodes d'optimisation combinatoire en programmation mathématique : Application à la conception des systèmes de verger-maraîcher." Thesis, Toulouse 3, 2019. http://www.theses.fr/2019TOU30131.
Full textMixed fruit-vegetable cropping systems (MFVCS) are a promising way of ensuring environmentally sustainable agricultural production systems in response to the challenge of being able to fulfill local market requirements. They combine productions and make a better use of biodiversity. These agroforestry systems are based on a complex set of interactions modifying the utilization of light, water and nutrients. Thus, designing such systems requires to optimize the use of these resources : by maximizing positive interactions (facilitations) and minimizing negative ones (competitions). To reach these objectives, the system's design has to include the spatial and temporal dimensions, taking into account the evolution of above- and belowground interactions over a time horizon. For that, we define the MFVCAP using a discrete representation of the land and the interactions between vegetable crops and fruit trees. We formulate the problem as three models : binary quadratic program (BQP), mixed integer linear programming (MILP) and binary linear programming (01LP). We explore large models using exact solvers. The limits of exact methods in solving the MFVCS problem show the need for approximate methods, able to solve a large-scale system with solutions of good quality in reasonable time, which could be used in interactive design with farmers and advisers. We have implemented a C++ open-source solver, called baryonyx, which is a parallel version of a (generalized) Wedelin heuristic. We used a sensitivity analysis method to find useful continuous parameters. Once found, we fixed other parameters and let a genetic optimization algorithm using derivatives adjust the useful ones in order to get the best solutions for a given time limit. The optimized configuration could be used to solve larger instances of the same problem type. Baryonyx got competitive results compared to state-of-the-art exact and approximate solvers on crew and bus driver scheduling problems expressed as set partitioning problems. The results are less convincing on MFVCS but still able to produce valid solutions on large instances
Brassard, Serge. "Méthodologie et modélisation floues des connaissances dans l'activité de conception en électrotechnique : application à la réalisation d'un système expert d'aide à la conception de l'appareillage électrique." Grenoble INPG, 1989. http://www.theses.fr/1989INPG0093.
Full textGruza, Jan. "Modélisation de la flexibilité des glycannes complexes en solution et en interaction avec des protéines." Grenoble 1, 1998. http://www.theses.fr/1998GRE10158.
Full textInamdar, Jaydeep Vijay. "Performance evaluation of greedy heuristic for SIP analyzer in H.264/SVC." 2008. http://hdl.handle.net/10106/922.
Full textKang, Shih-Huang, and 康詩凰. "Back analyses of earthdam seepage problems using heuristic optimization method." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/97563619349681806887.
Full text國立交通大學
土木工程學系
99
Embankment dam is the most common type of dams in Taiwan. Besides dam failure caused by overflow erosion, piping due to seepage is one of the major reason that may result in the breach of an embankment dam. Piping failure tends to take place unexpectedly and catastrophically. A proper program of monitoring for seepage and water pressure can help to reduce the chance of piping failure. When unusual monitored data occurs, an appropriate diagnosis will be essential to look for the real cause and timely solve the problem. This thesis aims to propose a diagnosis procedure by means of back analyses of earth-dam seepage problems using one of the heuristic optimization methods - the harmonic search method. This procedure incorporates the harmonic search algorithm into MATLAB as the optimization server and a commercial numerical simulation tool FLAC as the simulation engine to allow the simulated results match the monitoring data. A common data file serves as the interface communicating between MATLAB and FLAC. This work also makes use of two earth-dam seepage problems to demonstrate the feasibility and applicability of the proposed diagnosis procedure.
Pietrocatelli, Simon. "Analyse bayésienne et élicitation d’opinions d’experts en analyse de risques et particulièrement dans le cas de l’amiante chrysotile." Thèse, 2008. http://hdl.handle.net/1866/3345.
Full textCharacterizing the carcinogenic potency of chrysotile asbestos fibres relies a great deal on subjective and uncertain judgements by experts and analysts, given heterogeneous and equivocal results of important epidemiological and toxicological studies. The probabilistic Bayesian approach in risk assessments quantifies these subjective judgements and their uncertainties, along with their impact on risk estimations, but it is rarely used in the public health context. This report examines how the Bayesian approach could have been applied to a recent elicitation of experts’ opinions to estimate the toxicity of chrysotile asbestos, the degree of convergence and divergence, as well as the uncertainty levels of these experts. The experts’ estimations on the relative toxicity of chrysotile and amphibole asbestos were similar in the case of mesothelioma. However, in the case of lung cancer, the heterogeneity of the studies resulted in diverging and incompatible probabilistic evaluations. The experts’ judgements seemed influenced by heuristic biases, particularly the affect and anchor heuristics associated with a controversial topic and to heterogeneous data. If the elicitation process had been prepared following a rigorous methodology, these heuristics and biases could have been mitigated.
Müller, Johannes. "Wertbasierte Portfolio-Optimierung bei Software-Produktlinien: Value-based Portfolio-Optimization of Software Product Lines: Modell, Vorgehen, Umsetzung." Doctoral thesis, 2011. https://ul.qucosa.de/id/qucosa%3A11342.
Full text