To see the other types of publications on this topic, follow the link: Entropy maximization.

Dissertations / Theses on the topic 'Entropy maximization'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 18 dissertations / theses for your research on the topic 'Entropy maximization.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Bæcklund, Anna. "Maximization of the Wehrl Entropy in Finite Dimensions : Maximization of the Wehrl Entropy in Finite Dimensions." Thesis, KTH, Fysik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-120594.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Glocer, Karen A. "Entropy regularization and soft margin maximization /." Diss., Digital Dissertations Database. Restricted to UC campuses, 2009. http://uclibs.org/PID/11984.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Garvey, Jennie Hill. "Independent component analysis by entropy maximization (infomax)." Thesis, Monterey, Calif. : Naval Postgraduate School, 2007. http://bosun.nps.edu/uhtbin/hyperion-image.exe/07Jun%5FGarvey.pdf.

Full text
Abstract:
Thesis (M.S. in Electrical Engineering)--Naval Postgraduate School, June 2007.
Thesis Advisor(s): Frank E. Kragh. "June 2007." Includes bibliographical references (p. 103). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
4

Zhao, Mansuo. "Image Thresholding Technique Based On Fuzzy Partition And Entropy Maximization." University of Sydney. School of Electrical and Information Engineering, 2005. http://hdl.handle.net/2123/699.

Full text
Abstract:
Thresholding is a commonly used technique in image segmentation because of its fast and easy application. For this reason threshold selection is an important issue. There are two general approaches to threshold selection. One approach is based on the histogram of the image while the other is based on the gray scale information located in the local small areas. The histogram of an image contains some statistical data of the grayscale or color ingredients. In this thesis, an adaptive logical thresholding method is proposed for the binarization of blueprint images first. The new method exploits the geometric features of blueprint images. This is implemented by utilizing a robust windows operation, which is based on the assumption that the objects have "e;C"e; shape in a small area. We make use of multiple window sizes in the windows operation. This not only reduces computation time but also separates effectively thin lines from wide lines. Our method can automatically determine the threshold of images. Experiments show that our method is effective for blueprint images and achieves good results over a wide range of images. Second, the fuzzy set theory, along with probability partition and maximum entropy theory, is explored to compute the threshold based on the histogram of the image. Fuzzy set theory has been widely used in many fields where the ambiguous phenomena exist since it was proposed by Zadeh in 1965. And many thresholding methods have also been developed by using this theory. The concept we are using here is called fuzzy partition. Fuzzy partition means that a histogram is parted into several groups by some fuzzy sets which represent the fuzzy membership of each group because our method is based on histogram of the image . Probability partition is associated with fuzzy partition. The probability distribution of each group is derived from the fuzzy partition. Entropy which originates from thermodynamic theory is introduced into communications theory as a commonly used criteria to measure the information transmitted through a channel. It is adopted by image processing as a measurement of the information contained in the processed images. Thus it is applied in our method as a criterion for selecting the optimal fuzzy sets which partition the histogram. To find the threshold, the histogram of the image is partitioned by fuzzy sets which satisfy a certain entropy restriction. The search for the best possible fuzzy sets becomes an important issue. There is no efficient method for the searching procedure. Therefore, expansion to multiple level thresholding with fuzzy partition becomes extremely time consuming or even impossible. In this thesis, the relationship between a probability partition (PP) and a fuzzy C-partition (FP) is studied. This relationship and the entropy approach are used to derive a thresholding technique to select the optimal fuzzy C-partition. The measure of the selection quality is the entropy function defined by the PP and FP. A necessary condition of the entropy function arriving at a maximum is derived. Based on this condition, an efficient search procedure for two-level thresholding is derived, which makes the search so efficient that extension to multilevel thresholding becomes possible. A novel fuzzy membership function is proposed in three-level thresholding which produces a better result because a new relationship among the fuzzy membership functions is presented. This new relationship gives more flexibility in the search for the optimal fuzzy sets, although it also increases the complication in the search for the fuzzy sets in multi-level thresholding. This complication is solved by a new method called the "e;Onion-Peeling"e; method. Because the relationship between the fuzzy membership functions is so complicated it is impossible to obtain the membership functions all at once. The search procedure is decomposed into several layers of three-level partitions except for the last layer which may be a two-level one. So the big problem is simplified to three-level partitions such that we can obtain the two outmost membership functions without worrying too much about the complicated intersections among the membership functions. The method is further revised for images with a dominant area of background or an object which affects the appearance of the histogram of the image. The histogram is the basis of our method as well as of many other methods. A "e;bad"e; shape of the histogram will result in a bad thresholded image. A quadtree scheme is adopted to decompose the image into homogeneous areas and heterogeneous areas. And a multi-resolution thresholding method based on quadtree and fuzzy partition is then devised to deal with these images. Extension of fuzzy partition methods to color images is also examined. An adaptive thresholding method for color images based on fuzzy partition is proposed which can determine the number of thresholding levels automatically. This thesis concludes that the "e;C"e; shape assumption and varying sizes of windows for windows operation contribute to a better segmentation of the blueprint images. The efficient search procedure for the optimal fuzzy sets in the fuzzy-2 partition of the histogram of the image accelerates the process so much that it enables the extension of it to multilevel thresholding. In three-level fuzzy partition the new relationship presentation among the three fuzzy membership functions makes more sense than the conventional assumption and, as a result, performs better. A novel method, the "e;Onion-Peeling"e; method, is devised for dealing with the complexity at the intersection among the multiple membership functions in the multilevel fuzzy partition. It decomposes the multilevel partition into the fuzzy-3 partitions and the fuzzy-2 partitions by transposing the partition space in the histogram. Thus it is efficient in multilevel thresholding. A multi-resolution method which applies the quadtree scheme to distinguish the heterogeneous areas from the homogeneous areas is designed for the images with large homogeneous areas which usually distorts the histogram of the image. The new histogram based on only the heterogeneous area is adopted for partition and outperforms the old one. While validity checks filter out the fragmented points which are only a small portion of the whole image. Thus it gives good thresholded images for human face images.
APA, Harvard, Vancouver, ISO, and other styles
5

Zhao, Mansuo. "Image Thresholding Technique Based On Fuzzy Partition And Entropy Maximization." Thesis, The University of Sydney, 2004. http://hdl.handle.net/2123/699.

Full text
Abstract:
Thresholding is a commonly used technique in image segmentation because of its fast and easy application. For this reason threshold selection is an important issue. There are two general approaches to threshold selection. One approach is based on the histogram of the image while the other is based on the gray scale information located in the local small areas. The histogram of an image contains some statistical data of the grayscale or color ingredients. In this thesis, an adaptive logical thresholding method is proposed for the binarization of blueprint images first. The new method exploits the geometric features of blueprint images. This is implemented by utilizing a robust windows operation, which is based on the assumption that the objects have "e;C"e; shape in a small area. We make use of multiple window sizes in the windows operation. This not only reduces computation time but also separates effectively thin lines from wide lines. Our method can automatically determine the threshold of images. Experiments show that our method is effective for blueprint images and achieves good results over a wide range of images. Second, the fuzzy set theory, along with probability partition and maximum entropy theory, is explored to compute the threshold based on the histogram of the image. Fuzzy set theory has been widely used in many fields where the ambiguous phenomena exist since it was proposed by Zadeh in 1965. And many thresholding methods have also been developed by using this theory. The concept we are using here is called fuzzy partition. Fuzzy partition means that a histogram is parted into several groups by some fuzzy sets which represent the fuzzy membership of each group because our method is based on histogram of the image . Probability partition is associated with fuzzy partition. The probability distribution of each group is derived from the fuzzy partition. Entropy which originates from thermodynamic theory is introduced into communications theory as a commonly used criteria to measure the information transmitted through a channel. It is adopted by image processing as a measurement of the information contained in the processed images. Thus it is applied in our method as a criterion for selecting the optimal fuzzy sets which partition the histogram. To find the threshold, the histogram of the image is partitioned by fuzzy sets which satisfy a certain entropy restriction. The search for the best possible fuzzy sets becomes an important issue. There is no efficient method for the searching procedure. Therefore, expansion to multiple level thresholding with fuzzy partition becomes extremely time consuming or even impossible. In this thesis, the relationship between a probability partition (PP) and a fuzzy C-partition (FP) is studied. This relationship and the entropy approach are used to derive a thresholding technique to select the optimal fuzzy C-partition. The measure of the selection quality is the entropy function defined by the PP and FP. A necessary condition of the entropy function arriving at a maximum is derived. Based on this condition, an efficient search procedure for two-level thresholding is derived, which makes the search so efficient that extension to multilevel thresholding becomes possible. A novel fuzzy membership function is proposed in three-level thresholding which produces a better result because a new relationship among the fuzzy membership functions is presented. This new relationship gives more flexibility in the search for the optimal fuzzy sets, although it also increases the complication in the search for the fuzzy sets in multi-level thresholding. This complication is solved by a new method called the "e;Onion-Peeling"e; method. Because the relationship between the fuzzy membership functions is so complicated it is impossible to obtain the membership functions all at once. The search procedure is decomposed into several layers of three-level partitions except for the last layer which may be a two-level one. So the big problem is simplified to three-level partitions such that we can obtain the two outmost membership functions without worrying too much about the complicated intersections among the membership functions. The method is further revised for images with a dominant area of background or an object which affects the appearance of the histogram of the image. The histogram is the basis of our method as well as of many other methods. A "e;bad"e; shape of the histogram will result in a bad thresholded image. A quadtree scheme is adopted to decompose the image into homogeneous areas and heterogeneous areas. And a multi-resolution thresholding method based on quadtree and fuzzy partition is then devised to deal with these images. Extension of fuzzy partition methods to color images is also examined. An adaptive thresholding method for color images based on fuzzy partition is proposed which can determine the number of thresholding levels automatically. This thesis concludes that the "e;C"e; shape assumption and varying sizes of windows for windows operation contribute to a better segmentation of the blueprint images. The efficient search procedure for the optimal fuzzy sets in the fuzzy-2 partition of the histogram of the image accelerates the process so much that it enables the extension of it to multilevel thresholding. In three-level fuzzy partition the new relationship presentation among the three fuzzy membership functions makes more sense than the conventional assumption and, as a result, performs better. A novel method, the "e;Onion-Peeling"e; method, is devised for dealing with the complexity at the intersection among the multiple membership functions in the multilevel fuzzy partition. It decomposes the multilevel partition into the fuzzy-3 partitions and the fuzzy-2 partitions by transposing the partition space in the histogram. Thus it is efficient in multilevel thresholding. A multi-resolution method which applies the quadtree scheme to distinguish the heterogeneous areas from the homogeneous areas is designed for the images with large homogeneous areas which usually distorts the histogram of the image. The new histogram based on only the heterogeneous area is adopted for partition and outperforms the old one. While validity checks filter out the fragmented points which are only a small portion of the whole image. Thus it gives good thresholded images for human face images.
APA, Harvard, Vancouver, ISO, and other styles
6

Khabou, Mohamed Ali. "Improving shared weight neural networks generalization using regularization theory and entropy maximization /." free to MU campus, to others for purchase, 1999. http://wwwlib.umi.com/cr/mo/fullcit?p9953870.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fabro, Adriano Todorovic. "Análise estocástica do comportamento dinâmico de estruturas via métodos probabilísticos." [s.n.], 2010. http://repositorio.unicamp.br/jspui/handle/REPOSIP/265418.

Full text
Abstract:
Orientador: José Roberto de França Arruda
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecânica
Made available in DSpace on 2018-08-16T06:24:37Z (GMT). No. of bitstreams: 1 Fabro_AdrianoTodorovic_M.pdf: 6602156 bytes, checksum: 3a18dd67bde7f65ae2e4dd268670356d (MD5) Previous issue date: 2010
Resumo: Esta dissertação tem como objetivo geral levar 'a realidade industrial subsídios para a modelagem e análise de sistemas mecânicos lineares com variabilidade, assim como metodologias computacionais para quantificação de incertezas, para fins de aplicação em projeto. Neste sentido, foram realizados estudos sobre técnicas de modelagem e análise estocástica de sistemas mecânicos lineares aplicadas, inicialmente, a algumas estruturas simples, de baixo custo computacional, por meio de simulações em MatLabR. Propõe-se uma abordagem probabilística para a modelagem de incertezas baseada no Princípio da Máxima Entropia para a flexibilidade relativa a uma trinca aberta e não propagante em uma barra modelada através do Método do Elemento Espectral (SEM). Também é apresentada uma abordagem para o tratamento de problemas de campo aleatório utilizando o SEM, onde são utilizadas soluções analíticas da decomposição de Karhunen-Lo'eve. Uma formulação para elementos de viga do tipo Euler-Bernoulli é apresentada e um exemplo em que a rigidez à flexão é modelada como um campo aleatório Gaussiano é tratado. Uma abordagem para análise estocástica do comportamento dinâmico de uma tampa de compressor hermético é proposta. Uma aproximação por elementos finitos obtida com o software Ansys R foi utilizada para representar o comportamento determinístico de uma tampa de compressor, e duas abordagens de modelagem estocástica são comparadas. Um ensaio experimental foi realizado com tampas nominalmente idênticas, sendo medidas apenas frequências naturais com excitação por impacto, de modo a se poder compará-las com os valores obtidos teoricamente
Abstract: This dissertation has as a general objective to bring to the industrial reality subsidies for modeling and analysis of linear mechanical systems with variability, as well as computational methodologies to the uncertainty quantification, aiming industrial design applications. In that sense, theoretical studies about stochastic modeling and analysis for mechanical linear systems were performed. They were applied, firstly, to simple and computationally low cost structures using MatlabR. In that sense, a probabilistic modeling approach based on the Maximum Entropy Principle was proposed to treat the flexibility related to an open and nonpropagating crack in a rod modeled using the Spectral Element Method (SEM). An approach for the treatment of random field problems using SEM, which uses analytical solutions of the Karhunen-Lo'eve Decomposition, is also addressed. An Euler-Bernoulli beam formulation was used, and an example where the flexural stiffness is modeled as a Gaussian random field is presented. A finite element approximation obtained with the software Ansys R was used to represent the deterministic dynamic behavior of a compressor cap shell, and two stochastic modeling approaches were compared. Experiments were performed using nominally identical cap samples. Natural frequencies were measured using impact excitation in order to compare with the theoretical results
Mestrado
Mecanica dos Sólidos e Projeto Mecanico
Mestre em Engenharia Mecânica
APA, Harvard, Vancouver, ISO, and other styles
8

Medeiros, Richerland Pinto [UNESP]. "Inferência de emoções em fragmentos de textos obtidos do Facebook." Universidade Estadual Paulista (UNESP), 2017. http://hdl.handle.net/11449/150974.

Full text
Abstract:
Submitted by Richerland Pinto Medeiros null (rick.land@gmail.com) on 2017-06-27T15:12:38Z No. of bitstreams: 1 DISSERTACAO_RICHERLAND_MEDEIROS.pdf: 1209454 bytes, checksum: 251490a058f4248162de9508b4627e65 (MD5)
Approved for entry into archive by LUIZA DE MENEZES ROMANETTO (luizamenezes@reitoria.unesp.br) on 2017-06-27T17:04:08Z (GMT) No. of bitstreams: 1 medeiros_rp_me_bauru.pdf: 1209454 bytes, checksum: 251490a058f4248162de9508b4627e65 (MD5)
Made available in DSpace on 2017-06-27T17:04:09Z (GMT). No. of bitstreams: 1 medeiros_rp_me_bauru.pdf: 1209454 bytes, checksum: 251490a058f4248162de9508b4627e65 (MD5) Previous issue date: 2017-04-27
Esta pesquisa tem como objetivo analisar o uso da técnica estatística de aprendizado de máquina Maximização de Entropia, voltado para tarefas de processamento de linguagem natural na inferência de emoções em textos obtidos da rede social Facebook. Foram estudados os conceitos primordiais das tarefas de processamento de linguagem natural, os conceitos inerentes a teoria da informação, bem como o aprofundamento no conceito de um modelo entrópico como classificador de textos. Os dados utilizados na presente pesquisa foram obtidos de textos curtos, ou seja, textos com no máximo 500 caracteres. A técnica em questão foi abordada dentro do aprendizado supervisionado de máquina, logo, parte dos dados coletados foram usados como exemplos marcados dentro de um conjunto de classes predefinidas, a fim de induzir o mecanismo de aprendizado a selecionar a classe de emoção mais provável dado o exemplo analisado. O método proposto obteve índice de assertividade médio de 90%, baseado no modelo de validação cruzada.
This research aims to analyze the use of entropy maximization machine learning statistical technique, focused on natural language processing tasks in the inferencing of emotions in short texts from Facebook social network. Were studied the primary concepts of natural language processing tasks, IT intrinsic concepts, as well as deepening the concept of Entropy model as a text classifier. All data used for this research came from short texts found in social networks and had 500 characters or less. The model was used within supervised machine learning, therefore, part of the collected data was used as examples marked within a set of predefined classes in order to induce the learning mechanism to select the most probable emotion class given the analyzed sample. The method has obtained the mean accuracy rate of 90%, based on the cross-validation model.
APA, Harvard, Vancouver, ISO, and other styles
9

Hatefi, Armin. "Mixture model analysis with rank-based samples." Statistica Sinica, 2013. http://hdl.handle.net/1993/23849.

Full text
Abstract:
Simple random sampling (SRS) is the most commonly used sampling design in data collection. In many applications (e.g., in fisheries and medical research) quantification of the variable of interest is either time-consuming or expensive but ranking a number of sampling units, without actual measurement on them, can be done relatively easy and at low cost. In these situations, one may use rank-based sampling (RBS) designs to obtain more representative samples from the underlying population and improve the efficiency of the statistical inference. In this thesis, we study the theory and application of the finite mixture models (FMMs) under RBS designs. In Chapter 2, we study the problems of Maximum Likelihood (ML) estimation and classification in a general class of FMMs under different ranked set sampling (RSS) designs. In Chapter 3, deriving Fisher information (FI) content of different RSS data structures including complete and incomplete RSS data, we show that the FI contained in each variation of the RSS data about different features of FMMs is larger than the FI contained in their SRS counterparts. There are situations where it is difficult to rank all the sampling units in a set with high confidence. Forcing rankers to assign unique ranks to the units (as RSS) can lead to substantial ranking error and consequently to poor statistical inference. We hence focus on the partially rank-ordered set (PROS) sampling design, which is aimed at reducing the ranking error and the burden on rankers by allowing them to declare ties (partially ordered subsets) among the sampling units. Studying the information and uncertainty structures of the PROS data in a general class of distributions, in Chapter 4, we show the superiority of the PROS design in data analysis over RSS and SRS schemes. In Chapter 5, we also investigate the ML estimation and classification problems of FMMs under the PROS design. Finally, we apply our results to estimate the age structure of a short-lived fish species based on the length frequency data, using SRS, RSS and PROS designs.
APA, Harvard, Vancouver, ISO, and other styles
10

Ludovic, Moreau. "A Contribution in Stochastic Control Applied to Finance and Insurance." Phd thesis, Université Paris Dauphine - Paris IX, 2012. http://tel.archives-ouvertes.fr/tel-00737624.

Full text
Abstract:
Le but de cette thèse est d'apporter une contribution à la problématique de valorisation de produits dérivés en marchés incomplets. Nous considérons tout d'abord les cibles stochastiques introduites par Soner et Touzi (2002) afin de traiter le problème de sur-réplication, et récemment étendues afin de traiter des approches plus générales par Bouchard, Elie et Touzi (2009). Nous généralisons le travail de Bouchard {\sl et al} à un cadre plus général où les diffusions sont sujettes à des sauts. Nous devons considérer dans ce cas des contrôles qui prennent la forme de fonctions non bornées, ce qui impacte de façon non triviale la dérivation des EDP correspondantes. Notre deuxième contribution consiste à établir une version des cibles stochastiques qui soit robuste à l'incertitude de modèle. Dans un cadre abstrait, nous établissons une version faible du principe de programmation dynamique géométrique de Soner et Touzi (2002), et nous dérivons, dans un cas d'EDS controllées, l'équation aux dérivées partielles correspondantes, au sens des viscosités. Nous nous intéressons ensuite à un exemple de couverture partielle sous incertitude de Knightian. Finalement, nous nous concentrons sur le problème de valorisation de produits dérivées {\sl hybrides} (produits dérivés combinant finance de marché et assurance). Nous cherchons plus particulièrement à établir une condition suffisante sous laquelle une règle de valorisation (populaire dans l'industrie), consistant à combiner l'approches actuarielle de mutualisation avec une approche d'arbitrage, soit valable.
APA, Harvard, Vancouver, ISO, and other styles
11

Enqvist, Per. "Spectral Estimation by Geometric, Topological and Optimization Methods." Doctoral thesis, Stockholm, 2001. http://media.lib.kth.se:8080/kthdisseng.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Sharify, Meisam. "Algorithmes de mise à l'échelle et méthodes tropicales en analyse numérique matricielle." Phd thesis, Ecole Polytechnique X, 2011. http://pastel.archives-ouvertes.fr/pastel-00643836.

Full text
Abstract:
L'Algèbre tropicale peut être considérée comme un domaine relativement nouveau en mathématiques. Elle apparait dans plusieurs domaines telles que l'optimisation, la synchronisation de la production et du transport, les systèmes à événements discrets, le contrôle optimal, la recherche opérationnelle, etc. La première partie de ce manuscrit est consacrée a l'étude des applications de l'algèbre tropicale à l'analyse numérique matricielle. Nous considérons tout d'abord le problème classique de l'estimation des racines d'un polynôme univarié. Nous prouvons plusieurs nouvelles bornes pour la valeur absolue des racines d'un polynôme en exploitant les méthodes tropicales. Ces résultats sont particulièrement utiles lorsque l'on considère des polynômes dont les coefficients ont des ordres de grandeur différents. Nous examinons ensuite le problème du calcul des valeurs propres d'une matrice polynomiale. Ici, nous introduisons une technique de mise à l'échelle générale, basée sur l'algèbre tropicale, qui s'applique en particulier à la forme compagnon. Cette mise à l'échelle est basée sur la construction d'une fonction polynomiale tropicale auxiliaire, ne dépendant que de la norme des matrices. Les raciness (les points de non-différentiabilité) de ce polynôme tropical fournissent une pré-estimation de la valeur absolue des valeurs propres. Ceci se justifie en particulier par un nouveau résultat montrant que sous certaines hypothèses faites sur le conditionnement, il existe un groupe de valeurs propres bornées en norme. L'ordre de grandeur de ces bornes est fourni par la plus grande racine du polynôme tropical auxiliaire. Un résultat similaire est valable pour un groupe de petites valeurs propres. Nous montrons expérimentalement que cette mise à l'échelle améliore la stabilité numérique, en particulier dans des situations où les données ont des ordres de grandeur différents. Nous étudions également le problème du calcul des valeurs propres tropicales (les points de non-différentiabilité du polynôme caractéristique) d'une matrice polynômiale tropicale. Du point de vue combinatoire, ce problème est équivalent à trouver une fonction de couplage: la valeur d'un couplage de poids maximum dans un graphe biparti dont les arcs sont valués par des fonctions convexes et linéaires par morceaux. Nous avons développé un algorithme qui calcule ces valeurs propres tropicales en temps polynomial. Dans la deuxième partie de cette thèse, nous nous intéressons à la résolution de problèmes d'affectation optimale de très grande taille, pour lesquels les algorithms séquentiels classiques ne sont pas efficaces. Nous proposons une nouvelle approche qui exploite le lien entre le problème d'affectation optimale et le problème de maximisation d'entropie. Cette approche conduit à un algorithme de prétraitement pour le problème d'affectation optimale qui est basé sur une méthode itérative qui élimine les entrées n'appartenant pas à une affectation optimale. Nous considérons deux variantes itératives de l'algorithme de prétraitement, l'une utilise la méthode Sinkhorn et l'autre utilise la méthode de Newton. Cet algorithme de prétraitement ramène le problème initial à un problème beaucoup plus petit en termes de besoins en mémoire. Nous introduisons également une nouvelle méthode itérative basée sur une modification de l'algorithme Sinkhorn, dans lequel un paramètre de déformation est lentement augmenté. Nous prouvons que cette méthode itérative(itération de Sinkhorn déformée) converge vers une matrice dont les entrées non nulles sont exactement celles qui appartiennent aux permutations optimales. Une estimation du taux de convergence est également présentée.
APA, Harvard, Vancouver, ISO, and other styles
13

Tidlund, Matthias, and Stavros Angelis. "Optimal steering control input generation for vehicle's entry speed maximization in a double-lane change manoeuvre." Thesis, KTH, Fordonsdynamik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-140946.

Full text
Abstract:
In an effort to reduce physical testing during the development process of a new vehicle, the automotive industries develop methods that can facilitate the recreation of the physical testing scenarios in virtual environments using simulation software. This thesis aims to develop a method which would help evaluate the vehicle’s dynamic properties without it being subjected to physical testing. The goal is to develop a tool that can be used in an early development phase by the industry and that would allow for modifications and calibration to take place. A vehicle model as well as an electronic stability control implementation is built, and the model’s performance to an ISO3888 part-2 double lane change test is evaluated. Since the handling potentials of the vehicle are rated by its entry speed in that test, the model was subjected to an optimization process where its steering action was controlled in order to achieve the highest possible entry speed to the test in an effort to isolate the vehicle’s dynamic potential from the influence of a human driver when conducting this test. The vehicle modelling procedure is done in steps, from a simple implementation of a linear bicycle model to a more complex implementation of a four-wheel vehicle including roll, tire relaxation and suspension compliance properties as well as a simplified ESC implementation. The results of the steering input optimization process were physically tested on a test track, where the correspondence of the model to the real vehicle was evaluated. By further promoting the vehicle dynamics modelling, this tool can facilitate study more testing scenarios and options and it can serve as a step toward the reduction of the physical testing when the vehicle’s dynamic and handling performance need to be studied and evaluated.
Under utvecklingsprocessen av nya fordon sker en strävan mot att reducera fysiska tester, bilindustrier utvecklar därför metoder för att återskapa fysiska testscenarier i virtuella miljöer med hjälp av simuleringsmjukvara. Denna studie har som målsättning att utveckla en metod, med vilken fordonets dynamiska egenskaper kan utvärderas utan att utföra fysiska tester. Målet är att utveckla ett simuleringsverktyg som, i en tidig utvecklingsfas, kan användas av fordonsindustrin och som skulle införa både modifikations- och kalibreringsmöjligheter i detta skede. Såväl en fordonsmodell som ett anti-sladd system är konstruerat och modellens prestanda i ett dubbelt filbyte, specificerat i ISO3888 del 2, är utvärderad. Då bilens dynamiska prestanda klassificeras utifrån ingångshastigheten i detta test utfördes en optimeringsprocess där hjulens styrvinklar reglerades för att uppnå högsta möjliga hastighet vid testets startposition, detta för att separera fordonets dynamiska klassificering från mänsklig inverkan. Processen att konstruera fordonsmodellen utfördes med succesivt ökande antal av fordonsegenskaper, från en enkel implementering av en linjär cykel-modell till en tvåspårs-modell med krängning, transienta däckegenskaper, hjulupphängningsegenskaper samt ett anti-sladd system. Resultatet av den optimerade styrregleringen testades i motsvarande fordon på en testbana varefter modellen kunde utvärderas med det verkliga testet som referens. Genom en utökad möjlighet till simulering kan detta verktyg ge möjligheten att studera fler scenarier såväl som alternativa modelleringskonfigurationer; det kan reducera fysiska tester då fordons dynamiska prestanda ska klassificeras, studeras samt utvärderas.
APA, Harvard, Vancouver, ISO, and other styles
14

KRISHNA, CHARU. "IMAGE ENHANCEMENT USING ENTROPY MAXIMIZATION." Thesis, 2016. http://dspace.dtu.ac.in:8080/jspui/handle/repository/15093.

Full text
Abstract:
Image enhancement is one of the most interesting domain of image processing. One of the techniques of image enhancement is contrast enhancement. Contrast enhancement in image processing is a very important technique. One of the most vastly used techniques for contrast enhancement is histogram equalization. It enhances the contrast of the input image by mapping the intensity levels based on the input probability distribution function of the image. HE finds application in many fields like medical image processing. HE, in general flats the histogram of the image and thus enhances the contrast of the input image. HE the results in the stretching of dynamic range. Although HE has low computational cost and has high performance, it is rarely used in electronic appliances as its straight use changes the original brightness of the image and hence result in distorted image. Histogram equalized image may also result in annoying artifacts and noise. Hence many techniques for contrast enhancement were developed in order to overcome the defects of HE. These include Bi-histogram brightness preserving histogram (BBHE), Dualistic sub image brightness preserving histogram(DSIH), Minimum mean brightness error bi-histogram equalization(MMBEBHE), Recursive Mean Separate Histogram Equalization(RMSHE),Entropy maximization histogram modification technique(EMHM)etc. The proposed algorithm show better results as compared to other algorithm.
APA, Harvard, Vancouver, ISO, and other styles
15

Ilunga, Masengo. "Hydrological data interpolation using entropy." Thesis, 2006. http://hdl.handle.net/10539/1826.

Full text
Abstract:
Faculty of Engineering and Built Enviroment School of Civil and Enviromental Engineering 0105772w imasengo@yahoo.com
The problem of missing data, insufficient length of hydrological data series and poor quality is common in developing countries. This problem is much more prevalent in developing countries than it is in developed countries. This situation can severely affect the outcome of the water systems managers’ decisions (e.g. reliability of the design, establishment of operating policies for water supply, etc). Thus, numerous data interpolation (infilling) techniques have evolved in hydrology to deal with the missing data. The current study presents merely a methodology by combining different approaches and coping with missing (limited) hydrological data using the theories of entropy, artificial neural networks (ANN) and expectation-maximization (EM) techniques. This methodology is simply formulated into a model named ENANNEX model. This study does not use any physical characteristics of the catchment areas but deals only with the limited information (e.g. streamflow or rainfall) at the target gauge and its similar nearby base gauge(s). The entropy concept was confirmed to be a versatile tool. This concept was firstly used for quantifying information content of hydrological variables (e.g. rainfall or streamflow). The same concept (through directional information transfer index, i.e. DIT) was used in the selection of base/subject gauge. Finally, the DIT notion was also extended to the evaluation of the hydrological data infilling technique performance (i.e. ANN and EM techniques). The methodology was applied to annual total rainfall; annual mean flow series, annual maximum flows and 6-month flow series (means) of selected catchments in the drainage region D “Orange” of South Africa. These data regimes can be regarded as useful for design-oriented studies, flood studies, water balance studies, etc. The results from the case studies showed that DIT is as good index for data infilling technique selection as other criteria, e.g. statistical and graphical. However, the DIT has the feature of being non-dimensionally informational index. The data interpolation iii techniques viz. ANNs and EM (existing methods applied and not yet applied in hydrology) and their new features have been also presented. This study showed that the standard techniques (e.g. Backpropagation-BP and EM) as well as their respective variants could be selected in the missing hydrological data estimation process. However, the capability for the different data interpolation techniques of maintaining the statistical characteristics (e.g. mean, variance) of the target gauge was not neglected. From this study, the relationship between the accuracy of the estimated series (by applying a data infilling technique) and the gap duration was then investigated through the DIT notion. It was shown that a decay (power or exponential) function could better describe that relationship. In other words, the amount of uncertainty removed from the target station in a station-pair, via a given technique, could be known for a given gap duration. It was noticed that the performance of the different techniques depends on the gap duration at the target gauge, the station-pair involved in the missing data estimation and the type of the data regime. This study showed also that it was possible, through entropy approach, to assess (preliminarily) model performance for simulating runoff data at a site where absolutely no record exist: a case study was conducted at Bedford site (in South Africa). Two simulation models, viz. RAFLER and WRSM2000 models, were then assessed in this respect. Both models were found suitable for simulating flows at Bedford.
APA, Harvard, Vancouver, ISO, and other styles
16

鄭坤成. "Through the Study of Three Quantum System: To Investigate Time-Independent Schrodinger Equation and Information Entropy Maximization." Thesis, 1997. http://ndltd.ncl.edu.tw/handle/31361403337331581376.

Full text
Abstract:
碩士
國立中興大學
化學學系
85
This is an analysis of the relationships between the time-independent Schrodinger equation and the information entropy maximization. We first compute entropies of various states of the three simple quantum systems, namely, the box-particle, the harmonic oscillator, and the hydrogen atom for re-examination and then find the nature of continuous entropy is a relative concept. We use potential equivalent theorem as a constraint, and through the maximum-entropy procedure we obtain the ground state Schrodinger equation of the systems. It is also found that there exists a variational procedure involving maximizing entropy for obtaining all solutions of the excited states once one solution is known. In the light of information theory, the ensemble concept in statistical thermodynamics is helpful to understand microscopic quantum systems and many quantum mechanical concepts.
APA, Harvard, Vancouver, ISO, and other styles
17

Adelani, Titus Olufemi. "An Evaluation of Traffic Matrix Estimation Techniques for Large-Scale IP Networks." 2010. http://hdl.handle.net/1993/3869.

Full text
Abstract:
The information on the volume of traffic flowing between all possible origin and destination pairs in an IP network during a given period of time is generally referred to as traffic matrix (TM). This information, which is very important for various traffic engineering tasks, is very costly and difficult to obtain on large operational IP network, consequently it is often inferred from readily available link load measurements. In this thesis, we evaluated 5 TM estimation techniques, namely Tomogravity (TG), Entropy Maximization (EM), Quadratic Programming (QP), Linear Programming (LP) and Neural Network (NN) with gravity and worst-case bound (WCB) initial estimates. We found that the EM technique performed best, consistently, in most of our simulations and that the gravity model yielded better initial estimates than the WCB model. A hybrid of these techniques did not result in considerable decrease in estimation errors. We, however, achieved most significant reduction in errors by combining iterative proportionally-fitted estimates with the EM technique. Therefore, we propose this technique as a viable approach for estimating the traffic matrix of large-scale IP networks.
APA, Harvard, Vancouver, ISO, and other styles
18

Miskeen, Guzlan M. A., Demetres D. Kouvatsos, and Zadeh Esmaeil Habib. "An Exposition of Performance-Security Trade-offs in RANETs Based on Quantitative Network Models." 2013. http://hdl.handle.net/10454/9692.

Full text
Abstract:
No
Security mechanisms, such as encryption and authentication protocols, require extra computing resources and therefore, have an adverse effect upon the performance of robotic mobile wireless ad hoc networks (RANETs). Thus, an optimal performance and security trade-off should be one of the main aspects that should be taken into consideration during the design, development, tuning and upgrading of such networks. In this context, an exposition is initially undertaken on the applicability of Petri nets (PNs) and queueing networks (QNs) in conjunction with their generalisations and hybrid integrations as robust quantitative modelling tools for the performance analysis of discrete flow systems, such as computer systems, communication networks and manufacturing systems. To overcome some of the inherent limitations of these models, a novel hybrid modelling framework is explored for the quantitative evaluation of RANETs, where each robotic node is represented by an abstract open hybrid G-GSPN_QN model with head-of-line priorities, subject to combined performance and security metrics (CPSMs). The proposed model focuses on security processing and state-based control and it is based on an open generalised stochastic PN (GSPN) with a gated multi-class 'On-Off' traffic and mobility model. Moreover, it employs a power consumption model and is linked in tandem with an arbitrary QN consisting of finite capacity channel queues with blocking for 'intra' robot component-to-component communication and 'inter' robot-to-robot transmission. Conclusions and future research directions are included.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography