To see the other types of publications on this topic, follow the link: Simplification.

Dissertations / Theses on the topic 'Simplification'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Simplification.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Li, Gong. "Semiautomatic simplification." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2000. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape3/PQDD_0001/MQ59831.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Huang, Zhiheng. "Rule model simplification." Thesis, University of Edinburgh, 2006. http://hdl.handle.net/1842/904.

Full text
Abstract:
Due to its high performance and comprehensibility, fuzzy modelling is becoming more and more popular in dealing with nonlinear, uncertain and complex systems for tasks such as signal processing, medical diagnosis and financial investment. However, there are no principal routine methods to obtain the optimum fuzzy rule base which is not only compact but also retains high prediction (or classification) performance. In order to achieve this, two major problems need to be addressed. First, as the number of input variables increases, the number of possible rules grows exponentially (termed curse of dimensionality). It inevitably deteriorates the transparency of the rule model and can lead to over-fitting, with the model obtaining high performance on the training data but failing to predict the unknown data successfully. Second, gaps may occur in the rule base if the problem is too compact (termed sparse rule base). As a result, it cannot be handled by conventional fuzzy inference such as Mamdani. This Ph. D. work proposes a rule base simplification method and a family of fuzzy interpolation methods to solve the aforementioned two problems. The proposed simplification method reduces the rule base complexity via Retrieving Data from Rules (RDFR). It first retrieves a collection of new data from an original rule base. Then the new data is used for re-training to build a more compact rule model. This method has four advantages: 1) It can simplify rule bases without using the original training data, but is capable of dealing with combinations of rules and data. 2) It can integrate with any rule induction or reduction schemes. 3) It implements the similarity merging and inconsistency removal approaches. 4) It can make use of rule weights. Illustrative examples have been given to demonstrate the potential of this work. The second part of the work concerns the development of a family of transformation based fuzzy interpolation methods (termed HS methods). These methods first introduce the general concept of representative values (RVs), and then use this to interpolate fuzzy rules involving arbitrary polygonal fuzzy sets, by means of scale and move transformations. This family consists of two sub-categories: namely, the original HS methods and the enhanced HS methods. The HS methods not only inherit the common advantages of fuzzy interpolative reasoning -- helping reduce rule base complexity and allowing inferences to be performed within simple and sparse rule bases -- but also have two other advantages compared to the existing fuzzy interpolation methods. Firstly, they provide a degree of freedom to choose various RV definitions to meet different application requirements. Secondly, they can handle the interpolation of multiple rules, with each rule having multiple antecedent variables associated with arbitrary polygonal fuzzy membership functions. This makes the interpolation inference a practical solution for real world applications. The enhanced HS methods are the first proposed interpolation methods which preserve piece-wise linearity, which may provide a solution to solve the interpolation problem in a very high Cartesian space in the mathematics literature. The RDFR-based simplification method has been applied to a variety of applications including nursery prediction, the Saturday morning problem and credit application. HS methods have been utilized in truck backer-upper control and computer hardware prediction. The former demonstrates the simplification potential of the HS methods, while the latter shows their capability in dealing with sparse rule bases. The RDFR-based simplification method and HS methods are further integrated into a novel model simplification framework, which has been applied to a scaled-up application (computer activity prediction). In the experimental studies, the proposed simplification framework leads to very good fuzzy rule base reductions whilst retaining, or improving, performance.
APA, Harvard, Vancouver, ISO, and other styles
3

Cardon, David L. "T-Spline Simplification." Diss., CLICK HERE for online access, 2007. http://contentdm.lib.byu.edu/ETD/image/etd1813.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Zhang, Yanyan. "A Modified Douglas-Peucker Simplification Algorithm: A Consistent Displacement Line Simplification." The Ohio State University, 1996. http://rave.ohiolink.edu/etdc/view?acc_num=osu1392023228.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhou, Zhang. "Simplification of triangulated meshes." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp03/MQ31384.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Langis, Christian. "Mesh simplification in parallel." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape7/PQDD_0020/MQ48438.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Walter, Jason David. "Appearance Preserving Data Simplification." NCSU, 2001. http://www.lib.ncsu.edu/theses/available/etd-20010402-174138.

Full text
Abstract:

Many visualization environments constantly face the issue of dealingwith large, complex datasets. Often these datasets are so complexthat rendering a visualization would seem impractical. Likewise,enormous amounts of data may overwhelm the human visual system; therebyrendering the data incomprehensible. Thus, the need arises to deal withthese datasets in some arbitrary manner such that the resultingdataset represents the original whole --- while reducing thecost on the human and computer visual system.

A closely related problem can be found in geometric models, typicallyrepresented as a piecewise linear collection of connected polygons (amesh). Meshes can be obtained from range scanners or created with acomputer aided design package. However, these obtained meshes areoften very dense and have high spatial frequency. An active area ofcomputer graphics research is directed at the simplification of thesedense meshes. Initially, mesh simplification research aimed atpreserving only the topology, but the most recent research, appearancepreserving mesh simplification, is aimed at simplification whilepreserving surface properties of the mesh, such as color or texture.

Our work addresses the use of appearance preserving meshsimplification in a data simplification environment, as well as, theissues of doing so. As a result, we present and demonstrate a generalmethod to simplify large multidimensional datasets using anyappearance preserving mesh simplification algorithm. We add the use ofprincipal components analysis to reduce the dimensionality of the dataprior to simplification, which allows faster simplification on highdimensional data, and despite the reduction in dimensionality we haveshown full preservation of key features in the dataset. In addition,we introduce spatial locks to preserve important data elements duringthe simplification process.

APA, Harvard, Vancouver, ISO, and other styles
8

Ovreiu, Elena. "Accurate 3D mesh simplification." Phd thesis, INSA de Lyon, 2012. http://tel.archives-ouvertes.fr/tel-00838783.

Full text
Abstract:
Complex 3D digital objects are used in many domains such as animation films, scientific visualization, medical imaging and computer vision. These objects are usually represented by triangular meshes with many triangles. The simplification of those objects in order to keep them as close as possible to the original has received a lot of attention in the recent years. In this context, we propose a simplification algorithm which is focused on the accuracy of the simplifications. The mesh simplification uses edges collapses with vertex relocation by minimizing an error metric. Accuracy is obtained with the two error metrics we use: the Accurate Measure of Quadratic Error (AMQE) and the Symmetric Measure of Quadratic Error (SMQE). AMQE is computed as the weighted sum of squared distances between the simplified mesh and the original one. Accuracy of the measure of the geometric deviation introduced in the mesh by an edge collapse is given by the distances between surfaces. The distances are computed in between sample points of the simplified mesh and the faces of the original one. SMQE is similar to the AMQE method but computed in the both, direct and reverse directions, i.e. simplified to original and original to simplified meshes. The SMQE approach is computationnaly more expensive than the AMQE but the advantage of computing the AMQE in a reverse fashion results in the preservation of boundaries, sharp features and isolated regions of the mesh. For both measures we obtain better results than methods proposed in the literature.
APA, Harvard, Vancouver, ISO, and other styles
9

Canning, Yvonne Margaret. "Syntactic simplification of text." Thesis, University of Sunderland, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.369911.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

González, Ballester Carlos. "Simplification Techniques for Interactive Applications." Doctoral thesis, Universitat Jaume I, 2010. http://hdl.handle.net/10803/10492.

Full text
Abstract:
Interactive applications with 3D graphics are used everyday in a lot of different fields, such as games, teaching, learning environments and virtual reality. The scenarios showed in interactive applications usually tend to present detailed worlds and characters, being the most realistic as possible. Detailed 3D models require a lot of geometric complexity. But not always the available graphics hardware can handle and manage all this geometry maintaining a realistic frame rate. Simplification methods attempt to solve this problem, by generating simplified versions of the original 3D models. These simplified models present less geometry than the original ones. This simplification has to be done with a reasonable criterion in order to maintain as possible the appearance of the original models. But the geometry is not the only important factor in 3D models. They are also composed of additional attributes that are important for the final aspect of the models for the viewer. In the literature we can find a lot of work presented about simplification. However, there are still several points without an efficient solution. Therefore, this thesis focuses on simplification techniques for 3D models usually used in interactive applications.
APA, Harvard, Vancouver, ISO, and other styles
11

Lepper, Ingo. "Simplification orders in term rewriting." [S.l. : s.n.], 2001. http://deposit.ddb.de/cgi-bin/dokserv?idn=967334136.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Siddharthan, Advaith. "Syntactic simplification and text cohesion." Thesis, University of Cambridge, 2003. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.407014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Garland, Michael. "Quadric-Based Polygonal Surface Simplification." Research Showcase @ CMU, 1999. http://repository.cmu.edu/dissertations/282.

Full text
Abstract:
Many applications in computer graphics and related fields can benefit fromautomatic simplification of complex polygonal surface models. Applications areoften confronted with either very densely over-sampled surfaces or models toocomplex for the limited available hardware capacity. An effective algorithmfor rapidly producing high-quality approximations of the original model is avaluable tool for managing data complexity. In this dissertation, I present my simplification algorithm, based on iterativevertex pair contraction. This technique provides an effective compromisebetween the fastest algorithms, which often produce poor quality results, andthe highest-quality algorithms, which are generally very slow. For example, a1000 face approximation of a 100,000 face model can be produced in about 10seconds on a PentiumPro 200. The algorithm can simplify both the geometryand topology of manifold as well as non-manifold surfaces. In addition toproducing single approximations, my algorithm can also be used to generatemultiresolution representations such as progressive meshes and vertex hierarchiesfor view-dependent refinement. The foundation of my simplification algorithm, is the quadric error metricwhich I have developed. It provides a useful and economical characterization oflocal surface shape, and I have proven a direct mathematical connection betweenthe quadric metric and surface curvature. A generalized form of this metric canaccommodate surfaces with material properties, such as RGB color or texturecoordinates. I have also developed a closely related technique for constructing a hierarchyof well-defined surface regions composed of disjoint sets of faces. This algorithminvolves applying a dual form of my simplification algorithm to the dual graphof the input surface. The resulting structure is a hierarchy of face clusters whichis an effective multiresolution representation for applications such as radiosity.
APA, Harvard, Vancouver, ISO, and other styles
14

Cooper, D. J. "Realising flexibility through manufacturing simplification." Thesis, Cranfield University, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.379489.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Balazs, Marton E. "Design Simplification by Analogical Reasoning." Digital WPI, 2000. https://digitalcommons.wpi.edu/etd-dissertations/60.

Full text
Abstract:
Ever since artifacts have been produced, improving them has been a common human activity. Improving an artifact refers to modifying it such that it will be either easier to produce, or easier to use, or easier to fix, or easier to maintain, and so on. In all of these cases, "easier" means fewer resources are required for those processes. While 'resources' is a general measure, which can ultimately be expressed by some measure of cost (such as time or money), we believe that at the core of many improvements is the notion of reduction of complexity, or in other words, simplification. This talk presents our research on performing design simplification using analogical reasoning. We first define the simplification problem as the problem of reducing the complexity of an artefact from a given point of view. We propose that a point of view from which the complexity of an artefact can be measured consists of a context, an aspect and a measure. Next, we describe an approach to solving simplification problems by goal-directed analogical reasoning, as our implementation of this approach. Finally, we present some experimental results obtained with the system. The research presented in this dissertation is significant as it focuses on the intersection of a number of important, active research areas - analogical reasoning, functional representation, functional reasoning, simplification, and the general area of AI in Design.
APA, Harvard, Vancouver, ISO, and other styles
16

Okutan, Osman Berat. "Persistence, Metric Invariants, and Simplification." The Ohio State University, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=osu1559312147225384.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Yuan, Xing-Hong. "Simplification des reseaux moyenne tension." Paris 6, 1989. http://www.theses.fr/1989PA066736.

Full text
Abstract:
La tension nominale des reseaux moyenne tension est comprise entre quelques kilovolts et 35 kv. Ces reseaux contiennent tellement d'elements qu'il est le plus souvent necessaire de les simplifier pour effectuer une etude. La simplification, c'est l'elimination ou le regroupement de portions du reseau et le remplacement de celui-ci par un reseau equivalent. Le reseau equivalent doit correspondre a l'original en ce qui concerne les pertes, les chutes de tension aux nuds les plus importants, l'estimation de l'energie non distribuee. . . Mais, simplification signifie approximations et erreurs; celles-ci doivent etre minimisees et, si possible, corrigees. Des methodes de simplification sont utilisees par reduction de la matrice admittance et par d'autres methodes qui modifient la topologie initiale. Des strategies sont proposees pour conduire la simplification. Enfin un logiciel utilisant les concepts developpes a ete ecrit qui permet de tester les methodes sur des exemples pratiques
APA, Harvard, Vancouver, ISO, and other styles
18

Shardlow, Matthew. "Lexical simplification : optimising the pipeline." Thesis, University of Manchester, 2015. https://www.research.manchester.ac.uk/portal/en/theses/lexical-simplification-optimising-the-pipeline(8b56374a-bc5e-4d4d-80bc-2c2f01eed319).html.

Full text
Abstract:
Introduction: This thesis was submitted by Matthew Shardlow to the University of Manchester for the degree of Doctor of Philosophy (PhD) in the year 2015. Lexical simplification is the practice of automatically increasing the readability and understandability of a text by identifying problematic vocabulary and substituting easy to understand synonyms. This work describes the research undertaken during the course of a 4-year PhD. We have focused on the pipeline of operations which string together to produce lexical simplifications. We have identified key areas for research and allowed our results to influence the direction of our research. We have suggested new methods and ideas where appropriate. Objectives: We seek to further the field of lexical simplification as an assistive technology. Although the concept of fully-automated error-free lexical simplification is some way off, we seek to bring this dream closer to reality. Technology is ubiquitous in our information-based society. Ever-increasingly we consume news, correspondence and literature through an electronic device. E-reading gives us the opportunity to intervene when a text is too difficult. Simplification can act as an augmentative communication tool for those who find a text is above their reading level. Texts which would otherwise go unread would become accessible via simplification. Contributions: This PhD has focused on the lexical simplification pipeline. We have identified common sources of errors as well as the detrimental effects of these errors. We have looked at techniques to mitigate the errors at each stage of the pipeline. We have created the CW Corpus, a resource for evaluating the task of identifying complex words. We have also compared machine learning strategies for identifying complex words. We propose a new preprocessing step which yields a significant increase in identification performance. We have also tackled the related fields of word sense disambiguation and substitution generation. We evaluate the current state of the field and make recommendations for best practice in lexical simplification. Finally, we focus our attention on evaluating the effect of lexical simplification on the reading ability of people with aphasia. We find that in our small-scale preliminary study, lexical simplification has a nega- tive effect, causing reading time to increase. We evaluate this result and use it to motivate further work into lexical simplification for people with aphasia.
APA, Harvard, Vancouver, ISO, and other styles
19

Balazs, Marton E. "Design simplification by analogical reasoning." Link to electronic version, 1999. http://www.wpi.edu/Pubs/ETD/Available/etd-0209100-051108/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Charrier, Emilie. "Simplification polyédrique optimale pour le rendu." Phd thesis, Université Paris-Est, 2009. http://tel.archives-ouvertes.fr/tel-00532792.

Full text
Abstract:
En informatique, les images sont numériques et donc composées de pixels en 2D et de voxels en 3D. Dans une scène virtuelle 3D, il est impossible de manipuler directement les objets comme des ensembles de voxels en raison du trop gros volume de données. Les objets sont alors polyédrisés, c'est-à-dire remplacés par une collection de facettes. Pour ce faire, il est primordial de savoir décider si un sous-ensemble de voxels peut être transformé en une facette dans la représentation polyédrique. Ce problème est appelé reconnaissance de plans discrets. Pour le résoudre, nous mettons en place un nouvel algorithme spécialement adapté pour les ensembles de voxels denses dans une boite englobante. Notre méthode atteint une complexité quasi-linéaire dans ce cas et s'avère efficace en pratique. En parallèle, nous nous intéressons à un problème algorithmique annexe intervenant dans notre méthode de reconnaissance de plans discrets. Il s'agit de calculer les deux enveloppes convexes des points de Z2 contenus dans un domaine vertical borné et situés de part et d'autre d'une droite quelconque. Nous proposons une méthode de complexité optimale et adaptative pour calculer ces enveloppes convexes. Nous présentons le problème de manière détournée : déterminer le nombre rationnel à dénominateur borné qui approxime au mieux un nombre réel donné. Nous établissons le lien entre ce problème numérique et son interprétation géométrique dans le plan. Enfin, nous proposons indépendamment un nouvel algorithme pour calculer l'épaisseur d'un ensemble de points dans le réseau Zd. Notre méthode est optimale en 2D et gloutonne mais efficace en dimension supérieure
APA, Harvard, Vancouver, ISO, and other styles
21

Keskisärkkä, Robin. "Automatic Text Simplification via Synonym Replacement." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-84637.

Full text
Abstract:
In this study automatic lexical simplification via synonym replacement in Swedish was investigated using three different strategies for choosing alternative synonyms: based on word frequency, based on word length, and based on level of synonymy. These strategies were evaluated in terms of standardized readability metrics for Swedish, average word length, proportion of long words, and in relation to the ratio of errors (type A) and number of replacements. The effect of replacements on different genres of texts was also examined. The results show that replacement based on word frequency and word length can improve readability in terms of established metrics for Swedish texts for all genres but that the risk of introducing errors is high. Attempts were made at identifying criteria thresholds that would decrease the ratio of errors but no general thresholds could be identified. In a final experiment word frequency and level of synonymy were combined using predefined thresholds. When more than one word passed the thresholds word frequency or level of synonymy was prioritized. The strategy was significantly better than word frequency alone when looking at all texts and prioritizing level of synonymy. Both prioritizing frequency and level of synonymy were significantly better for the newspaper texts. The results indicate that synonym replacement on a one-to-one word level is very likely to produce errors. Automatic lexical simplification should therefore not be regarded a trivial task, which is too often the case in research literature. In order to evaluate the true quality of the texts it would be valuable to take into account the specific reader. A simplified text that contains some errors but which fails to appreciate subtle differences in terminology can still be very useful if the original text is too difficult to comprehend to the unassisted reader.
APA, Harvard, Vancouver, ISO, and other styles
22

Ardeshir, G. "Decision tree simplification for classifier ensembles." Thesis, University of Surrey, 2002. http://epubs.surrey.ac.uk/843022/.

Full text
Abstract:
Design of ensemble classifiers involves three factors: 1) a learning algorithm to produce a classifier (base classifier), 2) an ensemble method to generate diverse classifiers, and 3) a combining method to combine decisions made by base classifiers. With regard to the first factor, a good choice for constructing a classifier is a decision tree learning algorithm. However, a possible problem with this learning algorithm is its complexity which has only been addressed previously in the context of pruning methods for individual trees. Furthermore, the ensemble method may require the learning algorithm to produce a complex classifier. Considering the fact that performance of simplification methods as well as ensemble methods changes from one domain to another, our main contribution is to address a simplification method (post-pruning) in the context of ensemble methods including Bagging, Boosting and Error-Correcting Output Code (ECOC). Using a statistical test, the performance of ensembles made by Bagging, Boosting and ECOC as well as five pruning methods in the context of ensembles is compared. In addition to the implementation a supporting theory called Margin, is discussed and the relationship of Pruning to bias and variance is explained. For ECOC, the effect of parameters such as code length and size of training set on performance of Pruning methods is also studied. Decomposition methods such as ECOC are considered as a solution to reduce complexity of multi-class problems in many real problems such as face recognition. Focusing on the decomposition methods, AdaBoost.OC which is a combination of Boosting and ECOC is compared with the pseudo-loss based version of Boosting, AdaBoost.M2. In addition, the influence of pruning on the performance of ensembles is studied. Motivated by the result that both pruned and unpruned ensembles made by AdaBoost.OC have similar accuracy, pruned ensembles are compared with ensembles of single node decision trees. This results in the hypothesis that ensembles of simple classifiers may give better performance as shown for AdaBoost.OC on the identification problem in face recognition. The implication is that in some problems to achieve best accuracy of an ensemble, it is necessary to select base classifier complexity.
APA, Harvard, Vancouver, ISO, and other styles
23

Lagerkvist, Rebecca. "Mesh simplification of complex VRML models." Thesis, Linköpings universitet, Institutionen för teknik och naturvetenskap, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-97869.

Full text
Abstract:
In a large part of their work Volvo Cars uses digital models - in the design process, geometry simulation, safety tests, presentation material etc. The models may be used for many purposes but they are normally only produced in one level of detail. In general, the level that suits the most extreme demands. High resolution models challenge rendering performances, transmission bandwidth and storage capacities. At Volvo Cars there is time, money and energy to be saved by adapting the the model’s level of detail to their area of usage. The aim of this thesis is to investigate if the Volvo Cars models can be reduced to containing less than 20% of its original triangles without compromising too much in quality. In the thesis, the mesh simplification field is researched and the simplification algorithm judged to best suit the needs of Volvo Cars is implemented in a C++ program. The program is used to test and analyze the Volvo Cars’ models. The results show that it is possible to take away more than 80% of the the model’s polygons hardly without affecting the appearance.
APA, Harvard, Vancouver, ISO, and other styles
24

VIEIRA, ANTONIO WILSON. "A TOPOLOGICAL APPROACH FOR MESH SIMPLIFICATION." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2003. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=4314@1.

Full text
Abstract:
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
Diversas aplicações, em matemática, computação gráfica, medicina, geofísica e outras áreas, têm explorado a representação de sólidos por superfícies de contorno, em particular malhas poligonais. As malhas podem aproximar com muita precisão as propriedades geométricas da superfície de contorno de um sólido e ainda guardar importantes propriedades topológicas das superfícies como gênero, bordo e conexidade. Devido à grande complexidade dessas malhas, elas são geralmente processadas em meios computacionais usando alguma estrutura de dados. Essas estruturas guardam, além da geometria da malha, informações de incidências e adjacências entre os elementos da malha e exigem uma capacidade de armazenamento e processamento em função da complexidade da malha. Apesar da evolução dos recursos computacionais disponíveis para a manipulação destas estruturas, malhas extremamente complexas com milhões de elementos inviabilizam o armazenamento, processamento e transmissão de sua estrutura de dados nos meios computacionais. Muitas pesquisas recentes estão voltadas para a obtenção de processos de simplificação de malhas que permitam representar a mesma superfície com menos elementos na estrutura de dados e processos de compressão que codifiquem os modelos em formatos menores para efeitos de transmissão e armazenamento em mídia. Neste trabalho, desenvolvemos operadores, em uma estrutura de dados compacta, para a simplificação de malhas através da decimação de células da superfície. Objetivamos, com esses operadores, obter uma malha menos complexa que preserve as propriedades topológicas da superfície original e ainda, controlar as propriedades geométricas como volume, área e aspecto visual da mesma. Apresentamos ainda algumas aplicações para os processos de simplificação desenvolvidos com esses operadores.
Many applications, in mathematics, computer graphics, medical imaging, geophysics and others, have used the representation of solids by their boundary surface, usually polygonal meshes. Those meshes can represent, with high precision, the geometric properties of the boundary surface of solid and also store important topological surface properties as genus, boundary and connected components. Because of the high complexity of such meshes, they are usually processed by the computers using specific data structures. These structures store, beyond the mesh geometry, information about incidence and adjacency relations among the mesh elements. They require computational resources for storage and processing according to the mesh complexity. Even with the development of the computational resources available for handling such structures, very large meshes with millions of elements are hard to store, to process and to exchange through the web. Many recent researches are looking for mesh simplification process that allows to represent the same surface with fewer elements and compression process to encode it in compact ways for transmition and storage. In this work, we develop topological operators, in a concise data structure, for simplifying meshes by the decimation of its cells. One of our goals, with these operators, is to obtain a mesh with a low complexity that preserves the topological properties from the original surface without loosing the control of the geometric proprieties as volume, area and visual aspect.
APA, Harvard, Vancouver, ISO, and other styles
25

Tapkanova, Elmira. "Machine Translation and Text Simplification Evaluation." Scholarship @ Claremont, 2016. http://scholarship.claremont.edu/scripps_theses/790.

Full text
Abstract:
Machine translation translates a text from one language to another, while text simplification converts a text from its original form to a simpler one, usually in the same language. This survey paper discusses the evaluation (manual and automatic) of both fields, providing an overview of existing metrics along with their strengths and weaknesses. The first chapter takes an in-depth look at machine translation evaluation metrics, namely BLEU, NIST, AMBER, LEPOR, MP4IBM1, TER, MMS, METEOR, TESLA, RTE, and HTER. The second chapter focuses more generally on text simplification, starting with a discussion of the theoretical underpinnings of the field (i.e what ``simple'' means). Then, an overview of automatic evaluation metrics, namely BLEU and Flesch-Kincaid, is given, along with common approaches to text simplification. The paper concludes with a discussion of the future trajectory of both fields.
APA, Harvard, Vancouver, ISO, and other styles
26

Chen, Chung-Yun. "Graphic legibility enhancement using simplification guidelines." Thesis, University of Leeds, 2016. http://etheses.whiterose.ac.uk/16408/.

Full text
Abstract:
This study explores an approach to app icon legibility enhancement. Four areas of research are included: (1) design process; (2) the trend of logo/app icon redesign; (3) graphic legibility and (4) graphic simplification. It presents the results of five experiments designed to capture and compare design principles. Firstly, the result categorised the characteristics of simple shape. Secondly, the agreement of simplification judgement was summarised based on the average score of participants. Thirdly, the impact of each simplification criterion was compared and represented as a ratio; a measurement template and simplification guidelines were also generated at this stage. Fourthly, how this design principle (simplification guidelines) can be applied in practical use by student designers was examined. Finally, the legibility enhancement test was proved by the results of reaction time and accuracy improvement. The findings of this study determined the impact of simplification criteria with regard to: component, open-closed, weight, form, symmetry, angles and straight-curved respectively. After identifying these design principles (simplification guidelines), graphic designers, user interface designers and other users will be enabled to design a more legible logo/app icon design required for display on small devices.
APA, Harvard, Vancouver, ISO, and other styles
27

Tarnoff, David. "Episode 4.09 - Simplification of Boolean Expressions." Digital Commons @ East Tennessee State University, 2020. https://dc.etsu.edu/computer-organization-design-oer/2.

Full text
Abstract:
In this episode, we take a break from proving identities of Boolean algebra and start applying them. Why? Well, so we can build our Boolean logic circuits with fewer gates. That means they’ll be cheaper, smaller, and faster. That’s why.
APA, Harvard, Vancouver, ISO, and other styles
28

Buzer, Lilian. "Reconnaissance des plans discrets : simplification polygonale." Clermont-Ferrand 1, 2002. http://www.theses.fr/2002CLF10255.

Full text
Abstract:
Nous étudions le problème de la reconnaissance des droites et des plans discrets. Notre méthode est fondée sur le célèbre algorithme de programmation linéaire de N. Megiddo, linéaire suivant le nombre de contraintes. Nous décrivons une variante originale permettant de maintenir une complexité linéaire dans le cas incrémental. Cette méthode complexe n'ayant jamais été programmée, nous proposons la première implémentation simplifiée et certifiée de cette technique. Les divers algorithmes de reconnaissance sont étudiés et nous envisageons ensuite des méthodes probabilistes plus efficaces en moyenne. L'algorithme de Megiddo requérant des calculs de valeurs médianes, nous comparons les différentes approches de manière théorique et expérimentale. Finalement nous construisons un algorithme plus efficace en moyenne que les méthodes actuelles. Nous concluons ce travail par une application de nos résultats permettant de créer le premier algorithme sous-quadratique de simplification polygonale
We study digital lines and planes recognition. Our method is based on a well-known linear programming method, namely Megiddo algorithm, wich is linear in the number of constraints. A new variant is given in order to preserve a linear complexity for the incremental algorithm. This rather intricate method having never been programmed, we propose the first simplified and proved implementation. Various recognition techniques are studied and we then develop efficient randomized algorithms. Megiddo algorithm requiring median computations, we compare the different approaches in a theoretical and experimental way. We finally create a more efficient algorithm compared with current methods. We conclude this work by providing an application of our results to the construction of the first subquadratic algorithm for polygonal simplification
APA, Harvard, Vancouver, ISO, and other styles
29

Waters, Laura Jane. "Switch & simplification of antiretroviral therapy." Thesis, Imperial College London, 2013. http://hdl.handle.net/10044/1/12869.

Full text
Abstract:
The advent of combined antiretroviral therapy (cART) has transformed HIV care such that most individuals can expect a normal or near normal life expectancy. Managing HIV is increasingly focused on the prevention and treatment of non-infectious co-morbidities and dealing with drug-related side effects and toxicities, the main reasons for switching cART. Switching cART can be complicated by pharmacokinetic (PK) interactions and modern, potent agents may allow us to move away from the standard ‘three active drugs’ mantra. This thesis examines the pharmacokinetic impact of switching directly from efavirenz, widely used for initial therapy, to etravirine and to maraviroc. The impact of efavirenz to etravirine switch on central nervous system side effects was also investigated in a double-blind trial. In addition the value of pre-treatment tropism testing to guide susceptibility to maraviroc is explored. Finally a large cohort analysis explored the importance of active nucleoside analogues in a boosted protease inhibitor based, second line regimen for patients who had failed first line non-nucleoside reverse transcriptase inhibitor therapy. The pharmacokinetic studies confirmed a prolonged induction effect of efavirenz and a novel finding that this induction effect varies according to which agent is used. The tropism analysis showed that tropism switch during suppressive therapy is uncommon thus supporting the use of pre-treatment testing as a guide to future switch. Individuals with ongoing central nervous symptoms on efavirenz experienced significant improvement in switch to etravirine, the first time this has been proven in a randomised, blinded trial; furthermore significant lipid improvements were observed. Finally the cohort analysis suggested that the number of new or active nucleosides in a second line boosted PI-based combination does not affect virological outcomes; simpler second line regimens should be considered and could provide several advantages over more complex choices.
APA, Harvard, Vancouver, ISO, and other styles
30

Osborn, H. B., C. L. Unkrich, and L. Frykman. "Problems of Simplification in Hydrologic Modeling." Arizona-Nevada Academy of Science, 1985. http://hdl.handle.net/10150/296361.

Full text
Abstract:
From the Proceedings of the 1985 Meetings of the Arizona Section - American Water Resources Association and the Hydrology Section - Arizona-Nevada Academy of Science - April 27, 1985, Las Vegas, Nevada
APA, Harvard, Vancouver, ISO, and other styles
31

Cripwell, Liam. "Controllable and Document-Level Text Simplification." Electronic Thesis or Diss., Université de Lorraine, 2023. http://www.theses.fr/2023LORR0186.

Full text
Abstract:
La simplification de texte est une tâche qui consiste à réécrire un texte pour le rendre plus facile à lire et à comprendre pour un public plus large, tout en exprimant toujours le même sens fondamental. Cela présente des avantages potentiels pour certains utilisateurs (par exemple, les locuteurs non natifs, les enfants, les personnes ayant des difficultés de lecture), tout en étant prometteur en tant qu'étape de prétraitement pour les tâches de Traitement Automatique des Langues (TAL) en aval. Les progrès récents dans les modèles génératifs neuronaux ont conduit au développement de systèmes capables de produire des sorties très fluides. Cependant, étant donné la nature de "boîte noire" (black box) de ces systèmes de bout en bout, l'utilisation de corpus d'entraînement pour apprendre implicitement comment effectuer les opérations de réécriture nécessaires. Dans le cas de la simplification, ces ensembles de données comportent des limitation en termes à la fois de quantité et de qualité, la plupart des corpus étant soit très petits, soit construits automatiquement, soit soumis à des licences d'utilisation strictes. En conséquence, de nombreux systèmes ont tendance à être trop conservateurs, n'apportant souvent aucune modification au texte original ou se limitant à la paraphrase de courtes séquences de mots sans modifications structurelles substantielles. En outre, la plupart des travaux existants sur la simplification du texte se limitent aux entrées au niveau de la phrase, les tentatives d'application itérative de ces approches à la simplification au niveau du document ne parviennent en effet souvent pas à préserver de manière cohérente la structure du discours du document. Ceci est problématique, car la plupart des applications réelles de simplification de texte concernent des documents entiers. Dans cette thèse, nous étudions des stratégies pour atténuer la conservativité des systèmes de simplification tout en favorisant une gamme plus diversifiée de types de transformation. Cela implique la création de nouveaux ensembles de données contenant des instances d'opérations sous-représentées et la mise en œuvre de systèmes contrôlables capables d'être adaptés à des transformations spécifiques et à différents niveaux de simplicité. Nous étendons ensuite ces stratégies à la simplification au niveau du document, en proposant des systèmes capables de prendre en compte le contexte du document environnant. Nous développons également des techniques de contrôlabilité permettant de planifier les opérations à effectuer, à l'avance et au niveau de la phrase. Nous montrons que ces techniques permettent à la fois des performances élevées et une évolutivité des modèles de simplification
Text simplification is a task that involves rewriting a text to make it easier to read and understand for a wider audience, while still expressing the same core meaning. This has potential benefits for disadvantaged end-users (e.g. non-native speakers, children, the reading impaired), while also showing promise as a preprocessing step for downstream NLP tasks. Recent advancement in neural generative models have led to the development of systems that are capable of producing highly fluent outputs. However, these end-to-end systems often rely on training corpora to implicitly learn how to perform the necessary rewrite operations. In the case of simplification, these datasets are lacking in both quantity and quality, with most corpora either being very small, automatically constructed, or subject to strict licensing agreements. As a result, many systems tend to be overly conservative, often making no changes to the original text or being limited to the paraphrasing of short word sequences without substantial structural modifications. Furthermore, most existing work on text simplification is limited to sentence-level inputs, with attempts to iteratively apply these approaches to document-level simplification failing to coherently preserve the discourse structure of the document. This is problematic, as most real-world applications of text simplification concern document-level texts. In this thesis, we investigate strategies for mitigating the conservativity of simplification systems while promoting a more diverse range of transformation types. This involves the creation of new datasets containing instances of under-represented operations and the implementation of controllable systems capable of being tailored towards specific transformations and simplicity levels. We later extend these strategies to document-level simplification, proposing systems that are able to consider surrounding document context and use similar controllability techniques to plan which sentence-level operations to perform ahead of time, allowing for both high performance and scalability. Finally, we analyze current evaluation processes and propose new strategies that can be used to better evaluate both controllable and document-level simplification systems
APA, Harvard, Vancouver, ISO, and other styles
32

Brackman, Daphné. "La simplification du droit de la commande publique." Thesis, Lyon 3, 2015. http://www.theses.fr/2015LYO30045.

Full text
Abstract:
L’étude des diverses causes de la complexité du droit de la commande publique permet de déterminer le niveau minimal inéluctable de complexité à conserver, la complexité restante s’avérant inutile et devant donc être supprimée, ou du moins, aménagée. Plus particulièrement, les causes de la complexité de ce droit proviennent objectivement de la difficile rationalisation de ce dernier. Elles sont quantitatives ou qualitatives. Mais toute cette complexité s’explique surtout subjectivement car ses causes résultent du difficile règlement des conflits d’intérêts par ce droit. En effet, tout d’abord, les objectifs de ce droit sont débattus. Ensuite, la société donne différents points de vue sur ce droit. Enfin, on relève une efficacité amoindrie du droit au juge en matière de commande publique, ce qui nuit aux requérants. Il convient alors d’analyser les multiples voies et moyens de simplification du droit de la commande publique afin de trouver le niveau maximal inéluctable de simplification. Dès lors, le reste de la simplification s’avère inutile, infaisable. Ces voies et moyens doivent servir à mieux rationaliser ce droit selon deux approches, l’une quantitative et l’autre qualitative. Cependant, toute cette simplification du droit de la commande publique doit être effectuée surtout d’un point de vue subjectif. Plus précisément, les voies et moyens de cette simplification doivent permettre de mieux régler les conflits d’intérêts. En effet, on peut clarifier les objectifs du droit de la commande publique, réglementer de façon mesurée les divers points de vue de la société sur ce droit et renforcer l’efficacité du droit au juge pour les requérants
The study of the various causes of the complexity of public procurement law determine the minimum unavoidable level of complexity to keep, the remaining complexity is useless and must therefore be abolished, or at least, modified. More specifically, the causes of the complexity of this law objectively come to the difficult rationalization of the latter. They are quantitative or qualitative. But all this complexity is mainly because subjectively its causes result from the difficult settlement of conflicts of interest by that law. Indeed, first, the objectives of this law are discussed. Then, the society gives different points of view on this law. Finally, we note a diminished effectiveness of the right to a judge in matters of public procurement, which affects the applicants. It is then necessary to analyze the multiple ways and means of simplification of public procurement law in order to find the maximum unavoidable level of simplification. Therefore, the rest of the simplification is unnecessary, infeasible. These ways and means should be used to better rationalize this law according to two approaches, one quantitative and the other qualitative. However, any simplification of public procurement law must be made primarily from a subjective point of view. More specifically, ways and means of this simplification must allow to better resolve conflicts of interest. Indeed, one can clarify the objectives of public procurement law, regulate in a measured way the diverse points of view of society on this law and strengthen the effectiveness of the right to a judge for the applicants
APA, Harvard, Vancouver, ISO, and other styles
33

Paradinas, Salsón Teresa. "Simplification, approximation and deformation of large models." Doctoral thesis, Universitat de Girona, 2011. http://hdl.handle.net/10803/51293.

Full text
Abstract:
The high level of realism and interaction in many computer graphic applications requires techniques for processing complex geometric models. First, we present a method that provides an accurate low-resolution approximation from a multi-chart textured model that guarantees geometric fidelity and correct preservation of the appearance attributes. Then, we introduce a mesh structure called Compact Model that approximates dense triangular meshes while preserving sharp features, allowing adaptive reconstructions and supporting textured models. Next, we design a new space deformation technique called *Cages based on a multi-level system of cages that preserves the smoothness of the mesh between neighbouring cages and is extremely versatile, allowing the use of heterogeneous sets of coordinates and different levels of deformation. Finally, we propose a hybrid method that allows to apply any deformation technique on large models obtaining high quality results with a reduced memory footprint and a high performance.
L’elevat nivell de realisme i d’interacció requerit en múltiples aplicacions gràfiques fa que siguin necessàries tècniques pel processament de models geomètrics complexes. En primer lloc, presentem un mètode de simplificació que proporciona una aproximació precisa de baixa resolució d'un model texturat que garanteix fidelitat geomètrica i una correcta preservació de l’aparença. A continuació, introduïm el Compact Model, una nova estructura de dades que permet aproximar malles triangulars denses preservant els trets més distintius del model, permetent reconstruccions adaptatives i suportant models texturats. Seguidament, hem dissenyat *Cages, un esquema de deformació basat en un sistema de caixes multi-nivell que conserva la suavitat de la malla entre caixes veïnes i és extremadament versàtil, permetent l'ús de conjunts heterogenis de coordenades i diferents nivells de deformació. Finalment, proposem un mètode híbrid que permet aplicar qualsevol tècnica de deformació sobre models complexes obtenint resultats d’alta qualitat amb una memòria reduïda i un alt rendiment.
APA, Harvard, Vancouver, ISO, and other styles
34

Rennes, Evelina. "Improved Automatic Text Simplification by Manual Training." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-120001.

Full text
Abstract:
The purpose of this thesis was the further development of a rule set used in an automatic text simplification system, and the exploration of whether it is possible to improve the performance of a rule based text simplification system by manual training. A first rule set was developed from a thor- ough literature review, and the rule refinement was performed by manually adapting the first rule set to a set of training texts. When there was no more change added to the set of rules, the training was considered to be completed, and the two sets were applied to a test set, for evaluation. This thesis evaluated the performance of a text simplification system as a clas- sification task, by the use of objective metrics: precision and recall. The comparison of the rule sets revealed a clear improvement of the system, since precision increased from 45% to 82%, and recall increased from 37% to 53%. Both recall and precision was improved after training for the ma- jority of the rules, with a few exceptions. All rule types resulted in a higher score on correctness for R2. Automatic text simplification systems target- ing real life readers need to account for qualitative aspects, which has not been considered in this thesis. Future evaluation should, in addition to quantitative metrics such as precision, recall, and complexity metrics, also account for the experience of the reader.
APA, Harvard, Vancouver, ISO, and other styles
35

Paetzold, Gustavo Henrique. "Lexical simplification for non-native English speakers." Thesis, University of Sheffield, 2016. http://etheses.whiterose.ac.uk/15332/.

Full text
Abstract:
Lexical Simplification is the process of replacing complex words in texts to create simpler, more easily comprehensible alternatives. It has proven very useful as an assistive tool for users who may find complex texts challenging. Those who suffer from Aphasia and Dyslexia are among the most common beneficiaries of such technology. In this thesis we focus on Lexical Simplification for English using non-native English speakers as the target audience. Even though they number in hundreds of millions, there are very few contributions that aim to address the needs of these users. Current work is unable to provide solutions for this audience due to lack of user studies, datasets and resources. Furthermore, existing work in Lexical Simplification is limited regardless of the target audience, as it tends to focus on certain steps of the simplification process and disregard others, such as the automatic detection of the words that require simplification. We introduce a series of contributions to the area of Lexical Simplification that range from user studies and resulting datasets to novel methods for all steps of the process and evaluation techniques. In order to understand the needs of non-native English speakers, we conducted three user studies with 1,000 users in total. These studies demonstrated that the number of words deemed complex by non-native speakers of English correlates with their level of English proficiency and appears to decrease with age. They also indicated that although words deemed complex tend to be much less ambiguous and less frequently found in corpora, the complexity of words also depends on the context in which they occur. Based on these findings, we propose an ensemble approach which achieves state-of-the-art performance in identifying words that challenge non-native speakers of English. Using the insight and data gathered, we created two new approaches to Lexical Simplification that address the needs of non-native English speakers: joint and pipelined. The joint approach employs resource-light neural language models to simplify words deemed complex in a single step. While its performance was unsatisfactory, it proved useful when paired with pipelined approaches. Our pipelined simplifier generates candidate replacements for complex words using new, context-aware word embedding models, filters them for grammaticality and meaning preservation using a novel unsupervised ranking approach, and finally ranks them for simplicity using a novel supervised ranker that learns a model based on the needs of non-native English speakers. In order to test these and previous approaches, we designed LEXenstein, a framework for Lexical Simplification, and compiled NNSeval, a dataset that accounts for the needs of non-native English speakers. Comparisons against hundreds of previous approaches as well as the variants we proposed showed that our pipelined approach outperforms all others. Finally, we introduce PLUMBErr, a new automatic error identification framework for Lexical Simplification. Using this framework, we assessed the type and number of errors made by our pipelined approach throughout the simplification process and found that combining our ensemble complex word identifier with our pipelined simplifier yields a system that makes up to 25% fewer mistakes compared to the previous state-of-the-art strategies during the simplification process.
APA, Harvard, Vancouver, ISO, and other styles
36

Metzgen, Paul. "Decomposition and simplification with EXOR-based representations." Thesis, University of Oxford, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.365331.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

DeHaemer, Michael Joseph. "Simplification of objects rendered by polygonal approximation." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/27265.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Kratt, Julian [Verfasser]. "Geometric Shape Abstraction and Simplification / Julian Kratt." Konstanz : Bibliothek der Universität Konstanz, 2018. http://d-nb.info/1162841052/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Vadhwana, V. A. "A model simplification technique for computer flowsheeting." Thesis, London South Bank University, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.382822.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Phisanbut, Nalina. "Practical simplification of elementary functions using CAD." Thesis, University of Bath, 2011. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.547926.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Štajner, Sanja. "New data-driven approaches to text simplification." Thesis, University of Wolverhampton, 2015. http://hdl.handle.net/2436/554413.

Full text
Abstract:
Many texts we encounter in our everyday lives are lexically and syntactically very complex. This makes them difficult to understand for people with intellectual or reading impairments, and difficult for various natural language processing systems to process. This motivated the need for text simplification (TS) which transforms texts into their simpler variants. Given that this is still a relatively new research area, many challenges are still remaining. The focus of this thesis is on better understanding the current problems in automatic text simplification (ATS) and proposing new data-driven approaches to solving them. We propose methods for learning sentence splitting and deletion decisions, built upon parallel corpora of original and manually simplified Spanish texts, which outperform the existing similar systems. Our experiments in adaptation of those methods to different text genres and target populations report promising results, thus offering one possible solution for dealing with the scarcity of parallel corpora for text simplification aimed at specific target populations, which is currently one of the main issues in ATS. The results of our extensive analysis of the phrase-based statistical machine translation (PB-SMT) approach to ATS reject the widespread assumption that the success of that approach largely depends on the size of the training and development datasets. They indicate more influential factors for the success of the PB-SMT approach to ATS, and reveal some important differences between cross-lingual MT and the monolingual v MT used in ATS. Our event-based system for simplifying news stories in English (EventSimplify) overcomes some of the main problems in ATS. It does not require a large number of handcrafted simplification rules nor parallel data, and it performs significant content reduction. The automatic and human evaluations conducted show that it produces grammatical text and increases readability, preserving and simplifying relevant content and reducing irrelevant content. Finally, this thesis addresses another important issue in TS which is how to automatically evaluate the performance of TS systems given that access to the target users might be difficult. Our experiments indicate that existing readability metrics can successfully be used for this task when enriched with human evaluation of grammaticality and preservation of meaning.
APA, Harvard, Vancouver, ISO, and other styles
42

Štajner, Sanja. "New data-driven approaches to text simplification." Thesis, University of Wolverhampton, 2016. http://hdl.handle.net/2436/601113.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Imbert, Jean-Louis. "Simplification des systèmes de contraintes numériques linéaires." Aix-Marseille 2, 1989. http://www.theses.fr/1989AIX22022.

Full text
Abstract:
Presentation d'un algorithme qui, a partir d'un systeme de contraintes numeriques lineaires s et d'un ensemble de variables v, calcule un systeme de contraintes lineaires s' ne faisant intervenir que des variables de v, et dont les solutions sont exactement les projections sur v, des solutions du systeme initial. Cet algorithme est constitue principalement de deux sous algorithmes. Le permier, qui est lineaire, simplifie le sous systeme des equations. Le second, vbase sur l'elimination de fourier-motzkin, simplifie le sous systeme des inequations
APA, Harvard, Vancouver, ISO, and other styles
44

Fišer, Petr. "Simplification of quantum circuits for modular exponentiation." Master's thesis, Vysoká škola ekonomická v Praze, 2015. http://www.nusl.cz/ntk/nusl-261826.

Full text
Abstract:
This thesis is based on top of the previous thesis "Security of modern encryption protocols" where we introduced a new paradigm for constructing quantum circuits. We have built circuits for modular arithmetic (addition, multiplication and exponentiation) in order to break El-Gamal asymmetric cryptosystem. Current thesis reviews all proposed circuits and discusses possibilities of their further optimization in goal of lowering the number of used qbits at least by an order of magnitude. It also shows that this is not possible due to existence of COPY gates which make the design inherently unoptimizable. Getting rid of COPY gates is, however, not possible without substantial rewrite of the whole paradigm. The overall estimate of number of qbits used in circuits thus remains O(log(m)log^2(N)) where m is a processed number and N is a modulus. The thesis also proposes optimization of the modular multiplication circuit that, if issues with COPY gates are resolved, allows us to lower the number of used qbits by about O(log(m)) at the price of a longer execution time.
APA, Harvard, Vancouver, ISO, and other styles
45

Bao, Yu Wu. "The generation and simplification of isosurface in ViSC." Thesis, University of Macau, 2005. http://umaclib3.umac.mo/record=b1636962.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Yirci, Murat. "A Comparative Study On Polygonal Mesh Simplification Algorithms." Master's thesis, METU, 2008. http://etd.lib.metu.edu.tr/upload/12610074/index.pdf.

Full text
Abstract:
Polygonal meshes are a common way of representing 3D surface models in many different areas of computer graphics and geometry processing. However, these models are becoming more and more complex which increases the cost of processing these models. In order to reduce this cost, mesh simplification algorithms are developed. Another important property of a polygonal mesh model is that whether it is regular or not. Regular meshes have many advantages over the irregular ones in terms of memory requirements, efficient processing, rendering etc. In this thesis work, both mesh simplification and regular remeshing algorithms are studied. Moreover, some of the popular mesh libraries are compared with respect to their approaches and performance to the mesh simplification. In addition, mesh models with disk topology are remeshed and converted to regular ones.
APA, Harvard, Vancouver, ISO, and other styles
47

Repke, Stefan [Verfasser]. "Simplification problems for automata and games / Stefan Repke." Aachen : Hochschulbibliothek der Rheinisch-Westfälischen Technischen Hochschule Aachen, 2014. http://d-nb.info/1058850857/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Batsos, Epameinondas, and Atta Rabbi. "Clustering and cartographic simplification of point data set." Thesis, KTH, Geodesi och geoinformatik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-79892.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Syed, Imtiaz Husain Electrical Engineering &amp Telecommunications Faculty of Engineering UNSW. "Channel shortening equalizers for UWB receiver design simplification." Publisher:University of New South Wales. Electrical Engineering & Telecommunications, 2008. http://handle.unsw.edu.au/1959.4/41473.

Full text
Abstract:
Ultra Wideband (UWB) communication systems occupy large bandwidths with very low power spectral densities. This feature makes UWB channels highly rich in multipaths. To exploit the temporal diversity, a UWB receiver usually incorporates Rake reception. Each multipath in the channel carries just a fraction of the signal energy. This phenomenon dictates a Rake receiver with a large number of fingers to achieve good energy capture and output signal to noise ratio (SNR). Eventually, the Rake structure becomes very complex from analysis and design perspectives and incurs higher manufacturing cost. The first contribution of this thesis is to propose channel shortening or time domain equalization as a technique to reduce the complexity of the UWB Rake receiver. It is analyzed that most of the existing channel shortening equalizer (CSE) designs are either system specific or optimize a parameter not critical or even available in UWB systems. The CSE designs which are more generic and use commonly critical cost functions may perform poorly due to particular UWB channel profiles and related statistical properties. Consequently, the main contribution of the thesis is to propose several CSE designs to address the specific needs of UWB systems. These CSE designs not only exploit some general but also some UWB specific features to perform the task more efficiently. The comparative analysis of the proposed CSEs, some existing designs and the conventional Rake structures leads towards the conclusion. It is finally shown that the use of CSE at the receiver front end greatly simplifies the Rake structure and the associated signal processing.
APA, Harvard, Vancouver, ISO, and other styles
50

Lindstrom, Peter. "Model simplification using image and geometry-based metrics." Diss., Georgia Institute of Technology, 2000. http://hdl.handle.net/1853/8208.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography