To see the other types of publications on this topic, follow the link: Pattern posets.

Dissertations / Theses on the topic 'Pattern posets'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 21 dissertations / theses for your research on the topic 'Pattern posets.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Cervetti, Matteo. "Pattern posets: enumerative, algebraic and algorithmic issues." Doctoral thesis, Università degli studi di Trento, 2003. http://hdl.handle.net/11572/311140.

Full text
Abstract:
The study of patterns in combinatorial structures has grown up in the past few decades to one of the most active trends of research in combinatorics. Historically, the study of permutations which are constrained by not containing subsequences ordered in various prescribed ways has been motivated by the problem of sorting permutations with certain devices. However, the richness of this notion became especially evident from its plentiful appearances in several very different disciplines, such as pure mathematics, mathematical physics, computer science, biology, and many others. In the last decades, similar notions of patterns have been considered on discrete structures other than permutations, such as integer sequences, lattice paths, graphs, matchings and set partitions. In the first part of this talk I will introduce the general framework of pattern posets and some classical problems about patterns. In the second part of this talk I will present some enumerative results obtained in my PhD thesis about patterns in permutations, lattice paths and matchings. In particular I will describe a generating tree with a single label for permutations avoiding the vincular pattern 1 - 32 - 4, a finite automata approach to enumerate lattice excursions avoiding a single pattern and some results about matchings avoiding juxtapositions and liftings of patterns.
APA, Harvard, Vancouver, ISO, and other styles
2

Cervetti, Matteo. "Pattern posets: enumerative, algebraic and algorithmic issues." Doctoral thesis, Università degli studi di Trento, 2021. http://hdl.handle.net/11572/311152.

Full text
Abstract:
The study of patterns in combinatorial structures has grown up in the past few decades to one of the most active trends of research in combinatorics. Historically, the study of permutations which are constrained by not containing subsequences ordered in various prescribed ways has been motivated by the problem of sorting permutations with certain devices. However, the richness of this notion became especially evident from its plentiful appearances in several very different disciplines, such as pure mathematics, mathematical physics, computer science,biology, and many others. In the last decades, similar notions of patterns have been considered on discrete structures other than permutations, such as integer sequences, lattice paths, graphs, matchings and set partitions. In the first part of this talk I will introduce the general framework of pattern posets and some classical problems about patterns. In the second part of this talk I will present some enumerative results obtained in my PhD thesis about patterns in permutations, lattice paths and matchings. In particular I will describe a generating tree with a single label for permutations avoiding the vincular pattern 1 - 32 - 4, a finite automata approach to enumerate lattice excursions avoiding a single pattern and some results about matchings avoiding juxtapositions and liftings of patterns.
APA, Harvard, Vancouver, ISO, and other styles
3

Cervetti, Matteo. "Pattern posets: enumerative, algebraic and algorithmic issues." Doctoral thesis, Università degli studi di Trento, 2021. http://hdl.handle.net/11572/311152.

Full text
Abstract:
The study of patterns in combinatorial structures has grown up in the past few decades to one of the most active trends of research in combinatorics. Historically, the study of permutations which are constrained by not containing subsequences ordered in various prescribed ways has been motivated by the problem of sorting permutations with certain devices. However, the richness of this notion became especially evident from its plentiful appearances in several very different disciplines, such as pure mathematics, mathematical physics, computer science, biology, and many others. In the last decades, similar notions of patterns have been considered on discrete structures other than permutations, such as integer sequences, lattice paths, graphs, matchings and set partitions. In the first part of this talk I will introduce the general framework of pattern posets and some classical problems about patterns. In the second part of this talk I will present some enumerative results obtained in my PhD thesis about patterns in permutations, lattice paths and matchings. In particular I will describe a generating tree with a single label for permutations avoiding the vincular pattern 1 - 32 - 4, a finite automata approach to enumerate lattice excursions avoiding a single pattern and some results about matchings avoiding juxtapositions and liftings of patterns.
APA, Harvard, Vancouver, ISO, and other styles
4

Jung, JiYoon. "ANALYTIC AND TOPOLOGICAL COMBINATORICS OF PARTITION POSETS AND PERMUTATIONS." UKnowledge, 2012. http://uknowledge.uky.edu/math_etds/6.

Full text
Abstract:
In this dissertation we first study partition posets and their topology. For each composition c we show that the order complex of the poset of pointed set partitions is a wedge of spheres of the same dimension with the multiplicity given by the number of permutations with descent composition c. Furthermore, the action of the symmetric group on the top homology is isomorphic to the Specht module of a border strip associated to the composition. We also study the filter of pointed set partitions generated by knapsack integer partitions. In the second half of this dissertation we study descent avoidance in permutations. We extend the notion of consecutive pattern avoidance to considering sums over all permutations where each term is a product of weights depending on each consecutive pattern of a fixed length. We study the problem of finding the asymptotics of these sums. Our technique is to extend the spectral method of Ehrenborg, Kitaev and Perry. When the weight depends on the descent pattern, we show how to find the equation determining the spectrum. We give two length 4 applications, and a weighted pattern of length 3 where the associated operator only has one non-zero eigenvalue. Using generating functions we show that the error term in the asymptotic expression is the smallest possible.
APA, Harvard, Vancouver, ISO, and other styles
5

Hainzl, Sebastian, Frank Scherbaum, and Gert Zöller. "Spatiotemporal earthquake patterns : [Poster]." Universität Potsdam, 2006. http://www.uni-potsdam.de/imaf/events/ge_work0602.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Kuhnert, Matthias, Andreas Güntner, Mechthild Klann, Garrido F. Martin, and Birgit Zillgens. "Methods for spatial pattern comparison in distributed hydrological modelling : [Poster]." Universität Potsdam, 2006. http://www.uni-potsdam.de/imaf/events/ge_work0602.html.

Full text
Abstract:
The rigorous development, application and validation of distributed hydrological models obligates to evaluate data in a spatially distributed way. In particular, spatial model predictions such as the distribution of soil moisture, runoff generating areas or nutrient-contributing areas or erosion rates, are to be assessed against spatially distributed observations. Also model inputs, such as the distribution of modelling units derived by GIS and remote sensing analyses, should be evaluated against groundbased observations of landscape characteristics. So far, however, quantitative methods of spatial field comparison have rarely been used in hydrology.

In this paper, we present algorithms that allow to compare observed and simulated spatial hydrological data. The methods can be applied for binary and categorical data on regular grids. They comprise cell-by-cell algorithms, cell-neighbourhood approaches that account for fuzziness of location, and multi-scale algorithms that evaluate the similarity of spatial fields with changing resolution. All methods provide a quantitative measure of the similarity of two maps.

The comparison methods are applied in two mountainous catchments in southern Germany (Brugga, 40 km2) and Austria (Löhnersbach, 16 km2). As an example of binary hydrological data, the distribution of saturated areas is analyzed in both catchments. For categorical data, vegetation zones that are associated with different runoff generation mechanisms are analyzed in the Löhnersbach. Mapped spatial patterns are compared to simulated patterns from terrain index calculations and from satellite image analysis. It is discussed how particular features of visual similarity between the spatial fields are captured by the quantitative measures, leading to recommendations on suitable algorithms in the context of evaluating distributed hydrological models.



Interdisziplinäres Zentrum für Musterdynamik und Angewandte Fernerkundung
Workshop vom 9. - 10. Februar 2006
APA, Harvard, Vancouver, ISO, and other styles
7

Riedel, Michael R., Martin A. Ziemann, and Roland Oberhänsli. "Pattern dynamics applied to the kinetics of mineral phase transformations : [Poster]." Universität Potsdam, 2006. http://www.uni-potsdam.de/imaf/events/ge_work0602.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Doktor, Daniel, Franz-W. Badeck, Alberte Bondeau, Dirk Koslowsky, Jörg Schaber, and Murdock McAllister. "Using satellite imagery and ground observations to quantify the effect of intra-annually changing temperature patterns on spring time phenology : [Poster]." Universität Potsdam, 2006. http://www.uni-potsdam.de/imaf/events/ge_work0602.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Torreglosa, Camila Ragne. "Padrões alimentares e fatores de risco em indivíduos com doença cardiovascular." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/89/89131/tde-23022015-162319/.

Full text
Abstract:
As doenças cardiovasculares (DCV) representam a principal causa de mortalidade e de incapacidade, em ambos os gêneros, no Brasil e no mundo. O padrão de consumo alimentar está tanto positiva como negativamente associado aos principais fatores de risco para DCV, entre eles diabetes, hipertensão, obesidade e hipertrigliceridemia, todos componentes da síndrome metabólica. Este estudo tem como objetivos identificar os padrões alimentares em indivíduos com DCV, considerando a densidade de energia, a gordura saturada, a fibra, o sódio e o potássio consumidos, e investigar sua associação com fatores de risco de DCV e síndrome metabólica. Trata-se de um estudo transversal. Foram utilizados dados do estudo DICA Br. A amostra foi composta de indivíduos com DCV, com idade superior a 45 anos, de todas as regiões brasileiras. O consumo alimentar foi obtido por recordatório alimentar de 24h e os padrões alimentares obtidos pela regressão por posto reduzido (RPR). Para a RPR, utilizaram-se 28 grupos alimentares como preditores e como variáveis respostas componentes dietéticos. O teste de Mann Whitney foi utilizado para testar as diferenças entre as médias dos escores. Foram obtidos dados de 1.047 participantes; 95% apresentavam doença arterial coronariana; em sua maioria, eram idosos, da classe econômica C1 e C2 e estudaram até o ensino médio. A prevalência de síndrome metabólica foi de 58%. Foram extraídos dois padrões alimentares. O primeiro foi marcado pelo maior consumo de fibra alimentar e potássio, composto por arroz e feijão, frutas e sucos naturais com ou sem açúcar, legumes, carne bovina ou processada, verduras, raízes e tubérculos. O segundo padrão caracterizou-se pelo consumo de gordura saturada e maior densidade energética, representado por panificados salgados, gorduras, carne bovina e processada, doces caseiros, pizza, salgadinhos de pacote ou festa, sanduíche e alimento salgado pronto para consumo. Houve associação significativa entre o padrão alimentar 1 com medida da circunferência da cintura e nível de HDL adequados e com o padrão 2 e HDL adequado. A adoção do padrão alimentar 1 pode estar associada à proteção contra alguns dos componentes da síndrome metabólica.
Cardiovascular diseases (CVD) are the leading cause of mortality and disability in both genders in Brazil and worldwide. The dietary pattern is at the same time positive and negatively associated with the main risk factors for CVD, including diabetes, hypertension, obesity and hypertriglyceridemia, all components of the metabolic syndrome. This study aims to identify dietary patterns in individuals with CVD, considering the energy density, and the amount of saturated fatty acid, fiber, sodium and potassium of the diet, and to investigate its association with CVD risk factors and metabolic syndrome. This is a cross-sectional study, data were used from \"DICA Br\" study. The sample consisted of individuals with CVD, over 45 years old, residents from all Brazilian regions. Food consumption was obtained by one 24-hours diet recall and dietary patterns by reduced rank regression (RRR). In the RRR, 28 food groups were included as predictors and dietary components was chosen as the response variable. The Mann-Whitney test was used to test the differences between the factors scores\' means. Data of 1047 participants were analyzed. 95% have coronary artery disease, most are elderly, economical class most observed are C1 and C2. Also, most of them and studied up to high school. The prevalence of metabolic syndrome was 58%. Two dietary patterns were extracted: the first one is higher in dietary fiber and potassium, which is composed by rice, beans, fruits and natural juices with or without sugar, vegetables, beef or processed meat, roots and tubers. The second pattern is higher in saturated fatty acid and energy density, represented by breads, fats, and processed meat, homemade pastries, pizza, snacks or party package, sandwich and salty food ready for consumption. There was a significant association between dietary pattern 1 and low waist circumference and adequate high density cholesterol blood concentration. There was a significant association between dietary pattern 2 and adequate high density cholesterol blood concentration. We suggest that the adoption of the dietary pattern 1 may be associated with protection against some of the components of metabolic syndrome.
APA, Harvard, Vancouver, ISO, and other styles
10

Alberts, Stefan Francois. "Real-time Software Hand Pose Recognition using Single View Depth Images." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/86442.

Full text
Abstract:
Thesis (MEng)--Stellenbosch University, 2014.
ENGLISH ABSTRACT: The fairly recent introduction of low-cost depth sensors such as Microsoft’s Xbox Kinect has encouraged a large amount of research on the use of depth sensors for many common Computer Vision problems. Depth images are advantageous over normal colour images because of how easily objects in a scene can be segregated in real-time. Microsoft used the depth images from the Kinect to successfully separate multiple users and track various larger body joints, but has difficulty tracking smaller joints such as those of the fingers. This is a result of the low resolution and noisy nature of the depth images produced by the Kinect. The objective of this project is to use the depth images produced by the Kinect to remotely track the user’s hands and to recognise the static hand poses in real-time. Such a system would make it possible to control an electronic device from a distance without the use of a remote control. It can be used to control computer systems during computer aided presentations, translate sign language and to provide more hygienic control devices in clean rooms such as operating theatres and electronic laboratories. The proposed system uses the open-source OpenNI framework to retrieve the depth images from the Kinect and to track the user’s hands. Random Decision Forests are trained using computer generated depth images of various hand poses and used to classify the hand regions from a depth image. The region images are processed using a Mean-Shift based joint estimator to find the 3D joint coordinates. These coordinates are finally used to classify the static hand pose using a Support Vector Machine trained using the libSVM library. The system achieves a final accuracy of 95.61% when tested against synthetic data and 81.35% when tested against real world data.
AFRIKAANSE OPSOMMING: Die onlangse bekendstelling van lae-koste diepte sensors soos Microsoft se Xbox Kinect het groot belangstelling opgewek in navorsing oor die gebruik van die diepte sensors vir algemene Rekenaarvisie probleme. Diepte beelde maak dit baie eenvoudig om intyds verskillende voorwerpe in ’n toneel van mekaar te skei. Microsoft het diepte beelde van die Kinect gebruik om verskeie persone en hul ledemate suksesvol te volg. Dit kan egter nie kleiner ledemate soos die vingers volg nie as gevolg van die lae resolusie en voorkoms van geraas in die beelde. Die doel van hierdie projek is om die diepte beelde (verkry vanaf die Kinect) te gebruik om intyds ’n gebruiker se hande te volg oor ’n afstand en die statiese handgebare te herken. So ’n stelsel sal dit moontlik maak om elektroniese toestelle oor ’n afstand te kan beheer sonder die gebruik van ’n afstandsbeheerder. Dit kan gebruik word om rekenaarstelsels te beheer gedurende rekenaargesteunde aanbiedings, vir die vertaling van vingertaal en kan ook gebruik word as higiëniese, tasvrye beheer toestelle in skoonkamers soos operasieteaters en elektroniese laboratoriums. Die voorgestelde stelsel maak gebruik van die oopbron OpenNI raamwerk om die diepte beelde vanaf die Kinect te lees en die gebruiker se hande te volg. Lukrake Besluitnemingswoude ("Random Decision Forests") is opgelei met behulp van rekenaar gegenereerde diepte beelde van verskeie handgebare en word gebruik om die verskeie handdele vanaf ’n diepte beeld te klassifiseer. Die 3D koördinate van die hand ledemate word dan verkry deur gebruik te maak van ’n Gemiddelde-Afset gebaseerde ledemaat herkenner. Hierdie koördinate word dan gebruik om die statiese handgebaar te klassifiseer met behulp van ’n Steun-Vektor Masjien ("Support Vector Machine"), opgelei met behulp van die libSVM biblioteek. Die stelsel behaal ’n finale akkuraatheid van 95.61% wanneer dit getoets word teen sintetiese data en 81.35% wanneer getoets word teen werklike data.
APA, Harvard, Vancouver, ISO, and other styles
11

Bourdis, Nicolas. "Détection de changements entre vidéos aériennes avec trajectoires arbitraires." Phd thesis, Telecom ParisTech, 2013. http://tel.archives-ouvertes.fr/tel-00834717.

Full text
Abstract:
Les activités basées sur l'exploitation de données vidéo se sont développées de manière fulgurante ces dernières années. En effet, non seulement avons-nous assisté à une démocratisation de certaines de ces activités, telles que la vidéo-surveillance, mais également à une diversification importante des applications opérationnelles (e.g. suivi de ressources naturelles, reconnaissance aérienne et bientôt satellite). Cependant, le volume de données vidéo généré est aujourd'hui astronomique et l'efficacité des activités correspondantes est limitée par le coût et la durée nécessaire à l'interprétation humaine de ces données vidéo. Par conséquent, l'analyse automatique de flux vidéos est devenue une problématique cruciale pour de nombreuses applications. Les travaux réalisés dans le cadre de cette thèse s'inscrivent dans ce contexte, et se concentrent plus spécifiquement sur l'analyse automatique de vidéos aériennes. En effet, outre le problème du volume de données, ce type de vidéos est particulièrement difficile à exploiter pour un analyste image, du fait des variations de points de vue, de l'étroitesse des champs de vue, de la mauvaise qualité des images, etc. Pour aborder ces difficultés, nous avons choisi de nous orienter vers un système semi-automatique permettant d'assister l'analyste image dans sa tâche, en suggérant des zones d'intérêt potentiel par détection de changements. Plus précisément, l'approche développée dans le cadre de cette thèse cherche à exploiter les données disponibles au maximum de leur potentiel, afin de minimiser l'effort requis pour l'utilisateur et de maximiser les performances de détection. Pour cela, nous effectuons une modélisation tridimensionnelle des apparences observées dans les vidéos de référence. Cette modélisation permet ensuite d'effectuer une détection en ligne des changements significatifs dans une nouvelle vidéo, en identifiant les déviations d'apparence par rapport aux modèles de référence. Des techniques spécifiques ont également été proposées pour effectuer l'estimation des paramètres d'acquisition ainsi que l'atténuation des effets de l'illumination. De plus, nous avons développé plusieurs techniques de consolidation permettant d'exploiter la connaissance a priori relative aux changements à détecter. L'intérêt de notre approche de détection de changements est démontré dans ce manuscrit de thèse, par la présentation des résultats issus de son évaluation minutieuse et systématique. Cette évaluation a été effectuée à l'aide de données réelles et synthétiques permettant d'analyser, d'une part la robustesse de l'approche par rapport à des perturbations réalistes (e.g. bruit, artefacts de compression, apparences et effets complexes, etc), et d'autre part la précision des résultats en conditions contrôlées.
APA, Harvard, Vancouver, ISO, and other styles
12

Baumgarten, Lars. "Gesteinsmechanische Versuche und petrophysikalische Untersuchungen – Laborergebnisse und numerische Simulationen." Doctoral thesis, Technische Universitaet Bergakademie Freiberg Universitaetsbibliothek "Georgius Agricola", 2016. http://nbn-resolving.de/urn:nbn:de:bsz:105-qucosa-202612.

Full text
Abstract:
Dreiaxiale Druckprüfungen können als Einstufenversuche, als Mehrstufenversuche oder als Versuche mit kontinuierlichen Bruchzuständen ausgeführt werden. Bei der Anwendung der Mehrstufentechnik ergeben sich insbesondere Fragestellungen hinsichtlich der richtigen Wahl des Umschaltpunktes und des optimalen Verlaufs des Spannungspfades zwischen den einzelnen Versuchsstufen. Fraglich beim Versuch mit kontinuierlichen Bruchzuständen bleibt, ob im Versuchsverlauf tatsächlich Spannungszustände erfasst werden, welche die Höchstfestigkeit des untersuchten Materials repräsentieren. Die Dissertation greift diese Fragestellungen auf, ermöglicht den Einstieg in die beschriebene Thematik und schafft die Voraussetzungen, die zur Lösung der aufgeführten Problemstellungen notwendig sind. Auf der Grundlage einer umfangreichen Datenbasis gesteinsmechanischer und petrophysikalischer Kennwerte wurde ein numerisches Modell entwickelt, welches das Spannungs-Verformungs-, Festigkeits- und Bruchverhalten eines Sandsteins im direkten Zug- und im einaxialen Druckversuch sowie in dreiaxialen Druckprüfungen zufriedenstellend wiedergibt. Das Festigkeitsverhalten des entwickelten Modells wurde in Mehrstufentests mit unterschiedlichen Spannungspfaden analysiert und mit den entsprechenden Laborbefunden verglichen.
APA, Harvard, Vancouver, ISO, and other styles
13

Wu, Tien-Yun, and 吳天韻. "Continuous Pattern Design Applied to The Taipei Image Posters." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/95113018609209504598.

Full text
Abstract:
碩士
國立臺灣師範大學
美術學系
101
The study starts from the exploration of how different types and compositions of patterns developed in prehistoric culture, eastern culture and western culture through inheritance, interaction from historical perspective. According to the analysis from literature review, there are four types of patterns which are figural, abstract, text and compound patterns, while there are two kinds of compositions of continual patterns- two Sides continuous pattern and square-faced continuous pattern. Therefore, the compositions of a picture which uses continuous patterns can be categorized to four methods: symmetrical composition, grid composition, inlay composition and scattered composition. This study also discusses the effects of the combination of different approaches by using these four compositions methods to create pattern design with the theme of Taipei image. In addition, in hope of recalling viewers’ memories, poster is chosen as the media to present the pattern design and the rich imagery of Taipei. Through literature review, the layout of patterns is mostly created by one composition method; it is rare to apply various approaches at the same time. Thus, there are practices in this study which combine different ways of composition to create patterns. And the result suggests that symmetrical composition is the easiest method among all; however, mixing different kinds of composition techniques could shape richer visual effects and glamorize the pattern.
APA, Harvard, Vancouver, ISO, and other styles
14

Walsh, Kimberly R. "Bullying on Teen Television: Patterns across Portrayals and Fan Forum Posts." 2012. https://scholarworks.umass.edu/theses/960.

Full text
Abstract:
The primary goal of this thesis was to provide a snapshot of the portrayal of bullying on teen television. Drawing from contextual factors studied in the National Television Violence Study (Smith et al., 1998), a content analysis of 82 episodes (representing 10 series) and 355 acts of bullying was conducted to examine portrayals of physical, verbal, indirect, and cyber bullying in terms of bully and victim social status, motivations, humor, punishments/rewards, character support for bullies, harm shown to victims, interventions by third parties, and anti-bullying episode themes. The analysis revealed significant differences across bullying types for all variables except third party intervention, with portrayals of physical and verbal bullying identified as most “high-risk” (i.e. depicting bullying in ways that research suggests increase the likelihood of negative effects), and portrayals of cyber bullying identified as least “high-risk” for the majority of contextual elements. More generally, the analysis demonstrated that a substantial amount of bullying on teen television sends some concerning messages to young viewers, including the notion that bullying can be funny, harmless, and go without punishment. Complementing the content analysis, an exploratory textual analysis of 294 online fan posts related to bullying portrayed on Glee was performed to capture a representation of potential audience interpretations and intertexts (consumed alongside the television text). The analysis pointed to four major themes across posts: categories of bullying, messages about bullying promoted by characters, contextual elements of bullying, and feelings about characters involved in bullying. In terms of audience responses, the themes highlighted how some fans think critically about bullying portrayals and their implications, distinguish between different types of bullying, and identify with characters. In terms of intertexts, the trends suggested that fans might be exposed to a variety of messages that both criticize and support high-risk depictions of bullying, and defend and rebuke bullying behavior (depending on the characters involved). Combined, the content analysis and textual analysis underlined the importance of media bullying as a topic of scholarly inquiry, revealing that teen bullying is a unique and complex media phenomenon that audiences respond to and interpret in a multitude of ways.
APA, Harvard, Vancouver, ISO, and other styles
15

Chen, Cheng. "Battling the Internet water army: detection of hidden paid posters." Thesis, 2012. http://hdl.handle.net/1828/4044.

Full text
Abstract:
Online social media, such as news websites and community question answering (CQA) portals, have made useful information accessible to more people. However, many of online comment areas and communities are flooded with fraudulent information. These messages come from a special group of online users, called online paid posters, or termed "Internet water army" in China, represents a new type of online job opportunities. Online paid posters get paid for posting comments or articles on different online communities and websites for hidden purpose, e.g., to influence the opinion of other people towards certain social events or business markets. Though an interesting strategy in business marketing, paid posters may create a significant negative effect on the online communities, since the information from paid posters is usually not trustworthy. We thoroughly investigate the behavioral pattern of online paid posters based on a real-world trace data from the social comments of a business conflict. We design and validate a new detection mechanism, including both non-semantic analysis and semantic analysis, to identify potential online paid posters. Using supervised and unsupervised approaches, our test results with real-world datasets show a very promising performance.
Graduate
APA, Harvard, Vancouver, ISO, and other styles
16

Rau, Jenchiang, and 饒真強. "Influences of the layout pattern and color of poster text on the span of comprehension." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/94043448831311722540.

Full text
Abstract:
碩士
國立臺灣科技大學
設計研究所
93
Investigated were the influence of “layout pattern” and “the color of poster text” on the span of comprehension. Current “styles of poster text layout” were analyzed to understand the pattern of text layout. A focus group, survey, and card clustering were applied to interpret poster text layout in terms of poster design trends. A survey on the span of comprehension was conducted with the “geometric patterns” and “text colors” concluded in the interpretation. Data collected from the survey were analyzed to investigate the correlations between “geometric layout/text color” and span of comprehension with the qualitative approach. Findings in the analysis of current styles indicated that geometric straight line and free straight line patterns sharing 79% and 17% of the total patterns surveyed respectively are the commonest and second most common poster text layout; while geometric curve and free curve patterns sharing 3.5% and 0.5% of the total are the least and second least common. Comparing geometric patterns with free patterns, the former shares 82.5%, which is much greater than the latter at 17.5%. Therefore, geometric patterns are the commonest in poster text layout. The amount of characters memorized is the criterion for determining the efficiency of span of comprehension, i.e. the more accurate the characters remembered, the greater the efficiency in span of comprehension. Results of experiments indicated, for both variances, that in terms of “layout pattern” and “text color”, the latter is significantly correlated to the efficiency of span of comprehension, and a better efficiency (amount of characters remembered) was observed in the faded “purple text” of Middle Ming Font against a blank background. The “layout pattern and visual area”, and “visual area” and “text color” are significantly correlated to the efficiency of span of comprehension. Among the visual areas, the greatest efficiency was observed in areas 2 and 5; and the most significant difference was observed between “square layout pattern” and “pentagonal layout pattern”, suggesting that the latter was the focus of observation in “geometric patterns” of respondents.
APA, Harvard, Vancouver, ISO, and other styles
17

Chen, Yu-min, and 陳玉敏. "Auspiciousness-The Study and Application of Chinese Style on Poster Design-Take the Propitious Pattern for Example." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/74969638637286051242.

Full text
Abstract:
碩士
輔仁大學
應用美術學系碩士班
95
Chinoiserie was originally European cultural product in the 17th century when the traffic between the West and the East was very busy. It refers to Europeans’ interpretation of Chinese Art and the imagination of Chinese customs and practices. In addition to the influence of Chinese and European cultures, those of other Eastern countries were also included. Because of the introduction of Japanese style and the autarky of China, the trend of Chinese style started to fade in the middle of the 19th century. However, since the economic boom occurred in Asian countries and the awakening of local consciousness started at the end of the 19th century, designers in Taiwan and China started discussions on a series of Chinese designing styles and elements to cope with the demands of the extensive market. What’s worth to be delighted is that the efforts of these designers started to pay off internationally. However, under the cognition of establishing unique Chinese designs, some superficial and formalized Chinoiserie designs cannot be recognized by Chinese people. What’s worse, they cannot provide proper solutions to Chinese designs. The vast concept of “Chinese style” is the theme for discussion in this research. The history of Ancient- Chinoiserie in Europe is discussed, and the experiments and exploration as well as the inheritance of the new Chinese style in Asia are also discussed. In addition to the current situations of Graphic Design in Taiwan, Hong Kong and China, poster designs will be focused to probe into Chinese design ideas and the application of Chinese elements. As for the creation research, “Propitious Pattern” that have a lot to do with Chinese’ daily lives are combined with current affairs and nine posters were created in a manner of sarcastic imitation. It is induced from the results in this research that Chinese style is not to piece fragmental symbols together but to summarize Chinese culture and thinking. It is also believed that Chinese style is the essence of the old culture. What is simplified is the appearance of Chinese style, and what remains the same for good is the inner spirit of Chinese style. It is also appealed to the society and current educational system in Taiwan that we all can regain the enthusiasm about and respect to Chinese culture.
APA, Harvard, Vancouver, ISO, and other styles
18

Koehoorn, M., and C. Breslin. "Self-reported work patterns and work-related injuries amoung high school students in British Columbia [poster presentation]." 2003. http://hdl.handle.net/2429/835.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Wei, Sz-Ping, and 魏思萍. "A Study on the Design Principle of Layout Patterns for the Poster by Using Eyeball Tracking System." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/78302054020762369712.

Full text
Abstract:
碩士
崑山科技大學
數位生活科技研究所
100
How to judge the quality of poster layout is difficult because every person’s artistic response and appreciation preference is different. Using eye tracking system, this study recorded the eye movements of two groups, one group with art design background, and the other with information technology background. The observed posters are sorted according to four design principles, i.e. contrast, balance, rhythm, and unity. The result of visual preference of observers is analyzed according to the total time of observation, the time of gaze, the average time, and the trajectory of eye movements. According to the questionnaire, personal learning background affects his/her art appreciation. The group with art design background responds more to the posters with balance design principle. The other group with information technology background responds more to the posters with contrast design principle. This study shows that the group with art design background spends much more time on observing the posters. This group prefers to start with general observation, then gazing at specific objects especially pictures. On the other hand, the group with information technology background spends less time on observing the posters. This group prefers general observation and spends more time on words. This study provides the information for poster designers regarding two major design principles that help attract specific group of customers. The first principle is that for customers with art design background, the major design idea should focus on balance of the poster. In order to attract customers of art design background, the poster layout should have special design that express their design ideas more through pictures than words. The second principle is that for customers with information technology background, the major design idea should focus on contrast of the poster. To attract customers, the posters should have clear contents, and its layout should focus on words than pictures. The goal of this study is to provide poster designers the design principle to attract customers with different background and help the different types of customers easily grasp the information of the posters.
APA, Harvard, Vancouver, ISO, and other styles
20

Valentina, Giorgetti. "Ill-Posed Problems in Computer Vision." Doctoral thesis, 2022. http://hdl.handle.net/2158/1274371.

Full text
Abstract:
The visual reconstruction problems have often an ill-posed nature. In this thesis we deal with analyzing and solving three kinds of visual reconstruction problems: Blind Source Separation, Demosaicing and Deblurring. The demosaicing problem is related to the acquisition of RGB color images by means of CCD digital cameras. In the RGB model, each pixel of a digital color image is associated to a triple of numbers, which indicate the light intensity of the red, green and blue channel, respectively. However, most cameras use a single sensor, associated with a color filter that allows only the measure at each pixel of the reflectance of the scene at one of the three colors, according to a given scheme or pattern, called Color Filter Array (CFA). For this reason, at each pixel, the other two missing colors should be estimated. Different CFA’s are proposed for the acquisition. The most common is the Bayer pattern. In this scheme, the numbers of pixels in which the green color is sampled are double with respect to those associated with the red and blue channels, because of the higher sensibility of the human eye to the green wavelengths. If we decompose the acquired image into three channels, we obtain three downsampled grayscale images, so that demosaicing could be interpreted as interpolating grayscale images from sparse data. In most cameras, demosaicing is a part of the processing required to obtain a visible images. The camera’s built-in-firmware is substantially based on fast local interpolation algorithms. The heuristic approaches, which do not try to solve an optimization problem defined in math- ematical terms, are widely used in the literature. These methods, in general, are very fast. Our proposed technique is of heuristic kind. In general, the heuristic techniques consist of filtering operations, which are formulated by means of suitable observations on color images. The non- adaptive algorithms, among which bilinear and bicubic interpolation, yield satisfactory results in smooth regions of an image, but they can fail in textured or edge areas. Edge-directed in- terpolation is an adaptive approach, where, by analyzing the area around each pixel, we choose the possible interpolation direction. In practice, the interpolation direction is chosen to avoid interpolating across the edges. The algorithm here presented consists of three steps. The first two ones are initialization steps, while the third one is an iterative steps. In the first one, the missing valued in the green component are determined, in particular a weighted average-type technique is used. The weights are determined in an edge-directed approach, in which we consider also the possible edges in the red and blue components. In the second step, we determine the missing values in the red and blue components. In this case we use two alternative techniques, according to the position of the involved pixel in the Bayer pattern. In the first technique, the missing value is determined by imposing that the second derivative of the intensity value of the red/blue channel is equal to the second derivative of the intensity values of the green channel. This is done according to the proposed approaches in the AP algorithm and the regularization algorithm. In particular, a constraint is imposed, to get the derivatives of all channels similar as soon as possible. At the third step, all values of the three channels are recursively updated, by means of a constant-hue-based technique. In particular, we assume the constant color difference. The technique we propose at this step is similar to that used by W. T. Freeman. Indeed, even here a median filter is employed, in order to correct small spurious imperfections. We repeat iteratively the third step. However, to avoid increasing excessively the computational cost, we experimentally estimate that only four iterations are necessary to obtain an accurate demosaicing. We call our technique as Local Edge Preserving (LEP) algorithm. The results related to this technique have been published in A. Boccuto, I. Gerace, V. Giorgetti and M. Rinaldi, A Fast Algorithm for the Demosaicing Problem Concerning the Bayer Pattern. The Open Signal Processing Journal 6 (2019), 1–14. In this thesis, we also propose an algorithm for image demosaicing that does not work within the framework of the regularization approaches and is suited, in a natural way, to deal with noisy data. More precisely, we propose an algorithm for joint demosaicing and denoising. Regular- ization requires the adoption of constraints for the solution. The constraints we consider are intra-channel and inter-channel local correlation. With respect to the intra-channel correlation, we assume the intensity of each channel to be locally regular, i.e. piecewise smooth, so that also noise can be removed. We describe this constraint through stabilizers that are functions discour- aging intensity discontinuities of first, second and third order in a selective way, so that those associated to truly edges in the scene are left to emerge. This allows to describe scenes even very complex. Indeed, first order local smoothness characterizes images consisting of constant patches, second order local smoothness describes patches whose pixels have values varying lin- early, while third order local smoothness is used to represent images made up of quadratic-valued patches. As per the inter-channel correlation, we enforce it in correspondence with the intensity discontinuities, by means of constraints that promote their amplitude in the three channels to be equal almost everywhere. Note that all these constraints are by no means biased in favor of one of the three channels, nor the geometry of the sampling pattern is in any way exploited. Thus, the method we propose is completely independent of the CFA considered, although, in the experimental result section, we present its application to images mosaiced through the Bayer CFA. All the above constraints, including the data fidelity term, are merged in a non-convex en- ergy function, whose minimizer is taken as our desired solution. The optimization is performed through an iterative deterministic algorithm entailing the minimization in a sequence of a family of approximating functions that, starting with a first componentwise convex function, gradually converges to the original energy. Our regularization approach can produce image solutions that exhibit reliable discontinuities of both the intensity and the gradients, despite the necessary smoothness constraints. There- fore, we propose an edge-preserving regularization approach, which means that the significant discontinuities in the reconstructed image are geometrically consistent. In the very first works proposing edge-preserving regularization, the image discontinuities were often represented by means of extra, explicit variables, the so-called “line processes”. In that way, it was relatively easy to formulate in terms of constraints the various properties required by significant discontinuities. Nevertheless, the use of explicit line variables entails large computational costs. Thus, so-called “duality theorems” were derived to demonstrate the edge-preserving properties of suitable stabilizers, without introducing extra variables. In particular, we developed duality theorems to determine the properties required for a stabilizer to implicitly manage lines with the desired regularity features. In this work, we choose a suitable family of approximations with the peculiarity that each function satisfies the conditions required for an im- plicit treatment of geometrically significant edges, as expressed in the duality theorems. This allows a better adherence of the approximations to the ideal energy function, with a consequent better coherence with the properties required for the desired solution. In this thesis we also study a Blind Source Separation (BSS) problem. These topics have been widely investigated since the end of the last century, and have various applications. In particular, we analyze the digital reconstruction of degraded documents. We observe that weathering, powder, humidity, seeping of ink, mold and light transmission can determine the degradation of the paper and the ink of written text. Some of the consequences in damaged documents are, for instance, stains, noise, transparency of writing on the verso side and on the close pages, unfocused or overlapping characters, and so on. Historically, the first techniques of restoration for degraded documents were manual, and they led to a material restoration. Re- cently, thanks to the diffusion of scanners and software for reconstruction of images, videos, texts, photographs and films, several new techniques were used in the recovery and restoration of deteriorated material, like for instance digital or virtual restoration. Digital imaging for doc- uments is very important, because it allows to have digital achieves, to make always possible the accessibility and the readability. The Digital Document Restoration consists of a set of pro- cesses finalized to the visual and aesthetic improvement of a virtual reconstruction of a corrupted document, without risk of deterioration. We deal with show-through and bleed-through effects. The show-through is a front-to-back interference, caused by the transparency of the paper and the scanning process, and by means of which the text in the recto side of the document can appear also in the verso side, and conversely. The bleed-through is an intrinsic front-to-back physical deterioration caused by ink seeping, and its effect is similar to that of show-through. The physical model for the show-through distortion, is very complex, because there are the spreading of light in the paper, the features of the paper, the reflectance of the verso and the transmittance parameters. Sharma gave a mathematical model was first analyzed and then further approximated so to become easier to handle. This model describes the observed recto and verso images as mixtures of the two uncorrupted texts. Locally, we consider a classical linear and stationary recto-verso model developed for this purpose, and are concerned with the problem of estimating both the ideal source images of the recto and the verso of the document and the mixture matrix producing the bleed-through or show-through effects. This problem is ill-posed in the sense of Hadamard. In fact, as the estimated mixture matrix varies, the corresponding estimated sources are in general different, and thus infinitely many solutions exist. Many techniques to solve this ill-posed inverse problem have been proposed in the literature. Among them, the Independent Component Analysis (ICA) methods are based on the assumption that the sources are mutually independent. The best-known ICA technique is the so-called FastICA, which by means of a fixed point iteration finds an orthogonal rotation of the prewhitened data that maximizes a measure of non-Gaussianity of the rotated components. The FastICA algorithm is a parameter-free and extremely fast procedure, but ICA is not a viable approach in our setting, as for the problem we consider there is a clear correlation among the sources. On the other hand, several techniques for ill-posed inverse problems require that the estimated sources are only mutually uncorrelated. In this case, the estimated sources are determined via a linear transformation of the data, which is obtained by imposing either an orthogonality condition, as in Principal Component Analysis (PCA), or an orthonormality condition, as in Whitening (W) and Symmetric Whitening (SW) techniques. These approaches all require only a single and very fast processing step. In [49, 156] it is observed that the results obtained by means of the SW method are substantially equivalent to those produced by an ICA technique in the symmetric mixing case. Here we assume that the sum of all rows of the mixing matrix is equal to one, since we expect the color of the background of the source to be the same as that of the data. In our setting, we change the variables of the data so that high and low light intensities correspond to presence and absence of text in the document, respectively, and we impose a nonnegativity constraint on the estimated sources. We define the overlapping matrix of both the observed data and the ideal sources, a quantity related to the cross-correlation between the signals. From the overlapping matrix we can deduce the overlapping level, which measures the similarity between the front and the back of the document. In order to obtain an accurate estimate of the sources, it is necessary to determine a correct source overlapping level. To this aim, we propose the following iterative procedure. At each it- eration, given the current source overlapping level, we estimate the mixture matrix that produces the sources with the lowest possible source overlapping level among those having light intensity in the desired range. This mixture matrix is computed by means of a suitable symmetric factor- ization of the data overlapping matrix. We then use the estimated sources to update the source overlapping level, and iterate the procedure until a fixed point is reached. At the fixed point, the corresponding source overlapping level is the smallest one that allows to estimate the ideal recto and verso sides with the desired properties. We consider this level as an adequate estimate of the ideal source overlapping level. Thus, by means of this technique, we can estimate not only the ideal sources and the mixture matrix, but also the source overlapping level, a value that indicates the correlation between the ideal sources. Therefore, our method can be classified as a Correlated Component Analysis (CCA) technique. We refer to this method as the Minimum Amount of Text Overlapping in Document Separation (MATODS) algorithm. Similarly to the FastICA technique, the MATODS algorithm is a parameter-free and extremely fast procedure. We use the MATODS algorithm to solve the non-stationary and locally linear model we propose, and in particular we present an extension of this technique that fits this model, which we call the Not Invariant for Translation MATODS (NIT-MATODS) algorithm. The related results have been published in A. Boccuto, I. Gerace and V. Giorgetti, A Blind Source Separation Technique for Document Restoration. SIAM J. Imaging Sci. 12 (2) (2019), 1135–1162. In this thesis we modify the MATODS algorithm to deal with the derivatives of the images of the original sources. In this case, we assume that the overlapping level is equal to zero. By means of our experimental results, we show that the proposed technique improves the results obtained by MATODS in terms both of accuracy of the estimates and of computational costs. We refer to this method as the Zero Edge Overlapping in Document Separation (ZEODS) algorithm. The obtained results are published in A. Boccuto, I. Gerace, V. Giorgetti and G. Valenti, A Blind Source Separation Technique for Document Restoration Based on Edge Estimation. http://viXra.org/abs/2201.0141 (2022). In [148], Sharma gave a mathematical model was first analyzed and then further approxi- mated so to become easier to handle. This model describes the observed recto and verso images as mixtures of the two uncorrupted texts. Now we analyze in detail the iterative technique to solve such a model, in which the sources, the blur operators and the interference level are computed separately at every step, until a fixed point is found. In this work, in particular, we deal with determining the interference level, by fixing the blur operators and the ideal sources. To this aim, we use a GNC-type technique. In forthcoming papers, the steps about finding the blur operators and the ideal sources will be treated. The results concerning such a technique have been published in A. Boccuto, I. Gerace and V. Giorgetti, Blind Source Separation in Document Restoration: an Interference Level Estimation. http://viXra.org/abs/2201.0050 (2022). The problem of restoring images consists of estimating the original image, starting from the observed image and the supposed blur. In our model, we suppose to know the blur mask. In general, this problem is ill–conditioned and/or ill–posed in the Hadamard sense. Thanks to known regularization techniques, it is possible to reduce this problem to a well–posed problem, whose solution is the minimum of the so-called primal energy function, which consists of the sum of two terms. The former indicates the faithfulness of the solution to the data, and the latter is in connection with the regularity properties of the solution. In order to obtain more realistic restored images, the discontinuities in the intensity field is considered. Indeed, in images of real scenes, there are some dis- continuities in correspondence with edges of several objects. To deal with such discontinuities, we consider some line variables. It is possible to minimize a priori the primal energy function in these variables, to determine a dual energy function, which treats implicitly discontinuities. Indeed, minimizing the dual energy function is more computationally efficient than minimizing directly the primal energy function. In general, the dual energy function has a quadratic term, related to the faithfulness with the data, and a not necessarily convex addend, the regularization term. In order to link these two kinds of energy functions, some suitable duality theorems are used. In order to improve the quality of the reconstructed images, it is possible to consider a dual energy function which implicitly treats Boolean line variables. The proposed duality theorems can be used even with such a function. However, the related dual energy function is not neces- sarily convex. So, to minimize it, we use a GNC-type technique, which considers as first convex approximation the proposed convex dual energy function. It is possible to verify experimentally that the more expensive minimization is the first one, because the other ones just start with a good approximation of the solution. Hence, when we minimize the first convex approximation, we will approximate every block of the blur operator by matrices whose product can be computed by a suitable fast discrete transform. As every block is a symmetric Toeplitz matrix, we deal with determining a class of matrices easy to handle from the computational point of view, which yield a good approximation of the Toeplitz matrices. Toeplitz-type linear systems arise from numerical approximation of differential equations. Moreover, in restoration of blurred images, it is often dealt with Toeplitz matrices. Thus, in this thesis we investigate a particular class, which is a sum of two families of simultaneously diagonalizable real matrices, whose elements we call β-matrices. Such a class includes both circulant and reverse circulant matrices. Symmetric circulant matrices have several applications to ordinary and partial differential equations, images and signal restoration, graph theory. Reverse circulant matrices have different applications, for instance in exponential data fitting and signal processing. The obtained results have been published in A. Boccuto, I. Gerace and V. Giorgetti, Image deblurring: a class of matrices approximating Toeplitz matrices http://viXra.org/abs/2201.0155 (2022). The thesis is structured as follows. In Chapter 1 we deal with the demosaicing problem, proposing a fast technique which locally estimates the edges. In Chapter 2 we treat the same problem, by giving a regularization technique for solving it. In Chapter 3 we consider the BSS problem for ancient documents, proposing a technique which uses symmetric factorizations. In Chapter 4 we modify the technique illustrated in the previous chapter, by introducing disconti- nuities. In Chapter 5 we deal with the BSS problem, by giving a regularization technique, and in particular we study the estimates of the interference levels. In Chapter 6 we treat the prob- lem of image deblurring, and in particular we analyze how symmetric Toeplitz operators can be approximated in the proposed GNC technique.
APA, Harvard, Vancouver, ISO, and other styles
21

Baumgarten, Lars. "Gesteinsmechanische Versuche und petrophysikalische Untersuchungen – Laborergebnisse und numerische Simulationen." Doctoral thesis, 2015. https://tubaf.qucosa.de/id/qucosa%3A23031.

Full text
Abstract:
Dreiaxiale Druckprüfungen können als Einstufenversuche, als Mehrstufenversuche oder als Versuche mit kontinuierlichen Bruchzuständen ausgeführt werden. Bei der Anwendung der Mehrstufentechnik ergeben sich insbesondere Fragestellungen hinsichtlich der richtigen Wahl des Umschaltpunktes und des optimalen Verlaufs des Spannungspfades zwischen den einzelnen Versuchsstufen. Fraglich beim Versuch mit kontinuierlichen Bruchzuständen bleibt, ob im Versuchsverlauf tatsächlich Spannungszustände erfasst werden, welche die Höchstfestigkeit des untersuchten Materials repräsentieren. Die Dissertation greift diese Fragestellungen auf, ermöglicht den Einstieg in die beschriebene Thematik und schafft die Voraussetzungen, die zur Lösung der aufgeführten Problemstellungen notwendig sind. Auf der Grundlage einer umfangreichen Datenbasis gesteinsmechanischer und petrophysikalischer Kennwerte wurde ein numerisches Modell entwickelt, welches das Spannungs-Verformungs-, Festigkeits- und Bruchverhalten eines Sandsteins im direkten Zug- und im einaxialen Druckversuch sowie in dreiaxialen Druckprüfungen zufriedenstellend wiedergibt. Das Festigkeitsverhalten des entwickelten Modells wurde in Mehrstufentests mit unterschiedlichen Spannungspfaden analysiert und mit den entsprechenden Laborbefunden verglichen.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography