Dissertations / Theses on the topic 'Illumination analysis'

To see the other types of publications on this topic, follow the link: Illumination analysis.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Illumination analysis.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Bales, Michael Ryan. "Illumination compensation in video surveillance analysis." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/39535.

Full text
Abstract:
Problems in automated video surveillance analysis caused by illumination changes are explored, and solutions are presented. Controlled experiments are first conducted to measure the responses of color targets to changes in lighting intensity and spectrum. Surfaces of dissimilar color are found to respond significantly differently. Illumination compensation model error is reduced by 70% to 80% by individually optimizing model parameters for each distinct color region, and applying a model tuned for one region to a chromatically different region increases error by a factor of 15. A background model--called BigBackground--is presented to extract large, stable, chromatically self-similar background features by identifying the dominant colors in a scene. The stability and chromatic diversity of these features make them useful reference points for quantifying illumination changes. The model is observed to cover as much as 90% of a scene, and pixels belonging to the model are 20% more stable on average than non-member pixels. Several illumination compensation techniques are developed to exploit BigBackground, and are compared with several compensation techniques from the literature. Techniques are compared in terms of foreground / background classification, and are applied to an object tracking pipeline with kinematic and appearance-based correspondence mechanisms. Compared with other techniques, BigBackground-based techniques improve foreground classification by 25% to 43%, improve tracking accuracy by an average of 20%, and better preserve object appearance for appearance-based trackers. All algorithms are implemented in C or C++ to support the consideration of runtime performance. In terms of execution speed, the BigBackground-based illumination compensation technique is measured to run on par with the simplest compensation technique used for comparison, and consistently achieves twice the frame rate of the two next-fastest techniques.
APA, Harvard, Vancouver, ISO, and other styles
2

Noel, Laurent. "Discrete shape analysis for global illumination." Thesis, Paris Est, 2015. http://www.theses.fr/2015PESC1130/document.

Full text
Abstract:
Les images de synthèse sont présentes à travers un grand nombre d'applications tel que les jeux vidéo, le cinéma, l'architecture, la publicité, l'art, la réalité virtuelle, la visualisation scientifique, l'ingénierie en éclairage, etc. En conséquence, la demande en photoréalisme et techniques de rendu rapide ne cesse d'augmenter. Le rendu réaliste d'une scène virtuelle nécessite l'estimation de son illumination globale grâce à une simulation du transport de lumière, un processus coûteux en temps de calcul dont la vitesse de convergence diminue généralement lorsque la complexité de la scène augmente. En particulier, une forte illumination indirecte combinée à de nombreuses occlusions constitue une caractéristique globale de la scène que les techniques existantes ont du mal à gérer. Cette thèse s'intéresse à ce problème à travers l'application de techniques d'analyse de formes pour le rendu 3D.Notre principal outil est un squelette curviligne du vide de la scène, représenté par un graphe contenant des informations sur la topologie et la géométrie de la scène. Ce squelette nous permet de proposer de nouvelles méthodes pour améliorer des techniques de rendu temps réel et non temps réel. Concernant le rendu temps réel, nous utilisons les informations géométriques du squelette afin d'approximer le rendu des ombres projetés par un grand nombre de points virtuels de lumière représentant l'illumination indirecte de la scène 3D.Pour ce qui est du rendu non temps réel, nos travaux se concentrent sur des algorithmes basés sur l'échantillonnage de chemins, constituant actuellement le principal paradigme en rendu physiquement plausible. Notre squelette mène au développement de nouvelles stratégies d'échantillonnage de chemins, guidés par des caractéristiques topologiques et géométriques. Nous adressons également ce problème à l'aide d'un second outil d'analyse de formes: la fonction d'ouverture du vide de la scène, décrivant l'épaisseur locale du vide en chacun de ses points. Nos contributions offrent une amélioration des méthodes existantes and indiquent clairement que l'analyse de formes offre de nombreuses opportunités pour le développement de nouvelles techniques de rendu 3D
Nowadays, computer generated images can be found everywhere, through a wide range of applications such as video games, cinema, architecture, publicity, artistic design, virtual reality, scientific visualization, lighting engineering, etc. Consequently, the need for visual realism and fast rendering is increasingly growing. Realistic rendering involves the estimation of global illumination through light transport simulation, a time consuming process for which the convergence rate generally decreases as the complexity of the input virtual 3D scene increases. In particular, occlusions and strong indirect illumination are global features of the scene that are difficult to handle efficiently with existing techniques. This thesis addresses this problem through the application of discrete shape analysis to rendering. Our main tool is a curvilinear skeleton of the empty space of the scene, a sparse graph containing important geometric and topological information about the structure of the scene. By taking advantage of this skeleton, we propose new methods to improve both real-time and off-line rendering methods. Concerning real-time rendering, we exploit geometric information carried by the skeleton for the approximation of shadows casted by a large set of virtual point lights representing the indirect illumination of the 3D scene. Regarding off-line rendering, our works focus on algorithms based on path sampling, that constitute the main paradigm of state-of-the-art methods addressing physically based rendering. Our skeleton leads to new efficient path sampling strategies guided by topological and geometric features. Addressing the same problem, we also propose a sampling strategy based on a second tool from discrete shape analysis: the opening function of the empty space of the scene, describing the local thickness of that space at each point. Our contributions demonstrate improvements over existing approaches and clearly indicate that discrete shape analysis offers many opportunities for the development of new rendering techniques
APA, Harvard, Vancouver, ISO, and other styles
3

Zhao, Shuyan. "Face analysis under near infrared illumination." Göttingen Cuvillier, 2008. http://d-nb.info/990811492/04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Martinkauppi, B. (Birgitta). "Face colour under varying illumination - analysis and applications." Doctoral thesis, University of Oulu, 2002. http://urn.fi/urn:isbn:9514267885.

Full text
Abstract:
Abstract The colours of objects perceived by a colour camera are dependent on the illumination conditions. For example, when the prevailing illumination condition does not correspond to the one used in the white balancing of the camera, the object colours can change their appearance due to the lack of colour constancy capabilities. Many methods for colour constancy have been suggested but so far their performance has been inadequate. Faces are common and important objects encountered in many applications. Therefore, this thesis is dedicated to studying face colours and their robust use under real world illumination conditions. The main thesis statement is "knowledge about an object's colour, like skin colour changes under different illumination conditions, can be used to develop more robust techniques against illumination changes". Many face databases exist, and in some cases they contain colour images and even videos. However, from the point of view of this thesis these databases have several limitations: unavailability of spectral data related to image acquisition, undefined illumination conditions of the acquisition, and if illumination change is present it often means only change in illumination direction. To overcome these limitations, two databases, a Physics-Based Face Database and a Face Video Database were created. In addition to the images, the Physics-Based Face Database consists of spectral data part including skin reflectances, channel responsivities of the camera and spectral power distribution of the illumination. The images of faces are taken under four known light sources with different white balancing illumination conditions for over 100 persons. In addition to videos, the Face Video Database has spectral reflectances of skin for selected persons and images taken with the same measurement arrangement as in the Physics-Based Face Database. The images and videos are taken with several cameras. The databases were used to gather information about skin chromaticities and to provide test material. The skin RGB from images were converted to different colour spaces and the result showed that the normalized colour coordinate was among the most usable colour spaces for skin chromaticity modelling. None of the colour spaces could eliminate the colour shifts in chromaticity. The obtained chromaticity constraint can be implemented as an adaptive skin colour modelling part of face tracking algorithms, like histogram backprojection or mean shift. The performances of these adaptive algorithms were superior compared to those using a fixed skin colour model or model adaptation based on spatial pixel selection. Of course, there are cases when the colour cue is not enough alone and use of other cues like motion or edge data would improve the result. It was also demonstrated that the skin colour model can be used to segment faces and the segmentation results depend on the background due to the method used. Also an application for colour correction using principal component analysis and a simplified dichromatic reflection model was shown to improve colour quality of seriously clipped images. The results of tracking, segmentation and colour correction experiments using the collected data validate the thesis statement.
APA, Harvard, Vancouver, ISO, and other styles
5

PAULA, MARCUS VINICIUS DE. "SHADOW OF ILLUMINATION: AN ICONOLOGICAL ANALYSIS OF ILLEGIBILITY." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2008. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=11785@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
Este trabalho utiliza o método iconológico de análise desenvolvido por W.J.T. Mitchell para expandir a noção de iluminura e dissolver a oposição entre legível e ilegível. Na primeira parte desta tese são analisados e definidos os mecanismos de leitura da iluminura e suas relações com as convenções da legibilidade e da ilegibilidade. Na segunda parte aborda-se essa mesma questão por meio de outra perspectiva, que envolve as transformações gráficas sofridas pelas páginas dos jornais a partir da virada do século XIX para o século XX.
This work uses the iconological method of analysis developed by W.J.T. Mitchell to expand on the notion of illlumination and to break up the antagonism between the legibel and the illegible.In the first part of this thesis the mechanisms of reading illumination and their relation with the conventions of legibility and illegibility are analized and defined. In the second part, this same subject is approached from another perspective wich has envolved with the graphic transformation of newspaper pages at the end of the 19th Century end the beginning of the 20th Century.
APA, Harvard, Vancouver, ISO, and other styles
6

Agrawal, Amit Kumar. "Scene analysis under variable illumination using gradient domain methods." College Park, Md. : University of Maryland, 2006. http://hdl.handle.net/1903/3624.

Full text
Abstract:
Thesis (Ph. D.) -- University of Maryland, College Park, 2006.
Thesis research directed by: Electrical Engineering. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO, and other styles
7

Nillius, Peter. "Image Analysis using the Physics of Light Scattering." Doctoral thesis, KTH, Numerical Analysis and Computer Science, NADA, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3780.

Full text
Abstract:

Any generic computer vision algorithm must be able to copewith the variations in appearance of objects due to differentillumination conditions. While these variations in the shadingof a surface may seem a nuisance, they in fact containinformation about the world. This thesis tries to provide anunderstanding what information can be extracted from theshading in a single image and how to achieve this. One of thechallenges lies in finding accurate models for the wide varietyof conditions that can occur.

Frequency space representations are powerful tools foranalyzing shading theoretically. Surfaces act as low-passfilters on the illumination making the reflected lightband-limited. Hence, it can be represented by a finite numberof components in the Fourier domain, despite having arbitraryillumination. This thesis derives a basis for shading byrepresenting the illumination in spherical harmonics and theBRDF in a basis for isotropic reflectance. By analyzing thecontributing variance of this basis it is shown how to createfinite dimensional representations for any surface withisotropic reflectance.

The finite representation is used to analytically derive aprincipal component analysis (PCA) basis of the set of imagesdue to the variations in the illumination and BRDF. The PCA isperformed model-based so that the variations in the images aredescribed by the variations in the illumination and the BRDF.This has a number of advantages. The PCA can be performed overa wide variety of conditions, more than would be practicallypossible if the images were captured or rendered. Also, thereis an explicit mapping between the principal components and theillumination and BRDF so that the PCA basis can be used as aphysical model.

By combining a database of captured illumination and adatabase of captured BRDFs a general basis for shading iscreated. This basis is used to investigate materialclassification from a single image with known geometry butarbitrary unknown illumination. An image is classified byestimating the coecients in this basis and comparing them to adatabase. Experiments on synthetic data show that materialclassification from reflectance properties is hard. There aremis-classifications and the materials seem to cluster intogroups. The materials are grouped using a greedy algorithm.Experiments on real images show promising results.

Keywords:computer vision, shading, illumination,reflectance, image irradiance, frequency space representations,spherical harmonics, analytic PCA, model-based PCA, materialclassification, illumination estimation

APA, Harvard, Vancouver, ISO, and other styles
8

Singh, Gurprit. "Sampling and Variance Analysis for Monte Carlo Integration in Spherical Domain." Thesis, Lyon 1, 2015. http://www.theses.fr/2015LYO10121/document.

Full text
Abstract:
Cette thèse introduit un cadre théorique pour l'étude de différents schémas d'échantillonnage dans un domaine sphérique, et de leurs effets sur le calcul d'intégrales pour l'illumination globale. Le calcul de l'illumination (du transport lumineux) est un composant majeur de la synthèse d'images réalistes, qui se traduit par l'évaluation d'intégrales multidimensionnelles. Les schémas d'intégration numériques de type Monte-Carlo sont utilisés intensivement pour le calcul de telles intégrales. L'un des aspects majeurs de tout schéma d'intégration numérique est l'échantillonnage. En effet, la façon dont les échantillons sont distribués dans le domaine d'intégration peut fortement affecter le résultat final. Par exemple, pour la synthèse d'images, les effets liés aux différents schémas d'échantillonnage apparaissent sous la forme d'artéfacts structurés ou, au contrire, de bruit non structuré. Dans de nombreuses situations, des résultats complètement faux (biaisés) peuvent être obtenus à cause du schéma d'échantillonnage utilisé pour réaliser l'intégration. La distribution d'un échantillonnage peut être caractérisée à l'aide de son spectre de Fourier. Des schémas d'échantillonnage peuvent être générés à partir d'un spectre de puissance dans le domaine de Fourier. Cette technique peut être utilisée pour améliorer l'erreur d'intégration, car un tel contrôle spectral permet d'adapter le schéma d'échantillonnage au spectre de Fourier de l'intégrande. Il n'existe cependant pas de relation directe entre l'erreur dans l'intégration par méthode de Monte-Carlo et le spectre de puissance de la distribution des échantillons. Dans ces travaux, nous proposons une formulation de la variance qui établit un lien direct entre la variance d'une méthode de Monte-Carlo, les spectres de puissance du schéma d'échantillonnage ainsi que de l'intégrande. Pour obtenir notre formulation de la variance, nous utilisons la notion d'homogénéité de la distribution des échantillons qui permet d'exprimer l'erreur de l'intégration par une méthode de Monte-Carlo uniquement sous forme de variance. À partir de cette formulation de la variance, nous développons un outil d'analyse pouvant être utilisé pour déterminer le taux de convergence théorique de la variance de différents schémas d'échantillonnage proposés dans la littérature. Notre analyse fournit un éclairage sur les bonnes pratiques à mettre en œuvre dans la définition de nouveaux schémas d'échantillonnage basés sur l'intégrande
This dissertation introduces a theoretical framework to study different sampling patterns in the spherical domain and their effects in the evaluation of global illumination integrals. Evaluating illumination (light transport) is one of the most essential aspect in image synthesis to achieve realism which involves solving multi-dimensional space integrals. Monte Carlo based numerical integration schemes are heavily employed to solve these high dimensional integrals. One of the most important aspect of any numerical integration method is sampling. The way samples are distributed on an integration domain can greatly affect the final result. For example, in images, the effects of various sampling patterns appear in the form of either structural artifacts or completely unstructured noise. In many cases, we may get completely false (biased) results due to the sampling pattern used in integration. The distribution of sampling patterns can be characterized using their Fourier power spectra. It is also possible to use the Fourier power spectrum as input, to generate the corresponding sample distribution. This further allows spectral control over the sample distributions. Since this spectral control allows tailoring new sampling patterns directly from the input Fourier power spectrum, it can be used to improve error in integration. However, a direct relation between the error in Monte Carlo integration and the sampling power spectrum is missing. In this work, we propose a variance formulation, that establishes a direct link between the variance in Monte Carlo integration and the power spectra of both the sampling pattern and the integrand involved. To derive our closed-form variance formulation, we use the notion of homogeneous sample distributions that allows expression of error in Monte Carlo integration, only in the form of variance. Based on our variance formulation, we develop an analysis tool that can be used to derive theoretical variance convergence rates of various state-of-the-art sampling patterns. Our analysis gives insights to design principles that can be used to tailor new sampling patterns based on the integrand
APA, Harvard, Vancouver, ISO, and other styles
9

Lee, Jinho. "Synthesis and analysis of human faces using multi-view, multi-illumination image ensembles." Columbus, Ohio : Ohio State University, 2005. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1133366279.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Stevenson, Brady Roos. "Analysis of Near-Infrared Phase Effects on Biometric Iris Data." BYU ScholarsArchive, 2006. https://scholarsarchive.byu.edu/etd/1299.

Full text
Abstract:
The purpose of this research is to ascertain potential iris scan data variations from near infrared waves derived from fluorescent illumination. Prior studies of iris data variances from infrared wave interference of halogen, incandescent, and sunlight with iris cameras suggest that similar changes may exist under near infrared wavelengths from fluorescent light. The concern is that the fluorescent energy emission may interfere with the near infrared detection of an iris camera. An iris camera is used to measure human eye characteristics known as biometrics. If such infrared emission is statistically significant, then it can alter the validity of the iris scan data. The experiment utilized nine hundred forty-five (945) scans from sixty-three (63) subjects. Measured results showed increased heat from ambient fluorescent illumination does not statistically alter the biometric readings of human eyes. The test results fail to reject that data loss will not occur as heat is increased in the ambient fluorescent light source.
APA, Harvard, Vancouver, ISO, and other styles
11

Wallenberg, Marcus. "A Single-Camera Gaze Tracker using Controlled Infrared Illumination." Thesis, Linköping University, Department of Electrical Engineering, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-17398.

Full text
Abstract:

Gaze tracking is the estimation of the point in space a person is “looking at”. This is widely used in both diagnostic and interactive applications, such as visual attention studies and human-computer interaction. The most common commercial solution used to track gaze today uses a combination of infrared illumination and one or more cameras. These commercial solutions are reliable and accurate, but often expensive. The aim of this thesis is to construct a simple single-camera gaze tracker from off-the-shelf components. The method used for gaze tracking is based on infrared illumination and a schematic model of the human eye. Based on images of reflections of specific light sources in the surfaces of the eye the user’s gaze point will be estimated. Evaluation is also performed on both the software and hardware components separately, and on the system as a whole. Accuracy is measured in spatial and angular deviation and the result is an average accuracy of approximately one degree on synthetic data and 0.24 to 1.5 degrees on real images at a range of 600 mm.

APA, Harvard, Vancouver, ISO, and other styles
12

Kricke, Ralph [Verfasser]. "Lip Motion Analysis for a Person Authentication System under Near Infrared Illumination / Ralph Kricke." München : Verlag Dr. Hut, 2011. http://d-nb.info/1015608108/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Bari, Daniele. "Characterization and Reliability of Dye-sensitized Solar Cells: Temperature, Illumination, and Bias Effects." Doctoral thesis, Università degli studi di Padova, 2014. http://hdl.handle.net/11577/3423712.

Full text
Abstract:
Dye-sensitized solar cells (DSC) have recently proved to be a low-cost alternative to inorganic photovoltaics and they could attract a remarkable market share in the future. On the other hand, reliability issues must be solved to improve the competitiveness of this new solar energy technology. The present thesis deals with a characterization and reliability study of DSC aiming to have a comprehensive picture of the efficiency, stability, and degradation mechanisms of DSC, with the purpose to promote these devices as an alternative energy source in agreement with the European Community directions. Since Michael Gratzel advanced the concept of sensitized materials and nanoporous semiconductors in 1991, dye-sensitized solar cells have attracted the interest of many academic and solar company researchers opening the road to photovoltaics of the third generation. Noticeable achievements in dye synthesis and wide band-gap semiconductor fabrication allow physicists, chemists, and engineers to produce more efficient and reliable DSC. At the time of this dissertation, DSC efficiency has reached 15% allowing them to compete with conventional inorganic photovoltaic systems in terms of cost and material complexity especially in those applications where the efficiency-to-production ratio cost must be maximized. There are several applications where the performances of photo-electrochemical solar cells are already sufficient: outdoor applications such as windows of buildings and greenhouse coverage; indoor applications such as windows, decoration structures, and shop windows. In spite of the advantages, many technological and reliability issues still have to be solved. Those include: the stability of the electrical characteristics, weak or damaged sealing, environmental related factors (i.e. UV exposure for outdoor applications), humidity, high temperature, and the improvement of DSC lifetime. Intensive research is performed by researchers all around the world to understand the reliability and causes of instability in DSC: those efforts involve the study of many physic aspects including the effects of the different layers and materials, morphology, dyes, electrolytes, counter electrodes, growth conditions and the presence of oxygen and moisture. The characterization methods used to understand and monitor the electrical properties of silicon-based solar cells cannot be used "`as is"' for DSC without considering the completely different nature of DSC compared to silicon-based solar cells. Starting from the knowledge about the characterization of silicon base solar cells and a background on electrochemistry, we carefully transposed the same characterization technique to DSC. Access to full details of the devices and to "`ad hoc"' structures for the analysis has been granted thanks to the collaboration with our University of Rome "`Tor Vergata"'. We developed a measurement procedure, which allows us to define the standards for the characterization of dye-sensitized solar cells. This procedure is based on DC measurements as well as electrochemical impedance spectroscopy (EIS), the latter coming from electrochemistry. This technique allowed us to characterize DSC interfaces and identify which interfaces were been degraded during accelerated tests. This measurement set gives a comprehensive description of the behavior of all analyzed devices. Our characterization and reliability study mostly involves the use of the AM1.5 solar simulator, where its spectrum spreads from UV to far IR wavelength. As an alternative illumination source, we designed a LED-based monochromatic light source in order to illuminate solar cells during characterization. We designed the illuminator as well as the driver circuitry. We found that these monochromatic sources trigger different portions of the solar cells’ absorption spectrum as a function of the illumination source wavelength, giving the possibility of gathering additional information about DSC efficiency and degradation. In addition, during accelerated stresses we found that the degradation kinetic of open-circuit voltage, short-circuit current, efficiency, and fill-factor change if the characterization is performed with different illumination source wavelengths. This fact points out that characterization performed under monochromatic light could give additional information about the degradation mechanism behind DSC degradation. To make a picture of DSC reliability, we carried out several accelerated stresses, stressing devices under different illumination sources. All these tests were carried out indoors. We analyzed the degradation of samples subjected to accelerated life tests with different illumination conditions and the role of the temperature on the device degradation by means of: the AM1.5 solar simulator, white LED, UV exposure, and both thermal and electrical stresses. Since DSC gain heat during sunlight exposure thereby increasing their temperature, we examined the role of temperature on the DSC degradation. We showed that the temperature alone may strongly impact on the degradation rate of DSC reducing the overall DSC performance; in addition, we proved that the temperature has a twofold impact on cell performance. A moderate temperature induces an annealing process: it enhances the performances of the dye material likely recovering and rearranging some dangling or weak bonds at the transparent semiconductor interface or among the dye molecules. On the other hand, at high temperatures or for longer storage times regardless of the temperature level, the temperature strongly reduces the DSC performance as well as its lifetime. In order to understand the effects induced by sun illumination exposure, we carried out accelerated optical stresses by means of AM1.5 solar simulator. Furthermore, we compared the degradation kinetics of DC parameters gathered from optical and thermal stresses. The responsible of the degradation during thermal or illumination stress is the formation of defects and chemical species at transparent semiconductor/sensitizer/electrolyte interface which reduces the charge transfer at interface and the ion migration across electrolyte. During optical stresses, we observed a main difference between the open-circuit degradation kinetics and the short-circuit degradation kinetics: the latter usually features a turnaround phase during optical stresses. The turnaround phase is strongly dependent on the illumination intensity used during the accelerated stress: the higher the illumination level, the shorter the turnaround phase. The device features faster degradation kinetics with higher illumination levels likely due increase of the interface temperature, as also confirmed by pure thermal stresses. High power-to-weight ratio allows DSC to be used as solar energy harvester in space application. To make some light on the high energetic photons effects (even present at ground level) on DSC, we performed accelerated UV illumination stresses. We designed and assembled a UV illuminator as well as the driving circuitry. We found that UV exposure has detrimental effects on DSC and the main responsible for the cell failure during UV exposure is the electrolyte bleaching. It is worth to remark that the sensitizer seems to have a minor role in cell degradation as we proved. Concerning DSC studied in this Thesis, we strongly recommend some solutions in order to prevent electrolyte bleaching. Good UV filtering and encapsulation bring benefits for a reliable operation over time, even though they potentially go against the low weight and transparency nature of DSC. High efficiency even at low illumination intensity or under diffused light allows DSC to be taken into account for indoor applications. We carried out accelerated stresses by means of high power white LED and we compared the DC parameters degradation kinetics as well as the EIS plot evolution measured by means both AM1.5 solar simulator and white-led illuminator. In addition, we proposed a white led-based illumination system as a cheap and versatile alternative to expensive AM1.5 solar simulator. We designed the white LED-based illuminator as well as the driving circuit. We found that white led exposure leads to the degradation of DSC performance and even though the white spectrum has not UV component, dye-molecules are not be able to absorb wavelength in the UV. Comparing white-led and AM1.5 solar simulator characterizations, we proved and showed that the latter provides more information than the former. From the solar panel point of view, some DSC could be run into failure or be shaded during solar exposure. This likely real situation forces a single cell or a whole DSC string to work under certain bias conditions. To examine this non-trivial real condition, we designed and assembled current drivers and we performed forward and reverse biased constant current stresses (CCS) on dye-sensitized solar cells kept in the dark. We showed that DC parameters feature different degradation rates depending on that bias polarity and current intensity. We showed that forward CCS lead to the modification in the electrolyte composition, lowering the dark current of the cell while reverse CCS lead to the degradation of counter-electrode, accelerating the corrosion of the counter electrode by the electrolyte. In addition, we proved that most degradation occurs at those interfaces where the electrons are emitted during stress.
Dye-sensitized solar cells (DSC) hanno recentemente dimostrato di essere un’alternativa a basso costo al fotovoltaico inorganico e in futuro non lontano potrebbero detenere una quota di mercato notevole. Tuttavia, i problemi di affidabilità devono essere risolti per migliorare la competitività di questa nuova tecnologia. La presente tesi tratta la caratterizzazione e lo studio affidabilità di DSC al fine di avere un quadro completo circa l'efficienza, la stabilità e i meccanismi di degradazione nelle DSC, al fine di promuovere questi dispositivi come un nuova fonte di energia rispettando inoltre le normative della Comunità Europea. Da quando Michael Grätzel nel 1991 avanzò il concetto di materiali sensibilizzati e semiconduttori nanoporosi, dye-sensitized solar cells hano attirato l'interesse di molti ricercatori universitari e di aziende operanti nel fotovoltaico, aprendo così la strada al fotovoltaico di terza generazione. Risultati notevoli nella sintesi di cromofori sempre più pancromatici e nella fabbricazione di semiconduttori ad ampio bad-gap, consentono a fisici, chimici ed ingegneri di produrre DSC sempre più efficienti e affidabili. Al momento di questa tesi, l’efficienza delle DSC ha raggiunto il 13.4% il che consente loro di competere con i sistemi fotovoltaici inorganici convenzionali in termini di costi di produzione e complessità materiale, in particolar modo in quelle applicazioni in cui il rapporto efficienza costi di produzione deve essere massimizzato. Ci sono diverse applicazioni in cui le prestazioni di queste celle solari foto-elettrochimiche sono già sufficienti: applicazioni outdoor, come le finestre degli edifici e la copertura delle serre; applicazioni indoor come finestre, strutture di decorazione, e le vetrate dei negozi. Nonostante i vantaggi, molti problemi tecnologici e di affidabilità devono ancora essere risolti. Alcune delle problematiche sono: stabilità delle caratteristiche elettriche, incapsulamento, effetti dei fattori ambientali (ad esempio l'esposizione ai raggi UV per applicazioni esterne), l'umidità, la temperatura elevata, l’incremento del lifetime. Un'intensa attività di ricerca è portata avanti da ricercatori di tutto il mondo per capire l'affidabilità e le cause di instabilità delle DSC: questi sforzi coinvolgono lo studio di molti aspetti fisici e chimici compresi gli effetti nell’uso di diversi materiali, strutture, morfologie, coloranti, elettroliti, contro-elettrodi, fabbricazione in condizioni e presenza di ossigeno e di umidità. I metodi di caratterizzazione utilizzati per caratterizzare celle solari silicon-based non possono essere utilizzati "as is" per le DSC senza considerare la diversa natura delle DSC rispetto alle celle silicon-based. Partendo dalla conoscenza nella caratterizzazione di celle solari silicon-based e da un background in elettrochimica, abbiamo attentamente trasposto i metodi di caratterizzazione alle DSC. L'accesso a tutti i dettagli tecnologici delle DSC sono disponibili grazie alla collaborazione con l'Università di Roma "Tor Vergata". Abbiamo sviluppato una procedura di misura che permette di definire gli standard per la caratterizzazione di dye-sensitized solar cells. Questa procedura si basa su misure DC e spettroscopia di impedenza (EIS), dove quest'ultima tecnica proviene dall’elettrochimica. Questa tecnica permette di caratterizzare le interfacce presenti nelle DSC e di identificare quali interfacce stanno degradando durante gli stress accelerati. Questo set di misure fornisce una descrizione completa delle celle e del loro comportamento durante gli stress accelerati. La caratterizzazione e lo studio di affidabilità viene esguita illuminando le celle con un simulatore solare AM 1.5, dove il suo spettro si estende dagli UV sino al lontano IR. Come fonte di illuminazione alternativa, abbiamo progettato una sorgente di luce monocromatica basata su LED per illuminare le celle solari durante la caratterizzazione. Abbiamo progettato l'illuminatore nonché la circuiteria di pilotaggio. Abbiamo scoperto che queste sorgenti monocromatiche eccitano una porzione diversa dello spettro di assorbimento delle celle: in particolare, la porzione dello spettro eccitata è funzione della lunghezza d'onda della sorgente di illuminazione. Ciò permette di avere ulteriori informazioni sull’efficienza e sulla degradazione delle DSC. Inoltre, durante gli aging test, abbiamo notato che la cinetica di degradazione della tensione di circuito aperto, della corrente di corto circuito, dell'efficienza, e del fill factor, cambia se la caratterizzazione viene eseguita con diverse lunghezze d'onda della sorgente di illuminazione. Questo fatto sottolinea che la caratterizzazione effettuata con luce monocromatica potrebbe dare ulteriori informazioni sul meccanismo di degradazione che causa il degrado delle DSC. Per avere un quadro sull’affidabilità delle DSC, abbiamo effettuato molti ageing test, con altrettante fonti di illuminazione o in generale di stress. Tutte queste prove sono state effettuate indoor. Abbiamo studiato il degrado delle celle sottoposte a stress accelerati con diverse condizioni di illuminazione e il ruolo della temperatura nel degrado delle celle. Questo studio è stato possbile effettuando stress accelerati per mezzo di: simulatore solare AM1.5, camere climatiche, illuminatore a LED bianco, illuminatore UV, e driver in corrente per gli constant current stress (CCS). Poiché le DSC si scaldano durante l'esposizione alla luce solare e quindi la loro temperatura interna aumenta, abbiamo cercato di capire il ruolo della temperatura nella degradazione delle DSC. Abbiamo dimostrato che la sola temperatura può incidere fortemente sul tasso di degradazione delle DSC riducendo le prestazioni complessive delle celle; inoltre, abbiamo dimostrato che la temperatura ha un duplice impatto sulle prestazioni delle celle. Una temperatura moderata induce un processo di annealing: migliora le prestazioni del colorante probabilmente ristabilendo alcuni legami liberi o deboli all'interfaccia semiconduttore trasparente/colorante o tra le molecole di colorante. D'altra parte, a temperature elevate o per tempi più lunghi di stress, indipendentemente dal livello di temperatura, la temperatura riduce fortemente le prestazioni DSC nonché il lifetime. Per capire gli effetti indotti da esposizione alla luce solare, abbiamo effettuato stress ottici accelerati per mezzo di un simulatore solare AM 1.5. Inoltre, abbiamo confrontato la cinetica di degradazione dei parametri DC misuratu durante gli stress ottici e termici. Il responsabile della degradazione durante lo stress termico o ottico è la formazione di difetti e di specie chimiche all’interfaccia tra il semiconduttore trasparente/dye/elettrolita i quali riducono la capacità di trasferimento di carica all'interfaccia e la migrazione degli ioni attraverso l’elettrolita. Durante gli stress ottici, abbiamo osservato una chiara differenza tra la cinetica di degradazione della tensione di circuito aperto e la cinetica di degradazione della corrente di corto circuito: quest'ultimo solitamente presenta una fase di inversione di tendenza durante gli stress ottici. La fase di inversione di tendenza è fortemente dipendente dalla intensità di illuminazione utilizzata durante lo stress accelerato: maggiore è il livello di illuminazione, minore è la durata della fase di inversione di tendenza. La degradazione della cella è più veloce con livelli di illuminazione più elevati probabilmente dovuta all'aumento della temperatura di interfaccia, come confermato anche dagli stress termici puri. L’elevato rapporto efficienza-peso consente alle DSC di poter essere utilizzate come fonte di energia in applicazioni spaziali. Per indagare gli effetti di fotoni ad alta energia (presenti anche a livello del suolo) sulle DSC, abbiamo effettuato stress accelerati utilizzando una fonte di illuminazione UV. Abbiamo progettato e assemblato un illuminatore a raggi UV, così come il circuito di pilotggio. Abbiamo scoperto che l'esposizione ai raggi UV ha effetti negativi su DSC e il principale responsabile della degradazione delle celle durante l'esposizione ai raggi UV è il bleaching dell'elettrolita (scolorimento dell’elettrolita). Vale la pena notare che il dye sembra avere un ruolo secondario nella degradazione della cella come è stato dimostrato. Per quanto riguarda le DSC studiate in questa tesi, si consiglia di adottare alcune soluzioni per evitare il bleaching dell’elettrolita. Il filtraggio UV e un buon incapsulamento potrebbero portare benefici per un funzionamento affidabile nel tempo, anche se potenzialmente vanno contro il peso contenuto e la naturale trasparenza delle DSC. Alta efficienza anche a basse intensità di illuminazione o con luce diffusa permette alle DSC di essere prese in considerazione per applicazioni indoor. Abbiamo effettuato ageing test tramite LED bianchi ad alta potenza e abbiamo confrontato la cinetica di degradazione dei parametri DC così come l’evoluzione dell’impedenza (EIS). Queste caratteristiche sono state misurate illuminando le celle sia con simulatore solare AM 1.5 che con un illuminatore a LED bianchi. Congiuntamente allo studio di affidabilità, abbiamo proposto un sistema di illuminazione basato su LED bianchi come un'alternativa economica e versatile ai costosi simulatori solari AM 1.5. Abbiamo progettato l'illuminatore basato su LED bianchi cosi come il suo circuito di pilotaggio. Dai risultati raccolti durente gli stress, abbiamo scoperto che l'esposizione alla luce bianca porta al degrado delle prestazioni delle DSC. Anche se lo spettro bianco non ha componente UV, le molecole del dye non sono più in grado di assorbire lunghezze d'onda nella regione UV. Confrontando le caratteristiche (DC ed EIS) misurate con il simulatore solare a LED bianchi e con il simulatore solare AM 1.5, abbiamo mostrato e provato che quest'ultimo fornisce più informazioni rispetto al primo. Dal punto di vista pannello solare, alcune DSC potrebbero incorrere in guasti o essere ombreggiate durante l'esposizione solare. Tale situazione potrebbe verificarsi nel caso in cui una cella/stringa di un pannello solare non sia funzionante oppure sia in ombra e non siano state adottate soluzioni atte a prevenirne una condizione operativa non convenzionale (ovvero non sono presenti diodi di by-pass o blocking diode). Per esaminare questa condizione reale non banale, abbiamo progettato e assemblato diversi driver di corrente e abbiamo eseguito molti constant current stress (CCS). I CCS eseguiti sono di due tipi: postive CCS e negative CCS. Il primo prevede di polarizzare la cella in modo tale che la corrente scorra nello verso che scorre quando esposta a luce solare, cioè in condizione standard di funzionamento; negative CCS, prevede di polarizzare la cella nel senso opposto al positive CCS. Durante gli stress le DSC vengono mantenute al buio, per evitare effetti dovuti all’illuminazione. Dai dati raccolti durante i due tipi di CCS, si è potuto envincere che entrambe portano ad una degradazione delle performance della cella e che all’aumentare del modulo della corrente di stress diminuisce il tempo di vita della DSC. Osservando le caratteristiche DC delle celle stressate, positive e negative CCS degradano le DSC in maniera diversa: i primi portano ad un degrado lento e costante della cella, i secondi, apparentemente non degradano le celle in maniera significativa all’inizio dello stress, ma ne causano un’improvvisa e rapida degradazione (sudden failure) dopo diverse ore. L’istante in cui si verifica il sudden failure della DSC è funzione dell’intensità della corrente di stress. Abbiamo mostrato che durante i positive CCS, la composizione elettrolita cambia, abbassando la dark current della cella solare, mentre i negative CCS portano alla degradazione del contro-elettrodo, accelerandone la corrosione da parte dell’elettrolita. Inoltre, abbiamo dimostrato che la maggior della degradazione avviene alle interfacce in cui gli elettroni sono emessi durante lo stress. I risultati ottenuti, dimostrano che i CCS hanno effetti irreversibili sulle prestazioni elettriche delle DSC e che alcune soluzioni circuitali devono essere adottate allo scopo di prevenire inoppurtune condizioni di funzionamento delle celle.
APA, Harvard, Vancouver, ISO, and other styles
14

Luo, Ming. "Optical Analysis and Opto-Mechanical Design for Miniaturized Laser Illumination Module in 3D Areal Mapper." Thesis, Virginia Tech, 2000. http://hdl.handle.net/10919/33169.

Full text
Abstract:
A miniaturized spatial light modulator (SLM)-based structured-light illumination module with optical fiber input is designed to generate a coded 256 x 256 spots pattern for 3-D areal mapping applications. The projector uses the light from a He-Ne laser coupled to a polarization-maintaining (PM) fiber to illuminate a specially made hologram so that four virtual point sources are regenerated. The interference pattern of the four sources are filtered and modulated by an SLM. The output intensity can thus be encoded to form any arbitrary pattern through the electronic input applied to the SLM with a high speed. In this thesis, a complete optical diffraction analysis of the system is presented to provide guidelines for the optimal design of the system parameters. Through the theoretical analysis for square beam array generation, the important parameters for fabricating a hologram are given. The final system optical design and arrangement based on optical analysis are described. The detailed opto-mechanical construction of the LIM and the associated alignment, the computer simulation and the preliminary test results of the developed LIM are also provided.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
15

Carvalho, Tiago José de 1985. "Illumination inconsistency sleuthing for exposing fauxtography and uncovering composition telltales in digital images." [s.n.], 2014. http://repositorio.unicamp.br/jspui/handle/REPOSIP/275519.

Full text
Abstract:
Orientadores: Anderson de Rezende Rocha, Hélio Pedrini
Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Computação
Made available in DSpace on 2018-08-25T12:33:40Z (GMT). No. of bitstreams: 1 Carvalho_TiagoJosede_D.pdf: 74759719 bytes, checksum: dc371f3262b700f91afa5e0269df1e05 (MD5) Previous issue date: 2014
Resumo: Antes tomadas como naturalmente genuínas, fotografias não mais podem ser consideradas como sinônimo de verdade. Com os avanços nas técnicas de processamento de imagens e computação gráfica, manipular imagens tornou-se mais fácil do que nunca, permitindo que pessoas sejam capazes de criar novas realidades em minutos. Infelizmente, tais modificações, na maioria das vezes, têm como objetivo enganar os observadores, mudar opiniões ou ainda, afetar como as pessoas enxergam a realidade. Assim, torna-se imprescindível o desenvolvimento de técnicas de detecção de falsificações eficientes e eficazes. De todos os tipos de falsificações de imagens, composições são de especial interesse. Esse tipo de falsificação usa partes de duas ou mais imagens para construir uma nova realidade exibindo para o observador situações que nunca aconteceram. Entre todos os diferentes tipos de pistas investigadas para detecção de composições, as abordagens baseadas em inconsistências de iluminação são consideradas as mais promissoras uma vez que um ajuste perfeito de iluminação em uma imagem falsificada é extremamente difícil de ser alcançado. Neste contexto, esta tese, a qual é fundamentada na hipótese de que inconsistências de iluminação encontradas em uma imagem são fortes evidências de que a mesma é produto de uma composição, apresenta abordagens originais e eficazes para detecção de imagens falsificadas. O primeiro método apresentado explora o reflexo da luz nos olhos para estimar as posições da fonte de luz e do observador da cena. A segunda e a terceira abordagens apresentadas exploram um fenômeno, que ocorre com as cores, denominado metamerismo, o qual descreve o fato de que duas cores podem aparentar similaridade quando iluminadas por uma fonte de luz mas podem parecer totalmente diferentes quando iluminadas por outra fonte de luz. Por fim, nossa última abordagem baseia-se na interação com o usuário que deve inserir normais 3-D em objetos suspeitos da imagem de modo a permitir um cálculo mais preciso da posição 3-D da fonte de luz na imagem. Juntas, essas quatro abordagens trazem importantes contribuições para a comunidade forense e certamente serão uma poderosa ferramenta contra falsificações de imagens
Abstract: Once taken for granted as genuine, photographs are no longer considered as a piece of truth. With the advance of digital image processing and computer graphics techniques, it has been easier than ever to manipulate images and forge new realities within minutes. Unfortunately, most of the times, these modifications seek to deceive viewers, change opinions or even affect how people perceive reality. Therefore, it is paramount to devise and deploy efficient and effective detection techniques. From all types of image forgeries, composition images are specially interesting. This type of forgery uses parts of two or more images to construct a new reality from scenes that never happened. Among all different telltales investigated for detecting image compositions, image-illumination inconsistencies are considered the most promising since a perfect light matching in a forged image is still difficult to achieve. This thesis builds upon the hypothesis that image illumination inconsistencies are strong and powerful evidence of image composition and presents four original and effective approaches to detect image forgeries. The first method explores eye specular highlight telltales to estimate the light source and viewer positions in an image. The second and third approaches explore metamerism, when the colors of two objects may appear to match under one light source but appear completely different under another one. Finally, the last approach relies on user¿s interaction to specify 3-D normals of suspect objects in an image from which the 3-D light source position can be estimated. Together, these approaches bring to the forensic community important contributions which certainly will be a strong tool against image forgeries
Doutorado
Ciência da Computação
Doutor em Ciência da Computação
APA, Harvard, Vancouver, ISO, and other styles
16

Cui, Chen. "Adaptive weighted local textural features for illumination, expression and occlusion invariant face recognition." University of Dayton / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1374782158.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Hsu, Ken. "Stochastic analysis of lateral resolution and signal-to-noise ratio in fluorescence microscopy : application to structured illumination microscopy." Thesis, University of Nottingham, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.555413.

Full text
Abstract:
This thesis investigates ways for assessing the practical lateral resolution performance of imaging systems and explores the trade-off between lateral resolution and signal-to-noise ratio (SNR) in high resolution systems. Several authors have commented on the seemingly inherent inefficiency in which imaging systems use the available light when providing resolution enhancement beyond the Abbe limit, often due to the precise mechanism that enabled the resolution enhancement in the first place. Two methods for assessing the lateral resolution and noise performance of microscope systems were developed in this project: a probabilistic analysis based on two-point resolution which has the novelty of being able to deal with synthetic images and the stochastic transfer function (STF) which looks into the effect of noise in the Fourier domain. Results can be determined semi-analytically or by using Monte- Carlo simulations. These methods were applied to several microscope systems such as conventional widefield fluorescence microscopy (WFM) and structured illumination microscopy (SIM) and used to compare their noise performance. These techniques were also used to compare several post-reconstruction processing algorithms for SIM, showing the strengths and weaknesses of each strategy. SIM encodes high spatial frequency information into a senes of images and uses a reconstruction algorithm to yield a high resolution unage. Without further processing, the STF showed that SIM generates noise that is three times greater than that associated with a conventional fluorescence microscope and based on the two-point resolution analysis, the SIM SNR performance in 2D was only comparable with WFM, even though SIM has twice the theoretical bandwidth. However, the reconstruction processing of SIM introduces many redundancies in the form of overlapping spatial frequency orders which can be exploited to improve the SNR further. The STF showed a transition spatial frequency below which the WFM system outperforms SIM and it was shown that a simple WFM-SIM hybrid algorithm based on this observation can indeed significantly improve the SIM result. Common strategies which provide more complete treatments include the Wiener filter and the weighted- average approach. Based on the assumption of uncorrelated noise and a pre-defined goal transfer function, it was shown that the weighted-average method gives the minimum resultant variance statistically. For SIM, this condition is only met when the maximum fringe spatial frequency allowed by the illumination optics is used and the level of noise correlation increases with decreasing illumination grating frequency. The development of a general standardised metric which includes noise considerations can allow more realistic performance assessments of imaging systems and facilitate comparisons. Along with other considerations, such analysis can help practitioners select the most appropriate system for the intended tasks. Understanding the strength and weakness of existing systems will reveal areas of possible enhancements and also help with the development of new techniques.
APA, Harvard, Vancouver, ISO, and other styles
18

Pechacek, Christopher S. (Christopher Scott). "Space, light, and time : prospective analysis of Circadian illumination for health-based daylighting with applications to healthcare architecture." Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/44282.

Full text
Abstract:
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Architecture, 2008.
Includes bibliographical references (p. 120-129).
Light in architecture can be studied for its objective or perceptual effects. This thesis describes an objective link between human health and architectural design. Specifically, the link between daylight and human circadian rhythm (as a proxy for health) is explored. The purpose of this thesis is increase understanding about the health effects of daylighting in architecture. Little in the way of rigorous analysis exists in the emerging field of "evidence-based" design; however billions of dollars are committed to healthcare construction in the United States annually. The next generation of hospitals will certainly be guided by "evidence-based" findings, and so a better understanding of daylight's role in human physiology may influence future healthcare architecture. Therefore, the technical problem addressed here is the prospective analysis of architectural design for circadian stimulus potential based on the state of the art in photobiology. This combines lighting intensity, timing, and spectrum. Included in this thesis are specific recommendations for architectural design, which are based on scientific application of biological findings. Guidelines for circadian illumination are developed and applied. Evaluation of lighting sources (i.e. daylighting, artificial lighting) will reveal those elements of each necessary to meet circadian illumination guidelines. Recommendations for architectural designers will follow that describe how building design can maximize the application of daylighting to promote circadian organization, and thus improve the health potential of the built environment.
by Christopher S. Pechacek.
S.M.
APA, Harvard, Vancouver, ISO, and other styles
19

Pilleboue, Adrien. "Analyse spatiale et spectrale des motifs d'échantillonnage pour l'intégration Monte Carlo." Thesis, Lyon 1, 2015. http://www.theses.fr/2015LYO10225/document.

Full text
Abstract:
L’échantillonnage est une étape clé dans le rendu graphique. Il permet d’intégrer la lumière arrivant en un point de la scène pour en calculer sa couleur. Généralement, la méthode utilisée est l’intégration Monte Carlo qui approxime cette intégrale en choisissant un nombre fini d’échantillons. La réduction du biais et de la variance de l’intégration Monte Carlo est devenue une des grandes problématiques en rendu réaliste. Les techniques trouvées consistent à placer les points d’échantillonnage avec intelligence de façon à rendre la distribution la plus uniforme possible tout en évitant les régularités. Les années 80 ont été de ce point de vue un tournant dans ce domaine, avec l’apparition de nouvelles méthodes stochastiques. Ces méthodes ont, grâce à une meilleure compréhension des liens entre intégration Monte Carlo et échantillonnage, permis de réduire le bruit et la variance des images générées, et donc d’améliorer leur qualité. En parallèle, la complexité des méthodes d’échantillonnage s’est considérablement améliorée, permettant d’obtenir des méthodes à la fois rapides et efficaces en termes de qualité. Cependant, ces avancées ont jusqu’à là été faites par tâtonnement et se sont axées sur deux points majeurs : l’amélioration de l’uniformité du motif d’échantillonnage et la suppression des régularités. Bien que des théories permettant de borner l’erreur d’intégration existent, elles sont souvent limitées, voire inapplicables dans le domaine de l’informatique graphique. Cette thèse propose de rassembler les outils d’analyse des motifs d’échantillonnages et de les mettre en relation. Ces outils peuvent caractériser des propriétés spatiales, comme la distribution des distances entre points, ou bien spectrales à l’aide de la transformée de Fourier. Nous avons ensuite utilisé ces outils afin de donner une expression simple de la variance et du biais dans l’intégration Monte Carlo, en utilisant des prérequis compatibles avec le rendu d’image. Finalement, nous présentons une boite à outils théorique permettant de déterminer la vitesse de convergence d’une méthode d’échantillonnage à partir de son profil spectral. Cette boite à outils est notamment utilisée afin de classifier les méthodes d’échantillonnage existantes, mais aussi pour donner des indications sur les principes fondamentaux nécessaires à la conception de nouveaux algorithmes d’échantillonnage
Sampling is a key step in rendering pipeline. It allows the integration of light arriving to a point of the scene in order to calculate its color. Monte Carlo integration is generally the most used method to approximate that integral by choosing a finite number of samples. Reducing the bias and the variance of Monte Carlo integration has become one of the most important issues in realistic rendering. The solutions found are based on smartly positioning the samples points in a way that maximizes the uniformity of the distribution while avoiding the regularities. From this point of view, the 80s were a turning point in this domain, as new stochastic methods appeared. With a better comprehension of links between Monte Carlo integration and sampling, these methods allow the reduction of noise and of variance in rendered images. In parallel, the complexity of sampling methods has considerably enhanced, enabling to have fast as well as good quality methods. However, these improvements have been done by trial and error focusing on two major points : the improvement of sampling pattern uniformity, and the suppression of regularities. Even though there exists some theories allowing to bound the error of the integration, they are usually limited, and even inapplicable in computer graphics. This thesis proposes to gather the analysis tools of sampling patterns and to connect them together. These tools can characterize spatial properties such as the distribution of distances between points, as well as spectral properties via Fourier transformation. Secondly, we have used these tools in order to give a simple expression of the bias and the variance for Monte Carlo integration ; this is done by using prerequisites compatible with image rendering. Finally, we present a theoretical toolbox allowing to determine the convergence speed of a sampling method from its spectral profile. This toolbox is used specifically to give indications about the design principles necessary for new sampling algorithms
APA, Harvard, Vancouver, ISO, and other styles
20

Keresztes, Janos C., Koshel R. John, Karlien D’huys, Ketelaere Bart De, Jan Audenaert, Peter Goos, and Wouter Saeys. "Augmented design and analysis of computer experiments: a novel tolerance embedded global optimization approach applied to SWIR hyperspectral illumination design." OPTICAL SOC AMER, 2016. http://hdl.handle.net/10150/622951.

Full text
Abstract:
A novel meta-heuristic approach for minimizing nonlinear constrained problems is proposed, which offers tolerance information during the search for the global optimum. The method is based on the concept of design and analysis of computer experiments combined with a novel two phase design augmentation (DACEDA), which models the entire merit space using a Gaussian process, with iteratively increased resolution around the optimum. The algorithm is introduced through a series of cases studies with increasing complexity for optimizing uniformity of a short-wave infrared (SWIR) hyperspectral imaging (HSI) illumination system (IS). The method is first demonstrated for a two-dimensional problem consisting of the positioning of analytical isotropic point sources. The method is further applied to two-dimensional (2D) and five-dimensional (5D) SWIR HSI IS versions using close-and far-field measured source models applied within the non-sequential ray-tracing software FRED, including inherent stochastic noise. The proposed method is compared to other heuristic approaches such as simplex and simulated annealing (SA). It is shown that DACEDA converges towards a minimum with 1 % improvement compared to simplex and SA, and more importantly requiring only half the number of simulations. Finally, a concurrent tolerance analysis is done within DACEDA for to the five-dimensional case such that further simulations are not required. (C) 2016 Optical Society of America
APA, Harvard, Vancouver, ISO, and other styles
21

Kopřiva, Antonín. "Analýza tvorby třísky pomocí digitální vysokorychlostní kamery." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2011. http://www.nusl.cz/ntk/nusl-229651.

Full text
Abstract:
The aim of the diploma thesis is to present the main theory concerning high-speed cameras, their usefullness in industry and to present the cameras available at the market. A few experiments have been done focused on the right choice of an objective, lighting of the scene or setting of the cameras. A special equipment has been invented for a better process of measuring followed by experiment aimed at boring a cutter in materials chosen beforehand. The speed and the acceleration of the cutter was ana-lyzed by means of MotionMeasure software in another experiment.
APA, Harvard, Vancouver, ISO, and other styles
22

Sunkavalli, Kalyan. "Models of Visual Appearance for Analyzing and Editing Images and Videos." Thesis, Harvard University, 2012. http://dissertations.umi.com/gsas.harvard:10285.

Full text
Abstract:
The visual appearance of an image is a complex function of factors such as scene geometry, material reflectances and textures, illumination, and the properties of the camera used to capture the image. Understanding how these factors interact to produce an image is a fundamental problem in computer vision and graphics. This dissertation examines two aspects of this problem: models of visual appearance that allow us to recover scene properties from images and videos, and tools that allow users to manipulate visual appearance in images and videos in intuitive ways. In particular, we look at these problems in three different applications. First, we propose techniques for compositing images that differ significantly in their appearance. Our framework transfers appearance between images by manipulating the different levels of a multi-scale decomposition of the image. This allows users to create realistic composites with minimal interaction in a number of different scenarios. We also discuss techniques for compositing and replacing facial performances in videos. Second, we look at the problem of creating high-quality still images from low-quality video clips. Traditional multi-image enhancement techniques accomplish this by inverting the camera’s imaging process. Our system incorporates feature weights into these image models to create results that have better resolution, noise, and blur characteristics, and summarize the activity in the video. Finally, we analyze variations in scene appearance caused by changes in lighting. We develop a model for outdoor scene appearance that allows us to recover radiometric and geometric infor- mation about the scene from images. We apply this model to a variety of visual tasks, including color-constancy, background subtraction, shadow detection, scene reconstruction, and camera geo-location. We also show that the appearance of a Lambertian scene can be modeled as a combi- nation of distinct three-dimensional illumination subspaces — a result that leads to novel bounds on scene appearance, and a robust uncalibrated photometric stereo method.
Engineering and Applied Sciences
APA, Harvard, Vancouver, ISO, and other styles
23

Ковальська, Вікторія Володимирівна. "Analysis of the light pollution in the city of Kyiv." Thesis, Національний авіаційний університет, 2020. https://er.nau.edu.ua/handle/NAU/49682.

Full text
Abstract:
Робота публікується згідно наказу ректора від 21.01.2020 р. №008/од "Про перевірку кваліфікаційних робіт на академічний плагіат у 2019-2020 навчальному році". Керівник роботи: доцент кафедри екології, к.т.н., Радомська Маргарита Мирославівна
Object of research – processes of pollution formation under the influence of artificial light sources. Aim оf work – assessment of the light pollution level and its impact on environment. Mehods of research: methods of analysis of sky by Bortle scale; comparative analysis of the obtained values of light level in the Kyiv city
У наш час штучні джерела світла є невід’ємною частиною сучасні міста. Однак проблема полягає в тому, що їх кількість значно перевищує необхідну забезпечити достатній рівень освітлення на вулицях міста. У наш час проблема світла забруднення міст стає все більш важливим, оскільки це питання погано регулюється досудовий рівень. А вплив світлового забруднення відчуває і населення міста, і біота в цілому.
APA, Harvard, Vancouver, ISO, and other styles
24

Ковальська, Вікторія Володимирівна. "Analysis of the light pollution in the city of Kyiv." Thesis, Національний авіаційний університет, 2020. https://er.nau.edu.ua/handle/NAU/44801.

Full text
Abstract:
Робота публікується згідно наказу ректора від 21.01.2020 р. №008/од "Про перевірку кваліфікаційних робіт на академічний плагіат у 2019-2020 навчальному році". Керівник роботи: доцент кафедри екології, к.т.н., Радомська Маргарита Мирославівна
Object of research – processes of pollution formation under the influence of artificial light sources. Aim оf work – assessment of the light pollution level and its impact on environment. Mehods of research: methods of analysis of sky by Bortle scale; comparative analysis of the obtained values of light level in the Kyiv city
У наш час штучні джерела світла є невід’ємною частиною сучасні міста. Однак проблема полягає в тому, що їх кількість значно перевищує необхідну забезпечити достатній рівень освітлення на вулицях міста. У наш час проблема світла забруднення міст стає все більш важливим, оскільки це питання погано регулюється досудовий рівень. А вплив світлового забруднення відчуває і населення міста, і біота в цілому.
APA, Harvard, Vancouver, ISO, and other styles
25

CASINI, ANDREA EMANUELE MARIA. "Multidisciplinary modelling and simulation for assisting the space mission design process using Virtual Reality." Doctoral thesis, Politecnico di Torino, 2018. http://hdl.handle.net/11583/2715849.

Full text
Abstract:
Space mission design is a complex discipline. Several research studies are currently investigating how to ameliorate the process. Since the decision taken during the early phases of the project are those which affect the most the final solution of a system in terms of architecture, configuration, and cost, more efforts are sunk in these stages for not jeopardizing the entire product life-cycle stages. As the stakeholders and the other actors involved in the design process should face low levels of knowledge associated to the system in the conceptual stages, the decision-making process is intrinsically affected by uncertain results. Each choice made in this risky scenario affects the next design iterations, therefore a suitable design approach is needed. Several methodologies have been proposed by both academia and industry in the field of System Engineering (SE). The current trend is to adopt a Model Based System Engineering (MBSE) approach coupled with Concurrent Engineering (CE) paradigms. The model-based methodology overcomes the weaknesses of a document-based one, aggregating all the relevant information and engineering data into a system model, which evolves as the real system throughout all the product life-cycle phases. The systematic CE approach is able to involve several experts in a multidisciplinary working context, where data, ideas, and solutions are shared at the same time using a common platform. Both the approaches help to shorten time and cost of the overall design process and prevent possible mistakes which could worsen the final solution if not identified earlier enough, thus maximizing the efficiency of each design session. However, negotiations still result to be as one of the most complicated and frustrating part of the whole design process. Moreover, the recent space exploration scenarios proposed by national agencies are characterized by multiple actors of different extractions, but commonly participating into shaping future goals. The broader is the international cooperation framework, the more complex will be to design a space mission, especially considering the negotiation goals to be handled by the different experts involved. The present Ph.D. thesis is aiming to cast some lights on the integration of Virtual Reality (VR) within the standard design tools to assist the space mission design process. The creation of a virtual model for simulating different features of a system allows to analyse aspects which may be overlooked, especially in the early design phases, such as ergonomics, operations, and training. The intuitive interaction with human senses and the immersion into a 3D Virtual Environment (VE) guarantee fundamental improvements and evaluation of different solutions that are updated in real-time, benefitting the entire design process, especially the early phases. The visualization of different system features at a single glance permits direct data and information exchange, enabling more direct communications among the design team. The possibility to use a distributed and shared architecture, implemented into a standard Concurrent Design Facility (CDF) setup, enhances in-depth analysis even in the product development phase. This unique VE can simulate functional and physical behaviours of the virtual replica, helping to optimize future space systems. To test the VR-based methodology, a first proof of concept has been generated following the recent incremental and evolutionary architecture strategy of considering the Moon as the next step for the human exploration of Mars and the Solar System. According the exploration roadmaps, a permanent surface base is envisioned as an efficient test-bed for assessing critical technologies to be used for future deep-space endeavours. A preliminary mission scenario has been generated which targets to settle the outpost at the lunar south pole. The peculiar environment conditions make the area rich in volatiles to examine and exploit, especially considering the permanently shadowed regions that are supposed to contain icy water deposits, which are of paramount importance for human missions. A closed-loop power system, comprising solar panels, batteries, fuel cells, electrolysers, has been sized according the settlement power needs. This research work presents an integrated simulation case study that has been run using a VE to arrive at a preliminary estimate of the performance of both the power system and the VR tool. Virtues and vices of the proposed VR-based methodology have been listed together with possible future improvements for this research field.
APA, Harvard, Vancouver, ISO, and other styles
26

Pettersson, Johan. "Real-time Object Recognition on a GPU." Thesis, Linköping University, Department of Electrical Engineering, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-10238.

Full text
Abstract:

Shape-Based matching (SBM) is a known method for 2D object recognition that is rather robust against illumination variations, noise, clutter and partial occlusion.

The objects to be recognized can be translated, rotated and scaled.

The translation of an object is determined by evaluating a similarity measure for all possible positions (similar to cross correlation).

The similarity measure is based on dot products between normalized gradient directions in edges.

Rotation and scale is determined by evaluating all possible combinations, spanning a huge search space.

A resolution pyramid is used to form a heuristic for the search that then gains real-time performance.

For SBM, a model consisting of normalized edge gradient directions, are constructed for all possible combinations of rotation and scale.

We have avoided this by using (bilinear) interpolation in the search gradient map, which greatly reduces the amount of storage required.

SBM is highly parallelizable by nature and with our suggested improvements it becomes much suited for running on a GPU.

This have been implemented and tested, and the results clearly outperform those of our reference CPU implementation (with magnitudes of hundreds).

It is also very scalable and easily benefits from future devices without effort.

An extensive evaluation material and tools for evaluating object recognition algorithms have been developed and the implementation is evaluated and compared to two commercial 2D object recognition solutions.

The results show that the method is very powerful when dealing with the distortions listed above and competes well with its opponents.

APA, Harvard, Vancouver, ISO, and other styles
27

Kreuser, Carla Louise. "The meandering narrative : poetry and illustration engage in a moment of indiscipline : demonstrated in an analysis of Sara Fanelli’s illuminated poem - And all men kill the thing they love." Thesis, Stellenbosch : Stellenbosch University, 2014. http://hdl.handle.net/10019.1/86503.

Full text
Abstract:
Thesis (MPhil)--Stellenbosch University, 2014.
ENGLISH ABSTRACT: This is a study about the inner workings of an illuminated poem – about the dialogue that develops between poetry and illustration when they encounter each other on the page. However, the illuminated poem is more than just a relation between words and images, it is also a composite art in its own right. This study explores the dynamic of this particular type of imagetext by firstly claiming that the illuminated poem embodies a moment of indiscipline and secondly, by positing that illustration should contribute to this pairing by acting as a manifestation of illumination, instead of posturing as merely ‘illustrative’ or decorative. The inherent indisciplinarity of the illuminated poem as an imagetext is dissected – it is simultaneously two independent art forms and an integrated one; it can therefore be seen as both an interdisciplinary concern and a new art form. The illuminated poem as a visual art blurs the boundaries between words and images, upending the traditional, rigid boundaries of image-­‐text discourse. Additionally, a meandering narrative is set in motion when poetry and illustration engage in an illuminated poem – a slower, involved, cross-­‐pollinating reading that results in the activation of a reader’s imagination. The idea of Illumination is thus examined as both an orchestrated, visual choice and an active, conjuring process. Various strategies of illumination – with which illustration can open up a poem to new conceptual and narrative possibilities – are also discussed. These theories of interplay and interaction are then applied to an analysis of And all men kill the thing they love, an illuminated poem by Sara Fanelli and Oscar Wilde, revealing some of the ways in which illustration and poetry act as co-­‐conspirators and collaborators when they engage in a moment of indiscipline.
AFRIKAANSE OPSOMMING: Hierdie is ‘n ondersoekende studie na die dieperliggende werking van ‘n “illuminated” gedig. Die studie fokus op die dialoog wat ontstaan wanneer ‘n gedig en illustrasies mekaar op papier ontmoet. Die “illuminated” gedig is egter soveel meer as net die saamgestelde som van woord en beeld – dit is ook ‘n verstrengelde nuwe kunswerk in eie reg. Hierdie studie verken die dinamiek van dié besondere soort beeldteks deur, eerstens, te verklaar dat “illumination” ‘n moment van ongedissiplineerdheid behels en, tweedens, deur te verwag dat die illustrasies bydra tot hierdie verhoudingsdinamika deur ‘n manifestasie van “illumination”, pleks van net ‘illustrerend’ of dekoratief, te wees. Die inherente ongedissiplineerdheid van die “open-­‐ended” gedig as beeldteks word ondersoek – dit vorm tegelykertyd twee onafhanklike kunsvorms en ‘n geïntegreerde geheel; dit kan dus beskou word as beide ‘n interdissiplinêre kunswerk en ‘n nuwe kunsvorm. Die ‘mengsel’-­‐gedig as visuele kunsvorm oorskry die bekende grense tussen woorde en beelde en gooi alle rigiede, streng-­‐tradisionele riglyne van die beeldteks-­‐geding omver. Die verhaaltrant volg kronkelpaaie wanneer digkuns en illustrasie slaags raak op papier of meedoen aan die “open-­‐ended” gedig – ‘n stadiger, meer betrokke, kruisbestuiwende leestempo word afgedwing, wat sodoende die leser se verbeelding aktiveer. Die idee van “illumination” word dus ondersoek as beide ‘n georkestreerde, visuele keuse en ‘n meelewende (verwonderings)proses. Verskeie verhelderings-­‐ moontlikhede – waardeur illustrasie ‘n gedig kan ontsluit om nuwe konseptuele en vertellingsmoontlikhede te ontgin – word ook bespreek. Hierdie teoretiese benadering van ‘n heen-­‐en-­‐weer-­‐spel se wisselwerkende interaksie word dan toegepas op ‘n analise van And all men kill the thing they love, ‘n “illuminated” gedig deur Sara Fanelli en Oscar Wilde. Verskeie wyses waarop illustrasie en digkuns as samesweerders en samewerkers kan optree wanneer hulle hulself in ‘n oomblik van ongedissiplineerdheid bevind, word aangetoon.
APA, Harvard, Vancouver, ISO, and other styles
28

Tomeš, Martin. "Simulace LED náhrad v reálných podmínkách." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2014. http://www.nusl.cz/ntk/nusl-220695.

Full text
Abstract:
This Master’s Thesis on Simulation of LED retrofits under operational conditions is divided into theoretical and practical part. Theoretical part in the introductory section deals with the concepts of lighting technology and presents some electric light sources, such as classic bulbs, halogen bulbs, linear and compact fluorescent lamps. It also describes the principle of electroluminescent light sources (LED), which discusses principles of yielding a white light, the main advantages of LED etc. The reader is familiar with the issue of compensation the linear fluorescent lamps tubular LED source and with the work environment programme ReluxPro that is used for design of the lighting system. Practical part of the thesis is divided into two chapters. In the first chapter, measured linear fluorescent lamps and LED modules for the light-technical parameters, luminous intensity, radiation spectrum and settling time of light sources are evaluated and compared. The second chapter is solved by using ReluxPro, which explores the impact on the lighting and the brightness of the room layout SA 5.10 while using various light sources. In conclusion, there is an economic aspect of linear fluorescent lamps and LED modules analyzed. Results from the program ReluxPro are continuously compared with real measured values.
APA, Harvard, Vancouver, ISO, and other styles
29

Zhang, Yuyao. "Non-linear dimensionality reduction and sparse representation models for facial analysis." Thesis, Lyon, INSA, 2014. http://www.theses.fr/2014ISAL0019/document.

Full text
Abstract:
Les techniques d'analyse du visage nécessitent généralement une représentation pertinente des images, notamment en passant par des techniques de réduction de la dimension, intégrées dans des schémas plus globaux, et qui visent à capturer les caractéristiques discriminantes des signaux. Dans cette thèse, nous fournissons d'abord une vue générale sur l'état de l'art de ces modèles, puis nous appliquons une nouvelle méthode intégrant une approche non-linéaire, Kernel Similarity Principle Component Analysis (KS-PCA), aux Modèles Actifs d'Apparence (AAMs), pour modéliser l'apparence d'un visage dans des conditions d'illumination variables. L'algorithme proposé améliore notablement les résultats obtenus par l'utilisation d'une transformation PCA linéaire traditionnelle, que ce soit pour la capture des caractéristiques saillantes, produites par les variations d'illumination, ou pour la reconstruction des visages. Nous considérons aussi le problème de la classification automatiquement des poses des visages pour différentes vues et différentes illumination, avec occlusion et bruit. Basé sur les méthodes des représentations parcimonieuses, nous proposons deux cadres d'apprentissage de dictionnaire pour ce problème. Une première méthode vise la classification de poses à l'aide d'une représentation parcimonieuse active (Active Sparse Representation ASRC). En fait, un dictionnaire est construit grâce à un modèle linéaire, l'Incremental Principle Component Analysis (Incremental PCA), qui a tendance à diminuer la redondance intra-classe qui peut affecter la performance de la classification, tout en gardant la redondance inter-classes, qui elle, est critique pour les représentations parcimonieuses. La seconde approche proposée est un modèle des représentations parcimonieuses basé sur le Dictionary-Learning Sparse Representation (DLSR), qui cherche à intégrer la prise en compte du critère de la classification dans le processus d'apprentissage du dictionnaire. Nous faisons appel dans cette partie à l'algorithme K-SVD. Nos résultats expérimentaux montrent la performance de ces deux méthodes d'apprentissage de dictionnaire. Enfin, nous proposons un nouveau schéma pour l'apprentissage de dictionnaire adapté à la normalisation de l'illumination (Dictionary Learning for Illumination Normalization: DLIN). L'approche ici consiste à construire une paire de dictionnaires avec une représentation parcimonieuse. Ces dictionnaires sont construits respectivement à partir de visages illuminées normalement et irrégulièrement, puis optimisés de manière conjointe. Nous utilisons un modèle de mixture de Gaussiennes (GMM) pour augmenter la capacité à modéliser des données avec des distributions plus complexes. Les résultats expérimentaux démontrent l'efficacité de notre approche pour la normalisation d'illumination
Face analysis techniques commonly require a proper representation of images by means of dimensionality reduction leading to embedded manifolds, which aims at capturing relevant characteristics of the signals. In this thesis, we first provide a comprehensive survey on the state of the art of embedded manifold models. Then, we introduce a novel non-linear embedding method, the Kernel Similarity Principal Component Analysis (KS-PCA), into Active Appearance Models, in order to model face appearances under variable illumination. The proposed algorithm successfully outperforms the traditional linear PCA transform to capture the salient features generated by different illuminations, and reconstruct the illuminated faces with high accuracy. We also consider the problem of automatically classifying human face poses from face views with varying illumination, as well as occlusion and noise. Based on the sparse representation methods, we propose two dictionary-learning frameworks for this pose classification problem. The first framework is the Adaptive Sparse Representation pose Classification (ASRC). It trains the dictionary via a linear model called Incremental Principal Component Analysis (Incremental PCA), tending to decrease the intra-class redundancy which may affect the classification performance, while keeping the extra-class redundancy which is critical for sparse representation. The other proposed work is the Dictionary-Learning Sparse Representation model (DLSR) that learns the dictionary with the aim of coinciding with the classification criterion. This training goal is achieved by the K-SVD algorithm. In a series of experiments, we show the performance of the two dictionary-learning methods which are respectively based on a linear transform and a sparse representation model. Besides, we propose a novel Dictionary Learning framework for Illumination Normalization (DL-IN). DL-IN based on sparse representation in terms of coupled dictionaries. The dictionary pairs are jointly optimized from normally illuminated and irregularly illuminated face image pairs. We further utilize a Gaussian Mixture Model (GMM) to enhance the framework's capability of modeling data under complex distribution. The GMM adapt each model to a part of the samples and then fuse them together. Experimental results demonstrate the effectiveness of the sparsity as a prior for patch-based illumination normalization for face images
APA, Harvard, Vancouver, ISO, and other styles
30

Khan, Zulfiqar A. "EMI/EMC analysis of electronic systems subject to near zone illuminations." Columbus, Ohio : Ohio State University, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1196207323.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Ponchia, Chiara. "Frammenti dell'Aldilà. Immagini nella Divina Commedia nell'Italia settentrionale del Trecento." Doctoral thesis, Università degli studi di Padova, 2014. http://hdl.handle.net/11577/3423513.

Full text
Abstract:
The scope of my thesis is to analyze the genesis and the development of the Divine Comedy illustration in Northern Italy during the XIV century. In section 1 I present the most representative manuscripts, that I selected because of the great number of illuminations and because of the particular iconographic solutions they present. Section 2 is dedicated to the analysis of the different illuminated images created by the first Divine Comedy illuminators to represent the main characters of Dante’s poem and the landscapes of the three Realms. In my analysis, I pay particular attention to the sources, visual and textual, that inspired illuminators when they had to face the difficult challenge of illustrating a new text with no previous iconographic tradition. In section 3 I analyze the relationship between text and images in the manuscripts chosen for my thesis. Also in this case, I pay great attention to the possible sources that may have inspired illuminators. Concerning this, medieval novel illumination turned out to be a very fruitful field of investigation, as the spread of illuminated novels in Northern Italy at the beginning of the XIV century seems to have been one of the main incitements for the development of a narrative pictorial style.
Il mio lavoro ha come oggetto la genesi e lo sviluppo dell’illustrazione miniata della Divina Commedia nell’Italia nord-orientale nel XIV secolo. La tesi muove dall’individuazione di un corpus di manoscritti significativi per numero di immagini e tipologie illustrative impiegate. Ai manoscritti principali è dedicato un capitolo di approfondimento finalizzato soprattutto a rilevare le principali problematiche storico-artistiche ad essi correlate. Particolare attenzione è dedicata all’Egerton 943, per il quale si propone una nuova datazione. Il lavoro prosegue con un’attenta analisi delle soluzioni adottate dai miniatori della Divina Commedia per illustrare un testo nuovo e pertanto privo di una tradizione iconografica consolidata. Ampio spazio è dato allo studio delle fonti, visive e in alcuni casi testuali, cui si rivolsero i miniatori in cerca di ispirazione, partendo da alcuni affondi analitici sulle iconografie adottate per raffigurare i personaggi principali, quali ad esempio Caronte e Minosse, per poi passare allo studio dei modelli visivi scelti per ricreare le ambientazioni di inferno, purgatorio e paradiso. Allo studio della figurazione si aggiunge poi l’analisi delle principali forme di rapporto testo-immagine nei codici padani della Divina Commedia. Anche in questo caso vengono prese in esame le possibili fonti, attraverso un’approfondita ricognizione delle principali tipologie di racconto per immagine riscontrabili nell’Italia nord-orientale tra la fine del Duecento e l’inizio del secolo successivo. In particolare, si è rivelato proficuo lo studio dei codici cavallereschi, che paiono essere l’ambito privilegiato di elaborazione dei principali sistemi impaginativi che saranno poi impiegati nei più antichi testimoni settentrionali del poema.
APA, Harvard, Vancouver, ISO, and other styles
32

Bodnarova, Adriana. "Texture analysis for automatic visual inspection and flaw detection in textiles." Thesis, Queensland University of Technology, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
33

Leoputra, Wilson Suryajaya. "Video foreground extraction for mobile camera platforms." Thesis, Curtin University, 2009. http://hdl.handle.net/20.500.11937/1384.

Full text
Abstract:
Foreground object detection is a fundamental task in computer vision with many applications in areas such as object tracking, event identification, and behavior analysis. Most conventional foreground object detection methods work only in a stable illumination environments using fixed cameras. In real-world applications, however, it is often the case that the algorithm needs to operate under the following challenging conditions: drastic lighting changes, object shape complexity, moving cameras, low frame capture rates, and low resolution images. This thesis presents four novel approaches for foreground object detection on real-world datasets using cameras deployed on moving vehicles.The first problem addresses passenger detection and tracking tasks for public transport buses investigating the problem of changing illumination conditions and low frame capture rates. Our approach integrates a stable SIFT (Scale Invariant Feature Transform) background seat modelling method with a human shape model into a weighted Bayesian framework to detect passengers. To deal with the problem of tracking multiple targets, we employ the Reversible Jump Monte Carlo Markov Chain tracking algorithm. Using the SVM classifier, the appearance transformation models capture changes in the appearance of the foreground objects across two consecutives frames under low frame rate conditions. In the second problem, we present a system for pedestrian detection involving scenes captured by a mobile bus surveillance system. It integrates scene localization, foreground-background separation, and pedestrian detection modules into a unified detection framework. The scene localization module performs a two stage clustering of the video data.In the first stage, SIFT Homography is applied to cluster frames in terms of their structural similarity, and the second stage further clusters these aligned frames according to consistency in illumination. This produces clusters of images that are differential in viewpoint and lighting. A kernel density estimation (KDE) technique for colour and gradient is then used to construct background models for each image cluster, which is further used to detect candidate foreground pixels. Finally, using a hierarchical template matching approach, pedestrians can be detected.In addition to the second problem, we present three direct pedestrian detection methods that extend the HOG (Histogram of Oriented Gradient) techniques (Dalal and Triggs, 2005) and provide a comparative evaluation of these approaches. The three approaches include: a) a new histogram feature, that is formed by the weighted sum of both the gradient magnitude and the filter responses from a set of elongated Gaussian filters (Leung and Malik, 2001) corresponding to the quantised orientation, which we refer to as the Histogram of Oriented Gradient Banks (HOGB) approach; b) the codebook based HOG feature with branch-and-bound (efficient subwindow search) algorithm (Lampert et al., 2008) and; c) the codebook based HOGB approach.In the third problem, a unified framework that combines 3D and 2D background modelling is proposed to detect scene changes using a camera mounted on a moving vehicle. The 3D scene is first reconstructed from a set of videos taken at different times. The 3D background modelling identifies inconsistent scene structures as foreground objects. For the 2D approach, foreground objects are detected using the spatio-temporal MRF algorithm. Finally, the 3D and 2D results are combined using morphological operations.The significance of these research is that it provides basic frameworks for automatic large-scale mobile surveillance applications and facilitates many higher-level applications such as object tracking and behaviour analysis.
APA, Harvard, Vancouver, ISO, and other styles
34

Alderman, Gwendolyn. "From policy borrowing to implementation : an illuminative evaluation of learning and teaching in higher education in Australia (2002 to 2008)." Thesis, Queensland University of Technology, 2014. https://eprints.qut.edu.au/75865/1/Gwendolyn_Alderman_Thesis.pdf.

Full text
Abstract:
This study documents and theorises the consequences of the 2003 Australian Government Reform Package focussed on learning and teaching in Higher Education during the period 2002 to 2008. This is achieved through the perspective of program evaluation and the methodology of illuminative evaluation. The findings suggest that the three national initiatives of that time, Learning and Teaching Performance Fund (LTPF), Australian Learning and Teaching Council (ALTC), and Australian Universities Quality Agency (AUQA), were successful in repositioning learning and teaching as a core activity in universities. However, there were unintended consequences brought about by international policy borrowing, when the short-lived nature of LTPF suggests a legacy of quality compliance rather than one of quality enrichment.
APA, Harvard, Vancouver, ISO, and other styles
35

Lösche, Frank. "Investigating the moment when solutions emerge in problem solving." Thesis, University of Plymouth, 2018. http://hdl.handle.net/10026.1/12838.

Full text
Abstract:
At some point during a creative action something clicks, suddenly the prospective problem solver just knows the solution to a problem, and a feeling of joy and relief arises. This phenomenon, called Eureka experience, insight, Aha moment, hunch, epiphany, illumination, or serendipity, has been part of human narrations for thousands of years. It is the moment of a subjective experience, a surprising, and sometimes a life-changing event. In this thesis, I narrow down this moment 1. conceptually, 2. experientially, and 3. temporally. The concept of emerging solutions has a multidisciplinary background in Cognitive Science, Arts, Design, and Engineering. Through the discussion of previous terminology and comparative reviews of historical literature, I identify sources of ambiguity surrounding this phenomenon and suggest unifying terms as the basis for interdisciplinary exploration. Tracking the experience based on qualitative data from 11 creative practitioners, I identify conflicting aspects of existing models of creative production. To bridge this theoretical and disciplinary divide between iterative design thinking and sequential models of creativity, I suggest a novel multi-layered model. Empirical support for this proposal comes from Dira, a computer-based open-ended experimental paradigm. As part of this thesis I developed the task and 40 unique sets of stimuli and response items to collect dynamic measures of the creative process and evade known problems of insightful tasks. Using Dira, I identify the moment when solutions emerge from the number and duration of mouse-interactions with the on-screen elements and the 124 participants' self-reports. I provide an argument for the multi-layered model to explain a discrepancy between the timing observed in Dira and existing sequential models. Furthermore, I suggest that Eureka moments can be assessed on more than a dichotomous scale, as the empirical data from interviews and Dira demonstrates for this rich human experience. I conclude that the research on insight benefits from an interdisciplinary approach and suggest Dira as an instrument for future studies.
APA, Harvard, Vancouver, ISO, and other styles
36

Hubel, Philipp [Verfasser], and Matthias [Akademischer Betreuer] Mann. "Illuminating novel aspects in virus-host interactions by tailored quantitative proteomics analyses / Philipp Hubel ; Betreuer: Matthias Mann." München : Universitätsbibliothek der Ludwig-Maximilians-Universität, 2019. http://d-nb.info/121985221X/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Thorpe, Robert Nicholas. "Evaluating an English department: the use of illuminative evaluation procedures in descriptive and diagnostic analysis of English teaching programmes in high schools." Thesis, Rhodes University, 1991. http://hdl.handle.net/10962/d1003404.

Full text
Abstract:
To evaluate what is actually happening within a High School subject curriculum, the annual parade of marks, percentages and symbol distributions is not by itself adequate, especially in assessing progress towards such English syllabus goals as: That pupils expand their experience of life, gain empathetic understanding of people and develop moral awareness. (3.1. 4 HG) How too, from examination results alone, can a subject head of English assess the success of his objective "to woo his pupils into the reading habit"? (School 1: Goals 1988) Decisions on English department policy and procedures are frequently based on personal hunches and examination results. Few subject departments engage in proper evaluations of their curricula to support decisions made, or to impart meaning upon the countless daily transactions between child and adult, individual and institution in the learning process. This study demonstrates the efficacy of "illuminative evaluation" techniques in opening out an educational innovation (1986 First Language English syllabi of the Cape Education Department) at two High Schools for comment and appraisal. The array of information gathered should be useful in planning and implementing further curricula initiatives. The inherent flexibility of illuminative evaluation procedures and their freedom from large-scale data base requirements needed for 'scientific' models of evaluation are advantageous in investigating the untidy complexities of English teaching. Both 'closed' and 'open' response questionnaires, interviews, and perusal of relevant documents informed the researcher of the views of pupils, parents, English teachers, other subject heads, the two school principals and the education authorities on what waS and ought to be happening in English classes. From the considerable array of information generated, the distress of conscientious English teachers facing unreasonable work-loads emerged clearly. Such teachers are likely to occupy key roles in the non-racial state schools of the future and cannot be regarded as expendable. 'Open schools' present new challenges to existing curricula and the position of English may prove to be critical. Thus it is submitted that English subject heads should be concerned with evaluating their departments so that informed decisions can be taken on future directions. Illuminative evaluation is demonstrably useful in such analyses.
APA, Harvard, Vancouver, ISO, and other styles
38

Reid, Rhiannon Sara. "An activity theory analysis of how management of a private higher education institution interpret and engage with re-accreditation." Master's thesis, Faculty of Humanities, 2021. http://hdl.handle.net/11427/33905.

Full text
Abstract:
The aim of this study was to provide an in-depth understanding of how a single private provider conducted an application for re-accreditation in line with the recently revised accreditation framework set out by the Council on Higher Education. This framework aims to promote an integrated approach to accreditation and increased autonomy for higher education institutions with regard to the reaccreditation of programmes. The research unpacked how accreditation was understood and applied within the context of the institution, placing emphasis on understanding the elements that promoted or inhibited quality as well as the tensions and contradictions that arose within this process. The driving question addressed by this research was: How does management within a South African private higher education institution engage with the re-accreditation process? Literature revealed that there is limited research on understanding quality assurance in private higher education in South Africa, and specifically on accreditation. Cultural-historical activity theory (CHAT) was considered the most effective lens to interpret the findings of this study, as research indicates that it is for teasing out the historical and cultural contradictions within as well as between people, tools and the environment within complex educational systems. Multiple data-gathering techniques, including semi-structured in-depth interviews, participant observations and documentation reviews, were conducted. The findings of this study illuminate the critical role of management and their respective interpretations of quality in the shaping of the application for re-accreditation, that balanced quality development and accountability requirements. The study highlighted contradictions and issues that inhibited meaningful engagement with accreditation as well as the enhancement of programme and institutional quality.
APA, Harvard, Vancouver, ISO, and other styles
39

Nichterwitz, Melanie [Verfasser], and Bernd [Akademischer Betreuer] Rech. "Charge carrier transport in Cu(In,Ga)Se2 thin-film solar-cells studied by electron beam induced current and temperature and illumination dependent current voltage analyses / Melanie Nichterwitz. Betreuer: Bernd Rech." Berlin : Universitätsbibliothek der Technischen Universität Berlin, 2012. http://d-nb.info/1023762145/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

"Image-based illumination analysis and applications." Thesis, 2010. http://library.cuhk.edu.hk/record=b6075248.

Full text
Abstract:
Applications using image-based illumination analysis are very limited in the current literature of computer graphics. However, there are potentially more applications based on such analysis and estimation. In this thesis, we show two applications in computer graphics that can be directly benefited from using such analysis and estimation: photo colorization and texture synthesis.
Illumination is a very common phenomenon. All the photographs that we casually take with cameras exhibit such phenomenon. Computer graphicists usually simulate the illumination cast on objects based on physical models. While directly rendering such effects has been intensively studied in the field of computer graphics, the inverse estimation of illumination contribution to each pixel in the digital photographs, which we call image-based illumination estimation, still remains a challenging problem. The lack of the underlying geometry as well as the light source and material properties usually makes such inverse estimation ill-posed and a very difficult problem to solve.
In this thesis, we target on such image-based illumination estimation problem. We will review the current state-of-the-art illumination estimation algorithms for solving intrinsic images, and demonstrate their benefits and drawbacks. While this is a fundamental research problem in the field of computer vision, we show that by decomposing the image into its intrinsic components, the reflectance and illumination, many graphical applications can potentially be explored and benefited. In the meantime, we will also introduce a new and novel algorithm to efficiently estimate the intrinsic components based on the statistics of the textured regions. The same algorithm can also be directly applied to non-textured regions in an image.
Texture synthesis is a very fundamental problem in computer graphics. Current texture synthesis method is difficult to automatically take into account the illumination and deformation during the synthesis. By exploring the statistics of the texture, we propose a very efficient algorithm to estimate both the illumination and deformation fields on textures. The color of the illuminant is also taken into account so that the recovered reflectance has consistent color. By decomposing the illumination and deformation fields, we show that many texture-based applications, such as the preparation of texture exemplars from real photographs, the natural replacement of textured regions, the relighting of objects, as well as the manipulation of geometries in natural images can be well achieved, with the success of texture synthesis guided by illumination and deformation.
Traditional example-based colorization of natural images usually suffers from illumination inconsistency. The color transfer from areas such as highlights and shadows may severely harm the colorization result. We propose to consider the illumination problem in colorization and perform colorization in an illumination-free domain. The decomposition of the intrinsic components from multiple example images, as well as the recombination and utilization of these intrinsic components in colorization, form the foundation of the proposed technique. Consistent colorization results are obtained even though the example images are from different lighting conditions and viewing directions.
Liu, Xiaopei.
Adviser: Wong Tien Tsin.
Source: Dissertation Abstracts International, Volume: 73-03, Section: B, page: .
Thesis (Ph.D.)--Chinese University of Hong Kong, 2010.
Includes bibliographical references (leaves 76-83).
Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [201-] System requirements: Adobe Acrobat Reader. Available via World Wide Web.
Abstract also in Chinese.
APA, Harvard, Vancouver, ISO, and other styles
41

Dai, Liang-Kuang, and 戴良光. "Scene Change Detection based on Illumination Analysis." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/81025030771805941378.

Full text
Abstract:
碩士
淡江大學
資訊工程學系碩士班
96
In recent years, due to the rapid progress of computer science, internet is flooded with multimedia data, especially video, which has the most variety of information. For video on the organization and indexing, the video shot boundary detection is essential work. In general, the shot is usually regarded as the basic unit of the video, video shot boundary detection is to provide us to do video summary, retrieval and browsing. In this system, we use illumination change data to detect shot boundary based on HSV color space and to check shot change type. In the experiment result, the most common shot change types which are cut, fade and dissolve has high degree of accuracy of detection.
APA, Harvard, Vancouver, ISO, and other styles
42

TSENG, CHI-YUAN, and 曾啟淵. "The Illumination Analysis of Indoor Table Lamp." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/nf8chv.

Full text
Abstract:
碩士
國立屏東大學
應用物理系碩士班
107
Table lamps are important for reading in which the illumination distribution and glare would influence the vision of the eyes. It is important to understand the fact that would influence the degree of glare. In this thesis, a table lamp located on a table inside a room was simulated, and its illumination distributions were discussed. In order to quantized the degree of glare, the unified glare rating (UGR) which was defined by CIE was utilized in this thesis. Besides, the table was divided into four regions for convenient understanding the distribution of glare.   The table lamp could be consisted of fluorescent lamps with V-shaped reflector or LEDs. The simulated results of these two kind of table lamps showed that the illumination distributions and UGR would not affected by the types of lamps, but the additional diffuser of LED table lamp would efficiently degrade the degree of glare. By the way, the background illumination was also an important factor which would influence the degree of glare, no matter the indoor lamps on the floor or the natural illuminations. Besides, the orientations of table lamps, table and indoor lamps would also.
APA, Harvard, Vancouver, ISO, and other styles
43

Chen, C. W., and 陳智偉. "Calibration and Error Analysis for Stroboscopic Illumination Ellipsometry." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/88051764939774963581.

Full text
Abstract:
碩士
國立交通大學
光電工程系所
95
Stroboscopic illumination ellipsometry is a fast imaging technique which operates by synchronizing the ultra stable short pulse to freeze the variation of Photoelastic Modulator (PEM) signal. Four specific polarization states are used to deduce the ellipsometric parameters (Ψ,Δ) . We postulate the main deviation of ellipsometric parameters in stroboscopic illumination ellipsometry is caused by the deviation of initial temporal phase and modulation amplitude. In this paper, we will derive a correction technique to optimize ellipsometric parameters. Finally, an optical thin film of known thickness was measured to demonstrate the reliability of this correction technique. We can eliminate errors caused by the miss positioned initial time and modulation amplitude deviation. This post flight measurement technique can eliminate all the system errors and achieve dΨ~0.03° and dΔ~0.41° in 20 microsecond.
APA, Harvard, Vancouver, ISO, and other styles
44

Chu, Fu-Wei, and 初福威. "Application and Analysis of Indoor Low Illumination Photometer." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/8s4rq2.

Full text
Abstract:
碩士
國立虎尾科技大學
光電與材料科技研究所
100
As for the human eyes, when the illumination intensity is too low, the human eyes will easily lead to eye fatigue. On the contrary, if the illumination intensity is too high due to intense light, excessive waste of electricity will undoubtedly occur. In this thesis, the Holtek microcontroller HT46R64, Solteamopto photoelectric light sensor JSA-1116, and JSA-3111B are used to convert the strength of the light intensity into voltage; analog-to-digital converter (ADC) is employed to transform the analog signals into digital ones. Finally, the ADC results are displayed via the help of microcontroller. For various environments, users need to have the appropriate illumination condition to match the actual requirements for book-reading, requiring a range of lux from 300 to 500. Our motivation is to devise a compact photometer for indoor illumination, where the detection range is between 0 and 1000 lux. Due to the cost and wide measurement ranges of these photometer products available in the market, this thesis is intended to design a low cost, portable photometer; real-time measurement task can always be conducted under any desk lamp illumination scenario, and the estimated cost is around NT$399.
APA, Harvard, Vancouver, ISO, and other styles
45

"Compressing the illumination-adjustable images with principal component analysis." 2003. http://library.cuhk.edu.hk/record=b5891490.

Full text
Abstract:
Pun-Mo Ho.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2003.
Includes bibliographical references (leaves 90-95).
Abstracts in English and Chinese.
Chapter 1 --- Introduction --- p.1
Chapter 1.1 --- Background --- p.1
Chapter 1.2 --- Existing Approaches --- p.2
Chapter 1.3 --- Our Approach --- p.3
Chapter 1.4 --- Structure of the Thesis --- p.4
Chapter 2 --- Related Work --- p.5
Chapter 2.1 --- Compression for Navigation --- p.5
Chapter 2.1.1 --- Light Field/Lumigraph --- p.5
Chapter 2.1.2 --- Surface Light Field --- p.6
Chapter 2.1.3 --- Concentric Mosaics --- p.6
Chapter 2.1.4 --- On the Compression --- p.7
Chapter 2.2 --- Compression for Relighting --- p.7
Chapter 2.2.1 --- Previous Approaches --- p.7
Chapter 2.2.2 --- Our Approach --- p.8
Chapter 3 --- Image-Based Relighting --- p.9
Chapter 3.1 --- Plenoptic Illumination Function --- p.9
Chapter 3.2 --- Sampling and Relighting --- p.11
Chapter 3.3 --- Overview --- p.13
Chapter 3.3.1 --- Codec Overview --- p.13
Chapter 3.3.2 --- Image Acquisition --- p.15
Chapter 3.3.3 --- Experiment Data Sets --- p.16
Chapter 4 --- Data Preparation --- p.18
Chapter 4.1 --- Block Division --- p.18
Chapter 4.2 --- Color Model --- p.23
Chapter 4.3 --- Mean Extraction --- p.24
Chapter 5 --- Principal Component Analysis --- p.29
Chapter 5.1 --- Overview --- p.29
Chapter 5.2 --- Singular Value Decomposition --- p.30
Chapter 5.3 --- Dimensionality Reduction --- p.34
Chapter 5.4 --- Evaluation --- p.37
Chapter 6 --- Eigenimage Coding --- p.39
Chapter 6.1 --- Transform Coding --- p.39
Chapter 6.1.1 --- Discrete Cosine Transform --- p.40
Chapter 6.1.2 --- Discrete Wavelet Transform --- p.47
Chapter 6.2 --- Evaluation --- p.49
Chapter 6.2.1 --- Statistical Evaluation --- p.49
Chapter 6.2.2 --- Visual Evaluation --- p.52
Chapter 7 --- Relighting Coefficient Coding --- p.57
Chapter 7.1 --- Quantization and Bit Allocation --- p.57
Chapter 7.2 --- Evaluation --- p.62
Chapter 7.2.1 --- Statistical Evaluation --- p.62
Chapter 7.2.2 --- Visual Evaluation --- p.62
Chapter 8 --- Relighting --- p.65
Chapter 8.1 --- Overview --- p.66
Chapter 8.2 --- First-Phase Decoding --- p.66
Chapter 8.3 --- Second-Phase Decoding --- p.68
Chapter 8.3.1 --- Software Relighting --- p.68
Chapter 8.3.2 --- Hardware-Assisted Relighting --- p.71
Chapter 9 --- Overall Evaluation --- p.81
Chapter 9.1 --- Compression of IAIs --- p.81
Chapter 9.1.1 --- Statistical Evaluation --- p.81
Chapter 9.1.2 --- Visual Evaluation --- p.86
Chapter 9.2 --- Hardware-Assisted Relighting --- p.86
Chapter 10 --- Conclusion --- p.89
Bibliography --- p.90
APA, Harvard, Vancouver, ISO, and other styles
46

Chang, Ya-hui, and 張雅惠. "Comparison and analysis of global natural lighting illumination system." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/07865255196404984078.

Full text
Abstract:
碩士
國立臺灣科技大學
電子工程系
101
In recent years, green energy has undergone a lot of development and has been the subject of many ap-plication. Many research studies have focused on illumination with sunlight as a means of saving energy and creating healthy lighting [1]. It has been the most important issue how to save energy and the development of new renewable energy sources in growing shortage of energy[2-7]. The electricity consumption for lighting accounts for about 10% of all electricity consumption to 20% every year [8].Now developed many kinds of natural light system methods. Natural light systems have collecting, transmitting, and lighting elements. This system will provide a great number of benefits for the people who use it[1]. By this system, we can bring daylight into interior for lighting use. This paper is to discuss different construction of natural light system and introduct a variety of environments which need to the natural light system that provides natural light guiding system constructed index.
APA, Harvard, Vancouver, ISO, and other styles
47

Huang, Shou-Tsung, and 黃紹宗. "Surface Design and Illumination Analysis of Reflector for Surgical Operating." Thesis, 2001. http://ndltd.ncl.edu.tw/handle/83239492270312982491.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Chen, Chi-Kang, and 陳紀鋼. "Design and Analysis of Modern LED Lens for Collimating Illumination." Thesis, 2014. http://ndltd.ncl.edu.tw/handle/24256849791166878138.

Full text
Abstract:
碩士
淡江大學
機械與機電工程學系碩士班
102
This thesis explores the secondary optical lens design and analysis for generating collimating illumination in LED source. In the first phase of this study, a comprehensive paper review provides valuable and technical information. Then three types of lens are selected based on the most recent development with highly optical performances. The lens''s curve is constructed by adopting free-form method in which a series of computing points can be obtained for generating graphical solid model. The optical analysis can be executed by the Light Tools. This thesis proposed four requirements to achieve collimating illumination in modern LED lens, as follows. 1. The process of constructing lens curve is simple and correct. 2. The output optical efficiency must be greater 80% within beam angle. 3. Easy manufacture for reducing the cost. 4. For the volume restriction in practical consideration, the diameter and the axial length of lens may be constrained. According to those four requirements several design improvement in published works are carried out, and discussed. At the final stage, this thesis develops a innovative lens that can achieve 99.28% optical efficiency. Also, when axial length and diameter are predetermined, another development is introduced for comparison and discuss. Both of the new lenses agree four critical requirements in of modern LED lens for collimating illumination.
APA, Harvard, Vancouver, ISO, and other styles
49

LI, CHIH-HSUAN, and 李治軒. "Visibility Dehazing based on Channel-Weighted Analysis and Illumination Enhancement." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/36250754204916969607.

Full text
Abstract:
碩士
逢甲大學
資訊工程學系
105
The air pollution and foggy weather often result in serious distortion while taking photos or recognizing patterns. He et al. have introduced the dark channel prior to solve this dehazing problem. Unfortunately, it cannot function well once the color difference of target image is large. More precisely, the dehazed result looks unnatural. Thus, we aim to develop a brand-new visibility dehazing technique based on the channel-weighted analysis and illumination tuning. The channel-weighted analysis is adopted to eliminate the unnatural effect, while the illumination tuning is applied to refine the details. Simulation results have demonstrated that the new method can guarantee the readability of a hazed image after removing noise, including the foggy photo and sandstorm one.
APA, Harvard, Vancouver, ISO, and other styles
50

Wu, Chien-chun, and 吳健君. "The analysis of different LED chip size for spotlight illumination." Thesis, 2013. http://ndltd.ncl.edu.tw/handle/27600292182274598071.

Full text
Abstract:
碩士
國立中央大學
光電科學與工程學系
101
In this thesis, we focused on five similar packages of high-power LEDs and analyzed the effective exitance of each LEDs, which were driven at the same current density. Effective exitance was depended on efficacy and emitting area. When the light passed through lens encapsulation, the emitting area would be changed. Finally, we designed a miniaturization bike lamp, the light source of which was the highest effective exitance LED, for meeting K-mark regulation. The prototype was fabricated by CNC machining and the measured light pattern was similar to design one.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography