Academic literature on the topic 'Blurriness'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Blurriness.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Blurriness"

1

Ye, Wei, and Kai-Kuang Ma. "Blurriness-Guided Unsharp Masking." IEEE Transactions on Image Processing 27, no. 9 (September 2018): 4465–77. http://dx.doi.org/10.1109/tip.2018.2838660.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Sorensen, Roy A. "Vagueness, measurement, and blurriness." Synthese 75, no. 1 (April 1988): 45–82. http://dx.doi.org/10.1007/bf00873274.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Burek, Henry. "Blur equivalence – precision in blurriness." Optician 2017, no. 8 (August 2017): 160003–1. http://dx.doi.org/10.12968/opti.2017.8.160003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Khan, Aamir, Weidong Jin, Amir Haider, MuhibUr Rahman, and Desheng Wang. "Adversarial Gaussian Denoiser for Multiple-Level Image Denoising." Sensors 21, no. 9 (April 24, 2021): 2998. http://dx.doi.org/10.3390/s21092998.

Full text
Abstract:
Image denoising is a challenging task that is essential in numerous computer vision and image processing problems. This study proposes and applies a generative adversarial network-based image denoising training architecture to multiple-level Gaussian image denoising tasks. Convolutional neural network-based denoising approaches come across a blurriness issue that produces denoised images blurry on texture details. To resolve the blurriness issue, we first performed a theoretical study of the cause of the problem. Subsequently, we proposed an adversarial Gaussian denoiser network, which uses the generative adversarial network-based adversarial learning process for image denoising tasks. This framework resolves the blurriness problem by encouraging the denoiser network to find the distribution of sharp noise-free images instead of blurry images. Experimental results demonstrate that the proposed framework can effectively resolve the blurriness problem and achieve significant denoising efficiency than the state-of-the-art denoising methods.
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Zhuorui, and Aura Ganz. "Egocentric Landmark-Based Indoor Guidance System for the Visually Impaired." International Journal of E-Health and Medical Communications 8, no. 3 (July 2017): 55–69. http://dx.doi.org/10.4018/ijehmc.2017070104.

Full text
Abstract:
In this paper, we introduce an egocentric landmark-based guidance system that enables visually impaired users to interact with indoor environments. The user who wears Google Glasses will capture his surroundings within his field of view. Using this information, we provide the user an accurate landmark-based description of the environment including his relative distance and orientation to each landmark. To achieve this functionality, we developed a near real time accurate vision based localization algorithm. Since the users are visually impaired our algorithm accounts for captured images using Google Glasses that have severe blurriness, motion blurriness, low illumination intensity and crowd obstruction. We tested the algorithm performance in a 12,000 ft2 open indoor environment. When we have mint query images our algorithm obtains mean location accuracy within 5ft., mean orientation accuracy less than 2 degrees and reliability above 88%. After applying deformation effects to the query images such blurriness, motion blurriness and illumination changes, we observe that the reliability is above 75%.
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, Haopeng, Bo Yuan, Bo Dong, and Zhiguo Jiang. "No-Reference Blurred Image Quality Assessment by Structural Similarity Index." Applied Sciences 8, no. 10 (October 22, 2018): 2003. http://dx.doi.org/10.3390/app8102003.

Full text
Abstract:
No-reference (NR) image quality assessment (IQA) objectively measures the image quality consistently with subjective evaluations by using only the distorted image. In this paper, we focus on the problem of NR IQA for blurred images and propose a new no-reference structural similarity (NSSIM) metric based on re-blur theory and structural similarity index (SSIM). We extract blurriness features and define image blurriness by grayscale distribution. NSSIM scores an image quality by calculating image luminance, contrast, structure and blurriness. The proposed NSSIM metric can evaluate image quality immediately without prior training or learning. Experimental results on four popular datasets show that the proposed metric outperforms SSIM and well-matched to state-of-the-art NR IQA models. Furthermore, we apply NSSIM with known IQA approaches to blurred image restoration and demonstrate that NSSIM is statistically superior to peak signal-to-noise ratio (PSNR), SSIM and consistent with the state-of-the-art NR IQA models.
APA, Harvard, Vancouver, ISO, and other styles
7

Krzysztof - Świderska, Agnieszka Małgorzata. "Histrionical Blurriness. A God Image Without Features." Horyzonty Wychowania 21, no. 59 (September 29, 2022): 111–21. http://dx.doi.org/10.35765/hw.2022.2159.12.

Full text
Abstract:
RESEARCH OBJECTIVE: Exploring the specificity of God Image in histrionic personality organization. THE RESEARCH PROBLEM AND METHODS: The main research question is: God Image in histrionic subtype of Personality Organization is similar or different from the God Image in Borderline Personality Organization? Methodological strategy of Qualitative Secondary Analysis with the use of CAQDAS of material gathered in a project concerning God Image in Borderline personality organization (Krzysztof-Świderska, 2017) was performed. THE PROCESS OF ARGUMENTATION: The influence of the pre-oedipal roots of histrionic personality organization on the development of the specific God Image was argued. RESEARCH RESULTS: Outcomes shed light on certain characteristics of the histrionic God image: idealization and blurriness. The main conclusion is that characteristics of God Image in histrionical patients correspond with Bollas’s theoretical approach. CONCLUSIONS, INNOVATIONS, AND RECOMMENDATIONS: From a methodological point of view in future research, it is recommended to standardize groups in terms of the phase of therapy. It is also indicated that acting out experienced problems in a religious space could be potentially dangerous to those patients.
APA, Harvard, Vancouver, ISO, and other styles
8

Khosravi, Mohammad Hossein, and Hamid Hassanpour. "Model-based full reference image blurriness assessment." Multimedia Tools and Applications 76, no. 2 (January 21, 2016): 2733–47. http://dx.doi.org/10.1007/s11042-015-3149-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yang, Chang-Mo, Dong-Hyeok Lee, Kyoung-Soo Park, Young-Tae Kim, and Choon-Woo Kim. "Comparison of Measures of Blurriness in Transparent Displays." Electronic Imaging 2017, no. 18 (January 29, 2017): 76–79. http://dx.doi.org/10.2352/issn.2470-1173.2017.18.color-038.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ibrar-ul-Haque, Muhammad, Muhammad Tahir Qadri, Najeeb Siddiqui, and Talat Altaf. "Combined Blockiness, Blurriness and White Noise Distortion Meter." Wireless Personal Communications 103, no. 3 (June 5, 2018): 1927–39. http://dx.doi.org/10.1007/s11277-018-5888-x.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Blurriness"

1

Qadri, Muhammad Tahir. "Blockiness and blurriness measurement for video quality assessment." Thesis, University of Essex, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.574461.

Full text
Abstract:
Rapid growth of wired and wireless multi-media data challenges the researchers to develop an efficient objective assessment meter to estimate their quality. This research is focused to design objective quality assessment meters for images and video sequences. In this work, we proposed blockiness and blurriness distortion meters using full, reduced and no reference approaches. Both blockiness and blurriness distortions are calculated using frequency domain analysis. Since blockiness is an abrupt luminance change at DCT block boundaries which is also periodic therefore it generates harmonics in frequency domain, we used the strength of harmonics to estimate blockiness distortion. The more severe the blockiness is, the stronger will be the strength of harmonics in frequency domain. While blurriness removes the sharpness of the image by removing the high frequency components therefore we studied the loss of high frequency coefficients to measure blurriness artefact. We also aimed to design a multi-artefact distortion meter which can estimate the distortion without prior knowledge of distortion type. We developed the combined distortion meter for full, reduced and no reference approaches to estimate both blockiness and blurriness artefacts. We also studied the impact of spatial masking in image quality estimation. Due to the non linear behaviour of human visual system which perceives different amount of distortions at different spatial frequencies, we applied the masking process which weights the visibility of distortion according to the local spatial activity of the image. We investigated the importance of spatial masking in FR, RR and NR modes. Finally for video sequences, the quality metric of each frame is calculated and then we explored the methods to integrate these scores into a single value. Then the temporal masking is applied to mask the motion by using motion vectors and their standard deviations. The results are compared with the subjective scores provided by LIVE image and video database.
APA, Harvard, Vancouver, ISO, and other styles
2

Abate, Leonardo. "Detection and measurement of artefacts in digital video frames." Doctoral thesis, Università degli studi di Trieste, 2012. http://hdl.handle.net/10077/7414.

Full text
Abstract:
2010/2011
This work presents original algorithms for the measurement of artefacts impairing the quality of digital video sequences. The intended use of these algorithms is the control of the restoration processes performed on the video in advanced monitors for costumer applications. The problem of the artefact measurement at this stage of the processing chain di ffer from the assessment of quality performed in other applications. Quality assessment aimed to the improvement of the encoding operation, for example, can be done using the original sequence for comparison, and based on the pixel-by-pixel di erences with it. Quality measurements in intermediate stages of the transmission chain of the video, where the sequence is available in compressed form, can employ useful information contained in the bitstream, such as the frequency distribution of the frame content, the bitrate, the quantisation step and the error rate, all factors related to the global quality. In the proposed application, i. e. at the monitor, the measurements of the frame degradation must instead take place on the decoded numerical values of the pixels of the sole altered sequence. In addition, these measurement should require a low computational cost, so that they can be used in real time. In the rst part of this work some of the existing methods for Quality Assessment are briefly overviewed and classi ed based on the chosen approach to the problems. In this overview three main classes of methods are identi ed, namely the methods based on the measurement of speci c frame and video artefacts, the methods measuring the discrepancies between some statistical properties of the pixel distribution or the sequence parameters and ideal models, and the methods processing highly generic measures with trained classi ers. The rst strategy is deemed the most promising in the intended application, due to the good achieved results with relatively little computation and the possibility to avoid a long and complex training phase. The proposed algorithms are therefore based on the measurement of speci c video artefacts. A second part of the work is devoted to the identi cation of the main potential degradation factors in one of the most recent encoding standard, namely H264. The main aspects of frame degradation, namely blockiness in smooth areas and texture, edge degradation, and blurriness, are identi ed, and their relationship to the encoding options is briefly examined. Based on this brief inspection, two of the most common artefacts of the transmitted video, namely blurriness and blockiness, are chosen for measurements estimating the picture quality degradation. The devised algorithms integrate measures of the inter-pixel relationships determined by the artefacts with models of human vision to quantify their subjective appearance. For the blurriness measurement two methods are proposed, the fi rst acting selectively on object edges, the second uniformly on the frame surface. In the measurement of the edge blurriness the hierarchical role of each edge is estimated, distinguishing between the marginal edges of the detail and the edges of the main objects of the frame. The former have reduced contrast and short length compared to the edges of the surrounding shapes, and have little e ect on the overall blurriness impression. Conversely, the state of the latter is the main responsible of the frame quality aspect. The edge blurriness measure is based on the edge width and steepness, corrected with the edge length and the activity of the surrounding scene. This measure of edge blurriness is further corrected with a measure of the local scene clutter, accounting for the fact that in cluttered scenes the perception of the artefact is reduced. The resulting method yields blurriness measurements in local frame parts. The correlation of this measurements with subjective impression is evaluated in experimental tests. The two metrics acting uniformly on the frame measure the decrement in perceived contrast and the lack of detail, respectively. Used together, they are e ective in identifying special types of blurriness resulting in the generation of large areas with few edges and little contrast. These forms of blurriness generally cause a milder degradation of the perceived quality compared to the blurriness caused by encoding. The ability to distinguish among blurriness types and corresponding quality ranges is veri ed in experimental tests. Also the artefacts resulting from block based compression are analysed with a method acting on the sole edges and another applied to the whole frame. The edge degradation, consisting in an unnatural geometric alteration of the main objects, was measured from the frequency and length of straight edge fractions and the incidence of square corners. A correction procedure is introduced in order to avoid false alarms caused by natural polygonal objects and by the intrinsic nature of digital pictures. The measure of the blocking artefact on the frame surface, which appears altered by an unnatural grid, is performed with an original solution especially devised for video frames, and aimed to detect the displacement of the synthetic block edges caused by the motion compensation performed in video encoding. On this purpose very sensitive local blockiness indicators are devised, and corrected with models of the human perception of discontinuities in luminance in order to avoid false alarms. Vision models are further integrated in the computation of a global frame blockiness measure consisting in a weighted sum of local measures on detection points. The metric is tested with respect to its constance on subsequent frames, robusteness to upscaling and correlation with the quality ratings produced in experiments by a group of human observers.
Questo lavoro presenta algoritmi originali per la misura di artefatti che causano un degrado della qualità di sequenze video digitali. L'uso previsto per questi algoritmi è il controllo dei processi di restauro compiuti sul video in monitor avanzati di uso domestico. Il problema della misura di artefatti a questo stadio della catena di elaborazione si differenzia dalla stima di qualità compiuta in altre applicazioni. La stima di qualità volta al miglioramento dell'operazione di codifica, per esempio, può fare uso della sequenza originale per confronto, ed essere basato sulle differenze rispetto ad essa, calcolate pixel per pixel. Misure di qualità compiute in stadi intermedi della catena di trasmissione del video, dove la sequenza è disponibile in forma compressa, possono sfruttare utili informazioni contenute nel flusso di bit compresso, come la distribuzione in frequenza del contenuto del fotogramma, il tasso di compressione, l'intervallo di quantizzazione ed il tasso di errore, tutti fattori legati alla qualità globale del video. Nell'applicazione prevista, cioè al monitor, la misura del degrado dei fotogrammi deve invece avvenire usando solo i valori decodificati dei pixel della sequenza corrotta. Inoltre questa misura dovrebbe richiedere un basso costo computazionale, per poter avvenire in tempo reale. Nella prima parte di questo lavoro alcuni dei metodi esistenti per la stima di qualità sono brevemente esaminati e classificati in base al modo in cui affrontano il problema. I metodi sono divisi in tre categorie principali, cioè i metodi fondati sulla misura di specifici artefatti del fotogramma video, i metodi che misurano l'allontanamento di alcune proprietà statistiche della distribuzione dei pixel o dei parametri della sequenza da particolari modelli ideali, ed i metodi che fanno un'elaborazione di misure molto generiche con classificatori addestrati. La prima strategia è ritenuta la più promettente nell'applicazione considerata, per i buoni risultati ottenuti con una complessità di calcolo relativamente ridotta e la possibilità di evitare una lunga e complessa fase di addestramento. Gli algoritmi proposti si fondano perciò sulla misura di specifici artefatti del video. Una seconda parte del lavoro è dedicata all'identificazione dei principali fattori di degrado insiti nella codifica secondo uno dei più recenti standard, H264. Sono identificati i principali aspetti del degrado del fotogramma, cioè la blocchettatura in aree piane e tessitura, degrado dei bordi, e sfocatura, ed è brevemente esaminato il loro rapporto con le impostazioni di codifica. Sulla base di questo breve esame, due degli artefatti più comuni nel video trasmesso, cioè la sfocatura e la blocchettatura, sono scelti come oggetto di misure per stimare il degrado della qualità dell'immagine. Gli algoritmi progettati integrano misure delle relazioni fra pixel causate dagli artefatti con modelli della percezione umana che determinano la visibilità di tali artefatti. Per la misura di sfocatura sono presentati due metodi, il primo applicato selettivamente ai bordi degli oggetti, il secondo uniformemente sulla superficie del fotogramma. Nella misura della sfocatura dei bordi è stimato il ruolo gerarchico di ciascun bordo, distinguendo tra bordi secondari del dettaglio e bordi degli oggetti principali dell'immagine. I primi hanno contrasto ridotto e sono corti rispetto ai bordi degli oggetti nei dintorni, e hanno poco effetto sull'impressione di sfocatura complessiva. I secondi invece sono determinanti per la qualità del fotogramma. La misura di qualità dei bordi si fonda sulla loro larghezza e ripidità, corretta con la lunghezza e con l'attività della scena circostante. Questa misura di sfocatura dei bordi è ulteriormente corretta con una misura della densità della scena, misura che tiene conto del fatto che in scene affollate la percezione del difetto è ridotta. Il metodo complessivo fornisce misure di sfocatura in aree locali dell'immagine. La correlazione di queste misure con l'impressione soggettiva è valutata in test sperimentali. Le due misure di sfocatura che agiscono uniformemente sull'immagine misurano rispettivamente la diminuzione del contrasto percepito e la mancanza di dettaglio. Il loro uso congiunto permette di identificare specifici tipi di sfocatura che generano grandi aree con pochi bordi e basso contrasto. Queste forme di sfocatura generalmente causano un degrado di qualità percepita più lieve rispetto alla sfocatura dovuta alla codifica. La capacità di distinguere fra i tipi di sfocatura e fra i corrispondenti intervalli di qualità è verificata in prove sperimentali. Anche gli artefatti dovuti alla compressione a blocchi sono analizzati con un metodo che agisce sui soli bordi ed un altro applicato sull'intera superficie del fotogramma. Il degrado dei bordi, che consiste in un'innaturale alterazione geometrica degli oggetti principali, è stimata in base alla frequenza e alla lunghezza delle parti di bordi diritte ed alla frequenza degli angoli retti. Una correzione è introdotta per evitare falsi allarmi causati da oggetti di forma naturalmente poligonale e dalla natura intrinseca delle immagini digitali. La misura dell'artefatto di blocchettatura sulla superficie del fotogramma, che appare alterato da una griglia artificiale, è compiuta con un metodo originale specificamente studiato per fotogrammi video, e volto a rilevare lo spostamento dei bordi artificiali dei blocchi causato dalla compensazione del movimento compiuta nella codifica video. A questo scopo sono adottati indicatori di blocchettatura locale molto sensibili, corretti con modelli della percezione delle discontinuità di luminanza per evitare falsi allarmi. Modelli della percezione umana sono anche integrati nel calcolo di una misura globale della blocchettatura del frame, calcolata come somma pesata di misure locali sull'insieme dei punti di rilevamento. L'indice di blocchettatura è valutato in base alla sua uniformità in fotogrammi consecutivi, alla resistenza al riscalaggio e alla correlazione con i giudizi di qualità espressi da un gruppo di osservatori umani nelle prove sperimentali.
XXIII Ciclo
1981
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Blurriness"

1

Cecchi, Alessandro. Forming Form through Force. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780199316090.003.0005.

Full text
Abstract:
This chapter identifies connections between the formal strategies used in both Bruckner’s and Mahler’s symphonies. Its point of departure is the ‘energetic’ theory of musical form developed by Ernst Kurth in his monograph on Bruckner (1925), particularly the idea of the ‘intensifying wave’. On that basis it confronts the formal strategies of the first movements of Bruckner’s Ninth and Mahler’s First symphonies, focusing on the relationship between the structural disposition of ‘intensification processes’ and the deliberate blurrings of traditional formal boundaries based on ‘sonata form’. In the mentioned symphonies of Mahler and Bruckner, composition emerges as a ‘force field’ where sonata form does have a role to play, provided it is viewed not as an abstract scheme but as a concrete spectrum of compositional choices in continuous interaction with other instances, particularly a structural principle based on the disposition of intensification processes and the reaching of climaxes.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Blurriness"

1

Savoldi, Antonio, and Paolo Gubian. "Blurriness in Live Forensics: An Introduction." In Advances in Information Security and Its Application, 119–26. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-02633-1_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

De, K., and V. Masilamani. "Discrete Orthogonal Moments Based Framework for Assessing Blurriness of Camera Captured Document Images." In Proceedings of the 3rd International Symposium on Big Data and Cloud Computing Challenges (ISBCC – 16’), 227–36. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-30348-2_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Bhattacharya, Suman, Anirban Mitra, Moumita Chatterjee, Sudipta Roy, Saurabh Adhikari, and Soumya Sen. "Restoring of Fundus Retinal Image for Detection of Diabetic Retinopathy in Presence of Blurriness of Cataract." In Proceedings of 2nd International Conference on Mathematical Modeling and Computational Science, 231–39. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-0182-9_24.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Heister, Hanns-Werner. "The Principle of Blurring. Conscious, Artistically Produced Blurrings." In Music and Fuzzy Logic, 413–548. Berlin, Heidelberg: Springer Berlin Heidelberg, 2021. http://dx.doi.org/10.1007/978-3-662-62907-9_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yang, Zhuorui, and Aura Ganz. "Egocentric Landmark-Based Indoor Guidance System for the Visually Impaired." In Computer Vision, 1483–99. IGI Global, 2018. http://dx.doi.org/10.4018/978-1-5225-5204-8.ch061.

Full text
Abstract:
In this paper, we introduce an egocentric landmark-based guidance system that enables visually impaired users to interact with indoor environments. The user who wears Google Glasses will capture his surroundings within his field of view. Using this information, we provide the user an accurate landmark-based description of the environment including his relative distance and orientation to each landmark. To achieve this functionality, we developed a near real time accurate vision based localization algorithm. Since the users are visually impaired our algorithm accounts for captured images using Google Glasses that have severe blurriness, motion blurriness, low illumination intensity and crowd obstruction. We tested the algorithm performance in a 12,000 ft2 open indoor environment. When we have mint query images our algorithm obtains mean location accuracy within 5ft., mean orientation accuracy less than 2 degrees and reliability above 88%. After applying deformation effects to the query images such blurriness, motion blurriness and illumination changes, we observe that the reliability is above 75%.
APA, Harvard, Vancouver, ISO, and other styles
6

Grochowski, Mateusz. "Effectiveness and EU consumer law: the blurriness in judicial dialogue." In The Practice of Judicial Interaction in the Field of Fundamental Rights, 235–56. Edward Elgar Publishing, 2022. http://dx.doi.org/10.4337/9781800371224.00025.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

De Souza, Igor H. "Elenx de Céspedes." In Trans Historical, 42–67. Cornell University Press, 2021. http://dx.doi.org/10.7591/cornell/9781501759086.003.0003.

Full text
Abstract:
This chapter tackles the indeterminate genders in the Spanish Inquisition in terms of Elenx de Céspedes who had a husband and a wife in the course of their life. It combines Elenx's narratives on hermaphroditism and sodomy. Elenx claimed their identity as a hermaphrodite to the Inquisition. The Spanish Inquisition is infamous for its repression of non-Catholic practices. The criminalization of sodomy in the Inquisitorial context presupposed control over gender expression. On the other hand, literature dating to Spain's Golden Age often thematizes the conceptual blurriness of boundaries. The chapter notes the doubts of scholars with regards to Elenx' hermaphroditism.
APA, Harvard, Vancouver, ISO, and other styles
8

Amos, Yukari Takimoto, and Nicole M. Kukar. "Teaching and Learning Simultaneously." In Handbook of Research on Teacher Education and Professional Development, 48–67. IGI Global, 2017. http://dx.doi.org/10.4018/978-1-5225-1067-3.ch003.

Full text
Abstract:
The purpose of this chapter is to describe a collaboration process between a teacher education program and a university ESL program that attempts to increase teacher candidates' exposure to ELLs with “third space” as a theoretical framework. In third spaces, boundaries of teacher and student get blurred, and new ways of thinking about teaching and learning emerge. In the collaboration project that this chapter describes, the two teacher candidates regularly volunteered in the university ESL classes and taught mini-lessons to the ELLs while taking a class about ELL teaching. The qualitative analysis of the participants indicates that in the collaboration project, a university-based class and a field-based class were in sync by providing the participants with opportunities to immediately implement what they learned in a traditional class with the ELLs. In this boundary blurriness, the ELLs became from abstract to concrete in the participants' mind, and the participants became reflective practitioners.
APA, Harvard, Vancouver, ISO, and other styles
9

Amos, Yukari Takimoto, and Nicki Kukar. "Learning While Teaching." In Handbook of Research on Educator Preparation and Professional Learning, 100–130. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8583-1.ch007.

Full text
Abstract:
The purpose of this chapter is to describe a collaboration process between a teacher education program and a university ESL program that attempts to increase teacher candidate exposure to English learners (ELs) with “third space” as a theoretical framework. In third spaces, the boundaries of teacher and student get blurred, and new ways of thinking about teaching and learning emerge. In the collaboration project that this chapter describes, the three teacher candidates regularly volunteered in the university ESL classes and taught mini-lessons to the ELs while taking a class on EL teaching. The qualitative analysis of the participants indicates that in the collaboration project, a university-based class and a field-based class were in sync by providing the teacher candidates with opportunities to immediately implement what they learned in a traditional class with the ELs. In this boundary blurriness, the teacher candidates became the owner of their own practitioner knowledge, rather than the borrower of the existing academic knowledge.
APA, Harvard, Vancouver, ISO, and other styles
10

"One Hundred Years of Low Definition." In Indefinite Visions, edited by Erika Balsom. Edinburgh University Press, 2017. http://dx.doi.org/10.3366/edinburgh/9781474407120.003.0005.

Full text
Abstract:
In the 1920s, filmmaker-theorists such as Germaine Dulac argued in favour of what today would be called low-definition images. Dulac, for instance, advocates leaving behind cinema’s ‘unquestionable accuracy’ to pursue blurriness and superimposition. In order to create a strong affective impact on the spectator, cinema would have to find ways to avoid simply copying physical reality. But it would also need to pursue this path to establish itself as an art: indefinite images are above all aligned with claims for specificity that seek to establish the autonomy of cinema vis-à-vis other art forms and physical reality alike. The notion that cinema might simply copy was thus rebutted by recourse to a very traditional definition of art, precisely as this definition was being challenged by the historical avant-gardes. Taking a bifocal perspective on two historical moments roughly a century apart, this chapter questions what relevance the 1920s’ endorsement of low definition might have for our time, when such images have become a notable fixture of filmic practices, from blockbusters to the avant-garde
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Blurriness"

1

Ye, Wei, and Kai-Kuang Ma. "Blurriness-guided unsharp masking." In 2017 IEEE International Conference on Image Processing (ICIP). IEEE, 2017. http://dx.doi.org/10.1109/icip.2017.8296987.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Chunhua, Wen Chen, and Jeffrey A. Bloom. "A universal reference-free blurriness measure." In IS&T/SPIE Electronic Imaging, edited by Susan P. Farnand and Frans Gaykema. SPIE, 2011. http://dx.doi.org/10.1117/12.872477.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cherakhloo, Mahdi, Hatef Otroshi Shahreza, Mohammad Soltanian, Sina Ranjkeshzadeh, Hamid Azadegan, Alireza Shirkhodaee, Arash Amini, and Hamid Behroozi. "Measuring Image Blooming using Blurriness Metrics." In 2020 10th International Symposium on Telecommunications (IST). IEEE, 2020. http://dx.doi.org/10.1109/ist50524.2020.9345839.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Peng, Yan-Tsung, Yu-Cheng Lin, and Wen-Yi Peng. "Blurriness Guided Underwater Salient Object Detection." In OCEANS 2021: San Diego – Porto. IEEE, 2021. http://dx.doi.org/10.23919/oceans44145.2021.9705721.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zeng, Yi-Chong, Shih-Jui Yang, and Wen-Tsung Chang. "Integer-based pixel blurriness measure and its applications." In 2014 IEEE 3rd Global Conference on Consumer Electronics (GCCE). IEEE, 2014. http://dx.doi.org/10.1109/gcce.2014.7031206.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tahir, Qadri Muhammad, Mehmood S. Noman, and Aisha Tahir. "Blurriness measurement in frequency domain for image quality assessment." In 2011 International Conference on Graphic and Image Processing. SPIE, 2011. http://dx.doi.org/10.1117/12.914629.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Dardi, Francesca, Leonardo Abate, and Giovanni Ramponi. "A set of features for measuring blurriness in video frames." In Melecon 2010 - 2010 15th IEEE Mediterranean Electrotechnical Conference. IEEE, 2010. http://dx.doi.org/10.1109/melcon.2010.5476300.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Peng, Yan-Tsung, Xiangyun Zhao, and Pamela C. Cosman. "Single underwater image enhancement using depth estimation based on blurriness." In 2015 IEEE International Conference on Image Processing (ICIP). IEEE, 2015. http://dx.doi.org/10.1109/icip.2015.7351749.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yeh, Chun-Hsiao, and Herng-Hua Chang. "Face liveness detection with feature discrimination between sharpness and blurriness." In 2017 Fifteenth IAPR International Conference on Machine Vision Applications (MVA). IEEE, 2017. http://dx.doi.org/10.23919/mva.2017.7986885.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zeng, Yi-Chong. "A CNN-based Blurriness Measure Method for Focused Object Segmentation." In 2021 IEEE 10th Global Conference on Consumer Electronics (GCCE). IEEE, 2021. http://dx.doi.org/10.1109/gcce53005.2021.9621786.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography