Gotowa bibliografia na temat „Near-infrared camera”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Near-infrared camera”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Near-infrared camera"

1

Nadeau, Daniel, David C. Murphy, Rene Doyon i Neil Rowlands. "The Montreal near-infrared camera". Publications of the Astronomical Society of the Pacific 106 (sierpień 1994): 909. http://dx.doi.org/10.1086/133458.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Lisi, F., C. Baffa, V. Bilotti, D. Bonaccini, C. del Vecchio, S. Gennari, L. K. Hunt, G. Marcucci i R. Stanga. "ARNICA, the Arcetri Near-Infrared Camera". Publications of the Astronomical Society of the Pacific 108 (kwiecień 1996): 364. http://dx.doi.org/10.1086/133731.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Hoegner, L., A. Hanel, M. Weinmann, B. Jutzi, S. Hinz i U. Stilla. "Towards people detection from fused time-of-flight and thermal infrared images". ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XL-3 (11.08.2014): 121–26. http://dx.doi.org/10.5194/isprsarchives-xl-3-121-2014.

Pełny tekst źródła
Streszczenie:
Obtaining accurate 3d descriptions in the thermal infrared (TIR) is a quite challenging task due to the low geometric resolutions of TIR cameras and the low number of strong features in TIR images. Combining the radiometric information of the thermal infrared with 3d data from another sensor is able to overcome most of the limitations in the 3d geometric accuracy. In case of dynamic scenes with moving objects or a moving sensor system, a combination with RGB cameras of Time-of-Flight (TOF) cameras is suitable. As a TOF camera is an active sensor in the near infrared (NIR) and the thermal infrared camera captures the radiation emitted by the objects in the observed scene, the combination of these two sensors for close range applications is independent from external illumination or textures in the scene. This article is focused on the fusion of data acquired both with a time-of-flight (TOF) camera and a thermal infrared (TIR) camera. As the radiometric behaviour of many objects differs between the near infrared used by the TOF camera and the thermal infrared spectrum, a direct co-registration with feature points in both intensity images leads to a high number of outliers. A fully automatic workflow of the geometric calibration of both cameras and the relative orientation of the camera system with one calibration pattern usable for both spectral bands is presented. Based on the relative orientation, a fusion of the TOF depth image and the TIR image is used for scene segmentation and people detection. An adaptive histogram based depth level segmentation of the 3d point cloud is combined with a thermal intensity based segmentation. The feasibility of the proposed method is demonstrated in an experimental setup with different geometric and radiometric influences that show the benefit of the combination of TOF intensity and depth images and thermal infrared images.
Style APA, Harvard, Vancouver, ISO itp.
4

von Bueren, S. K., A. Burkart, A. Hueni, U. Rascher, M. P. Tuohy i I. J. Yule. "Deploying four optical UAV-based sensors over grassland: challenges and limitations". Biogeosciences 12, nr 1 (9.01.2015): 163–75. http://dx.doi.org/10.5194/bg-12-163-2015.

Pełny tekst źródła
Streszczenie:
Abstract. Unmanned aerial vehicles (UAVs) equipped with lightweight spectral sensors facilitate non-destructive, near-real-time vegetation analysis. In order to guarantee robust scientific analysis, data acquisition protocols and processing methodologies need to be developed and new sensors must be compared with state-of-the-art instruments. Four different types of optical UAV-based sensors (RGB camera, converted near-infrared camera, six-band multispectral camera and high spectral resolution spectrometer) were deployed and compared in order to evaluate their applicability for vegetation monitoring with a focus on precision agricultural applications. Data were collected in New Zealand over ryegrass pastures of various conditions and compared to ground spectral measurements. The UAV STS spectrometer and the multispectral camera MCA6 (Multiple Camera Array) were found to deliver spectral data that can match the spectral measurements of an ASD at ground level when compared over all waypoints (UAV STS: R2=0.98; MCA6: R2=0.92). Variability was highest in the near-infrared bands for both sensors while the band multispectral camera also overestimated the green peak reflectance. Reflectance factors derived from the RGB (R2=0.63) and converted near-infrared (R2=0.65) cameras resulted in lower accordance with reference measurements. The UAV spectrometer system is capable of providing narrow-band information for crop and pasture management. The six-band multispectral camera has the potential to be deployed to target specific broad wavebands if shortcomings in radiometric limitations can be addressed. Large-scale imaging of pasture variability can be achieved by either using a true colour or a modified near-infrared camera. Data quality from UAV-based sensors can only be assured, if field protocols are followed and environmental conditions allow for stable platform behaviour and illumination.
Style APA, Harvard, Vancouver, ISO itp.
5

Haishui Ye, Haishui Ye, Zhishan Gao Zhishan Gao, Zhenyu Qin Zhenyu Qin i Qianwen Wang Qianwen Wang. "Near-infrared fundus camera based on polarization switch in stray light elimination". Chinese Optics Letters 11, nr 3 (2013): 031702–31705. http://dx.doi.org/10.3788/col201311.031702.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Choi, Janghoon, Jun-Geun Shin, Yoon-Oh Tak, Youngseok Seo i Jonghyun Eom. "Single Camera-Based Dual-Channel Near-Infrared Fluorescence Imaging system". Sensors 22, nr 24 (13.12.2022): 9758. http://dx.doi.org/10.3390/s22249758.

Pełny tekst źródła
Streszczenie:
In this study, we propose a single camera-based dual-channel near-infrared (NIR) fluorescence imaging system that produces color and dual-channel NIR fluorescence images in real time. To simultaneously acquire color and dual-channel NIR fluorescence images of two fluorescent agents, three cameras and additional optical parts are generally used. As a result, the volume of the image acquisition unit increases, interfering with movements during surgical procedures and increasing production costs. In the system herein proposed, instead of using three cameras, we set a single camera equipped with two image sensors that can simultaneously acquire color and single-channel NIR fluorescence images, thus reducing the volume of the image acquisition unit. The single-channel NIR fluorescence images were time-divided into two channels by synchronizing the camera and two excitation lasers, and the noise caused by the crosstalk effect between the two fluorescent agents was removed through image processing. To evaluate the performance of the system, experiments were conducted for the two fluorescent agents to measure the sensitivity, crosstalk effect, and signal-to-background ratio. The compactness of the resulting image acquisition unit alleviates the inconvenient movement obstruction of previous devices during clinical and animal surgery and reduces the complexity and costs of the manufacturing process, which may facilitate the dissemination of this type of system.
Style APA, Harvard, Vancouver, ISO itp.
7

Baffa, C., G. Comoretto, S. Gennari, F. Lisi, E. Oliva, V. Biliotti, A. Checcucci i in. "NICS: The TNG Near Infrared Camera Spectrometer". Astronomy & Astrophysics 378, nr 2 (listopad 2001): 722–28. http://dx.doi.org/10.1051/0004-6361:20011194.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Iwai, Yutaka. ""ENG VTR combo near infrared camera system"." Journal of the Institute of Television Engineers of Japan 43, nr 7 (1989): 731–32. http://dx.doi.org/10.3169/itej1978.43.731.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Yue, Wei, Li Jiang, Xiubin Yang, Suining Gao, Yunqiang Xie i Tingting Xu. "Optical Design of a Common-Aperture Camera for Infrared Guided Polarization Imaging". Remote Sensing 14, nr 7 (28.03.2022): 1620. http://dx.doi.org/10.3390/rs14071620.

Pełny tekst źródła
Streszczenie:
Polarization and infrared imaging technology have unique advantages for various applications ranging from biology to ocean remote sensing. However, conventional combined polarization camera and infrared camera have limitations because they are constrained to single-band imaging systems with rotating polarizers and cascaded optics. Therefore, we propose a common-aperture mode based on multi-band infrared guided polarization imaging system (IGPIS) in this paper, which consists of infrared wide-area sensing and polarization features acquisition for accurate detection of ship targets. The IGPIS can provide images in visible polarization (0.45–0.76 μm), near-infrared polarization (0.76–0.9 μm), and long-wave infrared (8–12 μm) bands. Satellite attitude parameters and camera optical parameters are accurately calculated by establishing a dynamic imaging model for guidance imaging. We illustrate the imaging principle, sensors specifications and imaging performance analysis and the experimental results show that the MTF is 0.24 for visible and near-infrared, and 0.13 for long-wave infrared. The obtained multi-band images have an average gradient of 12.77 after accurate fusion. These results provide theoretical guidance for the design of common-aperture cameras in remote sensing imaging field.
Style APA, Harvard, Vancouver, ISO itp.
10

Kang, You Sun, i Duk Shin. "Multiband Camera System Using Color and Near Infrared Images". Applied Mechanics and Materials 446-447 (listopad 2013): 922–26. http://dx.doi.org/10.4028/www.scientific.net/amm.446-447.922.

Pełny tekst źródła
Streszczenie:
Various applications using a camera system have been developed and deployed commercially to improve our daily life. The performance of camera system is mainly dependent on image quality and illumination conditions. Multiband camera has been developed to provide a wealth of information for image acquisition. In this paper, we developed two applications about image segmentation and face detection using a multiband camera, which is available in four bands consisting of a near infrared and three color bands. We proposed a multiband camera system to utilize two different images i.e. color image extracted from Bayer filter and near infrared images. The experimental results showed the effectiveness of the proposed system.
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Near-infrared camera"

1

Molina, Keith M. (Keith Martin). "A battery powered near infrared (NIR) camera for the MIT HelioDome". Thesis, Massachusetts Institute of Technology, 2008. http://hdl.handle.net/1721.1/45325.

Pełny tekst źródła
Streszczenie:
Thesis (S.B.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2008.
Includes bibliographical references (p. 99).
Research in advanced fenestration systems has led to the development of the Heliodome project at the Massachusetts Institute of Technology Daylighting Laboratory. The MIT Heliodome project is dedicated to goniophotometry and the assessment of bidirectional photometric properties of light- (or heat-)redirecting facade systems by using digital cameras as multiple-points photosensors, that cover the visible and near infrared portions of the sunlight spectrum. Two cameras are used in this device: a charge couple device (CCD) camera using silicon detectors and a near infrared (NIR) camera using InGaAs sensors. Both cameras are mounted to a table which has two degrees of rotational freedom, altitude and azimuth. Using the rotating table and cameras in combination with an ellipsoidal dome, which is coated with a semi-transparent specula coating, allows a time-efficient and continuous measurement of bidirectional transmission (or reflection) density functions (BTDFs or BRDFs). This thesis seeks to enhance current Heliodome operations by developing a portable power source for the NIR camera. A portable power system has been designed and constructed to operate the NIR camera during measurement sessions. The portable power system allows the rotating table to rotate completely free of constraints caused by weight, imbalance, power and light path obstruction issues. This contribution to the Heliodome project provides the user with more reliable data and relief from disorderly setup.
by Keith M. Molina.
S.B.
Style APA, Harvard, Vancouver, ISO itp.
2

Stevenson, Brady Roos. "Analysis of Near-Infrared Phase Effects on Biometric Iris Data". BYU ScholarsArchive, 2006. https://scholarsarchive.byu.edu/etd/1299.

Pełny tekst źródła
Streszczenie:
The purpose of this research is to ascertain potential iris scan data variations from near infrared waves derived from fluorescent illumination. Prior studies of iris data variances from infrared wave interference of halogen, incandescent, and sunlight with iris cameras suggest that similar changes may exist under near infrared wavelengths from fluorescent light. The concern is that the fluorescent energy emission may interfere with the near infrared detection of an iris camera. An iris camera is used to measure human eye characteristics known as biometrics. If such infrared emission is statistically significant, then it can alter the validity of the iris scan data. The experiment utilized nine hundred forty-five (945) scans from sixty-three (63) subjects. Measured results showed increased heat from ambient fluorescent illumination does not statistically alter the biometric readings of human eyes. The test results fail to reject that data loss will not occur as heat is increased in the ambient fluorescent light source.
Style APA, Harvard, Vancouver, ISO itp.
3

Langri, Dharminder Singh. "Monitoring Cerebral Functional Response using sCMOS-based High Density Near Infrared Spectroscopic Imaging". Wright State University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=wright1558610822306817.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Riseby, Emil, i Alexander Svensson. "Multispectral Imaging for Surveillance Applications". Thesis, Linköpings universitet, Medie- och Informationsteknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-115731.

Pełny tekst źródła
Streszczenie:
Silicon based sensors is a commonly used technology in digital cameras today. That has made such cameras relatively cheap and widely used. Unfortunately they are constructed to capture and represent image quality for humans. Several image applications work better without the restrictions of the visible spectrum. Human visual restrictions are often indirectly put on technology by using images showing only visible light. Thinking outside the box in this case is seeing beyond the visible spectrum.
Style APA, Harvard, Vancouver, ISO itp.
5

Ekelund, Jonah. "Calibration and evaluation of the secondary sensors for the Mini-EUSO space instrument". Thesis, Luleå tekniska universitet, Rymdteknik, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-67118.

Pełny tekst źródła
Streszczenie:
The Mini-EUSO (Mini - Extreme Universe Space Observatory) is an instrument for observation of ultra-high energy cosmic rays (UHECR) from space. It is designed to observe Earth from the international space station (ISS) in the ultra-violet (UV), visible (VIS) and near-infrared (NIR) light ranges. The UV sensor is the main sensor, designed and built by the EUSO collaboration. The visible and near-infrared sensors are secondary sensors. These are two cameras, FMVU-13S2C-CS and CMLN-13S2M-CV, from Point Grey Research Inc. The near-infrared light camera has a phosphor coating on the sensor to convert from near-infrared light to visible light, which is detectable by the camera's CCD. This thesis deals with the calibration and evaluation of the secondary sensors. This is done by first evaluating the bias and dark current for both cameras. After which a calibration is done using the light measurement sphere, located at the National Instituteof Polar Research (NIPR) in Midori-cho, Tachikawa-shi, Japan. Due to the low sensitivity of the near-infrared light camera, an evaluation of its ability to see celestialobjects are also performed. It is found that the visible light camera has a high bias with values around 5 ADU (Analog-to-Digital unit), but almost non-existing dark current, with mean values below 1 ADU. The visible light camera has good sensitivity for all the colors: red, green and blue. However, it is most sensitive to green. Due to this, it is easy to saturate the pixels with too much light. Therefore, saturation intensity was also examined for the shutter times of the visible light camera. This is found to be between 900μWm-2sr-1 and 1·107μWm-2sr-1, depending on color and shutter time. The near-infrared light camera is the opposite; it has a low bias with values below 1 ADU and a high dark current. The values of the dark current for the near-infrared light camera are highly dependent on the temperature of the camera. Mean values are below 1 ADU for temperatures around 310K, but mean values of almost 2 ADU at temperatures around 338K. The sensitivity of the near-infrared light camera is very low, therefore, the only way to detect a difference between the light levels of the light measurement sphere was to use a high ADC amplication gain. With this it was found that there is a power-law behavior, values between 1.33 and 1.50, of the relationship between pixel values and light intensity. This is likely due to the phosphor coating used to convert to visible light. When trying to detect celestial objects, the faintest object detected was Venus with a magnitude of less than -4.
Style APA, Harvard, Vancouver, ISO itp.
6

Benselfelt, Tobias. "Flow Cytometry Sensor System Targeting Escherichia Coli as an Indicator of Faecal Contamination of Water Sources". Thesis, Linköpings universitet, Teknisk biologi, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-108004.

Pełny tekst źródła
Streszczenie:
Poor water quality is a global health concern affecting one billion people around the world. It is important to monitor water sources in order to maintain the quality of our drinking water and to avoid disease outbreaks. Targeting Escherichia coli as a faecal indicator is a widely used procedure, but the current methods are time consuming and not adequate to prevent spreading of faecal influence.   This Master thesis demonstrates the development of a near infrared fluorescence flow cytometer sensor system targeting Escherichia coli, using fluorescently labeled chicken IgY antibodies. The near infrared light was chosen to avoid fluorescence from blue-green algae that are present in the water source.   The hardware was developed with a 785  nm laser line to detect Alexa Fluor 790 labeled antibodies, using a photomultiplier tube or two different CMOS cameras. The antibodies were labeled using a commercial labeling kit, and evaluated using antibody binding assays and the developed hardware.   The IgY antibodies were successfully labeled with Alexa Fluor 790 and the function was maintained after the labeling process. The result demonstrates the principles of the sensor system and how it solved to the problem with fluorescence from blue-green algae. An aperture was used to overcome the suboptimal laser and filter setup, and to increase the sensitivity of the system. However, only a small fraction of the cells could be detected, due to challenges with the focal depth and loss of sensitivity in the photomultiplier tube at near infrared wavelengths. Further development is required to create a working product.
Style APA, Harvard, Vancouver, ISO itp.
7

Song, Yan-Bin, i 宋焱檳. "Research on Near-infrared-based Ego-positioning Algorithm with Wide-angle Camera". Thesis, 2018. http://ndltd.ncl.edu.tw/handle/gn6gdf.

Pełny tekst źródła
Streszczenie:
碩士
國立臺灣大學
資訊網路與多媒體研究所
106
Simultaneous Localization and Mapping(SLAM)is a generic method used to solve robot ego-positioning problems. At present, popular SLAM methods are mainly divided into direct methods and feature-based methods. The direct method is more sensitive to brightness information, and the feature-based method has been demonstrated more tolerance toward changes in brightness. In nighttime environment, information from normal camera become difficult to identify. Therefore, we propose to use near-infrared (NIR) cameras for ego-positioning of night robots. Images obtained by NIR light have different light intensities due to their distance from the infrared light sources, so SLAM is not suitable for direct methods. We used a feature-based method which is tolerant to light intensities to ego-positioning. This method combines ego-positioning and NIR with multi-wide-angle NIR cameras to not only solve the problem of capturing the feature points under low-light conditions, but also obtain more accurate ego-positioning results. This allows the robot to do SLAM even at night in low illumination indoor scenario. Therefore, this system has more robust results in different challenging situations.
Style APA, Harvard, Vancouver, ISO itp.
8

Lin, Chao-chun, i 林昭俊. "Research that near infrared ray diffraction technology in the cell-phone camera focus adjustable and image evaluation". Thesis, 2007. http://ndltd.ncl.edu.tw/handle/32835138138249410622.

Pełny tekst źródła
Streszczenie:
碩士
逢甲大學
資訊電機工程碩士在職專班
95
This paper demonstrates a new optical technique in which we apply the moire pattern theory to adjust the focal length of cell-phone camera. To increase the production capacity and reduce the cost of production of cell-phone camera, we develop a new adjustable focus system with this new technique. This method can test the resolution of cell-phone camera reaching 1.3 million pixels.
Style APA, Harvard, Vancouver, ISO itp.
9

Mudunuri, Sivaram Prasad. "Face Recognition in Unconstrained Environment". Thesis, 2019. https://etd.iisc.ac.in/handle/2005/5113.

Pełny tekst źródła
Streszczenie:
The goal of computer vision is to provide the ability to machines to understand image data and infer the useful information from it. The inferences highly depend on the quality of the image data. But in many real-world applications, we encounter poor quality images which have low discriminative power which affects the performance of computer vision algorithms. In particular, in the field of Biometrics, the performance of face recognition systems are significantly affected when the face images have poor resolution and are captured under uncontrolled pose and illumination conditions as in surveillance settings. In this thesis, we propose algorithms to match the low-resolution probe images captured under non frontal pose and poor illumination conditions with the high-resolution gallery faces captured in frontal pose and good illuminations which are often available during enrollment. Many of the standard metric learning and dictionary learning approaches perform quite well in matching faces across different domains but they require the locations of several landmark points like corners of eyes, nose and mouth etc. both during training and testing. This is a difficult task especially for low-resolution images under non-frontal pose. In the first algorithm of this thesis, we propose a multi-dimensional scaling based approach to learn a common transformation matrix for the entire face which simultaneously transforms the facial features of the low-resolution and the high-resolution training images such that the distance between them approximates the distance had both the images been captured under the same controlled imaging conditions. It is only during the training stage that we need locations of different fiducial points to learn the transformation matrix. To overcome the computational complexity of the algorithm, we further proposed a reference-based face recognition approach with a trade-off on recognition performance. In our second approach in this thesis, we propose a novel deep convolutional neural network architecture to address the low-resolution face recognition by systematically introducing different kinds of constraints at different layers of the architecture so that the approach can recognize low-resolution images as well as generalize well to images of unseen categories. Though coupled dictionary learning has emerged as a powerful technique for matching data samples of cross domains, most of the frameworks demand one-to-one paired training samples. In practical surveillance face recognition problems, there can be just one high-resolution image and many low resolution images of each subject for training in which there is no exact one-to-one correspondence in the images from two domains. The third algorithm proposes an orthogonal dictionary learning and alignment approach for handling this problem. In this part, we also address the heterogeneous face recognition problem where the gallery images are captured from RGB camera and the probe images are captured from near-infrared (NIR) camera. We further explored the more challenging problem of low-resolution heterogeneous face recognition where the probe faces are low-resolution NIR images since recently, NIR images are increasingly being captured for recognizing faces in low-light/night-time conditions. We developed a re-ranking framework to address the problem. To further encourage the research in this field, we have also collected our own database HPR (Heterogeneous face recognition across Pose and Resolution) which has facial images captured from two surveillance quality NIR cameras and one high-resolution visible camera, with significant variations in head pose and resolution. Extensive related experiments are conducted on each of the proposed approaches to demonstrate their effectiveness and usefulness
Style APA, Harvard, Vancouver, ISO itp.

Książki na temat "Near-infrared camera"

1

United States. National Aeronautics and Space Administration, red. The Hubble Space Telescope second servicing mission (SM-2): Near infrared camera and multi-object spectrometer (NICMOS). [Washington, D.C.?]: National Aeronautics and Space Administration, 1996.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Space Telescope Science Institute (U.S.), red. The 2002 HST calibration workshop: Space Telescope Science Institute, Baltimore, Maryland, October 17 and 18, 2002 : abstracts. Baltimore, Md: the Space Telescope Science Institute, 2002.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

HST Calibration Workshop (4th 2002 Baltimore, Md.). The 2002 HST calibration workshop: Hubble after the installation of the ACS and the NICMOS cooling system : proceedings of a workshop held at the Space Telescope Science Institute, Baltimore, Maryland, October 17 and 18, 2002. Baltimore, MD: Published and distributed by the Space Telescope Science Institute, 2003.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

K, Matthews, i United States. National Aeronautics and Space Administration., red. The first diffraction-limited images from the W.M. Keck Telescope. [Washington, D.C: National Aeronautics and Space Administration, 1996.

Znajdź pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Near-infrared camera"

1

Thompson, R. "NICMOS: Near Infrared Camera and Multi-Object Spectrometer". W Next Generation Infrared Space Observatory, 69–93. Dordrecht: Springer Netherlands, 1992. http://dx.doi.org/10.1007/978-94-011-2680-9_8.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Matthews, K., i B. T. Soifer. "The Near Infrared Camera on the W.M. Keck Telescope". W Infrared Astronomy with Arrays, 239–46. Dordrecht: Springer Netherlands, 1994. http://dx.doi.org/10.1007/978-94-011-1070-9_76.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Ueno, Munetaka, Takashi Ichikawa, Shuji Sato, Yasumasa Kasaba i Masanao Ito. "Near Infrared Survey of the Galactic Center with 512 × 512 PTSI Camera". W Infrared Astronomy with Arrays, 119–20. Dordrecht: Springer Netherlands, 1994. http://dx.doi.org/10.1007/978-94-011-1070-9_33.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Sun, Jinghao. "The Near Infrared High Resolution Imaging Camera Program of Beijing Observatory". W Infrared and Submillimeter Space Missions in the Coming Decade, 171–73. Dordrecht: Springer Netherlands, 1995. http://dx.doi.org/10.1007/978-94-011-0363-3_22.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Zhang, Nana, Jun Huang i Hui Zhang. "Interactive Face Liveness Detection Based on OpenVINO and Near Infrared Camera". W Communications in Computer and Information Science, 79–90. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-3341-9_7.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Gao, Renwu, Siting Zheng, Jia He i Linlin Shen. "CycleGAN-Based Image Translation for Near-Infrared Camera-Trap Image Recognition". W Pattern Recognition and Artificial Intelligence, 453–64. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-59830-3_39.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Straubmeier, C., T. Bertram, A. Eckart i T. Herbst. "LINC-NIRVANA — The Interferometric Near-Infrared Imaging Camera for the Large Binocular Telescope". W Springer Proceedings in Physics, 317–21. Berlin, Heidelberg: Springer Berlin Heidelberg, 1997. http://dx.doi.org/10.1007/978-3-642-18902-9_58.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Dellarre, Anthony, Maxime Limousin i Nicolas Beraud. "Melt Pool Acquisition Using Near-Infrared Camera in Aluminum Wire Arc Additive Manufacturing". W Advances on Mechanics, Design Engineering and Manufacturing IV, 803–14. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-15928-2_70.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Ichikawa, Takashi, Yuka Katsuno, Ryuji Suzuki, Chihiro Tokoku i Tetsuo Nishimura. "Performance of HAWAII-2 FPA for Subaru Multi-Object Near-Infrared Camera and Spectrograph". W Scientific Detectors for Astronomy, 529–34. Dordrecht: Springer Netherlands, 2004. http://dx.doi.org/10.1007/1-4020-2527-0_72.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Oonk, Rodney L. "Design of the Solid Cryogen Dewar for the Near-Infrared Camera and Multi-Object Spectrometer". W Advances in Cryogenic Engineering, 1385–91. Boston, MA: Springer US, 1992. http://dx.doi.org/10.1007/978-1-4615-3368-9_72.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Near-infrared camera"

1

Cromwell, Brian, Robert Wilson i Robert Johnson. "Ultra high-speed near-infrared camera". W Defense and Security, redaktorzy Bjorn F. Andresen i Gabor F. Fulop. SPIE, 2005. http://dx.doi.org/10.1117/12.603583.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Cha, Sang-Mok, Ho Jin, In-Soo Yuk, Sungho Lee, Uk-Won Nam, Bongkon Moon, Seungwon Mock i in. "KASINICS: KASI Near-Infrared Camera System". W SPIE Astronomical Telescopes + Instrumentation, redaktorzy Ian S. McLean i Masanori Iye. SPIE, 2006. http://dx.doi.org/10.1117/12.669903.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Younus, Othman Isam, Eleni Niarchou, Shivani Rajendra Teli, Zabih Ghassemlooy, Stanislav Zvanovec i Hoa Le Minh. "Near-Infrared based Optical Camera Communications". W 2022 13th International Symposium on Communication Systems, Networks and Digital Signal Processing (CSNDSP). IEEE, 2022. http://dx.doi.org/10.1109/csndsp54353.2022.9907899.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Spillar, Earl J., Paul E. Johnson, Michael Wenz i David W. Warren. "Wyoming prime-focus near-infrared camera". W Astronomy '90, Tucson AZ, 11-16 Feb 90, redaktor David L. Crawford. SPIE, 1990. http://dx.doi.org/10.1117/12.19072.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Salas, Luis, Leonel Gutierrez, Mario Tapia, Irene Cruz-Gonzales, Elfego Ruiz Schneider, Esteban Luna-Aguilar, Jorge Valdez i in. "Dual infrared camera for near and mid infrared observations". W Astronomical Telescopes and Instrumentation, redaktorzy Masanori Iye i Alan F. M. Moorwood. SPIE, 2003. http://dx.doi.org/10.1117/12.461962.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
6

Priest, Robert E., Isabella T. Lewis, Noel R. Sewall, Hye-Sook Park, Michael J. Shannon, Arno G. Ledebuhr, Lyn D. Pleasance, Mark A. Massie i Karen Metschuleit. "Near-infrared camera for the Clementine mission". W SPIE's 1995 Symposium on OE/Aerospace Sensing and Dual Use Photonics, redaktor Albert M. Fowler. SPIE, 1995. http://dx.doi.org/10.1117/12.211287.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
7

Thompson, Rodger I. "Near-infrared camera and multiobject spectrometer (NICMOS): the near-infrared space mission on HST". W Garmisch - DL tentative, redaktorzy Guy Cerutti-Maori i Philippe Roussel. SPIE, 1994. http://dx.doi.org/10.1117/12.185265.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

Vanegas, Morris D., Stefan Carp i Qianqian Fang. "Mobile Phone Camera Based Near-Infrared Spectroscopy Measurements". W Clinical and Translational Biophotonics. Washington, D.C.: OSA, 2018. http://dx.doi.org/10.1364/translational.2018.jtu3a.64.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
9

Nagayama, Takahiro, Chie Nagashima, Yasushi Nakajima, Tetsuya Nagata, Shuji Sato, Hidehiko Nakaya, Tomoyasu Yamamuro, Koji Sugitani i Motohide Tamura. "SIRUS: a near infrared simultaneous three-band camera". W Astronomical Telescopes and Instrumentation, redaktorzy Masanori Iye i Alan F. M. Moorwood. SPIE, 2003. http://dx.doi.org/10.1117/12.460770.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
10

Nordt, Alison, i Derek Edinger. "Optical bench assembly for the near-infrared camera". W Optics & Photonics 2005, redaktorzy James B. Heaney i Lawrence G. Burriesci. SPIE, 2005. http://dx.doi.org/10.1117/12.613797.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.

Raporty organizacyjne na temat "Near-infrared camera"

1

Lyons, B. C., S. J. Zweben, T. K. Gray, J. Hosea, R. Kaita, H. W. Kugel, R. J. Maqueda, A. G. McLean, A. L. Roquemore i V. A. Soukhanovskii. Large Area Divertor Temperature Measurements Using A High-speed Camera With Near-infrared FiIters in NSTX. Office of Scientific and Technical Information (OSTI), kwiecień 2011. http://dx.doi.org/10.2172/1010973.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Bhatt, Parth, Curtis Edson i Ann MacLean. Image Processing in Dense Forest Areas using Unmanned Aerial System (UAS). Michigan Technological University, wrzesień 2022. http://dx.doi.org/10.37099/mtu.dc.michigantech-p/16366.

Pełny tekst źródła
Streszczenie:
Imagery collected via Unmanned Aerial System (UAS) platforms has become popular in recent years due to improvements in a Digital Single-Lens Reflex (DSLR) camera (centimeter and sub-centimeter), lower operation costs as compared to human piloted aircraft, and the ability to collect data over areas with limited ground access. Many different application (e.g., forestry, agriculture, geology, archaeology) are already using and utilizing the advantages of UAS data. Although, there are numerous UAS image processing workflows, for each application the approach can be different. In this study, we developed a processing workflow of UAS imagery collected in a dense forest (e.g., coniferous/deciduous forest and contiguous wetlands) area allowing users to process large datasets with acceptable mosaicking and georeferencing errors. Imagery was acquired with near-infrared (NIR) and red, green, blue (RGB) cameras with no ground control points. Image quality of two different UAS collection platforms were observed. Agisoft Metashape, a photogrammetric suite, which uses SfM (Structure from Motion) techniques, was used to process the imagery. The results showed that an UAS having a consumer grade Global Navigation Satellite System (GNSS) onboard had better image alignment than an UAS with lower quality GNSS.
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii