To see the other types of publications on this topic, follow the link: Near-infrared camera.

Journal articles on the topic 'Near-infrared camera'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Near-infrared camera.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Nadeau, Daniel, David C. Murphy, Rene Doyon, and Neil Rowlands. "The Montreal near-infrared camera." Publications of the Astronomical Society of the Pacific 106 (August 1994): 909. http://dx.doi.org/10.1086/133458.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lisi, F., C. Baffa, V. Bilotti, D. Bonaccini, C. del Vecchio, S. Gennari, L. K. Hunt, G. Marcucci, and R. Stanga. "ARNICA, the Arcetri Near-Infrared Camera." Publications of the Astronomical Society of the Pacific 108 (April 1996): 364. http://dx.doi.org/10.1086/133731.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hoegner, L., A. Hanel, M. Weinmann, B. Jutzi, S. Hinz, and U. Stilla. "Towards people detection from fused time-of-flight and thermal infrared images." ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XL-3 (August 11, 2014): 121–26. http://dx.doi.org/10.5194/isprsarchives-xl-3-121-2014.

Full text
Abstract:
Obtaining accurate 3d descriptions in the thermal infrared (TIR) is a quite challenging task due to the low geometric resolutions of TIR cameras and the low number of strong features in TIR images. Combining the radiometric information of the thermal infrared with 3d data from another sensor is able to overcome most of the limitations in the 3d geometric accuracy. In case of dynamic scenes with moving objects or a moving sensor system, a combination with RGB cameras of Time-of-Flight (TOF) cameras is suitable. As a TOF camera is an active sensor in the near infrared (NIR) and the thermal infrared camera captures the radiation emitted by the objects in the observed scene, the combination of these two sensors for close range applications is independent from external illumination or textures in the scene. This article is focused on the fusion of data acquired both with a time-of-flight (TOF) camera and a thermal infrared (TIR) camera. As the radiometric behaviour of many objects differs between the near infrared used by the TOF camera and the thermal infrared spectrum, a direct co-registration with feature points in both intensity images leads to a high number of outliers. A fully automatic workflow of the geometric calibration of both cameras and the relative orientation of the camera system with one calibration pattern usable for both spectral bands is presented. Based on the relative orientation, a fusion of the TOF depth image and the TIR image is used for scene segmentation and people detection. An adaptive histogram based depth level segmentation of the 3d point cloud is combined with a thermal intensity based segmentation. The feasibility of the proposed method is demonstrated in an experimental setup with different geometric and radiometric influences that show the benefit of the combination of TOF intensity and depth images and thermal infrared images.
APA, Harvard, Vancouver, ISO, and other styles
4

von Bueren, S. K., A. Burkart, A. Hueni, U. Rascher, M. P. Tuohy, and I. J. Yule. "Deploying four optical UAV-based sensors over grassland: challenges and limitations." Biogeosciences 12, no. 1 (January 9, 2015): 163–75. http://dx.doi.org/10.5194/bg-12-163-2015.

Full text
Abstract:
Abstract. Unmanned aerial vehicles (UAVs) equipped with lightweight spectral sensors facilitate non-destructive, near-real-time vegetation analysis. In order to guarantee robust scientific analysis, data acquisition protocols and processing methodologies need to be developed and new sensors must be compared with state-of-the-art instruments. Four different types of optical UAV-based sensors (RGB camera, converted near-infrared camera, six-band multispectral camera and high spectral resolution spectrometer) were deployed and compared in order to evaluate their applicability for vegetation monitoring with a focus on precision agricultural applications. Data were collected in New Zealand over ryegrass pastures of various conditions and compared to ground spectral measurements. The UAV STS spectrometer and the multispectral camera MCA6 (Multiple Camera Array) were found to deliver spectral data that can match the spectral measurements of an ASD at ground level when compared over all waypoints (UAV STS: R2=0.98; MCA6: R2=0.92). Variability was highest in the near-infrared bands for both sensors while the band multispectral camera also overestimated the green peak reflectance. Reflectance factors derived from the RGB (R2=0.63) and converted near-infrared (R2=0.65) cameras resulted in lower accordance with reference measurements. The UAV spectrometer system is capable of providing narrow-band information for crop and pasture management. The six-band multispectral camera has the potential to be deployed to target specific broad wavebands if shortcomings in radiometric limitations can be addressed. Large-scale imaging of pasture variability can be achieved by either using a true colour or a modified near-infrared camera. Data quality from UAV-based sensors can only be assured, if field protocols are followed and environmental conditions allow for stable platform behaviour and illumination.
APA, Harvard, Vancouver, ISO, and other styles
5

Haishui Ye, Haishui Ye, Zhishan Gao Zhishan Gao, Zhenyu Qin Zhenyu Qin, and Qianwen Wang Qianwen Wang. "Near-infrared fundus camera based on polarization switch in stray light elimination." Chinese Optics Letters 11, no. 3 (2013): 031702–31705. http://dx.doi.org/10.3788/col201311.031702.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Choi, Janghoon, Jun-Geun Shin, Yoon-Oh Tak, Youngseok Seo, and Jonghyun Eom. "Single Camera-Based Dual-Channel Near-Infrared Fluorescence Imaging system." Sensors 22, no. 24 (December 13, 2022): 9758. http://dx.doi.org/10.3390/s22249758.

Full text
Abstract:
In this study, we propose a single camera-based dual-channel near-infrared (NIR) fluorescence imaging system that produces color and dual-channel NIR fluorescence images in real time. To simultaneously acquire color and dual-channel NIR fluorescence images of two fluorescent agents, three cameras and additional optical parts are generally used. As a result, the volume of the image acquisition unit increases, interfering with movements during surgical procedures and increasing production costs. In the system herein proposed, instead of using three cameras, we set a single camera equipped with two image sensors that can simultaneously acquire color and single-channel NIR fluorescence images, thus reducing the volume of the image acquisition unit. The single-channel NIR fluorescence images were time-divided into two channels by synchronizing the camera and two excitation lasers, and the noise caused by the crosstalk effect between the two fluorescent agents was removed through image processing. To evaluate the performance of the system, experiments were conducted for the two fluorescent agents to measure the sensitivity, crosstalk effect, and signal-to-background ratio. The compactness of the resulting image acquisition unit alleviates the inconvenient movement obstruction of previous devices during clinical and animal surgery and reduces the complexity and costs of the manufacturing process, which may facilitate the dissemination of this type of system.
APA, Harvard, Vancouver, ISO, and other styles
7

Baffa, C., G. Comoretto, S. Gennari, F. Lisi, E. Oliva, V. Biliotti, A. Checcucci, et al. "NICS: The TNG Near Infrared Camera Spectrometer." Astronomy & Astrophysics 378, no. 2 (November 2001): 722–28. http://dx.doi.org/10.1051/0004-6361:20011194.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Iwai, Yutaka. ""ENG VTR combo near infrared camera system"." Journal of the Institute of Television Engineers of Japan 43, no. 7 (1989): 731–32. http://dx.doi.org/10.3169/itej1978.43.731.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Yue, Wei, Li Jiang, Xiubin Yang, Suining Gao, Yunqiang Xie, and Tingting Xu. "Optical Design of a Common-Aperture Camera for Infrared Guided Polarization Imaging." Remote Sensing 14, no. 7 (March 28, 2022): 1620. http://dx.doi.org/10.3390/rs14071620.

Full text
Abstract:
Polarization and infrared imaging technology have unique advantages for various applications ranging from biology to ocean remote sensing. However, conventional combined polarization camera and infrared camera have limitations because they are constrained to single-band imaging systems with rotating polarizers and cascaded optics. Therefore, we propose a common-aperture mode based on multi-band infrared guided polarization imaging system (IGPIS) in this paper, which consists of infrared wide-area sensing and polarization features acquisition for accurate detection of ship targets. The IGPIS can provide images in visible polarization (0.45–0.76 μm), near-infrared polarization (0.76–0.9 μm), and long-wave infrared (8–12 μm) bands. Satellite attitude parameters and camera optical parameters are accurately calculated by establishing a dynamic imaging model for guidance imaging. We illustrate the imaging principle, sensors specifications and imaging performance analysis and the experimental results show that the MTF is 0.24 for visible and near-infrared, and 0.13 for long-wave infrared. The obtained multi-band images have an average gradient of 12.77 after accurate fusion. These results provide theoretical guidance for the design of common-aperture cameras in remote sensing imaging field.
APA, Harvard, Vancouver, ISO, and other styles
10

Kang, You Sun, and Duk Shin. "Multiband Camera System Using Color and Near Infrared Images." Applied Mechanics and Materials 446-447 (November 2013): 922–26. http://dx.doi.org/10.4028/www.scientific.net/amm.446-447.922.

Full text
Abstract:
Various applications using a camera system have been developed and deployed commercially to improve our daily life. The performance of camera system is mainly dependent on image quality and illumination conditions. Multiband camera has been developed to provide a wealth of information for image acquisition. In this paper, we developed two applications about image segmentation and face detection using a multiband camera, which is available in four bands consisting of a near infrared and three color bands. We proposed a multiband camera system to utilize two different images i.e. color image extracted from Bayer filter and near infrared images. The experimental results showed the effectiveness of the proposed system.
APA, Harvard, Vancouver, ISO, and other styles
11

Turhan, Tuncer, and Yusuf Ersahin. "Near-infrared camera for intraventricular neuroendoscopic procedures: in vitro comparison of the efficiency of near-infrared camera and visual light camera during bleeding." Child's Nervous System 27, no. 3 (September 9, 2010): 439–44. http://dx.doi.org/10.1007/s00381-010-1273-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Ogawa, Mikako. "A handy camera for near-infrared fluorescence imaging." Drug Delivery System 30, no. 2 (2015): 145–48. http://dx.doi.org/10.2745/dds.30.145.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Escudero-Sanz, I. "An achromatized spectrograph camera for the near-infrared." Monthly Notices of the Royal Astronomical Society 248, no. 3 (February 1, 1991): 583–84. http://dx.doi.org/10.1093/mnras/248.3.583.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Persson, S. E., S. C. West, D. M. Carr, A. Sivaramakrishnan, and D. C. Murphy. "A near-infrared camera for Las Campanas Observatory." Publications of the Astronomical Society of the Pacific 104 (March 1992): 204. http://dx.doi.org/10.1086/132979.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Yakno, Marlina, Junita Mohamad-Saleh, Mohd Zamri Ibrahim, and W. N. A. W. Samsudin. "Camera-projector calibration for near infrared imaging system." Bulletin of Electrical Engineering and Informatics 9, no. 1 (February 1, 2020): 160–70. http://dx.doi.org/10.11591/eei.v9i1.1697.

Full text
Abstract:
Advanced biomedical engineering technologies are continuously changing the medical practices to improve medical care for patients. Needle insertion navigation during intravenous catheterization process via Near infrared (NIR) and camera-projector is one solution. However, the central point of the problem is the image captured by camera misaligns with the image projected back on the object of interest. This causes the projected image not to be overlaid perfectly in the real-world. In this paper, a camera-projector calibration method is presented. Polynomial algorithm was used to remove the barrel distortion in captured images. Scaling and translation transformations are used to correct the geometric distortions introduced in the image acquisition process. Discrepancies in the captured and projected images are assessed. The accuracy of the image and the projected image is 90.643%. This indicates the feasibility of the captured approach to eliminate discrepancies in the projection and navigation images.
APA, Harvard, Vancouver, ISO, and other styles
16

Thompson, R. "NICMOS: Near Infrared Camera and Multi-Object Spectrometer." Space Science Reviews 61, no. 1-2 (July 1992): 69–93. http://dx.doi.org/10.1007/bf00212476.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Straubmeier, C., A. Eckart, T. Bertram, and T. Herbst. "LINC/NIRVANA - The LBT Near-Infrared Interferometric Camera." Astronomische Nachrichten 324, S1 (September 2003): 577–81. http://dx.doi.org/10.1002/asna.200385092.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Choi, Janghoon, Jun Geun Shin, Hyuk-Sang Kwon, Yoon Oh Tak, Hyeong Ju Park, Jin-Chul Ahn, Joo Beom Eom, et al. "Development of Intraoperative Near-Infrared Fluorescence Imaging System Using a Dual-CMOS Single Camera." Sensors 22, no. 15 (July 26, 2022): 5597. http://dx.doi.org/10.3390/s22155597.

Full text
Abstract:
We developed a single-camera-based near-infrared (NIR) fluorescence imaging device using indocyanine green (ICG) NIR fluorescence contrast agents for image-induced surgery. In general, a fluorescent imaging system that simultaneously provides color and NIR images uses two cameras, which is disadvantageous because it increases the imaging head of the system. Recently, a single-camera-based NIR optical imaging device with quantum efficiency partially extended to the NIR region was developed to overcome this drawback. The system used RGB_NIR filters for camera sensors to provide color and NIR images simultaneously; however, the sensitivity and resolution of the infrared images are reduced by 1/4, and the exposure time and gain cannot be set individually when acquiring color and NIR images. Thus, to overcome these shortcomings, this study developed a compact fluorescent imaging system that uses a single camera with two complementary metal–oxide semiconductor (CMOS) image sensors. Sensitivity and signal-to-background ratio were measured according to the concentrations of ICG solution, exposure time, and camera gain to evaluate the performance of the imaging system. Consequently, the clinical applicability of the system was confirmed through the toxicity analysis of the light source and in vivo testing.
APA, Harvard, Vancouver, ISO, and other styles
19

Ohyama, Youichi, Takashi Onaka, Hideo Matsuhara, Takehiko Wada, Woojung Kim, Naofumi Fujishiro, Kazunori Uemizu, et al. "Near-Infrared and Mid-Infrared Spectroscopy with the Infrared Camera (IRC) for AKARI." Publications of the Astronomical Society of Japan 59, sp2 (October 10, 2007): S411—S422. http://dx.doi.org/10.1093/pasj/59.sp2.s411.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Kimurra, Yoshikatsu, Yusuke Kanzawa, Kiyosumi Kidono, and Masaru Ogawa. "7-band Camera; Near Infrared 4-band and Visible 3-band Camera." Journal of the Institute of Image Information and Television Engineers 69, no. 3 (2015): J86—J90. http://dx.doi.org/10.3169/itej.69.j86.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Jeni, Laszlo A., Hideki Hashimoto, and Takashi Kubota. "Robust Facial Expression Recognition Using Near Infrared Cameras." Journal of Advanced Computational Intelligence and Intelligent Informatics 16, no. 2 (March 20, 2012): 341–48. http://dx.doi.org/10.20965/jaciii.2012.p0341.

Full text
Abstract:
In human-human communication we use verbal, vocal and non-verbal signals to communicate with others. Facial expressions are a form of non-verbal communication, recognizing them helps to improve the human-machine interaction. This paper proposes a system for pose- and illumination-invariant recognition of facial expressions using near-infrared camera images and precise 3D shape registration. Precise 3D shape information of the human face can be computed by means of Constrained Local Models (CLM), which fits a dense model to an unseen image in an iterative manner. We used a multi-class SVM to classify the acquired 3D shape into different emotion categories. Results surpassed human performance and show poseinvariant performance. Varying lighting conditions can influence the fitting process and reduce the recognition precision. We built a near-infrared and visible light camera array to test the method with different illuminations. Results shows that the near-infrared camera configuration is suitable for robust and reliable facial expression recognition with changing lighting conditions.
APA, Harvard, Vancouver, ISO, and other styles
22

Park, Sung Ho, Hyo Sik Yoon, and Kang Ryoung Park. "Faster R-CNN and Geometric Transformation-Based Detection of Driver’s Eyes Using Multiple Near-Infrared Camera Sensors." Sensors 19, no. 1 (January 7, 2019): 197. http://dx.doi.org/10.3390/s19010197.

Full text
Abstract:
Studies are being actively conducted on camera-based driver gaze tracking in a vehicle environment for vehicle interfaces and analyzing forward attention for judging driver inattention. In existing studies on the single-camera-based method, there are frequent situations in which the eye information necessary for gaze tracking cannot be observed well in the camera input image owing to the turning of the driver’s head during driving. To solve this problem, existing studies have used multiple-camera-based methods to obtain images to track the driver’s gaze. However, this method has the drawback of an excessive computation process and processing time, as it involves detecting the eyes and extracting the features of all images obtained from multiple cameras. This makes it difficult to implement it in an actual vehicle environment. To solve these limitations of existing studies, this study proposes a method that uses a shallow convolutional neural network (CNN) for the images of the driver’s face acquired from two cameras to adaptively select camera images more suitable for detecting eye position; faster R-CNN is applied to the selected driver images, and after the driver’s eyes are detected, the eye positions of the camera image of the other side are mapped through a geometric transformation matrix. Experiments were conducted using the self-built Dongguk Dual Camera-based Driver Database (DDCD-DB1) including the images of 26 participants acquired from inside a vehicle and the Columbia Gaze Data Set (CAVE-DB) open database. The results confirmed that the performance of the proposed method is superior to those of the existing methods.
APA, Harvard, Vancouver, ISO, and other styles
23

Ichikawa, T., K. Tarusawa, K. Yanagisawa, and N. Itoh. "Near-Infrared Imaging with the 105 cm Kiso Schmidt." International Astronomical Union Colloquium 148 (1995): 48–51. http://dx.doi.org/10.1017/s0252921100021679.

Full text
Abstract:
AbstractWe have carried out imaging observations in the near-infrared (J, H and K’ band) with a large format array camera attached to the prime focus of the 105 cm Schmidt telescope at Kiso Observatory. The image resolution, limiting magnitudes and effect of thermal radiation are presented, based on observations of nearby galaxies. Considering the results, we are constructing a new larger near-infrared camera optimized for use with the Kiso Schmidt.
APA, Harvard, Vancouver, ISO, and other styles
24

Kimura, Yoshikatsu, Yusuke Kanzawa, Kiyosumi Kidono, Takashi Naito, and Yoshiki Ninomiya. "6-band Camera System-Near Infrared 3-band and Visible 3-band Cameras-." Journal of The Institute of Image Information and Television Engineers 65, no. 3 (2011): 342–48. http://dx.doi.org/10.3169/itej.65.342.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Blair, Steven, Missael Garcia, Tyler Davis, Zhongmin Zhu, Zuodong Liang, Christian Konopka, Kevin Kauffman, et al. "Hexachromatic bioinspired camera for image-guided cancer surgery." Science Translational Medicine 13, no. 592 (May 5, 2021): eaaw7067. http://dx.doi.org/10.1126/scitranslmed.aaw7067.

Full text
Abstract:
Cancer affects one in three people worldwide. Surgery remains the primary curative option for localized cancers, but good prognoses require complete removal of primary tumors and timely recognition of metastases. To expand surgical capabilities and enhance patient outcomes, we developed a six-channel color/near-infrared image sensor inspired by the mantis shrimp visual system that enabled near-infrared fluorescence image guidance during surgery. The mantis shrimp’s unique eye, which maximizes the number of photons contributing to and the amount of information contained in each glimpse of its surroundings, is recapitulated in our single-chip imaging system that integrates arrays of vertically stacked silicon photodetectors and pixelated spectral filters. To provide information about tumor location unavailable from a single instrument, we tuned three color channels to permit an intuitive perspective of the surgical procedure and three near-infrared channels to permit multifunctional imaging of optical probes highlighting cancerous tissue. In nude athymic mice bearing human prostate tumors, our image sensor enabled simultaneous detection of two tumor-targeted fluorophores, distinguishing diseased from healthy tissue in an estimated 92% of cases. It also permitted extraction of near-infrared structured illumination enabling the mapping of the three-dimensional topography of tumors and surgical sites to within 1.2-mm error. In the operating room, during surgical resection in 18 patients with breast cancer, our image sensor further enabled sentinel lymph node mapping using clinically approved near-infrared fluorophores. The flexibility and performance afforded by this simple and compact architecture highlights the benefits of biologically inspired sensors in image-guided surgery.
APA, Harvard, Vancouver, ISO, and other styles
26

Yanagisawa, K., N. Itoh, T. Ichikawa, K. Tarusawa, and M. Ueno. "Near-Infrared Imaging by a Schmidt Telescope." Symposium - International Astronomical Union 161 (1994): 89–90. http://dx.doi.org/10.1017/s0074180900047136.

Full text
Abstract:
We have carried out wide field imaging observations in the near-infrared (J, H and K′ band) with a large format array camera attached to the prime focus of the 105 cm Schmidt telescope at Kiso Observatory. The image resolution, limiting magnitudes and the effect of thermal radiation are discussed.
APA, Harvard, Vancouver, ISO, and other styles
27

Barnsley, Robert M., Helen E. Jermak, Iain A. Steele, Robert J. Smith, Stuart D. Bates, and Chris J. Mottram. "IO:I, a near-infrared camera for the Liverpool Telescope." Journal of Astronomical Telescopes, Instruments, and Systems 2, no. 1 (March 4, 2016): 015002. http://dx.doi.org/10.1117/1.jatis.2.1.015002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Kim, Kisoo, Kyung-Won Jang, Sang-In Bae, Hyun-Kyung Kim, Younggil Cha, Jae-Kwan Ryu, Yong-Jin Jo, and Ki-Hun Jeong. "Ultrathin arrayed camera for high-contrast near-infrared imaging." Optics Express 29, no. 2 (January 8, 2021): 1333. http://dx.doi.org/10.1364/oe.409472.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Hodapp, Klaus-Werner, John Rayner, and Everett Irwin. "The University of Hawaii NICMOS-3 near-infrared camera." Publications of the Astronomical Society of the Pacific 104 (June 1992): 441. http://dx.doi.org/10.1086/133016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Boeker, T., J. W. V. Storey, A. Krabbe, and T. Lehmann. "MANIAC--A New Mid- and Near-Infrared Array Camera." Publications of the Astronomical Society of the Pacific 109 (July 1997): 827. http://dx.doi.org/10.1086/133951.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Matthews, K., and B. T. Soifer. "The near infrared camera on the W.M. Keck telescope." Experimental Astronomy 3, no. 1-4 (1994): 77–84. http://dx.doi.org/10.1007/bf00430121.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Okumura, Shin-ichiro, Eiji Nishihara, Etsuji Watanabe, Atsushi Mori, Hirokazu Kataza, and Takuya Yamashita. "OASIS: A Multi-Purpose Near-Infrared Camera and Spectrograph." Publications of the Astronomical Society of Japan 52, no. 5 (October 1, 2000): 931–42. http://dx.doi.org/10.1093/pasj/52.5.931.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Yamamura, Issei, Takashi Tsuji, Toshihiko Tanabé, and Tadashi Nakajima. "AKARI near-infrared spectroscopy of brown dwarfs." Proceedings of the International Astronomical Union 5, H15 (November 2009): 545. http://dx.doi.org/10.1017/s1743921310010641.

Full text
Abstract:
Brown dwarfs (hereafter BDs) are of particular interest because of their extremely low-temperature atmospheres for comparison with atmospheres of giant planets. Aiming to obtain clues to understand the formation and disappearance of dust clouds and molecular abundances in BD photospheres, we conducted an observation programme of space-borne near-infrared spectroscopy of bright BDs with the Infrared Camera (IRC) on-board AKARI.
APA, Harvard, Vancouver, ISO, and other styles
34

Žiljak Gršić, Jana, Lidija Tepeš Golubić, Denis Jurečić, and Maja Matas. "Hidden information in paintings that manifests itself in the near infrared spectrum." Informatologia 52, no. 1-2 (June 30, 2019): 9–16. http://dx.doi.org/10.32914/i.52.1-2.2.

Full text
Abstract:
For each pigment and dye, we associate information about their absorption of light in the near infrared spectrum (NIR) at 1000 nm. Generally, pigments red, yellow, cyan blue, orange, white and “drap” (sandy brown) do not absorb NIR light. The NIR camera does not distinguish, recognize or “see” them, so the NIR photo is white. Such colors are marked with the letter "V" (visual, VIS). The area of green pigments (in our language, “zeleno” = green) is very different considering the absorption properties of NIR radiation. Some green pigments strongly absorb NIR radiation, while some green pigments absorb very little. That is why we have introduced the acronym Z as numerical information on the absorption of NIR light for all colorants, with a range from zero to ten. Painters are trained to mix colors with respect to their Z values. Dual images are produced. The NIR camera separates the drawing, the image, depending on the amount of Z color. The painter succeeds in arranging the colors so discreetly that two images are taken in the same place, one of which is seen by the naked eye, while the other requires an NIR camera to be seen. The idea of a VIS / NIR painting is accepted with the realization that there are many video surveillance (NIR) cameras around us: on the streets, in restaurants, in banks, in public entrances and entrances to private houses in general. The NIR design is used on documents, diplomas and banknotes as a general security method in graphic technology
APA, Harvard, Vancouver, ISO, and other styles
35

Ishitsuka I., José K., Takehiko Wada, Fumihiko Ieda, Noritaka Tokimasa, Takehiko Kuroda, Masaki Morimoto, Takeshi Miyaji, et al. "A Near Infrared Camera Refrigerated by Two Stirling Machines – an Alternative to Robotic Telescopes." International Astronomical Union Colloquium 183 (2001): 219–20. http://dx.doi.org/10.1017/s0252921100078921.

Full text
Abstract:
AbstractWe have developed and tested a new near infrared camera equipped with a 512 × 512 PtSi CCD and cooled by two independent Stirling Cycle refrigerators. The camera, installed on the 60 cm reflector telescope of the Nishi-Harima Astronomical Observatory (NHAO) since April 2000, has begun regular observations toward infrared objects. Since the reasonable cost and lower maintenance needs of the camera make it more attractive, we introduce it as an alternative to robotic telescopes.
APA, Harvard, Vancouver, ISO, and other styles
36

Manoliu, Mitica-Valentin. "Biometric security: Recognition according to the pattern of palm veins." Scientific Bulletin of Naval Academy XXIII, no. 1 (July 15, 2020): 257–62. http://dx.doi.org/10.21279/1454-864x-20-i1-036.

Full text
Abstract:
Palm vein recognition is a promising new biometric method, which has additional potential in the forensic field. This process is performed using light using NIR(Near-infrared) LEDs and the camera that captures the acquisition of veins. The obtained images have noise with variations of rotation and translation. Therefore, the input image made by the camera must be pre-processed using characteristic processes. A set of features is extracted based on images taken from infrared light cameras and processed in order to make authentication possible. This whole process can be accomplished by several methods. Thus, the application can be used to improve the security of military ships in restricted areas, but not only.
APA, Harvard, Vancouver, ISO, and other styles
37

Rockett, Thomas B. O., Nicholas A. Boone, Robert D. Richards, and Jon R. Willmott. "Thermal Imaging Metrology Using High Dynamic Range Near-Infrared Photovoltaic-Mode Camera." Sensors 21, no. 18 (September 13, 2021): 6151. http://dx.doi.org/10.3390/s21186151.

Full text
Abstract:
The measurement of a wide temperature range in a scene requires hardware capable of high dynamic range imaging. We describe a novel near-infrared thermal imaging system operating at a wavelength of 940 nm based on a commercial photovoltaic mode high dynamic range camera and analyse its measurement uncertainty. The system is capable of measuring over an unprecedently wide temperature range; however, this comes at the cost of a reduced temperature resolution and increased uncertainty compared to a conventional CMOS camera operating in photodetective mode. Despite this, the photovoltaic mode thermal camera has an acceptable level of uncertainty for most thermal imaging applications with an NETD of 4–12 °C and a combined measurement uncertainty of approximately 1% K if a low pixel clock is used. We discuss the various sources of uncertainty and how they might be minimised to further improve the performance of the thermal camera. The thermal camera is a good choice for imaging low frame rate applications that have a wide inter-scene temperature range.
APA, Harvard, Vancouver, ISO, and other styles
38

Tamura, M., T. Naoi, Y. Oasa, Y. Nakajima, C. Nagashima, T. Nagayama, D. Baba, et al. "Deep Near-Infrared Surveys and Young Brown Dwarf Populations in Star-Forming Regions." Symposium - International Astronomical Union 211 (2003): 87–90. http://dx.doi.org/10.1017/s0074180900210346.

Full text
Abstract:
We are currently conducting three kinds of IR surveys of star forming regions (SFRs) in order to seek for very low-mass young stellar populations. First is a deep JHKs-bands (simultaneous) survey with the SIRIUS camera on the IRSF 1.4m or the UH 2.2m telescopes. Second is a very deep JHKs survey with the CISCO IR camera on the Subaru 8.2m telescope. Third is a high resolution companion search around nearby YSOs with the CIAO adaptive optics coronagraph IR camera on the Subaru. In this contribution, we describe our SIRIUS camera and present preliminary results of the ongoing surveys with this new instrument.
APA, Harvard, Vancouver, ISO, and other styles
39

Peters, D. W., J. P. Mitchell, R. E. Plant, and B. R. Hanson. "401 Assessing Crop Canopy Development Using a Digital, Red/Near-infrared Band Ratioing Camera." HortScience 34, no. 3 (June 1999): 513A—513. http://dx.doi.org/10.21273/hortsci.34.3.513a.

Full text
Abstract:
Current methods of making crop cover estimates are time-consuming and tend to be highly variable. A low-cost, digital, red/near-infrared band ratioing camera (Dycam Inc., Chatsworth, Calif.) and accompanying software (S. Heinold, Woodland Hills, Calif.) were evaluated for estimating crop cover. The camera was tested using a set of images having leaf areas of known sizes with different crop, soil, and lighting conditions. In the field, camera-based crop cover estimates were compared to light bar measured estimates. Results indicate that the camera and image analysis software are capable of estimating percent crop cover over a range of soil, crop, and lighting environments. Camera-based crop cover estimates were highly correlated with light bar estimates (tomato r2 = 0.96, cotton r2 = 0.98). Under the conditions tested, the camera appears to be a useful tool for monitoring crop growth in the field.
APA, Harvard, Vancouver, ISO, and other styles
40

Kanneganti, Srikrishna, Chan Park, Michael F. Skrutskie, John C. Wilson, Matthew J. Nelson, Aaron W. Smith, and Charles R. Lam. "FanCam—A Near-Infrared Camera for the Fan Mountain Observatory." Publications of the Astronomical Society of the Pacific 121, no. 882 (August 2009): 885–96. http://dx.doi.org/10.1086/605451.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Packham, Christopher, Keith L. Thompson, Almudena Zurita, Johan H. Knapen, Ian Smail, Robert Greimel, Daniel F. M. Folha, et al. "INGRID: A near-infrared camera for the William Herschel Telescope." Monthly Notices of the Royal Astronomical Society 345, no. 2 (October 2003): 395–405. http://dx.doi.org/10.1046/j.1365-8711.2003.06982.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Devaraj, R., Y. D. Mayya, L. Carrasco, and A. Luna. "Characterization and Performance of the Cananea Near-infrared Camera (CANICA)." Publications of the Astronomical Society of the Pacific 130, no. 987 (March 20, 2018): 055001. http://dx.doi.org/10.1088/1538-3873/aaa9ab.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Cahyadi, Willy Anugrah, and Yeon Ho Chung. "Experimental demonstration of indoor uplink near-infrared LED camera communication." Optics Express 26, no. 15 (July 20, 2018): 19657. http://dx.doi.org/10.1364/oe.26.019657.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Czepita, Maciej, and Damian Czepita. "Compact digital camera method for near-infrared ophthalmic transillumination photography." Canadian Journal of Ophthalmology 53, no. 5 (October 2018): e170-e173. http://dx.doi.org/10.1016/j.jcjo.2017.12.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Baug, T., D. K. Ojha, S. K. Ghosh, S. Sharma, A. K. Pandey, Brijesh Kumar, Arpan Ghosh, et al. "TIFR Near Infrared Imaging Camera-II on the 3.6 m Devasthal Optical Telescope." Journal of Astronomical Instrumentation 07, no. 01 (March 2018): 1850003. http://dx.doi.org/10.1142/s2251171718500034.

Full text
Abstract:
Tata Institute of Fundamental Research (TIFR) Near Infrared Imaging Camera-II (TIRCAM2) is a closed-cycle Helium cryo-cooled imaging camera equipped with a Raytheon 512[Formula: see text][Formula: see text][Formula: see text]512 pixels InSb Aladdin III Quadrant focal plane array (FPA) having sensitivity to photons in the 1–5[Formula: see text][Formula: see text] wavelength band. In this paper, we present the performance of the camera on the newly installed 3.6[Formula: see text]m Devasthal Optical Telescope (DOT) based on the calibration observations carried out during 2017 May 11–14 and 2017 October 7–31. After the preliminary characterization, the camera has been released to the Indian and Belgian astronomical community for science observations since 2017 May. The camera offers a field-of-view (FoV) of [Formula: see text] on the DOT with a pixel scale of 0.169[Formula: see text]. The seeing at the telescope site in the near-infrared (NIR) bands is typically sub-arcsecond with the best seeing of [Formula: see text] realized in the NIR [Formula: see text]-band on 2017 October 16. The camera is found to be capable of deep observations in the [Formula: see text], [Formula: see text] and [Formula: see text] bands comparable to other 4[Formula: see text]m class telescopes available world-wide. Another highlight of this camera is the observational capability for sources up to Wide-field Infrared Survey Explorer (WISE) W1-band (3.4[Formula: see text][Formula: see text]m) magnitudes of 9.2 in the narrow [Formula: see text]-band ([Formula: see text]; [Formula: see text] 3.59[Formula: see text][Formula: see text]m). Hence, the camera could be a good complementary instrument to observe the bright [Formula: see text]-band sources that are saturated in the Spitzer-Infrared Array Camera (IRAC) ([3.6] [Formula: see text] 7.92 mag) and the WISE W1-band ([3.4] [Formula: see text] 8.1 mag). Sources with strong polycyclic aromatic hydrocarbon (PAH) emission at 3.3[Formula: see text][Formula: see text]m are also detected. Details of the observations and estimated parameters are presented in this paper.
APA, Harvard, Vancouver, ISO, and other styles
46

Hughes, D. H., E. I. Robson, and M. J. Ward. "Optical & Near Infrared Imaging of NGC1275." Symposium - International Astronomical Union 134 (1989): 376–78. http://dx.doi.org/10.1017/s0074180900141373.

Full text
Abstract:
We are currently studying a selection of active galaxies using the new IR array camera IRCAM on UKIRT. Our aim is to seperate the underlying stellar emission from that of the active galactic nucleus. Although the optical is the best wavelength region to discriminate between the different populations in the underlying spiral and elliptical galaxies, it is in the infrared that the contrast between the non-thermal central core and the surrounding galaxy is increased. We present reduced data from infrared images taken at 1.25, 1.65 and 2.2 μm with an image scale of 0.6 arcsec/pixel together with optical 0.44 and 0.55 μm CCD images of the Seyfert galaxy NGC1275.
APA, Harvard, Vancouver, ISO, and other styles
47

Fairley, Iain, Anouska Mendzil, Michael Togneri, and Dominic Reeve. "The Use of Unmanned Aerial Systems to Map Intertidal Sediment." Remote Sensing 10, no. 12 (November 30, 2018): 1918. http://dx.doi.org/10.3390/rs10121918.

Full text
Abstract:
This paper describes a new methodology to map intertidal sediment using a commercially available unmanned aerial system (UAS). A fixed-wing UAS was flown with both thermal and multispectral cameras over three study sites comprising of sandy and muddy areas. Thermal signatures of sediment type were not observable in the recorded data and therefore only the multispectral results were used in the sediment classification. The multispectral camera consisted of a Red–Green–Blue (RGB) camera and four multispectral sensors covering the green, red, red edge and near-infrared bands. Statistically significant correlations (>99%) were noted between the multispectral reflectance and both moisture content and median grain size. The best correlation against median grain size was found with the near-infrared band. Three classification methodologies were tested to split the intertidal area into sand and mud: k-means clustering, artificial neural networks, and the random forest approach. Classification methodologies were tested with nine input subsets of the available data channels, including transforming the RGB colorspace to the Hue–Saturation–Value (HSV) colorspace. The classification approach that gave the best performance, based on the j-index, was when an artificial neural network was utilized with near-infrared reflectance and HSV color as input data. Classification performance ranged from good to excellent, with values of Youden’s j-index ranging from 0.6 to 0.97 depending on flight date and site.
APA, Harvard, Vancouver, ISO, and other styles
48

von Bueren, S., A. Burkart, A. Hueni, U. Rascher, M. Tuohy, and I. Yule. "Comparative validation of UAV based sensors for the use in vegetation monitoring." Biogeosciences Discussions 11, no. 3 (March 7, 2014): 3837–64. http://dx.doi.org/10.5194/bgd-11-3837-2014.

Full text
Abstract:
Abstract. Unmanned Aerial Vehicles (UAVs) equipped with lightweight spectral sensors facilitate non-destructive, near real time vegetation analysis. In order to guarantee quality scientific analysis, data acquisition protocols and processing methodologies need to be developed and new sensors must be trialed against state of the art instruments. In the following study, four different types of optical UAV based sensors (RGB camera, near infrared camera, six band multispectral camera, and a high resolution spectrometer) were compared and validated in order to evaluate their applicability for vegetation monitoring with a focus on precision agricultural applications. Data was collected in New Zealand over ryegrass pastures of various conditions. The UAV sensor data was validated with ground spectral measurements. It was found that large scale imaging of pasture variability can be achieved by either using a true color or a modified near infrared camera. A six band multispectral camera was used as an imaging spectrometer capable of identifying in field variations of vegetation status that correlate with ground spectral measurements. The high resolution spectrometer was validated and found to deliver spectral data that can match the quality of ground spectral measurements.
APA, Harvard, Vancouver, ISO, and other styles
49

Onaka, T., I. Sakon, R. Ohsawa, T. I. Mori, H. Kaneda, M. Tanaka, Y. Okada, F. Boulanger, C. Joblin, and P. Pilleri. "Near-Infrared Spectroscopy of the Diffuse Galactic Emission." Proceedings of the International Astronomical Union 10, H16 (August 2012): 703–4. http://dx.doi.org/10.1017/s1743921314012976.

Full text
Abstract:
AbstractThe near-infrared (NIR) spectral range (2–5 μm) contains a number of interesting features for the study of the interstellar medium. In particular, the aromatic and aliphatic components in carbonaceous dust can be investigated most efficiently with the NIR spectroscopy. We analyze NIR spectra of the diffuse Galactic emission taken with the Infrared Camera onboard AKARI and find that the aliphatic to aromatic emission band ratio decreases toward the ionized gas, which suggests processing of the band carriers in the ionized region.
APA, Harvard, Vancouver, ISO, and other styles
50

Laine, Seppo, Johan H. Knapen, Juan-Carlos Muñoz-Mateos, Taehyun Kim, Sébastien Comerón, Marie Martig, Benne W. Holwerda, et al. "Spitzer/Infrared Array Camera near-infrared features in the outer parts of S4G galaxies." Monthly Notices of the Royal Astronomical Society 444, no. 4 (September 13, 2014): 3015–39. http://dx.doi.org/10.1093/mnras/stu1642.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography