Добірка наукової літератури з теми "Camera monitoring"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Camera monitoring".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Статті в журналах з теми "Camera monitoring"

1

Garland, Laura, Andrew Crosby, Richard Hedley, Stan Boutin, and Erin Bayne. "Acoustic vs. photographic monitoring of gray wolves (Canis lupus): a methodological comparison of two passive monitoring techniques." Canadian Journal of Zoology 98, no. 3 (March 2020): 219–28. http://dx.doi.org/10.1139/cjz-2019-0081.

Повний текст джерела
Анотація:
Remote camera traps are often used in large-mammal research and monitoring programs because they are cost-effective, allow for repeat surveys, and can be deployed for long time periods. Statistical advancements in calculating population densities from camera-trap data have increased the popularity of camera usage in mammal studies. However, drawbacks to camera traps include their limited sampling area and tendency for animals to notice the devices. In contrast, autonomous recording units (ARUs) record the sounds of animals with a much larger sampling area but are dependent on animals producing detectable vocalizations. In this study, we compared estimates of occupancy and detectability between ARUs and remote cameras for gray wolves (Canis lupus Linnaeus, 1758) in northern Alberta, Canada. We found ARUs to be comparable with cameras in their detectability and occupancy of wolves, despite only operating for 3% of the time that cameras were active. However, combining cameras and ARUs resulted in the highest detection probabilities for wolves. These advances in survey technology and statistical methods provide innovative avenues for large-mammal monitoring that, when combined, can be applied to a broad spectrum of conservation and management questions, provided assumptions for these methods are rigorously tested and met.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Zhang, Ying, Jitao Bai, Yu Diao, Zhonghao Chen, Chu Wang, Kun Yang, Zeng Gao, and Huajie Wei. "Risk and Energy Based Optimization for Fire Monitoring System in Utility Tunnel Using Cellular Automata." Sustainability 16, no. 11 (June 1, 2024): 4717. http://dx.doi.org/10.3390/su16114717.

Повний текст джерела
Анотація:
Fire is one of the biggest threats to the safety of utility tunnels, and establishing camera-based monitoring systems is conducive to early fire finding and better understanding of the evolution of tunnel fires. However, conventional monitoring systems are being faced with the challenge of high energy consumption. In this paper, the camera operation in a utility tunnel was optimized considering both fire risk and energy consumption. Three design variables were investigated, namely the camera sight, the number of cameras in simultaneous operation, and the duration of camera operation. Cellular automata were used as a simple but effective method to simulate the spread of fire in a utility tunnel. Results show that as the number of cameras in simultaneous operation increases, the probability of fire capture also increases, but the energy consumption decreases. A shorter duration of camera operation can lead to a higher probability of fire capture, and meanwhile, lower energy consumption. For the duration of camera operation shorter than or equal to the allowable time, the probability of fire capture is significantly higher than that for the duration longer than the allowable time. Increasing the camera sight will significantly increase the probability of fire capture and lower the total energy consumption when a blind monitoring area exists. The total energy consumption of a camera-based monitoring system roughly satisfies hyperbolic correlation with the duration of camera operation, while the probability of fire capture can be predicted based on the number of cameras in simultaneous operation through a power model. The optimal design for the modeled tunnel section is two cameras in simultaneous operation with a tangent monitoring area. The duration of camera operation should be as short as possible, at least shorter than the allowable time. The study is expected to provide a reference for the sustainable design of energy-saving utility tunnels with lower fire risk.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Orisa, Mira, Karina Auliasari, and Rofila El Maghfiroh. "TEKNOLOGI MOTION-BASED TRACKING UNTUK PENGEMBANGAN APLIKASI KEAMANAN." Jurnal Teknologi Informasi dan Terapan 4, no. 2 (April 1, 2019): 119–24. http://dx.doi.org/10.25047/jtit.v4i2.69.

Повний текст джерела
Анотація:
The surveillance camera system is widely used by the public. The surveillance camera system used is usually a CCTV camera. In general, CCTV cameras can only record video. Security monitoring or monitoring by CCTV cameras can only be effective if there are operators who see the record directly on a monitor. Actually the surveillance camera system can be programmed to give a warning sign to the user. The surveillance camera used in this study is the IP camera. The camera is a camera that can be programmed to provide a notification to the user. By implementing motion-based tracking technology on the surveillance camera system can detect movement. Kalman filter is one of the motion-based tracking methods. Kalman filters can predict the movements recorded by the IP camera. The results of this study state that the surveillance camera system can provide notification messages to users via an android device when the surveillance camera records the movement of human objects.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Ali, S. Y., O. Al-Saleh, and P. A. Koushki. "Effectiveness of Automated Speed-Monitoring Cameras in Kuwait." Transportation Research Record: Journal of the Transportation Research Board 1595, no. 1 (January 1997): 20–26. http://dx.doi.org/10.3141/1595-04.

Повний текст джерела
Анотація:
In 1994 the General Traffic Department installed automatic radar cameras to monitor traffic speed at a number of strategic roadway locations in Kuwait. The aim was to lower the number of high-speed violations and thus reduce road accidents. Recent traffic safety records point to an increase in both the number of violations and the occurrence of road accidents. It is argued in this paper that without live enforcement support and active follow-up of camera-recorded violations, the effectiveness of these cameras in improving road safety is insignificant at best, particularly in the undisciplined driving environment of the oil-rich nations in the Middle East. The speed of traffic was simultaneously measured via radar instruments both at the automatic camera site and at sections approximately 1 km before or after or before and after the cameras at eight camera locations. Measurements were recorded for six 1/2-hr periods at each site for a total of 72 hr over a period of 3 months, so that morning, afternoon, and after-dark hours, as well as different days of the week and roadway types, were covered. Analysis of the speed data showed that for the three daily periods and various roadway types, traffic speeds were consistently higher in sections before or after or before and after the automatic camera at the camera site. Statistical tests indicated that the difference in speed measured at and away from the cameras was at the 99 percent level. The findings demonstrate that in a traffic environment characterized by poor driving behavior, inconsistent and piecemeal driver education programs, and insufficient presence of law enforcement officials, reliance on automatic cameras alone to reduce traffic violations is doomed to fail.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Hotař, Vlastimil. "Monitoring of Glass Production Using Vision Systems." Advanced Materials Research 39-40 (April 2008): 511–16. http://dx.doi.org/10.4028/www.scientific.net/amr.39-40.511.

Повний текст джерела
Анотація:
Applications of vision systems for control and monitoring are becoming more widespread. However, there are still specific problems in the glass industry, especially transparency of glass (many times colourless) that requires the use of special illuminations, adapters for lenses, filters, software filters, analyses of images, etc. An important problem is the choice of an optimal analysis for obtained images that should correspond with the character of obtained data. Our research is developing the use of cameras (an area scan cameras, a line scan cameras, and an intelligent cameras) for quality monitoring of glass production with connection to a PC or compact vision systems, software solutions with appropriate image analyses (using standard and new algorithms) and to solve of real problems for industry applications. The first stage of the research is: using a digital camera, an area scan camera and a line scan camera for glass melt and glass products monitoring to solve specific problems with lighting and to test some standard and non-standard analysis such as fractal geometry for the evaluation of productions and products. This article briefly shows basic information about our results and possibilities of application in the glass industry.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Zhou, Chenchen, Shaoqi Wang, Yi Cao, Shuang-Hua Yang, and Bin Bai. "Online Pyrometry Calibration for Industrial Combustion Process Monitoring." Processes 10, no. 9 (August 26, 2022): 1694. http://dx.doi.org/10.3390/pr10091694.

Повний текст джерела
Анотація:
Temperature and its distribution are crucial for combustion monitoring and control. For this application, digital camera-based pyrometers become increasingly popular, due to its relatively low cost. However, these pyrometers are not universally applicable due to the dependence of calibration. Compared with pyrometers, monitoring cameras exist in all most every combustion chamber. Although these cameras, theologically, have the ability to measure temperature, due to lack of calibration they are only used for visualization to support the decisions of operators. Almost all existing calibration methods are laboratory-based, and hence cannot calibrate a camera in operation. This paper proposes an online calibration method. It uses a pre-calibrated camera as a standard pyrometer to calibrate another camera in operation. The calibration is based on a photo taken by the pyrometry-camera at a position close to the camera in operation. Since the calibration does not affect the use of the camera in operation, it sharply reduces the cost and difficulty of pyrometer calibration. In this paper, a procedure of online calibration is proposed, and the advice about how to set camera parameters is given. Besides, the radio pyrometry is revised for a wider temperature range. The online calibration algorithm is developed based on two assumptions for images of the same flame taken in proximity: (1) there are common regions between the two images taken at close position; (2) there are some constant characteristic temperatures between the two-dimensional temperature distributions of the same flame taken from different angles. And those two assumptions are verified in a real industrial plants. Based on these two verified features, a temperature distribution matching algorithm is developed to calibrate pyrometers online. This method was tested and validated in an industrial-scale municipal solid waste incinerator. The accuracy of the calibrated pyrometer is sufficient for flame monitoring and control.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

A. Atya, Instructor Baidaa, Abdul Monem S. Rahma, and Abdul Mohssen J. Abdul Hossen. "Design and Implementation of Secure Building Monitoring System using Programmable Wireless Mobile Camera." International Journal of Computer Network and Information Security 9, no. 3 (March 8, 2017): 29–35. http://dx.doi.org/10.5815/ijcnis.2017.03.04.

Повний текст джерела
Анотація:
In the last decades, monitoring cameras begin to play a vital role in securing sensitive systems such as government sites or establishments. Generally, these kinds of cameras are fixed-location (i.e. outdoor camera) such that the viewpoint is limited to small area and not covering the whole place. In addition, there are some drawbacks that appear when using these kinds of cameras such as being breakable (intentionally or not) which may lead to camera malfunction or breaking in the linked electrical wires that may cause disconnection between the camera, monitor and its receiver. However, the main problem is the lacking of secure protecting system that prevents intruders from entering into the system disabling or malfunction it. In this research a new system is proposed in order to solve these problems by using wireless-mobile camera with embedded programmable operating system which enables controlling this camera remotely by sending wireless commands through the embedded component called Arduino card controller. This card enables the connection between the camera and the server to be programmatic by the user or developer. The main goal of this research is to design a monitoring system to detect any suspicious events and to ensure that the transferring monitoring data from the camera to the server is not infiltrated by unauthorized person by applying a set of techniques from image detection, object tracking and security algorithms to the instructions or the program of the camera. Compared with other researches, this work achieved the following goals: 1- Using Arduino card for programming the camera. 2- IP camera does not require user name and password. 3- The images and the other information are (encrypted) when sending to/from computer, 4- Using Mobile-wireless camera. 5- Process of keys exchanging between camera and computer. The results of this research are good and achieved the main goals of new developed technique.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Mahyan, Fariza Binti, Asrani Bin Lit, Nglai Anak Satu, Mark Sydney Anak Banyah, and Mcirvine Igoh Ak Gumis. "Raspberry Pi-based home security monitoring system." E3S Web of Conferences 479 (2024): 07014. http://dx.doi.org/10.1051/e3sconf/202447907014.

Повний текст джерела
Анотація:
The era of technology has opened space to facilitate daily tasks. Security cameras have nowadays become a necessity for every expert’s safety environment. A buzzer, PIR sensor, PI camera, and Raspberry Pi are used to create home security systems. The PIR sensor detects motion, the PI camera snaps an image, and the buzzer beeps. A notification will be delivered immediately to the owner’s Telegram account when the camera captures the person’s face and enables the posting of live video of that moment. It enables the user to record the incidents that happen at home. The purpose of developing the Telegram application is mainly to provide the owner with an Android application because nowadays society is more dependent on mobile technology. PIR sensors will be active when they detect people or animals, but they will deactivate when they detect breezes. As a result, the PI Camera will only record images of things like people and animals, not the wind. The PIR sensor, PI camera, and buzzer will all have a linear relationship to one another. When the PIR sensor is turned on, the PI camera and buzzer will also turn on, and vice versa.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Li, Xiao Meng, Shu Ce Zhang, Yong Liu, Xue Heng Tao, Xue Jun Wang, and Jin Shi Lu. "Design of Control System for Monitoring Camera Cleaning Robot." Applied Mechanics and Materials 513-517 (February 2014): 4047–51. http://dx.doi.org/10.4028/www.scientific.net/amm.513-517.4047.

Повний текст джерела
Анотація:
In order to solve the problem that manual cleaning high-altitude monitoring camera is difficult and risky, the scheme that mobile knee-type robot with three degrees of freedom cleans the monitoring probe instead of worker is proposed, and the control system based on MCU is designed. The hardware and program design is finished, which includes movement of manipulator, cameras jet cleaning with high-pressure spray gun, drying of cameras surface, ultrasonic obstacle avoidance, camera monitoring and image processing module. At last, the experiment and test for the cleaning robot prototype are carried out.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Zhang, Hua, Pengjie Tao, Xiaoliang Meng, Mengbiao Liu, and Xinxia Liu. "An Optimum Deployment Algorithm of Camera Networks for Open-Pit Mine Slope Monitoring." Sensors 21, no. 4 (February 6, 2021): 1148. http://dx.doi.org/10.3390/s21041148.

Повний текст джерела
Анотація:
With the growth in demand for mineral resources and the increase in open-pit mine safety and production accidents, the intelligent monitoring of open-pit mine safety and production is becoming more and more important. In this paper, we elaborate on the idea of combining the technologies of photogrammetry and camera sensor networks to make full use of open-pit mine video camera resources. We propose the Optimum Camera Deployment algorithm for open-pit mine slope monitoring (OCD4M) to meet the requirements of a high overlap of photogrammetry and full coverage of monitoring. The OCD4M algorithm is validated and analyzed with the simulated conditions of quantity, view angle, and focal length of cameras, at different monitoring distances. To demonstrate the availability and effectiveness of the algorithm, we conducted field tests and developed the mine safety monitoring prototype system which can alert people to slope collapse risks. The simulation’s experimental results show that the algorithm can effectively calculate the optimum quantity of cameras and corresponding coordinates with an accuracy of 30 cm at 500 m (for a given camera). Additionally, the field tests show that the algorithm can effectively guide the deployment of mine cameras and carry out 3D inspection tasks.
Стилі APA, Harvard, Vancouver, ISO та ін.

Дисертації з теми "Camera monitoring"

1

Lundgren, Elida. "Evaluating camera monitoring of breeding seabirds." Thesis, Uppsala universitet, Institutionen för biologisk grundutbildning, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-412458.

Повний текст джерела
Анотація:
This thesis was made in collaboration with the Baltic Seabird Project with the purpose of evaluating the possibilities of using cameras to monitor the fledging success of Common Guillemots on Stora Karlsö. Fledging success is measured from the time the egg is laid, hatched, until the chick is 15 days old and considered ready to leave the nest. Camera monitoring means that the breeding area where the birds reside is photographed at a predetermined interval to attempt to capture the offspring to determine its survival. The study was conducted on Stora Karlsö by installing cameras in the artificial breeding shelf, Auk Lab, and determine how well the method works. Whether the offspring can be caught on camera was the main question, with a secondary question addressing the differences from the traditional monitoring method which is done by daily controls by one person on site. The results show that camera monitoring can be a useful method. It is possible to observe the off-spring, but distance and angle are important factors affecting image quality. Power supply and the memory card size are important factors that decide whether time savings can be made compared to the traditional method. Further development of the routines for camera monitoring is necessary to create a reliable data collection
Studien gjordes i samarbete med Baltic Seabird Project med syftet att undersöka möjligheten att använda kameror för att övervaka häckningsframgång hos Sillgrissla på Stora Karlsö. Häckningsframgång definieras som tiden från att ägget läggs, kläcks, och ungen överlever i 15 dagar då den anses vara redo att lämna boet. Övervakning med kameror innebär att häckningsytan där fåglarna sitter fotograferas med ett bestämt tidsintervall för att försöka fånga avkomman på bild och därmed se att den har överlevt. Undersökningen har genomförts på Stora Karlsö genom att installera kameror i den konstgjorda häckningshyllan Auk Lab och bedöma hur väl metoden fungerar. Huruvida avkomman fångas på bild eller inte var den huvudsakliga frågeställningen, med en sekundär frågeställning som rör skillnaden mellan den traditionella insamlingsmetoden som genomförs genom att en person dagligen kontrollerar fåglarna. Resultaten visar att kameraövervakning kan vara en användbar metod, då det går att fånga avkomman på bild men avstånd och vinkel är några av de avgörande faktorerna för att framgångsrikt kunna observera avkomman. Strömtillförsel och minneskortets storlek är avgörande faktorer för att göra vinningar tidsmässigt jämfört med den traditionella metoden. Fortsatt utveckling av rutinerna för kameraövervakning är nödvändigt för att få en pålitlig datainsamling.  2
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Karim, Kh Nafis. "INTELLIGENT SYSTEM FOR MONITORING PHYSIOLOGICAL PARAMETERS USING CAMERA." Thesis, Mälardalens högskola, Inbyggda system, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-29834.

Повний текст джерела
Анотація:
Measuring physiological parameters or vital sign using camera has become popular in recent years. Contact-less monitoring and extraction of vital signs can be important source of information in situations like medical care system and safety control system. This paper presents the implementation of real-time, non-contact method for extraction of vital signs, heart rate in this case. A better face tracking method is used for efficient face detection. This study extends some of the previous works done and have a comparison study with several methods. The developed system used filtering with window over the green channel of the signal and then Converted to frequency domain to analyze the signal to detect heart rate. The developed system achieved high correlation and showed small error while referencing with actual heart signal from ECG. This method delivers better result in better light condition but gives fairly good result on lower light as well.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Daly, Jonathan. "Video camera monitoring to detect changes in haemodynamics." Thesis, University of Oxford, 2016. https://ora.ox.ac.uk/objects/uuid:e84f2acf-f35c-4257-a4c3-209c5da9cbee.

Повний текст джерела
Анотація:
Patients in hospital can be prone to sudden, life-threatening changes in their cardiovascular state. Haemodynamic parameters such as blood pressure, pulse transit time (PTT) and perfusion can be monitored in clinical situations to identify these changes as early as possible. Continuous blood pressure is usually monitored using a catheter placed into a major artery, but this is invasive and involves risk to the patient. In the last decade, the field of non-contact vital sign monitoring has emerged, with growing evidence that the remote photoplethysmogram (rPPG) signal can be used to estimate vital signs using video cameras. If the analysis of the rPPG signal can be expanded to include the estimation of haemodynamic parameters, it could result in methods for the continuous, non-contact monitoring of a subject's haemodynamic state. In a physiology study, a series of video recordings were made of 43 healthy volunteers. The subjects sat in a purpose-built chamber, and the composition of the air was carefully adjusted to cause the subjects to experience large, controlled changes in blood oxygen levels. To validate the video camera algorithms, reference data were also collected. Along with the volunteer study, a clinical study was performed to acquire data in a challenging clinical environment. Data were collected from patients on haemodialysis in the Renal Unit, a population likely to experience sudden changes in haemodynamics. The reference data from the Renal Unit study were analysed to determine the extent to which PTT and mean arterial pressure (MAP) are related. The correlation coefficients and linear fits were found on a global and a per-subject basis. In addition, the video recordings from the Physiology study were processed to derive rPPG signals, and these signals were analysed to obtain estimates for PTT. Local rPPG signals were also derived for different regions of interest, and the waveforms were analysed using a novel application of the technique of signal averaging to produce spatial maps of perfusion and blood flow. The correlation between conventionally measured PTT and MAP was found to be weaker in the haemodialysis population than has been shown elsewhere in the literature, except for a sub-set of patients. The results of the video analysis showed that PTT could be estimated robustly and consistently, although direct validation of these estimates was not possible because of the different method used to calculate the reference PTT. For most subjects, the spatial mapping methods produced robust maps that were consistent over time. These results suggest that it is possible to detect changes in haemodynamics using a video camera, and that this could have applications in healthcare, providing that challenges such as subject movement and clinical validation can be overcome.
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Gang, Siqi. "Driver-Monitoring-Camera Based Threat Awareness for Collision Avoidance." Thesis, KTH, Skolan för industriell teknik och management (ITM), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-263926.

Повний текст джерела
Анотація:
Since forward collision is one of the most common and dangerous types of traffic accidents, many studies and researches have been conducted to develop forward collision avoidance system. To facilitate the tradeoff between comfort and safety for forward collision avoidance, the driver's state needs to be monitored and estimated. Such support is necessary for Forward Collision Warning (FCW) system given human-involved control. Due to the advances of Driver Monitoring System (DMS), the demand for camera-based driver's state estimation has increased. This master thesis project, conducted at Zenuity AB, investigates a method to estimate driver's awareness based on DMS. The estimation of a driver's awareness is expected to help adapt FCW system based on visual attention when facing the unpredictable braking of the leading vehicle. The project consists of three tasks: gaze estimation, Gaze-to-Object Mapping (GTOM), and awareness estimation. A combined Kalman Filter was developed in gaze estimation for compensation of missing data and outliers and reducing the difference to “ground truth” data. The uncertainty matrix from gaze estimation was utilized to extract a gaze-to-object probability signal in GTOM, while the corresponding fixation duration was also obtained in GTOM. The two extracted new features were used in awareness estimation with two methods: Logistic Regression and two-Hidden Markov Model. The comparison between the two methods reveals whether a complex method is preferred or not. Based on the results of this project, Logistic Regression seems to perform better in driver's state estimation, with 92.0% accuracy and 76.3% True Negative rate. However, further research and improvements on the two-Hidden Markov Model are needed to reach a more comprehensive conclusion. The main contribution of this project is an investigation of an end-to-end method for driver's awareness estimation and thereby an identification of challenges for further studies.
Frontkollision (forward collision) är en av de vanligaste och farligaste typerna av trafikolyckor. Många studier och undersökningar har genomförts för att utveckla system för att undvika kollisioner. För att underlätta avvägningar mellan komfort och säkerhet för att undvika Frontkollision måste förarens tillstånd övervakas och skattas. Ett sådant stöd är nödvändigt för Forward Collision Warning (FCW) systemet, som involverar interaktion med människor. Efterfrågan på kamerabaserad uppskattning för föraren har ökat på grund av framsteg Driver Monitoring System (DMS). Det här examensarbete genomfördes på Zenuity AB och undersökte en metod för att skatta förarens medvetenhet baserad på Driver Monitoring System. Uppskattningen av förarens medvetenhet förväntas bidra till att anpassa FCW-systemet. Detta FCW-system är baserat på visuell uppmärksamhet om när oförutsägbar bromsning av det framförvarande fordonet sker. Examensarbetet består av tre uppgifter: blickuppskattning, Gaze-to-Object Mapping (GTOM), och medventenhetsuppskattning. Ett kombinerat Kalman-filter har utvecklats i gaze uppskattning för att kompensera saknade data och outliers samt reducera skillnaden till “ground truth” data. Osäkerhetesmatrisen från gaze uppskattningen användes för att extrahera en gaze-to-object sannolikhetssignal i GTOM. Den motsvarande fixeringsvaraktigheten erhålls också i GTOM. De två extraherade nya egenskaperna användes i medvetenhetsanalys med hjälp av två metoder: logistic regression och two-Hidden Markov Model. Jämförelsen mellan de två metoderna avslöjar om en komplex metod är att föredra eller inte. Resultatet av detta examensarbet visar att logistic regression fungerar bättre i förarens statusuppskattning med 92% noggrannhet och 76.3% True Negative rate. Vidare forskning och förbättringar av den two-hidden Markov modell behövs för att dra en mer omfattande slutsats. Det huvudsakliga bidraget av examensarbetet är en utforskning av en end-to-end metod för att uppskatta förarens medvetenhet och därmed kunna identifiera utmaningar för framtid studie.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Tian, Yi. "Self-Powered Intelligent Traffic Monitoring Using IR Lidar and Camera." Thesis, Virginia Tech, 2017. http://hdl.handle.net/10919/74949.

Повний текст джерела
Анотація:
This thesis presents a novel self-powered infrastructural traffic monitoring approach that estimates traffic information by combining three detection techniques. The traffic information can be obtained from the presented approach includes vehicle counts, speed estimation and vehicle classification based on size. Two categories of sensors are used including IR Lidar and IR camera. With the two sensors, three detection techniques are used: Time of Flight (ToF) based, vision based and Laser spot flow based. Each technique outputs observations about vehicle location at different time step. By fusing the three observations in the framework of Kalman filter, vehicle location is estimated, based on which other concerned traffic information including vehicle counts, speed and class is obtained. In this process, high reliability is achieved by combing the strength of each techniques. To achieve self-powering, a dynamic power management strategy is developed to reduce system total energy cost and optimize power supply in traffic monitoring based on traffic pattern recognition. The power manager attempts to adjust the power supply by reconfiguring system setup according to its estimation about current traffic condition. A system prototype has been built and multiple field experiments and simulations were conducted to demonstrate traffic monitoring accuracy and power reduction efficacy.
Master of Science
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Pozzi, Colakovic Emir. "Monitoring surface cleanliness of manufactured metal parts using camera technique." Thesis, KTH, Industriell produktion, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-232670.

Повний текст джерела
Анотація:
Technical surface cleanliness is a mandatory requirement for many production lines. It is part ofthe quality control process to ensure that the surfaces of the manufactured components are freeof contaminations and are ready for the next step in the production line. The project has takeninto account two different requirements related to the component cleanliness: first, thecleanliness is measured as the level of contamination on a sample and then, the particles sizes aredetermined. An important factor in the cleanliness is the presence of particles larger than acertain limit, called the critical particles, which have to be detected. This thesis is inspired from anissue SCANIA has (2018) in its production plant in Södertälje, Stockholm. SCANIA has a qualitycontrol system that analyzes the contamination level of few components through a timeconsuming process that takes hours and requires expensive microscopes, human intervention anda dedicated measurement room in the plant. The aim of this thesis is to investigate an alternativetechnical cleanliness monitoring method through the image analysis of the contaminationsamples. Through a simple digital camera, pictures from the contaminated samples are taken,processed and analyzed in order to obtain the cleanliness level and the particles size of thesamples. In contract to the current solution, the proposed method has the possibility of beingimplemented in the production line, providing larger sampling rate.
Teknisk ytrenhet är ett obligatoriskt krav för många produktionslinor. Det är en del avkvalitetsstyrningsprocessen för att säkerställa att ytorna på de tillverkade komponenterna är friafrån föroreningar och är redo för nästa steg i produktionslinan. Projektet har tagit hänsyn till tvåolika krav relaterade till komponents renhet: för det första mäts renheten som föroreningsnivå påett prov och sedan bestäms partikelstorlekarna. En viktig faktor vid renhetsbestämning ärnärvaron av partiklar som är större än en viss gräns, som kallas de kritiska partiklarna, som måstedetekteras. Avhandlingen är inspirerad av ett problem SCANIA har (2018) i sinproduktionsanläggning i Södertälje, Stockholm. SCANIA har ett kvalitetskontrollsystem somanalyserar föroreningsnivån av få komponenter genom en tidskrävande process vilken tar timmaroch kräver dyra mikroskop, mänskligt ingripande och ett dedikerat mätrum i anläggningen. Syftetmed denna avhandling är att undersöka en alternativ teknisk renhetsövervakningsmetod genombildanalys av förorenade proverna. Med en enkel digitalkamera tas bilder på de förorenadeproverna, bearbetas och analyseras för att uppnå renhetsnivån och partikelstorleken på proven. Iöverensstämmelse med den nuvarande lösningen finns det möjlighet att implementera denföreslagna metoden i produktionslinan, vilket ger större samplingsfrekvens.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Elzagzoug, Ezzaldeen. "Chromatic monitoring of transformer oil condition using CCD camera technology." Thesis, University of Liverpool, 2013. http://livrepository.liverpool.ac.uk/12233/.

Повний текст джерела
Анотація:
Power transformers are essential components within the power distribution system. Transformer failures having a high economic impact on the distribution operators and the industrial and domestic customers. Dielectric mineral oil is used in transformers for electrical insulation between live parts, cooling and protection of the insulation papers in the transformer. Oil contamination and changes in the chemical structure of the oil result in the decay of insulation paper and reduced insulation and cooling which can lead to a transformer failure. The general approach to oil monitoring has been for an operator to examine the colour index (ASTM) of the oil, electrical strength, acidity, water contents and dissolved gas analysis results and form an opinion as to the extent of oil degradation. Chromatic techniques enable data from di↵erent sources to be combined to give an overall evaluation about the condition of a system being monitored. One of the main goals for this work was to use chromatic techniques for integrating the oil data from the di↵erent sources and sensors. In addition the chromatic approach enables liquids to be monitored optically so a second aim was to apply chromatic optical oil monitoring using portable system by transmitting polychro- matic light through the oil sample, which is contained in a transparent cuvette and imaged using a mobile phone camera. A number of oil samples were optically analysed with portable chromatic sys- tem and the optical data was compared with the colour index and chromatically companied with the dissolved gas and other oil data to give overall evaluation of oil degradation. The chromatic optical result compared favourably with the colour index. It was also possible to classify the oil samples chromatically into categories of low, medium and high degradation. This enabled the chromatic data combination approach to be implemented as a prototype system in Matlab software that an operator could use to get a classification of an oil sample. Essential experiment was introduced to monitor di↵erent oil particles by obtaining the result of di↵erent filtered samples through the filter paper. Beside the ability to analyse data and distinguish between fresh and contam- inated oil samples the chromatic technique has the ability to track the history of di↵erent degraded oil samples which can give an indication about failure faults and it could give a prediction of any future faults. Therefore a commercially viable reliable system can be developed to extend the service life and extend the maintenance schedules. These monitoring systems could lead to extending the service life of the transformers, making the electricity supply more reliable and giving the consumer a better quality of life.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Trumpp, Alexander, Johannes Lohr, Daniel Wedekind, Martin Schmidt, Matthias Burghardt, Axel R. Heller, Hagen Malberg, and Sebastian Zaunseder. "Camera-based photoplethysmography in an intraoperative setting." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2018. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-234950.

Повний текст джерела
Анотація:
Background Camera-based photoplethysmography (cbPPG) is a measurement technique which enables remote vital sign monitoring by using cameras. To obtain valid plethysmograms, proper regions of interest (ROIs) have to be selected in the video data. Most automated selection methods rely on specific spatial or temporal features limiting a broader application. In this work, we present a new method which overcomes those drawbacks and, therefore, allows cbPPG to be applied in an intraoperative environment. Methods We recorded 41 patients during surgery using an RGB and a near-infrared (NIR) camera. A Bayesian skin classifier was employed to detect suitable regions, and a level set segmentation approach to define and track ROIs based on spatial homogeneity. Results The results show stable and homogeneously illuminated ROIs. We further evaluated their quality with regards to extracted cbPPG signals. The green channel provided the best results where heart rates could be correctly estimated in 95.6% of cases. The NIR channel yielded the highest contribution in compensating false estimations. Conclusions The proposed method proved that cbPPG is applicable in intraoperative environments. It can be easily transferred to other settings regardless of which body site is considered.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Freeman, Marianne Sarah. "Development of camera trap methodology in monitoring deer distribution and abundance." Thesis, Queen's University Belfast, 2015. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.695342.

Повний текст джерела
Анотація:
Camera traps have taken off one of the most popular tools in ecology. This thesis aims to develop existing camera trap methodology in order to better assess the distribution and abundance of deer in the UK. Particular focus was made on the invasion history of muntjac to help elucidate their invasion pattern. The number of founding females was estimated to be 4 or 5 individuals. The effect of covariates on the camera detection zones were considered to help improve density estimates resulting from camera trap research. Flash type and individual passing speed proved to be two important covariates adding weight to the recommendation that camera detection zones should be survey specific and that activity patterns should be considered when determining detection zones. Eight deer population densities were estimated from across the UK using both thermal imaging distance sampling and random encounter model (REM) techniques. A higher density was found with the REM, thought the two methods appeared more comparable in open woodlands. A low quality thermal imagine camera may have bias the results, but this study also emphasises the need to ensure other parameters, such as daily travel distance are site specific and as accurate as possible. Muntjac sightings, within Northern Ireland, were collated and verified using a scoring system and survey combination. The REM was trialled in one site, finding a minimum population of 5 muntjac deer. This baseline result can be used in any future population monitoring. These verified sightings alongside others from Ireland were used to test a muntjac species distribution model with different sampling bias approaches. The random background model was the most parsimonious model suggesting, in this case, that the additional bias controlling techniques may not always be necessary.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

LO, COCO Eleonora. "Monitoring SO2 degassing on Stromboli using a permanent UV Camera network." Doctoral thesis, Università degli Studi di Palermo, 2020. http://hdl.handle.net/10447/427103.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Книги з теми "Camera monitoring"

1

G, Raphael Martin, and Pacific Northwest Research Station (Portland, Or.), eds. Inexpensive camera systems for detecting martens, fishers, and other animals: Guidelines for use and standardization. Portland, Or. (333 S.W. First Ave., Portland 97208-3890): U.S. Dept. of Agriculture, Forest Service, Pacific Northwest Research Station, 1993.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

D, Nichols James, Kāranta Ke Ullāsa 1948-, and International Mammalogical Conference (9th : 2005 : Sapporo, Japan), eds. Camera traps in animal ecology: Methods and analyses / Allan F. O'Connell, James D. Nichols, K. Ullas Karanth, editors. Tokyo: Springer, 2011.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Program, WWF Nepal, ed. Status, distribution, and monitoring of tiger populations in Terai Arc Landscape (TAL)-Nepal: A photographic documentation of camera trapped tigers. Kathmandu: WWF Nepal Program, 2002.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Rieckoff, T. J. High-speed observer: Automated streak detection in SSME plumes. Marshall Space Flight Center, Ala: National Aeronautics and Space Administration, George C. Marshall Space Flight Center, 2001.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

George C. Marshall Space Flight Center., ed. High-speed observer: Automated streak detection in SSME plumes. Marshall Space Flight Center, Ala: National Aeronautics and Space Administration, George C. Marshall Space Flight Center, 2001.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

George C. Marshall Space Flight Center., ed. High-speed observer: Automated streak detection in SSME plumes. Marshall Space Flight Center, Ala: National Aeronautics and Space Administration, George C. Marshall Space Flight Center, 2001.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Hare, John. The lost camels of Tartary: A quest into forbidden China. London: Abacus, 1999.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Hare, John. The lost camels of Tartary: A quest into forbidden China. London: Little, Brown, 1998.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Zimmermann, Fridolin, Francesco Rovero, and Luigi Boitani. Camera Trapping for Wildlife Research. Pelagic Publishing Ltd., 2016.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Camera Trapping for Wildlife Research. Pelagic Publishing Ltd., 2016.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Частини книг з теми "Camera monitoring"

1

Fujisawa, Tetsuya, Tadahito Egawa, Kazuhiko Taniguchi, Syoji Kobashi, and Yutaka Hata. "An Energy Visualization by Camera Monitoring." In Advanced Intelligent Systems, 51–64. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-05500-8_6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Rawat, Chandan Singh, Tanya Dubey, Sagar Pujari, Yash Bhise, and Hridesh Kamal. "Heart Rate Monitoring Using External Camera." In ICT Infrastructure and Computing, 397–406. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-99-4932-8_36.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Yu, Meng-Chieh, Huan Wu, Jia-Ling Liou, Ming-Sui Lee, and Yi-Ping Hung. "Multiparameter Sleep Monitoring Using a Depth Camera." In Biomedical Engineering Systems and Technologies, 311–25. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38256-7_21.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Deng, Lijun, Wei Shen, Yi Lin, Wei Gao, and Jiayuan Lin. "Surveillance Camera-Based Monitoring of Plant Flowering Phenology." In Communications in Computer and Information Science, 273–83. Singapore: Springer Singapore, 2017. http://dx.doi.org/10.1007/978-981-10-3966-9_31.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Javh, Jaka, Janko Slavič, and Miha Boltežar. "Full-Field Modal Analysis Using a DSLR Camera." In Structural Health Monitoring, Photogrammetry & DIC, Volume 6, 27–30. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-74476-6_4.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Choiński, Mateusz, Mateusz Rogowski, Piotr Tynecki, Dries P. J. Kuijper, Marcin Churski, and Jakub W. Bubnicki. "A First Step Towards Automated Species Recognition from Camera Trap Images of Mammals Using AI in a European Temperate Forest." In Computer Information Systems and Industrial Management, 299–310. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-84340-3_24.

Повний текст джерела
Анотація:
AbstractCamera traps are used worldwide to monitor wildlife. Despite the increasing availability of Deep Learning (DL) models, the effective usage of this technology to support wildlife monitoring is limited. This is mainly due to the complexity of DL technology and high computing requirements. This paper presents the implementation of the light-weight and state-of-the-art YOLOv5 architecture for automated labeling of camera trap images of mammals in the Białowieża Forest (BF), Poland. The camera trapping data were organized and harmonized using TRAPPER software, an open-source application for managing large-scale wildlife monitoring projects. The proposed image recognition pipeline achieved an average accuracy of 85% F1-score in the identification of the 12 most commonly occurring medium-size and large mammal species in BF, using a limited set of training and testing data (a total of 2659 images with animals).Based on the preliminary results, we have concluded that the YOLOv5 object detection and classification model is a fine and promising DL solution after the adoption of the transfer learning technique. It can be efficiently plugged in via an API into existing web-based camera trapping data processing platforms such as e.g. TRAPPER system. Since TRAPPER is already used to manage and classify (manually) camera trapping datasets by many research groups in Europe, the implementation of AI-based automated species classification will significantly speed up the data processing workflow and thus better support data-driven wildlife monitoring and conservation. Moreover, YOLOv5 has been proven to perform well on edge devices, which may open a new chapter in animal population monitoring in real-time directly from camera trap devices.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Rahman, Hamidur, Mobyen Uddin Ahmed, and Shahina Begum. "Vision-Based Remote Heart Rate Variability Monitoring Using Camera." In Internet of Things (IoT) Technologies for HealthCare, 10–18. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-76213-5_2.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Garbat, Piotr, and Agata Olszewska. "Remote Heart Rate Monitoring Using a Multi-band Camera." In Image Processing and Communications, 101–7. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-31254-1_13.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Gille, Max, and Daniel J. Rixen. "Toward Camera-Based Monitoring of Abdominal Aortic Aneurysms (AAAs)." In Computer Vision & Laser Vibrometry, Volume 6, 57–62. Cham: Springer Nature Switzerland, 2023. http://dx.doi.org/10.1007/978-3-031-34910-2_7.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Korobiichuk, Igor, Vitaliy Lysenko, Oleksiy Opryshko, Dmiyriy Komarchyk, Natalya Pasichnyk, and Andrzej Juś. "Crop Monitoring for Nitrogen Nutrition Level by Digital Camera." In Advances in Intelligent Systems and Computing, 595–603. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-77179-3_56.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Тези доповідей конференцій з теми "Camera monitoring"

1

CHEN, JUSTIN, NEAL WADHWA, ABE DAVIS, FREDO DURAND, WILLIAM FREEMAN, and ORAL BUYUKOZTURK. "Long Distance Video Camera Measurements of Structures." In Structural Health Monitoring 2015. Destech Publications, 2015. http://dx.doi.org/10.12783/shm2015/385.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

ALKADY, KHALID, ACHILLES G. RASQUINHA, JOSEF T. BRANDL, CHRISTINE E. WITTICH, and CARRICK DETWEILER. "TARGET-FREE, VISION-BASED SYSTEM IDENTIFICATION OF CIVIL STRUCTURES USING UNMANNED AERIAL VEHICLES." In Structural Health Monitoring 2023. Destech Publications, Inc., 2023. http://dx.doi.org/10.12783/shm2023/36877.

Повний текст джерела
Анотація:
Vibration-based structural health monitoring (SHM) frameworks rely upon accurately identified natural frequencies and mode shapes of structures in the field, which is critical information for damage diagnostics and model updating. Vibrationbased techniques have traditionally relied on discrete contact-based sensors. Despite the success of traditional sensing modalities, challenges and limitations remain: 1) the sensors need to be placed at discrete locations, and 2) the structure needs to be accessed for instrumentation. Advancements in the fields of computer vision and robotics have facilitated the use of remote sensing technology, such as cameras and Unmanned Aerial Vehicles (UAVs), in vibration-based SHM applications to address the limitations of traditional sensors. Although UAVs have been used to monitor the dynamic response of structures, these applications have primarily relied on targets or GPS to track the structure’s motion, which is not always feasible due to the large scale of civil structures. To this end, the main objective of this study is to develop an end-to-end target-free, vision-based framework for system identification of civil structures using UAVs. The proposed framework incorporates a phase-based motion estimation approach to extract the structural vibration information from videos without placing targets in the scene. For videos collected by UAVs, a correction needs to be applied to account for the camera’s rigid body motion. To compensate for this motion, the framework extracts the power spectral density (PSD) plot of a static object in the scene and subtracts it from the PSDs of the structure-of-interest. To evaluate the efficacy and robustness of the developed framework, an experimental study was conducted to monitor the free vibration response of two single-degree-of-freedom structures using three different UAVs in a controlled laboratory environment. The analysis shows strong agreement between the results extracted from UAVs equipped with high-resolution cameras with those from a stationary camera and those from accelerometers. Furthermore, the results show that camera resolution, alignment, and motion can significantly impact the accuracy of the results. This study shows the potential of successfully incorporating UAVs into target-free vision-based dynamic monitoring frameworks.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Kenichi Yabuta and Hitoshi Kitazawa. "Optimum camera placement considering camera specification for security monitoring." In 2008 IEEE International Symposium on Circuits and Systems - ISCAS 2008. IEEE, 2008. http://dx.doi.org/10.1109/iscas.2008.4541867.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

MOON, HOYEON, HWEEKWON JUNG, YESEUL KONG, and GYUHAE PARK. "Assessment of Pressed Panel Products using Camera Image Processing." In Structural Health Monitoring 2017. Lancaster, PA: DEStech Publications, Inc., 2017. http://dx.doi.org/10.12783/shm2017/14220.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

LEE, SEUNGHWAN, YINAN MIAO, YESEUL KONG, HYEONWOO NAM, and GYUHAE PARK. "REAL TIME CONDITION MONITORING USING CAMERA AND PHASE-BASED MOTION ESTIMATION." In Structural Health Monitoring 2023. Destech Publications, Inc., 2023. http://dx.doi.org/10.12783/shm2023/36873.

Повний текст джерела
Анотація:
Defects on manufacturing equipment, such as abrasion, corrosion, and deposition, may decrease the production quality and even lead to the shutdown of the entire manufacturing line. Various condition monitoring technologies have been developed to predict machine health and safety through contact sensors. The noncontact camera has shown many advantages over the conventional contact sensors, including high spatial resolution, low cost and remote sensing. Moreover, phase-based motion amplification has proven to be an efficient tool for detecting subtle vibrations. However, the motion amplification requires significant computational resources and does not provide direct motion signal output. To detect abnormal vibrationsduring long-term inspections, a more efficient phase-based motion estimation technique is necessary. In this study, we propose a real-time vibration monitoring system that uses a camera and image phase. We use single optimal Gabor filter with phase-based optical flow to extract the vibrational motion. The use of single optimal filter significantly reduces computation costs and enables accurate measurement of vibration signals even in the low-light condition and/or with image noise. Parallel computing scheme is also introduced for real-time condition monitoring. We conducted validation experiments on a structure with multiple vibrating components that simulate a real factory line. Damage detection is performed on the structure with two damaged cases. All the results show that the proposed technique can accurately measure displacement and provide a novel solution for camera-based real-time damage detection.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

feng, cong, Yuanyuan Li, Xinjun Ma, and Chenchen Wu. "Fisheye camera around view monitoring system." In Ninth International Conference on Graphic and Image Processing, edited by Hui Yu and Junyu Dong. SPIE, 2018. http://dx.doi.org/10.1117/12.2302926.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Dyomin, Victor V., Igor G. Polovtsev, Alexandra Y. Davydova, and Alexey S. Olshukov. "Digital holographic camera for plankton monitoring." In Practical Holography XXXIII: Displays, Materials, and Applications, edited by Hans I. Bjelkhagen and V. Michael Bove. SPIE, 2019. http://dx.doi.org/10.1117/12.2512030.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Liu, Dongran, Marcos Paul Gerardo-Castro, Bruno Costa, and Yi Zhang. "Heart-Rate Monitoring Using Single Camera." In WCX™ 17: SAE World Congress Experience. 400 Commonwealth Drive, Warrendale, PA, United States: SAE International, 2017. http://dx.doi.org/10.4271/2017-01-1434.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Almeida, Miguel, Ganna Portyankina, Dmitri Titov, Richard Moissl, and Wojciech Markiewicz. "Venus Express Monitoring Camera Science Operations." In SpaceOps 2008 Conference. Reston, Virigina: American Institute of Aeronautics and Astronautics, 2008. http://dx.doi.org/10.2514/6.2008-3391.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

CHANG, HUAN-YU, and FUH-GWO YUAN. "Damage Visualization of Scattered Ultrasonic Wavefield via Integrated Highspeed Camera System." In Structural Health Monitoring 2019. Lancaster, PA: DEStech Publications, Inc., 2019. http://dx.doi.org/10.12783/shm2019/32468.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

Звіти організацій з теми "Camera monitoring"

1

Naqvi, Qaim, Patrick Wolff, Brenda Molano-Flores, and Jinelle Sperry. Camera traps are an effective tool for monitoring insect–plant interactions. Engineer Research and Development Center (U.S.), May 2024. http://dx.doi.org/10.21079/11681/48496.

Повний текст джерела
Анотація:
Insect and pollinator populations are vitally important to the health of ecosystems, food production, and economic stability, but are declining worldwide. New, cheap, and simple monitoring methods are necessary to inform management actions and should be available to researchers around the world. Here, we evaluate the efficacy of a commercially available, close-focus automated camera trap to monitor insect–plant interactions and insect behavior. We compared two video settings—scheduled and motion-activated—to a traditional human observation method. Our results show that camera traps with scheduled video settings detected more insects overall than humans, but relative performance varied by insect order. Scheduled cameras significantly outperformed motion-activated cameras, detecting more insects of all orders and size classes. We conclude that scheduled camera traps are an effective and relatively inexpensive tool for monitoring interactions between plants and insects of all size classes, and their ease of accessibility and set-up allows for the potential of widespread use. The digital format of video also offers the benefits of recording, sharing, and verifying observations.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Griffioen, A. B., P. Deitelzweig, and M. J. Kroes. Alternatives for trap monitoring in large rivers and lakes : Camera monitoring and eDNA sampling as alternative for conventional trap monitoring. IJmuiden: Stichting Wageningen Research, Centre for Fisheries Research (CVO), 2019. http://dx.doi.org/10.18174/503595.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Kulhandjian, Hovannes. AI-based Pedestrian Detection and Avoidance at Night using an IR Camera, Radar, and a Video Camera. Mineta Transportation Institute, November 2022. http://dx.doi.org/10.31979/mti.2022.2127.

Повний текст джерела
Анотація:
In 2019, the United States experienced more than 6,500 pedestrian fatalities involving motor vehicles which resulted in a 67% rise in nighttime pedestrian fatalities and only a 10% rise in daytime pedestrian fatalities. In an effort to reduce fatalities, this research developed a pedestrian detection and alert system through the application of a visual camera, infrared camera, and radar sensors combined with machine learning. The research team designed the system concept to achieve a high level of accuracy in pedestrian detection and avoidance during both the day and at night to avoid potentially fatal accidents involving pedestrians crossing a street. The working prototype of pedestrian detection and collision avoidance can be installed in present-day vehicles, with the visible camera used to detect pedestrians during the day and the infrared camera to detect pedestrians primarily during the night as well as at high glare from the sun during the day. The radar sensor is also used to detect the presence of a pedestrian and calculate their range and direction of motion relative to the vehicle. Through data fusion and deep learning, the ability to quickly analyze and classify a pedestrian’s presence at all times in a real-time monitoring system is achieved. The system can also be extended to cyclist and animal detection and avoidance, and could be deployed in an autonomous vehicle to assist in automatic braking systems (ABS).
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Bates, C. Richards, Melanie Chocholek, Clive Fox, John Howe, and Neil Jones. Scottish Inshore Fisheries Integrated Data System (SIFIDS): Work package (3) final report development of a novel, automated mechanism for the collection of scallop stock data. Edited by Mark James and Hannah Ladd-Jones. Marine Alliance for Science and Technology for Scotland (MASTS), 2019. http://dx.doi.org/10.15664/10023.23449.

Повний текст джерела
Анотація:
[Extract from Executive Summary] This project, aimed at the development of a novel, automated mechanism for the collection of scallop stock data was a sub-part of the Scottish Inshore Fisheries Integrated Data Systems (SIFIDS) project. The project reviewed the state-of-the-art remote sensing (geophysical and camera-based) technologies available from industry and compared these to inexpensive, off-the -shelf equipment. Sea trials were conducted on scallop dredge sites and also hand-dived scallop sites. Data was analysed manually, and tests conducted with automated processing methods. It was concluded that geophysical acoustic technologies cannot presently detect individual scallop but the remote sensing technologies can be used for broad scale habitat mapping of scallop harvest areas. Further, the techniques allow for monitoring these areas in terms of scallop dredging impact. Camera (video and still) imagery is effective for scallop count and provide data that compares favourably with diver-based ground truth information for recording scallop density. Deployment of cameras is possible through inexpensive drop-down camera frames which it is recommended be deployed on a wide area basis for further trials. In addition, implementation of a ‘citizen science’ approach to wide area recording is suggested to increase the stock assessment across the widest possible variety of seafloor types around Scotland. Armed with such data a full, statistical analysis could be completed and data used with automated processing routines for future long-term monitoring of stock.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Dhillon, Nathan, Andrew Hannay, and Robin Workman. Next Generation Monitoring Systems. TRL, July 2022. http://dx.doi.org/10.58446/npwb2214.

Повний текст джерела
Анотація:
Survey vehicles, operating at traffic-speed, are deployed across the road network to assess the condition of road pavements. These apply high-quality (and high cost) equipment to measure condition. However, significant progress has been made in the development of low-cost sensors and data collection units that may have potential for application in highways. This project has aimed to understand the capabilities of this emerging technology. The project explores the technologies and combines a Raspberry-Pi based Data Acquisition System, compact camera, GPS, inertial measurement system, Wifi and 4G GSM comms and a low-cost Solid State LiDAR into a prototype device. The total cost is a few hundred pounds. Trials characterise the prototype system. Although the solid-state LiDAR sensors are not found to be robust in this application, the remaining sensors show strong potential for use in road condition assessment. A wider trial of the prototype system in a potential application – the measurement of roughness (IRI) on developing world road networks – was carried out in El Salvador. The prototype shows comparable performance with alternatives, combined with higher levels of practicality and capability, and the potential for higher levels of consistency through a common low-cost measurement platform. In the light of this research, it is felt that, following refinements to the prototype, the initial application for the device would be for condition surveys in developing world nations.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Sakhare, Rahul Suryakant, Jairaj Desai, Jijo K. Mathew, John McGregor, Mischa Kachler, and Darcy M. Bullock. Measuring and Visualizing Freeway Traffic Conditions: Using Connected Vehicle Data. Purdue University, 2024. http://dx.doi.org/10.5703/1288284317751.

Повний текст джерела
Анотація:
Historically, a network of roadside sensors and cameras have been used to monitor freeway conditions. Although these systems are effective, they are typically not operational in and around work zones. Furthermore, it is often not financially viable to deploy in-road sensors and cameras in rural areas. Connected vehicle trajectory data has emerged as a viable source of data and provides a unique opportunity for monitoring freeways. This monograph describes how these connected vehicles can be used to directly measure queue lengths and travel times and this description is summarized in a graphical format easily used by agencies to make management decisions. Approximately 50 use cases are described to demonstrate these techniques under diverse conditions, such as lane reductions, short term closures, rolling slowdowns, work zone set up, work zone removal and inclement weather. A number of the use cases were selected from Indiana locations that had good ITS camera coverage to provide context-sensitive information to help the reader understand the graphics. In addition, several case studies are presented from selected states around the country to demonstrate the scalability of these techniques.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Balali, Vahid, Arash Tavakoli, and Arsalan Heydarian. A Multimodal Approach for Monitoring Driving Behavior and Emotions. Mineta Transportation Institute, July 2020. http://dx.doi.org/10.31979/mti.2020.1928.

Повний текст джерела
Анотація:
Studies have indicated that emotions can significantly be influenced by environmental factors; these factors can also significantly influence drivers’ emotional state and, accordingly, their driving behavior. Furthermore, as the demand for autonomous vehicles is expected to significantly increase within the next decade, a proper understanding of drivers’/passengers’ emotions, behavior, and preferences will be needed in order to create an acceptable level of trust with humans. This paper proposes a novel semi-automated approach for understanding the effect of environmental factors on drivers’ emotions and behavioral changes through a naturalistic driving study. This setup includes a frontal road and facial camera, a smart watch for tracking physiological measurements, and a Controller Area Network (CAN) serial data logger. The results suggest that the driver’s affect is highly influenced by the type of road and the weather conditions, which have the potential to change driving behaviors. For instance, when the research defines emotional metrics as valence and engagement, results reveal there exist significant differences between human emotion in different weather conditions and road types. Participants’ engagement was higher in rainy and clear weather compared to cloudy weather. More-over, engagement was higher on city streets and highways compared to one-lane roads and two-lane highways.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Teng, Henry, and Khalid Mosalam. Long-Term Monitoring of Bridge Settlements using Vision-Based Embedded System. Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, December 2020. http://dx.doi.org/10.55461/apri8198.

Повний текст джерела
Анотація:
The State of California is highly seismic, capable of generating large-magnitude earthquakes that could cripple the infrastructure of several large cities. Yet the annual maintenance of the State’s bridges, such as highway overpasses, is not robust due to budget and staff constraints. Over 1000 bridges were not inspected according to the California Department of Transportation’s (Caltrans) 2015 Maintenance Plan. To help engineers monitor infrastructure conditions, presented within is a device recently developed that employs modern sensing, computing, and communication technologies to autonomously measure and remotely report vertical settlements of bridges, such as highway overpasses. Given the limitations of existing measurement devices, we propose a novel vision-based method that employs a camera to take a picture of a projected laser beam. This new device is referred to as the Projected Laser Target Method (PLTM). This report documents the embedded system design and development of two prototypes. The first prototype implements communication over a local WIFI network using synchronous code to measure distance over time; this PLTM is deployed in a laboratory setting. The second device under study implements communication over a Bluetooth Low Energy system using asynchronous code and communication over 2G cellular networks using synchronous code, with the aim of determining its accuracy in the field. This report evaluates the performance of the field-suitable system in terms of its system reliability, measurement accuracy and precision, power consumption, and its overall system performance.
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Jones, Landon R., Jared A. Elmore, B. S. Krishnan, Sathishkumar Samiappan, Kristine O. Evans, Morgan B. Pfeiffer, Bradley F. Blackwell, and Raymond B. Iglay. Dataset for Controllable factors affecting accuracy and precision of human identification of animals from drone imagery. Mississippi State University, July 2023. http://dx.doi.org/10.54718/xblo5500.

Повний текст джерела
Анотація:
Dataset from the results of an experiment to determine how three controllable factors, flight altitude, camera angle, and time of day, affect human identification and counts of animals from drone images to inform best practices to survey animal communities with drones. We used a drone (unoccupied aircraft system, or UAS) to survey known numbers of eight animal decoy species, representing a range of body sizes and colors, at four GSD (ground sampling distance) values (0.35, 0.70, 1.06, 1.41 cm/pixel) representing equivalent flight altitudes (15.2, 30.5, 45.7, 61.0 m) at two camera angles (45° and 90°) and across a range of times of day (morning to late afternoon). Expert human observers identified and counted animals in drone images to determine how the three controllable factors affected accuracy and precision. Observer precision was high and unaffected by tested factors. However, results for observer accuracy revealed an interaction among all three controllable factors. Increasing flight altitude resulted in decreased accuracy in animal counts overall; however, accuracy was best at midday compared to morning and afternoon hours, when decoy and structure shadows were present or more pronounced. Surprisingly, the 45° camera enhanced accuracy compared to 90°, but only when animals were most difficult to identify and count, such as at higher flight altitudes or during the early morning and late afternoon. We provide recommendations based on our results to design future surveys to improve human accuracy in identifying and counting animals from drone images for monitoring animal populations and communities.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Author, Unknown. L52297 Technologies for In-Service Measurement of Seal Gaps in Internal Floating Roof Tanks. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), June 2009. http://dx.doi.org/10.55274/r0010683.

Повний текст джерела
Анотація:
The industry need was to investigate current and potential technologies for the measurement of seal gaps in internal floating roof storage tanks, including methods of remote monitoring including a) identify existing seal gap measurement technologies and methodologies as well as individuals and/or companies who own the technologies, and b) identify other technologies that may prove useful to effectively measure seal gaps on tanks in service. Four technologies were identified as possible remote-inspection alternatives to the current general practice of inspecting and measuring internal floating roof seal gaps by placing personnel inside in-service tanks. At least two of these technologies, remote camera and x-ray imaging, are worthy of additional evaluation to assess the relative costs and reliability as alternatives to manual inspection and measurement. Remote camera technology has been applied specifically to seal gap measurement and shown to be a viable alternative to manual inspection. X-ray imaging technology appears to be viable as an alternative but has not been applied specifically to seal gap measurement and would require further development to affirm its suitability for this purpose..
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії