Academic literature on the topic 'Low background sensor'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Low background sensor.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Low background sensor"

1

Honeycutt, Wesley T., M. Tyler Ley, and Nicholas F. Materer. "Precision and Limits of Detection for Selected Commercially Available, Low-Cost Carbon Dioxide and Methane Gas Sensors." Sensors 19, no. 14 (July 18, 2019): 3157. http://dx.doi.org/10.3390/s19143157.

Full text
Abstract:
The performance of a sensor platform for environmental or industrial monitoring is sensitive to the cost and performance of the individual sensor elements. Thus, the detection limits, accuracy, and precision of commercially available, low-cost carbon dioxide and methane gas concentration sensors were evaluated by precise measurements at known gas concentrations. Sensors were selected based on market availability, cost, power consumption, detection range, and accuracy. A specially constructed gas mixing chamber, coupled to a precision bench-top analyzer, was used to characterize each sensor during a controlled exposure to known gas concentrations. For environmental monitoring, the selected carbon dioxide sensors were characterized around 400 ppm. For methane, the sensor response was first monitored at 0 ppm, close to the typical environmental background. The selected sensors were then evaluated at gas concentrations of several thousand ppm. The determined detection limits accuracy, and precision provides a set of matrices that can be used to evaluate and select sensors for integration into a sensor platform for specific applications.
APA, Harvard, Vancouver, ISO, and other styles
2

Ju, Youngkwan, and Hyung Jin Mun. "The Research on Security Technology for Low-Performance Iot Sensor Node." International Journal of Engineering & Technology 7, no. 3.34 (September 1, 2018): 594. http://dx.doi.org/10.14419/ijet.v7i3.34.19389.

Full text
Abstract:
Background/Objectives: IoT developmental background: IoT, which is the key technology in the fourth industrial revolution, utilizes the Internet. Particularly, the growth of convergence products that utilize it has been constant with the demands of IoT services using the network of the Internet.Methods/Statistical analysis: IoT network consists of products equipped with various sensors to communicate; sensor nodes are made up of low volume memory, low performance CPU, and battery when they are used in the network. There has been the demand of secure transmission of information measured by a sensor node to the IoT platform. We conduct a study on how we can improve security in the IoT environment.Findings: Generally, sensor nodes are applied with basic security provided by IoT communication protocol rather than their own encryption. Therefore, sensor nodes are vulnerable in terms of security and the IoT platform that utilizes information collected by them would process distorted information.In order to draw a strategy to prevent security breach, we analyze security threat and the type of attacks.Improvements/Applications: We suggest a countermeasure to deal with security threat of sensor nodes and situations in which sensor nodes are vulnerable in IoT environment. To secure integrity of communication and transaction between a sensor node and an IoT platform in the future, the application of block chain technology into the IoT environment is necessary.
APA, Harvard, Vancouver, ISO, and other styles
3

Rivilla, Iván, Borja Aparicio, Juan M. Bueno, David Casanova, Claire Tonnelé, Zoraida Freixa, Pablo Herrero, et al. "Fluorescent bicolour sensor for low-background neutrinoless double β decay experiments." Nature 583, no. 7814 (June 22, 2020): 48–54. http://dx.doi.org/10.1038/s41586-020-2431-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Xiaoqing, Zhipeng Liu, Fang Qian, and Weijiang He. "A bezoimidazole-based highly selective and low-background fluorescent sensor for Zn2+." Inorganic Chemistry Communications 15 (January 2012): 176–79. http://dx.doi.org/10.1016/j.inoche.2011.10.018.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sumanto, Joko. "PEMBACAAN NOMOR SAMPEL DALAM REFURBISHING ALAT LOW BACKGROUND COUNTER-LBC TENNELEC TYPE LB5100 SERIES II." Jurnal Forum Nuklir 3, no. 2 (November 1, 2009): 131. http://dx.doi.org/10.17146/jfn.2009.3.2.3298.

Full text
Abstract:
PEMBACAAN NOMOR SAMPLE DALAM REFURBISHING ALAT LBC TENNELEC TYPE LB5100 SERIES II. Telah dirancang dan dibuat interface pembacaan nomor sample secara otomatis dengan komputer melalui komunikasi serial USB dalam rangka refurbishing alat LBC TENNELEC type LB5100. Refurbishing dilakukan dengan cara memanfaatkan bagian mekanik dari instrumen dan mengganti bagian elektronik yang kadaluwarsa dengan berbasis komputer pribadi-PC. Pada alat ini, sample yang diukur cukup banyak sehingga perlu dilakukan secara otomatis. Setiap sample telah ditandai dengan lobang-lobang yang disusun tertentu yang mengindikasikan nomor sample, group, dan send stack reader. Pada bagian mekanik diletakkan beberapa sensor photo transistor sesuai kedudukan sample. Sensor dihubungkan dengan jalur data dan dikirim ke komputer melalui komunikasi serial USB. Alat tersebut mampu mengukur 150 sampel dan 10 group. Hasil pengujian telah sesuai yang diharapkan.
APA, Harvard, Vancouver, ISO, and other styles
6

Scholz, Michael, Sudharshana Venkataraman Rmasubramanian, Andreas Blank, Sven Kreitlein, and Jörg Franke. "E|Flow II - Infrastructural Sensor Concepts to Digitize the Workspace for Sustainable and Resource Efficient Intralogistics." Applied Mechanics and Materials 871 (October 2017): 97–102. http://dx.doi.org/10.4028/www.scientific.net/amm.871.97.

Full text
Abstract:
Nowadays material flow in factories is realized by different concepts of transport. Each of those specific conveyers has pros and cons due to its concept. In general, the state of art of transport systems have a low flexibility of the path planning and are not suitable for dynamic transport requirements, wherefore they are designed for a specific application. Generally, the common systems cover a specific task of transportation and can fulfill a predefined maximum amount of transportation orders. Therefore, it is necessary that the next generations of production lines, especially the intralogistics transportation systems, have to be designed more adaptable and flexible. The object of the research in this paper is a cyber-physical material flow system with flexible, autonomous and collaborative vehicles combined with centralized sensors to digitize the workspace. These sensors are a combination of commonly used USB cameras with a single board computer to realize an embedded senor system. The whole device is mounted to the ceiling of a factory to digitize the workspace. The single board computer proceed the scenario below the camera and provides the results of the code via WLAN to a central device on the one side, on the other side directly to the autonomous vehicles within the scenario. The algorithm separates moving obstacle by calculating an adaptive background picture. The obstacles are marked with a rotatable rectangle which coordinates are submitted to the vehicles. The adaptive background picture is provided by service to the central device, where the background pictures of all sensors are merged together. Here, a morphologic algorithm separates static obstacles and marks the space where the vehicles can operate. This approach leads to a performant and low cost sensor architecture for an infrastructural sensor concept to realize a digital twin of a workspace.
APA, Harvard, Vancouver, ISO, and other styles
7

Bousiotis, Dimitrios, Ajit Singh, Molly Haugen, David C. S. Beddows, Sebastián Diez, Killian L. Murphy, Pete M. Edwards, Adam Boies, Roy M. Harrison, and Francis D. Pope. "Assessing the sources of particles at an urban background site using both regulatory instruments and low-cost sensors – a comparative study." Atmospheric Measurement Techniques 14, no. 6 (June 7, 2021): 4139–55. http://dx.doi.org/10.5194/amt-14-4139-2021.

Full text
Abstract:
Abstract. Measurement and source apportionment of atmospheric pollutants are crucial for the assessment of air quality and the implementation of policies for their improvement. In most cases, such measurements use expensive regulatory-grade instruments, which makes it difficult to achieve wide spatial coverage. Low-cost sensors may provide a more affordable alternative, but their capability and reliability in separating distinct sources of particles have not been tested extensively yet. The present study examines the ability of a low-cost optical particle counter (OPC) to identify the sources of particles and conditions that affect particle concentrations at an urban background site in Birmingham, UK. To help evaluate the results, the same analysis is performed on data from a regulatory-grade instrument (SMPS, scanning mobility particle sizer) and compared to the outcomes from the OPC analysis. The analysis of the low-cost sensor data manages to separate periods and atmospheric conditions according to the level of pollution at the site. It also successfully identifies a number of sources for the observed particles, which were also identified using the regulatory-grade instruments. The low-cost sensor, due to the particle size range measured (0.35 to 40 µm), performed rather well in differentiating sources of particles with sizes greater than 1 µm, though its ability to distinguish their diurnal variation, as well as to separate sources of smaller particles, at the site was limited. The current level of source identification demonstrated makes the technique useful for background site studies, where larger particles with smaller temporal variations are of significant importance. This study highlights the current capability of low-cost sensors in source identification and differentiation using clustering approaches. Future directions towards particulate matter source apportionment using low-cost OPCs are highlighted.
APA, Harvard, Vancouver, ISO, and other styles
8

Fernandez, Marco, Kathy Burns, Beverly Calhoun, Saramma George, Beverly Martin, and Chris Weaver. "Evaluation of a New Pulse Oximeter Sensor." American Journal of Critical Care 16, no. 2 (March 1, 2007): 146–52. http://dx.doi.org/10.4037/ajcc2007.16.2.146.

Full text
Abstract:
• Background A new forehead noninvasive oxygen saturation sensor may improve signal quality in patients with low cardiac index. • Objectives To examine agreement between oxygen saturation values obtained by using digit-based and forehead pulse oximeters with arterial oxygen saturation in patients with low cardiac index. • MethodsA method-comparison study was used to examine the agreement between 2 different pulse oximeters and arterial oxygen saturation in patients with low cardiac index. Readings were obtained from a finger and a forehead sensor and by analysis of a blood sample. Bias, precision, and root mean square differences were calculated for the digit and forehead sensors. Differences in bias and precision between the 2 noninvasive devices were evaluated with a t test (level of significance P<.05). • Results Nineteen patients with low cardiac index (calculated as cardiac output in liters per minute divided by body surface area in square meters; mean 1.98, SD 0.34) were studied for a total of 54 sampling periods. Mean (SD) oxygen saturations were 97% (2.4) for blood samples, 96% (3.2) for the finger sensor, and 97% (2.8) for the forehead sensor. By Bland Altman analysis, bias ± precision was −1.16 ± 1.62% for the digit sensor and −0.36 ± 1.74% for the forehead sensor; root mean square differences were 1.93% and 1.70%, respectively. Bias and precision differed significantly between the 2 devices; the forehead sensor differed less from the blood sample. • Conclusions In patients with low cardiac index, the forehead sensor was better than the digit sensor for pulse oximetry.
APA, Harvard, Vancouver, ISO, and other styles
9

Fu, Hao, ChinYin Chen, Chongchong Wang, MinChiang Chao, Qiang Zhou, Guilin Yang, and Guozhi Wang. "Quartz crystal based sensor head design and analysis for robot torque sensor application." Cobot 1 (April 26, 2022): 11. http://dx.doi.org/10.12688/cobot.17474.1.

Full text
Abstract:
Background: In recent years, with the gradual development of robot human-computer interaction, robots need to meet the precise control of more complex motion. Torque sensors play an important role. The traditional strain gauge sensor uses a metal strain gauge as the sensitive element, which means that the sensor has a slow response, low resolution and can easily be affected by external signal noise. Aiming at these deficiencies of strain gauge sensors, a sensor with cutting quartz square sheet as the sensor head is proposed. Methods: In order to study the application of quartz square sensing head in the sensor, firstly, COMSOL (5.6) simulation modeling is used to obtain the stress relationship between square quartz sheet and circular quartz sheet. Then the calculation formula of the force frequency coefficient of the circular quartz sheet is modified to obtain the calculation formula of the force frequency coefficient of the square quartz sheet, and the feasibility of the formula is verified by practical experiments. Next, the theoretical simulation and experimental research on the buckling limit force of quartz wafer are carried out, and the formula of buckling limit force in the process of quartz wafer installation is modified. Finally, the designed sensitive head is installed on the elastomer structure for verification. The frequency signal is collected by SGS-THOMSON Microelectronics 32 with a sampling rate of 1000Hz. Results: The main performances of the sensor are range 150nm, sensitivity 350Hz / nm, linearity 98.14%, hysteresis 0.51%, repeatability 98.44%, resolution 0.02%. Conclusions: As the sensitive unit of the torque sensor, the designed quartz wafer can obtain high response time and high resolution, solve the problems of low resolution and slow response time of the traditional strain gauge torque sensor, and reduce the use cost of the sensor.
APA, Harvard, Vancouver, ISO, and other styles
10

Fu, Hao, ChinYin Chen, Chongchong Wang, MinChiang Chao, Qiang Zhou, Guilin Yang, and Guozhi Wang. "Quartz crystal based sensor head design and analysis for robot torque sensor application." Cobot 1 (April 26, 2022): 11. http://dx.doi.org/10.12688/cobot.17474.1.

Full text
Abstract:
Background: In recent years, with the gradual development of robot human-computer interaction, robots need to meet the precise control of more complex motion. Torque sensors play an important role. The traditional strain gauge sensor uses a metal strain gauge as the sensitive element, which means that the sensor has a slow response, low resolution and can easily be affected by external signal noise. Aiming at these deficiencies of strain gauge sensors, a sensor with cutting quartz square sheet as the sensor head is proposed. Methods: In order to study the application of quartz square sensing head in the sensor, firstly, COMSOL (5.6) simulation modeling is used to obtain the stress relationship between square quartz sheet and circular quartz sheet. Then the calculation formula of the force frequency coefficient of the circular quartz sheet is modified to obtain the calculation formula of the force frequency coefficient of the square quartz sheet, and the feasibility of the formula is verified by practical experiments. Next, the theoretical simulation and experimental research on the buckling limit force of quartz wafer are carried out, and the formula of buckling limit force in the process of quartz wafer installation is modified. Finally, the designed sensitive head is installed on the elastomer structure for verification. The frequency signal is collected by SGS-THOMSON Microelectronics 32 with a sampling rate of 1000Hz. Results: The main performances of the sensor are range 150nm, sensitivity 350Hz / nm, linearity 98.14%, hysteresis 0.51%, repeatability 98.44%, resolution 0.02%. Conclusions: As the sensitive unit of the torque sensor, the designed quartz wafer can obtain high response time and high resolution, solve the problems of low resolution and slow response time of the traditional strain gauge torque sensor, and reduce the use cost of the sensor.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Low background sensor"

1

Lozza, Valentina. "Low energy low background photon counter for wisp search experiments." Doctoral thesis, Università degli studi di Trieste, 2010. http://hdl.handle.net/10077/3719.

Full text
Abstract:
2008/2009
Remarkable interest has recently arisen about the search for Weakly Inter- acting Sub-eV Particles (WISPs), such as axions, Axion Like Particles (ALPs), Minicharged and chameleon particles, all of which are not included in the Stan- dard Model. Precision experiments searching for WISPs probe energy scales as high as 10^6 TeV and are complementary to accelerator experiments, where the energy scale is a few TeV. The axion, in particular, is the oldest studied and has the strongest theoretical motivation, having its origin in Quantum Chromodynamics. It was introduced for the first time in 1973 by Peccei and Quinn to solve the strong CP problem, while later on the cosmological implications of its postulated existence also became clear: it is a good candidate for the cold dark matter, and it is necessary to fully explain the evolution of galaxies. Among the different interactions of axions, the most promising for its detection, from an experimental point of view, is the coupling to two photons (Primakoff effect). Using this coupling, several bounds on the axion mass and energy scale have been set by astrophysical observations, by laboratory experiments and by the direct observation of celestial bodies, such as the Sun. Most of these considerations, as was recently recognized, not only constrain the mass and coupling of the axion, but are more generally applicable to all ALPs. The current best limits on the coupling, over a wide range of ALP masses, come from the the CAST (Cern Axion Solar Telescope) experiment at Cern, which looks for ALPs produced in the solar core. The experiment is based on the Primakoff effect in a high magnetic field, where solar ALPs can be reconverted in photons. The CAST magnet, a 10 T, 10 m long LHC superconducting dipole, is placed on a mobile platform in order to follow the Sun twice a day, during sunrise and sunset, and has two straight bores instrumented with X-ray detectors at each end. The re- generated photon flux is, in fact, expected to be peaked at a few keV. On the other hand, there are suggestions that the problem of the anomalous temperature profile of the solar corona could be solved by a mechanism which could enhance the low energy tail of the regenerated photon spectrum. A low energy photon counter has, for this reason, been designed and built to cover one of the CAST ports, at least temporarily. Low energy, low background photon counters such as the one just mentioned, are also crucial for most experiments searching for WISPs. The low energy photon counting system initially developed to be coupled to CAST will be applicable, with proper upgrades, to other WISPs search experiments. It consists of a Galilean telescope to match the CAST magnet bore cross section to an optical fiber leading photons to the sensors, passing first through an optical switch. This last device allows one to share input photons between two different detectors, and to acquire light and background data simultaneously. The sensors at the end of this chain are a photomultiplier tube and an avalanche photodiode operated in Geiger mode. Each detector was preliminary characterized on a test bench, then it was coupled to the optical system. The final integrated setup was subsequently mounted on one of the CAST magnet bores. A set of measurements, including live sun tracking, was carried out at Cern during 2007-2008. The background ob- tained there was the same measured in the test bench measurements, around 0.4 Hz, but it is clear that to progress from these preliminary measurements a lower background sensor is needed. Different types of detectors were considered and the final choice fell on a Geiger mode avalanche photodiode (G-APD) cooled at liquid nitrogen temperature. The aim is to drastically reduce the dark count rate, al- though an increase in the afterpulsing phenomenon is expected. Since the detector is designed to be operated in a scenario where a very low rate of signal photons is predicted, the afterpulsing effect can be accepted and corrected by an increase in the detector dead time. First results show that a reduction in background of a factor better than 10^4 is obtained, with no loss in quantum e ciency. In addition, an optical system based on a semitransparent mirror (transparent to X-rays and re ective for 1-2 eV photons) has been built. This setup, covering the low energy spectrum of solar ALPs, will be installed permanently on the CAST beamline. Current work is centered on further tests on the liquid nitrogen cooled G-APD concept involving different types of sensors and different layouts of the front-end read-out electronics, with a particular attention to the quenching cir- cuit, whether active or passive. Once these detector studies are completed, the final low background sensor will be installed on the CAST experiment. It is important to note that the use of a single photon counter for low energy photons having a good enough background (<1 Hz at least) is not limited to the CAST case, but is of great importance for most WISPs experimental searches, with special regard for photon regeneration experi- ments, and, in general, for the field of precision experiments in particle physics.
Negli ultimi tempi è riemerso un notevole interesse nel campo della ricerca di particelle leggere debolmenti interagenti (Weakly Interacting Sub-eV Particles - WISPs), come ad esempio assioni, particelle con comportamenti simili agli assioni (Axion Like Particles - ALPs), particelle con carica frazionaria e particelle camaleonte; tutti tipi di particelle non inclusi nel Modello Standard. Vista la loro natura debolmente interagente, la scala di energia coinvolta è dell'ordine dei 10^6 TeV, queste particelle non sono visibili nelle collisioni realizzabili negli attuali acceleratori e possono invece essere studiate in esperimenti di precisione, che, sotto questo punto di vista, diventano complementari agli esperimenti su acceleratori. L'assione in particolare è la prima particella, da un punto di vista cronologico, ad essere stata ipotizzata, ed inoltre la sua esistenza è supportata da forti basi teoriche: la sua origine va infatti ricercata all'interno della Cromodinamica Quantistica (QCD). L'assione fu introdotto per la prima volta nel 1973 da Peccei e Quinn come soluzione del problema di violazione di CP nelle interazioni forti, mentre le sue implicazioni cosmologiche risultarono chiare solo in seguito. L'assione infatti può essere considerato un buon candidato per la materia oscura fredda e la sua introduzione è necessaria per spiegare l'evoluzione delle galassie. Tra le diverse interazione degli assioni con la materia e la radiazione, la più interessante da un punto di vista sperimentale è l'accoppiamento con due fotoni (effetto Primakoff). Usando questo tipo di accoppiamento numerosi limiti, sia sulla massa dell'assione che sulle scale di energia coinvolte, possono essere ottenuti da osservazioni astrofisiche e da esperimenti di laboratorio così come dalla diretta osservazione di oggetti celesti tipo il Sole. Queste considerazioni possono essere applicate non solo all'assione ma più in generale a tutte le ALPs. Attualmente i limiti migliori sulla costante di accoppiamento, su un largo spettro di masse di ALPs, si sono ottenuti dall'esperimento CAST (Cern Axion Solar Tele- scope) al Cern, che guarda agli ALPs prodotti nel Sole. L'esperimento è basato sull'effetto Primakoff in un campo magnetico elevato, dove gli ALPs solari sono riconvertiti in fotoni. Il magnete dell'esperimento CAST è costituito da un prototipo per un dipolo superconduttore di LHC, lungo 10 m e con un campo magnetico totale di 10 T. Il magnete è posto su di un affusto mobile per poter seguire il sole durante le fasi di alba e tramonto. Alle due estremità del magnete sono disposti quattro rivelatori sensibili nel campo degli X molli. Il picco del usso di fotoni rigenerato è infatti atteso a pochi keV. Tuttavia, ci sono suggerimenti che il prob- lema ancora aperto del profilo di temperatura della corona solare può essere risolto tramite un meccanismo che contemporaneamente incrementerebbe le code a bassa energia dell'atteso usso di fotoni rigenerati. A questo scopo un contatore di fotoni sensibile nell'intervallo del visibile è stato progettato ed assemblato per coprire una delle quattro porte del magnete di CAST, almeno temporaneamente. I contatori di fotoni studiati hanno un largo campo di applicazione e possono essere usati in altri tipi di esperimenti per la ricerca di WISPs. Il sistema inizialmente sviluppato per CAST consiste in un telescopio Galileiano per accoppiare una fibra ottica all'apertura del magnete di CAST, la fibra ottica è quindi collegata ad un interruttore ottico che permette di utilizzare due rivelatori contemporaneamente. La fibra in ingresso è infatti collegata alternativamente a due fibre in uscita, in questo modo ciascun rivelatore acquisisce per metà del tempo segnale e per metà del tempo fondo, lasciando inalterato il tempo totale di integrazione. I sensori utilizzati fino ad ora al termine della catena ottica sono un tubo fotomoltiplicatore e un avalanche photodiode operato in modalità Geiger. Ciascun rivelatore è stato preliminarmente caratterizzato su un banco di prova e quindi collegato al sistema ottico. Il sistema finale è stato quindi installato su CAST. Una serie di misure, che includono reali prese dati, sono state condotte al Cern durante il 2007-2008. La misura del fondo ottenuta a CAST è stata la stessa misurata durante i test di prova a Trieste, circa 0.4 Hz, ma risulta chiaro che il vero sviluppo futuro è basato su un sensore a fondo molto più basso. A questo scopo sono stati considerati diversi tipi di sensore e la scelta finale è ricaduta su di un avalanche photodiode operato in modalità Geiger e raffreddato all'azoto liquido. Lo scopo è quello di ridurre drasticamente i conteggi di fondo, sebbene a queste temperature sia atteso un incremento del rateo di afterpulses. Tuttavia il rivelatore è pensato per essere utilizzato in un applicazione a basso rateo e quindi il fenomeno degli afterpulses può essere ridotto agendo direttamente sul tempo morto del rivelatore, cioè aumentandolo. I primi test condotti sul rivelatore mostrano un decremento del fondo pari ad un fattore meglio di 10^4, senza rilevabili variazioni in efficienza. In aggiunta a questo sistema, per ottenere un'installazione permanente sul fascio di CAST, è stato realizzato uno specchio semitrasparente, che lascia pressocchè inalterato il fascio di raggi X e invece de ette il fascio di fotoni con energia nel visibile. Il lavoro attuale è incentrato sullo sviluppo del rivelatore a basso fondo raffreddato all'azoto liquido, includendo anche lo studio di diversi tipi di sensore e diversi tipi di elettronica di lettura, con particolare attenzione all'elettronica di quenching del circuito con le varianti attiva e passiva. Una volta terminati gli studi sui diversi tipi di rivelatori, l'apparato finale sarà installato su CAST. E' comunque importante notare che l'uso di un rivelatore a singolo fotone sensibile tra 1-2 eV con un fondo sufficientemente basso (<1 Hz almeno) non è limitato all'uso su CAST ma in tutti gli altri esperimenti per la ricerca di WISPs, con particolare riguardo agli esperimenti di rigenerazione risonante, e in generale, nel campo di applicazione degli esperimenti di precisione alla fisica delle particelle.
1982
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Low background sensor"

1

New Jersey. Legislature. General Assembly. Committee on Senior Citizens. Committee meeting of Assembly Senior Issues Committee: Assembly bill no. 2023 (requires certain background checks for assisted living administrators and applicants for certificate of need) : Assembly concurrent resolution no. 92 (memorializes federal Office of Homeland Security to examine needs of senior citizens in event of terrorist attacks) : Assembly concurrent resolution no. 93 (urges Domestic Security Preparedness Task Force and Domestic Security Preparedness Planning Group to examine needs of senior citizens in event of terrorist attacks). Trenton, N.J: Office of Legislative Services, Public Information Office, Hearing Unit, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Publication, Benn Macdona Goodwi. Password Log Book: Halloween Background Password Log Book, 105 Pages 6 X 9 ,halloween Books Adult,password Logbooks,password Logbook for Senior,password Logbook For. Independently Published, 2021.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Alcovia Co Alcovia Co Publishing. Bible Verse Coloring Book with Mandala Background in Thick Bold Outline for Senior Adults: Large Print Great for Low Vision Elderly, Beginners, Easy Level, Relaxing and Stress Relief. Independently Published, 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Gorn, Elliott J., and Allison Lauterbach. The Voice of Los Angeles. University of Illinois Press, 2017. http://dx.doi.org/10.5406/illinois/9780252037610.003.0009.

Full text
Abstract:
This chapter pays homage to Los Angeles Dodgers broadcaster Vin Scully, who has provided the team's play-by-play for more than six decades years, with “elegance and ease and seeming effortlessness.” Born in 1927, Vincent Edward Scully grew up in the Bronx listening to sportscasters on the radio. He took up broadcasting while a student at Fordham University. Scully joined the Dodgers at spring training in Vero Beach, Florida, in March 1950. More than sixty years later, he is still with the team, the longest tenured announcer in American sports history. With a strong sense of perspective—of history—Scully emphasizes to listeners that baseball is a special little world, fascinating to be sure, but not to be overvalued. This chapter first provides a background on Scully's career in radio broadcasting before considering him from different generational perspectives. It argues that Scully “is more than just a well-loved sportscaster. He is the voice of L.A.” Angelenos' sense of civic identity resonates with that voice.
APA, Harvard, Vancouver, ISO, and other styles
5

Sarah, Nouwen. Part III Regimes and Doctrines, Ch.36 International Criminal Law: Theory All Over the Place. Oxford University Press, 2016. http://dx.doi.org/10.1093/law/9780198701958.003.0037.

Full text
Abstract:
This chapter discusses the different theories employed in the field of international criminal law, which is now increasingly supported by theory. Case theories were developed after events had taken place; operational theories were produced to match complex facts; foundational theories were created to justify existing practices; external theories tried to make sense of the phenomenon of international criminal law as it had been observed; and so did the popular theories based on everyday encounters. Ago, rather than cogito, ergo sum was the field’s implicit maxim. Against this background one still finds that factual, operational, foundational, external theories prove to be less coherent when they are considered in light of each other. Rich theories could thus emerge from more joint theorizing among those working on variably factual, operational, foundational, and external theories, between scholars and practitioners, and between scholar-theorists and quotidian theorists.
APA, Harvard, Vancouver, ISO, and other styles
6

Nevill, Penelope. Military Sanctions Enforcement in the Absence of Express Authorization? Edited by Marc Weller. Oxford University Press, 2016. http://dx.doi.org/10.1093/law/9780199673049.003.0013.

Full text
Abstract:
This chapter examines the use of force to enforce sanctions in the absence of express authorization by the UN Security Council. After reviewing the history and background to enforcement of sanctions which primarily takes place at sea, the chapter addresses the question of what amounts to a use of force in this context, paying particular attention to whether sanctions enforcement is ‘law enforcement’ or a use of force in the sense of Article 2(4) of the UN Charter by examining the jurisprudence of the International Court of Justice and under the United Nations Law of the Sea Convention concerning forcible measures used or threatened by state authorities against vessels or oil rigs and platforms. The chapter concludes by assessing the legal bases for the use of force to enforce sanctions, including those imposed by the United Nations.
APA, Harvard, Vancouver, ISO, and other styles
7

Frédéric, Mégret. Part III Regimes and Doctrines, Ch.37 Theorizing the Laws of War. Oxford University Press, 2016. http://dx.doi.org/10.1093/law/9780198701958.003.0038.

Full text
Abstract:
This chapter suggests two predominant modes of theorizing about the laws of war—one ‘internal’, the other ‘external’—both providing a useful shorthand for two relatively irreducible types of exercises. Internal theorizing makes sense of the discipline among its practitioners and within bounds that are taken for granted. It is minimal in that its ambition is largely instrumental: providing the practitioners of the laws of war with the background necessary for them to function. External theorizing is less interested in the laws of war as a system than as an object; it is less focused on explaining the operation of the laws of war than understanding what the laws of war mean generally and for international law specifically. It is more explicitly theoretical precisely in that it seeks to highlight some of the ultimate functioning or purpose of the laws of war behind its dominant implicit theories.
APA, Harvard, Vancouver, ISO, and other styles
8

Kershaw, David. Principles of Takeover Regulation. Oxford University Press, 2016. http://dx.doi.org/10.1093/oso/9780199659555.001.0001.

Full text
Abstract:
Providing a clear and comprehensive exposition of takeover law in the UK, this book analyses the principles behind the Takeover Code, explaining the origin, effect, and operation of the rules and regulation with reference to practice and theory. Set in an economic context, the book includes coverage of the jurisprudence of the Takeover Panel, and offers an in-depth understanding of takeover regulation while also providing a degree of context and background to make sense of the regulation. A thoughtful explanation of takeover law, this is a valuable resource for the field of takeover law.
APA, Harvard, Vancouver, ISO, and other styles
9

Ocean, Rainbow. Love Password Book Large Print: Gift for Valentine, Alphabetical with Tabs, Internet Passcode Keeper Log Book for Couple, Co-Worker, Senior, Grandma, Grandpa / Valentines Day Background with Congratulations Greeting Fresh Spring Tulips Flowers. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Ocean, Rainbow. Love Password Book Large Print: Gift for Valentine, Alphabetical with Tabs, Internet Passcode Keeper Log Book for Couple, Co-Worker, Senior, Grandma, Grandpa / Valentines Day Background with Congratulations Greeting Fresh Spring Tulips Flowers. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Low background sensor"

1

An, Ning, Peng Li, Xiaoming Wang, Xiaojun Wu, and Yuntong Dang. "A Novel Home Safety IoT Monitoring Method Based on ZigBee Networking." In Proceeding of 2021 International Conference on Wireless Communications, Networking and Applications, 387–98. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-2456-9_40.

Full text
Abstract:
AbstractThis paper realizes the design of home safety early warning system by studying the wireless communication networking technology of ZigBee and WiFi, as well as sensor communication technology, which is based on taking home safety monitoring as the application background. In this study, CC2530 chip was used as ZigBee wireless communication module. A novel home security IoT monitoring method was proposed through sensor triggering, human activity trajectory perception algorithm design, and wireless networking and communication optimization. Meanwhile, the safety early warning and remote monitoring of home staff can be realized, and home safety can be guaranteed. The system can achieve the purpose of home monitoring and early warning with low software and hardware cost through the experimental design and result analysis. It can not only provide reference for the design of sensor communication system, but also provide technical reference for aging society and response.
APA, Harvard, Vancouver, ISO, and other styles
2

Khan, Muhammad Umar Karim, and Chong-Min Kyung. "Poisson Mixture Model for High Speed and Low-Power Background Subtraction." In Smart Sensors and Systems, 1–23. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-42234-9_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Panigrahi, Muktikanta, and Basudam Adhikari. "Fundamentals on Polyaniline based Composites." In Polyaniline based Composite for Gas Sensors, 1–43. IOR PRESS, 2021. http://dx.doi.org/10.34256/ioriip2121.

Full text
Abstract:
The background of work carried out highlighting on polyaniline, N-substituted polyaniline and acid-doped polyaniline. The problems associated with this polymer and promises it hold are also discussed. It also provides introduction to the nanocomposites of polyaniline/nanoclays, and polyaniline/polyacrylic acid. As well, we have described the polymer stabilized intrinsically conducting polymer composites. The state of the art polymer stabilised intrinsically conducting composites have been reviewed. At last, we have reviewed on the CH4 gas sensing since it has been recognized as one of the inflammable gas sensors. The main problem on the CH4 gas sensor lies on its room temperature operation and detection of low ppm level concentration.
APA, Harvard, Vancouver, ISO, and other styles
4

Binder, Amy J., and Kate Wood. "Who Are Conservative Students?" In Becoming Right. Princeton University Press, 2013. http://dx.doi.org/10.23943/princeton/9780691145372.003.0002.

Full text
Abstract:
This chapter asks who conservative students are by drawing on two sources. First are the surveys administered by the University of California at Los Angeles's Higher Education Research Institute to thousands of incoming college freshmen and graduating seniors during the 2000s. The second source is the data collected on different campuses, designed to shed light on the formative years of the students and alumni/ae in their families and their schools, their early experiences with conservatism, and how they acquired the politics bug. Using this information, the chapter examines the students' demographics, political identifications, precollege political styles, ideological orientations, religious affiliation, and social class background as well as their families' political backgrounds.
APA, Harvard, Vancouver, ISO, and other styles
5

K., Jayashree, and Chithambaramani R. "Big Data and Clustering Techniques." In Handbook of Research on Big Data Clustering and Machine Learning, 1–9. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-0106-1.ch001.

Full text
Abstract:
Big data has become a chief strength of innovation across academics, governments, and corporates. Big data comprises massive sensor data, raw and semi-structured log data of IT industries, and the exploded quantity of data from social media. Big data needs big storage, and this volume makes operations such as analytical operations, process operations, retrieval operations very difficult and time consuming. One way to overcome these difficult problems is to have big data clustered in a compact format. Thus, this chapter discusses the background of big data and clustering. It also discusses the various application of big data in detail. The various related work, research challenges of big data, and the future direction are addressed in this chapter.
APA, Harvard, Vancouver, ISO, and other styles
6

França, Reinaldo Padilha, Ana Carolina Borges Monteiro, Rangel Arthur, and Yuzo Iano. "An Overview of Narrowband Internet of Things (NB-IoT) in the Modern Era." In Advances in Wireless Technologies and Telecommunication, 26–45. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-4775-5.ch002.

Full text
Abstract:
NB-IoT is the most suitable mobile network technology for IoT applications that require exceptionally extensive coverage added with extremely low power consumption, since these applications will generally be characterized by low data rates and moderate reaction times, usually in a few seconds, enabling the creation and development of solutions aimed at smart cities and smart environments. The NB-IoT technology can be characterized as a cellular LPWAN technology operating in a downlink within a bandwidth of 180 kHz and a sub-carrier space of 15 kHz and in the uplink, in general with a single tone transmission ranging between 3.75 kHz or 15 kHz, using coverage enhancement techniques, with characteristics of battery life for more than a decade and with specific battery-saving features. The ease that technological solutions of internet of things (IoT) make available through applications connected through intelligent sensors in traffic lights and parking lots; city pollution sensors; meters for energy, water, and sewage in cities, among other possibilities make systems more efficient, considering NB-IoT connectivity in relation to the treatment of information collected by devices allowing applications to be developed to address market needs. Therefore, this chapter aims to provide an updated discussion on narrowband technologies in the context of the IoT, showing and approaching its success, with a concise bibliographic background, categorizing and synthesizing the technological potential.
APA, Harvard, Vancouver, ISO, and other styles
7

Manji, Hadi. "Neurosyphilis and neuro-AIDS." In Oxford Textbook of Medicine, edited by Christopher Kennard, 6100–6109. Oxford University Press, 2020. http://dx.doi.org/10.1093/med/9780198746690.003.0598.

Full text
Abstract:
Invasion of the central nervous system occurs early in the course of syphilis infection. Neurosyphilis causes a meningitis, a myeloradiculopathy due to pachymeningitis, gummatous (granulomatous) cord and brain lesions; endarteritis may cause infarction and a low-grade meningoencephalitis affecting the brain results in dementia (general paralysis of the insane) and in the spinal cord, a sensory ataxic syndrome (tabes dorsalis). The introduction of highly active antiretroviral therapies has greatly reduced the frequency of these complications in patients with access to these treatments. However, newer complications are now increasingly recognized such as neurological immune reconstitution inflammatory syndrome, a compartmentalization syndrome (cerebrospinal fluid escape). This chapter looks at these and other important issues regarding the background, diagnosis, treatment, and outlook for neurosyphilis and neuro-AIDS.
APA, Harvard, Vancouver, ISO, and other styles
8

Nisar, Kashif, Angela Amphawan, and Suhaidi B. Hassan. "Comprehensive Structure of Novel Voice Priority Queue Scheduling System Model for VoIP over WLANs." In Global Applications of Pervasive and Ubiquitous Computing, 257–78. IGI Global, 2013. http://dx.doi.org/10.4018/978-1-4666-2645-4.ch027.

Full text
Abstract:
Voice over Internet Protocol (VoIP) has grown quickly in the world of telecommunication. Wireless Local Area Networks (WLANs) are the most performance assuring technology for wireless networks, and WLANs have facilitated high-rate voice services at low cost and good flexibility. In a voice conversation, each client works as a sender or a receiver depending on the direction of traffic flow over the network. A VoIP application requires high throughput, low packet loss, and a high fairness index over the network. The packets of VoIP streaming may experience drops because of the competition among the different kinds of traffic flow over the network. A VoIP application is also sensitive to delay and requires the voice packets to arrive on time from the sender to the receiver side without any delay over WLAN. The scheduling system model for VoIP traffic is an unresolved problem. The objectives of this paper are to identify scheduler issues. This comprehensive structure of Novel Voice Priority Queue (VPQ) scheduling system model for VoIP over WLAN discusses the essential background of the VPQ schedulers and algorithms. This paper also identifies the importance of the scheduling techniques over WLANs.
APA, Harvard, Vancouver, ISO, and other styles
9

Nahin, Paul J. "What You Need to Know to Read This Book." In The Logician and the Engineer. Princeton University Press, 2017. http://dx.doi.org/10.23943/princeton/9780691176000.003.0001.

Full text
Abstract:
This chapter details the background knowledge needed to read this book. Specifically, it assumes some knowledge of mathematics and electrical physics and an appreciation for the value of analytical reasoning—but no more than a technically minded college-prep high school junior or senior would have. In particular, the math level is that of algebra, including knowing how matrices multiply. The electrical background is simple: knowing (1) that electricity comes in two polarities (positive and negative) and that electrical charges of like polarity repel and of opposite polarity attract; and (2) understanding Ohm's law for resistors (that the voltage drop across a resistor in volts is the current through the resistor in amperes times the resistance in ohms) and the circuit laws of Kirchhoff (that the sum of the voltage drops around any closed loop is zero, which is an expression of the conservation of energy; the sum of all the currents into any node is zero).
APA, Harvard, Vancouver, ISO, and other styles
10

Konta, Carla. "Nice to meet you, President Tito …" In The Legacy of J. William Fulbright, 241–60. University Press of Kentucky, 2019. http://dx.doi.org/10.5810/kentucky/9780813177700.003.0013.

Full text
Abstract:
The chapter explores the political backgrounds, strategic interests, and diplomatic consequences of Senator J. William Fulbright’s visit to socialist Yugoslavia in November 1964 to chair the signing of the Yugoslav Fulbright agreement. The mission tackled two issues: as a US senator, Fulbright repaired misunderstandings and low points of previous US-Yugoslav bilateral relations; as a politician who was intellectually committed to liberal internationalism, he confirmed his support for Yugoslav independence from the Soviet Union and, by observing the Yugoslav Communist regime, convinced himself of a different solution for Vietnam’s emerging tangle. By examining Fulbright and Yugoslav papers, the chapter argues that Yugoslav experimentation with national communism and its possible bridge function between East and West framed the senator’s politics of dissent over Vietnam on the assumption that Communist movements were not as monolithic as most US policy makers viewed them. America’s soft approach to Yugoslav communism corroborated Fulbright’s convictions and persuaded him that Yugoslavia could serve as a case study for the impasse in Vietnam.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Low background sensor"

1

Harbaugh, Svetlana V., Molly E. Davidson, Yaroslav G. Chushak, Nancy Kelley-Loughnane, and Morley O. Stone. "Riboswitch-based sensor in low optical background." In NanoScience + Engineering, edited by Emily M. Heckman, Thokchom B. Singh, and Junichi Yoshida. SPIE, 2008. http://dx.doi.org/10.1117/12.796122.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Imran, Muhammad, Naeem Ahmad, Khursheed Khursheed, Mattias ONils, and Najeem Lawal. "Low Complexity Background Subtraction for Wireless Vision Sensor Node." In 2013 Euromicro Conference on Digital System Design (DSD). IEEE, 2013. http://dx.doi.org/10.1109/dsd.2013.77.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Zou, Yu, Massimo Gottardi, and Matteo Perenzoni. "Live Demostration: Low Power Vision Sensor with Robust Dynamic Background Rejection." In 2018 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 2018. http://dx.doi.org/10.1109/iscas.2018.8351881.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Escher, Lukas, Thomas Rück, Simon Jobst, Martin König, and Rudolf Bierl. "Design and Characterization of a Low-Cost Photoacoustic Sensor for NO2 Using Lateral Illumination and Background Suppression." In 3D Image Acquisition and Display: Technology, Perception and Applications. Washington, D.C.: Optica Publishing Group, 2022. http://dx.doi.org/10.1364/3d.2022.jtu2a.10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

An, Mangmang, Chaosong Gao, Guangming Huang, Jun Liu, Yuan Mei, Xiangming Sun, Ping Yang, and Lan Feng Xiao. "A Low-Noise CMOS Pixel Direct Charge Sensor Topmetal-IIa for Low Background and Low Rate- Density Experiments." In Topical Workshop on Electronics for Particle Physics. Trieste, Italy: Sissa Medialab, 2018. http://dx.doi.org/10.22323/1.313.0041.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Anderson, Mary L., Joshua D. Daniel, Andrei N. Zagrai, and David J. Westpfahl. "Electro-Mechanical Impedance Measurements in an Imitated Low Earth Orbit Radiation Environment." In ASME 2016 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/imece2016-66855.

Full text
Abstract:
Piezoelectric sensors are used in many structural health monitoring (SHM) methods to interrogate the condition of the structure to which the sensors are affixed or imbedded. Among SHM methods utilizing thin wafer piezoelectric sensors (PWAS), electro-mechanical impedance monitoring is seen as a promising approach to assess structural condition in the vicinity of a sensor. Using the converse and direct piezoelectric effects, this health monitoring method utilizes mechanical actuation and electric voltage to determine the impedance signature of the structure. If there is damage to the structure, there will be a change in the impedance signature. It is important to discern between actual damage and environmental effects on the piezoelectric ceramic sensors and the structure. If structural health monitoring is to be implemented in space structures on orbit, it is imperative to determine the effects of the extreme space environment on piezoelectric sensors and the structures to which they are affixed. The space environment comprises extreme temperatures, vacuum, atomic oxygen, microgravity, micro-meteoroids and debris, and significant amounts of radiation. Radiation in space comes from three sources: solar events, background cosmic radiation, and trapped particles in the Van Allen Belts. Radiation exposure to structures on orbit will vary significantly depending on the duration of the flight and the altitude and inclination of the orbit. In this contribution, the effect of gamma radiation on piezoelectric ceramic sensors and space grade aluminum is investigated for equivalent gamma radiation exposure to 3-months, six-months, and 1-year on Low Earth Orbit (LEO). An experiment was conducted at White Sands Missile Range, Gamma Radiation Facility using Cobalt-60 as the source of radiation. A free PWAS and a PWAS bonded to a small aluminum beam were exposed to increasing levels of gamma radiation. Impedance data were collected for both sensors after each radiation exposure. The total radiation absorbed dose was 200 kRad (Si) by the end of the experiment. The results show that piezoelectric ceramic material is affected by gamma radiation. Over the course of increasing exposure levels to Cobalt-60, the impedance frequency of the free sensor increased with each absorbed dose. The impedance measurements of the sensor bonded to the aluminum beam reflects structural and sensor’s impedance. The data for this sensor show an increase in impedance amplitude with each level of absorbed dose. The mechanism at work in these impedance changes is suggested and future experimental work is identified. A survey of previous results of radiation exposure of piezoelectric ceramic sensors and aluminum alloys is presented and are compared to previous studies.
APA, Harvard, Vancouver, ISO, and other styles
7

Moran, Steven E., Robert L. Law, Peter N. Craig, and Warren M. Goldberg. "Optically phase-locked electronic speckle pattern interferometer." In OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1986. http://dx.doi.org/10.1364/oam.1986.fu3.

Full text
Abstract:
The design, operation, and characteristics of an optically phase-locked electronic speckle pattern interferometer (OPL-ESPI) are described. The OPL-ESPI sensor generates real-time equal Doppler speckle contours of vibrating objects from unstable sensor platforms with a Doppler resolution of 30 Hz and a maximum tracking range of ±5 MHz. Laboratory measurements of the sensor’s characteristics indicate a velocity tracking range of at least 8 μm/s to 8 cm/s, or a dynamic range of 40 dB. The optical phase-locked loop (OPLL) not only compensates for the deleterious effects of ambient background vibration, but also provides the basis for a new ESPI video signal processing technique which produces high contrast speckle contours. The ability to locate the OPLL lock point anywhere within the field of view of the sensor also allows examination of the spatial detail of low contrast speckle contours whose low visibility would otherwise make interpretation difficult. Since the OPL-ESPI system has local oscillator phase modulation capability, it offers the potential for detection of vibrations with amplitudes of less than a hundredth of a wavelength.
APA, Harvard, Vancouver, ISO, and other styles
8

Heaney, J. B., K. P. Stewart, R. A. Boucarut, P. W. Alley, and A. R. Korb. "Optical Components For Infrared And Submillimeter Wave Cryogenic Remote Sensors." In Space Optics for Astrophysics and Earth and Planetary Remote Sensing. Washington, D.C.: Optica Publishing Group, 1988. http://dx.doi.org/10.1364/soa.1988.wb22.

Full text
Abstract:
The spectroscopic instruments that will go into space aboard the Cosmic Background Explorer (COBE) satellite cover the electromagnetic spectrum from 1μm to about 10cm in wavelength. Two of the instruments - the Far Infrared Absolute Spectrometer (FIRAS) and the Diffuse Infrared Background Experiment (DIRBE) - will operate at temperatures below 10K. The combined challenges of broad wavelength coverage and low temperature operation drove component design across several technology frontiers in both spectral and temperature regimes. This report will discuss the design and test of the filters, polarizers and dichroic beamsplitters that have been selected to satisfy the cryogenic sensor performance requirements in response to the broad wavelength and temperature demands.
APA, Harvard, Vancouver, ISO, and other styles
9

Zou, Yu, Massimo Gottardi, Michela Lecca, and Matteo Perenzoni. "A Low-Power VGA Vision Sensor with Event Detection through Motion Computation based on Pixel-Wise Double-Threshold Background Subtraction and Local Binary Pattern Coding." In ESSCIRC 2019 - IEEE 45th European Solid State Circuits Conference (ESSCIRC). IEEE, 2019. http://dx.doi.org/10.1109/esscirc.2019.8902926.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Zhang, Mingshao, Zhou Zhang, El-Sayed Aziz, Sven K. Esche, and Constantin Chassapis. "Kinect-Based Universal Range Sensor for Laboratory Experiments." In ASME 2013 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/imece2013-62979.

Full text
Abstract:
The Microsoft Kinect is part of a wave of new sensing technologies. Its RGB-D camera is capable of providing high quality synchronized video of both color and depth data. Compared to traditional 3-D tracking techniques that use two separate RGB cameras’ images to calculate depth data, the Kinect is able to produce more robust and reliable results in object recognition and motion tracking. Also, due to its low cost, the Kinect provides more opportunities for use in many areas compared to traditional more expensive 3-D scanners. In order to use the Kinect as a range sensor, algorithms must be designed to first recognize objects of interest and then track their motions. Although a large number of algorithms for both 2-D and 3-D object detection have been published, reliable and efficient algorithms for 3-D object motion tracking are rare, especially using Kinect as a range sensor. In this paper, algorithms for object recognition and tracking that can make use of both RGB and depth data in different scenarios are introduced. Subsequently, efficient methods for scene segmentation including background and noise filtering are discussed. Taking advantage of those two kinds of methods, a prototype system that is capable of working efficiently and stably in various applications related to educational laboratories is presented.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Low background sensor"

1

Delwiche, Michael, Boaz Zion, Robert BonDurant, Judith Rishpon, Ephraim Maltz, and Miriam Rosenberg. Biosensors for On-Line Measurement of Reproductive Hormones and Milk Proteins to Improve Dairy Herd Management. United States Department of Agriculture, February 2001. http://dx.doi.org/10.32747/2001.7573998.bard.

Full text
Abstract:
The original objectives of this research project were to: (1) develop immunoassays, photometric sensors, and electrochemical sensors for real-time measurement of progesterone and estradiol in milk, (2) develop biosensors for measurement of caseins in milk, and (3) integrate and adapt these sensor technologies to create an automated electronic sensing system for operation in dairy parlors during milking. The overall direction of research was not changed, although the work was expanded to include other milk components such as urea and lactose. A second generation biosensor for on-line measurement of bovine progesterone was designed and tested. Anti-progesterone antibody was coated on small disks of nitrocellulose membrane, which were inserted in the reaction chamber prior to testing, and a real-time assay was developed. The biosensor was designed using micropumps and valves under computer control, and assayed fluid volumes on the order of 1 ml. An automated sampler was designed to draw a test volume of milk from the long milk tube using a 4-way pinch valve. The system could execute a measurement cycle in about 10 min. Progesterone could be measured at concentrations low enough to distinguish luteal-phase from follicular-phase cows. The potential of the sensor to detect actual ovulatory events was compared with standard methods of estrus detection, including human observation and an activity monitor. The biosensor correctly identified all ovulatory events during its testperiod, but the variability at low progesterone concentrations triggered some false positives. Direct on-line measurement and intelligent interpretation of reproductive hormone profiles offers the potential for substantial improvement in reproductive management. A simple potentiometric method for measurement of milk protein was developed and tested. The method was based on the fact that proteins bind iodine. When proteins are added to a solution of the redox couple iodine/iodide (I-I2), the concentration of free iodine is changed and, as a consequence, the potential between two electrodes immersed in the solution is changed. The method worked well with analytical casein solutions and accurately measured concentrations of analytical caseins added to fresh milk. When tested with actual milk samples, the correlation between the sensor readings and the reference lab results (of both total proteins and casein content) was inferior to that of analytical casein. A number of different technologies were explored for the analysis of milk urea, and a manometric technique was selected for the final design. In the new sensor, urea in the sample was hydrolyzed to ammonium and carbonate by the enzyme urease, and subsequent shaking of the sample with citric acid in a sealed cell allowed urea to be estimated as a change in partial pressure of carbon dioxide. The pressure change in the cell was measured with a miniature piezoresistive pressure sensor, and effects of background dissolved gases and vapor pressures were corrected for by repeating the measurement of pressure developed in the sample without the addition of urease. Results were accurate in the physiological range of milk, the assay was faster than the typical milking period, and no toxic reagents were required. A sampling device was designed and built to passively draw milk from the long milk tube in the parlor. An electrochemical sensor for lactose was developed starting with a three-cascaded-enzyme sensor, evolving into two enzymes and CO2[Fe (CN)6] as a mediator, and then into a microflow injection system using poly-osmium modified screen-printed electrodes. The sensor was designed to serve multiple milking positions, using a manifold valve, a sampling valve, and two pumps. Disposable screen-printed electrodes with enzymatic membranes were used. The sensor was optimized for electrode coating components, flow rate, pH, and sample size, and the results correlated well (r2= 0.967) with known lactose concentrations.
APA, Harvard, Vancouver, ISO, and other styles
2

Horwitz, Benjamin, and Nicole M. Donofrio. Identifying unique and overlapping roles of reactive oxygen species in rice blast and Southern corn leaf blight. United States Department of Agriculture, January 2017. http://dx.doi.org/10.32747/2017.7604290.bard.

Full text
Abstract:
Plants and their fungal pathogens both produce reactive oxygen species (ROS). CytotoxicROS act both as stressors and signals in the plant-fungal interaction. In biotrophs, a compatible interaction generates little ROS, but is followed by disease. An incompatible interaction results in a strong oxidative burst by the host, limiting infection. Necrotrophs, in contrast, thrive on dead and dying cells in an oxidant-rich local environment. Rice blast, Magnaportheoryzae, a hemibiotroph, occurs worldwide on rice and related hosts and can decimate enough rice each year to feed sixty million people. Cochliobolusheterostrophus, a necrotroph, causes Southern corn leaf blight (SLB), responsible for a major epidemic in the 1970s. The objectives of our study of ROS signaling and response in these two cereal pathogens were: Confocal imaging of ROS production using genetically encoded redox sensor in two pathosystems over time. Forward genetic screening of HyPer sensor lines in two pathosystems for fungal genes involved in altered ROSphenotypes. RNA-seq for discovery of genes involved in ROS-related stress and signaling in two pathosystems. Revisions to the research plan: Library construction in SLB was limited by low transformation efficiency, compounded by a protoplasting enzyme being unavailable during most of year 3. Thus Objective 2 for SLB re-focused to construction of sensor lines carrying deletion mutations in known or candidate genes involved in ROS response. Imaging on rice proved extremely challenging, so mutant screening and imaging were done with a barley-infecting line, already from the first year. In this project, ROS imaging at unprecedented time and spatial resolution was achieved, using genetically-encoded ratio sensors in both pathogens. This technology is currently in use for a large library of rice blast mutants in the ROS sensor background, and Southern corn leaf blight mutants in final stages of construction. The imaging methods developed here to follow the redox state of plant pathogens in the host tissue should be applicable to fungal pathogens in general. Upon completion of mutant construction for SCLB we hope to achieve our goal of comparison between intracellular ROS status and response in hemibiotroph and necrotroph cereal pathogens.
APA, Harvard, Vancouver, ISO, and other styles
3

Busso, Matías, and Sebastián Montaño. Signaling Specific Skills and the Labor Market of College Graduates. Inter-American Development Bank, September 2022. http://dx.doi.org/10.18235/0004454.

Full text
Abstract:
We study how signaling skills specic to the major aects labor market outcomes of college graduates. We rely on census-like data and a regression discontinuity design to study the impacts of a well-known award given to top performers on a mandatory nationwide exam, which constitutes a graduation requirement for college seniors in Colombia. Students who can rely on the signal when searching for a job have a wage premium of 7 to 12 percent compared to otherwise identical students. This positive return persists even ve years after graduation. The signal mostly benets workers who graduate from low-reputation colleges, and allows workers to nd jobs in more productive rms and in sectors that better use their skills. We rule out that the positive wage returns are explained by human capital. The signal favors mostly less advantaged groups, implying that less information frictions about students' skills could potentially reduce earnings gaps. Our results imply that information policies like those that formally certify specic skills can potentially improve the eciency in talent allocation of the economy and level the playing eld for workers who come from disadvantaged backgrounds.
APA, Harvard, Vancouver, ISO, and other styles
4

TANG, Denise Tse-Shang, Stefanie TENG, Celine TAN, Bonnie LAM, and Christina YUAN. Building inclusive workplaces for lesbians and bisexual women in Hong Kong’s financial services industry. Centre for Cultural Research and Development, Lingnan University, April 2021. http://dx.doi.org/10.14793/ccrd2021001.

Full text
Abstract:
Workplace inclusion is a core component of corporate social responsibility (CSR) in Hong Kong. Workplace inclusion points to the need for employers to recognize diversity among employees, to acknowledge their contributions to the work environment and to raise professional standards for the work force. Diversity within a workplace indicates inclusion of persons with different backgrounds as in racial, ethnic, sex, health status, sexual orientation and gender identity. Women are already less represented at senior levels across various business sectors in Hong Kong. Lesbians and bisexual women face a double glass ceiling in the workplace as a result of both their gender and sexual orientation. Funded by Lingnan University’s Innovation and Impact Fund, and in partnership with Interbank Forum and Lesbians in Finance, Prof. Denise Tse-Shang Tang conducted an online survey and two focus groups targeting lesbians and bisexual women working in Hong Kong’s financial and banking industry. The aim of the study is to examine the specific challenges and barriers faced by lesbians and bisexual women in Hong Kong’s financial services industry. We found that only 37% of survey respondents were out at work, with 23% partially out to close colleagues. In other words, there are still key concerns with being out at work. On the issue of a glass ceiling for LGBT+ corporate employees, 18% of the survey respondents agreed and 47% somewhat agreed that such a ceiling exists. When asked whether it is harder for lesbians and bisexual women to come out in the workplace than it is for gay men, 32% agreed and 46% somewhat agreed. 27% agreed and 39% somewhat agreed with the statement that it is difficult for lesbians and bisexual women to climb up the corporate ladder. Other findings pointed to the low visibility of lesbians and bisexual women in corporate settings, lack of mentorship, increased levels of stress and anxiety, and the fear of being judged as both a woman and a lesbian. Masculine-presenting employees face significantly more scrutiny than cisgender female employees. Therefore, even though discussion on diversity and inclusion has been on the agenda for better corporate work environment in Hong Kong, there still remain gaps in raising awareness of lesbian and bisexual women’s issues.
APA, Harvard, Vancouver, ISO, and other styles
5

Rankin, Nicole, Deborah McGregor, Candice Donnelly, Bethany Van Dort, Richard De Abreu Lourenco, Anne Cust, and Emily Stone. Lung cancer screening using low-dose computed tomography for high risk populations: Investigating effectiveness and screening program implementation considerations: An Evidence Check rapid review brokered by the Sax Institute (www.saxinstitute.org.au) for the Cancer Institute NSW. The Sax Institute, October 2019. http://dx.doi.org/10.57022/clzt5093.

Full text
Abstract:
Background Lung cancer is the number one cause of cancer death worldwide.(1) It is the fifth most commonly diagnosed cancer in Australia (12,741 cases diagnosed in 2018) and the leading cause of cancer death.(2) The number of years of potential life lost to lung cancer in Australia is estimated to be 58,450, similar to that of colorectal and breast cancer combined.(3) While tobacco control strategies are most effective for disease prevention in the general population, early detection via low dose computed tomography (LDCT) screening in high-risk populations is a viable option for detecting asymptomatic disease in current (13%) and former (24%) Australian smokers.(4) The purpose of this Evidence Check review is to identify and analyse existing and emerging evidence for LDCT lung cancer screening in high-risk individuals to guide future program and policy planning. Evidence Check questions This review aimed to address the following questions: 1. What is the evidence for the effectiveness of lung cancer screening for higher-risk individuals? 2. What is the evidence of potential harms from lung cancer screening for higher-risk individuals? 3. What are the main components of recent major lung cancer screening programs or trials? 4. What is the cost-effectiveness of lung cancer screening programs (include studies of cost–utility)? Summary of methods The authors searched the peer-reviewed literature across three databases (MEDLINE, PsycINFO and Embase) for existing systematic reviews and original studies published between 1 January 2009 and 8 August 2019. Fifteen systematic reviews (of which 8 were contemporary) and 64 original publications met the inclusion criteria set across the four questions. Key findings Question 1: What is the evidence for the effectiveness of lung cancer screening for higher-risk individuals? There is sufficient evidence from systematic reviews and meta-analyses of combined (pooled) data from screening trials (of high-risk individuals) to indicate that LDCT examination is clinically effective in reducing lung cancer mortality. In 2011, the landmark National Lung Cancer Screening Trial (NLST, a large-scale randomised controlled trial [RCT] conducted in the US) reported a 20% (95% CI 6.8% – 26.7%; P=0.004) relative reduction in mortality among long-term heavy smokers over three rounds of annual screening. High-risk eligibility criteria was defined as people aged 55–74 years with a smoking history of ≥30 pack-years (years in which a smoker has consumed 20-plus cigarettes each day) and, for former smokers, ≥30 pack-years and have quit within the past 15 years.(5) All-cause mortality was reduced by 6.7% (95% CI, 1.2% – 13.6%; P=0.02). Initial data from the second landmark RCT, the NEderlands-Leuvens Longkanker Screenings ONderzoek (known as the NELSON trial), have found an even greater reduction of 26% (95% CI, 9% – 41%) in lung cancer mortality, with full trial results yet to be published.(6, 7) Pooled analyses, including several smaller-scale European LDCT screening trials insufficiently powered in their own right, collectively demonstrate a statistically significant reduction in lung cancer mortality (RR 0.82, 95% CI 0.73–0.91).(8) Despite the reduction in all-cause mortality found in the NLST, pooled analyses of seven trials found no statistically significant difference in all-cause mortality (RR 0.95, 95% CI 0.90–1.00).(8) However, cancer-specific mortality is currently the most relevant outcome in cancer screening trials. These seven trials demonstrated a significantly greater proportion of early stage cancers in LDCT groups compared with controls (RR 2.08, 95% CI 1.43–3.03). Thus, when considering results across mortality outcomes and early stage cancers diagnosed, LDCT screening is considered to be clinically effective. Question 2: What is the evidence of potential harms from lung cancer screening for higher-risk individuals? The harms of LDCT lung cancer screening include false positive tests and the consequences of unnecessary invasive follow-up procedures for conditions that are eventually diagnosed as benign. While LDCT screening leads to an increased frequency of invasive procedures, it does not result in greater mortality soon after an invasive procedure (in trial settings when compared with the control arm).(8) Overdiagnosis, exposure to radiation, psychological distress and an impact on quality of life are other known harms. Systematic review evidence indicates the benefits of LDCT screening are likely to outweigh the harms. The potential harms are likely to be reduced as refinements are made to LDCT screening protocols through: i) the application of risk predication models (e.g. the PLCOm2012), which enable a more accurate selection of the high-risk population through the use of specific criteria (beyond age and smoking history); ii) the use of nodule management algorithms (e.g. Lung-RADS, PanCan), which assist in the diagnostic evaluation of screen-detected nodules and cancers (e.g. more precise volumetric assessment of nodules); and, iii) more judicious selection of patients for invasive procedures. Recent evidence suggests a positive LDCT result may transiently increase psychological distress but does not have long-term adverse effects on psychological distress or health-related quality of life (HRQoL). With regards to smoking cessation, there is no evidence to suggest screening participation invokes a false sense of assurance in smokers, nor a reduction in motivation to quit. The NELSON and Danish trials found no difference in smoking cessation rates between LDCT screening and control groups. Higher net cessation rates, compared with general population, suggest those who participate in screening trials may already be motivated to quit. Question 3: What are the main components of recent major lung cancer screening programs or trials? There are no systematic reviews that capture the main components of recent major lung cancer screening trials and programs. We extracted evidence from original studies and clinical guidance documents and organised this into key groups to form a concise set of components for potential implementation of a national lung cancer screening program in Australia: 1. Identifying the high-risk population: recruitment, eligibility, selection and referral 2. Educating the public, people at high risk and healthcare providers; this includes creating awareness of lung cancer, the benefits and harms of LDCT screening, and shared decision-making 3. Components necessary for health services to deliver a screening program: a. Planning phase: e.g. human resources to coordinate the program, electronic data systems that integrate medical records information and link to an established national registry b. Implementation phase: e.g. human and technological resources required to conduct LDCT examinations, interpretation of reports and communication of results to participants c. Monitoring and evaluation phase: e.g. monitoring outcomes across patients, radiological reporting, compliance with established standards and a quality assurance program 4. Data reporting and research, e.g. audit and feedback to multidisciplinary teams, reporting outcomes to enhance international research into LDCT screening 5. Incorporation of smoking cessation interventions, e.g. specific programs designed for LDCT screening or referral to existing community or hospital-based services that deliver cessation interventions. Most original studies are single-institution evaluations that contain descriptive data about the processes required to establish and implement a high-risk population-based screening program. Across all studies there is a consistent message as to the challenges and complexities of establishing LDCT screening programs to attract people at high risk who will receive the greatest benefits from participation. With regards to smoking cessation, evidence from one systematic review indicates the optimal strategy for incorporating smoking cessation interventions into a LDCT screening program is unclear. There is widespread agreement that LDCT screening attendance presents a ‘teachable moment’ for cessation advice, especially among those people who receive a positive scan result. Smoking cessation is an area of significant research investment; for instance, eight US-based clinical trials are now underway that aim to address how best to design and deliver cessation programs within large-scale LDCT screening programs.(9) Question 4: What is the cost-effectiveness of lung cancer screening programs (include studies of cost–utility)? Assessing the value or cost-effectiveness of LDCT screening involves a complex interplay of factors including data on effectiveness and costs, and institutional context. A key input is data about the effectiveness of potential and current screening programs with respect to case detection, and the likely outcomes of treating those cases sooner (in the presence of LDCT screening) as opposed to later (in the absence of LDCT screening). Evidence about the cost-effectiveness of LDCT screening programs has been summarised in two systematic reviews. We identified a further 13 studies—five modelling studies, one discrete choice experiment and seven articles—that used a variety of methods to assess cost-effectiveness. Three modelling studies indicated LDCT screening was cost-effective in the settings of the US and Europe. Two studies—one from Australia and one from New Zealand—reported LDCT screening would not be cost-effective using NLST-like protocols. We anticipate that, following the full publication of the NELSON trial, cost-effectiveness studies will likely be updated with new data that reduce uncertainty about factors that influence modelling outcomes, including the findings of indeterminate nodules. Gaps in the evidence There is a large and accessible body of evidence as to the effectiveness (Q1) and harms (Q2) of LDCT screening for lung cancer. Nevertheless, there are significant gaps in the evidence about the program components that are required to implement an effective LDCT screening program (Q3). Questions about LDCT screening acceptability and feasibility were not explicitly included in the scope. However, as the evidence is based primarily on US programs and UK pilot studies, the relevance to the local setting requires careful consideration. The Queensland Lung Cancer Screening Study provides feasibility data about clinical aspects of LDCT screening but little about program design. The International Lung Screening Trial is still in the recruitment phase and findings are not yet available for inclusion in this Evidence Check. The Australian Population Based Screening Framework was developed to “inform decision-makers on the key issues to be considered when assessing potential screening programs in Australia”.(10) As the Framework is specific to population-based, rather than high-risk, screening programs, there is a lack of clarity about transferability of criteria. However, the Framework criteria do stipulate that a screening program must be acceptable to “important subgroups such as target participants who are from culturally and linguistically diverse backgrounds, Aboriginal and Torres Strait Islander people, people from disadvantaged groups and people with a disability”.(10) An extensive search of the literature highlighted that there is very little information about the acceptability of LDCT screening to these population groups in Australia. Yet they are part of the high-risk population.(10) There are also considerable gaps in the evidence about the cost-effectiveness of LDCT screening in different settings, including Australia. The evidence base in this area is rapidly evolving and is likely to include new data from the NELSON trial and incorporate data about the costs of targeted- and immuno-therapies as these treatments become more widely available in Australia.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography