Academic literature on the topic 'Instrumentation and techniques of general interest'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Instrumentation and techniques of general interest.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Instrumentation and techniques of general interest"

1

McLean, Ian S., Ding-Qiang Su, Thomas Armstrong, Noah Brosch, Martin Cullum, Michel Dennefeld, George Jacoby, et al. "Commission 9: Instrumentation and Techniques: (Instrumentation et Techniques)." Transactions of the International Astronomical Union 24, no. 1 (2000): 316–27. http://dx.doi.org/10.1017/s0251107x00003266.

Full text
Abstract:
The last triennium, and coincidentally the last few years of the 20th century, has been a most remarkable time for Commission 9, and for astronomy in general. Ground-based astronomy in particular has received an enormous boost due to the arrival of an astonishing array of new telescopes, novel instruments and innovative techniques. For those of us closely involved in developing new observatories, instrumentation or detectors, the last few years have been rather hectic! As an astronomer with a long-time interest in the development of new instruments, what amazes me is the breadth of technology and the visionary scope of all these incredible new achievements. Many of the very large 8-10 meter class telescopes are now coming into full operation – yet, just as this is happening, numerous smaller “survey” telescopes are providing a wealth of new sources. Adaptive optics is being practiced at many sites and diffraction-limited imaging from the ground is now a reality. Several optical-IR interferometers are now working and more are coming along very soon. Detectors continue to get bigger and better, especially for the infrared, and instrumentation is increasingly more sophisticated, complex and efficient. Remote observing, robotic telescopes and global networks of telescopes are common, and international collaborations are larger and stronger than ever before.
APA, Harvard, Vancouver, ISO, and other styles
2

Martín, Francisco Ferrero, Marta Valledor Llopis, Juan C. Campo Rodríguez, Alberto López Martínez, Ana Soldado Cabezuelo, María T. Fernández-Arguelles, and José M. Costa-Fernández. "Optoelectronic Instrumentation and Measurement Strategies for Optical Chemical (Bio)Sensing." Applied Sciences 11, no. 17 (August 26, 2021): 7849. http://dx.doi.org/10.3390/app11177849.

Full text
Abstract:
There is a growing interest in the development of sensitive, portable, and low-cost instrumentation for optical chemical (bio)sensing. Such instrumentation can allow real-time decision-making for industry, farmers, and researchers. The combination of optical fiber schemes, luminescence spectroscopy techniques, and new materials for sensor immobilization has allowed the growth of optical sensors. This article focuses on the development of low-cost optoelectronic instrumentation and measurement strategies for optical chemical (bio)sensing. Most of the articles in this field have focused on the chemical sensors themselves, although few have covered the design process for optoelectronic instrumentation. This article tries to fill this gap by presenting designs for real applications, as carried out by the authors. We also offer an introduction to the optical devices and optical measurement techniques used in this field to allow a full understanding of the applications.
APA, Harvard, Vancouver, ISO, and other styles
3

Mendenhall, Stephen, Dillon Mobasser, Katherine Relyea, and Andrew Jea. "Spinal instrumentation in infants, children, and adolescents: a review." Journal of Neurosurgery: Pediatrics 23, no. 1 (January 2019): 1–15. http://dx.doi.org/10.3171/2018.10.peds18327.

Full text
Abstract:
OBJECTIVEThe evolution of pediatric spinal instrumentation has progressed in the last 70 years since the popularization of the Harrington rod showing the feasibility of placing spinal instrumentation into the pediatric spine. Although lacking in pediatric-specific spinal instrumentation, when possible, adult instrumentation techniques and tools have been adapted for the pediatric spine. A new generation of pediatric neurosurgeons with interest in complex spine disorder has pushed the field forward, while keeping the special nuances of the growing immature spine in mind. The authors sought to review their own experience with various types of spinal instrumentation in the pediatric spine and document the state of the art for pediatric spine surgery.METHODSThe authors retrospectively reviewed patients in their practice who underwent complex spine surgery. Patient demographics, operative data, and perioperative complications were recorded. At the same time, the authors surveyed the literature for spinal instrumentation techniques that have been utilized in the pediatric spine. The authors chronicle the past and present of pediatric spinal instrumentation, and speculate about its future.RESULTSThe medical records of the first 361 patients who underwent 384 procedures involving spinal instrumentation from July 1, 2007, to May 31, 2018, were analyzed. The mean age at surgery was 12 years and 6 months (range 3 months to 21 years and 4 months). The types of spinal instrumentation utilized included occipital screws (94 cases); C1 lateral mass screws (115 cases); C2 pars/translaminar screws (143 cases); subaxial cervical lateral mass screws (95 cases); thoracic and lumbar spine traditional-trajectory and cortical-trajectory pedicle screws (234 cases); thoracic and lumbar sublaminar, subtransverse, and subcostal polyester bands (65 cases); S1 pedicle screws (103 cases); and S2 alar-iliac/iliac screws (56 cases). Complications related to spinal instrumentation included hardware-related skin breakdown (1.8%), infection (1.8%), proximal junctional kyphosis (1.0%), pseudarthroses (1.0%), screw malpositioning (0.5%), CSF leak (0.5%), hardware failure (0.5%), graft migration (0.3%), nerve root injury (0.3%), and vertebral artery injury (0.3%).CONCLUSIONSPediatric neurosurgeons with an interest in complex spine disorders in children should develop a comprehensive armamentarium of safe techniques for placing rigid and nonrigid spinal instrumentation even in the smallest of children, with low complication rates. The authors’ review provides some benchmarks and outcomes for comparison, and furnishes a historical perspective of the past and future of pediatric spine surgery.
APA, Harvard, Vancouver, ISO, and other styles
4

Livas, Christos, Albert Cornelis Jongsma, and Yijin Ren. "Enamel Reduction Techniques in Orthodontics: A Literature Review." Open Dentistry Journal 7, no. 1 (October 31, 2013): 146–51. http://dx.doi.org/10.2174/1874210601307010146.

Full text
Abstract:
Artificial abrasion of interproximal surfaces has been described for almost seventy years as orthodontic intervention for achievement and maintenance of ideal treatment outcome. A variety of terms and approaches have been introduced throughout this period implying a growing clinicians’ interest. Nevertheless, the widespread recognition of enamel stripping technique was initiated by the advent of bonded orthodontic attachments and a 2-article series of Sheridan in the 80’s. Since then, experimental and clinical research has been focused on the investigation of instrumentation efficacy and potential iatrogenic sequelae related to interproximal stripping. This review discusses the evolution, technical aspects and trends of enamel reduction procedures as documented in the literature.
APA, Harvard, Vancouver, ISO, and other styles
5

Hofmann, D., and Y. V. Tarbeyev. "Theoretical, physical and metrological problems of further development of measurement techniques and instrumentation in science and technology." ACTA IMEKO 3, no. 1 (May 7, 2014): 23. http://dx.doi.org/10.21014/acta_imeko.v3i1.193.

Full text
Abstract:
<p>This is a reissue of a paper which appeared in ACTA IMEKO 1979, Proceedings of the 8th IMEKO Congress of the International Measurement Confederation, "Measurement for progress in science and technology", 21.-27.5.1979, Moscow, vol. 3, pp. 607-626.</p><p>Common interest of both metrologists and representatives of science and technology in constant improvement of measurements as well as general trends in the development of research in the field of metrology, measurement technology and instrumentation at the present-day stage are shown. Problems of general metrology, of improving systems of units and standards ("natural" standards in particular) are considered in detail.</p>
APA, Harvard, Vancouver, ISO, and other styles
6

Lieu, David. "Ultrasound Physics and Instrumentation for Pathologists." Archives of Pathology & Laboratory Medicine 134, no. 10 (October 1, 2010): 1541–56. http://dx.doi.org/10.5858/2009-0730-ra.1.

Full text
Abstract:
Abstract Context.—Interest in pathologist-performed ultrasound-guided fine-needle aspiration is increasing. Educational courses discuss clinical ultrasound and biopsy techniques but not ultrasound physics and instrumentation. Objective.—To review modern ultrasound physics and instrumentation to help pathologists understand the basis of modern ultrasound. Data Sources.—A review of recent literature and textbooks was performed. Conclusions.—Ultrasound physics and instrumentation are the foundations of clinical ultrasound. The key physical principle is the piezoelectric effect. When stimulated by an electric current, certain crystals vibrate and produce ultrasound. A hand-held transducer converts electricity into ultrasound, transmits it into tissue, and listens for reflected ultrasound to return. The returning echoes are converted into electrical signals and used to create a 2-dimensional gray-scale image. Scanning at a high frequency improves axial resolution but has low tissue penetration. Electronic focusing moves the long-axis focus to depth of the object of interest and improves lateral resolution. The short-axis focus in 1-dimensional transducers is fixed, which results in poor elevational resolution away from the focal zone. Using multiple foci improves lateral resolution but degrades temporal resolution. The sonographer can adjust the dynamic range to change contrast and bring out subtle masses. Contrast resolution is limited by processing speed, monitor resolution, and gray-scale perception of the human eye. Ultrasound is an evolving field. New technologies include miniaturization, spatial compound imaging, tissue harmonics, and multidimensional transducers. Clinical cytopathologists who understand ultrasound physics, instrumentation, and clinical ultrasound are ready for the challenges of cytopathologist-performed ultrasound-guided fine-needle aspiration and core-needle biopsy in the 21st century.
APA, Harvard, Vancouver, ISO, and other styles
7

Davis, John. "Introduction to the Joint Commission Meeting on High Resolution Imaging from the Ground." Highlights of Astronomy 8 (1989): 545–46. http://dx.doi.org/10.1017/s1539299600008261.

Full text
Abstract:
As a result of advances in instrumentation and techniques, from radio through to optical wavelengths, we have before us the prospect of producing very high resolution images of a wide range of objects across this entire spectral range. This prospect, and the new knowledge and discoveries that may be anticipated from it, lie behind an upsurge in interest in high resolution imaging from the ground. Several new high angular resolution instruments for radio, infrared, and optical wavelengths are expected to come into operation before the 1991 IAU General Assembly.
APA, Harvard, Vancouver, ISO, and other styles
8

Kim, Junseok A., and Karen D. Davis. "Magnetoencephalography: physics, techniques, and applications in the basic and clinical neurosciences." Journal of Neurophysiology 125, no. 3 (March 1, 2021): 938–56. http://dx.doi.org/10.1152/jn.00530.2020.

Full text
Abstract:
Magnetoencephalography (MEG) is a technique used to measure the magnetic fields generated from neuronal activity in the brain. MEG has a high temporal resolution on the order of milliseconds and provides a more direct measure of brain activity when compared with hemodynamic-based neuroimaging methods such as magnetic resonance imaging and positron emission tomography. The current review focuses on basic features of MEG such as the instrumentation and the physics that are integral to the signals that can be measured, and the principles of source localization techniques, particularly the physics of beamforming and the techniques that are used to localize the signal of interest. In addition, we review several metrics that can be used to assess functional coupling in MEG and describe the advantages and disadvantages of each approach. Lastly, we discuss the current and future applications of MEG.
APA, Harvard, Vancouver, ISO, and other styles
9

Gilli, R., F. Mattea, G. Martin, and M. Valente. "X-RAY MICROTOMOGRAPHY TO CHARACTERIZE THE ROOT CANAL VOLUME EXTRACTED IN ENDODONTIC INSTRUMENTATION." Anales AFA 33, no. 3 (October 15, 2022): 70–76. http://dx.doi.org/10.31527/analesafa.2022.33.3.70.

Full text
Abstract:
During the last decades, the analytical techniques of X-ray absorption contrast imaging have systematically gained greater relevance, mainly due to the ability to attain non-destructive exploration of the sample interior. The significant improvement in spatial resolution offered by X-ray micro-tomography, as compared to conventional computed tomography, has motivated its insertion in many biomedical fields, among which dentistry stands out. Particularly, for endodontics, microCT appears as a method of remarkable potential interest to study procedures involved in root canal treatments, where one of the main needs is the anatomical characterization of the root canal in the teeth. The present work reports on the adaptation of the microCT equipment of the LIIFAMIRx⃝laboratory at the E. Gaviola Physics Institute, CONICET and UNC, thus allowing to acquire of radiographic images of dental samples of interest, to be later used in the implementation of algorithms, intended to tomographic reconstruction and volume segmentation. As a result, radiographic images of premolar teeth were obtained with good contrast between the different materials present, and three-dimensional representations, whose visualization is comparable with the real samples. Moreover, it was possible to characterize the root canal volume of the tooth both in its natural form and after having undergone the instrumentation process in which the pulp tissue is extracted.
APA, Harvard, Vancouver, ISO, and other styles
10

Woike, Mark, Ali Abdul-Aziz, Nikunj Oza, and Bryan Matthews. "New Sensors and Techniques for the Structural Health Monitoring of Propulsion Systems." Scientific World Journal 2013 (2013): 1–10. http://dx.doi.org/10.1155/2013/596506.

Full text
Abstract:
The ability to monitor the structural health of the rotating components, especially in the hot sections of turbine engines, is of major interest to aero community in improving engine safety and reliability. The use of instrumentation for these applications remains very challenging. It requires sensors and techniques that are highly accurate, are able to operate in a high temperature environment, and can detect minute changes and hidden flaws before catastrophic events occur. The National Aeronautics and Space Administration (NASA), through the Aviation Safety Program (AVSP), has taken a lead role in the development of new sensor technologies and techniques for the in situ structural health monitoring of gas turbine engines. This paper presents a summary of key results and findings obtained from three different structural health monitoring approaches that have been investigated. This includes evaluating the performance of a novel microwave blade tip clearance sensor; a vibration based crack detection technique using an externally mounted capacitive blade tip clearance sensor; and lastly the results of using data driven anomaly detection algorithms for detecting cracks in a rotating disk.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Instrumentation and techniques of general interest"

1

Chippendale, Aaron Paul. "Detecting cosmological reionization on large scales through the 21 cm HI line." Thesis, The University of Sydney, 2009. http://hdl.handle.net/2123/6256.

Full text
Abstract:
This thesis presents the development of new techniques for measuring the mean redshifted 21 cm line of neutral hydrogen during reionization. This is called the 21 cm cosmological reionization monopole. Successful observations could identify the nature of the first stars and test theories of galaxy and large-scale structure formation. The goal was to specify, construct and calibrate a portable radio telescope to measure the 21 cm monopole in the frequency range 114 MHz to 228 MHz, which corresponds to the redshift range 11.5 > z > 5.2. The chosen approach combined a frequency independent antenna with a digital correlation spectrometer to form a correlation radiometer. The system was calibrated against injected noise and against a modelled galactic foreground. Components were specified for calibration of the sky spectrum to 1 mK/MHz relative accuracy. Comparing simulated and measured spectra showed that bandpass calibration is limited to 11 K, that is 1% of the foreground emission, due to larger than expected frequency dependence of the antenna pattern. Overall calibration, including additive contributions from the system and the radio foreground, is limited to 60 K. This is 160 times larger than the maximum possible monopole amplitude at redshift eight. Future work will refine and extend the system known as the Cosmological Reionization Experiment Mark I (CoRE Mk I).
APA, Harvard, Vancouver, ISO, and other styles
2

Chippendale, Aaron Paul. "Detecting cosmological reionization on large scales through the 21 cm HI line." University of Sydney, 2009. http://hdl.handle.net/2123/6256.

Full text
Abstract:
Doctor of Philosophy (PhD)
This thesis presents the development of new techniques for measuring the mean redshifted 21 cm line of neutral hydrogen during reionization. This is called the 21 cm cosmological reionization monopole. Successful observations could identify the nature of the first stars and test theories of galaxy and large-scale structure formation. The goal was to specify, construct and calibrate a portable radio telescope to measure the 21 cm monopole in the frequency range 114 MHz to 228 MHz, which corresponds to the redshift range 11.5 > z > 5.2. The chosen approach combined a frequency independent antenna with a digital correlation spectrometer to form a correlation radiometer. The system was calibrated against injected noise and against a modelled galactic foreground. Components were specified for calibration of the sky spectrum to 1 mK/MHz relative accuracy. Comparing simulated and measured spectra showed that bandpass calibration is limited to 11 K, that is 1% of the foreground emission, due to larger than expected frequency dependence of the antenna pattern. Overall calibration, including additive contributions from the system and the radio foreground, is limited to 60 K. This is 160 times larger than the maximum possible monopole amplitude at redshift eight. Future work will refine and extend the system known as the Cosmological Reionization Experiment Mark I (CoRE Mk I).
APA, Harvard, Vancouver, ISO, and other styles
3

Mate, Sujay. "Développement d'un simulateur du ciel pour les instruments à grand champ de vue X-gamma en orbite terrestre basse : application à l'évaluation des performances du spectro-imageur SVOM-ECLAIRs." Thesis, Toulouse 3, 2021. http://www.theses.fr/2021TOU30031.

Full text
Abstract:
Les sursauts gamma (Gamma-Ray Bursts ou GRBs) sont les explosions les plus lumineuses de l'univers. On les observe sous la forme de bouffées de rayons X/ƴ (d'une durée de quelques millisecondes à quelques dizaines de secondes) suivies d'une émission rémanente (généralement à de plus grandes longueurs d'onde). Ils résultent soit de la fusion de deux objets compacts (une paire d'étoiles à neutrons ou une étoile à neutrons et un trou noir), soit de l'effondrement du noyau d'une étoile massive (>15 masse solaire). Les GRBs sont d'excellents candidats pour étudier la physique aux énergies et densités extrêmes et un outil astrophysique pour sonder l'histoire de l'univers car ils sont observés à tous les âges de celui-ci. La mission spatiale Sino-Française SVOM (lancement prévu en juin 2022) a pour objectif la détection et l'étude des GRBs à l'aide d'instruments spatiaux et terrestres dédiés afin d'obtenir une couverture multi-longueurs d'onde. Le principal instrument à bord du satellite SVOM est ECLAIRs, un imageur à masque codé à grand champ de vue (~2 sr) fonctionnant dans la bande d'énergie de 4 à 150 keV. ECLAIRs détectera et localisera les GRBs (ainsi que d'autres sources transitoires à hautes énergies) en temps quasi réel grâce à son " trigger " embarqué. Le bruit de fond d'ECLAIRs est élevé et variable en raison du grand champ de vue et de la stratégie de pointage de SVOM qui amène la Terre à transiter dans le champ de vue. Une nouvelle méthode (appelée "Particle Interaction Recycling Approach" ou PIRA en anglais), basée sur des simulations de Monte-Carlo (GEANT4), a été développée pour estimer précisément et rapidement le bruit de fond variable. Les simulations du bruit de fond sont complétées avec des sources X et des sursauts gamma afin de générer des scénarios d'observation complets. Le bruit de fond variable d'ECLAIRs pose des problèmes pour la détection des GRBs et affecte la sensibilité de l'instrument. Nous avons évalué les performances du "trigger" embarqué, notamment l'impact du bruit de fond sur la détection des sources transitoires et sa sensibilité aux caractéristiques des GRBs (durée, profil temporel, forme spectrale, position dans le champ de vue). ECLAIRs enverra au sol tous les photons détectés. De plus, la disponibilité d'une plus grande puissance de calcul et une meilleure connaissance du contexte (par exemple, les variations du bruit de fond, les sources dans le champ de vue, etc.) au sol, nous ont conduits à développer un "trigger" sol pour surmonter les difficultés rencontrées par le "trigger" embarqué. Ainsi, nous proposons un algorithme basé sur des transformées en ondelettes pour détecter les GRBs dans le cadre du "trigger" sol. Les travaux de cette thèse, à savoir le développement de PIRA, l'évaluation des performances et le développement d'un nouvel algorithme de détection de sursauts, fournissent une base solide pour construire un "trigger" sol efficace, qui complétera le "trigger" embarqué et améliorera les performances globales de la mission SVOM
Gamma-Ray Bursts (GRBs) are the most luminous explosions in the universe. They are observed as bright flashes of gamma/X-rays (lasting a few milliseconds to a few tens of seconds) followed by an "afterglow" emission (usually at longer wavelengths). They are produced either due to the merger of two compact objects (a pair of neutron stars or a neutron star and a black hole) or due to the core collapse of a massive star (> 15 solar mass). GRBs are excellent candidates to study physics at extreme energies and densities. They also constitute important astrophysical tools to probe the history of the universe as they are observed at all epochs. The upcoming (June 2022) Sino-French mission SVOM (Space-based multi-band astronomical Variable Objects Monitor) aims to detect and study GRBs using dedicated space and ground based instruments to obtain multi-wavelength coverage. The primary instrument onboard SVOM spacecraft is ECLAIRs, a wide-field (~ 2 sr) coded-mask imager sensitive in the 4 - 150 keV energy range. ECLAIRs will detect and localise GRBs (and other high energy transients) in near real time using an onboard trigger. ECLAIRs will encounter a high and variable background due to the wide field-of-view (FoV) and the pointing strategy of SVOM which makes the Earth transit through the FoV. A new method (called Particle Interaction Recycling Approach or PIRA), based on Monte-Carlo simulations (GEANT4), was developed to estimate the variable background accurately and rapidly. The simulations of the background are complemented with simulations of X-ray sources and gamma-ray bursts to generate complete observation scenarios. The variable background of ECLAIRs poses challenges to detect GRBs and affects the sensitivity of the instrument. We use the simulated data to evaluate the performance of the onboard trigger, in particular, the impact of the variable background and its sensitivity to the GRB characteristics (duration, temporal profile, spectral shape,position in the FoV). ECLAIRs will send all detected photons to the ground. In addition, the availability of a larger computational power and the better knowledge of the context (e.g. background variations, sources in the FoV, etc.) on the ground motivates us to develop an "offline trigger" to overcome the challenges faced by the onboard trigger. An algorithm based on wavelet transforms is proposed to detect GRBs as part of the offline trigger. The work in this thesis, i.e. the development of PIRA, instrument's performance evaluation and development of a trigger method, provides a sound basis to build an effective offline trigger that will complement the onboard trigger and improve the overall performance of the SVOM mission
APA, Harvard, Vancouver, ISO, and other styles
4

Walter, Nicolas. "Détection de primitives par une approche discrète et non linéaire : application à la détection et la caractérisation de points d'intérêt dans les maillages 3D." Phd thesis, Université de Bourgogne, 2010. http://tel.archives-ouvertes.fr/tel-00808216.

Full text
Abstract:
Ce manuscrit est dédié à la détection et la caractérisation de points d'intérêt dans les maillages. Nous montrons tout d'abord les limitations de la mesure de courbure sur des contours francs, mesure habituellement utilisée dans le domaine de l'analyse de maillages. Nous présentons ensuite une généralisation de l'opérateur SUSAN pour les maillages, nommé SUSAN-3D. La mesure de saillance proposée quantifie les variations locales de la surface et classe directement les points analysés en cinq catégories : saillant, crête, plat, vallée et creux. Les maillages considérés sont à variété uniforme avec ou sans bords et peuvent être réguliers ou irréguliers, denses ou non et bruités ou non. Nous étudions ensuite les performances de SUSAN-3D en les comparant à celles de deux opérateurs de courbure : l'opérateur de Meyer et l'opérateur de Stokely. Deux méthodes de comparaison des mesures de saillance et courbure sont proposées et utilisées sur deux types d'objets : des sphères et des cubes. Les sphères permettent l'étude de la précision sur des surfaces différentiables et les cubes sur deux types de contours non-différentiables : les arêtes et les coins. Nous montrons au travers de ces études les avantages de notre méthode qui sont une forte répétabilité de la mesure, une faible sensibilité au bruit et la capacité d'analyser les surfaces peu denses. Enfin, nous présentons une extension multi-échelle et une automatisation de la détermination des échelles d'analyse qui font de SUSAN-3D un opérateur générique et autonome d'analyse et de caractérisation pour les maillages
APA, Harvard, Vancouver, ISO, and other styles
5

Ferrara, G. "Banche dati per le biblioteche di scienze della terra: Georef, Web of Science, Scirus e Google Scholar." Thesis, 2009. http://hdl.handle.net/2122/5428.

Full text
Abstract:
Sin dalla loro introduzione, prima in forma cartacea, poi su supporti informatici, per finire alle versioni elettroniche disponibili via web, le banche dati hanno sempre rappresentato uno strumento fondamentale, indispensabile e non sostituibile della ricerca. Non di meno lo sono per i bibliotecari. Questo presupposto ci deve spingere a valorizzare sempre più il servizio di reference e di information retrieval, visto che gli strumenti a nostra disposizione aumentano di giorno in giorno, sia liberamente fruibili che sotto forma di abbonamenti annuali. Oggi le banche dati stanno varcando l’ambito per cui inizialmente erano state create e stanno per entrare in quello della valutazione della ricerca. In questo lavoro si prendono in considerazione quelle banche dati e quei motori di ricerca specialistici che oramai sono di utilizzo quotidiano in una biblioteca di scienze della terra. In particolare ci si riferirà a: Georef, Web of Science, Scirus e Google Scholar. Il lavoro inizia analizzando storicamente sia le banche dati che i motori di ricerca specialistici, per comprendere la loro natura e le motivazioni alla base della loro creazione. Si procede poi affrontando il tema della qualità del recupero dell’informazione presentata, per finire con l’analisi comparativa delle citazioni ricevute dai record estratti.
Università degli studi di Roma“La Sapienza” Scuola Speciale Archivisti e Bibliotecari
Unpublished
5.8. TTC - Biblioteche ed editoria
open
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Instrumentation and techniques of general interest"

1

Arthus-Bertrand, Yann. The earth from the air 366 days. London: Thames & Hudson, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Rebecca, Hallewell, ed. Scotland's sailing fishermen: The history of the Scottish herring boom. Strathtummel, Perthshire: Hallewell Publications, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gamble, Allan. Historic Sydney: Drawings. Roseville, NSW: Craftsman House, 1990.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

David, Simmons, ed. Covered bridges : Ohio, Kentucky, West Virginia. Wooster, Ohio: Wooster Book Co., 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Creative sequencing techniques for music production: A practical guide to Pro Tools, Logic, Digital Performer, and Cubase. 2nd ed. Amsterdam: Focal Press, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Philippe, Chancel, ed. Paris in store. London: Thames & Hudson, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

1880-1935, Nani Leone, and Bulfoni Clara, eds. Lost China: The photographs of Leone Nani / edited by Clara Bulfoni. Milan: Skira Editore S.p.A, 2003.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gefen, Gérard. Sicily: Land of the leopard princes. London: New York, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Schiele, Bernard. Patrimoines et identite s. Que bec: Muse e de la civilisation, 2002.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

A, Thompson John M., ed. Manual of curatorship: A guide to museum practice. 2nd ed. Oxford: Butterworth-Heinemann, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Instrumentation and techniques of general interest"

1

Hale, Robert C., Meredith E. Seeley, Ashley E. King, and Lehuan H. Yu. "Analytical Chemistry of Plastic Debris: Sampling, Methods, and Instrumentation." In Microplastic in the Environment: Pattern and Process, 17–67. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78627-4_2.

Full text
Abstract:
AbstractApproaches for the collection and analysis of plastic debris in environmental matrices are rapidly evolving. Such plastics span a continuum of sizes, encompassing large (macro-), medium (micro-, typically defined as particles between 1 μm and 5 mm), and smaller (nano-) plastics. All are of environmental relevance. Particle sizes are dynamic. Large plastics may fragment over time, while smaller particles may agglomerate in the field. The diverse morphologies (fragment, fiber, sphere) and chemical compositions of microplastics further complicate their characterization. Fibers are of growing interest and present particular analytical challenges due to their narrow profiles. Compositional classes of emerging concern include tire wear, paint chips, semisynthetics (e.g., rayon), and bioplastics. Plastics commonly contain chemical additives and fillers, which may alter their toxicological potency, behavior (e.g., buoyancy), or detector response (e.g., yield fluorescence) during analysis. Field sampling methods often focus on >20 μm and even >300 μm sized particles and will thus not capture smaller microplastics (which may be most abundant and bioavailable). Analysis of a limited subgroup (selected polymer types, particle sizes, or shapes) of microplastics, while often operationally necessary, can result in an underestimation of actual sample content. These shortcomings complicate calls for toxicological studies of microplastics to be based on “environmentally relevant concentrations.” Sample matrices of interest include water (including wastewater, ice, snow), sediment (soil, dust, wastewater sludge), air, and biota. Properties of the environment, and of the particles themselves, may concentrate plastic debris in select zones (e.g., gyres, shorelines, polar ice, wastewater sludge). Sampling designs should consider such patchy distributions. Episodic releases due to weather and anthropogenic discharges should also be considered. While water grab samples and sieving are commonplace, novel techniques for microplastic isolation, such as continuous flow centrifugation, show promise. The abundance of nonplastic particulates (e.g., clay, detritus, biological material) in samples interferes with microplastic detection and characterization. Their removal is typically accomplished using a combination of gravity separation and oxidative digestion (including strong bases, peroxide, enzymes); unfortunately, aggressive treatments may damage more labile plastics. Microscope-based infrared or Raman detection is often applied to provide polymer chemistry and morphological data for individual microplastic particles. However, the sheer number of particles in many samples presents logistical hurdles. In response, instruments have been developed that employ detector arrays and rapid scanning lasers. The addition of dyes to stain particulates may facilitate spectroscopic detection of some polymer types. Most researchers provide microplastic data in the form of the abundances of polymer types within particle size, polymer, and morphology classes. Polymer mass data in samples remain rare but are essential to elucidating fate. Rather than characterizing individual particles in samples, solvent extraction (following initial sample prep, such as sediment size class sorting), combined with techniques such as thermoanalysis (e.g., pyrolysis), has been used to generate microplastic mass data. However, this may obviate the acquisition of individual particle morphology and compositional information. Alternatively, some techniques (e.g., electron and atomic force microscopy and matrix-assisted laser desorption mass spectrometry) are adept at providing highly detailed data on the size, morphology, composition, and surface chemistry of select particles. Ultimately, the analyst must select the approach best suited for their study goals. Robust quality control elements are also critical to evaluate the accuracy and precision of the sampling and analysis techniques. Further, improved efforts are required to assess and control possible sample contamination due to the ubiquitous distribution of microplastics, especially in indoor environments where samples are processed.
APA, Harvard, Vancouver, ISO, and other styles
2

Smith, Yolanda R., Denise Murray, and Howard A. Zacur. "General Techniques and Instrumentation of Operative Hysteroscopy." In Practical Manual of Operative Laparoscopy and Hysteroscopy, 273–85. New York, NY: Springer New York, 1997. http://dx.doi.org/10.1007/978-1-4612-1886-9_26.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Torim, Ants, Innar Liiv, Chahinez Ounoughi, and Sadok Ben Yahia. "Pattern Based Software Architecture for Predictive Maintenance." In Communications in Computer and Information Science, 26–38. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-17030-0_3.

Full text
Abstract:
AbstractMany industrial sectors are moving toward Industry Revolution (IR) 4.0. In this respect, the Internet of Things and predictive maintenance are considered the key pillars of IR 4.0. Predictive maintenance is one of the hottest trends in manufacturing where maintenance work occurs according to continuous monitoring using a healthiness check for processing equipment or instrumentation. It enables the maintenance team to have an advanced prediction of failures and allows the team to undertake timely corrective actions and decisions ahead of time. The aim of this paper is to present a smart monitoring and diagnostics system as an expert system that can alert an operator before equipment failures to prevent material and environmental damages. The main novelty and contribution of this paper is a flexible architecture of the predictive maintenance system, based on software patterns - flexible solutions to general problems. The presented conceptual model enables the integration of an expert knowledge of anticipated failures and the matrix-profile technique based anomaly detection. The results so far are encouraging.
APA, Harvard, Vancouver, ISO, and other styles
4

Herr, Werner, and Etienne Forest. "Non-linear Dynamics in Accelerators." In Particle Physics Reference Library, 51–104. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-34245-6_3.

Full text
Abstract:
AbstractNon-linear effects in accelerator physics are important both during the design stage and for successful operation of accelerators. Since both of these aspects are closely related, they will be treated together in this overview. Some of the most important aspects are well described by methods established in other areas of physics and mathematics. Given the scope of this handbook, the treatment will be focused on the problems in accelerators used for particle physics experiments. Although the main emphasis will be on accelerator physics issues, some of the aspects of more general interest will be discussed. In particular to demonstrate that in recent years a framework has been built to handle the complex problems in a consistent form, technically superior and conceptually simpler than the traditional techniques. The need to understand the stability of particle beams has substantially contributed to the development of new techniques and is an important source of examples which can be verified experimentally. Unfortunately the documentation of these developments is often poor or even unpublished, in many cases only available as lectures or conference proceedings.
APA, Harvard, Vancouver, ISO, and other styles
5

Téot, Luc. "Facial Scars Reconstruction." In Textbook on Scar Management, 325–31. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-44766-3_38.

Full text
Abstract:
AbstractFacial postburns scars have always generated a large interest in the general population and stimulated many movie stories, but they remain a source of social exclusion in most of the countries in the world and are a challenge for advanced surgical solutions. Several strategic options have been proposed in the last two decades for patients suffering severe facial scars with a high psychological impact. The tissues of each subunit of the face are specific (eyebrows, front, cheeks, chin, etc.), which makes it difficult for a conventional flap to reproduce this specificity, the different subunits presenting different characteristics in terms of depth, dermal component, softness, and gliding possibilities.The choice between the different advanced techniques will be limited to a thin partial thickness skin graft, plus a dermal substitute (Integra or Matriderm), a pre-expanded flap coming from the surrounding areas (shoulder, back), or an allogeneic transplantation, which imposes a permanent immunosuppression and whose number is regulated at national levels.
APA, Harvard, Vancouver, ISO, and other styles
6

Heinemann, Moritz, Filip Sadlo, and Thomas Ertl. "Interactive Visualization of Droplet Dynamic Processes." In Fluid Mechanics and Its Applications, 29–46. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-09008-0_2.

Full text
Abstract:
AbstractThis article presents an overview of visual analysis techniques specifically developed for high-resolution direct numerical multiphase simulations in the droplet dynamic context. Visual analysis of such data covers a large range of tasks, starting from observing physical phenomena such as energy transport or collisions for single droplets to the analysis of large-scale simulations such as sprays and jets. With an increasing number of features, coalescence and breakup events might happen, which need to be visually presented in an interactive explorable way to gain a deeper insight into physics. But also the task of finding relevant structures, features of interest, or a general dataset overview becomes non-trivial. We present an overview of new approaches developed in our SFB-TRR 75 project A1 covering work from the last decade to the current work-in-progress. They are the basis for relevant contributions to visualization research as well as useful tools for close collaborations within the SFB.
APA, Harvard, Vancouver, ISO, and other styles
7

Dodson, Alan, and Terry Moore. "Geodetic Techniques." In Continental Shelf Limits. Oxford University Press, 2000. http://dx.doi.org/10.1093/oso/9780195117820.003.0011.

Full text
Abstract:
Establishing a claim to the continental shelf is very much dependent on being able to establish base lines, locations, distances and water depth with a high degree of accuracy. Although this has always been the case, it has become much more significant with the increasing accuracy of measurement instrumentation, the introduction of global (satellite) positioning systems, and the need for international collaboration and agreement. This chapter outlines the measurements or calculations relating to position on the surface of the Earth, the geodetic principles underlying the concepts of coordinates and their reference systems, and the level of accuracy with which positions can be determined. Until the advent of satellite positioning and navigation systems, and in particular the Global Positioning System (GPS), geodetic coordinate systems were of little interest to many of the users of coordinate position information. Indeed, many of today's problems stem from historical misunderstanding of the true complexity of systems of coordinates (Ashkenazi, 1986). The first section of this chapter describes a number of common types of coordinate representation, their implementation in the definition of coordinate systems, and the interpretation of these systems with reference data. Examples of typical local, regional, and geocentric data are outlined as illustration of the general principles. In order to combine coordinates based on differing systems, be it the combination of different national systems or of a national and a geocentric system, it is also necessary to understand the methods of transforming coordinates from one system to another. Details of these methods are presented. Coordinate values within the described systems will usually have been obtained through geodetic observation of national (or international) control networks, with subsequent measurement of detail referenced to those network stations. The chapter therefore gives a brief description of geodetic networks before discussing in some detail the theory of errors and related accuracy analysis which is required in order that realistic accuracy estimates can be ascribed to positional information. The final section presents the definitions and calculation algorithms relating to geodetic distance determination on the Earth's surface, with emphasis on the geodesic— the shortest line between points on an ellipsoidal reference surface, and therefore the line that all distances in article 76 are referred to.
APA, Harvard, Vancouver, ISO, and other styles
8

"General index." In Techniques and Instrumentation in Analytical Chemistry, 685–88. Elsevier, 2005. http://dx.doi.org/10.1016/s0167-9244(05)80021-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Graves, Steven W., and John P. Nolan. "Molecular Assemblies, Probes, and Proteomics in Flow Cytometry." In Flow Cytometry for Biotechnology. Oxford University Press, 2005. http://dx.doi.org/10.1093/oso/9780195183146.003.0013.

Full text
Abstract:
The many proteins and nucleic acids encoded in the genome predominantly perform their functions as macromolecular assemblies. In fact, modern biomedical research often targets the interactions of individual molecules of these assemblies, usually by disrupting or enhancing specific contacts, to provide treatment for many different diseases. Therefore, efficient pharmaceutical design requires knowledge of how macromolecular assemblies are built and function. To achieve this goal, sensitive and quantitative tools are essential. This chapter will discuss the use of flow cytometry as a general platform for sensitive measurement and quantification of molecular assemblies. First, this chapter will introduce general methods for analysis of molecular interactions along with a comparison of flow cytometry with these methods. Second, an overview of current flow cytometry instrumentation, assay technologies, and applications in molecular assembly analysis will be given. Third, the implementation of the above approaches in molecular assembly will be discussed. Finally, potential future directions of flow cytometry in molecular assembly analysis will be explored. At present, the analysis of macromolecular assemblies is performed by a wide variety of techniques that are chosen for the target molecules under study (proteins, DNA, lipids, etc.), the type of measurement required (kinetic or equilibrium), and whether the assembly of interest needs to be studied in vivo or in vitro. This continuum of techniques can be divided into the heterogeneous assays, which require a separation step to resolve products from reactants, and homogeneous assays, which can measure interactions without a separation step. Heterogeneous assays, in general, use radioisotopes, which are not perturbing; offer excellent sensitivity; and provide accurate quantification. The products are quantified after a separation step such as gel filtration, gel electrophoresis, or centrifugation. Rapid quench methods can provide subsecond kinetic resolution; however, the added separation steps are tedious and make collection of kinetic time courses difficult, as each time point must be separated and measured individually. Furthermore, in the time it takes the separation to occur, the interaction of interest can dissociate, which is a problem specific to low-affinity assemblies. Nonetheless, by using rapid chemical quench techniques, reaction times as short as a few milliseconds can be observed. Homogenous assays can be separated into solution- or surface-based assays. Solutionbased assays measure an optical signal generated by the assembly to quantify an interaction. High component concentrations (micromolar) allow changes in intrinsic molecular properties, such as protein fluorescence or circular dichroism, to be used to study molecular assemblies. For greater sensitivity (nanomole component concentrations), resonance energy transfer or polarization assays using exogenous fluorescent labels can be used. In combination with stopped-flow spectroscopy methodologies, solution-based assays allow reactions to monitored in a continuous fashion with submillisecond dead times.
APA, Harvard, Vancouver, ISO, and other styles
10

"Symposium 15: General applications, techniques and instrumentation." In Microbeam Analysis, 493–518. CRC Press, 2000. http://dx.doi.org/10.1201/9781482289428-23.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Instrumentation and techniques of general interest"

1

Rathod, Mulchand S. "Improving Learning Outcomes of a Course in Instrumentation." In ASME 2006 International Mechanical Engineering Congress and Exposition. ASMEDC, 2006. http://dx.doi.org/10.1115/imece2006-13589.

Full text
Abstract:
Many engineering educators have become sensitive to the improved outcome of student learning in their classes. This has been true for our colleagues in the colleges of education where teachers are prepared for the teaching pedagogy. In many cultures as well as ours, the teaching profession is upheld as a noble profession. At the same time, the university faculty are held with high esteem by the general population. Faculty teaching in undergraduate programs have begun to address the pedagogy of learning in recent years. There is a national trend towards helping in this phenomenon. Besides funding initiatives by organizations such as the National, Science Foundation, engineering professional societies continue to organize forums and awards to recognize and promote teaching and learning of engineering subject matter. This paper would address an experiment in improved learning by students of a subject matter that is laboratory based. The instrumentation course is a required course for engineering technology (ET) students pursuing mechanical, manufacturing/industrial, product design, and electromechanical majors at Wayne State University (WSU). Most engineering technology students are more comfortable with experimental techniques than with derivation of equations and formulas. Setting for this course was a multi-media distance learning classroom and a set of lab experiments. The teacher had an important task of not just covering the material, but to increase student interest to optimize their learning. Although all the teaching materials were prepared for presentation in power point, after discussion with the class it was decided to make the learning process different from the traditional teaching. The class was divided in three groups and each group was given a reading assignment covering one third of the material to be covered in each session. Each team met on a regular basis going over its assignment and breaking up the tasks for each team member to lead presentation and discussion for the whole class. Learning objectives addressed in the course included team work, effective communication, system design and testing, continued student participation, effective learning for long term retention besides the contents of the subject matter. Overall, student really felt they were learning a lot and new things. This paper summarizes a very positive experience of students and faculty dealing with learning pedagogy.
APA, Harvard, Vancouver, ISO, and other styles
2

Pons, B. Stanley. "Infrared Spectral Electrochemistry of Surface Reactions." In Microphysics of Surfaces, Beams, and Adsorbates. Washington, D.C.: Optica Publishing Group, 1985. http://dx.doi.org/10.1364/msba.1985.tua2.

Full text
Abstract:
In situ infrared vibrational spectroscopy of the electrode/solution interface has established itself as an efficient and informative method for study of the orientation and structure of adsorbed species. The method is growing rapidly in popularity due to is ease in implementation. Very high sensitivities (approximately 10-6 adsorbance) is attained with relatively simple instrumentation and by simple single external specular reflection. A range of related experimental techniques and their application has led to study of problems of wide electrochemical and general interest including: a) The detection and identification in aqueous and non-aqueous solvents of the potential dependent populations of solvent, simple ions, and organic solutes both in the adsorbed state and free in the double layer using a wide range of metallic and non-metallic electrode materials including well-defined single crystal surfaces. b) The investigation of the orientation, surface bonding, and intermolecular interaction of adsorbed species and the effects of electrode potential on these parameters. c) The identification and kinetics of reaction of adsorbed and non-adsorbed reaction intermediates such as radical anions and cations.
APA, Harvard, Vancouver, ISO, and other styles
3

Gillaugh, Daniel L., Alexander A. Kaszynksi, Trevor C. Tomlin, Jeffrey M. Brown, Joseph A. Beck, and Emily B. Carper. "Accurate Blade Tip Timing Placement on a Centrifugal Impeller Using As-Manufactured Modeling." In ASME Turbo Expo 2022: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/gt2022-83415.

Full text
Abstract:
Abstract Non-intrusive stress measurement systems (NSMS) are commonly used during rig and engine tests to ensure safe operation of the test asset. Blade tip timing (BTT) is one form of NSMS that estimates operational stresses of a bladed rotor using time-of-arrival (TOA) data coupled with finite element analysis (FEA) predictions. Traditional FEA techniques assume nominal airfoils to generate stress-to-deflections ratios used with TOA data to predict blade stresses. Recent research has been conducted showing the significant variability in stress-to-deflection ratios when accounting for geometric variations in the blade geometry. As-manufactured finite element modeling has been shown to be a prudent way to account for these geometric variations when developing instrumentation placements and safety limits for rotors. Literature on this topic tends to focus on integrally bladed rotors where every blade is notionally identical. Centrifugal impellers alternatively have a main and splitter blade with significantly different geometries making blade tip timing placement more challenging on these components. Traditional BTT placement techniques will be tailored in this work to achieve optimal probe placement to detect specific vibratory modes of interest for both the main and splitter blades. BTT safety limits will be generated for each mode of interest, and optical scanning and mesh morphing approaches will be used to generate an as-manufactured model of the centrifugal impeller. The BTT safety limit variability due to geometric variations will be assessed for both the main and splitter blades. It will be shown that due to geometric variations within the impeller that the BTT limits can vary between blades. This research further validates the importance of using as-manufactured modeling during component design, test, and throughout the life-cycle.
APA, Harvard, Vancouver, ISO, and other styles
4

Geffert, Sawyer. "Animation general." In SIGGRAPH '18: Special Interest Group on Computer Graphics and Interactive Techniques Conference. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3209800.3232916.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Shimohira, Kazuhisa. "The basic and general idea of motion." In SIGGRAPH07: Special Interest Group on Computer Graphics and Interactive Techniques Conference. New York, NY, USA: ACM, 2007. http://dx.doi.org/10.1145/1280720.1280804.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Batra, Vineet, Ankit Phogat, and Mridul Kavidayal. "General primitives for smooth coloring of vector graphics." In SIGGRAPH '18: Special Interest Group on Computer Graphics and Interactive Techniques Conference. New York, NY, USA: ACM, 2018. http://dx.doi.org/10.1145/3230744.3230786.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Chen, Jiong, and Mathieu Desbrun. "Go Green: General Regularized Green’s Functions for Elasticity." In SIGGRAPH '22: Special Interest Group on Computer Graphics and Interactive Techniques Conference. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3528233.3530726.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

"Session details: Course 24: GPGPU: general-purpose cmputation on graphics hardware." In SIGGRAPH07: Special Interest Group on Computer Graphics and Interactive Techniques Conference, edited by Mike Houston and Naga Govindaraju. New York, NY, USA: ACM, 2007. http://dx.doi.org/10.1145/3250714.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Wood, Christopher M. "General data compression algorithm for space images using fractal techniques." In 1994 Symposium on Astronomical Telescopes & Instrumentation for the 21st Century, edited by David L. Crawford and Eric R. Craine. SPIE, 1994. http://dx.doi.org/10.1117/12.176818.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Van der Laan, Jan E. "Low-Pressure Gain-Cell Laser-Detector Operation with A CO2 Transversely Excited Atmospheric (Tea) Laser." In Laser and Optical Remote Sensing: Instrumentation and Techniques. Washington, D.C.: Optica Publishing Group, 1987. http://dx.doi.org/10.1364/lors.1987.wc4.

Full text
Abstract:
The recent development of a low-pressure CO2 gain-cell1 preamplifier for optical receivers has stimulated renewed interest in the technique in the lidar community. Early work in this area2 indicated that a gain factor of only 2.5 could be achieved using a 72-cm long gain cell. Because low gain did not appear to justify the added cost and complexity to a lidar system, interest in the technique decreased.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Instrumentation and techniques of general interest"

1

Droby, Samir, Michael Wisniewski, Martin Goldway, Wojciech Janisiewicz, and Charles Wilson. Enhancement of Postharvest Biocontrol Activity of the Yeast Candida oleophila by Overexpression of Lytic Enzymes. United States Department of Agriculture, November 2003. http://dx.doi.org/10.32747/2003.7586481.bard.

Full text
Abstract:
Enhancing the activity of biocontrol agents could be the most important factor in their success in controlling fruit disease and their ultimate acceptance in commercial disease management. Direct manipulation of a biocontrol agent resulting in enhancement of diseases control could be achieved by using recent advances in molecular biology techniques. The objectives of this project were to isolate genes from yeast species that were used as postharvest biocontrol agents against postharvest diseases and to determine their role in biocontrol efficacy. The emphasis was to be placed on the yeast, Candida oleophila, which was jointly discovered and developed in our laboratories, and commercialized as the product, Aspire. The general plan was to develop a transformation system for C . oleophila and either knockout or overexpress particular genes of interest. Additionally, biochemical characterization of the lytic peptides was conducted in the wild-type and transgenic isolates. In addition to developing a better understanding of the mode of action of the yeast biocontrol agents, it was also our intent to demonstrate the feasibility of enhancing biocontrol activity via genetic enhancement of yeast with genes known to code for proteins with antimicrobial activity. Major achievements are: 1) Characterization of extracellular lytic enzymes produced by the yeast biocontrol agent Candida oleophila; 2) Development of a transformation system for Candida oleophila; 3) Cloning and analysis of C.oleophila glucanase gene; 4) Overexpression of and knockout of C. oleophila glucanase gene and evaluating its role in the biocontrol activity of C. oleophila; 5) Characterization of defensin gene and its expression in the yeast Pichiapastoris; 6) Cloning and Analysis of Chitinase and Adhesin Genes; 7) Characterization of the rnase secreted by C . oleophila and its inhibitory activity against P. digitatum. This project has resulted in information that enhanced our understanding of the mode of action of the yeast C . oleophila. This was important step towards enhancing the biocontrol activity of the yeast. Fungal cell wall enzymes produced by the yeast antagonist were characterized. Different substrates were identified to enhance there production in vitro. Exo-b-1, 3 glucanase, chitinase and protease production was stimulated by the presence of cell-wall fragments of Penicillium digitatum in the growing medium, in addition to glucose. A transformation system developed was used to study the role of lytic enzymes in the biocontrol activity of the yeast antagonist and was essential for genetic manipulation of C . oleqphila. After cloning and characterization of the exo-glucanase gene from the yeast, the transformation system was efficiently used to study the role of the enzyme in the biocontrol activity by over-expressing or knocking out the activity of the enzyme. At the last phase of the research (still ongoing) the transformation system is being used to study the role of chitinase gene in the mode of action. Knockout and over expression experiments are underway.
APA, Harvard, Vancouver, ISO, and other styles
2

Weller, Joel I., Derek M. Bickhart, Micha Ron, Eyal Seroussi, George Liu, and George R. Wiggans. Determination of actual polymorphisms responsible for economic trait variation in dairy cattle. United States Department of Agriculture, January 2015. http://dx.doi.org/10.32747/2015.7600017.bard.

Full text
Abstract:
The project’s general objectives were to determine specific polymorphisms at the DNA level responsible for observed quantitative trait loci (QTLs) and to estimate their effects, frequencies, and selection potential in the Holstein dairy cattle breed. The specific objectives were to (1) localize the causative polymorphisms to small chromosomal segments based on analysis of 52 U.S. Holstein bulls each with at least 100 sons with high-reliability genetic evaluations using the a posteriori granddaughter design; (2) sequence the complete genomes of at least 40 of those bulls to 20 coverage; (3) determine causative polymorphisms based on concordance between the bulls’ genotypes for specific polymorphisms and their status for a QTL; (4) validate putative quantitative trait variants by genotyping a sample of Israeli Holstein cows; and (5) perform gene expression analysis using statistical methodologies, including determination of signatures of selection, based on somatic cells of cows that are homozygous for contrasting quantitative trait variants; and (6) analyze genes with putative quantitative trait variants using data mining techniques. Current methods for genomic evaluation are based on population-wide linkage disequilibrium between markers and actual alleles that affect traits of interest. Those methods have approximately doubled the rate of genetic gain for most traits in the U.S. Holstein population. With determination of causative polymorphisms, increasing the accuracy of genomic evaluations should be possible by including those genotypes as fixed effects in the analysis models. Determination of causative polymorphisms should also yield useful information on gene function and genetic architecture of complex traits. Concordance between QTL genotype as determined by the a posteriori granddaughter design and marker genotype was determined for 30 trait-by-chromosomal segment effects that are segregating in the U.S. Holstein population; a probability of <10²⁰ was used to accept the null hypothesis that no segregating gene within the chromosomal segment was affecting the trait. Genotypes for 83 grandsires and 17,217 sons were determined by either complete sequence or imputation for 3,148,506 polymorphisms across the entire genome. Variant sites were identified from previous studies (such as the 1000 Bull Genomes Project) and from DNA sequencing of bulls unique to this project, which is one of the largest marker variant surveys conducted for the Holstein breed of cattle. Effects for stature on chromosome 11, daughter pregnancy rate on chromosome 18, and protein percentage on chromosome 20 met 3 criteria: (1) complete or nearly complete concordance, (2) nominal significance of the polymorphism effect after correction for all other polymorphisms, and (3) marker coefficient of determination >40% of total multiple-regression coefficient of determination for the 30 polymorphisms with highest concordance. The missense polymorphism Phe279Tyr in GHR at 31,909,478 base pairs on chromosome 20 was confirmed as the causative mutation for fat and protein concentration. For effect on fat percentage, 12 additional missensepolymorphisms on chromosome 14 were found that had nearly complete concordance with the suggested causative polymorphism (missense mutation Ala232Glu in DGAT1). The markers used in routine U.S. genomic evaluations were increased from 60,000 to 80,000 by adding markers for known QTLs and markers detected in BARD and other research projects. Objectives 1 and 2 were completely accomplished, and objective 3 was partially accomplished. Because no new clear-cut causative polymorphisms were discovered, objectives 4 through 6 were not completed.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography