Academic literature on the topic 'LZ. None of these, but in this section'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'LZ. None of these, but in this section.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "LZ. None of these, but in this section"

1

the LUX and LZ Collaborations, Vitaly Kudryavtsev for. "Recent Results from LUX and Prospects for Dark Matter Searches with LZ." Universe 5, no. 3 (March 7, 2019): 73. http://dx.doi.org/10.3390/universe5030073.

Full text
Abstract:
Weakly Interacting Massive Particle (WIMP) remains one of the most promising dark matter candidates. Many experiments around the world are searching for WIMPs and the best current sensitivity to WIMP-nucleon spin-independent cross-section is about 10 − 10 pb. LUX has been one of the world-leading experiments in the search for dark matter WIMPs. Results from the LUX experiment on WIMP searches for different WIMP masses are summarised in this paper. The LUX detector will be replaced by its successor, the LUX-ZEPLIN (LZ) detector. With 50 times larger fiducial mass and an increased background rejection power due to specially-designed veto systems, the LZ experiment (due to take first data in 2020) will achieve a sensitivity to WIMPs exceeding the current best limits by more than an order of magnitude (for spin-independent interactions and for WIMP masses exceeding a few GeV). An overview of the LZ experiment is presented and LZ sensitivity is discussed based on the accurately modelled background and the high-sensitivity material screening campaign.
APA, Harvard, Vancouver, ISO, and other styles
2

CONTOPOULOS, G., G. LUKES-GERAKOPOULOS, and T. A. APOSTOLATOS. "ORBITS IN A NON-KERR DYNAMICAL SYSTEM." International Journal of Bifurcation and Chaos 21, no. 08 (August 2011): 2261–77. http://dx.doi.org/10.1142/s0218127411029768.

Full text
Abstract:
We study the orbits in a Manko–Novikov type metric (MN) which is a perturbed Kerr metric. There are periodic, quasi-periodic, and chaotic orbits, which are found in configuration space and on a surface of section for various values of the energy E and the z-component of the angular momentum Lz. For relatively large Lz there are two permissible regions of nonplunging motion bounded by two closed curves of zero velocity (CZV), while in the Kerr metric there is only one closed CZV of nonplunging motion. The inner permissible region of the MN metric contains mainly chaotic orbits, but it contains also a large island of stability. When Lz decreases, the two permissible regions join and chaos increases. Below a certain value of Lz, most orbits escape inwards and plunge through the horizon. On the other hand, as the energy E decreases (for fixed Lz) the outer permissible region shrinks and disappears. In the inner permissible region, chaos increases and for sufficiently small E most orbits are plunging. We find the positions of the main periodic orbits as functions of Lz and E, and their bifurcations. Around the main periodic orbit of the outer region, there are islands of stability that do not appear in the Kerr metric (integrable case). In a realistic binary system, because of the gravitational radiation, the energy E and the angular momentum Lz of an inspiraling compact object decrease and therefore the orbit of the object is nongeodesic. In fact, in an extreme mass ratio inspiraling (EMRI) system the energy E and the angular momentum Lz decrease adiabatically and therefore the motion of the inspiraling object is characterized by the fundamental frequencies which are drifting slowly in time. In the Kerr metric, the ratio of the fundamental frequencies changes strictly monotonically in time. However, in the MN metric when an orbit is trapped inside an island the ratio of the fundamental frequencies remains constant for some time. Hence, if such a phenomenon is observed this will indicate that the system is nonintegrable and therefore the central object is not a Kerr black hole.
APA, Harvard, Vancouver, ISO, and other styles
3

Li, Peinan, Zeyu Dai, Xi Wang, Jun Liu, Yi Rui, Xiaojun Li, Jie Fan, and Peixin Chen. "A Study of the Segment Assembly Error and Quality Control Standard of Special-Shaped Shield Tunnels." Energies 15, no. 7 (March 29, 2022): 2512. http://dx.doi.org/10.3390/en15072512.

Full text
Abstract:
Large-section special-shaped shield tunnels feature many advantages, such as versatility and a large space utilization rate in energy transmission and public transport; however, guaranteeing the quality of segments’ assembly is difficult. Based on the quasi-rectangular shield tunnel project of Hangzhou Line 9 in China, this study investigated the formation mechanism and control measures of lining segment assembly defects. By quantifying the manufacturing error and positioning error, a simulation program for segment assembly is developed to calculate the error. Furthermore, considering the relative accumulative error between the upper and lower T blocks, the finite element model of key blocks (T-LZ block) is established to perform mechanical analysis, based on which the relative error control standard of the key block under the corresponding working conditions is proposed. The results show that the assembly quality can be effectively improved by assembling the LZ block first and applying corresponding error control measures, and the displacement of the segment along the rZ direction should be carefully controlled during the construction. The error caused by normal assembly will not damage the LZ block, and the corresponding control standard under the action of multi-degree-of-freedom error (extreme case) is 9.8 mm. Using this method to predict the assembly quality of segments can provide a basis for actual construction control of segment assembly.
APA, Harvard, Vancouver, ISO, and other styles
4

LaFontaine, Caden, Bailey Tallman, Spencer Ellis, Trevor Croteau, Brandon Torres, Sabrina Hernandez, Diego Cristancho Guerrero, Jessica Jaksik, Drue Lubanski, and Roland Allen. "A Dark Matter WIMP That Can Be Detected and Definitively Identified with Currently Planned Experiments." Universe 7, no. 8 (July 27, 2021): 270. http://dx.doi.org/10.3390/universe7080270.

Full text
Abstract:
A recently proposed dark matter WIMP (weakly interacting massive particle) has only second-order couplings to gauge bosons and itself. As a result, it has small annihilation, scattering, and creation cross-sections, and is consequently consistent with all current experiments and the observed abundance of dark matter. These cross-sections are, however, still sufficiently large to enable detection in experiments that are planned for the near future, and definitive identification in experiments proposed on a longer time scale. The (multi-channel) cross-section for annihilation is consistent with thermal production and freeze-out in the early universe, and with current evidence for dark matter annihilation in analyses of the observations of gamma rays by Fermi-LAT and antiprotons by AMS-02, as well as the constraints from Planck and Fermi-LAT. The cross-section for direct detection via collision with xenon nuclei is estimated to be slightly below 10−47 cm2, which should be attainable by LZ and Xenon nT and well within the reach of Darwin. The cross-section for collider detection via vector boson fusion is estimated to be ∼1 fb, and may be ultimately attainable by the high-luminosity LHC; definitive collider identification will probably require the more powerful facilities now being proposed.
APA, Harvard, Vancouver, ISO, and other styles
5

Hankinson, R. J. "The Sceptical Inquirer." History of Philosophy and Logical Analysis 23, no. 1 (September 8, 2020): 74–99. http://dx.doi.org/10.30965/26664275-02301007.

Full text
Abstract:
Abstract This article treats of whether scepticism, in particular Pyrrhonian scepticism, can be said to deploy a method of any kind. I begin by distinguishing various different notions of method, and their relations to the concept of expertise (section 1). I then (section 2) consider Sextus’s account, in the prologue to Outlines of Pyrrhonism, of the Pyrrhonist approach, and how it supposedly differs from those of other groups, sceptical and otherwise. In particular, I consider the central claim that the Pyrrhonist is a continuing investigator (section 3), who in spite of refusing to be satisfied with any answer (or none), none the less still achieves tranquillity, and whether this can avoid being presented as a method for so doing, and hence as compromising the purity of sceptical suspension of commitment (section 4). In doing so, I relate—and contrast—the Pyrrhonists’ account of their practice to the ‘Socratic Method’ (section 5), as well as to the argumentative practice of various Academics (section 6), and assess their claim in so doing to be offering a way of instruction (section 7). I conclude (section 8) that there is a consistent and interesting sense in which Pyrrhonian scepticism can be absolved of the charge that it incoherently, and crypto-dogmatically, presents itself as offering a method for achieving an intrinsically desirable goal.
APA, Harvard, Vancouver, ISO, and other styles
6

Grace, R. F., and V. J. Roach. "Caesarean Section in a Patient with Paramyotonia Congenita." Anaesthesia and Intensive Care 26, no. 5 (October 1998): 534–37. http://dx.doi.org/10.1177/0310057x9802600511.

Full text
Abstract:
This case report details spinal anaesthesia for an elective caesarean section in a patient with the rare condition of paramyotonia congenita. There are few case reports of anaesthesia in this condition and none in the Australian anaesthetic literature. This case highlights the need for the avoidance of hypothermia and depolarizing muscle relaxants, the safety of spinal anaesthesia and a conservative approach to the management of plasma potassium concentration. The subsequent review outlines the current literature and discusses other issues involved in the anaesthetic management of this disorder.
APA, Harvard, Vancouver, ISO, and other styles
7

Begum, Tahera, M. Jalal Uddin, and Ishrat Jabin. "Caesarean Section Without Indwelling Catheter." Chattagram Maa-O-Shishu Hospital Medical College Journal 14, no. 1 (April 4, 2015): 34–35. http://dx.doi.org/10.3329/cmoshmcj.v14i1.22878.

Full text
Abstract:
Background: Many studies were conducted worldwide on the subject but there is none in Chittagong. To get our experience we had conducted the study. Methods: It was an experimental study. 70 cases were conveniently selected under certain inclusion and exclusion criteria. Cases were managed by the obstetricians unrelated to the study. All cases were meticulously observed by the investigators and findings were instantly recorded. All data were manually managed. A discussion was made and conclusion was drawn. Results: Total 70 cases were studied. 35 primae and 35 multipara. Mean age was 27 years +2.3 . All were literate and 86% were housewives. Cases were without medical, surgical complications and were uneventfully managed. First voiding time was 4.28 hours +0.45. 07% cases were evacuated with plain catheter after 7 hours for pain and bladder distention. Average hospital stay 2.3 days. There was no occurrence of urinary tract infection. Conclusion: Indwelling catheter should not be used unless it is strongly indicated. DOI: http://dx.doi.org/10.3329/cmoshmcj.v14i1.22878 Chatt Maa Shi Hosp Med Coll J; Vol.14 (1); Jan 2015; Page 34-35
APA, Harvard, Vancouver, ISO, and other styles
8

Chowdhury, Liza, and Ishrat Jahan. "Cesarean Section without Urethral Catheterization." Journal of Armed Forces Medical College, Bangladesh 11, no. 1 (December 15, 2016): 3–6. http://dx.doi.org/10.3329/jafmc.v11i1.30658.

Full text
Abstract:
Introduction: Urethral catheterization is done as a routine procedure in cesarean section. It is associated with great discomfort, high incidence of urinary tract infection, delayed ambulation and longer hospital stay.Objectives: To determine the feasibility and safety of cesarean section without urethral catheterization.Materials and Methods: A prospective, observational study was carried out from April 2012 to March 2013, in the Department of Obstetrics and Gynaecology, CMH Dhaka and IBN Sina Hospital Dhaka among 65 patients who had undergone cesarean section without catheterization. There were some limitations of this study. We had excluded previous cesarean from our study, so the results of this study cannot be generalized.Results: Firs void discomfort was not significantly associated without the use of indwelling catheter. Hospital stay was shorter (94% was discharged on 3rd post operative day). None of the patients had bladder injury. Mean duration of surgery was 45 minutes (44%) and ambulation time 11-14 hours (60%). Average estimated blood loss was 500 ml (41%) and time of 1st voiding was 5-8 hrs (58%). Need for catheterization was significantly low (3%).Conclusion: Cesarean section can be done safely without urethral catheterization with reduced morbidities.Journal of Armed Forces Medical College Bangladesh Vol.11(1) 2015: 3-6
APA, Harvard, Vancouver, ISO, and other styles
9

Gurung, Tara, Sangeeta Shrestha, Ujjwal Basnet, and Amirbabu Shrestha. "Awareness with Recall in General Anesthesia undergoing Cesarean Section." Nepal Journal of Obstetrics and Gynaecology 13, no. 1 (November 12, 2018): 18–22. http://dx.doi.org/10.3126/njog.v13i1.21611.

Full text
Abstract:
Aims: To determine the incidence of awareness with recall in parturient undergoing cesarean section under general anesthesia in Paropakar Maternity and Women’s Hospital.Methods: Retrospective observational cohort study of the patients who underwent cesarean section under general anesthesia from April mid 2014 to April mid 2017 (Baishakh 2071 to Chaitra 2073 BS). Awareness questionnaires filled up through the modified Brice interview.Results: A total of 162 patients underwent Cesarean Section under general anesthesia and138 were included in the study. None of them had awareness and six patients had a dream.Conclusion: No awareness with recall found and prospective study is required to determine the condition.
APA, Harvard, Vancouver, ISO, and other styles
10

Varga, Gyorgy, and Ricardo Dias de Oliveira Brito. "The Cross-Section of Expected Stock Returns in Brazil." Brazilian Review of Finance 14, no. 2 (June 27, 2016): 151. http://dx.doi.org/10.12660/rbfin.v14n2.2016.60916.

Full text
Abstract:
In a sample of the Brazilian stock market from 1999 to 2015, this paper shows that the book-to-market and momentum of individual firms capture some of the cross-sectional variation in average stock returns, while the market β and size do not play a role. The positive relation of cross-section of returns with book-to-market is more evident earlier, while the positive relation with momentum is stronger later in the sample. However, because none of these characteristics show explanatory power for all the subsamples studied, we are not fully convinced that they capture fundamental risk factors.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "LZ. None of these, but in this section"

1

Ebenezer, Catherine. "“Access denied”? Barriers for staff accessing, using and sharing published information online within the National Health Service (NHS) in England: technology, risk, culture, policy and practice." Thesis, University of Sheffield, 2017. http://eprints.rclis.org/32585/1/CME%20thesis%20v4.0%20White%20Rose%20final%20single%20volume.pdf.

Full text
Abstract:
The overall aim of the study was to investigate barriers to online professional information seeking, use and sharing occurring within the NHS in England, their possible effects (upon education, working practices, working lives and clinical and organisational effectiveness), and possible explanatory or causative factors. The investigation adopted a qualitative case study approach, using semi-structured interviews and documentary analysis as its methods, with three NHS Trusts of different types (acute - district general hospital, mental health / community, acute – teaching) as the nested sites of data collection. It aimed to be both exploratory and explanatory. A stratified sample of participants, including representatives of professions whose perspectives were deemed to be relevant, and clinicians with educational or staff development responsibilities, was recruited for each Trust. Three non-Trust specialists (the product manager of a secure web gateway vendor, an academic e-learning specialist, and the senior manager at NICE responsible for the NHS Evidence electronic content and web platform) were also interviewed. Policy documents, statistics, strategies, reports and quality accounts for the Trusts were obtained via public websites, from participants or via Freedom of Information requests. Thematic analysis following the approach of Braun and Clarke (2006) was adopted as the analytic method for both interviews and documents. The key themes of the results that emerged are presented: barriers to accessing and using information, education and training, professional cultures and norms, information governance and security, and communications policy. The findings are discussed under three main headings: power, culture, trust and risk in information security; use and regulation of Web 2.0 and social media, and the system of professions. It became evident that the roots of problems with access to and use of such information lay deep within the culture and organisational characteristics of the NHS and its use of IT. A possible model is presented to explain the interaction of the various technical and organisational factors that were identified as relevant. A number of policy recommendations are put forward to improve access to published information at Trust level, as well as recommendations for further research.
APA, Harvard, Vancouver, ISO, and other styles
2

Moreira, Walter. "A construção de informações documentárias: aportes da linguística documentária, da terminologia e das ontologias." Thesis, Universidade de São Paulo, 2010. http://eprints.rclis.org/17437/1/TeseFinalRevisada_05Jul2010.pdf.

Full text
Abstract:
Investigates theoretical and practical interfaces between terminology, philosophical ontology, computational ontology and documentary linguistics and the subsidies that they offer for the construction of documentary information. It was established as specific objectives, the analysis of the production, development, implementation and use of ontologies based on the information science theories, the research on the contribution of ontologies for the development of thesauri and vice versa and the discussion of the philosophical foundation of the application of ontologies based on the study of ontological categories present in classical philosophy and in the contemporary proposals. It argues that the understanding of ontologies through the communicative theory of terminology contributes to the organization of a less quantitative access (syntactic) and more qualitative (semantic) of information. Notes that, in spite of sharing some common goals, there is little dialogue between the information science (and, inside it, the documentary linguistics) and computer science. It argues that the computational and philosophical ontologies are not completely independent events, which have among themselves only the similarity of name, and notes that the discussion of categories and categorization in computer science, does not always have the emphasis it receives in information science in studies on knowledge representation. The approach of Deleuze and Guattari's rhizome, was treated as instigator of reflections on the validity of the hierarchical tree model structure and the possibilities of its expansion. It concludes that the construction of ontologies can not ignore the terminological and conceptual analysis, as it's understood by the terminology and by the information science accumulated in the theoretical and methodological basis for the construction of indexing languages and, on the other hand, the construction of flexible indexing languages can not ignore the representational model of ontologies which are more capable for formalization and interoperability.
APA, Harvard, Vancouver, ISO, and other styles
3

Boyer, Sebastien. "Dans le cadre du nouveau cycle de combustible $^(232)$Th/$^(233)$U, determination de la section efficace de capture radiative $^(233)$Pa(n,$\gamma$) pour des energies de neutrons comprises entre 0 et 1 MeV." Phd thesis, Université Sciences et Technologies - Bordeaux I, 2004. http://tel.archives-ouvertes.fr/tel-00009062.

Full text
Abstract:
Dans l'optique d'un developpement durable du nucleaire, un des themes de recherche du CNRS dicte par la loi Bataille de 1991, est l'etude d'une nouvelle filiere nucleaire utilisant un combustible a base de minerai de thorium ($^(232)$Th) ou le noyau fissile est l'$^(233)$U. Le principal interet de ce type de combustible reside dans sa particularite de produire les dechets transuraniens en beaucoup plus faible quantite que les reacteurs a eau pressurisee actuels. Cependant certaines donnees nucleaires importantes concernant cette nouvelle filiere sont tres mal connues comme par exemple celles relatives au noyau charniere protactinium 233 ($^(233)$Pa). Sa periode de 27 jours lui confere un role particulier dans le cycle mais en raison de sa trop forte activite l'etude de ce noyau releve du defi experimental. Pour contourner cette difficulte, la probabilite d'emission de rayonnements gamma dans la reaction induite par neutrons $^(233)$Pa(n,$\gamma$) entre 0 et 1 MeV d'energie neutron a ete determinee a partir de la reaction transfert $^(232)Th(^(3)He,p)^(234)Pa*$. Le dispositif de mesure permettait d'identifier la particule de sortie signant ainsi la voie de reaction tandis que des scintillateurs de type C$_(6)$D$_(6)$ permettaient la detection en coincidence des rayonnements gamma emis. La methode d'analyse des evenements gamma a necessite la ponderation des spectres de photons par des fonctions mathematiques calculees dites "fonctions de poids". leurs determinations requierent neanmoins une connaissance parfaite du comportement des scintillateurs (efficacite, fonctions de reponse) dans la geometrie choisie. Pour ce faire, une etude preliminaire a ete realisee a l'aide de sources gamma et avec des reactions induites par protons sur des noyaux legers. Les simulations utilisant le code de transport MCNP ont ete validees par des resultats experimentaux.
APA, Harvard, Vancouver, ISO, and other styles
4

Park, Allison M. "Comparing the Cognitive Demand of Traditional and Reform Algebra 1 Textbooks." Scholarship @ Claremont, 2011. http://scholarship.claremont.edu/hmc_theses/9.

Full text
Abstract:
Research has shown that students achieved higher standardized test scores in mathematics and gained more positive attitudes towards mathematics after learning from reform curricula. Because these studies involve actual students and teachers, there are classroom variables that are involved in these findings (Silver and Stein, 1996; Stein et al., 1996). To understand how much these curricula by themselves contribute to higher test scores, I have studied the cognitive demand of tasks in two traditional and two reform curricula. This work required the creation of a scale to categorize tasks based on their level of cognitive demand. This scale relates to those by Stein, Schoenfeld, and Bloom. Based on this task analysis, I have found that more tasks in the reform curricula require higher cognitive demand than tasks in the traditional curricula. These findings confirm other results that posing tasks with higher cognitive demand to students can lead to higher student achievement.
APA, Harvard, Vancouver, ISO, and other styles
5

Zurita, Sánchez Juan Manuel. "El paradigma otletiano como base de un modelo para la organización y difusión del conocimiento científico." Thesis, Universidad Nacional Autónoma de México, 2001. http://eprints.rclis.org/6752/1/tesina._juan_manuel_zurita.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Cho, Karina Elle. "Enhancing the Quandle Coloring Invariant for Knots and Links." Scholarship @ Claremont, 2019. https://scholarship.claremont.edu/hmc_theses/228.

Full text
Abstract:
Quandles, which are algebraic structures related to knots, can be used to color knot diagrams, and the number of these colorings is called the quandle coloring invariant. We strengthen the quandle coloring invariant by considering a graph structure on the space of quandle colorings of a knot, and we call our graph the quandle coloring quiver. This structure is a categorification of the quandle coloring invariant. Then, we strengthen the quiver by decorating it with Boltzmann weights. Explicit examples of links that show that our enhancements are proper are provided, as well as background information in quandle theory.
APA, Harvard, Vancouver, ISO, and other styles
7

Mebane, Palmer. "Uniquely Solvable Puzzles and Fast Matrix Multiplication." Scholarship @ Claremont, 2012. https://scholarship.claremont.edu/hmc_theses/37.

Full text
Abstract:
In 2003 Cohn and Umans introduced a new group-theoretic framework for doing fast matrix multiplications, with several conjectures that would imply the matrix multiplication exponent $\omega$ is 2. Their methods have been used to match one of the fastest known algorithms by Coppersmith and Winograd, which runs in $O(n^{2.376})$ time and implies that $\omega \leq 2.376$. This thesis discusses the framework that Cohn and Umans came up with and presents some new results in constructing combinatorial objects called uniquely solvable puzzles that were introduced in a 2005 follow-up paper, and which play a crucial role in one of the $\omega = 2$ conjectures.
APA, Harvard, Vancouver, ISO, and other styles
8

Lubisco, Nídia Maria Lienert. "La evaluación en la biblioteca universitaria brasileña: evolución y propuesta de mejora." Thesis, Universidad Carlos III De Madrid, 2007. http://eprints.rclis.org/12225/1/tesisnidia.pdf.

Full text
Abstract:
This thesis examines the evolution of the evaluation of the university library in Brazil. At present, the only applicable system takes place in the evaluation of academic by the Ministry of Education, through the National Institute for Studies and Research Educacionais Anísio Teixeira (INEP), the entity responsible for coordinating and implementing the process of evaluating the entire education system in Brazil. The starting point for addressing this investigation were the results of a case study developed at the Federal University of Bahia in 2001, under the assumption that the Ministry did not have criteria and appropriate instruments to reflect the library's role as a resource for educational qualifications. The review of the literature relevant Brazilian, British and Spanish regarding the subject, especially regarding experience with the Spanish University Libraries Network (REBIUN), were essential to know the trends and the state of affairs. It also took out a field work through questionnaires in 7 countries and 31 Latin American universities, in order to ascertain the current status of evaluation systems used in that region. The low response rate, about 50% of institutions surveyed, this was offset by the work of the Symposium on Electronic Library evaluator and Quality (Sociedad Argentina de report, 2002), an event of great significance to who came over 450 librarians and teachers. This initiative helped develop a comparative study that served to place the Brazilian reality and take into account the most similar case mix. All this enabled comply with the overall objective of this thesis: develop a model to evaluate university libraries in Brazil, from the state of affairs of the university library and Brazilian experiences located in Latin American countries. That model se basa en el Instrument INEP (2006), but its content is raised, taking into account different jobs: performance indicators for assessing the university library, REBIUN (2000), one of the documents PNBU, prepared Romcy by Maria Carmen de Carvalho (1995), Standards for university libraries in Chile (2001) and methodological guide for evaluating the libraries of institutions of higher education (Mexico, 2000). The final claim is that the Brazilian Ministry has a more comprehensive evaluation system, to be implemented and improved according to the data you make and, in turn, could be to build a bank of information management and a system of indicators performance.
APA, Harvard, Vancouver, ISO, and other styles
9

Ollendorff, Christine. "Construction d'un diagnostic complexe d'une bibliothèque académique." Thesis, Ecole nationale supérieure d'arts et métiers - ENSAM, 1999. http://eprints.rclis.org/11682/1/These-co.pdf.

Full text
Abstract:
In the first part, we studies academic libraries whose founding principles are change with the information society. The library manager looks for the best ways to lead a library. Evaluation and management tools on libraries are sources of insatisfactions because of the lack of a holistic vision of libraries they give. Constructivist approaches of management seems an alternative way to help library managers in designing a strategic vision of these organizations. In the second part, we build a modelization project with complex systems modelization of Jean-Louis Le Moigne and soft systems methodology of Peter Checkland. French academic libraries constitute our research field in which we use participant observations and staff and managers interviews. Our system is made up of the academic library in the vision of it leader. We build a three phase diagnostic. With the first phase, the manager can observ services supply as a three axes system which evolves with technologies, materials and humans means and bibliotheconomic knowledge. The second phase studies the informational system with environmental and flow observations. The third phase studies the decisional system, looking for its organization, ist internal information circulation and its decisional processes. This diagnostic is in continuous construction. It works in the library management continuity and is not the first step of a strategic planning. We use the diagnostic in five libraries. Results show modelization supply with the creation of evolving, transposable and learning models.
APA, Harvard, Vancouver, ISO, and other styles
10

Kulzhabayev, Alibek. "Abstract Unattended Workflow Interactions." Thesis, 2012. http://eprints.rclis.org/16791/1/maindoc.pdf.

Full text
Abstract:
Recent studies in the domain of digital preservation have demonstrated the principle feasibility of the migration-by-emulation approach. The migration-by-emulation approach is a method aimed to recreate original environments of the obsolete digital objects and using such environments, to automatically convert large amounts of digital artifacts, from obsolete formats into up-to-date formats. However, there are some issues hindering the creation of fast and reliable migration workflows. We studied a large number of migration workflows, and we provide an abstract description of the migration workflow based on that study. In such a description the mechanism to automatically handle unsuccessful migration workflows is also provided. Since we consider unattended migrations of a numerous digital objects, an approach of an execution of the migration worfklow for each digital object that is being migrated is not time-efficient. The abstract description of the migration workflow provided in this thesis, describes the process to repeat certain part of the migration workflow, according to the number of digital objects to be migrated. That allows to migrate large number of digital objects within one migration worfklow execution, which would increase the speed of such an execution. Moreover, we implemented a feature to create stages, which in this case serve as identifiers of the mentioned repeatable part of the migration workflow. By using the abstract description of the migration workflow and the feature to create stages, a user can choose or rearrange certain parts of the multi-format migration workflow, allowing the user to select the output formats needed. Such a workflow migrates a digital object from one format to several different ones. In this thesis, hindrances to create fast and reliable migration workflows are analyzed, and methods to optimize them are designed and implemented. About 3,35 times execution speed acceleration on 50 sample digital objects of such optimized migration workflows is observed in comparison with the ones that were created without our optimization method.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "LZ. None of these, but in this section"

1

Mathematics of probability. Providence, Rhode Island: American Mathematical Society, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

1949-, Libgober A. (Anatoly), Cogolludo-Agustín José Ignacio, and Hironaka Eriko 1962-, eds. Topology of algebraic varieties and singularities: Conference in honor of Anatoly Libgober's 60th birthday, June 22-26, 2009, Jaca, Huesca, Spain. Providence, R.I: American Mathematical Society, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Center for Mathematics at Notre Dame and American Mathematical Society, eds. Toplogy and field theories: Center for Mathematics at Notre Dame, Center for Mathematics at Notre Dame : summer school and conference, Topology and field theories, May 29-June 8, 2012, University of Notre Dame, Notre Dame, Indiana. Providence, Rhode Island: American Mathematical Society, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ahmed, S. E. (Syed Ejaz), 1957- editor of compilation, ed. Perspectives on big data analysis: Methodologies and applications : International Workshop on Perspectives on High-Dimensional Data Anlaysis II, May 30-June 1, 2012, Centre de Recherches Mathématiques, University de Montréal, Montréal, Québec, Canada. Providence, Rhode Island: American Mathematical Society, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Riemann surfaces by way of complex analytic geometry. Providence, R.I: American Mathematical Society, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Conference Board of the Mathematical Sciences and National Science Foundation (U.S.), eds. Lectures on the energy critical nonlinear wave equation. Providence, Rhode Island: Published for the Conference Board of the Mathematical Sciences by the American Mathematical Society, with support from the National Science Foundation, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

The ultimate challenge: The 3x+1 problem. Providence, R.I: American Mathematical Society, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

From riches to raags: 3-manifolds, right-angled artin groups, and cubical geometry. Providence, Rhode Island: Published for the Conference Board of the Mathematical Sciences by the American Mathematical Society, Providence, Rhode Island with support from the National Science Foundation, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Morse, Robert Fitzgerald, editor of compilation, Nikolova-Popova, Daniela, 1952- editor of compilation, and Witherspoon, Sarah J., 1966- editor of compilation, eds. Group theory, combinatorics and computing: International Conference in honor of Daniela Nikolova-Popova's 60th birthday on Group Theory, Combinatorics and Computing, October 3-8, 2012, Florida Atlantic University, Boca Raton, Florida. Providence, Rhode Island: American Mathematical Society, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Flannery, D. L. (Dane Laurence), 1965-, ed. Algebraic design theory. Providence, R.I: American Mathematical Society, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "LZ. None of these, but in this section"

1

Smithies, Declan. "Luminosity." In The Epistemic Role of Consciousness, 345–79. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780199917662.003.0011.

Full text
Abstract:
Chapter 11 defends the thesis that some phenomenal and epistemic conditions are luminous in the sense that you’re always in a position to know whether or not they obtain. Section 11.1 draws a distinction between epistemic and doxastic senses of luminosity and argues that some conditions are epistemically luminous even if none are doxastically luminous. Section 11.2 uses this distinction in solving Ernest Sosa’s version of the problem of the speckled hen. The same distinction is applied to Timothy Williamson’s anti-luminosity argument in section 11.3, his argument against epistemic iteration principles in section 11.4, and his argument for improbable knowing in section 11.5. Section 11.6 concludes by explaining why this defense of luminosity is not merely a pointless compromise.
APA, Harvard, Vancouver, ISO, and other styles
2

"Literary Selections." In Roots of the Black Chicago Renaissance, edited by Richard A. Courage and Christopher Robert Reed, 245. University of Illinois Press, 2020. http://dx.doi.org/10.5622/illinois/9780252043055.003.0013.

Full text
Abstract:
Editors’ Note: Our study concludes with a section comprising three literary selections that we intend to break new ground for a scholarly collection. None of the selections is a conventional academic essay, each belongs to a different genre of writing, and each amplifies the light already shone on the roots of the Black Chicago Renaissance....
APA, Harvard, Vancouver, ISO, and other styles
3

Phillips, David. "The Critique of Common-Sense Morality (Methods III.I–III.XI)." In Sidgwick's The Methods of Ethics, 96–119. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780197539613.003.0006.

Full text
Abstract:
This chapter focuses on The Methods of Ethics, Book III, Chapters I through XI. Its topic is Sidgwick’s critique of the theory he calls “common-sense morality” or “dogmatic intuitionism.” The first section introduces this theory, noting problems with both Sidgwick’s names for it. The second section discusses the project of Chapters I through XI, making the principles of common-sense morality clear and precise. The third section introduces the four conditions Sidgwick uses to test putative axioms. The fourth section outlines his argument that none of the principles of common-sense morality satisfy the four conditions, focusing on the example of promissory obligation. The fifth section considers two responses: defending absolutist deontology, and moving to moderate deontology by introducing, as W. D. Ross does, the concept of prima facie duty. The final section raises questions about the fairness of Sidgwick’s critique.
APA, Harvard, Vancouver, ISO, and other styles
4

Collins, Richard B., Dale A. Oesterle, and Lawrence Friedman. "Schedule." In The Colorado State Constitution, 457–62. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780190907723.003.0030.

Full text
Abstract:
This chapter discusses the “Schedule” of the Colorado Constitution. Transition to statehood required that territorial institutions and law be retained until expressly replaced. At the end of the original constitution, twenty-two sections under the heading of Schedule detailed how the transition should work. Although almost entirely obsolete, none has been repealed. Schedule Section 1 was invoked by enterprising defense lawyers in efforts to get their clients off on a technicality. At least one succeeded. Section 20, requiring that presidential electors “be chosen by direct vote of the people,” could be read as obsolete or as a continuing constitutional rule. It is the only section with possible relevance to a current dispute.
APA, Harvard, Vancouver, ISO, and other styles
5

Knee, Jonathan A. "Lessons from Clown School." In Class Clowns, 202–32. Columbia University Press, 2016. http://dx.doi.org/10.7312/columbia/9780231179287.003.0007.

Full text
Abstract:
The final section of Class Clowns tries to draw some conclusions from the analyses and case studies. Ten lessons are drawn that should provide guardrails to anyone seeking to play in the education domain. Broadly, the failures highlighted fall into three broad buckets: misguided ambition, misunderstood markets and faulty execution. None of these phenomena are unique to education, but their application takes on a very particular shape in the educational context.
APA, Harvard, Vancouver, ISO, and other styles
6

Anne Franks, Mary. "“Not Where Bodies Live”." In Free Speech in the Digital Age, 137–49. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780190883591.003.0009.

Full text
Abstract:
John Perry Barlow, one of the founders of the Electronic Frontier Foundation (EFF), famously claimed in 1996 that the internet “is a world that is both everywhere and nowhere, but it is not where bodies live.” The conception of cyberspace as a realm of pure expression has encouraged an aggressively anti-regulatory approach to the internet. This approach was essentially codified in U.S. federal law in Section 230 of the Communications Decency Act, which invokes free speech principles to provide broad immunity for online intermediaries against liability for the actions of those who use their services. The free speech frame has encouraged an abstract approach to online conduct that downplays its material conditions and impact. Online intermediaries use Section 230 as both a shield and a sword—simultaneously avoiding liability for the speech of others while benefiting from that speech. In the name of free expression, Section 230 allows powerful internet corporations to profit from harmful online conduct while absorbing none of its costs.
APA, Harvard, Vancouver, ISO, and other styles
7

Schram, Frederick R., and Stefan Koenemann. "Mictacea." In Evolution and Phylogeny of Pancrustacea, 362–69. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780195365764.003.0028.

Full text
Abstract:
Mictacea demonstrates the confusion that occurs when animals share so many primitive features. Modest Guţu in fact erected a number of orders to accommodate them, but these do not seem entirely justified. Although the two orders erected by Guţu currently do not have much support, in fairness it must be recognized that molecules, or information from yet-to-be-recognized mictacean taxa, could change the situation. In this case, Guţu’s Bochusacea and Cosinzeneacea might yet prove useful. They inhabit anchialine caves and the deep sea. The animals are small, slender, and subcylindrical in outline. The diagnosis of Guţu to the contrary, none of these animals appear particularly “flattened” in cross-section.
APA, Harvard, Vancouver, ISO, and other styles
8

Sarch, Alexander. "The Scope of the Willful Ignorance Doctrine (I)." In Criminally Ignorant, 85–108. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780190056575.003.0004.

Full text
Abstract:
Determining when the equal culpability thesis holds sets the boundaries in which the willful ignorance doctrine is to be applied. Chapter 3 thus considers the best existing attempts to specify the conditions in which the equal culpability thesis holds, but proceeds to argue that none succeeds. Still, each failure is instructive. First, the chapter argues against the unrestricted equal culpability thesis. Not all willful ignorance, it turns out, is as culpable as the analogous knowing misconduct. Then the chapter argues against the three leading attempts to restrict the thesis. Section II argues against a restriction that appeals to bad motives, while Section III argues against a common counterfactual restriction (according to which willful ignorance is as culpable as knowing misconduct when one would do the actus reus even with knowledge). The latter proposal fails since criminal culpability does not depend on considerations about counterfactual conduct or one’s willingness to misbehave. Finally, Section IV discusses a third restriction, offered by Deborah Hellman, which asks whether the decision to remain in ignorance was itself justified. This version of the thesis is on the right track, but still requires refinement in important ways.
APA, Harvard, Vancouver, ISO, and other styles
9

Fabre, Cécile. "Building Blocks." In Spying Through a Glass Darkly, 12–36. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780198833765.003.0002.

Full text
Abstract:
This chapter sets the stage for the remainder of the book. It provides an overview of the ethics of spying in classical moral and political thought. It examines three possible approaches to espionage: (a) the dirty-hands approach, which sees espionage as a necessary evil; (b) the contractarian view that espionage is best defended by appeal to normative conventions endorsed by all; (c) the view that the most fruitful way to construct an ethics of espionage is through the lens of just-war theory. None of those approaches are fully satisfactory, though they provide building blocks for the approach taken in this book. The final section sets out the normative principles on which the book rests.
APA, Harvard, Vancouver, ISO, and other styles
10

Adams, Robert Merrihew. "God and Possibilities." In What Is, and What Is In Itself, 194–212. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780192856135.003.0012.

Full text
Abstract:
Section 11.1 of this chapter discusses how (and whether!) God can know all possibilities without actualizing all of them. For if unactualized conscious states and processes are to be permanent objects of actual thoughts in an eternal mind, how different is that from their being actual conscious states and processes of that mind? It is argued that discursive knowledge of logic and mathematics is not problematic in this way—and in section 11.2 that “intuitive” knowledge of qualities raises additional issues, but none that God could not manage. Section 11.3 points out that God does not need to know nearly as much about possibilities if God need not and does not try to create a “best possible world,” but is motivated by grace and love for actual creatures. Section 11.4 addresses two questions about causal explanation: First, the teleological question: “What am I doing, or at least trying to do, and why am I trying to do it?” I think I often know an at least roughly right answer to that question. The second and much harder question is “Why is it that I can do what I am trying to do, if indeed I can?” Occasionalism implies that it depends on God. That is where this line of explanations stops. No causal explanation of God’s omnipotence is on offer.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "LZ. None of these, but in this section"

1

Al-Shuwaikhat, Hisham I., Shaker A. Al-Buhassan, Turki F. Al-Saadoun, and Saad M. Al-Driweesh. "Optimizing Production Facilities Using None-Radio Active Source MPFM in Ghawar Field in Saudi Aramco." In SPE Saudi Arabia section Young Professionals Technical Symposium. Society of Petroleum Engineers, 2008. http://dx.doi.org/10.2118/117063-ms.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Weaver, Matthew M., Michael G. Dunn, and Tab Heffernan. "Experimental Determination of the Influence of Foreign Particle Ingestion on the Behavior of Hot-Section Components Including Lamilloy®." In ASME 1996 International Gas Turbine and Aeroengine Congress and Exhibition. American Society of Mechanical Engineers, 1996. http://dx.doi.org/10.1115/96-gt-337.

Full text
Abstract:
The effect of foreign particle ingestion on hot-section components has been investigated with a series of experiments performed using a one-quarter sector F100-PW-100 annular combustor. The combustor was operated so that the engine corrected conditions were duplicated. The experiments were designed to determine the influence of cooling hole size, hole roughness (laser-drilled vs. EDM), combustor exit temperature and dust concentration on the cooling capability of the component in the presence of dust particles. Cylindrical Lamilloy® specimens, film cooled Inconel cylinders, and F100-PW-220 first stage turbine vanes were investigated. The size distribution of the foreign particles injected into the air stream just upstream of the combustor was consistent with distributions measured at the exit of high compressors for large turbofan engines. The foreign material was a soil composition representative of materials found around the world. Pre- and post-test bench flow measurements suggested that none of the test specimen’s cooling passages or holes were clogged by the foreign particles. Foreign particle deposition on the specimens was experienced with approximate values for the minimum combustor exit temperature and minimum specimen wall temperature for deposition measured.
APA, Harvard, Vancouver, ISO, and other styles
3

Urdaneta, Mario, Alfonso Ortega, and Russel V. Westphal. "Experiments and Modeling of the Hydraulic Resistance of In-Line Square Pin Fin Heat Sinks With Top By-Pass Flow." In ASME 2003 International Electronic Packaging Technical Conference and Exhibition. ASMEDC, 2003. http://dx.doi.org/10.1115/ipack2003-35268.

Full text
Abstract:
Extensive experiments were performed aimed at obtaining physical insight into the behavior of in-line pin fin heat sinks with pins of square cross-section. Detailed pressure measurements were made inside an array of square pins in order to isolate the inlet, developing, fully developed, and exit static pressure distributions as a function of row number. With this as background data, overall pressure drop was measured for a self-consistent set of aluminum heat sinks in side inlet side exit flow, with top clearance only. Pin heights of 12.5 mm, 17.5 mm, and 22.5 mm, pin pitch of 3.4 mm to 6.33 mm, and pin thickness of 1.5 mm, 2 mm and 2.5mm were evaluated. Base dimensions were kept fixed at 25 × 25 mm. In total, 20 aluminum heat sinks were evaluated. A “two-branch by-pass model” was developed, by allowing inviscid acceleration of the flow in the bypass section, and using pressure loss coefficients obtained under no bypass conditions in the heat sink section. The experimental data compared well to the proposed hydraulic models. Measurements in the array of pins showed that full development of the flow occurs after nine rows, thus indicating that none of the heat sinks tested could be characterized as fully-developed.
APA, Harvard, Vancouver, ISO, and other styles
4

Hu, Kenny S. Y., and Tom I.-P. Shih. "Large-Eddy and RANS Simulations of Heat Transfer in a U-Duct With a High-Aspect Ratio Trapezoidal Cross Section." In ASME Turbo Expo 2018: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/gt2018-75535.

Full text
Abstract:
Large-eddy and RANS simulations were performed to examine the details of the heat-transfer mechanisms in a U-duct with a high-aspect ratio trapezoidal cross section at a Reynolds number of 20,000. ANSYS-Fluent was used to perform the simulations. For the large-eddy simulations (LES), the WALE subgrid-scale model was employed, and its inflow boundary condition was provided by a concurrent LES of incompressible fully-developed flow in a straight duct with the same cross section and flow conditions as the U-duct. The grid resolution required to obtain meaningful LES solutions were obtained via a grid sensitivity study of incompressible fully-developed turbulent flow in a straight duct of square cross section, where data from direct numerical simulation (DNS) and experiments are available to validate and guide the simulation. In addition, the grid used satisfies Celik’s criterion, and resolves the Kolmogorov’s −5/3 law. Results were also obtained for the U-duct by using RANS, and three widely used turbulence models were examined — realizable k-ε with the two-layer model in the near-wall region, shear-stress transport (SST) model, and stress-omega full Reynolds stress model (RSM). Results obtained from LES showed unsteady flow separation to occur immediately after the turn region, which none of the RANS models could predict. By being able to capture this unsteady flow mechanism, LES was able to predict the measured heat-transfer downstream of the U-duct. The maximum relative error in the predicted local heat-transfer coefficient was less than 10% in the LES results, but up to 80% in the RANS results.
APA, Harvard, Vancouver, ISO, and other styles
5

Carroll, L. Blair, and Moe S. Madi. "Crack Detection Program on the Cromer to Gretna, Manitoba Section of Enbridge Pipelines Inc. Line 3." In 2000 3rd International Pipeline Conference. American Society of Mechanical Engineers, 2000. http://dx.doi.org/10.1115/ipc2000-186.

Full text
Abstract:
Enbridge Pipelines Inc. Line 3 is an 860mm (34 inch), API 5LX Grade X52 pipeline with nominal wall thickness ranging from 7.14 mm to 12.5 mm. The Canadian portion of the Line runs from Edmonton, Alberta to Gretna, Manitoba. It was constructed between 1963 and 1969 in a series of loops designed to increase the capacity of the Enbridge system. Until 1999 the pipeline operated in a looped configuration with neighboring 24 inch and 48 inch pipelines. Line 3 downstream of Kerrobert, Saskatchewan began operating in straight 34 inch configuration in 1999 following completion of the first phase expansion project known as Terrace Expansion Project that connects the (48 inch) loops with a new (36 inch) pipeline. In 1997, the Pipetronix (now PII) Ultrascan CD in-line inspection tool was run for 283 km from Cromer to Gretna, Manitoba, to identify long seam cracking and pipe body stress corrosion cracking. This section of the line is comprised primarily of pipe manufactured with a double submerged arc welded long seam with short sections of pipe having electric resistance welded long seams. There were two primary objectives set forth in this inspection project. The first was to assess the integrity of this section of Line 3 and identify anomalies, which might affect the future operation of the pipeline. The second objective was to evaluate the performance of the Ultrascan CD tool and determine its potential role in the Enbridge integrity program. A series of excavations have been conducted based upon the analysis of this data and none of the indications identified were considered to be an immediate concern to the integrity of the pipeline. Notably, two of the excavations resulted in the detection of the first two “significant” SCC colonies (based upon the CEPA definition of significant) [1] found on the Enbridge system. This paper will focus on the tool performance requirements established by Enbridge prior to the inspection run which include specific defect type and size and defects at a maximum sensitivity of the tool. In addition, the information obtained as a result of the excavation program and onsite inspection and assessments. The information gathered, from this program were useful in better understanding the tool tolerance in detecting such defects and to better differentiate between them.
APA, Harvard, Vancouver, ISO, and other styles
6

Bouhairie, Salem, Siddharth Talapatra, and Kevin Farrell. "Turbulent Flow in a No-Tube-in-Window Shell-and-Tube Heat Exchanger: CFD vs PIV." In ASME 2014 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/imece2014-36902.

Full text
Abstract:
A research-scale shell-and-tube heat exchanger housing a no-tube-in-window (NTIW) arrangement of tubes is analyzed using ANSYS® FLUENT. Three-dimensional, computational fluid dynamic (CFD) simulations of adiabatic flow in a periodic section of the exchanger were conducted. The numerical results were compared to particle image velocimetry (PIV) measurements in the window region where tubes are not present. As part of the study, the k-epsilon with scalable wall function, k-omega with shear stress transport (SST), Reynolds Stress (RSM), and Scale Adaptive Simulation (SAS) turbulence models were assessed. Each turbulence model showed some similarities with the recorded phenomena, but none fully captured the complexity of flow field outside of the tube bundle. Additional simulations of an entire NTIW exchanger model were performed to examine the flow behavior between the window and crossflow regions, as window momentum flux, ρu2, limits are a concern for safe mechanical design.
APA, Harvard, Vancouver, ISO, and other styles
7

Bajwa, Christopher S., and Earl P. Easton. "Potential Effects of Historic Rail Accidents on the Integrity of Spent Nuclear Fuel Transportation Packages." In ASME 2009 Pressure Vessels and Piping Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/pvp2009-77811.

Full text
Abstract:
The US Nuclear Regulatory Commission (NRC) completed an analysis of historical rail accidents (from 1975 to 2005) involving hazardous materials and long duration fires in the United States. The analysis was initiated to determine what types of accidents had occurred and what impact those types of accidents could have on the rail transport of spent nuclear fuel. The NRC found that almost 21 billion miles of freight rail shipments over a 30 year period had resulted in a small number of accidents involving the release of hazardous materials, eight of which involved long duration fires. All eight of the accidents analyzed resulted in fires that were less severe than the “fully engulfing fire” described as a hypothetical accident condition in the NRC regulations for radioactive material transport found in Title 10 of the Code of Federal Regulations, Part 71, Section 73. None of the eight accidents involved a release of radioactive material. This paper describes the eight accidents in detail and examines the potential effects on spent nuclear fuel transportation packages exposed to the fires that resulted from these accidents.
APA, Harvard, Vancouver, ISO, and other styles
8

Kochan, P., and H. J. Carmichael. "Photon-statistics dependence of single-atom absorption." In OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1993. http://dx.doi.org/10.1364/oam.1993.wxx.8.

Full text
Abstract:
Under usual laboratory conditions the absorption of a beam of light by a dilute two-level medium depends on the intensity and spectrum of the incident light. If the incident photon flux is small compared to the decay rate of the absorber’s excited state, saturation effects can be neglected and only the spectrum of the light is important; thus, weak-field absorption depends only on the firstorder correlation function of the incident light. This result relies, however, on a requirement that the incident beam cross-section be much larger than the absorption cross-section of an atom. Then individual atoms act as low efficiency scatterers and the statistics of the scattering process is governed by the law of large numbers applied to many scattering sites. In contrast, if it is arranged that just one atom significantly absorbs (scatters) a beam of photons, correlation functions of the incident light beyond the first-order must be considered. For a beam that is focused within an absorption cross-section, we might naively predict that every photon will be scattered and none transmitted. In fact, even for a very weak beam there is a finite probability for two photons to arrive at the atom within the excited state lifetime, and in this event at least one of them will be transmitted (statistical saturation). In this paper we calculate the transmitted photon flux for a weak beam of photons focussed strongly onto a single, resonant two-state atom. We study the dependence of the transmitted flux on the statistics of the incident photons. Photon beams derived from sources of coherent, chaotic, squeezed, coherent-antibunched, and broadband-antibunched light are considered. our calculations are based on a recently developed theory of cascaded open systems,1,2 and serve to illustrate the usefulness of this theory.
APA, Harvard, Vancouver, ISO, and other styles
9

Tyson, Samuel, and Shiraz Tayabji. "Long-Life Pavement for Users of an International Roadway in New Mexico." In 12th International Conference on Concrete Pavements. International Society for Concrete Pavements, 2021. http://dx.doi.org/10.33593/v38reo2p.

Full text
Abstract:
A 36-lane-mile (60 lane-km) international roadway was rehabilitated in the United States of America (USA) during 2018 by the New Mexico Department of Transportation (NMDOT) to provide uninterrupted long-life pavement performance for commercial users of the roadway. The southern border of the USA with the country of Mexico marks the starting point of New Mexico State Road 136 (NM 136), a four-lane divided roadway that carries heavily-loaded trucks associated with the United States–Mexico–Canada Agreement (USMCA), formerly the North American Free Trade Agreement (NAFTA). Truck traffic in the dual north- and south- bound lanes of this roadway is especially high on the 9-mile (15-km) section of NM 136 between the international border and an intermodal railway facility located in the USA state of New Mexico. Prior to this rehabilitation project, the structural cross-section of NM 136 consisted of 4.5 to 6.0 inches (110 to 150 mm) of asphalt on 5.0 to 6.0 inches (130 to 150 mm) of coarse-grained soils. Prior to this project on NM 136, NMDOT had very little experience with concrete pavements and none with continuously reinforced concrete pavements (CRCPs). The structural design for this rehabilitation project utilized the existing asphalt pavement as a satisfactory base for the CRCP by milling 1.5 inches (40 mm) of the existing asphalt concrete (AC) pavement and applying a 1.5-inch (40-mm) AC levelling course followed by the CRCP. This paper presents the design and construction related details of the NM 136 CRCP project.
APA, Harvard, Vancouver, ISO, and other styles
10

Abbaszadeh, Morteza, Parvin Nikpour Parizi, and Ramin Taheri. "A Novel Approach to Design Reversible Counter Rotating Propeller Fans." In ASME 2012 Gas Turbine India Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/gtindia2012-9657.

Full text
Abstract:
Because of their high performance and unique abilities like producing none-rotating wake, Counter Rotating Propellers (C.R.P.) are being used in many advanced propulsion or ventilation systems. But due to complicated design procedure of C.R.P. fans up to now it was not possible to apply the concept in reversible systems. For the first time in this research, a new method presented to design a reversible counter rotating propeller system. This method is based on designing a basic C.R.P. by a reliable edition of blade element theory to achieve maximum performance in main rotating course and then to optimize it in order to have almost same performance in reverse rotating course. After expressing concepts of the method, a basic model is designed to ensure the capability of the presented scheme. Design outcome was a Reversible C.R.P. (R.C.R.P) system with 0.5 meter in diameter and cross section of NACA2412 airfoil. This model is evaluated by using R.S.M. turbulence method through ANSYS Fluent commercial software package. Evaluation results showed that system has efficiency of 0.85 in main course and 0.78 in reverse course by which a good performance for a small size reversible system can be captured.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "LZ. None of these, but in this section"

1

Adams, Sunny, Madison Story, and Adam Smith. Evaluation of 11 buildings in the Fort McCoy cantonment. Engineer Research and Development Center (U.S.), September 2022. http://dx.doi.org/10.21079/11681/45350.

Full text
Abstract:
The United States Congress codified the National Historic Preservation Act of 1966 (NHPA), the nation’s most effective cultural resources legislation to date, mostly through establishing the National Register of Historic Places (NRHP). The NHPA requires federal agencies to address their cultural resources, which are defined as any prehistoric or historic district, site, building, structure, or object. Section 110 of the NHPA requires federal agencies to inventory and evaluate their cultural resources, and Section 106 requires them to determine the effect of federal undertakings on those potentially eligible for the NRHP. Fort McCoy is in west-central Wisconsin, entirely within Monroe County. It was first established as the Sparta Maneuver Tract in 1909. The post was renamed Camp McCoy in 1926. Since 1974, it has been known as Fort McCoy. This report provides historic context and determinations of eligibility for buildings in the cantonment constructed between 1946 and 1975 and concludes that none are eligible for the NRHP. In consultation with the Wisconsin State Historic Preservation Officer (WISHPO), this work fulfills Section 110 requirements for these buildings.
APA, Harvard, Vancouver, ISO, and other styles
2

Price, Roz. Climate Adaptation: Lessons and Insights for Governance, Budgeting, and Accountability. Institute of Development Studies (IDS), December 2020. http://dx.doi.org/10.19088/k4d.2022.008.

Full text
Abstract:
This rapid review draws on literature from academic, policy and non-governmental organisation sources. There is a huge literature on climate governance issues in general, but less is known about effective support and the political-economy of adaptation. A large literature base and case studies on climate finance accountability and budgeting in governments is nascent and growing. Section 2 of this report briefly discusses governance of climate change issues, with a focus on the complexity and cross-cutting nature of climate change compared to the often static organisational landscape of government structured along sectoral lines. Section 3 explores green public financial management (PFM). Section 4 then brings together several principles and lessons learned on green PFM highlighted in the guidance notes. Transparency and accountability lessons are then highlighted in Section 5. The Key findings are: 1) Engaging with the governance context and the political economy of climate governance and financing is crucial to climate objectives being realised. 2) More attention is needed on whether and how governments are prioritising adaptation and resilience in their own operations. 3) Countries in Africa further along in the green PFM agenda give accounts of reform approaches that are gradual, iterative and context-specific, building on existing PFM systems and their functionality. 4) A well-functioning “accountability ecosystem” is needed in which state and non-state accountability actors engage with one another. 5) Climate change finance accountability systems and ecosystems in countries are at best emerging. 6) Although case studies from Nepal, the Philippines and Bangladesh are commonly cited in the literature and are seen as some of the most advanced developing country examples of green PFM, none of the countries have had significant examples of collaboration and engagement between actors. 7) Lessons and guiding principles for green PFM reform include: use the existing budget cycle and legal frameworks; ensure that the basic elements of a functional PFM system are in place; strong leadership of the Ministry of Finance (MoF) and clear linkages with the overall PFM reform agenda are needed; smart sequencing of reforms; real political ownership and clearly defined roles and responsibilities; and good communication to stakeholders).
APA, Harvard, Vancouver, ISO, and other styles
3

Goeckeritz, Joel, Nathan Schank, Ryan L Wood, Beverly L Roeder, and Alonzo D Cook. Use of Urinary Bladder Matrix Conduits in a Rat Model of Sciatic Nerve Regeneration after Nerve Transection Injury. Science Repository, December 2022. http://dx.doi.org/10.31487/j.rgm.2022.03.01.

Full text
Abstract:
Previous research has demonstrated the use of single-channel porcine-derived urinary bladder matrix (UBM) conduits in segmental-loss, peripheral nerve repairs as comparable to criterion-standard nerve autografts. This study aimed to replicate and expand upon this research with additional novel UBM conduits and coupled therapies. Fifty-four Wistar Albino rats were divided into 6 groups, and each underwent a surgical neurectomy to remove a 7-millimeter section of the sciatic nerve. Bridging of this nerve gap and treatment for each group was as follows: i) reverse autograft—the segmented nerve was reversed 180 degrees and used to reconnect the proximal and distal nerve stumps; ii) the nerve gap was bridged via a silicone conduit; iii) a single-channel UBM conduit; iv) a multi-channel UBM conduit; v) a single-channel UBM conduit identical to group 3 coupled with fortnightly transcutaneous electrical nerve stimulation (TENS); vi) or, a multi-channel UBM conduit identical to group 4 coupled with fortnightly TENS. The extent of nerve recovery was assessed by behavioural parameters: foot fault asymmetry scoring measured weekly for six weeks; electrophysiological parameters: compound muscle action potential (CMAP) amplitudes, measured at weeks 0 and 6; and morphological parameters: total fascicle areas, myelinated fiber counts, fiber densities, and fiber sizes measured at week 6. All the above parameters demonstrated recovery of the test groups (3-6) as being either comparable or less than that of reverse autograft, but none were shown to outperform reverse autograft. As such, UBM conduits may yet prove to be an effective treatment to repair relatively short segmental peripheral nerve injuries, but further research is required to demonstrate greater efficacy over nerve autografts.
APA, Harvard, Vancouver, ISO, and other styles
4

Goeckeritz, Joel, Nathan Schank, Ryan L Wood, Beverly L Roeder, and Alonzo D Cook. Use of Urinary Bladder Matrix Conduits in a Rat Model of Sciatic Nerve Regeneration after Nerve Transection Injury. Science Repository, December 2022. http://dx.doi.org/10.31487/j.rgm.2022.03.01.sup.

Full text
Abstract:
Previous research has demonstrated the use of single-channel porcine-derived urinary bladder matrix (UBM) conduits in segmental-loss, peripheral nerve repairs as comparable to criterion-standard nerve autografts. This study aimed to replicate and expand upon this research with additional novel UBM conduits and coupled therapies. Fifty-four Wistar Albino rats were divided into 6 groups, and each underwent a surgical neurectomy to remove a 7-millimeter section of the sciatic nerve. Bridging of this nerve gap and treatment for each group was as follows: i) reverse autograft—the segmented nerve was reversed 180 degrees and used to reconnect the proximal and distal nerve stumps; ii) the nerve gap was bridged via a silicone conduit; iii) a single-channel UBM conduit; iv) a multi-channel UBM conduit; v) a single-channel UBM conduit identical to group 3 coupled with fortnightly transcutaneous electrical nerve stimulation (TENS); vi) or, a multi-channel UBM conduit identical to group 4 coupled with fortnightly TENS. The extent of nerve recovery was assessed by behavioural parameters: foot fault asymmetry scoring measured weekly for six weeks; electrophysiological parameters: compound muscle action potential (CMAP) amplitudes, measured at weeks 0 and 6; and morphological parameters: total fascicle areas, myelinated fiber counts, fiber densities, and fiber sizes measured at week 6. All the above parameters demonstrated recovery of the test groups (3-6) as being either comparable or less than that of reverse autograft, but none were shown to outperform reverse autograft. As such, UBM conduits may yet prove to be an effective treatment to repair relatively short segmental peripheral nerve injuries, but further research is required to demonstrate greater efficacy over nerve autografts.
APA, Harvard, Vancouver, ISO, and other styles
5

Friedler, Haley S., Michelle B. Leavy, Eric Bickelman, Barbara Casanova, Diana Clarke, Danielle Cooke, Andy DeMayo, et al. Outcome Measure Harmonization and Data Infrastructure for Patient-Centered Outcomes Research in Depression: Data Use and Governance Toolkit. Agency for Healthcare Research and Quality (AHRQ), October 2021. http://dx.doi.org/10.23970/ahrqepcwhitepaperdepressiontoolkit.

Full text
Abstract:
Executive Summary Patient registries are important tools for advancing research, improving healthcare quality, and supporting health policy. Registries contain vast amounts of data that could be used for new purposes when linked with other sources or shared with researchers. This toolkit was developed to summarize current best practices and provide information to assist registries interested in sharing data. The contents of this toolkit were developed based on review of the literature, existing registry practices, interviews with registries, and input from key stakeholders involved in the sharing of registry data. While some information in this toolkit may be relevant in other countries, this toolkit focuses on best practices for sharing data within the United States. Considerations related to data sharing differ across registries depending on the type of registry, registry purpose, funding source(s), and other factors; as such, this toolkit describes general best practices and considerations rather than providing specific recommendations. Finally, data sharing raises complex legal, regulatory, operational, and technical questions, and none of the information contained herein should be substituted for legal advice. The toolkit is organized into three sections: “Preparing to Share Data,” “Governance,” and “Procedures for Reviewing and Responding to Data Requests.” The section on “Preparing to Share Data” discusses the role of appropriate legal rights to further share the data and the need to follow all applicable ethical regulations. Registries should also prepare for data sharing activities by ensuring data are maintained appropriately and developing policies and procedures for governance and data sharing. The “Governance” section describes the role of governance in data sharing and outlines key governance tasks, including defining and staffing relevant oversight bodies; developing a data request process; reviewing data requests; and overseeing access to data by the requesting party. Governance structures vary based on the scope of data shared and registry resources. Lastly, the section on “Procedures for Reviewing and Responding to Data Requests” discusses the operational steps involved in sharing data. Policies and procedures for sharing data may depend on what types of data are available for sharing and with whom the data can be shared. Many registries develop a data request form for external researchers interested in using registry data. When reviewing requests, registries may consider whether the request aligns with the registry’s mission/purpose, the feasibility and merit of the proposed research, the qualifications of the requestor, and the necessary ethical and regulatory approvals, as well as administrative factors such as costs and timelines. Registries may require researchers to sign a data use agreement or other such contract to clearly define the terms and conditions of data use before providing access to the data in a secure manner. The toolkit concludes with a list of resources and appendices with supporting materials that registries may find helpful.
APA, Harvard, Vancouver, ISO, and other styles
6

Friedler, Haley S., Michelle B. Leavy, Eric Bickelman, Barbara Casanova, Diana Clarke, Danielle Cooke, Andy DeMayo, et al. Outcome Measure Harmonization and Data Infrastructure for Patient-Centered Outcomes Research in Depression: Data Use and Governance Toolkit. Agency for Healthcare Research and Quality (AHRQ), October 2021. http://dx.doi.org/10.23970/ahrqepcwhitepaperdepressiontoolkit.

Full text
Abstract:
Executive Summary Patient registries are important tools for advancing research, improving healthcare quality, and supporting health policy. Registries contain vast amounts of data that could be used for new purposes when linked with other sources or shared with researchers. This toolkit was developed to summarize current best practices and provide information to assist registries interested in sharing data. The contents of this toolkit were developed based on review of the literature, existing registry practices, interviews with registries, and input from key stakeholders involved in the sharing of registry data. While some information in this toolkit may be relevant in other countries, this toolkit focuses on best practices for sharing data within the United States. Considerations related to data sharing differ across registries depending on the type of registry, registry purpose, funding source(s), and other factors; as such, this toolkit describes general best practices and considerations rather than providing specific recommendations. Finally, data sharing raises complex legal, regulatory, operational, and technical questions, and none of the information contained herein should be substituted for legal advice. The toolkit is organized into three sections: “Preparing to Share Data,” “Governance,” and “Procedures for Reviewing and Responding to Data Requests.” The section on “Preparing to Share Data” discusses the role of appropriate legal rights to further share the data and the need to follow all applicable ethical regulations. Registries should also prepare for data sharing activities by ensuring data are maintained appropriately and developing policies and procedures for governance and data sharing. The “Governance” section describes the role of governance in data sharing and outlines key governance tasks, including defining and staffing relevant oversight bodies; developing a data request process; reviewing data requests; and overseeing access to data by the requesting party. Governance structures vary based on the scope of data shared and registry resources. Lastly, the section on “Procedures for Reviewing and Responding to Data Requests” discusses the operational steps involved in sharing data. Policies and procedures for sharing data may depend on what types of data are available for sharing and with whom the data can be shared. Many registries develop a data request form for external researchers interested in using registry data. When reviewing requests, registries may consider whether the request aligns with the registry’s mission/purpose, the feasibility and merit of the proposed research, the qualifications of the requestor, and the necessary ethical and regulatory approvals, as well as administrative factors such as costs and timelines. Registries may require researchers to sign a data use agreement or other such contract to clearly define the terms and conditions of data use before providing access to the data in a secure manner. The toolkit concludes with a list of resources and appendices with supporting materials that registries may find helpful.
APA, Harvard, Vancouver, ISO, and other styles
7

Stavland, Arne, Siv Marie Åsen, Arild Lohne, Olav Aursjø, and Aksel Hiorth. Recommended polymer workflow: Lab (cm and m scale). University of Stavanger, November 2021. http://dx.doi.org/10.31265/usps.201.

Full text
Abstract:
Polymer flooding is one of the most promising EOR methods (Smalley et al. 2018). It is well known and has been used successfully (Pye 1964; Standnes & Skjevrak 2014; Sheng et al. 2015). From a technical perspective we recommend that polymer flooding should be considered as a viable EOR method on the Norwegian Continental Shelf for the following reasons: 1. More oil can be produced with less water injected; this is particularly important for the NCS which are currently producing more water than oil 2. Polymers will increase the aerial sweep and improve the ultimate recovery, provided a proper injection strategy 3. Many polymer systems are available, and it should be possible to tailor their chemical composition to a wide range of reservoir conditions (temperature and salinity) 4. Polymer systems can be used to block water from short circuiting injection production wells 5. Polymer combined with low salinity injection water has many benefits: a lower polymer concentration can be used to reach target viscosity, less mechanical degradation, less adsorption, and a potential reduction in Sor due to a low salinity wettability effect. There are some hurdles when considering polymer flooding that needs to be considered: 1. Many polymer systems are not at the present considered as green chemicals; thus, reinjection of produced water is needed. However, results from polymer degradation studies in the IORCentre indicates that a. High molecular weight polymers are quickly degraded to low molecular weight. In case of accidental release to the ocean low molecular weight polymers are diluted and the lifetime of the spill might be quite short. According to Caulfield et al. (2002) HPAM is not toxic, and will not degrade to the more environmentally problematic acrylamide. b. In the DF report for environmental impact there are case studies using the DREAM model to predict the transport of chemical spills. This model is coupled with polymer (sun exposure) degradation data from the IORCentre to quantify the lifetime of polymer spills. This approach should be used for specific field cases to quantify the environmental risk factor. 2. Care must be taken to prepare the polymer solution offshore. Chokes and vales might be a challenge but can be mitigating according to the results from the large-scale testing done in the IORCentre (Stavland et al. 2021). None of the above-mentioned challenges are server enough to not consider polymer flooding. HPAM is neither toxic, nor bio-accumulable, or bio-persistent and the CO2 footprint from a polymer flood may be significantly less than a water flood (Dupuis et al. 2021). There are at least two contributing factors to this statement, which we will return in detail to in the next section i) during linear displacement polymer injection will produce more oil for the same amount of water injected, hence the lifetime of the field can be shortened ii) polymers increase the arial sweep reducing the need for wells.
APA, Harvard, Vancouver, ISO, and other styles
8

Mazzoni, Silvia, Nicholas Gregor, Linda Al Atik, Yousef Bozorgnia, David Welch, and Gregory Deierlein. Probabilistic Seismic Hazard Analysis and Selecting and Scaling of Ground-Motion Records (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/zjdn7385.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER) and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 3 (WG3), Task 3.1: Selecting and Scaling Ground-motion records. The objective of Task 3.1 is to provide suites of ground motions to be used by other working groups (WGs), especially Working Group 5: Analytical Modeling (WG5) for Simulation Studies. The ground motions used in the numerical simulations are intended to represent seismic hazard at the building site. The seismic hazard is dependent on the location of the site relative to seismic sources, the characteristics of the seismic sources in the region and the local soil conditions at the site. To achieve a proper representation of hazard across the State of California, ten sites were selected, and a site-specific probabilistic seismic hazard analysis (PSHA) was performed at each of these sites for both a soft soil (Vs30 = 270 m/sec) and a stiff soil (Vs30=760 m/sec). The PSHA used the UCERF3 seismic source model, which represents the latest seismic source model adopted by the USGS [2013] and NGA-West2 ground-motion models. The PSHA was carried out for structural periods ranging from 0.01 to 10 sec. At each site and soil class, the results from the PSHA—hazard curves, hazard deaggregation, and uniform-hazard spectra (UHS)—were extracted for a series of ten return periods, prescribed by WG5 and WG6, ranging from 15.5–2500 years. For each case (site, soil class, and return period), the UHS was used as the target spectrum for selection and modification of a suite of ground motions. Additionally, another set of target spectra based on “Conditional Spectra” (CS), which are more realistic than UHS, was developed [Baker and Lee 2018]. The Conditional Spectra are defined by the median (Conditional Mean Spectrum) and a period-dependent variance. A suite of at least 40 record pairs (horizontal) were selected and modified for each return period and target-spectrum type. Thus, for each ground-motion suite, 40 or more record pairs were selected using the deaggregation of the hazard, resulting in more than 200 record pairs per target-spectrum type at each site. The suites contained more than 40 records in case some were rejected by the modelers due to secondary characteristics; however, none were rejected, and the complete set was used. For the case of UHS as the target spectrum, the selected motions were modified (scaled) such that the average of the median spectrum (RotD50) [Boore 2010] of the ground-motion pairs follow the target spectrum closely within the period range of interest to the analysts. In communications with WG5 researchers, for ground-motion (time histories, or time series) selection and modification, a period range between 0.01–2.0 sec was selected for this specific application for the project. The duration metrics and pulse characteristics of the records were also used in the final selection of ground motions. The damping ratio for the PSHA and ground-motion target spectra was set to 5%, which is standard practice in engineering applications. For the cases where the CS was used as the target spectrum, the ground-motion suites were selected and scaled using a modified version of the conditional spectrum ground-motion selection tool (CS-GMS tool) developed by Baker and Lee [2018]. This tool selects and scales a suite of ground motions to meet both the median and the user-defined variability. This variability is defined by the relationship developed by Baker and Jayaram [2008]. The computation of CS requires a structural period for the conditional model. In collaboration with WG5 researchers, a conditioning period of 0.25 sec was selected as a representative of the fundamental mode of vibration of the buildings of interest in this study. Working Group 5 carried out a sensitivity analysis of using other conditioning periods, and the results and discussion of selection of conditioning period are reported in Section 4 of the WG5 PEER report entitled Technical Background Report for Structural Analysis and Performance Assessment. The WG3.1 report presents a summary of the selected sites, the seismic-source characterization model, and the ground-motion characterization model used in the PSHA, followed by selection and modification of suites of ground motions. The Record Sequence Number (RSN) and the associated scale factors are tabulated in the Appendices of this report, and the actual time-series files can be downloaded from the PEER Ground-motion database Portal (https://ngawest2.berkeley.edu/)(link is external).
APA, Harvard, Vancouver, ISO, and other styles
9

Jorgensen, Frieda, Andre Charlett, Craig Swift, Anais Painset, and Nicolae Corcionivoschi. A survey of the levels of Campylobacter spp. contamination and prevalence of selected antimicrobial resistance determinants in fresh whole UK-produced chilled chickens at retail sale (non-major retailers). Food Standards Agency, June 2021. http://dx.doi.org/10.46756/sci.fsa.xls618.

Full text
Abstract:
Campylobacter spp. are the most common bacterial cause of foodborne illness in the UK, with chicken considered to be the most important vehicle for this organism. The UK Food Standards Agency (FSA) agreed with industry to reduce Campylobacter spp. contamination in raw chicken and issued a target to reduce the prevalence of the most contaminated chickens (those with more than 1000 cfu per g chicken neck skin) to below 10 % at the end of the slaughter process, initially by 2016. To help monitor progress, a series of UK-wide surveys were undertaken to determine the levels of Campylobacter spp. on whole UK-produced, fresh chicken at retail sale in the UK. The data obtained for the first four years was reported in FSA projects FS241044 (2014/15) and FS102121 (2015 to 2018). The FSA has indicated that the retail proxy target for the percentage of highly contaminated raw whole retail chickens should be less than 7% and while continued monitoring has demonstrated a sustained decline for chickens from major retailer stores, chicken on sale in other stores have yet to meet this target. This report presents results from testing chickens from non-major retailer stores (only) in a fifth survey year from 2018 to 2019. In line with previous practise, samples were collected from stores distributed throughout the UK (in proportion to the population size of each country). Testing was performed by two laboratories - a Public Health England (PHE) laboratory or the Agri-Food & Biosciences Institute (AFBI), Belfast. Enumeration of Campylobacter spp. was performed using the ISO 10272-2 standard enumeration method applied with a detection limit of 10 colony forming units (cfu) per gram (g) of neck skin. Antimicrobial resistance (AMR) to selected antimicrobials in accordance with those advised in the EU harmonised monitoring protocol was predicted from genome sequence data in Campylobacter jejuni and Campylobacter coli isolates The percentage (10.8%) of fresh, whole chicken at retail sale in stores of smaller chains (for example, Iceland, McColl’s, Budgens, Nisa, Costcutter, One Stop), independents and butchers (collectively referred to as non-major retailer stores in this report) in the UK that are highly contaminated (at more than 1000 cfu per g) with Campylobacter spp. has decreased since the previous survey year but is still higher than that found in samples from major retailers. 8 whole fresh raw chickens from non-major retailer stores were collected from August 2018 to July 2019 (n = 1009). Campylobacter spp. were detected in 55.8% of the chicken skin samples obtained from non-major retailer shops, and 10.8% of the samples had counts above 1000 cfu per g chicken skin. Comparison among production plant approval codes showed significant differences of the percentages of chicken samples with more than 1000 cfu per g, ranging from 0% to 28.1%. The percentage of samples with more than 1000 cfu of Campylobacter spp. per g was significantly higher in the period May, June and July than in the period November to April. The percentage of highly contaminated samples was significantly higher for samples taken from larger compared to smaller chickens. There was no statistical difference in the percentage of highly contaminated samples between those obtained from chicken reared with access to range (for example, free-range and organic birds) and those reared under standard regime (for example, no access to range) but the small sample size for organic and to a lesser extent free-range chickens, may have limited the ability to detect important differences should they exist. Campylobacter species was determined for isolates from 93.4% of the positive samples. C. jejuni was isolated from the majority (72.6%) of samples while C. coli was identified in 22.1% of samples. A combination of both species was found in 5.3% of samples. C. coli was more frequently isolated from samples obtained from chicken reared with access to range in comparison to those reared as standard birds. C. jejuni was less prevalent during the summer months of June, July and August compared to the remaining months of the year. Resistance to ciprofloxacin (fluoroquinolone), erythromycin (macrolide), tetracycline, (tetracyclines), gentamicin and streptomycin (aminoglycosides) was predicted from WGS data by the detection of known antimicrobial resistance determinants. Resistance to ciprofloxacin was detected in 185 (51.7%) isolates of C. jejuni and 49 (42.1%) isolates of C. coli; while 220 (61.1%) isolates of C. jejuni and 73 (62.9%) isolates of C. coli isolates were resistant to tetracycline. Three C. coli (2.6%) but none of the C. jejuni isolates harboured 23S mutations predicting reduced susceptibility to erythromycin. Multidrug resistance (MDR), defined as harbouring genetic determinants for resistance to at least three unrelated antimicrobial classes, was found in 10 (8.6%) C. coli isolates but not in any C. jejuni isolates. Co-resistance to ciprofloxacin and erythromycin was predicted in 1.7% of C. coli isolates. 9 Overall, the percentages of isolates with genetic AMR determinants found in this study were similar to those reported in the previous survey year (August 2016 to July 2017) where testing was based on phenotypic break-point testing. Multi-drug resistance was similar to that found in the previous survey years. It is recommended that trends in AMR in Campylobacter spp. isolates from retail chickens continue to be monitored to realise any increasing resistance of concern, particulary to erythromycin (macrolide). Considering that the percentage of fresh, whole chicken from non-major retailer stores in the UK that are highly contaminated (at more than 1000 cfu per g) with Campylobacter spp. continues to be above that in samples from major retailers more action including consideration of interventions such as improved biosecurity and slaughterhouse measures is needed to achieve better control of Campylobacter spp. for this section of the industry. The FSA has indicated that the retail proxy target for the percentage of highly contaminated retail chickens should be less than 7% and while continued monitoring has demonstrated a sustained decline for chickens from major retailer stores, chicken on sale in other stores have yet to meet this target.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography