Academic literature on the topic 'EZ. None of these, but in this section'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'EZ. None of these, but in this section.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "EZ. None of these, but in this section"

1

Mete, Maurizio, Alessandro Alfano, Emilia Maggio, Massimo Guerriero, and Grazia Pertile. "Inverted ILM Flap for the Treatment of Myopic Macular Holes: Healing Processes and Morphological Changes in Comparison with Complete ILM Removal." Journal of Ophthalmology 2019 (June 2, 2019): 1–8. http://dx.doi.org/10.1155/2019/1314989.

Full text
Abstract:
Purpose. To investigate the microstructural changes after successful myopic macular hole (MMH) surgery, comparing inverted ILM flap and complete ILM removal techniques, and their association with visual function. Methods. Spectral-domain optical coherence tomography (SD-OCT) was used to evaluate both external limiting membrane (ELM) and ellipsoid zone (EZ) recovery in 40 eyes of 39 patients who underwent pars plana vitrectomy with either inverted internal limiting membrane flap technique (n=27) or complete ILM removal (n=13) to achieve MH closure. The association between ELM and EZ recovery and visual acuity was also investigated. The patients were followed up at 1 year. Results. ELM and EZ was recovered in 72% and 62% of cases, respectively, regardless of the surgical techniques 1 year after surgery. A strong positive association between the ELM and EZ recovery and the mean BCVA was found: regardless of the surgical technique, this was statistically significant at each time point (p<0.05). None of the baseline variables were found to act as predictive factors for either ELM or EZ. Conclusion. The inverted ILM flap technique did not affect the MMH healing processes compared to complete ILM removal. Thus, the presence of the ILM plug did not interfere with the restoration of both ELM and EZ, which correlated with functional recovery.
APA, Harvard, Vancouver, ISO, and other styles
2

Kessel, Adam, Taylor Ryan McFarland, Nicolas Sayegh, Kathryn Morton, Deepika Sirohi, Manish Kohli, Umang Swami, Roberto Nussenzveig, Neeraj Agarwal, and Benjamin Louis Maughan. "Randomized phase II trial of radium-223 (RA) plus enzalutamide (EZ) versus EZ alone in metastatic castration-refractory prostate cancer (mCRPC): Final efficacy and safety results." Journal of Clinical Oncology 39, no. 6_suppl (February 20, 2021): 135. http://dx.doi.org/10.1200/jco.2021.39.6_suppl.135.

Full text
Abstract:
135 Background: We previously reported that treatment with EZ+RA was associated with a decline in serum bone metabolism markers (BMM), which correlated with improved outcomes compared to EZ alone (Agarwal N et al, Clinical Cancer Research, 2020, PMID 31937614). Here we report the final efficacy and safety results for this trial. Methods: In this phase 2 trial (NCT02199197), patients (pts) with progressive mCRPC were treated with EZ (160 mg daily) ±RA (standard dose of 55 kBq/kg IV Q4 weeks x 6), until disease progression or unacceptable toxicities. Primary objectives of change in bone markers and safety have been reported previously. Secondary objectives included comparison of PSA progression free survival (PFS), overall survival (OS), and long term safety in all pts receiving RA+EZ vs EZ alone. Post hoc analysis included comparison of PSA-PFS2 (defined as time from start of protocol therapy to PSA progression on subsequent therapy or death whichever occurred earlier), time to subsequent/next therapy (TTNT), and long term safety. Survival analysis and log-rank tests was performed using the R statistical package v.4.0.2 ( https://www.r-project.org ). Statistical significance was defined as P<0.05. Results: Between 08/2014 and 11/2017, 47 pts were eligible and enrolled. Median follow up was 22 months (range 3.2-71.5). Thirty-five pts received RA+EZ and 12 pts received EZ alone. Receipt of prior abiraterone was allowed and was balanced between two groups: 60% in RA+EZ vs. 64% in EZ pts. Final efficacy results: TTNT, PSA-PFS2 were significantly improved in the RA+EZ pts over EZ alone pts, and all other efficacy parameters were numerically improved in RA+EZ pts (Table). Final safety results: none of the 12 EZ alone pts had any fracture; two of 35 RA+EZ pts were found to have incidental grade 1 asymptomatic fracture at the site of bone metastasis on routine imaging, at 15 and 31 months respectively after the last dose of RA, and did not require any intervention. No patients developed bone marrow disorders during the follow-up period. Efficacy and safety data will be elaborated during the meeting. Conclusions: In our study, EZ+RA resulted in significant long-term clinical benefit over EZ alone in pts with mCRPC without compromising safety. * NA&BLM; equal contribution. Clinical trial information: NCT02199197. [Table: see text]
APA, Harvard, Vancouver, ISO, and other styles
3

Kato, Naoki, Ichiro Yuki, Toshihiro Ishibashi, Ayako Ikemura, Issei Kan, Kengo Nishimura, Tomonobu Kodama, et al. "Visualization of stent apposition after stent-assisted coiling of intracranial aneurysms using high resolution 3D fusion images acquired by C-arm CT." Journal of NeuroInterventional Surgery 12, no. 2 (August 12, 2019): 192–96. http://dx.doi.org/10.1136/neurintsurg-2019-014966.

Full text
Abstract:
PurposeWe used an imaging technique based on 3-dimensional (3D) C-arm CT to assess the apposition of three types of stents after coiling of intracranial aneurysms.MethodsAll patients with intracranial aneurysms were considered who received stent-assisted coiling with Enterprise2, Neuroform EZ, or Neuroform Atlas stents confirmed by C-arm CT imaging at our institution between June 2015 and November 2017. A 3D digital subtraction angiography (DSA) scan for vessel imaging followed by a high-resolution cone beam CT (HR-CBCT) scan for coil and stent imaging was performed. The images were fused to obtain dual volume 3D fusion images. We investigated malapposition of the stent trunk (crescent sign) and of the stent edges (edge malapposition) and used the χ2 statistic to test for an association with stent types. Inter-rater agreement between two raters was estimated using Cohen’s kappa statistics.ResultsWe evaluated 75 consecutive cases. Enterprise2 stents were used in 22 cases, Neuroform EZ in 26, and Neuroform Atlas in 27 cases. By stent type, crescent sign was detected in 27% of Enterprise2, 8% of Neuroform EZ, and none of Neuroform Atlas stents (p=0.007), while edge malapposition was detected in 27% of Enterprise2, 58% of Neuroform EZ, and 30% of Neuroform Atlas stents (p=0.05). Excellent (κ=0.81) and good (κ=0.78) agreement between the raters was found for the detection of edge apposition and crescent sign, respectively.ConclusionStent malapposition was clearly visualized by dual volume 3D imaging. The Neuroform Atlas stents showed good apposition even in vessels with strong curvature.
APA, Harvard, Vancouver, ISO, and other styles
4

FIRSTENBERG-EDEN, RUTH, and NADINE M. SULLIVAN. "EZ Coli Rapid Detection System: A Rapid Method for the Detection of Escherichia coli O157 in Meat and Other Foods." Journal of Food Protection 60, no. 3 (March 1, 1997): 219–25. http://dx.doi.org/10.4315/0362-028x-60.3.219.

Full text
Abstract:
The EZ Coli™ Rapid Detection System consists of a selective enrichment medium and a rapid immunological detection kit. After being incubated for 15 to 24 h at 40 to 42°C, an Escherichia coli O157 culture was at a sufficient cell concentration (&gt; 106 CFU/ml) to be tested with the EZ Coli Detection Kit. In studies of foods seeded with E. coli O157, all 42 strains of E. coli O157 tested positive with the detection kit. None of the 29 strains of E. coli non-O157 tested positive with the kit. Species of Citrobacter, Hafnia, and Klebsiella grew in the medium but tested negative. Of the 47 strains of non-E. coli O157 tested, only two strains of Salmonella 0 Group N grew and tested positive with the kit. Several laboratories evaluated the EZ Coli System with 378 clean and naturally contaminated food samples (mainly raw beef), and 337 different food samples, including raw meats (beef, pork, turkey, and chicken), dairy products, spices, vegetables, and apple cider, spiked with 50 different strains of E. coli O157 (1 to 100 CFU/25 g). Of these samples, 44.6% were positive and 52.2% were negative. The false-positive rate was 1.7% and the false-negative rate was 1.5%. The data show that high levels of coliforms (&gt; 106 CFU/g) in food samples may impede the detection of low levels (1 to 10 CFU/25 g) of E. coli O157 organisms in broth, thereby causing false-negative reactions with most detection systems. The EZ Coli Rapid Detection System provides a rapid and specific means of detecting E. coli O157 in raw and processed foods.
APA, Harvard, Vancouver, ISO, and other styles
5

Hadriche, Abir, Ichrak Behy, Amal Necibi, Abdennaceur Kachouri, Chokri Ben Amar, and Nawel Jmail. "Assessment of Effective Network Connectivity among MEG None Contaminated Epileptic Transitory Events." Computational and Mathematical Methods in Medicine 2021 (December 28, 2021): 1–14. http://dx.doi.org/10.1155/2021/6406362.

Full text
Abstract:
Characterizing epileptogenic zones EZ (sources responsible of excessive discharges) would assist a neurologist during epilepsy diagnosis. Locating efficiently these abnormal sources among magnetoencephalography (MEG) biomarker is obtained by several inverse problem techniques. These techniques present different assumptions and particular epileptic network connectivity. Here, we proposed to evaluate performances of distributed inverse problem in defining EZ. First, we applied an advanced technique based on Singular Value Decomposition (SVD) to recover only pure transitory activities (interictal epileptiform discharges). We evaluated our technique’s robustness in separation between transitory and ripples versus frequency range, transitory shapes, and signal to noise ratio on simulated data (depicting both epileptic biomarkers and respecting time series and spectral properties of realistic data). We validated our technique on MEG signal using detector precision on 5 patients. Then, we applied four methods of inverse problem to define cortical areas and neural generators of excessive discharges. We computed network connectivity of each technique. Then, we confronted obtained noninvasive networks to intracerebral EEG transitory network connectivity using nodes in common, connection strength, distance metrics between concordant nodes of MEG and IEEG, and average propagation delay. Coherent Maximum Entropy on the Mean (cMEM) proved a high matching between MEG network connectivity and IEEG based on distance between active sources, followed by Exact low-resolution brain electromagnetic tomography (eLORETA), Dynamical Statistical Parametric Mapping (dSPM), and Minimum norm estimation (MNE). Clinical performance was interesting for entire methods providing in an average of 73.5% of active sources detected in depth and seen in MEG, and vice versa, about 77.15% of active sources were detected from MEG and seen in IEEG. Investigated problem techniques succeed at least in finding one part of seizure onset zone. dSPM and eLORETA depict the highest connection strength among all techniques. Propagation delay varies in this range [18, 25]ms, knowing that eLORETA ensures the lowest propagation delay (18 ms) and the closet one to IEEG propagation delay.
APA, Harvard, Vancouver, ISO, and other styles
6

Lam, J., P. Tomaszewski, G. Gilbert, JT Moreau, M. Guiot, J. Farmer, J. Atkinson, et al. "P.147 Evaluation of Arterial Spin Labeling (ASL) perfusion imaging in poorly- defined focal epilepsy in children." Canadian Journal of Neurological Sciences / Journal Canadien des Sciences Neurologiques 49, s1 (June 2022): S46. http://dx.doi.org/10.1017/cjn.2022.231.

Full text
Abstract:
Background: Poorly-defined cases (PDCs) of focal epilepsy are cases with no/subtle MRI abnormalities or have abnormalities extending beyond the lesion visible on MRI. Here, we evaluated the utility of Arterial Spin Labeling (ASL) MRI perfusion in PDCs of pediatric focal epilepsy. Methods: ASL MRI was obtained in 25 consecutive children presenting with poorly-defined focal epilepsy (20 MRI- positive, 5 MRI-negative). Qualitative visual inspection and quantitative analysis with asymmetry and Z-score maps were used to detect perfusion abnormalities. ASL results were compared to the hypothesized epileptogenic zone (EZ) derived from other clinical/imaging data and the resection zone in patients with Engel I/II outcome and >18 month follow-up. Results: Qualitative analysis revealed perfusion abnormalities in 17/25 total cases (68%), 17/20 MRI-positive cases (85%) and none of the MRI-negative cases. Quantitative analysis confirmed all cases with abnormalities on qualitative analysis, but found 1 additional true-positive and 4 false-positives. Concordance with the surgically-proven EZ was found in 10/11 cases qualitatively (sensitivity=91%, specificity=50%), and 11/11 cases quantitatively (sensitivity=100%, specificity=23%). Conclusions: ASL perfusion may support the hypothesized EZ, but has limited localization benefit in MRI-negative cases. Nevertheless, owing to its non-invasiveness and ease of acquisition, ASL could be a useful addition to the pre-surgical MRI evaluation of pediatric focal epilepsy.
APA, Harvard, Vancouver, ISO, and other styles
7

Lam, J., P. Tomaszewski, G. Gilbert, JT Moreau, M. Guiot, S. Albrecht, J. Farmer, et al. "P.173 Evaluation of Arterial Spin Labeling (ASL) Perfusion Imaging in Poorly-Defined Focal Epilepsy in Children." Canadian Journal of Neurological Sciences / Journal Canadien des Sciences Neurologiques 48, s3 (November 2021): S69. http://dx.doi.org/10.1017/cjn.2021.449.

Full text
Abstract:
Background: Poorly-defined cases (PDCs) of focal epilepsy are cases with no/subtle MRI abnormalities or have abnormalities extending beyond the lesion visible on MRI. Here, we evaluated the utility of Arterial Spin Labeling (ASL) MRI perfusion in PDCs of pediatric focal epilepsy. Methods: ASL MRI was obtained in 25 consecutive children presenting with poorly-defined focal epilepsy (20 MRI- positive, 5 MRI-negative). Qualitative visual inspection and quantitative analysis with asymmetry and Z-score maps were used to detect perfusion abnormalities. ASL results were compared to the hypothesized epileptogenic zone (EZ) derived from other clinical/imaging data and the resection zone in patients with Engel I/II outcome and >18 month follow-up. Results: Qualitative analysis revealed perfusion abnormalities in 17/25 total cases (68%), 17/20 MRI-positive cases (85%) and none of the MRI-negative cases. Quantitative analysis confirmed all cases with abnormalities on qualitative analysis, but found 1 additional true-positive and 4 false-positives. Concordance with the surgically-proven EZ was found in 10/11 cases qualitatively (sensitivity=91%, specificity=50%), and 11/11 cases quantitatively (sensitivity=100%, specificity=23%). Conclusions: ASL perfusion may support the hypothesized EZ, but has limited localization benefit in MRI-negative cases. Nevertheless, owing to its non-invasiveness and ease of acquisition, ASL could be a useful addition to the pre-surgical MRI evaluation of pediatric focal epilepsy.
APA, Harvard, Vancouver, ISO, and other styles
8

AGIRREAZKUENAGA ZIGORRAGA, IÑAKI. "LA ADMINISTRACIÓN NO PUEDE BENEFICIARSE DE SU SILENCIO." RVAP 87-88, no. 87-88 (August 1, 2010): 25–45. http://dx.doi.org/10.47623/ivap-rvap.87.88.2010.01.

Full text
Abstract:
La Administración que genera mediante una conducta claramente ilegal inseguridad jurídica no puede esgrimir esa inseguridad a su favor, pretendiendo obtener de ella ventajas frente a quienes sufren los efectos de la inseguridad creada mediante su silencio. En tal sentido el plazo de 6 meses establecido en el art. 46.1 in fine LJCA, no debe surtir efectos frente al principio de tutela judicial efectiva, y si la Administración no quiere que se compute el plazo de prescripción de la infracción, le basta con cumplir con su obligación de resolver el recurso administrativo, para que la sanción adquiera firmeza, y a la vez no haya posibilidad de que prescriba la infracción. Jokabide erabat legez kontrakoa erabiliz segurtasun juridikorik eza eragiten duen administrazioak ezin du gero segurtasun ez hori bere alde erabili, bere isiltasunak eragindako intseguritatearen ondorioak pairatzen dituztenen aurrean abantailak atera asmoz. Horretan, AAJL legearen 46.1 in fine artikuluak ezartzen duen 6 hilabeteko epeak ez luke ondoriorik eragin behar tutoretza judizial efektiboaren printzipioaren aurrean, eta Administrazioak ez baldin badu nahi arauhaustearen preskripzioaren epea konputatzerik, nahikoa du errekurtso administratiboa ebazteko obligazioa betetzea, orduan zehapena irmoa izango baita, eta, aldi berean, ez baita aukerarik egongo arau-haustea preskribatzeko. If the Administration provokes legal uncertainty by means of a clearly illegal conduct, it cannot take advantage from that uncertainty by pretending to benefit vis-à-vis those that suffered the uncertainty generated from its silence. To this effect, the six months period established in section 46.1 in fine of LJCA should not apply face-to-face the principle of effective judicial protection, and if the Administration does not want to take into consideration the lapsing period of the infringement, it would be enough with discharging the obligation of deciding upon the administrative appeal so as to make the penalty final and at the same time the infringement could not be time-barred.
APA, Harvard, Vancouver, ISO, and other styles
9

Mutaqin, Kikin, Jana L. Comer, Astri C. Wayadande, Ulrich Melcher, and Jacqueline Fletcher. "Selection and characterization ofSpiroplasma citrimutants by random transposome mutagenesis." Canadian Journal of Microbiology 57, no. 6 (June 2011): 525–32. http://dx.doi.org/10.1139/w11-026.

Full text
Abstract:
Phytopathogenic spiroplasmas can multiply in vascular plants and insects. A deeper understanding of this dual-host life could be furthered through the identification by random mutagenesis of spiroplasma genes required. The ability of the EZ::TN™ <DHFR-1> Tnp transposome™ system to create random insertional mutations in the genome of Spiroplasma citri was evaluated. The efficiency of electroporation-mediated transformation of S. citri BR3-3X averaged 28.8 CFUs/ng transposome for 109spiroplasma cells. Many transformants appearing on the selection plates were growth impaired when transferred to broth. Altering broth composition in various ways did not improve their growth. However, placing colonies into a small broth volume resulted in robust growth and successful subsequent passages of a subset of transformants. PCR using primers for the dihydrofolate reductase gene confirmed the transposon’s presence in the genomes of selected transformants. Southern blot hybridization and nucleotide sequencing suggested that insertion was random within the chromosome and usually at single sites. The insertions were stable. Growth rates of all transformants were lower than that of the wild-type S. citri, but none lost the ability to adhere to a Circulifer tenellus (CT-1) cell line. The EZ::TN™ <DHFR-1> Tnp transposome™ system represents an additional tool for genetic manipulation of the fastidious spiroplasmas.
APA, Harvard, Vancouver, ISO, and other styles
10

Hankinson, R. J. "The Sceptical Inquirer." History of Philosophy and Logical Analysis 23, no. 1 (September 8, 2020): 74–99. http://dx.doi.org/10.30965/26664275-02301007.

Full text
Abstract:
Abstract This article treats of whether scepticism, in particular Pyrrhonian scepticism, can be said to deploy a method of any kind. I begin by distinguishing various different notions of method, and their relations to the concept of expertise (section 1). I then (section 2) consider Sextus’s account, in the prologue to Outlines of Pyrrhonism, of the Pyrrhonist approach, and how it supposedly differs from those of other groups, sceptical and otherwise. In particular, I consider the central claim that the Pyrrhonist is a continuing investigator (section 3), who in spite of refusing to be satisfied with any answer (or none), none the less still achieves tranquillity, and whether this can avoid being presented as a method for so doing, and hence as compromising the purity of sceptical suspension of commitment (section 4). In doing so, I relate—and contrast—the Pyrrhonists’ account of their practice to the ‘Socratic Method’ (section 5), as well as to the argumentative practice of various Academics (section 6), and assess their claim in so doing to be offering a way of instruction (section 7). I conclude (section 8) that there is a consistent and interesting sense in which Pyrrhonian scepticism can be absolved of the charge that it incoherently, and crypto-dogmatically, presents itself as offering a method for achieving an intrinsically desirable goal.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "EZ. None of these, but in this section"

1

Ebenezer, Catherine. "“Access denied”? Barriers for staff accessing, using and sharing published information online within the National Health Service (NHS) in England: technology, risk, culture, policy and practice." Thesis, University of Sheffield, 2017. http://eprints.rclis.org/32585/1/CME%20thesis%20v4.0%20White%20Rose%20final%20single%20volume.pdf.

Full text
Abstract:
The overall aim of the study was to investigate barriers to online professional information seeking, use and sharing occurring within the NHS in England, their possible effects (upon education, working practices, working lives and clinical and organisational effectiveness), and possible explanatory or causative factors. The investigation adopted a qualitative case study approach, using semi-structured interviews and documentary analysis as its methods, with three NHS Trusts of different types (acute - district general hospital, mental health / community, acute – teaching) as the nested sites of data collection. It aimed to be both exploratory and explanatory. A stratified sample of participants, including representatives of professions whose perspectives were deemed to be relevant, and clinicians with educational or staff development responsibilities, was recruited for each Trust. Three non-Trust specialists (the product manager of a secure web gateway vendor, an academic e-learning specialist, and the senior manager at NICE responsible for the NHS Evidence electronic content and web platform) were also interviewed. Policy documents, statistics, strategies, reports and quality accounts for the Trusts were obtained via public websites, from participants or via Freedom of Information requests. Thematic analysis following the approach of Braun and Clarke (2006) was adopted as the analytic method for both interviews and documents. The key themes of the results that emerged are presented: barriers to accessing and using information, education and training, professional cultures and norms, information governance and security, and communications policy. The findings are discussed under three main headings: power, culture, trust and risk in information security; use and regulation of Web 2.0 and social media, and the system of professions. It became evident that the roots of problems with access to and use of such information lay deep within the culture and organisational characteristics of the NHS and its use of IT. A possible model is presented to explain the interaction of the various technical and organisational factors that were identified as relevant. A number of policy recommendations are put forward to improve access to published information at Trust level, as well as recommendations for further research.
APA, Harvard, Vancouver, ISO, and other styles
2

Boyer, Sebastien. "Dans le cadre du nouveau cycle de combustible $^(232)$Th/$^(233)$U, determination de la section efficace de capture radiative $^(233)$Pa(n,$\gamma$) pour des energies de neutrons comprises entre 0 et 1 MeV." Phd thesis, Université Sciences et Technologies - Bordeaux I, 2004. http://tel.archives-ouvertes.fr/tel-00009062.

Full text
Abstract:
Dans l'optique d'un developpement durable du nucleaire, un des themes de recherche du CNRS dicte par la loi Bataille de 1991, est l'etude d'une nouvelle filiere nucleaire utilisant un combustible a base de minerai de thorium ($^(232)$Th) ou le noyau fissile est l'$^(233)$U. Le principal interet de ce type de combustible reside dans sa particularite de produire les dechets transuraniens en beaucoup plus faible quantite que les reacteurs a eau pressurisee actuels. Cependant certaines donnees nucleaires importantes concernant cette nouvelle filiere sont tres mal connues comme par exemple celles relatives au noyau charniere protactinium 233 ($^(233)$Pa). Sa periode de 27 jours lui confere un role particulier dans le cycle mais en raison de sa trop forte activite l'etude de ce noyau releve du defi experimental. Pour contourner cette difficulte, la probabilite d'emission de rayonnements gamma dans la reaction induite par neutrons $^(233)$Pa(n,$\gamma$) entre 0 et 1 MeV d'energie neutron a ete determinee a partir de la reaction transfert $^(232)Th(^(3)He,p)^(234)Pa*$. Le dispositif de mesure permettait d'identifier la particule de sortie signant ainsi la voie de reaction tandis que des scintillateurs de type C$_(6)$D$_(6)$ permettaient la detection en coincidence des rayonnements gamma emis. La methode d'analyse des evenements gamma a necessite la ponderation des spectres de photons par des fonctions mathematiques calculees dites "fonctions de poids". leurs determinations requierent neanmoins une connaissance parfaite du comportement des scintillateurs (efficacite, fonctions de reponse) dans la geometrie choisie. Pour ce faire, une etude preliminaire a ete realisee a l'aide de sources gamma et avec des reactions induites par protons sur des noyaux legers. Les simulations utilisant le code de transport MCNP ont ete validees par des resultats experimentaux.
APA, Harvard, Vancouver, ISO, and other styles
3

Park, Allison M. "Comparing the Cognitive Demand of Traditional and Reform Algebra 1 Textbooks." Scholarship @ Claremont, 2011. http://scholarship.claremont.edu/hmc_theses/9.

Full text
Abstract:
Research has shown that students achieved higher standardized test scores in mathematics and gained more positive attitudes towards mathematics after learning from reform curricula. Because these studies involve actual students and teachers, there are classroom variables that are involved in these findings (Silver and Stein, 1996; Stein et al., 1996). To understand how much these curricula by themselves contribute to higher test scores, I have studied the cognitive demand of tasks in two traditional and two reform curricula. This work required the creation of a scale to categorize tasks based on their level of cognitive demand. This scale relates to those by Stein, Schoenfeld, and Bloom. Based on this task analysis, I have found that more tasks in the reform curricula require higher cognitive demand than tasks in the traditional curricula. These findings confirm other results that posing tasks with higher cognitive demand to students can lead to higher student achievement.
APA, Harvard, Vancouver, ISO, and other styles
4

Zurita, Sánchez Juan Manuel. "El paradigma otletiano como base de un modelo para la organización y difusión del conocimiento científico." Thesis, Universidad Nacional Autónoma de México, 2001. http://eprints.rclis.org/6752/1/tesina._juan_manuel_zurita.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Cho, Karina Elle. "Enhancing the Quandle Coloring Invariant for Knots and Links." Scholarship @ Claremont, 2019. https://scholarship.claremont.edu/hmc_theses/228.

Full text
Abstract:
Quandles, which are algebraic structures related to knots, can be used to color knot diagrams, and the number of these colorings is called the quandle coloring invariant. We strengthen the quandle coloring invariant by considering a graph structure on the space of quandle colorings of a knot, and we call our graph the quandle coloring quiver. This structure is a categorification of the quandle coloring invariant. Then, we strengthen the quiver by decorating it with Boltzmann weights. Explicit examples of links that show that our enhancements are proper are provided, as well as background information in quandle theory.
APA, Harvard, Vancouver, ISO, and other styles
6

Mebane, Palmer. "Uniquely Solvable Puzzles and Fast Matrix Multiplication." Scholarship @ Claremont, 2012. https://scholarship.claremont.edu/hmc_theses/37.

Full text
Abstract:
In 2003 Cohn and Umans introduced a new group-theoretic framework for doing fast matrix multiplications, with several conjectures that would imply the matrix multiplication exponent $\omega$ is 2. Their methods have been used to match one of the fastest known algorithms by Coppersmith and Winograd, which runs in $O(n^{2.376})$ time and implies that $\omega \leq 2.376$. This thesis discusses the framework that Cohn and Umans came up with and presents some new results in constructing combinatorial objects called uniquely solvable puzzles that were introduced in a 2005 follow-up paper, and which play a crucial role in one of the $\omega = 2$ conjectures.
APA, Harvard, Vancouver, ISO, and other styles
7

Lubisco, Nídia Maria Lienert. "La evaluación en la biblioteca universitaria brasileña: evolución y propuesta de mejora." Thesis, Universidad Carlos III De Madrid, 2007. http://eprints.rclis.org/12225/1/tesisnidia.pdf.

Full text
Abstract:
This thesis examines the evolution of the evaluation of the university library in Brazil. At present, the only applicable system takes place in the evaluation of academic by the Ministry of Education, through the National Institute for Studies and Research Educacionais Anísio Teixeira (INEP), the entity responsible for coordinating and implementing the process of evaluating the entire education system in Brazil. The starting point for addressing this investigation were the results of a case study developed at the Federal University of Bahia in 2001, under the assumption that the Ministry did not have criteria and appropriate instruments to reflect the library's role as a resource for educational qualifications. The review of the literature relevant Brazilian, British and Spanish regarding the subject, especially regarding experience with the Spanish University Libraries Network (REBIUN), were essential to know the trends and the state of affairs. It also took out a field work through questionnaires in 7 countries and 31 Latin American universities, in order to ascertain the current status of evaluation systems used in that region. The low response rate, about 50% of institutions surveyed, this was offset by the work of the Symposium on Electronic Library evaluator and Quality (Sociedad Argentina de report, 2002), an event of great significance to who came over 450 librarians and teachers. This initiative helped develop a comparative study that served to place the Brazilian reality and take into account the most similar case mix. All this enabled comply with the overall objective of this thesis: develop a model to evaluate university libraries in Brazil, from the state of affairs of the university library and Brazilian experiences located in Latin American countries. That model se basa en el Instrument INEP (2006), but its content is raised, taking into account different jobs: performance indicators for assessing the university library, REBIUN (2000), one of the documents PNBU, prepared Romcy by Maria Carmen de Carvalho (1995), Standards for university libraries in Chile (2001) and methodological guide for evaluating the libraries of institutions of higher education (Mexico, 2000). The final claim is that the Brazilian Ministry has a more comprehensive evaluation system, to be implemented and improved according to the data you make and, in turn, could be to build a bank of information management and a system of indicators performance.
APA, Harvard, Vancouver, ISO, and other styles
8

Moreira, Walter. "A construção de informações documentárias: aportes da linguística documentária, da terminologia e das ontologias." Thesis, Universidade de São Paulo, 2010. http://eprints.rclis.org/17437/1/TeseFinalRevisada_05Jul2010.pdf.

Full text
Abstract:
Investigates theoretical and practical interfaces between terminology, philosophical ontology, computational ontology and documentary linguistics and the subsidies that they offer for the construction of documentary information. It was established as specific objectives, the analysis of the production, development, implementation and use of ontologies based on the information science theories, the research on the contribution of ontologies for the development of thesauri and vice versa and the discussion of the philosophical foundation of the application of ontologies based on the study of ontological categories present in classical philosophy and in the contemporary proposals. It argues that the understanding of ontologies through the communicative theory of terminology contributes to the organization of a less quantitative access (syntactic) and more qualitative (semantic) of information. Notes that, in spite of sharing some common goals, there is little dialogue between the information science (and, inside it, the documentary linguistics) and computer science. It argues that the computational and philosophical ontologies are not completely independent events, which have among themselves only the similarity of name, and notes that the discussion of categories and categorization in computer science, does not always have the emphasis it receives in information science in studies on knowledge representation. The approach of Deleuze and Guattari's rhizome, was treated as instigator of reflections on the validity of the hierarchical tree model structure and the possibilities of its expansion. It concludes that the construction of ontologies can not ignore the terminological and conceptual analysis, as it's understood by the terminology and by the information science accumulated in the theoretical and methodological basis for the construction of indexing languages and, on the other hand, the construction of flexible indexing languages can not ignore the representational model of ontologies which are more capable for formalization and interoperability.
APA, Harvard, Vancouver, ISO, and other styles
9

Ollendorff, Christine. "Construction d'un diagnostic complexe d'une bibliothèque académique." Thesis, Ecole nationale supérieure d'arts et métiers - ENSAM, 1999. http://eprints.rclis.org/11682/1/These-co.pdf.

Full text
Abstract:
In the first part, we studies academic libraries whose founding principles are change with the information society. The library manager looks for the best ways to lead a library. Evaluation and management tools on libraries are sources of insatisfactions because of the lack of a holistic vision of libraries they give. Constructivist approaches of management seems an alternative way to help library managers in designing a strategic vision of these organizations. In the second part, we build a modelization project with complex systems modelization of Jean-Louis Le Moigne and soft systems methodology of Peter Checkland. French academic libraries constitute our research field in which we use participant observations and staff and managers interviews. Our system is made up of the academic library in the vision of it leader. We build a three phase diagnostic. With the first phase, the manager can observ services supply as a three axes system which evolves with technologies, materials and humans means and bibliotheconomic knowledge. The second phase studies the informational system with environmental and flow observations. The third phase studies the decisional system, looking for its organization, ist internal information circulation and its decisional processes. This diagnostic is in continuous construction. It works in the library management continuity and is not the first step of a strategic planning. We use the diagnostic in five libraries. Results show modelization supply with the creation of evolving, transposable and learning models.
APA, Harvard, Vancouver, ISO, and other styles
10

Juliana, Fachin JF. "Acesso à Informação Pública nos Arquivos Públicos Estaduais." Thesis, 2014. http://eprints.rclis.org/23504/1/Disserta%C3%A7%C3%A3o-Juliana%20-%2017-07-2014.pdf.

Full text
Abstract:
The exploratory descriptive qualitative study analyze the public information access on the State Public Archives websites and using criteria’s of the law n. 12.527, November 2011, focusing on the dissemination of information by electronic means. Considered whether the State Public Archives feature on their websites any indication of the law on access to public information – (Lei de Acesso à Informação) LAI; Characterized the mission of the State Public Archives; Investigating the use of Public Policy Informational for State Public Archives; Detecting the managers perception of the importance and need to use websites to provide information to users; Identifying the views of managers from the State Public Archives to the law n. 12.527 on access public information. The study was conducted from August to December 2013. The focus of the study population consists of 26 State Public Archives; the survey sample was restricted to 15 State Public Archives that owns a website running. Two tools were used to collect data: the analysis of the websites of the 15 State Public Archives, and, send by e-mail, a structured questionnaire with open questions to all managers of their State Public Archives. For the synthesis of collected data are employed the use of tables and spreadsheets. The Bardin (1977) method are used for the content analysis obtained by the open and structured questionnaire. The results analysis of websites was found that six State Archives indicated the law; By characterizing the mission of the Public Archives, it was investigated that there are two aspects of implementation: one is the document management, and the other is the provision of information access. In the analysis of the websites are found that 10 indicated use of Informational Public Policy. By analyzing the responses to the open and structured questionnaire were identified reporting the importance of using the website as a way to broaden the information dissemination. The manager’s opinion indicated the importance and applicability from the LAI to archives, exposing points that need to be revised in the law, and also increase the investment to do more implementation for the LAI. It was concluded: a) the archival the Public Archives State are some transformations on the scenario pervaded by the law on information access, for instance, changing gradually to meet the requirements of its applicability; b) as the archives profile, which are subordinates organs became to the public administration and those depend on public funds, their goal of existence is to organize and make access to available information; c) trends are related to the internal demands to the people skills (enforcement of the law) and the use of tools interactive communication and information, such as websites, blogs, social networks, dynamic databases, and others. The State Public Archives adapt themselves gradually to the new informational demands required by LAI.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "EZ. None of these, but in this section"

1

Mathematics of probability. Providence, Rhode Island: American Mathematical Society, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

1949-, Libgober A. (Anatoly), Cogolludo-Agustín José Ignacio, and Hironaka Eriko 1962-, eds. Topology of algebraic varieties and singularities: Conference in honor of Anatoly Libgober's 60th birthday, June 22-26, 2009, Jaca, Huesca, Spain. Providence, R.I: American Mathematical Society, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Center for Mathematics at Notre Dame and American Mathematical Society, eds. Toplogy and field theories: Center for Mathematics at Notre Dame, Center for Mathematics at Notre Dame : summer school and conference, Topology and field theories, May 29-June 8, 2012, University of Notre Dame, Notre Dame, Indiana. Providence, Rhode Island: American Mathematical Society, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ahmed, S. E. (Syed Ejaz), 1957- editor of compilation, ed. Perspectives on big data analysis: Methodologies and applications : International Workshop on Perspectives on High-Dimensional Data Anlaysis II, May 30-June 1, 2012, Centre de Recherches Mathématiques, University de Montréal, Montréal, Québec, Canada. Providence, Rhode Island: American Mathematical Society, 2014.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Riemann surfaces by way of complex analytic geometry. Providence, R.I: American Mathematical Society, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Conference Board of the Mathematical Sciences and National Science Foundation (U.S.), eds. Lectures on the energy critical nonlinear wave equation. Providence, Rhode Island: Published for the Conference Board of the Mathematical Sciences by the American Mathematical Society, with support from the National Science Foundation, 2015.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

The ultimate challenge: The 3x+1 problem. Providence, R.I: American Mathematical Society, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

From riches to raags: 3-manifolds, right-angled artin groups, and cubical geometry. Providence, Rhode Island: Published for the Conference Board of the Mathematical Sciences by the American Mathematical Society, Providence, Rhode Island with support from the National Science Foundation, 2012.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Morse, Robert Fitzgerald, editor of compilation, Nikolova-Popova, Daniela, 1952- editor of compilation, and Witherspoon, Sarah J., 1966- editor of compilation, eds. Group theory, combinatorics and computing: International Conference in honor of Daniela Nikolova-Popova's 60th birthday on Group Theory, Combinatorics and Computing, October 3-8, 2012, Florida Atlantic University, Boca Raton, Florida. Providence, Rhode Island: American Mathematical Society, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Flannery, D. L. (Dane Laurence), 1965-, ed. Algebraic design theory. Providence, R.I: American Mathematical Society, 2011.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "EZ. None of these, but in this section"

1

Smithies, Declan. "Luminosity." In The Epistemic Role of Consciousness, 345–79. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780199917662.003.0011.

Full text
Abstract:
Chapter 11 defends the thesis that some phenomenal and epistemic conditions are luminous in the sense that you’re always in a position to know whether or not they obtain. Section 11.1 draws a distinction between epistemic and doxastic senses of luminosity and argues that some conditions are epistemically luminous even if none are doxastically luminous. Section 11.2 uses this distinction in solving Ernest Sosa’s version of the problem of the speckled hen. The same distinction is applied to Timothy Williamson’s anti-luminosity argument in section 11.3, his argument against epistemic iteration principles in section 11.4, and his argument for improbable knowing in section 11.5. Section 11.6 concludes by explaining why this defense of luminosity is not merely a pointless compromise.
APA, Harvard, Vancouver, ISO, and other styles
2

"Literary Selections." In Roots of the Black Chicago Renaissance, edited by Richard A. Courage and Christopher Robert Reed, 245. University of Illinois Press, 2020. http://dx.doi.org/10.5622/illinois/9780252043055.003.0013.

Full text
Abstract:
Editors’ Note: Our study concludes with a section comprising three literary selections that we intend to break new ground for a scholarly collection. None of the selections is a conventional academic essay, each belongs to a different genre of writing, and each amplifies the light already shone on the roots of the Black Chicago Renaissance....
APA, Harvard, Vancouver, ISO, and other styles
3

Phillips, David. "The Critique of Common-Sense Morality (Methods III.I–III.XI)." In Sidgwick's The Methods of Ethics, 96–119. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780197539613.003.0006.

Full text
Abstract:
This chapter focuses on The Methods of Ethics, Book III, Chapters I through XI. Its topic is Sidgwick’s critique of the theory he calls “common-sense morality” or “dogmatic intuitionism.” The first section introduces this theory, noting problems with both Sidgwick’s names for it. The second section discusses the project of Chapters I through XI, making the principles of common-sense morality clear and precise. The third section introduces the four conditions Sidgwick uses to test putative axioms. The fourth section outlines his argument that none of the principles of common-sense morality satisfy the four conditions, focusing on the example of promissory obligation. The fifth section considers two responses: defending absolutist deontology, and moving to moderate deontology by introducing, as W. D. Ross does, the concept of prima facie duty. The final section raises questions about the fairness of Sidgwick’s critique.
APA, Harvard, Vancouver, ISO, and other styles
4

Collins, Richard B., Dale A. Oesterle, and Lawrence Friedman. "Schedule." In The Colorado State Constitution, 457–62. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780190907723.003.0030.

Full text
Abstract:
This chapter discusses the “Schedule” of the Colorado Constitution. Transition to statehood required that territorial institutions and law be retained until expressly replaced. At the end of the original constitution, twenty-two sections under the heading of Schedule detailed how the transition should work. Although almost entirely obsolete, none has been repealed. Schedule Section 1 was invoked by enterprising defense lawyers in efforts to get their clients off on a technicality. At least one succeeded. Section 20, requiring that presidential electors “be chosen by direct vote of the people,” could be read as obsolete or as a continuing constitutional rule. It is the only section with possible relevance to a current dispute.
APA, Harvard, Vancouver, ISO, and other styles
5

Knee, Jonathan A. "Lessons from Clown School." In Class Clowns, 202–32. Columbia University Press, 2016. http://dx.doi.org/10.7312/columbia/9780231179287.003.0007.

Full text
Abstract:
The final section of Class Clowns tries to draw some conclusions from the analyses and case studies. Ten lessons are drawn that should provide guardrails to anyone seeking to play in the education domain. Broadly, the failures highlighted fall into three broad buckets: misguided ambition, misunderstood markets and faulty execution. None of these phenomena are unique to education, but their application takes on a very particular shape in the educational context.
APA, Harvard, Vancouver, ISO, and other styles
6

Anne Franks, Mary. "“Not Where Bodies Live”." In Free Speech in the Digital Age, 137–49. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780190883591.003.0009.

Full text
Abstract:
John Perry Barlow, one of the founders of the Electronic Frontier Foundation (EFF), famously claimed in 1996 that the internet “is a world that is both everywhere and nowhere, but it is not where bodies live.” The conception of cyberspace as a realm of pure expression has encouraged an aggressively anti-regulatory approach to the internet. This approach was essentially codified in U.S. federal law in Section 230 of the Communications Decency Act, which invokes free speech principles to provide broad immunity for online intermediaries against liability for the actions of those who use their services. The free speech frame has encouraged an abstract approach to online conduct that downplays its material conditions and impact. Online intermediaries use Section 230 as both a shield and a sword—simultaneously avoiding liability for the speech of others while benefiting from that speech. In the name of free expression, Section 230 allows powerful internet corporations to profit from harmful online conduct while absorbing none of its costs.
APA, Harvard, Vancouver, ISO, and other styles
7

Schram, Frederick R., and Stefan Koenemann. "Mictacea." In Evolution and Phylogeny of Pancrustacea, 362–69. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780195365764.003.0028.

Full text
Abstract:
Mictacea demonstrates the confusion that occurs when animals share so many primitive features. Modest Guţu in fact erected a number of orders to accommodate them, but these do not seem entirely justified. Although the two orders erected by Guţu currently do not have much support, in fairness it must be recognized that molecules, or information from yet-to-be-recognized mictacean taxa, could change the situation. In this case, Guţu’s Bochusacea and Cosinzeneacea might yet prove useful. They inhabit anchialine caves and the deep sea. The animals are small, slender, and subcylindrical in outline. The diagnosis of Guţu to the contrary, none of these animals appear particularly “flattened” in cross-section.
APA, Harvard, Vancouver, ISO, and other styles
8

Sarch, Alexander. "The Scope of the Willful Ignorance Doctrine (I)." In Criminally Ignorant, 85–108. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780190056575.003.0004.

Full text
Abstract:
Determining when the equal culpability thesis holds sets the boundaries in which the willful ignorance doctrine is to be applied. Chapter 3 thus considers the best existing attempts to specify the conditions in which the equal culpability thesis holds, but proceeds to argue that none succeeds. Still, each failure is instructive. First, the chapter argues against the unrestricted equal culpability thesis. Not all willful ignorance, it turns out, is as culpable as the analogous knowing misconduct. Then the chapter argues against the three leading attempts to restrict the thesis. Section II argues against a restriction that appeals to bad motives, while Section III argues against a common counterfactual restriction (according to which willful ignorance is as culpable as knowing misconduct when one would do the actus reus even with knowledge). The latter proposal fails since criminal culpability does not depend on considerations about counterfactual conduct or one’s willingness to misbehave. Finally, Section IV discusses a third restriction, offered by Deborah Hellman, which asks whether the decision to remain in ignorance was itself justified. This version of the thesis is on the right track, but still requires refinement in important ways.
APA, Harvard, Vancouver, ISO, and other styles
9

Fabre, Cécile. "Building Blocks." In Spying Through a Glass Darkly, 12–36. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780198833765.003.0002.

Full text
Abstract:
This chapter sets the stage for the remainder of the book. It provides an overview of the ethics of spying in classical moral and political thought. It examines three possible approaches to espionage: (a) the dirty-hands approach, which sees espionage as a necessary evil; (b) the contractarian view that espionage is best defended by appeal to normative conventions endorsed by all; (c) the view that the most fruitful way to construct an ethics of espionage is through the lens of just-war theory. None of those approaches are fully satisfactory, though they provide building blocks for the approach taken in this book. The final section sets out the normative principles on which the book rests.
APA, Harvard, Vancouver, ISO, and other styles
10

Adams, Robert Merrihew. "God and Possibilities." In What Is, and What Is In Itself, 194–212. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780192856135.003.0012.

Full text
Abstract:
Section 11.1 of this chapter discusses how (and whether!) God can know all possibilities without actualizing all of them. For if unactualized conscious states and processes are to be permanent objects of actual thoughts in an eternal mind, how different is that from their being actual conscious states and processes of that mind? It is argued that discursive knowledge of logic and mathematics is not problematic in this way—and in section 11.2 that “intuitive” knowledge of qualities raises additional issues, but none that God could not manage. Section 11.3 points out that God does not need to know nearly as much about possibilities if God need not and does not try to create a “best possible world,” but is motivated by grace and love for actual creatures. Section 11.4 addresses two questions about causal explanation: First, the teleological question: “What am I doing, or at least trying to do, and why am I trying to do it?” I think I often know an at least roughly right answer to that question. The second and much harder question is “Why is it that I can do what I am trying to do, if indeed I can?” Occasionalism implies that it depends on God. That is where this line of explanations stops. No causal explanation of God’s omnipotence is on offer.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "EZ. None of these, but in this section"

1

Al-Shuwaikhat, Hisham I., Shaker A. Al-Buhassan, Turki F. Al-Saadoun, and Saad M. Al-Driweesh. "Optimizing Production Facilities Using None-Radio Active Source MPFM in Ghawar Field in Saudi Aramco." In SPE Saudi Arabia section Young Professionals Technical Symposium. Society of Petroleum Engineers, 2008. http://dx.doi.org/10.2118/117063-ms.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Siewiorek, Gail M., and Ender A. Finol. "The Effect of Embolic Protection Filters on Distal Embolization and Slow Flow Condition in Carotid Artery Stenting." In ASME 2007 Summer Bioengineering Conference. American Society of Mechanical Engineers, 2007. http://dx.doi.org/10.1115/sbc2007-176534.

Full text
Abstract:
Strokes are the third leading cause of death in the United States today. Carotid artery stenting (CAS) used in conjunction with a cerebral protection device (CPD) is a current alternative treatment for severe carotid artery disease. A type of CPD, an embolic protection filter (EPF), has received attention recently due to its allowance of distal perfusion during use. This investigation studied the effects of four EPFs (Spider RX, FilterWire EZ, RX Accunet, Emboshield) on both pressure gradient and flow rate in the internal carotid artery (ICA) in vitro. Dyed polymer microspheres larger than the pore size of the devices tested were injected into the ICA of a 70% stenosed carotid artery model. The percentage of particles missed was calculated. None of the devices tested were able to completely prevent embolization. Emboshield had the least desirable performance (missing 35.4% of particles) while Spider RX had the best (missing 0.06%). A decrease in flow rate and an increase in pressure drop were seen after the device was filled with particles. From these results, it is inferred that improper wall apposition is the primary cause for inadequate capture efficiency rates, which may lead to an increase in distal embolization and stroke.
APA, Harvard, Vancouver, ISO, and other styles
3

Colucci, Jose´ A., Agusti´n Irizarry-Rivera, and Efrain O’Neill-Carrilo. "Sustainable Energy @ Puerto Rico." In ASME 2007 Energy Sustainability Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/es2007-36010.

Full text
Abstract:
During the last 15 years a renewed interest and growth in renewable energy (RE) processes emerged. It was driven by strong environmental movements, oil dependence/depletion concerns and lately national security concerns. Several RE technologies such as wind, niche photovoltaic and biodiesel are presently very competitive in certain applications versus their oil counterparts especially in Europe and certain locations in the mainland United States. Others are slowly penetrating certain markets such as fuel cells. In the discussion section an overview of the most mature RE technologies will be given focusing on their potential implementation in Puerto Rico. The discussion section will also include findings from an ongoing study at the municipality of Caguas who is becoming the sustainable model for Puerto Rico including energy. The overall analysis includes some elements of social, technical, cultural, political and economic criteria. In the latter capital, operating costs and foot print will be considered. Also sensitivity analyses will be performed regarding the energy generation potential of these processes. The technologies included are photovoltaic, wind energy, fuel cells, concentrated solar power and solar thermal water heating. These are referred to as near term implementation technologies. Other medium/long term ocean energy technologies will be discussed including tide, waves and ocean thermal. The last discussion subsection will briefly consider the area of transportation fuels (gasoline and diesel). In the last section an implementation plan will be presented for these processes including the University of Puerto Rico @ Mayagu¨ez (UPRM) capabilities and potential role in this puertorrican SAGA (Sol, Aire, Gente and Agua).
APA, Harvard, Vancouver, ISO, and other styles
4

Weaver, Matthew M., Michael G. Dunn, and Tab Heffernan. "Experimental Determination of the Influence of Foreign Particle Ingestion on the Behavior of Hot-Section Components Including Lamilloy®." In ASME 1996 International Gas Turbine and Aeroengine Congress and Exhibition. American Society of Mechanical Engineers, 1996. http://dx.doi.org/10.1115/96-gt-337.

Full text
Abstract:
The effect of foreign particle ingestion on hot-section components has been investigated with a series of experiments performed using a one-quarter sector F100-PW-100 annular combustor. The combustor was operated so that the engine corrected conditions were duplicated. The experiments were designed to determine the influence of cooling hole size, hole roughness (laser-drilled vs. EDM), combustor exit temperature and dust concentration on the cooling capability of the component in the presence of dust particles. Cylindrical Lamilloy® specimens, film cooled Inconel cylinders, and F100-PW-220 first stage turbine vanes were investigated. The size distribution of the foreign particles injected into the air stream just upstream of the combustor was consistent with distributions measured at the exit of high compressors for large turbofan engines. The foreign material was a soil composition representative of materials found around the world. Pre- and post-test bench flow measurements suggested that none of the test specimen’s cooling passages or holes were clogged by the foreign particles. Foreign particle deposition on the specimens was experienced with approximate values for the minimum combustor exit temperature and minimum specimen wall temperature for deposition measured.
APA, Harvard, Vancouver, ISO, and other styles
5

Urdaneta, Mario, Alfonso Ortega, and Russel V. Westphal. "Experiments and Modeling of the Hydraulic Resistance of In-Line Square Pin Fin Heat Sinks With Top By-Pass Flow." In ASME 2003 International Electronic Packaging Technical Conference and Exhibition. ASMEDC, 2003. http://dx.doi.org/10.1115/ipack2003-35268.

Full text
Abstract:
Extensive experiments were performed aimed at obtaining physical insight into the behavior of in-line pin fin heat sinks with pins of square cross-section. Detailed pressure measurements were made inside an array of square pins in order to isolate the inlet, developing, fully developed, and exit static pressure distributions as a function of row number. With this as background data, overall pressure drop was measured for a self-consistent set of aluminum heat sinks in side inlet side exit flow, with top clearance only. Pin heights of 12.5 mm, 17.5 mm, and 22.5 mm, pin pitch of 3.4 mm to 6.33 mm, and pin thickness of 1.5 mm, 2 mm and 2.5mm were evaluated. Base dimensions were kept fixed at 25 × 25 mm. In total, 20 aluminum heat sinks were evaluated. A “two-branch by-pass model” was developed, by allowing inviscid acceleration of the flow in the bypass section, and using pressure loss coefficients obtained under no bypass conditions in the heat sink section. The experimental data compared well to the proposed hydraulic models. Measurements in the array of pins showed that full development of the flow occurs after nine rows, thus indicating that none of the heat sinks tested could be characterized as fully-developed.
APA, Harvard, Vancouver, ISO, and other styles
6

Hu, Kenny S. Y., and Tom I.-P. Shih. "Large-Eddy and RANS Simulations of Heat Transfer in a U-Duct With a High-Aspect Ratio Trapezoidal Cross Section." In ASME Turbo Expo 2018: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/gt2018-75535.

Full text
Abstract:
Large-eddy and RANS simulations were performed to examine the details of the heat-transfer mechanisms in a U-duct with a high-aspect ratio trapezoidal cross section at a Reynolds number of 20,000. ANSYS-Fluent was used to perform the simulations. For the large-eddy simulations (LES), the WALE subgrid-scale model was employed, and its inflow boundary condition was provided by a concurrent LES of incompressible fully-developed flow in a straight duct with the same cross section and flow conditions as the U-duct. The grid resolution required to obtain meaningful LES solutions were obtained via a grid sensitivity study of incompressible fully-developed turbulent flow in a straight duct of square cross section, where data from direct numerical simulation (DNS) and experiments are available to validate and guide the simulation. In addition, the grid used satisfies Celik’s criterion, and resolves the Kolmogorov’s −5/3 law. Results were also obtained for the U-duct by using RANS, and three widely used turbulence models were examined — realizable k-ε with the two-layer model in the near-wall region, shear-stress transport (SST) model, and stress-omega full Reynolds stress model (RSM). Results obtained from LES showed unsteady flow separation to occur immediately after the turn region, which none of the RANS models could predict. By being able to capture this unsteady flow mechanism, LES was able to predict the measured heat-transfer downstream of the U-duct. The maximum relative error in the predicted local heat-transfer coefficient was less than 10% in the LES results, but up to 80% in the RANS results.
APA, Harvard, Vancouver, ISO, and other styles
7

Carroll, L. Blair, and Moe S. Madi. "Crack Detection Program on the Cromer to Gretna, Manitoba Section of Enbridge Pipelines Inc. Line 3." In 2000 3rd International Pipeline Conference. American Society of Mechanical Engineers, 2000. http://dx.doi.org/10.1115/ipc2000-186.

Full text
Abstract:
Enbridge Pipelines Inc. Line 3 is an 860mm (34 inch), API 5LX Grade X52 pipeline with nominal wall thickness ranging from 7.14 mm to 12.5 mm. The Canadian portion of the Line runs from Edmonton, Alberta to Gretna, Manitoba. It was constructed between 1963 and 1969 in a series of loops designed to increase the capacity of the Enbridge system. Until 1999 the pipeline operated in a looped configuration with neighboring 24 inch and 48 inch pipelines. Line 3 downstream of Kerrobert, Saskatchewan began operating in straight 34 inch configuration in 1999 following completion of the first phase expansion project known as Terrace Expansion Project that connects the (48 inch) loops with a new (36 inch) pipeline. In 1997, the Pipetronix (now PII) Ultrascan CD in-line inspection tool was run for 283 km from Cromer to Gretna, Manitoba, to identify long seam cracking and pipe body stress corrosion cracking. This section of the line is comprised primarily of pipe manufactured with a double submerged arc welded long seam with short sections of pipe having electric resistance welded long seams. There were two primary objectives set forth in this inspection project. The first was to assess the integrity of this section of Line 3 and identify anomalies, which might affect the future operation of the pipeline. The second objective was to evaluate the performance of the Ultrascan CD tool and determine its potential role in the Enbridge integrity program. A series of excavations have been conducted based upon the analysis of this data and none of the indications identified were considered to be an immediate concern to the integrity of the pipeline. Notably, two of the excavations resulted in the detection of the first two “significant” SCC colonies (based upon the CEPA definition of significant) [1] found on the Enbridge system. This paper will focus on the tool performance requirements established by Enbridge prior to the inspection run which include specific defect type and size and defects at a maximum sensitivity of the tool. In addition, the information obtained as a result of the excavation program and onsite inspection and assessments. The information gathered, from this program were useful in better understanding the tool tolerance in detecting such defects and to better differentiate between them.
APA, Harvard, Vancouver, ISO, and other styles
8

Bouhairie, Salem, Siddharth Talapatra, and Kevin Farrell. "Turbulent Flow in a No-Tube-in-Window Shell-and-Tube Heat Exchanger: CFD vs PIV." In ASME 2014 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/imece2014-36902.

Full text
Abstract:
A research-scale shell-and-tube heat exchanger housing a no-tube-in-window (NTIW) arrangement of tubes is analyzed using ANSYS® FLUENT. Three-dimensional, computational fluid dynamic (CFD) simulations of adiabatic flow in a periodic section of the exchanger were conducted. The numerical results were compared to particle image velocimetry (PIV) measurements in the window region where tubes are not present. As part of the study, the k-epsilon with scalable wall function, k-omega with shear stress transport (SST), Reynolds Stress (RSM), and Scale Adaptive Simulation (SAS) turbulence models were assessed. Each turbulence model showed some similarities with the recorded phenomena, but none fully captured the complexity of flow field outside of the tube bundle. Additional simulations of an entire NTIW exchanger model were performed to examine the flow behavior between the window and crossflow regions, as window momentum flux, ρu2, limits are a concern for safe mechanical design.
APA, Harvard, Vancouver, ISO, and other styles
9

Bajwa, Christopher S., and Earl P. Easton. "Potential Effects of Historic Rail Accidents on the Integrity of Spent Nuclear Fuel Transportation Packages." In ASME 2009 Pressure Vessels and Piping Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/pvp2009-77811.

Full text
Abstract:
The US Nuclear Regulatory Commission (NRC) completed an analysis of historical rail accidents (from 1975 to 2005) involving hazardous materials and long duration fires in the United States. The analysis was initiated to determine what types of accidents had occurred and what impact those types of accidents could have on the rail transport of spent nuclear fuel. The NRC found that almost 21 billion miles of freight rail shipments over a 30 year period had resulted in a small number of accidents involving the release of hazardous materials, eight of which involved long duration fires. All eight of the accidents analyzed resulted in fires that were less severe than the “fully engulfing fire” described as a hypothetical accident condition in the NRC regulations for radioactive material transport found in Title 10 of the Code of Federal Regulations, Part 71, Section 73. None of the eight accidents involved a release of radioactive material. This paper describes the eight accidents in detail and examines the potential effects on spent nuclear fuel transportation packages exposed to the fires that resulted from these accidents.
APA, Harvard, Vancouver, ISO, and other styles
10

Kochan, P., and H. J. Carmichael. "Photon-statistics dependence of single-atom absorption." In OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1993. http://dx.doi.org/10.1364/oam.1993.wxx.8.

Full text
Abstract:
Under usual laboratory conditions the absorption of a beam of light by a dilute two-level medium depends on the intensity and spectrum of the incident light. If the incident photon flux is small compared to the decay rate of the absorber’s excited state, saturation effects can be neglected and only the spectrum of the light is important; thus, weak-field absorption depends only on the firstorder correlation function of the incident light. This result relies, however, on a requirement that the incident beam cross-section be much larger than the absorption cross-section of an atom. Then individual atoms act as low efficiency scatterers and the statistics of the scattering process is governed by the law of large numbers applied to many scattering sites. In contrast, if it is arranged that just one atom significantly absorbs (scatters) a beam of photons, correlation functions of the incident light beyond the first-order must be considered. For a beam that is focused within an absorption cross-section, we might naively predict that every photon will be scattered and none transmitted. In fact, even for a very weak beam there is a finite probability for two photons to arrive at the atom within the excited state lifetime, and in this event at least one of them will be transmitted (statistical saturation). In this paper we calculate the transmitted photon flux for a weak beam of photons focussed strongly onto a single, resonant two-state atom. We study the dependence of the transmitted flux on the statistics of the incident photons. Photon beams derived from sources of coherent, chaotic, squeezed, coherent-antibunched, and broadband-antibunched light are considered. our calculations are based on a recently developed theory of cascaded open systems,1,2 and serve to illustrate the usefulness of this theory.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "EZ. None of these, but in this section"

1

Adams, Sunny, Madison Story, and Adam Smith. Evaluation of 11 buildings in the Fort McCoy cantonment. Engineer Research and Development Center (U.S.), September 2022. http://dx.doi.org/10.21079/11681/45350.

Full text
Abstract:
The United States Congress codified the National Historic Preservation Act of 1966 (NHPA), the nation’s most effective cultural resources legislation to date, mostly through establishing the National Register of Historic Places (NRHP). The NHPA requires federal agencies to address their cultural resources, which are defined as any prehistoric or historic district, site, building, structure, or object. Section 110 of the NHPA requires federal agencies to inventory and evaluate their cultural resources, and Section 106 requires them to determine the effect of federal undertakings on those potentially eligible for the NRHP. Fort McCoy is in west-central Wisconsin, entirely within Monroe County. It was first established as the Sparta Maneuver Tract in 1909. The post was renamed Camp McCoy in 1926. Since 1974, it has been known as Fort McCoy. This report provides historic context and determinations of eligibility for buildings in the cantonment constructed between 1946 and 1975 and concludes that none are eligible for the NRHP. In consultation with the Wisconsin State Historic Preservation Officer (WISHPO), this work fulfills Section 110 requirements for these buildings.
APA, Harvard, Vancouver, ISO, and other styles
2

Price, Roz. Climate Adaptation: Lessons and Insights for Governance, Budgeting, and Accountability. Institute of Development Studies (IDS), December 2020. http://dx.doi.org/10.19088/k4d.2022.008.

Full text
Abstract:
This rapid review draws on literature from academic, policy and non-governmental organisation sources. There is a huge literature on climate governance issues in general, but less is known about effective support and the political-economy of adaptation. A large literature base and case studies on climate finance accountability and budgeting in governments is nascent and growing. Section 2 of this report briefly discusses governance of climate change issues, with a focus on the complexity and cross-cutting nature of climate change compared to the often static organisational landscape of government structured along sectoral lines. Section 3 explores green public financial management (PFM). Section 4 then brings together several principles and lessons learned on green PFM highlighted in the guidance notes. Transparency and accountability lessons are then highlighted in Section 5. The Key findings are: 1) Engaging with the governance context and the political economy of climate governance and financing is crucial to climate objectives being realised. 2) More attention is needed on whether and how governments are prioritising adaptation and resilience in their own operations. 3) Countries in Africa further along in the green PFM agenda give accounts of reform approaches that are gradual, iterative and context-specific, building on existing PFM systems and their functionality. 4) A well-functioning “accountability ecosystem” is needed in which state and non-state accountability actors engage with one another. 5) Climate change finance accountability systems and ecosystems in countries are at best emerging. 6) Although case studies from Nepal, the Philippines and Bangladesh are commonly cited in the literature and are seen as some of the most advanced developing country examples of green PFM, none of the countries have had significant examples of collaboration and engagement between actors. 7) Lessons and guiding principles for green PFM reform include: use the existing budget cycle and legal frameworks; ensure that the basic elements of a functional PFM system are in place; strong leadership of the Ministry of Finance (MoF) and clear linkages with the overall PFM reform agenda are needed; smart sequencing of reforms; real political ownership and clearly defined roles and responsibilities; and good communication to stakeholders).
APA, Harvard, Vancouver, ISO, and other styles
3

Goeckeritz, Joel, Nathan Schank, Ryan L Wood, Beverly L Roeder, and Alonzo D Cook. Use of Urinary Bladder Matrix Conduits in a Rat Model of Sciatic Nerve Regeneration after Nerve Transection Injury. Science Repository, December 2022. http://dx.doi.org/10.31487/j.rgm.2022.03.01.

Full text
Abstract:
Previous research has demonstrated the use of single-channel porcine-derived urinary bladder matrix (UBM) conduits in segmental-loss, peripheral nerve repairs as comparable to criterion-standard nerve autografts. This study aimed to replicate and expand upon this research with additional novel UBM conduits and coupled therapies. Fifty-four Wistar Albino rats were divided into 6 groups, and each underwent a surgical neurectomy to remove a 7-millimeter section of the sciatic nerve. Bridging of this nerve gap and treatment for each group was as follows: i) reverse autograft—the segmented nerve was reversed 180 degrees and used to reconnect the proximal and distal nerve stumps; ii) the nerve gap was bridged via a silicone conduit; iii) a single-channel UBM conduit; iv) a multi-channel UBM conduit; v) a single-channel UBM conduit identical to group 3 coupled with fortnightly transcutaneous electrical nerve stimulation (TENS); vi) or, a multi-channel UBM conduit identical to group 4 coupled with fortnightly TENS. The extent of nerve recovery was assessed by behavioural parameters: foot fault asymmetry scoring measured weekly for six weeks; electrophysiological parameters: compound muscle action potential (CMAP) amplitudes, measured at weeks 0 and 6; and morphological parameters: total fascicle areas, myelinated fiber counts, fiber densities, and fiber sizes measured at week 6. All the above parameters demonstrated recovery of the test groups (3-6) as being either comparable or less than that of reverse autograft, but none were shown to outperform reverse autograft. As such, UBM conduits may yet prove to be an effective treatment to repair relatively short segmental peripheral nerve injuries, but further research is required to demonstrate greater efficacy over nerve autografts.
APA, Harvard, Vancouver, ISO, and other styles
4

Goeckeritz, Joel, Nathan Schank, Ryan L Wood, Beverly L Roeder, and Alonzo D Cook. Use of Urinary Bladder Matrix Conduits in a Rat Model of Sciatic Nerve Regeneration after Nerve Transection Injury. Science Repository, December 2022. http://dx.doi.org/10.31487/j.rgm.2022.03.01.sup.

Full text
Abstract:
Previous research has demonstrated the use of single-channel porcine-derived urinary bladder matrix (UBM) conduits in segmental-loss, peripheral nerve repairs as comparable to criterion-standard nerve autografts. This study aimed to replicate and expand upon this research with additional novel UBM conduits and coupled therapies. Fifty-four Wistar Albino rats were divided into 6 groups, and each underwent a surgical neurectomy to remove a 7-millimeter section of the sciatic nerve. Bridging of this nerve gap and treatment for each group was as follows: i) reverse autograft—the segmented nerve was reversed 180 degrees and used to reconnect the proximal and distal nerve stumps; ii) the nerve gap was bridged via a silicone conduit; iii) a single-channel UBM conduit; iv) a multi-channel UBM conduit; v) a single-channel UBM conduit identical to group 3 coupled with fortnightly transcutaneous electrical nerve stimulation (TENS); vi) or, a multi-channel UBM conduit identical to group 4 coupled with fortnightly TENS. The extent of nerve recovery was assessed by behavioural parameters: foot fault asymmetry scoring measured weekly for six weeks; electrophysiological parameters: compound muscle action potential (CMAP) amplitudes, measured at weeks 0 and 6; and morphological parameters: total fascicle areas, myelinated fiber counts, fiber densities, and fiber sizes measured at week 6. All the above parameters demonstrated recovery of the test groups (3-6) as being either comparable or less than that of reverse autograft, but none were shown to outperform reverse autograft. As such, UBM conduits may yet prove to be an effective treatment to repair relatively short segmental peripheral nerve injuries, but further research is required to demonstrate greater efficacy over nerve autografts.
APA, Harvard, Vancouver, ISO, and other styles
5

Friedler, Haley S., Michelle B. Leavy, Eric Bickelman, Barbara Casanova, Diana Clarke, Danielle Cooke, Andy DeMayo, et al. Outcome Measure Harmonization and Data Infrastructure for Patient-Centered Outcomes Research in Depression: Data Use and Governance Toolkit. Agency for Healthcare Research and Quality (AHRQ), October 2021. http://dx.doi.org/10.23970/ahrqepcwhitepaperdepressiontoolkit.

Full text
Abstract:
Executive Summary Patient registries are important tools for advancing research, improving healthcare quality, and supporting health policy. Registries contain vast amounts of data that could be used for new purposes when linked with other sources or shared with researchers. This toolkit was developed to summarize current best practices and provide information to assist registries interested in sharing data. The contents of this toolkit were developed based on review of the literature, existing registry practices, interviews with registries, and input from key stakeholders involved in the sharing of registry data. While some information in this toolkit may be relevant in other countries, this toolkit focuses on best practices for sharing data within the United States. Considerations related to data sharing differ across registries depending on the type of registry, registry purpose, funding source(s), and other factors; as such, this toolkit describes general best practices and considerations rather than providing specific recommendations. Finally, data sharing raises complex legal, regulatory, operational, and technical questions, and none of the information contained herein should be substituted for legal advice. The toolkit is organized into three sections: “Preparing to Share Data,” “Governance,” and “Procedures for Reviewing and Responding to Data Requests.” The section on “Preparing to Share Data” discusses the role of appropriate legal rights to further share the data and the need to follow all applicable ethical regulations. Registries should also prepare for data sharing activities by ensuring data are maintained appropriately and developing policies and procedures for governance and data sharing. The “Governance” section describes the role of governance in data sharing and outlines key governance tasks, including defining and staffing relevant oversight bodies; developing a data request process; reviewing data requests; and overseeing access to data by the requesting party. Governance structures vary based on the scope of data shared and registry resources. Lastly, the section on “Procedures for Reviewing and Responding to Data Requests” discusses the operational steps involved in sharing data. Policies and procedures for sharing data may depend on what types of data are available for sharing and with whom the data can be shared. Many registries develop a data request form for external researchers interested in using registry data. When reviewing requests, registries may consider whether the request aligns with the registry’s mission/purpose, the feasibility and merit of the proposed research, the qualifications of the requestor, and the necessary ethical and regulatory approvals, as well as administrative factors such as costs and timelines. Registries may require researchers to sign a data use agreement or other such contract to clearly define the terms and conditions of data use before providing access to the data in a secure manner. The toolkit concludes with a list of resources and appendices with supporting materials that registries may find helpful.
APA, Harvard, Vancouver, ISO, and other styles
6

Friedler, Haley S., Michelle B. Leavy, Eric Bickelman, Barbara Casanova, Diana Clarke, Danielle Cooke, Andy DeMayo, et al. Outcome Measure Harmonization and Data Infrastructure for Patient-Centered Outcomes Research in Depression: Data Use and Governance Toolkit. Agency for Healthcare Research and Quality (AHRQ), October 2021. http://dx.doi.org/10.23970/ahrqepcwhitepaperdepressiontoolkit.

Full text
Abstract:
Executive Summary Patient registries are important tools for advancing research, improving healthcare quality, and supporting health policy. Registries contain vast amounts of data that could be used for new purposes when linked with other sources or shared with researchers. This toolkit was developed to summarize current best practices and provide information to assist registries interested in sharing data. The contents of this toolkit were developed based on review of the literature, existing registry practices, interviews with registries, and input from key stakeholders involved in the sharing of registry data. While some information in this toolkit may be relevant in other countries, this toolkit focuses on best practices for sharing data within the United States. Considerations related to data sharing differ across registries depending on the type of registry, registry purpose, funding source(s), and other factors; as such, this toolkit describes general best practices and considerations rather than providing specific recommendations. Finally, data sharing raises complex legal, regulatory, operational, and technical questions, and none of the information contained herein should be substituted for legal advice. The toolkit is organized into three sections: “Preparing to Share Data,” “Governance,” and “Procedures for Reviewing and Responding to Data Requests.” The section on “Preparing to Share Data” discusses the role of appropriate legal rights to further share the data and the need to follow all applicable ethical regulations. Registries should also prepare for data sharing activities by ensuring data are maintained appropriately and developing policies and procedures for governance and data sharing. The “Governance” section describes the role of governance in data sharing and outlines key governance tasks, including defining and staffing relevant oversight bodies; developing a data request process; reviewing data requests; and overseeing access to data by the requesting party. Governance structures vary based on the scope of data shared and registry resources. Lastly, the section on “Procedures for Reviewing and Responding to Data Requests” discusses the operational steps involved in sharing data. Policies and procedures for sharing data may depend on what types of data are available for sharing and with whom the data can be shared. Many registries develop a data request form for external researchers interested in using registry data. When reviewing requests, registries may consider whether the request aligns with the registry’s mission/purpose, the feasibility and merit of the proposed research, the qualifications of the requestor, and the necessary ethical and regulatory approvals, as well as administrative factors such as costs and timelines. Registries may require researchers to sign a data use agreement or other such contract to clearly define the terms and conditions of data use before providing access to the data in a secure manner. The toolkit concludes with a list of resources and appendices with supporting materials that registries may find helpful.
APA, Harvard, Vancouver, ISO, and other styles
7

Stavland, Arne, Siv Marie Åsen, Arild Lohne, Olav Aursjø, and Aksel Hiorth. Recommended polymer workflow: Lab (cm and m scale). University of Stavanger, November 2021. http://dx.doi.org/10.31265/usps.201.

Full text
Abstract:
Polymer flooding is one of the most promising EOR methods (Smalley et al. 2018). It is well known and has been used successfully (Pye 1964; Standnes & Skjevrak 2014; Sheng et al. 2015). From a technical perspective we recommend that polymer flooding should be considered as a viable EOR method on the Norwegian Continental Shelf for the following reasons: 1. More oil can be produced with less water injected; this is particularly important for the NCS which are currently producing more water than oil 2. Polymers will increase the aerial sweep and improve the ultimate recovery, provided a proper injection strategy 3. Many polymer systems are available, and it should be possible to tailor their chemical composition to a wide range of reservoir conditions (temperature and salinity) 4. Polymer systems can be used to block water from short circuiting injection production wells 5. Polymer combined with low salinity injection water has many benefits: a lower polymer concentration can be used to reach target viscosity, less mechanical degradation, less adsorption, and a potential reduction in Sor due to a low salinity wettability effect. There are some hurdles when considering polymer flooding that needs to be considered: 1. Many polymer systems are not at the present considered as green chemicals; thus, reinjection of produced water is needed. However, results from polymer degradation studies in the IORCentre indicates that a. High molecular weight polymers are quickly degraded to low molecular weight. In case of accidental release to the ocean low molecular weight polymers are diluted and the lifetime of the spill might be quite short. According to Caulfield et al. (2002) HPAM is not toxic, and will not degrade to the more environmentally problematic acrylamide. b. In the DF report for environmental impact there are case studies using the DREAM model to predict the transport of chemical spills. This model is coupled with polymer (sun exposure) degradation data from the IORCentre to quantify the lifetime of polymer spills. This approach should be used for specific field cases to quantify the environmental risk factor. 2. Care must be taken to prepare the polymer solution offshore. Chokes and vales might be a challenge but can be mitigating according to the results from the large-scale testing done in the IORCentre (Stavland et al. 2021). None of the above-mentioned challenges are server enough to not consider polymer flooding. HPAM is neither toxic, nor bio-accumulable, or bio-persistent and the CO2 footprint from a polymer flood may be significantly less than a water flood (Dupuis et al. 2021). There are at least two contributing factors to this statement, which we will return in detail to in the next section i) during linear displacement polymer injection will produce more oil for the same amount of water injected, hence the lifetime of the field can be shortened ii) polymers increase the arial sweep reducing the need for wells.
APA, Harvard, Vancouver, ISO, and other styles
8

Mazzoni, Silvia, Nicholas Gregor, Linda Al Atik, Yousef Bozorgnia, David Welch, and Gregory Deierlein. Probabilistic Seismic Hazard Analysis and Selecting and Scaling of Ground-Motion Records (PEER-CEA Project). Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, November 2020. http://dx.doi.org/10.55461/zjdn7385.

Full text
Abstract:
This report is one of a series of reports documenting the methods and findings of a multi-year, multi-disciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER) and funded by the California Earthquake Authority (CEA). The overall project is titled “Quantifying the Performance of Retrofit of Cripple Walls and Sill Anchorage in Single-Family Wood-Frame Buildings,” henceforth referred to as the “PEER–CEA Project.” The overall objective of the PEER–CEA Project is to provide scientifically based information (e.g., testing, analysis, and resulting loss models) that measure and assess the effectiveness of seismic retrofit to reduce the risk of damage and associated losses (repair costs) of wood-frame houses with cripple wall and sill anchorage deficiencies as well as retrofitted conditions that address those deficiencies. Tasks that support and inform the loss-modeling effort are: (1) collecting and summarizing existing information and results of previous research on the performance of wood-frame houses; (2) identifying construction features to characterize alternative variants of wood-frame houses; (3) characterizing earthquake hazard and ground motions at representative sites in California; (4) developing cyclic loading protocols and conducting laboratory tests of cripple wall panels, wood-frame wall subassemblies, and sill anchorages to measure and document their response (strength and stiffness) under cyclic loading; and (5) the computer modeling, simulations, and the development of loss models as informed by a workshop with claims adjustors. This report is a product of Working Group 3 (WG3), Task 3.1: Selecting and Scaling Ground-motion records. The objective of Task 3.1 is to provide suites of ground motions to be used by other working groups (WGs), especially Working Group 5: Analytical Modeling (WG5) for Simulation Studies. The ground motions used in the numerical simulations are intended to represent seismic hazard at the building site. The seismic hazard is dependent on the location of the site relative to seismic sources, the characteristics of the seismic sources in the region and the local soil conditions at the site. To achieve a proper representation of hazard across the State of California, ten sites were selected, and a site-specific probabilistic seismic hazard analysis (PSHA) was performed at each of these sites for both a soft soil (Vs30 = 270 m/sec) and a stiff soil (Vs30=760 m/sec). The PSHA used the UCERF3 seismic source model, which represents the latest seismic source model adopted by the USGS [2013] and NGA-West2 ground-motion models. The PSHA was carried out for structural periods ranging from 0.01 to 10 sec. At each site and soil class, the results from the PSHA—hazard curves, hazard deaggregation, and uniform-hazard spectra (UHS)—were extracted for a series of ten return periods, prescribed by WG5 and WG6, ranging from 15.5–2500 years. For each case (site, soil class, and return period), the UHS was used as the target spectrum for selection and modification of a suite of ground motions. Additionally, another set of target spectra based on “Conditional Spectra” (CS), which are more realistic than UHS, was developed [Baker and Lee 2018]. The Conditional Spectra are defined by the median (Conditional Mean Spectrum) and a period-dependent variance. A suite of at least 40 record pairs (horizontal) were selected and modified for each return period and target-spectrum type. Thus, for each ground-motion suite, 40 or more record pairs were selected using the deaggregation of the hazard, resulting in more than 200 record pairs per target-spectrum type at each site. The suites contained more than 40 records in case some were rejected by the modelers due to secondary characteristics; however, none were rejected, and the complete set was used. For the case of UHS as the target spectrum, the selected motions were modified (scaled) such that the average of the median spectrum (RotD50) [Boore 2010] of the ground-motion pairs follow the target spectrum closely within the period range of interest to the analysts. In communications with WG5 researchers, for ground-motion (time histories, or time series) selection and modification, a period range between 0.01–2.0 sec was selected for this specific application for the project. The duration metrics and pulse characteristics of the records were also used in the final selection of ground motions. The damping ratio for the PSHA and ground-motion target spectra was set to 5%, which is standard practice in engineering applications. For the cases where the CS was used as the target spectrum, the ground-motion suites were selected and scaled using a modified version of the conditional spectrum ground-motion selection tool (CS-GMS tool) developed by Baker and Lee [2018]. This tool selects and scales a suite of ground motions to meet both the median and the user-defined variability. This variability is defined by the relationship developed by Baker and Jayaram [2008]. The computation of CS requires a structural period for the conditional model. In collaboration with WG5 researchers, a conditioning period of 0.25 sec was selected as a representative of the fundamental mode of vibration of the buildings of interest in this study. Working Group 5 carried out a sensitivity analysis of using other conditioning periods, and the results and discussion of selection of conditioning period are reported in Section 4 of the WG5 PEER report entitled Technical Background Report for Structural Analysis and Performance Assessment. The WG3.1 report presents a summary of the selected sites, the seismic-source characterization model, and the ground-motion characterization model used in the PSHA, followed by selection and modification of suites of ground motions. The Record Sequence Number (RSN) and the associated scale factors are tabulated in the Appendices of this report, and the actual time-series files can be downloaded from the PEER Ground-motion database Portal (https://ngawest2.berkeley.edu/)(link is external).
APA, Harvard, Vancouver, ISO, and other styles
9

Jorgensen, Frieda, Andre Charlett, Craig Swift, Anais Painset, and Nicolae Corcionivoschi. A survey of the levels of Campylobacter spp. contamination and prevalence of selected antimicrobial resistance determinants in fresh whole UK-produced chilled chickens at retail sale (non-major retailers). Food Standards Agency, June 2021. http://dx.doi.org/10.46756/sci.fsa.xls618.

Full text
Abstract:
Campylobacter spp. are the most common bacterial cause of foodborne illness in the UK, with chicken considered to be the most important vehicle for this organism. The UK Food Standards Agency (FSA) agreed with industry to reduce Campylobacter spp. contamination in raw chicken and issued a target to reduce the prevalence of the most contaminated chickens (those with more than 1000 cfu per g chicken neck skin) to below 10 % at the end of the slaughter process, initially by 2016. To help monitor progress, a series of UK-wide surveys were undertaken to determine the levels of Campylobacter spp. on whole UK-produced, fresh chicken at retail sale in the UK. The data obtained for the first four years was reported in FSA projects FS241044 (2014/15) and FS102121 (2015 to 2018). The FSA has indicated that the retail proxy target for the percentage of highly contaminated raw whole retail chickens should be less than 7% and while continued monitoring has demonstrated a sustained decline for chickens from major retailer stores, chicken on sale in other stores have yet to meet this target. This report presents results from testing chickens from non-major retailer stores (only) in a fifth survey year from 2018 to 2019. In line with previous practise, samples were collected from stores distributed throughout the UK (in proportion to the population size of each country). Testing was performed by two laboratories - a Public Health England (PHE) laboratory or the Agri-Food & Biosciences Institute (AFBI), Belfast. Enumeration of Campylobacter spp. was performed using the ISO 10272-2 standard enumeration method applied with a detection limit of 10 colony forming units (cfu) per gram (g) of neck skin. Antimicrobial resistance (AMR) to selected antimicrobials in accordance with those advised in the EU harmonised monitoring protocol was predicted from genome sequence data in Campylobacter jejuni and Campylobacter coli isolates The percentage (10.8%) of fresh, whole chicken at retail sale in stores of smaller chains (for example, Iceland, McColl’s, Budgens, Nisa, Costcutter, One Stop), independents and butchers (collectively referred to as non-major retailer stores in this report) in the UK that are highly contaminated (at more than 1000 cfu per g) with Campylobacter spp. has decreased since the previous survey year but is still higher than that found in samples from major retailers. 8 whole fresh raw chickens from non-major retailer stores were collected from August 2018 to July 2019 (n = 1009). Campylobacter spp. were detected in 55.8% of the chicken skin samples obtained from non-major retailer shops, and 10.8% of the samples had counts above 1000 cfu per g chicken skin. Comparison among production plant approval codes showed significant differences of the percentages of chicken samples with more than 1000 cfu per g, ranging from 0% to 28.1%. The percentage of samples with more than 1000 cfu of Campylobacter spp. per g was significantly higher in the period May, June and July than in the period November to April. The percentage of highly contaminated samples was significantly higher for samples taken from larger compared to smaller chickens. There was no statistical difference in the percentage of highly contaminated samples between those obtained from chicken reared with access to range (for example, free-range and organic birds) and those reared under standard regime (for example, no access to range) but the small sample size for organic and to a lesser extent free-range chickens, may have limited the ability to detect important differences should they exist. Campylobacter species was determined for isolates from 93.4% of the positive samples. C. jejuni was isolated from the majority (72.6%) of samples while C. coli was identified in 22.1% of samples. A combination of both species was found in 5.3% of samples. C. coli was more frequently isolated from samples obtained from chicken reared with access to range in comparison to those reared as standard birds. C. jejuni was less prevalent during the summer months of June, July and August compared to the remaining months of the year. Resistance to ciprofloxacin (fluoroquinolone), erythromycin (macrolide), tetracycline, (tetracyclines), gentamicin and streptomycin (aminoglycosides) was predicted from WGS data by the detection of known antimicrobial resistance determinants. Resistance to ciprofloxacin was detected in 185 (51.7%) isolates of C. jejuni and 49 (42.1%) isolates of C. coli; while 220 (61.1%) isolates of C. jejuni and 73 (62.9%) isolates of C. coli isolates were resistant to tetracycline. Three C. coli (2.6%) but none of the C. jejuni isolates harboured 23S mutations predicting reduced susceptibility to erythromycin. Multidrug resistance (MDR), defined as harbouring genetic determinants for resistance to at least three unrelated antimicrobial classes, was found in 10 (8.6%) C. coli isolates but not in any C. jejuni isolates. Co-resistance to ciprofloxacin and erythromycin was predicted in 1.7% of C. coli isolates. 9 Overall, the percentages of isolates with genetic AMR determinants found in this study were similar to those reported in the previous survey year (August 2016 to July 2017) where testing was based on phenotypic break-point testing. Multi-drug resistance was similar to that found in the previous survey years. It is recommended that trends in AMR in Campylobacter spp. isolates from retail chickens continue to be monitored to realise any increasing resistance of concern, particulary to erythromycin (macrolide). Considering that the percentage of fresh, whole chicken from non-major retailer stores in the UK that are highly contaminated (at more than 1000 cfu per g) with Campylobacter spp. continues to be above that in samples from major retailers more action including consideration of interventions such as improved biosecurity and slaughterhouse measures is needed to achieve better control of Campylobacter spp. for this section of the industry. The FSA has indicated that the retail proxy target for the percentage of highly contaminated retail chickens should be less than 7% and while continued monitoring has demonstrated a sustained decline for chickens from major retailer stores, chicken on sale in other stores have yet to meet this target.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography