Academic literature on the topic 'Confounded design'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Confounded design.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Confounded design"

1

Ghosh, D. K., and S. C. Bagui. "Identification of confounded design and its interactions." Journal of Applied Statistics 25, no. 3 (June 1998): 349–56. http://dx.doi.org/10.1080/02664769823089.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Weimer, Jon. "Confounded Experimental Designs and Human Factors Research." Proceedings of the Human Factors Society Annual Meeting 31, no. 2 (September 1987): 222–23. http://dx.doi.org/10.1177/154193128703100219.

Full text
Abstract:
The use of confounded factorial designs has been seriously neglected in the human factors literature. A confounded factorial is constructed by systematically confounding blocking variables with one or more interactions which are believed to be statistically insignificant or inconsequential to the researcher. These designs offer the advantages of increased economy and power. These designs are especially useful when research is being conducted on military personnel and subjects must be selected from different military facilities, which may result in heterogeneous subject populations. A concrete example illustrates how confounding of this type can be used to the researcher's advantage through the tailored construction of a confounded design.
APA, Harvard, Vancouver, ISO, and other styles
3

Dülmer, Hermann. "The Factorial Survey." Sociological Methods & Research 45, no. 2 (April 27, 2015): 304–47. http://dx.doi.org/10.1177/0049124115582269.

Full text
Abstract:
The factorial survey is an experimental design consisting of varying situations (vignettes) that have to be judged by respondents. For more complex research questions, it quickly becomes impossible for an individual respondent to judge all vignettes. To overcome this problem, random designs are recommended most of the time, whereas quota designs are not discussed at all. First comparisons of random designs with fractional factorial and D-efficient designs are based on fictitious data, first comparisons with fractional factorial and confounded factorial designs are restricted to theoretical considerations. The aim of this contribution is to compare different designs regarding their reliability and their internal validity. The benchmark for the empirical comparison is established by the estimators from a parsimonious full factorial design, each answered by a sample of 132 students (real instead of fictitious data). Multilevel analyses confirm that, if they exist, balanced confounded factorial designs are ideal. A confounded D-efficient design, as proposed for the first time in this article, is also superior to simple random designs.
APA, Harvard, Vancouver, ISO, and other styles
4

Moradian, Hanieh, Manfred Gossen, and Andreas Lendlein. "Co-delivery of genes can be confounded by bicistronic vector design." MRS Communications 12, no. 2 (February 18, 2022): 145–53. http://dx.doi.org/10.1557/s43579-021-00128-7.

Full text
Abstract:
AbstractMaximizing the efficiency of nanocarrier-mediated co-delivery of genes for co-expression in the same cell is critical for many applications. Strategies to maximize co-delivery of nucleic acids (NA) focused largely on carrier systems, with little attention towards payload composition itself. Here, we investigated the effects of different payload designs: co-delivery of two individual “monocistronic” NAs versus a single bicistronic NA comprising two genes separated by a 2A self-cleavage site. Unexpectedly, co-delivery via the monocistronic design resulted in a higher percentage of co-expressing cells, while predictive co-expression via the bicistronic design remained elusive. Our results will aid the application-dependent selection of the optimal methodology for co-delivery of genes. Graphical abstract
APA, Harvard, Vancouver, ISO, and other styles
5

Kjeldsen, Sverre E., Alexandre Persu, and Michel Azizi. "Design of renal denervation studies not confounded by antihypertensive drugs." Journal of the American Society of Hypertension 9, no. 5 (May 2015): 337–40. http://dx.doi.org/10.1016/j.jash.2015.02.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Green, David W., and David E. Over. "Valuing Intervention and Observation." Quarterly Journal of Experimental Psychology 62, no. 5 (May 2009): 1010–22. http://dx.doi.org/10.1080/17470210802305482.

Full text
Abstract:
Understanding causal relations is fundamental to effective action but causal data can be confounded. We examined the value that participants placed on data derived from a hypothetical intervention or observation. Our materials involved a possible cause (“bottled water”), a possible confound (“food”), and a context (“a restaurant”). We supposed that participants seek to draw as specific a causal inference as possible from presented data and value information sources more highly that allow them to do so. On this basis, we predicted that in circumstances where an intervention removed the confounding causal factor but observation did not, participants would prefer data derived from an intervention when the possible cause was present (the bottled water was drunk) but show the reverse preference when the possible cause was absent (the bottled water was not drunk). Experiment 1 confirmed this prediction. Using a between-subjects design, Experiment 2 tested for a difference in confidence in causal judgements given identical data, including data on the confound, as a function of method of data collection (intervention or observation). There was no significant difference in confidence ratings between the two methods but confidence ratings were sensitive to the probability of an effect (illness) given the cause. Using a within-subjects design, Experiment 3 revealed systematic individual differences in preference for the two methods. Participants were divided between those who considered intervention more confounded and those who considered observation more confounded. Our experiments point to the subtleties of participants’ evaluation of data from studies of human beings.
APA, Harvard, Vancouver, ISO, and other styles
7

McCarty, Lynn S., and Christopher J. Borgert. "Are all current ecotoxicity test results confounded by design and implementation issues?" Integrated Environmental Assessment and Management 12, no. 2 (March 27, 2016): 397–98. http://dx.doi.org/10.1002/ieam.1749.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

KUMAR, PRAKASH, KRISHAN LAL, ANIRBAN MUKHERJEE, UPENDRA KUMAR PRADHAN, MRINMOY RAY, and OM PRAKASH. "Advanced row-column designs for animal feed experiments." Indian Journal of Animal Sciences 88, no. 4 (January 5, 2023): 499–503. http://dx.doi.org/10.56093/ijans.v88i4.78895.

Full text
Abstract:
Inappropriate statistical designs may misinterpret results of animal feed experiments. Thus complete statistical designs can make animal feed research more appropriate and cost effective. Usually factorial row-column designs are used when the heterogeneity in the experimental material is in two directions and the experimenter is interested in studying the effect of two or more factors simultaneously. Attempts have been to develop the method of construction of balanced nested row column design under factorial setup. Factorial experiments are used in designs when two or more factors have same levels or different levels. The designs that are balanced symmetric factorials nested in blocks are called block designs with nested row-column balanced symmetric factorial experiments. These designs were constructed by using confounding through equation methods.Construction of confounded asymmetrical factorial experiments in row-column settings and efficiency factor of confounded effects was worked out. The design can be used in animal feed experiment with fewer resources by not compromising the test accuracy.
APA, Harvard, Vancouver, ISO, and other styles
9

Murray, Eleanor J., Ellen C. Caniglia, and Lucia C. Petito. "Causal survival analysis: A guide to estimating intention-to-treat and per-protocol effects from randomized clinical trials with non-adherence." Research Methods in Medicine & Health Sciences 2, no. 1 (October 8, 2020): 39–49. http://dx.doi.org/10.1177/2632084320961043.

Full text
Abstract:
When reporting results from randomized experiments, researchers often choose to present a per-protocol effect in addition to an intention-to-treat effect. However, these per-protocol effects are often described retrospectively, for example, comparing outcomes among individuals who adhered to their assigned treatment strategy throughout the study. This retrospective definition of a per-protocol effect is often confounded and cannot be interpreted causally because it encounters treatment-confounder feedback loops, where past confounders affect future treatment, and current treatment affects future confounders. Per-protocol effects estimated using this method are highly susceptible to the placebo paradox, also called the “healthy adherers” bias, where individuals who adhere to placebo appear to have better survival than those who don’t. This result is generally not due to a benefit of placebo, but rather is most often the result of uncontrolled confounding. Here, we aim to provide an overview to causal inference for survival outcomes with time-varying exposures for static interventions using inverse probability weighting. The basic concepts described here can also apply to other types of exposure strategies, although these may require additional design or analytic considerations. We provide a workshop guide with solutions manual, fully reproducible R, SAS, and Stata code, and a simulated dataset on a GitHub repository for the reader to explore.
APA, Harvard, Vancouver, ISO, and other styles
10

C. Eze, Francis. "Choice of Confounding in the 2k Factorial Design in 2b Blocks." Academic Journal of Applied Mathematical Sciences, no. 55 (May 15, 2019): 50–56. http://dx.doi.org/10.32861/ajams.55.50.56.

Full text
Abstract:
In 2k complete factorial experiment, the experiment must be carried out in a completely randomized design. When the numbers of factors increase, the number of treatment combinations increase and it is not possible to accommodate all these treatment combinations in one homogeneous block. In this case, confounding in more than one incomplete block becomes necessary. In this paper, we considered the choice of confounding when k > 2. Our findings show that the choice of confounding depends on the number of factors, the number of blocks and their sizes. When two more interactions are to be confounded, their product module 2 should be considered and thereafter, a linear combination equation should be used in allocating the treatment effects in the principal block. Other contents in other blocks are generated by multiplication module 2 of the effects not in the principal block. Partial confounding is recommended for the interactions that cannot be confounded.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Confounded design"

1

Jerkert, Jesper. "Philosophical Issues in Medical Intervention Research." Licentiate thesis, KTH, Filosofi, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-163872.

Full text
Abstract:
The thesis consists of an introduction and two papers. In the introduction a brief historical survey of empirical investigations into the effectiveness of medicinal interventions is given. Also, the main ideas of the EBM (evidence-based medicine) movement are presented. Both included papers can be viewed as investigations into the reasonableness of EBM and its hierarchies of evidence. Paper I: Typically, in a clinical trial patients with specified symptoms are given either of two or more predetermined treatments. Health endpoints in these groups are then compared using statistical methods. Concerns have been raised, not least from adherents of so-called alternative medicine, that clinical trials do not offer reliable evidence for some types of treatment, in particular for highly individualized treatments, for example traditional homeopathy. It is argued that such concerns are unfounded. There are two minimal conditions related to the nature of the treatments that must be fulfilled for evaluability in a clinical trial, namely (1) the proper distinction of the two treatment groups and (2) the elimination of confounding variables or variations. These are delineated, and a few misunderstandings are corrected. It is concluded that the conditions do not preclude the testing of alternative medicine, whether individualized or not. Paper II: Traditionally, mechanistic reasoning has been assigned a negligible role in standard EBM literature, although some recent authors have argued for an upgrading. Even so, mechanistic reasoning that has received attention has almost exclusively been positive -- both in an epistemic sense of claiming that there is a mechanistic chain and in a health-related sense of there being claimed benefits for the patient. Negative mechanistic reasoning has been neglected, both in the epistemic and in the health-related sense. I distinguish three main types of negative mechanistic reasoning and subsume them under a new definition of mechanistic reasoning in the context of assessing medical interventions. Although this definition is wider than a previous suggestion in the literature, there are still other instances of reasoning that concern mechanisms but do not (and should not) count as mechanistic reasoning. One of the three distinguished types, which is negative only in the health-related sense, has a corresponding positive counterpart, whereas the other two, which are epistemically negative, do not have such counterparts, at least not that are particularly interesting as evidence. Accounting for negative mechanistic reasoning in EBM is therefore partly different from accounting for positive mechanistic reasoning. Each negative type corresponds to a range of evidential strengths, and it is argued that there are differences with respect to the typical strengths. The variety of negative mechanistic reasoning should be acknowledged in EBM, and presents a serious challenge to proponents of so-called medical hierarchies of evidence.

QC 20150413

APA, Harvard, Vancouver, ISO, and other styles
2

Görgen, Kai. "On Rules and Methods: Neural Representations of Complex Rule Sets and Related Methodological Contributions." Doctoral thesis, Humboldt-Universität zu Berlin, 2019. http://dx.doi.org/10.18452/20711.

Full text
Abstract:
Wo und wie werden komplexe Regelsätze im Gehirn repräsentiert? Drei empirische Studien dieser Doktorarbeit untersuchen dies experimentell. Eine weitere methodische Studie liefert Beiträge zur Weiterentwicklung der genutzten empirischen Methode. Die empirischen Studien nutzen multivariate Musteranalyse (MVPA) funktioneller Magnetresonanzdaten (fMRT) gesunder Probanden. Die Fragestellungen der methodischen Studie wurden durch die empirischen Arbeiten inspiriert. Wirkung und Anwendungsbreite der entwickelten Methode gehen jedoch über die Anwendung in den empirischen Studien dieser Arbeit hinaus. Die empirischen Studien bearbeiten Fragen wie: Wo werden Hinweisreize und Regeln repräsentiert, und sind deren Repräsentationen voneinander unabhängig? Wo werden Regeln repräsentiert, die aus mehreren Einzelregeln bestehen, und sind Repräsentationen der zusammengesetzten Regeln Kombinationen der Repräsentationen der Einzelregeln? Wo sind Regeln verschiedener Hierarchieebenen repräsentiert, und gibt es einen hierarchieabhängigen Gradienten im ventrolateralen präfrontalen Kortex (VLPFK)? Wo wird die Reihenfolge der Regelausführung repräsentiert? Alle empirischen Studien verwenden informationsbasiertes funktionales Mapping ("Searchlight"-Ansatz), zur hirnweiten und räumlich Lokalisierung von Repräsentationen verschiedener Elemente komplexer Regelsätze. Kernergebnisse der Arbeit beinhalten: Kompositionalität neuronaler Regelrepräsentationen im VLPFK; keine Evidenz für Regelreihenfolgenrepräsentation im VLPFK, welches gegen VLPFK als generelle Task-Set-Kontrollregion spricht; kein Hinweis auf einen hierarchieabhängigen Gradienten im VLPFK. Die komplementierende methodische Studie präsentiert "The Same Analysis Approach (SAA)", ein Ansatz zur Erkennung und Behebung experimentspezifischer Fehler, besonders solcher, die aus Design–Analyse–Interaktionen entstehen. SAA ist für relevant MVPA, aber auch für anderen Bereichen innerhalb und außerhalb der Neurowissenschaften.
Where and how does the brain represent complex rule sets? This thesis presents a series of three empirical studies that decompose representations of complex rule sets to directly address this question. An additional methodological study investigates the employed analysis method and the experimental design. The empirical studies employ multivariate pattern analysis (MVPA) of functional magnetic resonance imaging (fMRI) data from healthy human participants. The methodological study has been inspired by the empirical work. Its impact and application range, however, extend well beyond the empirical studies of this thesis. Questions of the empirical studies (Studies 1-3) include: Where are cues and rules represented, and are these represented independently? Where are compound rules (rules consisting of multiple rules) represented, and are these composed from their single rule representations? Where are rules from different hierarchical levels represented, and is there a hierarchy-dependent functional gradient along ventro-lateral prefrontal cortex (VLPFC)? Where is the order of rule-execution represented, and is it represented as a separate higher-level rule? All empirical studies employ information-based functional mapping ("searchlight" approach) to localise representations of rule set features brain-wide and spatially unbiased. Key findings include: compositional coding of compound rules in VLPFC; no order information in VLPFC, suggesting VLPFC is not a general controller for task set; evidence against the hypothesis of a hierarchy-dependent functional gradient along VLPFC. The methodological study (Study 4) introduces "The Same Analysis Approach (SAA)". SAA allows to detect, avoid, and eliminate confounds and other errors in experimental design and analysis, especially mistakes caused by malicious experiment-specific design-analysis interactions. SAA is relevant for MVPA, but can also be applied in other fields, both within and outside of neuroscience.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Confounded design"

1

Publishing House of the Methodis Church. Immersionists Against the Bible; or, the Babel Builders Confounded, in an Exposition of the Origin, Design, Tactics, and Progress of the New Version Movement of Campbellites and Other Baptists. HardPress, 2020.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Cawthon, Stephanie W. Large-Scale Survey Design in Deaf Education Research. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190455651.003.0009.

Full text
Abstract:
Large-scale surveys are an appealing research design option for those wishing to collect data from many participants dispersed across different settings. This chapter describes several critical issues that must be considered when developing and conducting large-scale surveys in deaf education: aligning with a theoretical rationale, considering sample characteristics and potential confounds, piloting study measures, and developing an analysis plan. The chapter provides examples of ways to capture the heterogeneous demographics inherent within deaf education, ranging from individual characteristics such as identity, language use, and professional experience to educational setting characteristics such as program models and available accommodations. The chapter provides recommendations for how to instill trust and be mindful of participant fatigue during the recruitment process. The chapter ends with strategies for making survey recruitment materials, test directions, and items accessible for a diverse study population.
APA, Harvard, Vancouver, ISO, and other styles
3

Casey, Patricia. Models, risks, and protections (DRAFT). Oxford University Press, 2018. http://dx.doi.org/10.1093/med/9780198786214.003.0004.

Full text
Abstract:
Several explanatory models have been proposed for AD. The stress model is the current model and the one on which the ICD-11 criteria will be based. Others include a crisis model, a biological model, and a transactional-cognitive model. The research on the risk and protective factors is sparse and some studies are poorly designed owing to inadequate confounder control. The presence of a stressor is essential, and it can be a common event such as relationship breakdown or more traumatic stressors that have come to be associated with PTSD. Personality disorder does not appear to be a specific risk factor but certain personality dimensions have been identified as increasing vulnerability. Maladaptive coping strategies, poor social supports, and childhood trauma have also been identified. Resilience is a protective factor. Thus, the variables that increase or decrease risk are similar to those identified for other common psychiatric disorders, but better-designed studies are required in the future.
APA, Harvard, Vancouver, ISO, and other styles
4

Ferrari, Matthew. Using disease dynamics and modeling to inform control strategies in low-income countries. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198789833.003.0008.

Full text
Abstract:
The incidence infectious disease is inherently dynamic in time and space. Mathematical models that account for the dynamic processes that give rise to fluctuations in disease incidence are powerful tools in disease management and control. We describe the use of dynamic models for surveillance, evaluation and prediction of disease control efforts in low-income countries. Dynamic models can help to anticipate trends owing to intrinsic (e.g., herd immunity) or extrinsic (e.g., seasonality) forces that may confound efforts to isolate the impact of specific interventions. Infectious disease dynamics are frequently nonlinear, meaning that future outcomes are difficult to predict through simple extrapolation of present conditions. Thus, dynamic models can help to explore the potential consequences of proposed interventions. These projections can alert managers to the potential for unintended consequences of control and help to define effect sizes for the design of conventional studies of the impact of interventions.
APA, Harvard, Vancouver, ISO, and other styles
5

Leben, Derek. In Defense of ‘Ought Implies Can’. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198815259.003.0007.

Full text
Abstract:
Two recent papers have presented experimental evidence against the hypothesis that there is a semantic connection between OUGHT and CAN, rather than a pragmatic and defeasible one. However, there are two flaws with their designs. One is temporal ambiguity: just asking whether “x ought to A” is underspecified as to when the obligation exists. Another is failing to distinguish between prior obligations and all-things-considered obligations. To test these potential confounds, the chapter author ran two experiments. The first paired some of the original stories with a visual timeline specifying the time of the obligation. The second flipped the wording of the original “obligated but can’t” question into the reversed: “can’t, but still obligated.” In both experiments, there were large and significant differences between the original and modified conditions. These results undermine the conclusions of the previous experiments and remain consistent with the Semantic Hypothesis.
APA, Harvard, Vancouver, ISO, and other styles
6

Ker-Lindsay, James. The Cyprus Problem. Oxford University Press, 2011. http://dx.doi.org/10.1093/wentk/9780199757169.001.0001.

Full text
Abstract:
For nearly 60 years--from its uprising against British rule in the 1950s, to the bloody civil war between Greek and Turkish Cypriots in the 1960s, the Turkish invasion of Cyprus in the 1970s, and the United Nation's ongoing 30-year effort to reunite the island--the tiny Mediterranean nation of Cyprus has taken a disproportionate share of the international spotlight. And while it has been often in the news, accurate and impartial information on the conflict has been nearly impossible to obtain. In The Cyprus Problem, James Ker-Lindsay offers an incisive, even-handed account of the conflict. Ker-Lindsay covers all aspects of the Cyprus problem, placing it in historical context, addressing the situation as it now stands, and looking toward its possible resolution. The book begins with the origins of the Greek and Turkish Cypriot communities as well as the other indigenous communities on the island (Maronites, Latin, Armenians, and Gypsies). Ker-Lindsay then examines the tensions that emerged between the Greek and Turkish Cypriots after independence in 1960 and the complex constitutional provisions and international treaties designed to safeguard the new state. He pays special attention to the Turkish invasion in 1974 and the subsequent efforts by the UN and the international community to reunite Cyprus. The book's final two chapters address a host of pressing issues that divide the two Cypriot communities, including key concerns over property, refugee returns, and the repatriation of settlers. Ker-Lindsay concludes by considering whether partition really is the best solution, as many observers increasingly suggest. Written by a leading expert, The Cyprus Problem brings much needed clarity and understanding to a conflict that has confounded observers and participants alike for decades.
APA, Harvard, Vancouver, ISO, and other styles
7

Scolding, Neil. Vasculitis and collagen vascular diseases. Oxford University Press, 2011. http://dx.doi.org/10.1093/med/9780198569381.003.0862.

Full text
Abstract:
That part of the clinical interface between neurology and general medicine occupied by inflammatory and immunological diseases is neither small nor medically trivial. Neurologists readily accept the challenges of ‘primary’ immune diseases of the nervous system: these tend to be focussed on one particular target such as oligodendrocytes or the neuro-muscular junction present in predictable ways, and are amenable as a rule to rational, methodological diagnosis, and occasionally even treatment. This is proper neurology.‘Secondary’ neurological involvement in diseases mainly considered systemic inflammatory conditions—for example, SLE, sarcoidosis, vasculitis, and Behçet’s—is a rather different matter. It may be difficult enough to secure such a diagnosis even when systemic disease has previously been diagnosed and new neurological features need to be differentiated from iatrogenic disease, particularly drug side effects or the consequences of immune suppression. But all the diseases mentioned may present with and confine themselves wholly to the nervous system; they may mimic one another, and pursue erratic and unpredictable clinical courses. In central nervous system disease, diagnosis by tissue biopsy is potentially hazardous and unattractive. Few neurologists enjoy excesses of confidence or expertise when faced with such clinical problems: the cautious diagnostician is perplexed, and the evidence-based neuroprescriber confounded. Unsurprisingly, great variations in approaches to diagnosis and management are seen (Scolding et al. 2002b).But rheumatologically inclined general, renal or respiratory physicians, comfortable when managing inflammation affecting their system or indeed other parts of the body designed to support the nervous system, are generally also ill at ease when faced with neurological features whose differential diagnosis may be large, particularly given the near universal diagnostic non-specificity of either imaging or CSF analysis.Here then is the subject material for this chapter: the diagnosis and management of central nervous system involvement in inflammatory and immunological systemic diseases (Scolding 1999a). In not one of these neurological conditions has a single controlled therapeutic trial been reported, and much that is published on these conditions is misleading or inaccurate. And yet the frequency with which the diagnosis is only confirmed or even first emerges at autopsy bears stark witness to both the severity and evasiveness of these disorders.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Confounded design"

1

Mandal, Madhura, and Premadhis Das. "Confounded Factorial Design with Partial Balance and Orthogonal Sub-Factorial Structure." In Statistics and its Applications, 111–31. Singapore: Springer Singapore, 2018. http://dx.doi.org/10.1007/978-981-13-1223-6_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Roberts, Caroline, and Marieke Voorpostel. "Combining Data Collection Modes in Longitudinal Studies." In Withstanding Vulnerability throughout Adult Life, 359–73. Singapore: Springer Nature Singapore, 2023. http://dx.doi.org/10.1007/978-981-19-4567-0_22.

Full text
Abstract:
AbstractTechnological advances over the past two decades have substantially changed the range of data collection methods available to survey researchers. Web-based surveys have gained in popularity as increasing Internet penetration rates improve their coverage potential for general population research. Nevertheless, they still systematically exclude certain subgroups—e.g., those without Internet access or those less able (or motivated) to complete a survey questionnaire on their own. A popular solution to this problem is to use other modes of data collection for those who cannot participate online. However, while mixed mode surveys can be effective at reducing selection errors, measurements obtained from different modes may not be comparable, particularly when it comes to sensitive topics. The fact that measurement and selection errors are confounded poses challenges for researchers analysing mixed mode data, and in a longitudinal setting, has implications for studying changes over time. In this chapter, we discuss these challenges in the context of longitudinal studies designed to measure indicators of vulnerability, and illustrate them with a synthesis of findings from our own research relating to (1) the effects of combining modes on response rates and the representativeness of survey samples and (2) effects for measurement comparability.
APA, Harvard, Vancouver, ISO, and other styles
3

"Completely Confounded Design." In Handbook of Statistics for Teaching and Research in Plant and Crop Science, 531–44. CRC Press, 2005. http://dx.doi.org/10.1201/9781482277814-36.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Boniface, David R. "Unbalanced and confounded designs." In Experiment Design and Statistical Methods, 101–29. Routledge, 2019. http://dx.doi.org/10.1201/9780203756423-10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

"Incomplete and Confounded Block Designs." In Design and Analysis of Experiments with SAS, 269–314. Chapman and Hall/CRC, 2010. http://dx.doi.org/10.1201/9781439882740-11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

"Incomplete and Confounded Block Designs." In Design and Analysis of Experiments with R, 285–330. Chapman and Hall/CRC, 2014. http://dx.doi.org/10.1201/b17883-10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Paine, Thomas. "Common Sense." In Rights of Man, Common Sense, and Other Political Writings. Oxford University Press, 2008. http://dx.doi.org/10.1093/owc/9780199538003.003.0002.

Full text
Abstract:
Of the Origin and Design of Government in General. With concise Remarks on the English Constitution. Some writers have so confounded society with government, as to leave little or no distinction between them; whereas they are not only different, but have different origins....
APA, Harvard, Vancouver, ISO, and other styles
8

"Epidemiologic Design Bias, Confounders, and Interaction." In Epidemiology for the Advanced Practice Nurse. New York, NY: Springer Publishing Company, 2022. http://dx.doi.org/10.1891/9780826185143.0014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Ashworth, Scott, Christopher R. Berry, and Ethan Bueno De Mesquita. "Modeling the Research Design." In Theory and Credibility, 216–33. Princeton University Press, 2021. http://dx.doi.org/10.23943/princeton/9780691213828.003.0010.

Full text
Abstract:
This chapter discusses modeling the research design that involves representing a research design itself within a model that embodies the mechanisms in question. The goal is a better understanding of which theoretical implications are commensurable with the estimand of the research design. The chapter reviews the approach to linking theory and empirics through all-else-equal claims that suggests that much theorizing already implicitly involves a model of a research design. It refers to a comparative static that can be thought of as a model of a just-controlling design with no omitted confounders. Modeling the research design becomes more interesting, and worth consideration as a distinct approach, when multiple mechanisms interact in potentially complicated ways or when a more complicated research design makes thinking about commensurability conceptually difficult.
APA, Harvard, Vancouver, ISO, and other styles
10

Dickson-Deane, Camille, LeRoy Hill, and Laura E. Gray. "Modelling for Value Systems in a Diverse Online Program in the Caribbean." In Multicultural Instructional Design, 370–86. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-5225-9279-2.ch017.

Full text
Abstract:
The authors present a conceptual framework to guide the participation of students in an online instructional design program. The online program has socio-cultural influencing factors that confound the already diverse nature of the offering. The framework intends to encourage a value system for students that can be used to guide their knowledge and performance as they pursue the tenets of the field of instructional design. Elmore's mode of leadership, Bourdieu's theory of habitus and Hofstede's cultural dimensions theory are used to create a foundation for the framework whilst acknowledging the complexities of the diverse environment. The framework supports and acknowledges the knowledge expected of novice instructional designers through the use of guides whilst acknowledging the systemic and systematic individualistic change processes that will occur.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Confounded design"

1

Hsu, Shu-han, Ying-Yuan Huang, Kexin Yang, and Linda Milor. "Identification of Failure Modes for Circuit Samples with Confounded Causes of Failure." In 2019 IEEE 25th International Symposium on On-Line Testing And Robust System Design (IOLTS). IEEE, 2019. http://dx.doi.org/10.1109/iolts.2019.8854409.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Kim, Hyeji, Jing Chen, Euiyoung Kim, and Alice M. Agogino. "Scenario-Based Conjoint Analysis: Measuring Preferences for User Experiences in Early Stage Design." In ASME 2017 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/detc2017-67690.

Full text
Abstract:
Conjoint analysis has proven to be a useful method for decomposing and estimating consumer preference for each attribute of a product or service through evaluations of sets of different versions of the product with varying attribute levels. The predictive value of conjoint analysis is confounded, however, by increasing market uncertainties and changes in user expectations. We explore the use of scenario-based conjoint analysis in order to complement qualitative design research methods in the early stages of concept development. The proposed methodology focuses on quantitatively assessing user experiences rather than product features to create experience-driven products, especially in cases in which the technology is advancing beyond consumer familiarity. Rather than replace conventional conjoint analysis for feature selection near the end of the product development cycle, our method broadens the scope of conjoint analysis so that this powerful measurement technique can be applied in the early stage of design to complement qualitative research and drive strategic directions for developing product experiences. We illustrate on a new product development case study of a flexible wearable for parent-child communication and tracking as an example of scenario-based conjoint analysis implementation. The results, limitations, and findings are discussed in more depth followed by future research directions.
APA, Harvard, Vancouver, ISO, and other styles
3

Kestner, Brian K., Christopher A. Perullo, Jonathan S. Sands, and Dimitri N. Mavris. "Bayesian Belief Network for Robust Engine Design and Architecture Selection." In ASME Turbo Expo 2014: Turbine Technical Conference and Exposition. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/gt2014-27017.

Full text
Abstract:
Designing propulsion system architectures to meet next generation requirements requires many tradeoffs be made. These trades are often between performance, risk, and cost. For example, the core of an engine is the most expensive and highest risk area of a propulsion system design. However, a new core design provides the greatest flexibility in meeting future performance requirements. The decision to upgrade or redesign the core must be justified by comparison with other lower risk options. Furthermore, for turboshaft applications, the choice of compressor, whether axial or centrifugal, is a major decision and trade with the choice being heavily driven by both current and projected weight and performance requirements. This problem is confounded by uncertainty in potential benefits of technologies or future performance of components. To address these issues this research proposes the use of a Bayesian belief network (BBN) to extend the more traditional robust engine design process. This is done by leveraging forward and backward inference to identify engine upgrade paths that are robust to uncertainty in requirements performance. Prior beliefs on the different scenarios and technology uncertainty can be used to quantify risk. Forward inference can be used to compare different scenarios. The problem will be demonstrated using a two-spool turboshaft architecture modeled using the Numerical Propulsion System Simulation (NPSS) program. Upgrade options will include off the shelf, derivative engine (fixed core) with no technologies, derivative engine with new technologies, a new engine with no technologies, and a new engine with new technologies. The robust design process with a BBN will be used to identify which engine cycle and upgrade scenario is needed to meet performance requirements while minimizing cost and risk. To demonstrate how the choice of upgrade and cycle change with changes in requirements, studies are performed at different horsepower, ESFC, and power density requirements.
APA, Harvard, Vancouver, ISO, and other styles
4

Hinich, Melvin J., Elmer L. Hixson, and Jack H. Sheehan. "The Origin and Implications of Higher-Order Cumulant Spectral Signatures in Hypoid Gear Trains." In ASME 1996 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 1996. http://dx.doi.org/10.1115/imece1996-0517.

Full text
Abstract:
Abstract This paper considers the origin and implications of higher-order (non-linear) cumulant spectral signatures generated by a hypoid gear assembly in a light utility van. This analysis demonstrates that the hypoid gear silencing problem is qualitatively and quantitatively different than the problems in gear design, both nominal and detailed, in particular: • The production of sound by final drive gear trains is an extremely inefficient process; therefore, the vibration components which dominate sound generation are negligible quantities in typical gear design trade-offs. The usual silencing adjustments often shift but do not remove the offending source. • Human hearing is acutely sensitive to the phase-coherent sinusoidal oscillations with modulated sidebands produced by hypoid gear trains. Modifications which reduce signature energy in a specific narrow band component may have no useful effect on perceived acoustic annoyance. • Final drive measurements capture transient, stochastically modulated hypoid gear signatures confounded by highly correlated, non-Gaussian interference. Experimental design to explicitly measure the transient drive signals and the sources of correlated inference is essential for useful results. • Emerging higher-order spectral analysis methods require long measurement ensembles to achieve appropriate resolution bandwidth and statistical convergence. Efficient data handling architecture for long, replicated ensembles is a pragmatic necessity.
APA, Harvard, Vancouver, ISO, and other styles
5

Teichert, Kendall B., and Brian D. Jensen. "Calibration Approach for Thermomechanical In-Plane Microactuator Self-Sensing." In ASME 2008 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/detc2008-49546.

Full text
Abstract:
Microelectromecahanical system (MEMS) actuation is a growing area of research. One obstacle for use of actuation in MEMS applications is the difficulty of proper sensing. Recent work has been done that shows the potential for thermomechanical in-plane microactuators (TIMs) to act as self-sensors by using the piezoresistive characteristic of silicon. However, in order to implement this technology a calibration method needs to be devised to account for variations between TIMs. This work presents an approach for this calibration consisting of two parts that compensate for variation in fabrication and material properties. Test structures are presented that will enable this calibration to be done on-chip, and validation is given for the usability of this approach. Two validation approaches are used. For the first approach, data previously gathered was analyzed using the TIM itself for calibration. This approach showed significant correlation with the model; however, this approach confounds any sensing signal and therefore was used only for general model validation. The second approach uses a novel calibration structure that decouples the mechanical and electrical characteristics. This approach showed correlation with test data within the bounds of experimental uncertainty in nearly all cases. Suggestions are given concerning implementation.
APA, Harvard, Vancouver, ISO, and other styles
6

Bowsher, Andrew, Cary A. Gloor, Bruce Griffiths, and Chris McMahon. "Design Based Failure Analysis of a Voltage Sensitive Memory Defect." In ISTFA 2011. ASM International, 2011. http://dx.doi.org/10.31399/asm.cp.istfa2011p0382.

Full text
Abstract:
Abstract The semiconductor failure analyst’s tool box is a vast and resourceful set of capabilities that more than ever needs meaningful Memory Failure Signature Analysis (Memory FSA) as an important part of that suite. Today, this is driven by advanced process technology nodes that are producing virtually invisible defects to confound manufacturing and reliability. This demands greater attention to characterizing memory failures in order to theorize causes for failure and to implement suitable FA approaches and corrective action plans. Design Based FA (DBFA) techniques aim to extend this philosophy by focusing on a deep understanding of the chip’s Intellectual Property (IP), in terms of both content and architecture. It uses this knowledge to gain important insights into the behavior of the failure that otherwise may have been hidden or unobservable. This disciplined methodology leads to quicker closure for problems through implementing improved test screens, providing recommendations under a closed-loop Design for Manufacturing (DFM) system, enacting process enhancements, or some combination of all these areas. Here we present a clever technique to further aid in the failure signature analysis process and use it as an example for this Design Based FA methodology.
APA, Harvard, Vancouver, ISO, and other styles
7

Wood, Denise May, Greg Auhl, and Sally McCarthy. "Accreditation and quality in higher education curriculum design: does the tail wag the dog?" In Fifth International Conference on Higher Education Advances. Valencia: Universitat Politècnica València, 2019. http://dx.doi.org/10.4995/head19.2019.9365.

Full text
Abstract:
Increasingly, the higher education sector is driven by sets of standards that describe quality – internal institutional standards that consider curriculum, teaching and delivery to students and external standards from both the sector and the professions that describe expectations, content, skills and attitudes that curricula must address to support graduate outcomes. Quality is the focus of these requirements, and yet quality in higher education remains a messy problem, with no clear framework (Kundu, 2016) and numerous variables that confound the problem. We ask what comes first: the external standards that accredit a university to provide education for a profession, or internal standards that focus on quality teaching and learning opportunities. The paper presents a short case study that highlights the challenge for course leaders pressured to meet industry requirements, and the impact this has on their awareness and capacity to design a transformational curriculum for students. We conclude that it is the difference between an aspirational courses, whereby quality is focussed on the learning design for student experience, and a compliant course, where quality is focussed on meeting static requirements.
APA, Harvard, Vancouver, ISO, and other styles
8

Riha, D. S., M. L. Kirby, J. W. Cardinal, L. C. Domyancic, J. M. McFarland, and F. W. Brust. "Probabilistic Risk Assessment of Aging Layered Pressure Vessels." In ASME 2019 Pressure Vessels & Piping Conference. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/pvp2019-93720.

Full text
Abstract:
Abstract The National Aeronautics and Space Administration (NASA) operates approximately 300 aging layered pressure vessels that were designed and manufactured prior to ASME Boiler and Pressure Vessel (B&PV) code requirements. In order to make decisions regarding the continued fitness-for-service of these non-code carbon steel vessels, it is necessary to perform a relative risk of failure assessment for each vessel. However, risk assessment of these vessels is confounded by uncertainties and variabilities related to the use of proprietary materials in fabrication, missing construction records, geometric discontinuities, weld residual stresses, and complex service stress gradients in and around the welds. Therefore, a probabilistic framework that can capture these uncertainties and variabilities has been developed to assess the fracture risk of flaws in regions of interest, such as longitudinal and circumferential welds, using the NESSUS® probabilistic modeling software and NASGRO® fracture mechanics software. In this study, the probabilistic framework was used to predict variability in the stress intensity factor associated with different reference flaws located in the head-to-shell circumferential welds of a 4-layer and 14-layer pressure vessel. The probabilistic studies predict variability in flaw behavior and the important uncertain parameters for each reference flaw location.
APA, Harvard, Vancouver, ISO, and other styles
9

Miller, Mark, and Sam Holley. "Assessing Human Factors and Cyber Attacks at the Human-Machine Interface: Threats to Safety and Pilot and Controller Performance." In 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002204.

Full text
Abstract:
The current state of automated digital information in aviation continues to expand rapidly as NextGen ADS-B(In) systems become more common in the form of Electronic Flight Bag (EFB) pad devices brought onto the flight deck. Integrated systems including satellites, aircraft, and air traffic control (ATC) data currently are not effectively encrypted and invite exposure to cyber attacks targeting flight decks and ATC facilities. The NextGen ATC system was not designed from the outset to identify and nullify cyber threats or attempts at disruption, and the safety gap has enlarged. Performance error at digital human-machine interfaces (HMI) has been well documented in aviation and now presents a potentially significant threat where the HMI can be more susceptible to human error from cyber attacks. Examples of HMI errors arising from digital information produced by automated systems are evaluated by the authors using HMI flaws discovered in recent Boeing 737-Max accidents. SHELL computer diagrams for both the digital flight deck and ATC facilities illustrate how the system is now interconnected for potential cyber threats and identifies how human factors consequences compromising HMI safety and operator performance present potential dangers. Aviation Safety and Reporting System (ASRS) data are examined and confirm HMI threats. The authors contrast various HMI errors with cyber attack effects on cognition, situational awareness, and decision making. A focused examination to assess cyber attack effects on cognitive metrics suggests cognitive clarity of operators is confounded when confronted with conflicting or confusing indications at the HMI. Difficulty in successfully identifying a cyber attack and the actions taken as human factors countermeasures are illustrated in the context of the HMI environment. The Human Factors Analysis and Classification System (HFACS) is used to show how cyber attacks could occur and be addressed along with a dual-path solution.Keywords: NextGen, Cyber attack, SHELL, HMI, Cognitive load, HFACS
APA, Harvard, Vancouver, ISO, and other styles
10

Denton, Mark S., and William D. Bostick. "New Innovative Electrocoagulation (EC) Treatment Technology for BWR Colloidal Iron Utilizing the Seeding and Filtration Electronically (SAFE™) System." In The 11th International Conference on Environmental Remediation and Radioactive Waste Management. ASMEDC, 2007. http://dx.doi.org/10.1115/icem2007-7186.

Full text
Abstract:
The presence of iron (iron oxide from carbon steel piping) buildup in Boiling Water Reactor (BWR) circuits and wastewaters is decades old. In, perhaps the last decade, the advent of precoatless filters for condensate blow down has compounded this problem due to the lack of a solid substrate (e.g., powdex resin pre-coat) to help drop the iron out of solution. The presence and buildup of this iron in condensate phase separators (CPS) further confounds the problem when the tank is decanted back to the plant. Iron carryover here is unavoidable without further treatment steps. The form of iron in these tanks, which partially settles and is pumped to a de-waterable high integrity container (HIC), is particularly difficult and time consuming to dewater (low shear strength, high water content). The addition upstream from the condensate phase separator (CPS) of chemicals, such as polymers, to carry out the iron, only produces an iron form even more difficult to filter and dewater (even less shear strength, higher water content, and a gel/slime consistency). Typical, untreated colloidal material contains both sub-micron particles up to, let’s say 100 micron. It is believed that the sub-micron particles penetrate filters, or sheet filters, thus plugging the pores for what should have been the successful filtration of the larger micron particles. Like BWR iron wastewaters, fuel pools/storage basins (especially in the decon. phase) often contain colloids which make clarity and the resulting visibility nearly impossible. Likewise, miscellaneous, often high conductivity, wastesteams at various plants contain such colloids, iron, salts (sometimes seawater intrusion and referred to as Salt Water Collection Tanks), dirt/clay, surfactants, waxes, chelants, etc. Such wastestreams are not ideally suited for standard dead-end (cartridges) or cross-flow filtration (UF/RO) followed even by demineralizers. Filter and bed plugging are almost assured. The key to solving these dilemmas is 1) to break the colloid (i.e., break the outer radius repulsive charges of the similar charged colloidal particles), 2) allow these particles to now flocculate (floc), and 3) form a type of floc that is more readily filterable, and, thus, dewaterable. This task has been carried out with the innovative application of electronically seeding the feed stream with the metal of choice, and without the addition of chemicals common to ferri-floccing, or polymer addition. This patent-pending new system and technique is called Seeding And Filtration Electronically, or the SAFE™ System. Once the colloid has been broken and flocking has begun, removal of the resultant floc can be carried out by standard, backwashable (or, in simple cases, dead-end) filters; or simply in dewaterable HICs or liners. Such applications include low level radwaste (LLW) from both PWRs and BWRs, fuel pools, storage basins, salt water collection tanks, etc. For the removal of magnetic materials, such as some BWR irons, an ElectroMagnetic Filter (EMF) was developed to couple with the ElectroCoagulation (EC), (or metal-Floccing) Unit. In the advent that the wastestream primarily contains magnetic materials (e.g., boiler condensates and magnetite, and hemagnetite from BWRs), the material was simply filtered using the EMF. Bench-, pilot- and full-scale systems have been assembled and applied on actual plant waste samples quite successfully. The effects of initial feed pH and conductivity, as well as flocculation retention times was examined prior to applying the production equipment into the field. Since the initial studies (Denton, et al, EPRI, 2006), the ultimate success of field applications is now being demonstrated as the next development phase. For such portable field demonstrations and demand systems, a fully self enclosed (secondary containment) EC system was first developed and assembled in a modified B 25 Box (Floc-In-A-Box) and is being deployed to a number of NPP sites. Finally, a full-scale SAFE™ System has been deployed to Exelon’s Dresden NPP as a vault cleanup demand system. This is a 30 gpm EC system to convert vault solids/sludges to a form capable of being collected and dewatered in a High Integrity Container (HIC). This initial vault work will be on-going for approximately three months, before being moved to additional vaults. During the past year, additional refinements to the patent pending SAFE™ System have included the SAFER™ System (Scalant and Foulant Electronic Removal) for the removal by EC of silica, calcium and magnesium. This has proven to be an effective enabler for RO, NF and UF as a pretreatment system. Advantages here include smaller, more efficiently designed systems and allowed lower removal efficiencies with the removal of the limiting factor of scalants. Similarly, the SAFE™ System has been applied in the form of a BAC-UP™ System (Boric Acid Clean-Up) as an alternative to more complex RO or boric acid recycle systems. Lastly, samples were received from two different DOE sites for the removal of totally soluable, TDS, species (e.g., cesium, Cs, Sr, Tc, etc.). For these applications, an ion-specific seed (an element of the SMART™ System) was coupled with the Cs prior to EC and subsequent filtration and dewatering, for the effective removal of the cesium complex and the segregation of low level and high waste (LLW & HLW) streams.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Confounded design"

1

Nieto-Castanon, Alfonso. CONN functional connectivity toolbox (RRID:SCR_009550), Version 18. Hilbert Press, 2018. http://dx.doi.org/10.56441/hilbertpress.1818.9585.

Full text
Abstract:
CONN is a Matlab-based cross-platform software for the computation, display, and analysis of functional connectivity in fMRI (fcMRI). Connectivity measures include seed-to-voxel connectivity maps, ROI-to- ROI connectivity matrices, graph properties of connectivity networks, generalized psychophysiological interaction models (gPPI), intrinsic connectivity, local correlation and other voxel-to-voxel measures, independent component analyses (ICA), and dynamic component analyses (dyn-ICA). CONN is available for resting state data (rsfMRI) as well as task-related designs. It covers the entire pipeline from raw fMRI data to hypothesis testing, including spatial coregistration, ART-based scrubbing, aCompCor strategy for control of physiological and movement confounds, first-level connectivity estimation, and second-level random-effect analyses and hypothesis testing.
APA, Harvard, Vancouver, ISO, and other styles
2

Nieto-Castanon, Alfonso. CONN functional connectivity toolbox (RRID:SCR_009550), Version 20. Hilbert Press, 2020. http://dx.doi.org/10.56441/hilbertpress.2048.3738.

Full text
Abstract:
CONN is a Matlab-based cross-platform software for the computation, display, and analysis of functional connectivity in fMRI (fcMRI). Connectivity measures include seed-to-voxel connectivity maps, ROI-to- ROI connectivity matrices, graph properties of connectivity networks, generalized psychophysiological interaction models (gPPI), intrinsic connectivity, local correlation and other voxel-to-voxel measures, independent component analyses (ICA), and dynamic component analyses (dyn-ICA). CONN is available for resting state data (rsfMRI) as well as task-related designs. It covers the entire pipeline from raw fMRI data to hypothesis testing, including spatial coregistration, ART-based scrubbing, aCompCor strategy for control of physiological and movement confounds, first-level connectivity estimation, and second-level random-effect analyses and hypothesis testing.
APA, Harvard, Vancouver, ISO, and other styles
3

Nieto-Castanon, Alfonso. CONN functional connectivity toolbox (RRID:SCR_009550), Version 19. Hilbert Press, 2019. http://dx.doi.org/10.56441/hilbertpress.1927.9364.

Full text
Abstract:
CONN is a Matlab-based cross-platform software for the computation, display, and analysis of functional connectivity in fMRI (fcMRI). Connectivity measures include seed-to-voxel connectivity maps, ROI-to- ROI connectivity matrices, graph properties of connectivity networks, generalized psychophysiological interaction models (gPPI), intrinsic connectivity, local correlation and other voxel-to-voxel measures, independent component analyses (ICA), and dynamic component analyses (dyn-ICA). CONN is available for resting state data (rsfMRI) as well as task-related designs. It covers the entire pipeline from raw fMRI data to hypothesis testing, including spatial coregistration, ART-based scrubbing, aCompCor strategy for control of physiological and movement confounds, first-level connectivity estimation, and second-level random-effect analyses and hypothesis testing.
APA, Harvard, Vancouver, ISO, and other styles
4

Clevenger, Anthony P., and Adam T. Ford. A before-after-control-impact study of wildlife fencing along a highway in the Canadian Rocky Mountains. Nevada Department of Transportation, February 2022. http://dx.doi.org/10.15788/ndot2022.02.

Full text
Abstract:
Wildlife exclusion fencing has become a standard component of highway mitigation systems designing to reduce collisions with large mammals. Past work on the effectiveness of exclusion fencing has relied heavily on control-impact (i.e., space-for-time substitutions) and before-after study designs. These designs limit inference and may confound the effectiveness of mitigation with co-occurring process that also change the rate of collisions. We used a replicated before-after-control-impact study design to assess fencing effectiveness along the Trans-Canada Highway in the Rocky Mountains of Canada. We found that collisions declined for common ungulates species (elk, mule deer and white-tailed deer) by up to 96% but not for large carnivores. The weak response of carnivores is likely due to combination of fence intrusions and low sample sizes. When accounting for background changes in collision rates observed at control sites, naïve estimates of fencing effectiveness declined by 6% at one site to 90% and increased by 10% at another to a realized effectiveness of 82%. When factoring in the cost of ungulate collisions to society as a whole, fencing provided a net economic gain within 1 year of construction. Over a 10-year period, fencing would provide a net economic gain of >$500,000 per km in reduced collisions. In contrast, control site may take upwards of 90 years before the background rates of collisions decline to a break even point. Our study highlights the benefits of long-term monitoring of road mitigation projects and provides evidence of fencing effectiveness for reducing wildlife-vehicle collisions involving large mammals.
APA, Harvard, Vancouver, ISO, and other styles
5

Splitter, Gary, and Menachem Banai. Microarray Analysis of Brucella melitensis Pathogenesis. United States Department of Agriculture, 2006. http://dx.doi.org/10.32747/2006.7709884.bard.

Full text
Abstract:
Original Objectives 1. To determine the Brucella genes that lead to chronic macrophage infection. 2. To identify Brucella genes that contribute to infection. 3. To confirm the importance of Brucella genes in macrophages and placental cells by mutational analysis. Background Brucella spp. is a Gram-negative facultative intracellular bacterium that infects ruminants causing abortion or birth of severely debilitated animals. Brucellosis continues in Israel, caused by B. melitensis despite an intensive eradication campaign. Problems with the Rev1 vaccine emphasize the need for a greater understanding of Brucella pathogenesis that could improve vaccine designs. Virulent Brucella has developed a successful strategy for survival in its host and transmission to other hosts. To invade the host, virulent Brucella establishes an intracellular niche within macrophages avoiding macrophage killing, ensuring its long-term survival. Then, to exit the host, Brucella uses placenta where it replicates to high numbers resulting in abortion. Also, Brucella traffics to the mammary gland where it is secreted in milk. Missing from our understanding of brucellosis is the surprisingly lillie basic information detailing the mechanisms that permit bacterial persistence in infected macrophages (chronic infection) and dissemination to other animals from infected placental cells and milk (acute infection). Microarray analysis is a powerful approach to determine global gene expression in bacteria. The close genomic similarities of Brucella species and our recent comparative genomic studies of Brucella species using our B. melitensis microarray, suqqests that the data obtained from studying B. melitensis 16M would enable understanding the pathogenicity of other Brucella organisms, particularly the diverse B. melitensis variants that confound Brucella eradication in Israel. Conclusions Results from our BARD studies have identified previously unknown mechanisms of Brucella melitensis pathogenesis- i.e., response to blue light, quorum sensing, second messenger signaling by cyclic di-GMP, the importance of genomic island 2 for lipopolysaccharide in the outer bacterial membrane, and the role of a TIR domain containing protein that mimics a host intracellular signaling molecule. Each one of these pathogenic mechanisms offers major steps in our understanding of Brucella pathogenesis. Strikingly, our molecular results have correlated well to the pathognomonic profile of the disease. We have shown that infected cattle do not elicit antibodies to the organisms at the onset of infection, in correlation to the stealth pathogenesis shown by a molecular approach. Moreover, our field studies have shown that Brucella exploit this time frame to transmit in nature by synchronizing their life cycle to the gestation cycle of their host succumbing to abortion in the last trimester of pregnancy that spreads massive numbers of organisms in the environment. Knowing the bacterial mechanisms that contribute to the virulence of Brucella in its host has initiated the agricultural opportunities for developing new vaccines and diagnostic assays as well as improving control and eradication campaigns based on herd management and linking diagnosis to the pregnancy status of the animals. Scientific and Agricultural Implications Our BARD funded studies have revealed important Brucella virulence mechanisms of pathogenesis. Our publication in Science has identified a highly novel concept where Brucella utilizes blue light to increase its virulence similar to some plant bacterial pathogens. Further, our studies have revealed bacterial second messengers that regulate virulence, quorum sensing mechanisms permitting bacteria to evaluate their environment, and a genomic island that controls synthesis of its lipopolysaccharide surface. Discussions are ongoing with a vaccine company for application of this genomic island knowledge in a Brucella vaccine by the U.S. lab. Also, our new technology of bioengineering bioluminescent Brucella has resulted in a spin-off application for diagnosis of Brucella infected animals by the Israeli lab by prioritizing bacterial diagnosis over serological diagnosis.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography