Dissertations / Theses on the topic 'Baseline Approaches'

To see the other types of publications on this topic, follow the link: Baseline Approaches.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 20 dissertations / theses for your research on the topic 'Baseline Approaches.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Antrobus, Alexander Dennis. "Achieving baseline states in sparsely connected spiking-neural networks: stochastic and dynamic approaches in mathematical neuroscience." Master's thesis, University of Cape Town, 2015. http://hdl.handle.net/11427/19949.

Full text
Abstract:
Networks of simple spiking neurons provide abstract models for studying the dynamics of biological neural tissue. At the expense of cellular-level complexity, they are a frame-work in which we can gain a clearer understanding of network-level dynamics. Substantial insight can be gained analytically, using methods from stochastic calculus and dynamical systems theory. This can be complemented by data generated from computational simulations of these models, most of which benefit easily from parallelisation. One cubic millimetre of mammalian cortical tissue can contain between fifty and one-hundred thousand neurons and display considerable homogeneity. Mammalian cortical tissue (or grey matter") also displays several distinct firing patterns which are widely and regularly observed in several species. One such state is the "input-free" state of low-rate, stochastic firing. A key objective over the past two decades of modelling spiking-neuron networks has been to replicate this background activity state using "biologically plausible" parameters. Several models have produced dynamically and statistically reasonable activity (to varying degrees) but almost all of these have relied on some driving component in the network, such as endogenous cells (i.e. cells which spontaneously fire) or wide-spread, randomised external input (put down to background noise from other brain regions). Perhaps it would be preferable to have a model where the system itself is capable of maintaining such a background state? This a functionally important question as it may help us understand how neural activity is generated internally and how memory works. There has also been some contention as to whether driven" models produce statistically realistic results. Recent numerical results show that there are connectivity regimes in which Self-Sustained, Asynchronous, Irregular (SSAI) firing activity can be achieved. In this thesis, I discuss the history and analysis of the key spiking-network models proposed in the progression toward addressing this problem. I also discuss the underlying constructions and mathematical theory from measure theory and the theory of Markov processes which are used in the analysis of these models. I then present a small adjustment to a well known model and provide some original work in analysing the resultant dynamics. I compare this analysis to data generated by simulations. I also discuss how this analysis can be improved and what the broader future is for this line of research.
APA, Harvard, Vancouver, ISO, and other styles
2

Bouraoui, Seyfallah. "Time series analysis of SAR images using persistent scatterer (PS), small baseline (SB) and merged approaches in regions with small surface deformation." Phd thesis, Université de Strasbourg, 2013. http://tel.archives-ouvertes.fr/tel-01019429.

Full text
Abstract:
This thesis aims at the study of small to large surface deformation that can be detected using the remote sensing interferometric synthetic aperture radar (InSAR) methods. The new developments of InSAR processing techniques allow the monitoring of surface deformation with millimeter surface change accuracy. Conventional InSAR use a pair of SAR images ("Master" and "Slave" images) in order to measure the phase difference between the two images taken at different times. The uncertainties in measurements using the conventional InSAR due to the atmospheric delay, the topographic changes and the orbital artifacts are the handicaps of this method. The idea of InSAR method is to measure the phase difference between tow SAR acquisitions. These measure refere to the ground movment according to the satellite position. In interferogram the red to blue colors refere to the pixel movement to or far from the satellite position in Line-Of-Sight (LOS) direction. In 2000's, Radar spacecraft have seen a large number of launching mission, SAR quisitions and InSAR applicability have seen explosion in differents geophysical studies due to the important SAR datas and facility of data accessibity. This SAR-mining needs other type and generation of InSAR processing.In 2001, Ferretti and others introduce a new method called Permanent Scatterer InSAR (PS) that is based on the use of more than one Slave image in InSAR processing with the same Master image. This method allows enhancing the LOS signal for each pixel (PS) by using the best time and/or space-correlated signal (from amplitude and/or from phase) for each pixel over the acquisitions. A large number of algorithms were developed for this purpose using thesame principle (variantes). In 2002, Berardino et al developed new algorithm for monitoring surface deformation based on the combination of stack of InSAR results from SAR couples respecting small baseline (SB) distance. Nowadays, these two methods represent the existing time series (TS) analysis of SAR images approaches. In addition, StaMPS software introduced by Hooper and others, in 2008 is able to combine these two methods in order to take advantages from both of this TS approaches in term of best signal correlation and reducing the signal noise errors. In this thesis, the time series studies of surface changes associate to differents geophysical phenomena will have two interest: the first is to highlight the PS and SBAS results and discuss the fiability of obtained InSAR signal with comparation with the previous studies of the same geophysical case or observations in the field and in the second time, the combined method will also validate the results obtained separately with differents TS techniques. The validation of obtained signal is assured by these two steeps: Both of PS and SBAS methods should give relatively the same interferograms and LOSdisplacement signal (in term of sign and values), in addition these results will be compared with the previous studies results or with observations on the field.In this thesis, the InSAR techniques are applied to different case-studies of small surface deformation [...]
APA, Harvard, Vancouver, ISO, and other styles
3

Chun, Seokjoon. "Using MIMIC Methods to Detect and Identify Sources of DIF among Multiple Groups." Scholar Commons, 2014. https://scholarcommons.usf.edu/etd/5352.

Full text
Abstract:
This study investigated the efficacy of multiple indicators, multiple causes (MIMIC) methods in detecting uniform and nonuniform differential item functioning (DIF) among multiple groups, where the underlying causes of DIF was different. Three different implementations of MIMIC DIF detection were studied: sequential free baseline, free baseline, and constrained baseline. In addition, the robustness of the MIMIC methods against the violation of its assumption, equal factor variance across comparison groups, was investigated. We found that the sequential-free baseline methods provided similar Type I error and power rates to the free baseline method with a designated anchor, and much better Type I error and power rates than the constrained baseline method across four groups, resulting from the co-occurrence background variables. But, when the equal factor variance assumption was violated, the MIMIC methods yielded the inflated Type I error. Also, the MIMIC procedure had problems correctly identifying the sources DIF, so further methodological developments are needed.
APA, Harvard, Vancouver, ISO, and other styles
4

Uhde, Kristin Broome. "Bioterrorism Syndromic Surveillance: A Dual-Use Approach with Direct Application to the Detection of Infectious Disease Outbreaks." [Tampa, Fla.] : University of South Florida, 2003. http://purl.fcla.edu/fcla/etd/SFE0000623.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Romano, Elisa. "Evaluation of a multi-component individual treatment intervention for adult males with histories of sexual abuse : a multiple-baseline approach." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1999. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape10/PQDD_0006/NQ41624.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

PALENA, Nicola (ORCID:0000-0002-2746-1208). "Educing Information and Deception Detection: Testing the Efficacy of the Baseline Approach and of Social Influence Tactics in HUMINT Interviewing." Doctoral thesis, Università degli studi di Bergamo, 2020. http://hdl.handle.net/10446/181498.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Fusaro, Jonathan L. "Estimating Baseline Population Parameters of Urban and Wildland Black Bear Populations Using a DNA-Based Capture -Mark-Recapture Approach in Mono County, California." DigitalCommons@USU, 2014. https://digitalcommons.usu.edu/etd/3706.

Full text
Abstract:
Prior to European settlement, black bear (Ursus americanus) were far less abundant in the state of California. Estimates from statewide harvest data indicate the California black bear population has tripled in the last 3 decades. Bears inhabit areas they formally never occurred (e.g., urban environments) and populations that were at historically low densities are now at high densities. Though harvest data are useful and widely used as an index for black bear population size and population demographics statewide, it lacks the ability to produce precise estimates of abundance and density at local scales or account for the numerous bears living in non-hunted areas. As the human population continues to expand into wildlife habitat, we are being forced to confront controversial issues about wildlife management and conservation. Habituated bears living in non-hunted, urban areas have been and continue to be a major concern for wildlife managers and the general public. My objective was to develop DNA-based capture-mark-recapture (CMR) survey techniques in wildland and urban environments in Mono County, California to acquire population size and density at local scales from 2010 to 2012. I also compared population density between the urban and wildland environment. To my knowledge, DNA-based CMR surveys for bears have only been implemented in wildland or rural environments. I made numerous modifications to the techniques used during wildland DNA-based CMR surveys to survey bears in an urban environment. I used a higher density of hair-snares than typically used in wildland studies, non-consumable lures, modified hair-snares for public safety, included the public throughout the entire process, and surveyed in the urban-wildland interface as well as the city center. These methods were efficient and accurate while maintaining human safety. I determined that there is likely a difference in population density between the urban and wildland environments. Population density was 1.6 to 2.5 times higher in the urban study area compared to the wildland study area. Considering the negative impacts urban environments can have on wildland bear populations, this is a serious management concern. The densities I found were similar to those found in other urban and wildland black bear populations. The baseline data acquired from this study can be used as part of a long-term monitoring effort. By surveying additional years, population vital rates such as apparent survival, recruitment, movement, and finite rate of population change can be estimated.
APA, Harvard, Vancouver, ISO, and other styles
8

Mendoza, Guillermo I. "Exploring Gesturing as a Natural Approach to Impact Stages of Second Language Development: A Multiple Baseline, Single Case Study of a Head Start Child." Digital Commons @ East Tennessee State University, 2016. https://dc.etsu.edu/etd/3121.

Full text
Abstract:
There is an increase in Hispanic English Language Learners (ELL). Poverty levels and lack of teacher training can also be stacked against the ELL population. Gesturing is a teaching technique that is used in successful methods such as The Natural Approach (NA) and Total Physical Response (TPR) in helping ELL students in English comprehension and output. This study examined the effects that increased teacher gestures have on the number of words spoken by the child in multiple settings. Data were collected in the context of a multiple baseline design across three settings. The results indicate that there was an effect on the amount of words spoken in two out of three settings. Suggestions are presented to expand on this effect.
APA, Harvard, Vancouver, ISO, and other styles
9

Hayward, Joanna I. "A Latent Profile Analysis of Baseline Difficulties in Emotion Regulation and Experiential Avoidance on Depression and Anxiety in a Psychiatric Inpatient Sample: A Person Centered Approach." University of Toledo / OhioLINK, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1515238096646445.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Esch, Christina [Verfasser]. "Development of a one-step three dimensional approach for the phase unwrapping process in a differential InSAR stack based on Small BAseline Subset (SBAS) interferograms / Christina Esch." Bonn : Universitäts- und Landesbibliothek Bonn, 2020. http://d-nb.info/1219140562/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

WANZIRA, HUMPHREY. "Supportive Supervision as an approach to improve the quality of care for children with acute malnutrition in Arua district, Uganda: Baseline systematic assessment, Cluster Randomised Controlled Trial and Cost-Effectiveness Analysis." Doctoral thesis, Università degli Studi di Trieste, 2019. http://hdl.handle.net/11368/2962380.

Full text
Abstract:
INTRODUCTION Moderate and severe acute malnutrition estimates among children in the West Nile region, in Uganda, are higher than the national level (10.4% and 5.6%, respectively versus 3.6 % and 1.3 %). Additionally, the WHO estimates that in 2016, 6.6 million children and young adolescents died from causes attributed to the poor quality of care in such similar settings. Supportive supervision (SS) has been proposed as one of the approaches to improve quality of care. The main objectives of this project were; to determine the baseline status of the quality of care of nutrition services and health outcomes among malnourished children at health facility level; to test the effectiveness of supportive supervision to improve health outcomes and quality of care; and to estimate its cost effectiveness. METHODS Phase one: Six health centers with the highest burden of malnutrition in Arua district, West Nile region, were selected. Information on health outcomes (cured, defaulters, non responders, transferred and died) and quality of case management were extracted from official records. Quality of care was assessed using the national Nutrition Service Delivery Assessment (NSDA) tool, with ten key areas scored as poor, fair, good or excellent. Phase two: The six facilities were randomized to receive either SS or to control. SS was delivered for ten months in two equal five months’ periods; to heath center (HC) staff only (first period), and later extended to community health workers (CHWs) (second period). SS was delivered biweekly for the first three months and later monthly. The package included: monitoring progress, provision of technical support, facilitating good team dynamics and problem solving attitude. The control facilities were assigned to receive the national routine quarterly supervisory visits. Main outcomes included health outcomes, quality of case management, quality of nutrition service delivery and access to care. Phase three: The Incremental Cost Effectiveness Ratios (ICER) for the first and second period were estimated. RESULTS Phase one: A total of 1020 children were assessed at baseline. The cured and defaulter’s rates were 52.9% (95% CI: 49.7 – 56.1) and 38.3% (95%CI: 35.2 – 41.4) respectively. The NSDA revealed 33/60 (55%) areas scored poorly, 25/60 (41%) as fair, 2/60 (3.3%) were good and none were excellent. Main gaps included: lack of trained staff; disorganized patient flow; poor case management; stock out of essential nutrition supplies and weak community linkage. Phase two: 737 children were enrolled, 430 in the intervention and 307 in the control. Significant findings of the intervention versus control included: higher cure rate [83.8% (95%CI: 79.4 – 86.7) versus [44.9% (95%CI: 37.8 – 49.1), p=0.010)], lower defaulting rate [1.4% (95%CI 1.1% to 1.8%) versus 47.2% (95%CI 37.3% to 57.1 %), p=0.001], higher correct complementary treatment (94.0% versus 58.8%, p=0.001) and more NSDA areas scored as either good or excellent [24/30 (80%) versus 14/30 (46.6%), OR = 4.6 (1.3 – 17.4), p=0.007]. Access to care was significantly higher during the second period as compared to the first period [proportion difference = 28.4%, OR = 1.7 (1.3 – 2.3), p = 0.001].Phase three: the ICER of € 9.7 (95%CI:7.4 – 14.9) and € 6.8 (95% CI:4.8 – 9.5) were estimated in the first and second periods respectively. CONCLUSION At baseline, the quality of care provided to children with malnutrition at health center level was greatly substandard. The delivery of SS to HC staff and CHWs significantly improved the cure rate, the quality of case management, the overall quality of care and access to care. SS, especially that delivered to CHWs, was very cost effective.
APA, Harvard, Vancouver, ISO, and other styles
12

Proença, Sara Isabel Azevedo. "Impact assessment of energy and climate policies : a hybrid botton-up general equilibrium model (HyBGem) for Portugal." Doctoral thesis, Instituto Superior de Economia e Gestão, 2013. http://hdl.handle.net/10400.5/6126.

Full text
Abstract:
Doutoramento em Economia
Climate change mitigation and the imperative of a new sustainable energy paradigm are among the greatest challenges facing the world today, and they are high on the priority list of policy makers as well as within the scientific community. In this context significant efforts are being made in the design and implementation of energy and carbon mitigation policies at both European and national level. Evidence of this can be seen in the recent adoption by the EU of an integrated climate and energy policy that setts ambitious binding targets to be achieved by 2020 – known as the 20-20-20 targets of the EU Climate and Energy Package. Undoubtedly, the cost of these policies can be substantially reduced if a comprehensive impact assessment is made of the most efficient and cost-effective policy measures and technological options. Policy impact assessment therefore plays an important role in supporting the energy and climate decision-making process. This is the context of and motivation for the research presented in this thesis. The first part of the thesis, the conceptual framework, describes the development of the Hybrid Bottom-up General Equilibrium Model (HyBGEM) for Portugal, as a decision-support tool to assist national policy makers in conducting energy and climate policy analysis. HyBGEM is a single integrated, multi-sector, hybrid top-down/bottom-up general equilibrium E3 model formulated as a mixed complementarity problem. The second part of the thesis, the empirical analysis, provides an impact assessment of Portugal’s 2020 energy-climate policy targets under the EU Climate and Energy Package commitments, based on the HyBGEM model and the baseline projections previously developed. Five policy scenarios have been modelled and simulated to evaluate the economic, environmental and technological impacts on Portugal of complying with its individual 2020 carbon emissions and renewable energy targets. Furthermore, insights are gained into how these targets interact with each other, what are the most efficient and cost-effective policy options, and how alternative pathways affect the extent of policy-induced effects. The numerical analysis reveals that Portugal’s 2020 energy-climate targets can be achieved without significant compliance costs. A major challenge for policy makers is to promote an effective decarbonisation of the electricity generation sector through renewable-based technologies. There is evidence that the compliance costs of Portugal’s low carbon target in 2020 are significantly higher than the costs of achieving the national RES-E target, given that imposing carbon emissions constraints and subsidising renewable electricity generation via a feed-in tariffs scheme both have a similar impact on economy-wide emissions. This result suggests that the most cost-effective policy option to achieve the national energy-climate targets is to promote renewable power generation technologies, recommending that policy makers should proceed with the mechanisms that support it. The transition to a ‘greener’ economy is thus central to the ongoing fight against climate change. There is also evidence that emission market segmentation as imposed by the current EU-ETS creates substantial excess costs compared to uniform emissions pricing through a comprehensive cap-and-trade system. The economic argument on counterproductive overlapping regulation is not corroborated by the findings. Furthermore, there is no potential for a double dividend arising from environmental tax reforms. To conclude, the results highlight the critical importance of market distortions and revenue-recycling schemes, together with baseline projections in policy impact assessment.
A mitigação das alterações climáticas e o imperativo de um novo paradigma energético sustentável estão entre os maiores desafios que o mundo de hoje enfrenta, surgindo no topo da lista de prioridades quer dos decisores políticos quer da comunidade científica. Neste contexto, têm sido envidados esforços significativos na conceção e aplicação de políticas energéticas e de mitigação de carbono, tanto a nível europeu como nacional. A recente adoção de uma política integrada da UE em matéria de clima e energia, com objetivos ambiciosos a serem alcançados até 2020 – os denominados objetivos 20-20-20 do Pacote Clima-Energia da UE, é prova disso. Não há dúvida de que o custo destas políticas pode ser substancialmente reduzido se for feita uma avaliação global das medidas e das opções tecnológicas mais eficientes e com melhor relação custo-eficácia. A avaliação de impacto das políticas desempenha assim um papel importante no apoio à tomada de decisão em matéria energética e climática. São estes o contexto e a motivação para a investigação apresentada nesta tese. A primeira parte da tese, referente à estrutura conceptual, descreve o desenvolvimento do modelo HyBGEM – Hybrid Bottom-up General Equilibrium Model, concebido para Portugal. Trata-se de uma ferramenta de apoio à decisão em matéria de políticas de energia-clima. O HyBGEM é um modelo E3 de equilíbrio geral, com uma estrutura híbrida top-down/bottom-up integrada, multi-setorial e formulado como um problema de complementaridade mista. A segunda parte da tese, referente à análise empírica, apresenta uma avaliação de impacto das políticas de energia-clima para Portugal no quadro dos compromissos assumidos no Pacote Clima-Energia da UE, com base no modelo HyBGEM e em projeções de base previamente construídas. Foram modelados e simulados cinco cenários de política para avaliar os impactos económicos, ambientais e tecnológicos do cumprimento das metas nacionais traçadas para 2020 em matéria de limitação de emissões de carbono e promoção das energias renováveis. Avalia-se também o modo como estes objetivos interagem entre si, quais são as opções de política mais eficientes e custo-eficazes, e em que medida opções alternativas influenciam a magnitude dos impactos. A análise numérica revela que as metas energia-clima 2020 para Portugal podem ser alcançadas sem incorrer em custos de cumprimento significativos. O desafio fundamental que se coloca aos decisores políticos consiste em impulsionar a descarbonização do setor de produção de energia elétrica através de tecnologias de energia renovável. Existe evidência de que os custos de cumprimento da meta de redução de carbono são significativamente mais elevados que os custos de cumprimento da meta de FER-E, sendo que a imposição de restrições às emissões e a subsidiação da produção de eletricidade a partir de fontes de energia renovável (regime de tarifas feed-in) têm um impacto semelhante sobre o total de emissões. Este resultado sugere que a promoção das tecnologias de base renovável no sistema energético nacional é a opção com melhor relação custo-eficácia para a concretização dos objetivos nacionais energia-clima para 2020, instando os decisores políticos a prosseguir com os mecanismos de apoio existentes. A transição para uma economia mais ‘verde’ afigura-se assim fundamental no combate em curso contra as alterações climáticas. A análise revela também que a segmentação do mercado de emissões imposta pelo atual CELE gera custos adicionais substanciais quando comparada com um sistema de direitos de emissão uniforme. O argumento económico de que a sobreposição de regulamentação é contraproducente não é corroborado pelos resultados. A expectativa de um duplo dividendo decorrente das reformas fiscais em matéria ambiental não foi confirmada. Os resultados destacam ainda a importância crítica das distorções de mercado, dos sistemas de reciclagem de receitas e das projeções de base, para a avaliação de impacto das políticas.
APA, Harvard, Vancouver, ISO, and other styles
13

Upton, Janine-Lee. "Integrating spatial, temporal, referral problem and demographic approaches to establish systematic baseline data to inform future evaluations at the Pietermaritzburg Child and Family Centre." Thesis, 2013. http://hdl.handle.net/10413/10651.

Full text
Abstract:
This exploratory retrospective record review extending from 1975 – 2013 of the Pietermaritzburg Child and Family Centre (hereafter referred to as the “CFC”) aimed to create a database to electronically capture CFC records to generate descriptive statistics, and the create CFC user profiles utilising multiple data analysis methodologies in order to create baseline data to inform future program evaluations. To date, no formal program evaluation has been conducted by the CFC, resulting in programs being launched without the backing of systematic and empirical data used to inform decisions. Data driven decision making is imperative when deciding on resource allocation to ensure maximum derived benefits. The study sample totalled 1974 records from the past three decades since CFC inception in 1975. These records were captured electronically in a Microsoft Access database. SPSS and ArcGIS were used to analyse the data to create service user profiles, and gather baseline data to inform future Needs Assessments and Program Evaluations. The study found temporal changes in CFC user demographics, referral problems, socio-economic standing, and referral schools since inception in 1975. The study explored, using geographic information software, client distribution of demographics, residence, referral problem, and CFC service reach, and found that there are significant geographical variations in each of these constructs. The geographic variations, together with the statistical findings highlight the importance of establishing Monitoring and Evaluation systems in order to stay relevant to the needs of CFC users. Further, findings suggest a tailored approach to CFC program development and focus, depending on future CFC priorities.
Thesis (M.Soc.Sc.)-University of KwaZulu-Natal, Pietermaritzburg, 2013.
APA, Harvard, Vancouver, ISO, and other styles
14

Shriwastav, Sachin. "Negotiator Initiated Connectivity Restoration in Partitioned Wireless Sensor Networks." Thesis, 2018. https://etd.iisc.ac.in/handle/2005/4112.

Full text
Abstract:
Wireless Sensor Networks (WSN) are groups of sensors, connected by communication links, spread over the area of interest, to perform a specific task. The network may get partitioned into clusters due to unanticipated simultaneous multiple node failures, and in such an event, will need to get reconnected to continue operation. These clusters are unaware of their own size, surviving nodes and links as well as the size and location of other survivor clusters. In our work, we propose a distributed and autonomous approach, namely Round Table Negotiation (RTN) approach, for reconnecting disjoint clusters in a short time. In this approach, each survivor cluster undergoes a self-discovery process, compiles its information and then sends a negotiator to participate in the round table negotiation and decision-making process to the round table around a pre-specified meeting point. All such negotiators exchange information and decide upon reconnection paths between the clusters and the nodes to involve in the reconnection process. The negotiators then return to their respective clusters and the reconnection process is carried out. Simulation results on wireless networks of varying sizes are presented. We show through a detailed comparison with existing methods that the proposed approach achieves reconnection in significantly lower time and is favorably comparable with respect to other performance metrics as well. In the second part of the work, we introduce the notion of importance of coverage area, that is, WSNs deployed in areas of varying coverage importance, classified by Importance Ranks (IR). In addition to connectivity restoration, the lost coverage of higher importance needs to be recovered, using the survivor nodes from the less important coverage area. This is done in two ways: recovery followed by reconnection and vice-versa, using the RTN approach. At the round table, the negotiation includes the assignment of importance based replacements for lost coverage along with reconnection paths and nodes to involve in the process. Once the negotiators return to their clusters, recovery and the reconnection processes are carried out. Simulation of the application of proposed modules on a randomly generated network is presented, along with comparison of these methods with that of original RTN approach over various random networks, to prove the applicability and to further study the efficacy.
APA, Harvard, Vancouver, ISO, and other styles
15

Chwialkowski, Natalia Ewa. "Novel approaches in determining baseline information on annual disposal rates and trace element content of U.S. coal combustion residues : a response to EPA’s June 2010 proposed disposal rule." Thesis, 2010. http://hdl.handle.net/2152/ETD-UT-2010-12-2386.

Full text
Abstract:
Although products of coal combustion (PCCs) such as coal ash are currently exempted from classification as a hazardous waste in the United States under the 1976 Resource Conservation and Recovery Act (RCRA), the U.S. Environmental Protection Agency (EPA) is now revising a proposed rule to modify disposal practices for these materials in order to prevent contamination of ground- and surface water sources by leached trace elements. This paper analyzes several aspects of EPA’s scientific reasoning for instating the rule, with the intent of answering the following questions: 1) Are EPA’s cited values for PCC production and disposal accurate estimates of annual totals?; 2) In what ways can EPA’s leaching risk modeling assessment be improved?; 3) What is the total quantity of trace elements contained within all PCCs disposed annually?; and 4) What would be the potential costs and feasibility of reclassifying PCCs not under RCRA, but under existing NRC regulations as low-level radioactive waste (LLRW)? Among the results of my calculations, I found that although EPA estimates for annual PCC disposal are 20% larger than industry statistics, these latter values appear to be closer to reality. Second, EPA appears to have significantly underestimated historical PCC disposal: my projections indicate that EPA’s maximum estimate for the quantity of fly ash landfilled within the past 90 years was likely met by production in the last 30 years alone, if not less. Finally, my analysis indicates that while PCCs may potentially meet the criteria for reclassification as low-level radioactive waste by NRC, the cost of such regulation would be many times that of the EPA June proposed disposal rule ($220-302 billion for PCCs disposed in 2008 alone, versus $1.47 billion per year for the Subtitle C option and $236-587 million for Subtitle D regulatory options).
text
APA, Harvard, Vancouver, ISO, and other styles
16

Jabeen, Rukhshinda. "Automated Baseline Estimation for Analytical Signals." 2013. http://hdl.handle.net/10222/37440.

Full text
Abstract:
During the last decade, many baseline estimation methods have been proposed, but many of these approaches are either only useful for specific kinds of analytical signals or require the adjustment of many parameters. This complicates the selection of an appropriate approach for each kind of chemical signal and the optimization of multiple parameters itself is not an easy task. In this work, an asymmetric least squares (ALS) approach is used with truncated and augmented Fourier basis functions to provide a universal basis space for baseline approximation for diverse analytical signals. The proposed method does not require extensive parameter adjustment or prior baseline information. The basis set used to model the baselines includes a Fourier series truncated to low frequency sines and cosines (consistent with the number of channels) which is then augmented with lower frequencies. The number of basis functions employed depends mainly on the frequency characteristics of the baseline, which is the only parameter adjustment required for baseline estimation. The weighting factor for the asymmetric least squares in this case is dependent mainly on the level of the noise. The adjustment of these two parameters can be easily performed by visual inspection of results. To estimate and eliminate the baseline from the analytical signals, a novel algorithm, called Truncated Fourier Asymmetric Least Squares (TFALS) was successfully developed and optimized. It does not require baseline representative signals or extensive parameter adjustments. The method is described only with parameters optimization using simulated signals. The results with simulated and experimental data sets having different baseline artefacts show that TFALS is a versatile, effective and easy-to-use baseline removal method.
APA, Harvard, Vancouver, ISO, and other styles
17

"Baseline free approach for the semiparametric transformation models with missing covariates." 2003. http://library.cuhk.edu.hk/record=b5891462.

Full text
Abstract:
Leung Man-Kit.
Thesis (M.Phil.)--Chinese University of Hong Kong, 2003.
Includes bibliographical references (leaves 37-41).
Abstracts in English and Chinese.
Chapter 1 --- Introduction --- p.1
Chapter 1.1 --- Basic concepts of survival data --- p.3
Chapter 1.2 --- Missing Complete at Random (MCAR) --- p.8
Chapter 1.3 --- Missing at Random (MAR) --- p.9
Chapter 2 --- The maximaization of the marginal likelihood --- p.11
Chapter 2.1 --- Survival function --- p.11
Chapter 2.2 --- Missing covariate pattern --- p.13
Chapter 2.3 --- Set of survival time with rank restrictions --- p.13
Chapter 2.4 --- Marginal likelihood --- p.14
Chapter 2.5 --- Score function --- p.15
Chapter 3 --- The MCMC stochastic approximation approach --- p.17
Chapter 4 --- Simulations Studies --- p.22
Chapter 4.1 --- MCAR : Simulation 1 --- p.23
Chapter 4.2 --- MCAR : Simulation 2 --- p.24
Chapter 4.3 --- MAR : Simulation 3 --- p.26
Chapter 4.4 --- MAR : Simulation 4 --- p.27
Chapter 5 --- Example --- p.30
Chapter 6 --- Discussion --- p.33
Appendix --- p.35
Bibliography --- p.37
APA, Harvard, Vancouver, ISO, and other styles
18

Perez, Bolde Carlos Francisco Castellanos. "Evolution of latin american approaches to integrated coastal management (ICM): paths, outcomes, and governance baselines." Master's thesis, 2010. http://hdl.handle.net/10400.1/345.

Full text
Abstract:
Dissertação mest., Gestão da Água e da Costa, Universidade do Algarve, 2010
Latin America is a mosaic of social, economic, political and environmental realities where thousands of ICM efforts have been implemented, and only a few successful cases have been documented. This thesis focuses on governance of the coastal zone, and identifies and synthesizes the evolution of approaches to Integrated Coastal Management (ICM) in Latin America, and puts into such context the Governance Baselines methodology. In order to achieve such objective, and as no ICM classification framework has been developed to the best understanding of its author, this thesis proposes the SALM ICM evolution path classification based on observations derived from its preparation process. Derivated from LOICZ Priority Topic 3 ?Linking Governance and Science in Coastal Regions?, the Governance Baselines methodology has been successfully implemented in several contexts ? i.e. protected areas, urbanized coasts, and rural, multiple use estuaries? in both high- and low-income Latin American countries. This thesis found that success of ICM management efforts tends to depend on the extent to which management efforts are able to integrate all four GESAMP-defined institutional or 1st Order outcomes ?unambiguous goals, constituencies, formal commitment, and institutional capacity?, and turn them into outcomes of superior order; particularly the systematically neglected and critical conduct and use changes in the behavior of institutions, individuals, groups, businesses and investments ? 2nd Order?, are i) the essence and drivers of environmental and socioeconomic benefits ?3rd Order outcomes?, and, even more, ii) the cause of the threatens to the coastal zone.
APA, Harvard, Vancouver, ISO, and other styles
19

Bacaltchuk, Benami. "Baseline data for a coorientational approach to evaluation of changes produced by a sustainable agricultural demonstration program the Wisconsin integrated cropping systems trial." 1993. http://catalog.hathitrust.org/api/volumes/oclc/30611627.html.

Full text
Abstract:
Thesis (Ph. D.)--University of Wisconsin--Madison, 1993.
Typescript. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 189-201).
APA, Harvard, Vancouver, ISO, and other styles
20

Freysen, Charlene. "Effek van gestaltspelterapie op die selfbeeld van die leergestremde leerder." Diss., 2005. http://hdl.handle.net/10500/1454.

Full text
Abstract:
Text in Afrikaans
The young learner is in the developmental phase where he wants to master tasks successfully. When the learner experiences problems at school it influences his motivation and how he views himself. Learning disabled learners are exposed to academic failures and form negative views about their abilities and functioning. The effect of Gestalt play therapy on the self-esteem of the learning disabled learner was explored. The study was done through a baseline consisting of an adjusted Rosenberg's Self-esteem Questionnaire that was completed by educators and learners before and after the therapeutic program. Because of the learners' learning disability, they used an aid namely "Talking-Mats". Although learning disabilities influenced the learners' self-evaluations substantively, the learners' circumstances at home further substantively influenced their self-esteem. It seems that Gestalt play therapy did have a positive effect on the self-esteem of learning disabled learners.
Social work
M. Diac.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography