To see the other types of publications on this topic, follow the link: Restriction of data.

Dissertations / Theses on the topic 'Restriction of data'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 31 dissertations / theses for your research on the topic 'Restriction of data.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Brass, Stefan. "Range restriction for general formulas." Universität Potsdam, 2010. http://opus.kobv.de/ubp/volltexte/2010/4152/.

Full text
Abstract:
Deductive databases need general formulas in rule bodies, not only conjuctions of literals. This is well known since the work of Lloyd and Topor about extended logic programming. Of course, formulas must be restricted in such a way that they can be effectively evaluated in finite time, and produce only a finite number of new tuples (in each iteration of the TP-operator: the fixpoint can still be infinite). It is also necessary to respect binding restrictions of built-in predicates: many of these predicates can be executed only when certain arguments are ground. Whereas for standard logic programming rules, questions of safety, allowedness, and range-restriction are relatively easy and well understood, the situation for general formulas is a bit more complicated. We give a syntactic analysis of formulas that guarantees the necessary properties.
APA, Harvard, Vancouver, ISO, and other styles
2

Moore, Page Casey Seaman John Weldon. "A restriction method for the analysis of discrete longitudinal missing data." Waco, Tex. : Baylor University, 2006. http://hdl.handle.net/2104/4880.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

MARTELOTTE, MARCELA COHEN. "USING LINEAR MIXED MODELS ON DATA FROM EXPERIMENTS WITH RESTRICTION IN RANDOMIZATION." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2010. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=16422@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
Esta dissertação trata da aplicação de modelos lineares mistos em dados provenientes de experimentos com restrição na aleatorização. O experimento utilizado neste trabalho teve como finalidade verificar quais eram os fatores de controle do processo de laminação a frio que mais afetavam a espessura do material utilizado na fabricação das latas para bebidas carbonatadas. A partir do experimento, foram obtidos dados para modelar a média e a variância da espessura do material. O objetivo da modelagem era identificar quais fatores faziam com que a espessura média atingisse o valor desejado (0,248 mm). Além disso, era necessário identificar qual a combinação dos níveis desses fatores que produzia a variância mínima na espessura do material. Houve replicações neste experimento, mas estas não foram executadas de forma aleatória, e, além disso, os níveis dos fatores utilizados não foram reinicializados, nas rodadas do experimento. Devido a estas restrições, foram utilizados modelos mistos para o ajuste da média, e da variância, da espessura, uma vez que com tais modelos é possível trabalhar na presença de dados auto-correlacionados e heterocedásticos. Os modelos mostraram uma boa adequação aos dados, indicando que para situações onde existe restrição na aleatorização, a utilização de modelos mistos se mostra apropriada.
This dissertation presents an application of linear mixed models on data from an experiment with restriction in randomization. The experiment used in this study was aimed to verify which were the controlling factors, in the cold-rolling process, that most affected the thickness of the material used in the carbonated beverages market segment. From the experiment, data were obtained to model the mean and variance of the thickness of the material. The goal of modeling was to identify which factors were significant for the thickness reaches the desired value (0.248 mm). Furthermore, it was necessary to identify which combination of levels, of these factors, produced the minimum variance in the thickness of the material. There were replications of this experiment, but these were not performed randomly. In addition, the levels of factors used were not restarted during the trials. Due to these limitations, mixed models were used to adjust the mean and the variance of the thickness. The models showed a good fit to the data, indicating that for situations where there is restriction on randomization, the use of mixed models is suitable.
APA, Harvard, Vancouver, ISO, and other styles
4

Bourgeois, Adèle. "On the Restriction of Supercuspidal Representations: An In-Depth Exploration of the Data." Thesis, Université d'Ottawa / University of Ottawa, 2020. http://hdl.handle.net/10393/40901.

Full text
Abstract:
Let $\mathbb{G}$ be a connected reductive group defined over a p-adic field F which splits over a tamely ramified extension of F, and let G = $\mathbb{G}(F)$. We also assume that the residual characteristic of F does not divide the order of the Weyl group of $\mathbb{G}$. Following J.K. Yu's construction, the irreducible supercuspidal representation constructed from the G-datum $\Psi$ is denoted $\pi_G(\Psi)$. The datum $\Psi$ contains an irreducible depth-zero supercuspidal representation, which we refer to as the depth-zero part of the datum. Under our hypotheses, the J.K. Yu Construction is exhaustive. Given a connected reductive F-subgroup $\mathbb{H}$ that contains the derived subgroup of $\mathbb{G}$, we study the restriction $\pi_G(\Psi)|_H$ and obtain a description of its decomposition into irreducible components along with their multiplicities. We achieve this by first describing a natural restriction process from which we construct H-data from the G-datum $\Psi$. We then show that the obtained H-data, and conjugates thereof, construct the components of $\pi_G(\Psi)|_H$, thus providing a very precise description of the restriction. Analogously, we also describe an extension process that allows to construct G-data from an H-datum $\Psi_H$. Using Frobenius Reciprocity, we obtain a description for the components of $\Ind_H^G\pi_H(\Psi_H)$. From the obtained description of $\pi_G(\Psi)|_H$, we prove that the multiplicity in $\pi_G(\Psi)|_H$ is entirely determined by the multiplicity in the restriction of the depth-zero piece of the datum. Furthermore, we use Clifford theory to obtain a formula for the multiplicity of each component in $\pi_G(\Psi)|_H$. As a particular case, we take a look at the regular depth-zero supercuspidal representations and obtain a condition for a multiplicity free restriction. Finally, we show that our methods can also be used to define a restriction of Kim-Yu types, allowing to study the restriction of irreducible representations which are not supercuspidal.
APA, Harvard, Vancouver, ISO, and other styles
5

Mugodo, James, and n/a. "Plant species rarity and data restriction influence the prediction success of species distribution models." University of Canberra. Resource, Environmental & Heritage Sciences, 2002. http://erl.canberra.edu.au./public/adt-AUC20050530.112801.

Full text
Abstract:
There is a growing need for accurate distribution data for both common and rare plant species for conservation planning and ecological research purposes. A database of more than 500 observations for nine tree species with different ecological and geographical distributions and a range of frequencies of occurrence in south-eastern New South Wales (Australia) was used to compare the predictive performance of logistic regression models, generalised additive models (GAMs) and classification tree models (CTMs) using different data restriction regimes and several model-building strategies. Environmental variables (mean annual rainfall, mean summer rainfall, mean winter rainfall, mean annual temperature, mean maximum summer temperature, mean minimum winter temperature, mean daily radiation, mean daily summer radiation, mean daily June radiation, lithology and topography) were used to model the distribution of each of the plant species in the study area. Model predictive performance was measured as the area under the curve of a receiver operating characteristic (ROC) plot. The initial predictive performance of logistic regression models and generalised additive models (GAMs) using unrestricted, temperature restricted, major gradient restricted and climatic domain restricted data gave results that were contrary to current practice in species distribution modelling. Although climatic domain restriction has been used in other studies, it was found to produce models that had the lowest predictive performance. The performance of domain restricted models was significantly (p = 0.007) inferior to the performance of major gradient restricted models when the predictions of the models were confined to the climatic domain of the species. Furthermore, the effect of data restriction on model predictive performance was found to depend on the species as shown by a significant interaction between species and data restriction treatment (p = 0.013). As found in other studies however, the predictive performance of GAM was significantly (p = 0.003) better than that of logistic regression. The superiority of GAM over logistic regression was unaffected by different data restriction regimes and was not significantly different within species. The logistic regression models used in the initial performance comparisons were based on models developed using the forward selection procedure in a rigorous-fitting model-building framework that was designed to produce parsimonious models. The rigorous-fitting modelbuilding framework involved testing for the significant reduction in model deviance (p = 0.05) and significance of the parameter estimates (p = 0.05). The size of the parameter estimates and their standard errors were inspected because large estimates and/or standard errors are an indication of model degradation from overfilling or effecls such as mullicollinearily. For additional variables to be included in a model, they had to contribule significantly (p = 0.025) to the model prediclive performance. An attempt to improve the performance of species distribution models using logistic regression models in a rigorousfitting model-building framework, the backward elimination procedure was employed for model selection, bul it yielded models with reduced performance. A liberal-filling model-building framework that used significant model deviance reduction at p = 0.05 (low significance models) and 0.00001 (high significance models) levels as the major criterion for variable selection was employed for the development of logistic regression models using the forward selection and backward elimination procedures. Liberal filling yielded models that had a significantly greater predictive performance than the rigorous-fitting logistic regression models (p = 0.0006). The predictive performance of the former models was comparable to that of GAM and classification tree models (CTMs). The low significance liberal-filling models had a much larger number of variables than the high significance liberal-fitting models, but with no significant increase in predictive performance. To develop liberal-filling CTMs, the tree shrinking program in S-PLUS was used to produce a number of trees of differenl sizes (subtrees) by optimally reducing the size of a full CTM for a given species. The 10-fold cross-validated model deviance for the subtrees was plotted against the size of the subtree as a means of selecting an appropriate tree size. In contrast to liberal-fitting logistic regression, liberal-fitting CTMs had poor predictive performance. Species geographical range and species prevalence within the study area were used to categorise the tree species into different distributional forms. These were then used, to compare the effect of plant species rarity on the predictive performance of logistic regression models, GAMs and CTMs. The distributional forms included restricted and rare (RR) species (Eucalyptus paliformis and Eucalyptus kybeanensis), restricted and common (RC) species (Eucalyptus delegatensis, Eucryphia moorei and Eucalyptus fraxinoides), widespread and rare (WR) species (Eucalyptus data) and widespread and common (WC) species (Eucalyptus sieberi, Eucalyptus pauciflora and Eucalyptus fastigata). There were significant differences (p = 0.076) in predictive performance among the distributional forms for the logistic regression and GAM. The predictive performance for the WR distributional form was significantly lower than the performance for the other plant species distributional forms. The predictive performance for the RC and RR distributional forms was significantly greater than the performance for the WC distributional form. The trend in model predictive performance among plant species distributional forms was similar for CTMs except that the CTMs had poor predictive performance for the RR distributional form. This study shows the importance of data restriction to model predictive performance with major gradient data restriction being recommended for consistently high performance. Given the appropriate model selection strategy, logistic regression, GAM and CTM have similar predictive performance. Logistic regression requires a high significance liberal-fitting strategy to both maximise its predictive performance and to select a relatively small model that could be useful for framing future ecological hypotheses about the distribution of individual plant species. The results for the modelling of plant species for conservation purposes were encouraging since logistic regression and GAM performed well for the restricted and rare species, which are usually of greater conservation concern.
APA, Harvard, Vancouver, ISO, and other styles
6

Rae, Mary Nichols. "DISABILITY AND RESTRICTION OF OPPORTUNITIES IN THE WORKPLACE: DATA FROM THE NATIONAL HEALTH INTERVIEW SURVEY (NHIS)." Cincinnati, Ohio : University of Cincinnati, 2000. http://www.ohiolink.edu/etd/view.cgi?ucin971880292.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Garb, Leanna Rose. "Stroop Task as a Measure of Executive Functioning in Older Adults: Preliminary Data from a Multi-Site Study of Moderate Sleep Restriction." Thesis, The University of Arizona, 2013. http://hdl.handle.net/10150/297566.

Full text
Abstract:
Aim of Multi-Site Sleep Study is to examine the effects of chronic moderate sleep restriction on adults. Participants must be between 60- 80 years and sleep 8-9 hours (long sleepers) or 6-7.25 (average sleepers) hours per night. For my thesis, I will examine the first year data of the Stroop task (pre and posttest) looking at Stroop interference and Stroop time. My hypotheses are that long sleepers will benefit from moderate sleep restriction, but average sleepers will not. I predict no change will occur for the control group (both average and long sleepers). The study is fourteen weeks. Following baseline, participants will be assigned to the sleep restriction treatment or control treatment. The sleep restriction group will get an hour less of nightly sleep. Participants in the control group will get the same amount of sleep as baseline. Analysis revealed that there was a main effect of pre-post for interference of the Stroop task. There was no significant main effect of group or interaction between pre-post and group. For part 1, 2, and 3 Stroop time, there was a main effect of pre-post. There was no significant main effect of group or interaction between pre-post and group.
APA, Harvard, Vancouver, ISO, and other styles
8

Magnusson, Victor. "Cut off cross-border data flow and international investment law. : A legal analysis of a restriction with an effect equivalent of a ban on cross-border data flow and the fair and equitable treatment standard found in bilateral investment treaties." Thesis, Uppsala universitet, Juridiska institutionen, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-443840.

Full text
Abstract:
In the world we live in today, the international trade and economy is becoming more and more dependent on data. Data that can be transferred across borders and during the last couple of years there is an observable trend that the cross-border data flows is increasing. The increase of the cross-border data flows is a result of the vast boom in the global digitalization.  Businesses and enterprises can use the data accessible in multiple kinds of ways, follow and keep control of production chains, follow the demand of consumers, and make alterations to the products following the requests of the consumers. This is improving the efficiency and productivity of the businesses. The free flow of data across borders does not only have positive effect for the businesses, but also from a larger perspective, it also contributes to the welfare of countries, and provide new possibilities and opportunities. Despite the fact that the free flow of data has its great effects on both businesses and the welfare of states, states are imposing restrictions on cross-border data flows. The restrictions in place are of deferent kinds, some makes it mandatory to store or process data, while other restrictions are harsher and could provide a ban or cut off on cross-border data flow.  In the legal system of international investment law, the fair and equitable treatment standard is a standard found in treaties, bilateral and multilateral. The standard is protecting the forging investors.  If a state is enforcing a restriction with an effect equivalent to a ban on cross-border data flow, what is the relation of that restriction to the fair and equitable treatment standard?
APA, Harvard, Vancouver, ISO, and other styles
9

Sjöberg, Sofia. "Utilizing research in the practice of personnel selection : General mental ability, personality, and job performance." Doctoral thesis, Stockholms universitet, Psykologiska institutionen, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-101976.

Full text
Abstract:
Identifying and hiring the highest performers is essential for organizations to remain competitive. Research has provided effective guidelines for this but important aspects of these evidence-based processes have yet to gain acceptance among practitioners. The general aim of this thesis was to help narrowing the gap between research and practice concerning personnel selection decisions. The first study compared the validity estimates of general mental ability (GMA) and the five factor model of personality traits as predictors of job performance, finding that, when the recently developed indirect correction for range restriction was applied, GMA was an even stronger predictor of job performance than previously found, while the predictive validity of the personality traits remained at similar levels. The approach used for data collection and combination is crucial to forming an overall assessment of applicants for selection decisions and has a great impact on the validity of the decision. The second study compared the financial outcomes of applying a mechanical or clinical approach to combining predictor scores. The results showed that the mechanical approach can result in a substantial increase in overall utility. The third study examined the potential influences that practitioners’ cognitive decision-making style, accountability for the assessment process, and responsibility for the selection decision had on their hiring approach preferences. The results showed that practitioners scoring high on intuitive decision-making style preferred a clinical hiring approach, while the contextual aspects did not impact practitioners’ preferences. While more research may be needed on practitioner preferences for a particular approach, the overall results of this thesis support and strengthen the predictive validity of GMA and personality traits, and indicate that the mechanical approach to data combination provides increased utility for organizations.

At the time of the doctoral defense, the following papers were unpublished and had a status as follows: Paper 2: Manuscript. Paper 3: Manuscript.

APA, Harvard, Vancouver, ISO, and other styles
10

Vlatsa, Dimitra A. "Data envelopment analysis with intensity restrictions." Diss., Georgia Institute of Technology, 1995. http://hdl.handle.net/1853/24909.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Lee, Sang Han. "Estimating and testing of functional data with restrictions." [College Station, Tex. : Texas A&M University, 2007. http://hdl.handle.net/1969.1/ETD-TAMU-1626.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Oldani, Isabella. "Exchanging and Protecting Personal Data across Borders: GDPR Restrictions on International Data Transfer." Doctoral thesis, Università degli studi di Trento, 2020. http://hdl.handle.net/11572/270518.

Full text
Abstract:
From the very outset of the EU data protection legislation, and hence from the 1995 Directive, international data transfer has been subject to strict requirements aimed at ensuring that protection travels with data. Although these rules have been widely criticized for their inability to deal with the complexity of modern international transactions, the GDPR has essentially inherited the same architecture of the Directive together with its structural limitations. This research aims to highlight the main weaknesses of the EU data export restrictions and identify what steps should be taken to enable a free, yet safe, data flow. This research first places EU data transfer rules in the broader debate about the challenges that the un-territorial cyberspace poses to States’ capabilities to exert their control over data. It then delves into the territorial scope of the GDPR to understand how far it goes in protecting data beyond the EU borders. The objectives underpinning data export restrictions (i.e., avoiding the circumvention of EU standards and protecting data from foreign public authorities) and their limitations in achieving such objectives are then identified. Lastly, three possible “solutions” for enabling data flow are tested. Firstly, it is shown that the adoption by an increasing number of non-EEA countries of GDPR-like laws and the implementation by many companies of GDPR-compliant policies is more likely to boost international data flow than internationally agreed standards. Secondly, the role that Article 3 GDPR may play in making data transfer rules “superfluous” is analysed, as well as the need to complement the direct applicability of the GDPR with cross-border cooperation between EU and non-EU regulators. Thirdly, the study finds that the principle of accountability, as an instrument of data governance, may boost international data flow by pushing most of the burden for ensuring GDPR compliance on organizations and away from resource-constrained regulators.
APA, Harvard, Vancouver, ISO, and other styles
13

Pedrosa, Diogo de Carvalho. "Data input and content exploration in scenarios with restrictions." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-13042015-144651/.

Full text
Abstract:
As technology evolves, new devices and interaction techniques are developed. These transformations create several challenges in terms of usability and user experience. Our research faces some challenges for data input or content exploration in scenarios with restrictions. It is not our intention to investigate all possible scenarios, but we deeply explore a broad range of devices and restrictions. We start with a discussion about the use of an interactive coffee table for exploration of personal photos and videos, also considering a TV set as an additional screen. In a second scenario, we present an architecture that offers to interactive digital TV (iDTV) applications the possibility of receiving multimodal data from multiple devices. Our third scenario concentrates on supporting text input for iDTV applications using a remote control, and presents an interface model based on multiple input modes as a solution. In the last two scenarios, we continued investigating better ways to provide text entry; however, our restriction becomes not using the hands, which is the kind of challenge faced by severely motor-disabled individuals. First, we present a text entry method based on two input symbols and an interaction technique based on detecting internal and external heel rotations using an accelerometer, for those who keep at least a partial movement of a leg and a foot. In the following scenario, only the eyes are required. We present an eye-typing technique that recognizes the intended word by weighting length and frequency of all possible words formed by filtering extra letters from the sequence of letters gazed by the user. The exploration of each scenario in depth was important to achieve the relevant results and contributions. On the other hand, the wide scope of this dissertation allowed the student to learn about several technologies and techniques.
Com a evolução da tecnologia, novos dispositivos e técnicas de interação são desenvolvidas. Essas transformações criam desafios em termos de usabilidade e experiência do usuário. Essa pesquisa enfrenta alguns desafios para a entrada de dados e exploração de conteúdo em cenários com restrições. Não foi intenção da pesquisa investigar todos os possíveis cenários, mas sim a exploração em profundidade de uma ampla gama de dispositivos e restrições. Ao todo cinco cenários são investigados. Primeiramente é apresentada uma discussão sobre o uso de uma mesa de centro interativa para a exploração de fotos e vídeos pessoais, a qual também considera um aparelho de TV como tela adicional. Com base no segundo cenário, uma arquitetura que oferece a aplicações de TV digital interativa (TVDI) a possibilidade de receber dados multimodais de múltiplos dispositivos é apresentada. O terceiro cenário se concentra no suporte a entrada de texto para aplicações de TVDI usando o controle remoto, resultando na apresentação de um modelo de interface baseado em múltiplos modos de entrada como solução. Os dois últimos cenários permitem continuar a investigação por melhores formas de entrada de texto, porém, a restrição se torna a impossibilidade de usar as mãos, um dos desafios enfrentados por indivíduos com deficiência motora severa. No primeiro deles, são apresentados um método de entrada de texto baseado em dois símbolos de entrada e uma técnica de interação baseada na detecção de rotações do pé apoiado sobre o calcanhar usando acelerômetro, para aqueles que mantêm pelo menos um movimento parcial de uma perna e um pé. No senário seguinte, apenas os movimentos dos olhos são exigidos. Foi apresentada uma técnica de escrita com o olho que reconhece a palavra desejada ponderando o comprimento de a frequência de ocorrência de todas as palavras que podem ser formadas filtrando letras excedentes da lista de letras olhadas pelo usuário. A exploração de cada cenário em profundidade foi importante para a obtenção de resultados e contribuições relevantes. Por outro lado, o amplo escopo da dissertação permitiu ao estudante o aprendizado de diversas técnicas e tecnologias.
APA, Harvard, Vancouver, ISO, and other styles
14

Fachini, Juliana Betini. "Modelos de regressão com e sem fração de cura para dados bivariados em análise de sobrevivência." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/11/11134/tde-12092011-170753/.

Full text
Abstract:
Neste trabalho são reunidos diferentes modelos e técnicas para representar situações experimentais ou observacionais de análise de sobrevivência. Para modelar respostas bivariadas e covariáveis foi proposto o modelo de regressão Kumaraswamy-Weibull bivariado. A presen»ca de indivíduos curados foi considerada sob duas diferentes abordagens, originando o modelo de regressão com fração de cura para dados bivariados por meio de cópulas e o modelo de regressão log-linear bivariado com fração de cura. Os parâmetros dos modelos foram esti- mados pelo método de máxima verossimilhança sujeito a restriçãoo nos parâmetros por meio da função barreira adaptada. Adaptou-se uma análise de sensibilidade de forma a considerar as metodologias de Influência Global, Influência Local e Influência Local Total para verificar vários aspectos que envolvem a formulação e ajuste dos modelos propostos. Utilizou-se um conjunto de dados de insuficiência renal e retinopatia diabética são utilizados para exemplificar a aplicação dos modelos propostos.
This work brought together di®erent models and techniques to represent expe- rimental or observational situations in survival analysis. To model bivariate responses and covariates was proposed Kumaraswamy Weibull bivariate regression model. The presence of cured individuals was considered under two di®erent approaches originating the regression model with a cured fraction for bivariate data through copulas and the log-linear bivariate regression model with cured fraction. The parameters of the models were estimated by ma- ximum likelihood method subject to the restriction on the parameters through the adapted barrier function. A sensitivity analysis was adapted considering the methodologies of Global In°uence, Local In°uence and Total Local In°uence to check various aspects of the formulation and adjustment of the models proposed. Data set of renal failure and diabetic retinopathy are used to exemplify the application of the proposed models.
APA, Harvard, Vancouver, ISO, and other styles
15

Ramirez, de Arellano Serna Antonio. "Incorporating preference information in Data Envelopment Analysis via external restrictions." Thesis, University of York, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.367467.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Kantu, Dieudonne Kabongo. "Robustness analysis based on weight restrictions in data envelopment analysis." Master's thesis, University of Cape Town, 2006. http://hdl.handle.net/11427/11829.

Full text
Abstract:
Includes bibliographical references.
Evaluating the performance of organisations is essential to good planning and control. Part of this process is monitoring the performance of organisations against their goals. The comparative efficiency of organizations using common inputs and outputs makes it possible for organizations to improve their performance so that can operate as the most efficient organizations. Resources and outputs can be very diversified in nature and it is complex to assess organizations using such resources and outputs. Data Envelopment Analysis models are designed to facilitate this of assessment and aim to evaluate the relative efficiency of organisations. Chapter 2 is dedicated to the basic Data Envelopment Analysis. We present the following: * A review of the Data Envelopment Analysis models; * The properties and particularities of each model. In chapter 3, we present our literature survey on restrictions. Data Envelopment Analysis is a value-free frontier which has the of yielding more objective efficiency measures. However, the complete freedom in the determination of weights for the factors and products) relevant to the assessment of organisations has led to some problems such as: zero-weights and lack of discrimination between efficient organizations. Weight restriction methods were introduced in order to tackle these problems. The first part of chapter 3 in detail the motivations for weight restrictions while the second part presents the actual weight restriction rnethods.
APA, Harvard, Vancouver, ISO, and other styles
17

Kabnurkar, Amit. "Mathematical Modeling for Data Envelopment Analysis with Fuzzy Restrictions on Weights." Thesis, Virginia Tech, 2001. http://hdl.handle.net/10919/31992.

Full text
Abstract:
Data envelopment analysis (DEA) is a relative technical efficiency measurement tool, which uses operations research techniques to automatically calculate the weights assigned to the inputs and outputs of the production units being assessed. The actual input/output data values are then multiplied with the calculated weights to determine the efficiency scores. Recent variants of the DEA model impose upper and lower bounds on the weights to eliminate certain drawbacks associated with unrestricted weights. These variants are called weight restriction DEA models. Most weight restriction DEA models suffer from a drawback that the weight bound values are uncertain because they are determined based on either incomplete information or the subjective opinion of the decision-makers. Since the efficiency scores calculated by the DEA model are sensitive to the values of the bounds, the uncertainty of the bounds gets passed onto the efficiency scores. The uncertainty in the efficiency scores becomes unacceptable when we consider the fact that the DEA results are used for making important decisions like allocating funds and taking action against inefficient units. In order to minimize the effect of the uncertainty in bound values on the decision-making process, we propose to explicitly incorporate the uncertainty in the modeling process using the concepts of fuzzy set theory. Modeling the imprecision involves replacing the bound values by fuzzy numbers because fuzzy numbers can capture the intuitive conception of approximate numbers very well. Amongst the numerous types of weight restriction DEA models developed in the research, two are more commonly used in real-life applications compared to the others. Therefore, in this research, we focus on these two types of models for modeling the uncertainty in bound values. These are the absolute weight restriction DEA models and the Assurance Region (AR) DEA models. After developing the fuzzy models, we provide implementation roadmaps for illustrating the development and solution methodology of those models. We apply the fuzzy weight restriction models to the same data sets as those used by the corresponding crisp weight restriction models in the literature and compare the results using the two-sample paired t-test for means. We also use the fuzzy AR model developed in the research to measure the performance of a newspaper preprint insertion line.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
18

Cheng, Xiaofeng. "Analysis of States Gun Control Restrictions." Scholar Commons, 2002. http://purl.fcla.edu/fcla/etd/SFE0000037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Zacherl, Walter David. "Method for Registering Lidar Data in Restrictive, Tunnel-Like Environments." Diss., The University of Arizona, 2016. http://hdl.handle.net/10150/613145.

Full text
Abstract:
A new method of registering multiple range datasets collected in a GPS-denied, tunnel-like environment is presented. The method is designed to function with minimal user inputs and be effective over a wide range of changes in observation angle. The method is initially developed to operate on data in a general 2.5D coordinate system. Then, the general registration method is specifically tailored to a 2.5D spherical coordinate system. To apply the method, the range data is first filtered with a series of discrete Gaussian-based filters to construct a second-order Taylor series approximation to the surface about each sampled point. Finally, principal curvatures are calculated and compared across neighboring datasets to determine homologies and the best fit transfer matrix. The new method relaxes the minimum change in perspective requirement between neighboring datasets typical of other algorithms. Results from the application of the method on both synthetic and real-world data are shown. The real-world data comes from a series of high explosive tests performed in a tunnel environment. The tunnels were oriented horizontally in rock and constructed with boring equipment. The tunnel surfaces were surveyed with a Faro Focus3D terrestrial panorama scanning light detection and ranging (lidar) system both before and after a high explosive device was detonated inside the tunnel with the intent of documenting damage to the tunnel surface.
APA, Harvard, Vancouver, ISO, and other styles
20

Larner, Andrew Gordon. "The legal and institutional restrictions on the handling of digital land related data in the United Kingdom." Thesis, University of East London, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.388135.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Villalba, Matamoros Martha. "Stochastic short-term production scheduling accounting for fleet allocation, operational considerations, blending restrictions and future multi-element ore control data." Thesis, McGill University, 2014. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=123287.

Full text
Abstract:
Mine production scheduling may be long-term or short-term based on the time period considered and the final objective. The optimization goal of short-term production scheduling is to minimize the mining cost expected from a mine while satisfying operational constraints, such as mining slope, grade blending, metal production, mining capacity and processing capacity; however some parameters may be uncertain, such as metal quality and fleet parameters. Traditional short-term production planning is carried out by two sequential optimizations, production schedule is defined at the first step and the available fleet is evaluated for this schedule as a second step, however; the fleet availability, hauling time and mining considerations do not influence the schedule decision. In addition, the fleet optimization algorithms do not consider uncertainty in their parameters and do not take into account the local mineralization of the deposit because a single possibly misleading total aggregated block tonnage is linked to each sector to be mined. The local mineralization or local scale variability between blocks assists in the blending process and metal quality control; however, the traditional short-term production scheduling is based on exploration drilling or a sparse data ore body model, while in practice grade control data or close spacing blasthole drilling classify the material as ore and waste because their short-scale information is not available at the time of the monthly short-term planning. The local variability is relevant in the short-term production scheduling to define the destination of the material.The short-term mine production scheduling in this thesis is developed as a single formulation where mining considerations, production constraints, uncertainty in the orebody metal quantity, as well as fleet parameters, are evaluated together to define a well informed sequence of mining that results in high performance at the mine operation. The formulation is implemented at a multi-element iron mine and the resulting monthly schedules show lower cost, minable patterns and, efficient fleet allocation, that ensures a higher and less variable utilization of the fleet over the conventional schedule approach.Uninformed and ultimately costly decisions can be taken because of imperfect geological knowledge or information effect. The orebody uncertainty may be updated by simulated future ore control data to account for local scale grade variability, and the information used to discriminate ore and waste in practice. Multi-element orebody uncertainty models are updated based on the correlation of exploration data and past ore control data, this orebody uncertainty is then used to optimize the short-term production scheduling that leads to better performance in terms of matching ore quality targets and delivering recoverable reserves.
Le programme de production d'une mine peut se faire à long ou à court terme selon l'horizon de temps choisi et l'objectif final visé. L'objectif d'optimisation du programme à court terme est de minimiser les coûts d'opération attendus d'une mine en tenant compte de contraintes, tel que la pente de talus, le mélange de matériel, la production de métaux ainsi que la capacité de production et de traitement de la mine. Cependant, certains paramètres tels que la qualité du métal de base et les paramètres définissant la flotte minière peuvent être incertains.La planification traditionnelle à court terme des activités de production est assurée par deux optimisations séquentielles. Le programme des activités de production est établi à la première étape et la disponibilité de la flotte est évaluée à la seconde étape. Il faut bien voir cependant que la disponibilité de la flotte ainsi que le temps de transport et les considérations générales de l'exploitation minière n'influencent pas les décisions de programmation. De plus, les algorithmes d'optimisation de la flotte ne considèrent pas l'incertitude comme un de leurs paramètres et ne prennent pas en compte la minéralisation locale du gisement parce que le résultat agrégé prévu du tonnage global pourrait facilement être trompeur compte tenu qu'il est considéré comme un seul grand bloc lié à chaque secteur à être minés. La minéralisation locale, aussi appelée la variabilité de l'échelle entre les blocs de matière première, aide dans le processus de gestion des mélanges et le contrôle de la qualité des métaux. Cependant, la planification à court terme des activités de production est basée sur les forages d'exploration ou sur un modèle de corps minéralisé qui s'appuie sur des données ayant un certain degré d'incertitude. Alors qu'en pratique le contrôle de qualité du minerai ou la courte distance entre les trous de forage pré-dynamitage classifie le matériel comme économique ou stérile étant donné que l'information à courte échelle n'est pas disponible au moment de la préparation de la planification à court terme des activités de production. La variabilité locale est importante dans la planification à court terme pour définir la destination du matériel.Le modèle du programme à court terme des activités de production d'une mine tel que proposé dans cette thèse présente une formulation où les considérations minières, les contraintes de production, l'incertitude liée à la quantité de métal présente dans le corps minéralisé ainsi que les paramètres de la flotte minière sont évalués ensembles afin d'obtenir une séquence bien documentée des activités minières qui favorise une performance optimisée des opérations minières. La formulation a été implantée dans une mine de fer avec multiéléments et les résultats de la planification mensuelle ont démontré des coûts moindres et une utilisation efficiente de la flotte minière qui assurent une utilisation élevée et moins variable de la flotte en comparaison de l'approche d'un programme conventionnelle.Des décisions mal documentées et coûteuses peuvent être prise à cause de connaissances géologiques imparfaites ou d'autres facteurs informationnels. L'incertitude concernant le corps minéralisé peut être mise à jour en simulant des données futures de contrôle des métaux afin de tenir compte de la variabilité du niveau de contrôle dans la qualité du métal présent et de l'information utilisée pour mieux faire la différence entre minéral économique et stérile. Les modèles d'incertitude des corps minéralisés a multiéléments sont mis à jour en se basant sur la corrélation des données d'exploration et les données d'exploration et les données passées de contrôle de métal. Ces incertitudes liées aux corps minéralisées permettent alors d'optimiser le programme à court-terme des activités de production qui mène à une meilleure performance en termes d'atteinte d'objectifs quant à la qualité du métal dans le minerai et de la réserve de matériel utilisable.
APA, Harvard, Vancouver, ISO, and other styles
22

Andersson, Nils, and Martin Marklund. "Adaptive Feature based Level of Detail for Memory Restrained Interactive Direct Volume Rendering." Thesis, Linköpings universitet, Medie- och Informationsteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-170319.

Full text
Abstract:
The purpose of this thesis was to find and implement an adaptive method, based on given data and hardware, for selecting different level-of-detail whilst preserving visual quality to the best extent possible. Another important aspect of the new method was that it had to be performance effective, since the target platform was an interactive direct volume rendering application. The project was targeted towards memory restricted systems in which it has previously been impossible to render large-scale volumetric datasets. The previous state of the target platform supports two detail levels: full and half, and does not implement any kind of prioritisation when selecting the level-of-detail of bricks. Apart from failing to render parts of the datasets, the old implementation is also lacking in that the top of the dataset always has the lowest prioritisation, which can prove problematic for certain datasets. An adaptive method which determines a suitable number of detail levels at run-time has been implemented. The new implementation has also reworked the way bricks are prioritised during rendering. The proposed algorithm prioritises bricks holding surface information as well as bricks that match the transfer-function configuration well. The results show that the proposed method is able to render large-scale datasets in limited environments whilst maintaining interactive frame-rates. The new brick selection algorithm is a step in the right direction towards solving the issue with parts of the dataset not being prioritised.
APA, Harvard, Vancouver, ISO, and other styles
23

Tuncer, Ceren. "A Dea-based Approach To Ranking Multi-criteria Alternatives." Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/12607476/index.pdf.

Full text
Abstract:
ABSTRACT A DEA-BASED APPROACH TO RANKING MULTI-CRITERIA ALTERNATIVES Tuncer, Ceren M.Sc., Department of Industrial Engineering Supervisor: Prof. Dr. Murat Kö
ksalan August 2006, 88 pages This thesis addresses the problem of ranking multi-criteria alternatives. A Data Envelopment Analysis (DEA)-based approach, the Method of the Area of the Efficiency Score Graph (AES) is proposed. Rather than assessing the alternatives with respect to the fixed original alternative set as done in the existing DEA-based ranking methods, AES considers the change in the efficiency scores of the alternatives while reducing the size of the alternative set. Producing a final score for each alternative that accounts for the progress of its efficiency score, AES favors alternatives that manage to improve quickly and maintain high levels of efficiency. The preferences of the Decision Maker (DM) are incorporated into the analysis in the form of weight restrictions. The utilization of the AES scores of the alternatives in an incremental clustering algorithm is also proposed. The AES Method is applied to rank MBA programs worldwide, sorting of the programs is also performed using their AES scores. Results are compared to another DEA-based ranking method. Keywords: Ranking, data envelopment analysis, weight restrictions.
APA, Harvard, Vancouver, ISO, and other styles
24

Kwon, Hyukje. "A Monte Carlo Study of Missing Data Treatments for an Incomplete Level-2 Variable in Hierarchical Linear Models." The Ohio State University, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=osu1303846627.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Flodin, Caroline. "Sjöräddning och obemannade autonoma farkoster, hur är det med uppgifterna? : En fallstudie om riktlinjer för datahantering i sjöräddning med obemannade autonoma farkoster." Thesis, Linköpings universitet, Informationssystem och digitalisering, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-177792.

Full text
Abstract:
Sjöräddning i Sverige sker genom samverkan mellan statliga verksamheter, kommuner och frivilligorganisationer för ett gemensamt mål att rädda personer som råkat i sjönöd. Tid är ofta en kritisk faktor i räddningsuppdragen men ett snabbt och oplanerat utryck riskerar samtidigt att sätta räddningsaktörerna själva i farozonen. Utvecklingen av obemannade autonoma farkoster för SAR (eng. Search And Rescue) ses som en lösning på behovet att kunna snabbt skicka hjälp till samt få ögon på incidentplatsen utan att försätta räddningsaktörerna för onödig risk. Nuvarande kommunikationssystem inom svensk sjöräddning kan dock inte hantera annan typ av information än muntlig varav räddningsaktörer endast känner till riktlinjer för hantering av muntlig information. Med ett framtida införande av autonoma farkoster kommer dock fler informationstyper att behöva hanteras i sjöräddningar varav oklarheten om vilka informationstyper autonoma farkoster samlar in och vilka datahanteringskrav som finns är problematiskt. Oklarhet om informationstyperna och deras datahanteringskrav är vidare problematiskt för utvecklingen och implementeringen av autonoma farkoster då risken finns att farkoster och tekniker utvecklas men inte får användas för att de inte är anpassade efter lagkraven på hantering av olika datatyper. I denna studie undersöks därför vilka informationstyper som autonoma farkoster kan samla in vid sjöräddning. Detta för att komma fram till vilka riktlinjer för datahantering som gäller vid sjöräddning med autonoma farkoster. Studien undersöker också vilka informationstyper som är kritiska för en SAR-sjöräddningssamverkan samt vilka informationsdelningsutmaningar som finns i dagens sjöräddning. Studien genomfördes i form av en kvalitativ fallstudie och har tillämpat ett socio-tekniskt systemperspektiv för att bättre se till helheten och besvara frågeställningarna. Resultatet av denna studie visar att autonoma farkoster kan samla in information om sin omgivning, vilket utgör grunden för att skapa en medvetenhet om situationen som är kritiskt för SAR-operationer, och kan även samla in information om sitt eget tillstånd. De lagverk som identifierats utgöra de huvudsakliga restriktionerna är kamerabevakningslagen, lagen för skydd av geografisk information, offentlighets- och sekretesslagen, GDPR och dataskyddslagen. Dessa lagverk innehåller riktlinjer för delning av information och personuppgiftsbehandling i SAR-sjöräddning. Kunskapsbidrag studien har genererat inkluderar bland annat identifiering av datatyper som kan samlas in av autonoma farkoster i en SAR-sjöräddning, och sannolikt andra typer av räddningsinsatser, och delning och hanteringskraven på de datatyperna i räddningsinsatser och därmed kunskap om vilka datatyper som är mest reglerade. Vidare kunskapsbidrag är kunskap om vilka informationstyper som är mest kritiska för SAR-sjöräddningar, och därför bör prioriteras att samlas in och delas, och identifieringen av utmaningar för informationsdelning mellan statliga verksamheter och frivilligorganisationer.
Maritime rescue in Sweden is performed through a cooperation between government agencies, municipalities and non-governmental organisations (NGOs) with the common goal of saving people in distress. Time is often a critical factor in the rescue missions but a fast and unplanned response may at the same time put the rescue workers in danger. The development of unmanned autonomous vehicles for SAR is seen as a solution to the need of being able to quickly sendhelp as well as get eyes on the scene of the incident without exposing the rescue workers for unnecessary risks. However, the current communications systems in Swedish maritime rescue are unable to handle any other type of information except verbal, meaning that rescue workers only know guidelines for handling verbal information. However, with a future implementation of autonomous vehicles, there will be a need to handle more information types in maritime rescue whereas the uncertainty regarding what kind of information autonomous vehicles collect and which data management requirements exist is problematic. The uncertainty about the information types and their data management requirements is also problematic for the development and implementation of autonomous vehicles as there is a risk that vehicles and technologies are developed but not allowed to be used because they are not adapted to the legal requirements on management of the different types of data. Therefore, in this study I examine what information types that autonomous vehicles can collect in a maritime rescue to find out what guidelines for data management that applies during a maritime rescue with autonomous vehicles. The study also examines what kind of information’s are critical for a SAR maritime rescue cooperation as well as what information sharing challenges exist in current maritime rescue. The study was performed as a qualitative case study and has used a socio-technical systems perspective so as to better see the overall picture and answer the research questions. The result shows that autonomous vehicles can collect information about their surroundings, which is the foundation for establishing situation awareness that is critical for SAR-operations, and that they can collect information about their own status. The main laws and regulations that have been identified as constituting the main restrictions are (translated from Swedish) the law of camera surveillance, the law for protection of geographical information, the public access to information and secrecy act, the GDPR and the data protection act. These contains guidelines for sharing information and the processing of personal data in SAR maritime rescue. The knowledge contributions of this study includes among others the identification of datatypes that can be collected by autonomous vehicles in SAR maritime rescue, and probably other types of rescue operations, and the sharing and management requirements on those datatypes in rescue operations and thus knowledge of what datatypes that are the most restricted. Further knowledge contributions is knowledge about which information types that are the most critical for SAR maritime rescue and thus should be prioritised for collection and sharing as well as the identification of challenges for information sharing between government agencies and NGOs.
APA, Harvard, Vancouver, ISO, and other styles
26

Liu, Yu-Ting, and 劉昱廷. "Credit Score under Data Restriction." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/2h4h3j.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Yu-TsenChou and 周宥岑. "Fourier Analysis for Time-Course Gene Expression Data in Caloric Restriction." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/10920679718032402859.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Santos, Sara Raquel Azevedo dos. "Paradigma da ponderação constitucional legal da criação de bases de dados genéticos para condenados." Master's thesis, 2013. http://hdl.handle.net/1822/28407.

Full text
Abstract:
Dissertação de mestrado em Direito Judiciário
A perícia genético-forense apresentou-se ao Direito com toda a sedução e irresistibilidade que um meio de prova altamente fiável e determinista pode deter. O séc. XX veio inverter a tendência garantística das legislações processuais penais, tais foram os acontecimentos causadores de medo e terror mundialmente conhecidos, como o terrorismo e a criminalidade altamente organizada. Os interesses securitários subverteram os postulados garantísticohumanísticos trazidos com os movimentos Constitucionais pós-guerras. À altura, as atrocidades cometidas pelo Estado à dignidade da pessoa humana exigiram a consagração de Direitos Fundamentais na Constituição da República inatingíveis pelo Estado. Esses Direitos devem estar ali reconhecidos, como garantia de defesa dos cidadãos perante o poder tirano e autoritário do Estado. Ainda que a mesma Constituição autorize restrições aos direitos submete-a a estritos pressupostos materiais – adequação, necessidade e proporcionalidade - previstos no artigo 18º nº 2 da CRP. Ao processo penal, muitas vezes considerado o Direito Constitucional aplicado, compete a realização da justiça, no estrito respeito pelos DF, conciliação muitas vezes complexa que no entanto sempre se norteará pelos concretos interesses em confronto, tendo como limite a dignidade da pessoa humana. O que parece não encontrar legitimidade penal é a medida que prevê a inserção de perfis genéticos de condenados por crimes nos quais foi concretamente aplicada uma pena de prisão igual ou superior a 3 anos, uma vez que a inserção e permanência de dados altamente sensíveis e potencialmente reveladores de muitas outras informações em bases de dados Estaduais, supõe crimes futuros, que ainda não aconteceram, num de que os culpados por determinado crime, por terem delinquido voltarão a incidir na atividade criminosa e que por isso deverão suportar uma compressão permanente do seu direito à privacidade e autodeterminação informacional, à presunção da inocência (que lhe deverá ser novamente garantido mal se dê a condenação) e ao Princípio da não Autoincriminação. Tal medida enquanto restritiva de DF deverá forçosamente preencher os critérios materiais o artigo 18 nº s da CRP de idoneidade, necessidade e proporcionalidade e exige ainda um interesse que contra balance o suficiente com os interesses individuais constitucionalmente garantidos. Só assim a medida legislativa que impõe a inserção de perfis genéticos para condenados por crime cuja pena concretamente foi igual ou superior a 3 anos, não estará ferida de morte. Daí a urgência de uma proposta de alteração legislativa.
Genetic and forensic expertise was introduced to Law with all the seduction and irresistibility that a highly reliable and deterministic means of proof may comprise. The 20th century reversed the guarantee based trend of the criminal procedural legislations, such were the world renowned events that cause fear and terror, such as terrorism and highly organized crime. The safety interests subverted the postulates for humanistic guarantees brought by the post-wars Constitutional movements. At the time, the atrocities committed by the State to the dignity of the human person required the acknowledgement of Fundamental Rights in the Constitution of the Republic unattainable by the State. These rights must be recognized in it as guarantee of the protection of citizens before the authoritarian and tyrant power of the State. Even though the same Constitution allows restrictions to the rights, it submits to strict material preconditions – the appropriateness, necessity and proportionality - foreseen in article 18, paragraph 2 of the Constitution of the Portuguese Republic. The criminal procedure, often considered the Constitutional Law applied, shall achieve justice while entirely respecting the Fundamental Rights, an often complex conciliation which, however, will always be guided by the concrete interests in confrontation, bearing the dignity of the human person as its limit. What does not seem to find criminal legitimacy is the measure that provides for the insertion of genetic profiles of convicted offenders in which a prison sentence of 3 or more years was specifically applied, since the insertion and endurance of highly sensitive data and potentially revealing of many other information in State databases presumes future crimes, which have not yet happened, on a Lombrosian assumption that the guilty, because they have committed an offense, will fall upon criminal activity again and therefore should bear a permanent compression of their right to privacy and informational selfdetermination, to the presumption of innocence (which they should be once again guaranteed as soon as they are convicted) and to the privilege against self-incrimination. Such measure restricting Fundamental rights must necessarily fulfill the material criteria of appropriateness, necessity and proportionality foreseen in article 18, paragraph 2 of the Constitution of the Portuguese Republic, and also requires an interest which sufficiently counterbalances with the constitutionally guaranteed individual interests. Only thus will the legislative measure, which requires the insertion of genetic profiles for convicted offenders whose penalty was not less than 3 years, not be condemned. Hence the urgent need for a legislative amendment.
APA, Harvard, Vancouver, ISO, and other styles
29

Gonçalves, Francisca Fernandes. "A recolha de amostras biológicas em arguido condenado: análise do regime do artigo 8.º, n.º 2 da lei n.º 5/2008, de 12 de fevereiro, e consequente ponderação constitucional dos direitos lesados." Master's thesis, 2018. http://hdl.handle.net/1822/60754.

Full text
Abstract:
Dissertação de mestrado em Direito Judiciário
A criação da base de dados de perfis de ADN através da lei n.º 5/2008, de 12 de fevereiro, para fins de identificação civil e investigação criminal, tem assumido uma importância e relevância crescente no nosso ordenamento jurídico. Ao permitir a conservação e interconexão de um conjunto de informações sobre os perfis genéticos de determinados indivíduos da sociedade, tem revelado ser um instrumento fundamental de produção de prova e de auxílio na resolução de investigações criminais. Todavia, a sua admissibilidade surge rodeada de um conjunto de problemas que interessam considerar. Com efeito, para o presente trabalho torna-se indispensável enquadrar e ponderar diversas questões, designadamente, determinar o modo de aplicação das técnicas de análise de ADN no âmbito da investigação criminal e o tipo de informação biológica que pode ser utilizado, a natureza e o valor que assume como meio de prova, bem como identificar os direitos fundamentais que são potencialmente afetados com o recurso à prova genética e pela criação de bases de dados de perfis de ADN. Deste modo, iremos incidir a nossa investigação sobre a análise dos problemas que a lei n.º 5/2008, de 12 de fevereiro suscita, analisando e ponderando os pressupostos e procedimentos de recolha e tratamento laboratorial de amostras biológicas, a constituição da base de dados e dos requisitos que autorizam a inserção e armazenamento dos perfis genéticos na base de dados de perfis de ADN. Porém, centrar-nos-emos, essencialmente, sobre os critérios e problemas que resultam da recolha e inserção de perfis genéticos de arguidos condenados por crime doloso em pena concreta de prisão igual ou superior a três anos, uma vez que estão em causa informações sensíveis suscetíveis de revelar muitos outros dados pessoais, que passam, deste modo, a constar nos ficheiros que compõem a base de dados e a estar disponíveis para cruzamento com amostras biológicas que surjam no âmbito de outras investigações, recaindo sobre o sujeito uma constante compressão nos seus direitos fundamentais.
The creation of the data base of DNA profiles through Law no. 5/2008, of February 12, for the purpose of civil identification and criminal investigation, has assumed an importance and increasing relevance in our legal system. Allowing the conservation and interconnection of a set of information on genetic profiles of certain individuals of society, has proved to be a fundamental instrument for proving and assisting in the resolution of criminal investigations. However, permitting the creation of a data base of DNA is surrounded by a number of problems that we need to consider. In the present dissertation, it is essential to frame and ponder a number of issues, including how to apply DNA analysis techniques in criminal investigations and the type of biological information that can be used, the nature and the value that it assumes as a way of proof, as well as identify the fundamental rights that are potentially affected by the use of genetic testing and the creation of data bases of DNA profiles. In this way, we will focus our research on the analysis of the problems that Law No. 5/2008, of February 12, raises, analyzing and pondering the assumptions and procedures of the collection and laboratory treatment of biological samples, the constitution of a data base and the requirements that allow the insertion and storage of genetic profiles in the DNA profiling data base. However, we will mainly focus on the criteria and problems that result from the collection and insertion of genetic profiles of defendants convicted of felony crimes in actual prison sentences equal to or more than three years as sensitive information which could be revealed in the files that make up the data base and be available for cross-checking with biological samples that may come up in the course of other investigations, but always considering the defendants fundamental rights.
APA, Harvard, Vancouver, ISO, and other styles
30

Washburn, Faith M. "The Ebola Virus Disease Outbreak in Guinea, Liberia and Sierra Leone - Data Management Implementation and Outcomes for Movement and Monitoring of Travelers at Points of Entry." 2015. http://scholarworks.gsu.edu/iph_theses/371.

Full text
Abstract:
Data management in resource-limited settings can be a mountainous problem if not approached with a thorough understanding of those limitations and a mindset prepared for rapid changes in the environment. Data management becomes even more challenging at multiple points of entry, where there are many interwoven parts working together in order to get a potential traveler from his/her first steps into an airport area to boarding a plane, all while ensuring that the traveler has been thoroughly screened for any signs or symptoms of a possible Ebola virus disease infection. This capstone describes the history of the International Health Regulations’ effects on control of disease spread and importation at points of entry, the Do Not Board/Lookout List’s role in disease control in the United States, and the CDC’s International Assistance Team’s unique task in creating and implementing country-specific databases to meet the needs of Ebola-affected countries. The most critical data management need at these countries’ points of entry is specifically to prevent the exportation of Ebola virus disease in order to keep each country’s airspace open and allow goods, personnel and services to continue to be imported into these countries during this sustained Ebola outbreak.
APA, Harvard, Vancouver, ISO, and other styles
31

Lu, Hsueh-Hao, and 呂學豪. "The Effect of Stock Opening Price Types on Stock Returns and Volatility in Taiwan: The Moderating Effects of Short-Sale Restrictions below Previous-Date Closing Prices." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/36782679556918154486.

Full text
Abstract:
碩士
真理大學
管理科學研究所
96
This research is to investigate the effects of stock opening price types on stock returns and volatility in TSE listed company, when short-sale restriction below previous-date closing is considered as a moderating variable. The research sample covers the observations from September 4, 1992 to September 4 2004. After the effects of stock opening price types on stock returns and volatility for exact and post short-sale restrictions, which are attracted by EGARCH model, we used one-sample t test and sign rank test to test the significance. The resultants show that stock returns decrease after the rule of short-sale restrictions below previous-date closing prices is executed, but return volatility increase. The opening price types affect significant on the stock returns and the volatility in TES listed company. After the policy of the short-sale restrictions below previous closing prices is executed, the policy intervention does not increase the stock returns and reduce the stock volatility significance. All results of research might offer the government and investor as a reference for decision-marking.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography