Dissertations / Theses on the topic 'ADHM data'

To see the other types of publications on this topic, follow the link: ADHM data.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 30 dissertations / theses for your research on the topic 'ADHM data.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Kirschner, Paolo. "Progetto ADaM - Archeological DAta Management Progetto per la creazione di una banca dati relazionale per la gestione dei dati di scavo." Doctoral thesis, Università degli studi di Padova, 2008. http://hdl.handle.net/11577/3425584.

Full text
Abstract:
The ADaM project stems from the need to create a database for managing all phases of archaeological excavation as free as possible from any type of coercion of a categorical-typological nature or from the rigidity of the structure of the archive itself. The idea of achieving a single and total instrument was at the same time a moment of reflection for a normalization of an excavation reality through a decomposition, or better a synthesis of the various dynamics in a series of relationships. The standards now acquired in the management of the excavation permit to apply to the various phases of archaeological work a system of procedures for storing relational data in order to speed up the collection and further processing of information. In addition to the need to catalogue and archive the material data of archaeological excavation in an effective and articulated way, it was of primary importance, not only for reasons of bureaucratic and institutional type, to be able to produce a paper outcome of such documentation. Therefore the section regarding the printed results of the archives and the export forms is very rich (PDF, XLS, WWW queries throgh PHP instructions). The choice of FileMaker as DBMS and its Advanced Server application have been fairly automatic, not only for the large potential, but also for their previous use in other systems managed by the Department of Archaeology of the University of Padua and also for the familiarity that exists to these programs in the archaeological world. To be able to manage a data-base through four communication protocols ( "fmnet", "instant web publishing", "php" and "xml / xslt") has been valued as a determining advantage. The ADaM project tries to meet all these requirements in the "apparently" simplest way by combining "proprietary" current technologies, which were considered the most practical and effective ones ("fast and easy source"), with technologies of extremely wide circulation and tested reliability typical of an open-source software sector.
APA, Harvard, Vancouver, ISO, and other styles
2

Li, Shijian. "Scalable User Assignment in Power Grids: A Data Driven Approach." Digital WPI, 2017. https://digitalcommons.wpi.edu/etd-theses/1107.

Full text
Abstract:
"The fast pace of global urbanization is drastically changing the population distributions over the world, which leads to significant changes in geographical population densities. Such changes in turn alter the underlying geographical power demand over time, and drive power substations to become over-supplied (demand ≪ capacity) or under-supplied (demand ≈ capacity). In this work, we make the first attempt to investigate the problem of power substation/user assignment by analyzing large scale power grid data. We develop a Scalable Power User Assignment (SPUA) framework, that takes large-scale spatial power user/substation distribution data and temporal user power consumption data as input, and assigns users to substations, in a manner that minimizes the maximum substation utilization among all substations. To evaluate the performance of SPUA framework, we conduct evaluations on real power consumption data and user/substation location data collected from Xinjiang Province in China for 35 days in 2015. The evaluation results demonstrate that our SPUA framework can achieve a 20%–65% reduction on the maximum substation utilization, and 2 to 3.7 times reduction on total transmission loss over other baseline methods."
APA, Harvard, Vancouver, ISO, and other styles
3

Anderson, William, and Eduardo Carro. "Data Acquisition System Central Multiplexer." International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/611651.

Full text
Abstract:
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California
The Central Multiplexer is a versatile data multiplexer designed to address emerging test requirements for recording data from many sources on digital rotary head recorders at high data rates. A modular design allows easy reconfiguration for airborne or laboratory use; simultaneous data input from 63 sources of data in any combination of PCM commutators, ARINC 429 buses, ARINC 629 buses, MIL- STD-1553 buses, and general-purpose high-speed serial data packets; simultaneous, independent programmable outputs to high-speed digital data recorders, quick-look displays, and engineering monitor and analysis systems; and setup and control from a remote panel, a dumb terminal, a laptop personal computer, a standalone test system, or a large control computer.
APA, Harvard, Vancouver, ISO, and other styles
4

Tankielun, Adam [Verfasser]. "Data Post-Processing and Hardware Architecture of Electromagnetic Near-Field Scanner / Adam Tankielun." Aachen : Shaker, 2008. http://d-nb.info/1164342266/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Adams, Richard. "The Advanced Data Acquisition Model (ADAM): A process model for digital forensic practice." Thesis, Adams, Richard (2012) The Advanced Data Acquisition Model (ADAM): A process model for digital forensic practice. PhD thesis, Murdoch University, 2012. https://researchrepository.murdoch.edu.au/id/eprint/14422/.

Full text
Abstract:
Given the pervasive nature of information technology, the nature of evidence presented in court is now less likely to be paper-based and in most instances will be in electronic form . However, evidence relating to computer crime is significantly different from that associated with the more ‘traditional’ crimes for which, in contrast to digital forensics, there are well-established standards, procedures and models to which law courts can refer. The key problem is that, unlike some other areas of forensic practice, digital forensic practitioners work in a number of different environments and existing process models have tended to focus on one particular area, such as law enforcement, and fail to take into account the different needs of those working in other areas such as incident response or ‘commerce’. This thesis makes an original contribution to knowledge in the field of digital forensics by developing a new process model for digital data acquisition that addresses both the practical needs of practitioners working in different areas of the field and the expectation of law courts for a formal description of the process undertaken to acquire digital evidence. The methodology adopted for this research is design science on the basis that it is particularly suited to the task of creating a new process model and an ‘ideal approach’ in the problem domain of digital forensic evidence. The process model employed is the Design Science Research Process (DSRP) (Peffers, Tuunanen, Gengler, Rossi, Hui, Virtanen and Bragge, 2006) that has been widely utilised within information systems research. A review of current process models involving the acquisition of digital data is followed by an assessment of each of the models from a theoretical perspective, by drawing on the work of Carrier and Spafford (2003)1, and from a legal perspective by reference to the Daubert test2. The result of the model assessment is that none provide a description of a generic process for the acquisition of digital data, although a few models contain elements that could be considered for adaptation as part of a new model. Following the identification of key elements for a new model (based on the literature review and model assessment) the outcome of the design stage is a three-stage process model called the Advance Data Acquisition Model (ADAM) that comprises of three UML3 Activity diagrams, overriding Principles and an Operation Guide for each stage. Initial testing of the ADAM (the Demonstration stage from the DSRP) involves a ‘desk check’ using both in-house documentation relating to three digital forensic investigations and four narrative scenarios. The results of this exercise are fed back into the model design stage and alterations made as appropriate. The main testing of the model (the DSRP Evaluation stage) involves independent verification and validation of the ADAM utilising two groups of ‘knowledgeable people’. The first group, the Expert Panel, consists of international ‘subject matter experts’ from the domain of digital forensics. The second group, the Practitioner Panel, consists of peers from around Australia that are digital forensic practitioners and includes a representative from each of the areas of relevance for this research, namely: law enforcement, commerce and incident response. Feedback from the two panels is considered and modifications applied to the ADAM as appropriate. This thesis builds on the work of previous researchers and demonstrates how the UML can be practically applied to produce a generic model of one of the fundamental digital forensic processes, paving the way for future work in this area that could include the creation of models for other activities undertaken by digital forensic practitioners. It also includes the most comprehensive review and critique of process models incorporating the acquisition of digital forensics yet undertaken.
APA, Harvard, Vancouver, ISO, and other styles
6

Gonzales, de Olarte Efraín. "La Matriz de Capacidades y Desempeños (MCD) y el Algoritmo del Desarrollo Humano (ADH)." Economía, 2012. http://repositorio.pucp.edu.pe/index/handle/123456789/117871.

Full text
Abstract:
Given that human development is a complex process involving multiple components and determiningfactors, multidimensional indicators are needed. On the basis of the extensive literatureon the subject, we advance two new indicators: the Matrix of Capabilities and Functioning(MCF), and the Algorithm of Human Development (HDA).The MCF is composed of vectors of capabilities and functionings, based on Sen’s idea of RefinedFunctionings. It is based in a matricial framework, both static and dynamic. The main purposeof constructing this index is to study how different sets of capabilities relate to alternativefunctionings, to produce diverse outcomes.The Human Development Algorithm (HAD) is a multidimensional index concerning the setof goods and services needed to complete a life cycle. The HDA is a socio-economic contextindicator. It is composed of the main «satisfactors» or basic goods and services needed: food,health, education, housing, social security, decent employment and retirement programs, thatmight be available to all throughout of life cycle. This indicator shows the material progressreached by each country or region as well the institutional organization, private and public, andthe degree of social cohesion and solidarity.
Dado que el desarrollo humano es un proceso complejo que tiene múltiples componentes ydeterminantes, es necesario tener indicadores multidimensionales tales como la Matriz de Capacidadesy Desempeños (MCD) y el Algoritmo del Desarrollo Humano (ADH).El MCD está compuesto de vectores de capacidades y de desempeños, basado en la idea de Sensobre desempeños refinados. Está basado en un marco conceptual matricial, tanto estático comodinámico. El propósito principal de la construcción del índice es estudiar cómo diferentes conjuntosde capacidades con desempeños pueden producir distintos resultados.El ADH es un índice multidimensional relacionado con el conjunto de bienes y servicios que senecesitan para completar un ciclo de vida. El ADH es un indicador de contexto socioeconómico.Está compuesto de los principales satisfactores o bienes y servicios básicos que se requieren: alimentos,salud, educación, vivienda, seguridad social, empleo decente y sistema de pensiones, quedeben estar disponibles durante todo el ciclo de vida. Este indicador muestra el progreso materialalcanzado por cada país o región, tanto como la organización institucional pública y privada, yel grado de cohesión social.
APA, Harvard, Vancouver, ISO, and other styles
7

Huang, Andrew S. (Andrew Shane). "ADAM : a decentralized parallel computer architecture featuring fast thread and data migration and a uniform hardware abstraction." Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/29905.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2002.
Includes bibliographical references (p. 247-256).
The furious pace of Moore's Law is driving computer architecture into a realm where the the speed of light is the dominant factor in system latencies. The number of clock cycles to span a chip are increasing, while the number of bits that can be accessed within a clock cycle is decreasing. Hence, it is becoming more difficult to hide latency. One alternative solution is to reduce latency by migrating threads and data, but the overhead of existing implementations has previously made migration an unserviceable solution so far. I present an architecture, implementation, and mechanisms that reduces the overhead of migration to the point where migration is a viable supplement to other latency hiding mechanisms, such as multithreading. The architecture is abstract, and presents programmers with a simple, uniform fine-grained multithreaded parallel programming model with implicit memory management. In other words, the spatial nature and implementation details (such as the number of processors) of a parallel machine are entirely hidden from the programmer. Compiler writers are encouraged to devise programming languages for the machine that guide a programmer to express their ideas in terms of objects, since objects exhibit an inherent physical locality of data and code. The machine implementation can then leverage this locality to automatically distribute data and threads across the physical machine by using a set of high performance migration mechanisms.
(cont.) An implementation of this architecture could migrate a null thread in 66 cycles - over a factor of 1000 improvement over previous work. Performance also scales well; the time required to move a typical thread is only 4 to 5 times that of a null thread. Data migration performance is similar, and scales linearly with data block size. Since the performance of the migration mechanism is on par with that of an L2 cache, the implementation simulated in my work has no data caches and relies instead on multithreading and the migration mechanism to hide and reduce access latencies.
by Andrew "bunnie" Huang.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
8

Wise, Barbara. "THE EFFECT OF CLOSURE ON THE RELATIONSHIP BETWEEN ADHD SYMPTOMS AND SMOKING INITIATION: A MODERATION MODEL USING ADD HEALTH DATA." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1448998688.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Mahmud, A. S. M. Hasan. "Sustainable Resource Management for Cloud Data Centers." FIU Digital Commons, 2016. http://digitalcommons.fiu.edu/etd/2634.

Full text
Abstract:
In recent years, the demand for data center computing has increased significantly due to the growing popularity of cloud applications and Internet-based services. Today's large data centers host hundreds of thousands of servers and the peak power rating of a single data center may even exceed 100MW. The combined electricity consumption of global data centers accounts for about 3% of worldwide production, raising serious concerns about their carbon footprint. The utility providers and governments are consistently pressuring data center operators to reduce their carbon footprint and energy consumption. While these operators (e.g., Apple, Facebook, and Google) have taken steps to reduce their carbon footprints (e.g., by installing on-site/off-site renewable energy facility), they are aggressively looking for new approaches that do not require expensive hardware installation or modification. This dissertation focuses on developing algorithms and systems to improve the sustainability in data centers without incurring significant additional operational or setup costs. In the first part, we propose a provably-efficient resource management solution for a self-managed data center to cap and reduce the carbon emission while maintaining satisfactory service performance. Our solution reduces the carbon emission of a self-managed data center to net-zero level and achieves carbon neutrality. In the second part, we consider minimizing the carbon emission in a hybrid data center infrastructure that includes geographically distributed self-managed and colocation data centers. This segment identifies and addresses the challenges of resource management in a hybrid data center infrastructure and proposes an efficient distributed solution to optimize the workload and resource allocation jointly in both self-managed and colocation data centers. In the final part, we explore sustainable resource management from cloud service users' point of view. A cloud service user purchases computing resources (e.g., virtual machines) from the service provider and does not have direct control over the carbon emission of the service provider's data center. Our proposed solution encourages a user to take part in sustainable (both economical and environmental) computing by limiting its spending on cloud resource purchase while satisfying its application performance requirements.
APA, Harvard, Vancouver, ISO, and other styles
10

Gilander, Julia, and Jennifer Bodén. "Spelbaserat lärande som främjar engagemang : Ett digitalt läromedel för barn med ADHD och tidsuppfattningssvårigheter." Thesis, Linköpings universitet, Institutionen för datavetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-120617.

Full text
Abstract:
Barn med ADHD har generellt svårt med tidsuppfattningen vilket kan leda till problem senare i livet. Som en hjälp för att träna tidsuppfattning finns bland annat det analoga verktyget “Pussla med tid” som ska underlätta vardagen för dessa barn. Det som saknas är dock ett digitalt hjälpmedel i vardagen som är roligt att använda, vilket var något som framkom under intervjuer med arbetsterapeuter. Denna studie tittar därför på uppdrag av kund på digitaliseringsmöjligheterna av “Pussla med tid” där fokus ligger på att undersöka möjligheterna för att skapa engagemang samt hur verktyget bör anpassas efter målgruppens behov. Denna studie är kvalitativ och utgår från User-centered design (UCD) där målgruppens behov legat till grund för designbesluten. För att ta reda på hur lärandesituationen anpassas gjordes intervjuer med arbetsterapeuter. Under utvecklandet av spelkonceptet användes metoder som brainstorming, skissböcker och moodboard som lade grunden för spelets utseende och funktion. Användartester utfördes sedan för att utvärdera huruvida spelet var engagerande eller inte. Resultatet av studien visade att barn med ADHD kräver en lugn lärandemiljö med få störande element, samt en tydlig överblick av vad som ska göras. En prototyp utifrån dessa parametrar togs fram och testades av målgruppen. Resultatet visade att spelet generellt var engagerande och användarvänligt men att det fanns vissa brister i exempelvis nivån av tillfredställelse hos målgruppen. Denna brist berodde framförallt på att spelet upplevdes ha för låg svårighetsgrad och en avsaknad av utmaning, vilket är ett viktigt kriterium för att skapa engagemang. Studien genererade underlag för hur ett digitalt hjälpmedel ska utformas för att engagera målgruppen då prototypen av spelet visade på både brister men framförallt styrkor i konceptet.
Children with ADHD often experience difficulties with time perception which can cause them problems later in life. In order to help them in their daily life, an analog tool called “Pussla med tid” has been developed with the purpose of teaching these children the concept of time by visualizing it. What does not yet exist is a digital tool designed for the end user which is perceived as fun. This study evaluates the possibility of digitalizing the “Pussla med tid” tool and how it could be done to engage and meet the needs of the end users. This is a qualitative study that uses a user-centerd design (UCD) approach where the needs of the target group has been taken into careful consideration in every design decision. In order to find out needs, interviews with special educators were held. When developing the concept of the game, methods such as brainstorming, sketch books and moodboards were used to determine the interface and functions of the game. User tests were conducted in order to evaluate the level of engagement and whether it met the needs of the end users or not. According to the results, children with ADHD require a calm learning environment with little distractions and with an overview showing the entire process. Using this information when designing, a prototype was developed and tested by the target group. The results showed that the game was generally engaging and user friendly, but that there were a lack of satisfaction. Thislowlevelofsatisfactionwasprobablyduetothedifficulty level of the game which was perceived as low, and which did not offer the testers any challenges, which is important to create engagement. T​his study generated results that could help determine how a digital tool should be designed to engage the end user. The prototype that was developed showed both strengths and weaknesses, data which could be useful when designing similar tools with a shared purpose of being engaging.
APA, Harvard, Vancouver, ISO, and other styles
11

Collins, Peter. "The remediation of oculomotor and attentional deficits of children with ADHD : identifying and training control mechanisms based on ocular data." Thesis, University of Nottingham, 2016. http://eprints.nottingham.ac.uk/31898/.

Full text
Abstract:
This project set out to develop a cognitive training intervention for individuals with attention deficit hyperactivity disorder (ADHD). The thesis builds on research suggesting that reinforcement deficits in the ADHD population give rise to the underdevelopment of a number of cognitive abilities, in particular inhibitory control skills. Arguing that this skill is explicitly trainable and that training inhibitory gaze control is a means of training inhibitory control, this thesis set out to utilise eye-tracking technology to assess inhibitory gaze control performance in ADHD and to develop an engaging intervention in the form of a computer game capable of training the inhibitory gaze control system. Drawing on literature on inhibitory control in ADHD, the saccadic system, game development, and cognitive load theory a training intervention and battery of assessment tasks were developed iteratively across a number of pilot studies. The development process and resultant cognitive training interventions are described. The final proof-of-concept study was trialled for eight one-hour training sessions with an ADHD population (N = 8). Comparisons of pre- and post-training assessments produced strong effects for measures of gaze control, inhibitory control, timing, and attention. The results are interpreted and a number of limitations noted. The potential benefits of such interventions to aid clinicians to diagnose, to monitor, and to treat ADHD are considered. The relevance of cognitive interventions in contributing to research attempting to identify endophenotypes of ADHD is also discussed.
APA, Harvard, Vancouver, ISO, and other styles
12

Tan, Lirong. "Identification of Disease Biomarkers from Brain fMRI Data using Machine Learning Techniques: Applications in Sensorineural Hearing Loss and Attention Deficit Hyperactivity Disorder." University of Cincinnati / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1447689755.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Al-Sari, Abdulkader Mohammed. "A geological investigation of multispectral remote sensing data for the Mahd Adh Dhahab and Jabal Said districts, western Saudi Arabia." Thesis, Durham University, 1989. http://etheses.dur.ac.uk/6305/.

Full text
Abstract:
This thesis examines the effect of spatial resolution on lithological and alteration mapping using remotely sensed multispectral data. The remotely sensed data were obtained by the Thematic Mapper (TM) and Airborne Thematic Mapper (ATM) over two areas in the Arabian Shield. These were the Mahd Adh Dhahab and Jabal Said areas. The ATM data had a nominal spatial resolution of 7.5m, 5m, and 2.5m. In order to compare these data sets it was necessary to correct for, sensor- and scene-related distortions. This was achieved by calibrating each data set and converting them to reflectance units using ground spectra with a similar spectral resolution obtained with the Barringer Hand Held Ratioing Radiometer (HHRR) . The ATM data were also corrected for X-track shading by normalising the brightness of each column to that of the centre column. The result of X-ray and laboratory spectral analysis of samples collected from the study areas, support the presence of characteristic minerals associated with the alteration zones. The corrected data were analysed by a variety of techniques in order to enhance the geological information present in the data. These included false colour compositing, decorrelating stretching and band ratioing. The latter two techniques proved most effective for discrimination and several additional geological units and areas were identified which had not been mapped previously. Results further indicate that the increased spatial resolution of the ATM data did not permit greater discrimination than the TM data. This suggests TM data should prove to be a cost-effective way of mapping and detection of alteration zones in the Arabian Shield.
APA, Harvard, Vancouver, ISO, and other styles
14

Cebik, James A., and William J. Connor. "AIRBORNE DATA ACQUISITION SYSTEM FOR THE RAH-66 COMANCHE AIRCRAFT." International Foundation for Telemetering, 1997. http://hdl.handle.net/10150/609828.

Full text
Abstract:
International Telemetering Conference Proceedings / October 27-30, 1997 / Riviera Hotel and Convention Center, Las Vegas, Nevada
The RAH-66 Comanche flight test program required a state of the art Airborne Data Acquisition System consisting of: 1) A modular distributed system that uses a series of software programmable building blocks capable of signal conditioning all types of sensors. 2) A digital multiplexing system capable of combining various types of digital streams at high rates including Synchronous and Asynchronous PCM, MIL-STD-1553B, and RS-422 data streams. 3) A Data Combiner Unit that accepts synchronous PCM data streams from one to eight sources at 4 MBPS or less and a frame size of up to 8128 words each that outputs four independent PCM streams at 8 MBPS or less and a frame size of up to 16384 words. 4) A Data System Control Unit that controls the tape recorder, serves as the interface to the Pilot’s Control Unit and monitors/reports status of the data acquisition system to the Pilots Control Unit. 5) An Airborne Computer that provides the control and interface to the pilot & copilot instrumentation displays. 6) A Cockpit Instrumentation Pilot Display System consisting of a Main Unit Multi- Function Display, a Load Factor/Hub Moment Display and a Right Wing Flight Control Position Display. The Main Unit Multi-Function Display has the capability to display multiple graphic pages generated by the Airborne Computer. 7) The ability to record high speed avionics buses from the (Mission Equipment Package) MEP such as MIL-STD-1553B, (High Speed Data Bus) HSDB, (Processor Interconnect) PI Bus, (Data Flow Network) DFN and PCM utilizing the Ampex DCRsi-107 Tape Recorder.
APA, Harvard, Vancouver, ISO, and other styles
15

Riddle, Tara L. "Validation and Development of Adult Norms for the Contingency Naming Test." Ohio University / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1307638644.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Prasad, Vibhore. "The epidemiology of injuries in epilepsy and attention deficit-hyperactivity disorder (ADHD) in children and young people using the Clinical Practice Research Datalink (CPRD) and linked data." Thesis, University of Nottingham, 2016. http://eprints.nottingham.ac.uk/33216/.

Full text
Abstract:
Background: Injuries are a leading cause of morbidity and mortality in children and young people (CYP) throughout the world and in the UK. Detailed estimates of the risk of specific injuries, namely fractures, thermal injuries and poisonings, are not available for CYP with specific medical conditions, such as epilepsy or attention deficit-hyperactivity disorder (ADHD) in the English primary care population. To date there has been no description of the recording of ADHD by general practitioners (GPs) in English primary care according to people’s area-level social deprivation and strategic health authority (SHA) region. Objectives: 1. To define a cohort of CYP with epilepsy from the UK primary care population. 2. To estimate the risk of specific injuries, namely fractures, thermal injuries and poisonings in CYP with epilepsy compared to CYP without epilepsy. 3. To define and describe the cumulative administrative prevalence of ADHD in CYP in English primary care overall and by age, sex, SHA region, deprivation and calendar time. 4. To estimate the risk of specific injuries, namely fractures, thermal injuries and poisonings in CYP with ADHD compared to CYP without ADHD. Methods: This thesis describes work conducted using a large primary care dataset (the Clinical Practice Research Datalink (CPRD)) containing GP medical records and, for a proportion, linked hospital records from the hospital episodes statistics (HES) database. Firstly, the CPRD was used to define a cohort of CYP with epilepsy and CYP without epilepsy. The GP medical records for this cohort were used to estimate the risk of fractures, thermal injuries and poisonings, in CYP with epilepsy compared to CYP without epilepsy. The rates of injuries were estimated by age and sex. For a proportion of people in this study, the effect on estimates of using linked hospital medical records in addition to the GP medical records was evaluated. Secondly, the administrative prevalence of ADHD recorded by GPs was defined for CYP in England by identifying a cohort of CYP in the CPRD with GP medical records linked to hospital medical records. The cumulative administrative prevalence of ADHD was estimated overall and by age, sex, SHA region, deprivation and calendar time. Thirdly, the GP medical records and linked hospital medical records for the cohort of CYP with ADHD was used to estimate the risk of fractures, thermal injuries and poisonings, in CYP with ADHD compared to CYP without ADHD. The rates of injuries were estimated by age, sex and deprivation. Findings: CYP with epilepsy are at greater risk of fractures, thermal injuries and poisonings compared to CYP without epilepsy. In CYP with epilepsy the incidence of fractures is 18% higher, thermal injuries is 50% higher and poisonings 147% higher than in CYP without epilepsy, with the increased risk being restricted to medicinal poisonings. Among young adults with epilepsy, aged 19 to 24 years, the incidence rate of medicinal poisoning is four-fold that of the general population of the same age. Using GP medical records and linked hospital medical records may improve the ascertainment of injuries. For example, if hospital medical records are used in addition to GP medical records to ascertain femur fractures, a further 33% of fractures may be ascertained compared to using GP medical records alone. In comparison, if hospital medical records were used without GP medical records, 10% of femur fractures may not be ascertained. However, this increased ascertainment of injuries is unlikely to alter the estimates of risk of injuries in people with epilepsy when compared to people without epilepsy (e.g. risk of long bone fractures: using hospital and GP medical records, hazard ratio (HR)=1.25 (95% confidence interval (95%CI) 1.07 to 1.46) vs. using GP medical records alone, HR=1.23 (95%CI 1.10 to 1.38)). The administrative prevalence of ADHD in CYP aged 3 to 17 years old in English GP medical records is 0.88% (95% confidence interval (95%CI) 0.87 to 0.89). The prevalence of ADHD recorded by GPs is around five times greater in males than in females. The administrative prevalence of ADHD appears to increase with age, with the lowest prevalence in 3 to 4 year-olds (0.02 (95%CI 0.02 to 0.03)) and the highest prevalence in 15 to 17 year olds (1.38 (95%CI 1.36 to 1.40)). The administrative prevalence of ADHD is twice as high in CYP from the most deprived areas compared to CYP from the least deprived areas (1.14% (95%CI 1.12 to 1.16) in the most deprived areas to 0.64% (95%CI 0.63 to 0.65) in the least deprived areas)). CYP with ADHD are at greater risk of fractures, thermal injuries and poisonings compared CYP without ADHD. In CYP with ADHD the incidence of fractures is 28% higher, thermal injuries is 104% higher and poisonings is 300% higher than in CYP without ADHD. Conclusions: CYP with epilepsy and ADHD have an increased risk of fracture, thermal injury and poisoning compared to CYP without these conditions. For both conditions the risk of poisoning is higher than the risk of fractures or thermal injuries. The administrative prevalence of ADHD is lower than estimates of community prevalence ascertained from studies not using primary care data. The prevalence of ADHD varied with deprivation, being almost twice as high in CYP from the most deprived areas compared to CYP from the least deprived areas. Future research is required to explore the circumstances surrounding injuries in CYP with and without epilepsy and ADHD. Future research is also required to explore the effect of treating epilepsy and ADHD with medication on injury risk. Research is required to explore the effect of the severity of epilepsy and ADHD on estimated risks of injuries. Future research exploring potential under-diagnosis or under-recording of diagnosis of ADHD in CYP in primary care is needed. CYP with epilepsy and ADHD and their parents should be provided with evidence-based injury prevention interventions because work in this thesis has demonstrated they are at higher risk of injury than the general population of CYP. Health care professionals working with CYP; child and adolescent mental health services; child education or care practitioners; and other agencies and organisations with an injury prevention role, should be made aware of the increased risk of injury in CYP with epilepsy and ADHD. Commissioners of health services for CYP should ensure service specifications include injury prevention training and provision for evidence-based injury prevention interventions.
APA, Harvard, Vancouver, ISO, and other styles
17

Kubicki, Adam [Verfasser]. "Significance of sidescan sonar data in morphodynamics investigations on shelf seas : case studies on subaqueous dunes migration, refilling of extraction pits and sorted bedforms stability / Adam Kubicki." Kiel : Universitätsbibliothek Kiel, 2008. http://d-nb.info/1019660244/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Khader, Osama [Verfasser], Adam [Akademischer Betreuer] Wolisz, Thomas [Akademischer Betreuer] Sikora, Pedro [Akademischer Betreuer] Marron, and Thiemo [Akademischer Betreuer] Voigt. "Autonomous framework for supporting energy efficiency and communication reliability for periodic data flows in wireless sensor networks / Osama Khader. Gutachter: Thomas Sikora ; Pedro Marron ; Thiemo Voigt. Betreuer: Adam Wolisz." Berlin : Technische Universität Berlin, 2014. http://d-nb.info/1066162808/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Buratti, Luca. "Visualisation of Convolutional Neural Networks." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018.

Find full text
Abstract:
Le Reti Neurali, e in particolare le Reti Neurali Convoluzionali, hanno recentemente dimostrato risultati straordinari in vari campi. Purtroppo, comunque, non vi è ancora una chiara comprensione del perchè queste architetture funzionino così bene e soprattutto è difficile spiegare il comportamento nel caso di fallimenti. Questa mancanza di chiarezza è quello che separa questi modelli dall’essere applicati in scenari concreti e critici della vita reale, come la sanità o le auto a guida autonoma. Per questa ragione, durante gli ultimi anni sono stati portati avanti diversi studi in modo tale da creare metodi che siano capaci di spiegare al meglio cosa sta succedendo dentro una rete neurale oppure dove la rete sta guardando per predire in un certo modo. Proprio queste tecniche sono il centro di questa tesi e il ponte tra i due casi di studio che sono presentati sotto. Lo scopo di questo lavoro è quindi duplice: per prima cosa, usare questi metodi per analizzare e quindi capire come migliorare applicazioni basate su reti neurali convoluzionali e in secondo luogo, per investigare la capacità di generalizzazione di queste architetture, sempre grazie a questi metodi.
APA, Harvard, Vancouver, ISO, and other styles
20

Ammanouil, Rita. "Contributions au démélange non-supervisé et non-linéaire de données hyperspectrales." Thesis, Université Côte d'Azur (ComUE), 2016. http://www.theses.fr/2016AZUR4079/document.

Full text
Abstract:
Le démélange spectral est l’un des problèmes centraux pour l’exploitation des images hyperspectrales. En raison de la faible résolution spatiale des imageurs hyperspectraux en télédetection, la surface représentée par un pixel peut contenir plusieurs matériaux. Dans ce contexte, le démélange consiste à estimer les spectres purs (les end members) ainsi que leurs fractions (les abondances) pour chaque pixel de l’image. Le but de cette thèse estde proposer de nouveaux algorithmes de démélange qui visent à améliorer l’estimation des spectres purs et des abondances. En particulier, les algorithmes de démélange proposés s’inscrivent dans le cadre du démélange non-supervisé et non-linéaire. Dans un premier temps, on propose un algorithme de démelange non-supervisé dans lequel une régularisation favorisant la parcimonie des groupes est utilisée pour identifier les spectres purs parmi les observations. Une extension de ce premier algorithme permet de prendre en compte la présence du bruit parmi les observations choisies comme étant les plus pures. Dans un second temps, les connaissances a priori des ressemblances entre les spectres à l’échelle localeet non-locale ainsi que leurs positions dans l’image sont exploitées pour construire un graphe adapté à l’image. Ce graphe est ensuite incorporé dans le problème de démélange non supervisé par le biais d’une régularisation basée sur le Laplacian du graphe. Enfin, deux algorithmes de démélange non-linéaires sont proposés dans le cas supervisé. Les modèles de mélanges non-linéaires correspondants incorporent des fonctions à valeurs vectorielles appartenant à un espace de Hilbert à noyaux reproduisants. L’intérêt de ces fonctions par rapport aux fonctions à valeurs scalaires est qu’elles permettent d’incorporer un a priori sur la ressemblance entre les différentes fonctions. En particulier, un a priori spectral, dans un premier temps, et un a priori spatial, dans un second temps, sont incorporés pour améliorer la caractérisation du mélange non-linéaire. La validation expérimentale des modèles et des algorithmes proposés sur des données synthétiques et réelles montre une amélioration des performances par rapport aux méthodes de l’état de l’art. Cette amélioration se traduit par une meilleure erreur de reconstruction des données
Spectral unmixing has been an active field of research since the earliest days of hyperspectralremote sensing. It is concerned with the case where various materials are found inthe spatial extent of a pixel, resulting in a spectrum that is a mixture of the signatures ofthose materials. Unmixing then reduces to estimating the pure spectral signatures and theircorresponding proportions in every pixel. In the hyperspectral unmixing jargon, the puresignatures are known as the endmembers and their proportions as the abundances. Thisthesis focuses on spectral unmixing of remotely sensed hyperspectral data. In particular,it is aimed at improving the accuracy of the extraction of compositional information fromhyperspectral data. This is done through the development of new unmixing techniques intwo main contexts, namely in the unsupervised and nonlinear case. In particular, we proposea new technique for blind unmixing, we incorporate spatial information in (linear and nonlinear)unmixing, and we finally propose a new nonlinear mixing model. More precisely, first,an unsupervised unmixing approach based on collaborative sparse regularization is proposedwhere the library of endmembers candidates is built from the observations themselves. Thisapproach is then extended in order to take into account the presence of noise among theendmembers candidates. Second, within the unsupervised unmixing framework, two graphbasedregularizations are used in order to incorporate prior local and nonlocal contextualinformation. Next, within a supervised nonlinear unmixing framework, a new nonlinearmixing model based on vector-valued functions in reproducing kernel Hilbert space (RKHS)is proposed. The aforementioned model allows to consider different nonlinear functions atdifferent bands, regularize the discrepancies between these functions, and account for neighboringnonlinear contributions. Finally, the vector-valued kernel framework is used in orderto promote spatial smoothness of the nonlinear part in a kernel-based nonlinear mixingmodel. Simulations on synthetic and real data show the effectiveness of all the proposedtechniques
APA, Harvard, Vancouver, ISO, and other styles
21

Brown, Steven Richard. "A design of experiments approach for engineering carbon metabolism in the yeast Saccharomyces cerevisiae." Thesis, University of Exeter, 2016. http://hdl.handle.net/10871/26158.

Full text
Abstract:
The proven ability to ferment Saccharomyces cerevisiae on a large scale presents an attractive target for producing chemicals and fuels from sustainable sources. Efficient and predominant carbon flux through to ethanol is a significant engineering issue in the development of this yeast as a multi-product cell chassis used in biorefineries. In order to evaluate diversion of carbon flux away from ethanol, combinatorial deletions were investigated in genes encoding the six isozymes of alcohol dehydrogenase (ADH), which catalyse the terminal step in ethanol production. The scarless, dominant and counter- selectable amdSYM gene deletion method was optimised for generation of a combinatorial ADH knockout library in an industrially relevant strain of S. cerevisiae. Current understanding of the individual ADH genes fails to fully evaluate genotype-by-genotype and genotype-by-environment interactions: rather, further research of such a complex biological process requires a multivariate mathematical modelling approach. Application of such an approach using the Design of Experiments (DoE) methodology is appraised here as essential for detailed empirical evaluation of complex systems. DoE provided empirical evidence that in S. cerevisiae: i) the ADH2 gene is not associated with producing ethanol under anaerobic culture conditions in combination with 25 g l-1 glucose substrate concentrations; ii) ADH4 is associated with increased ethanol production when the cell is confronted with a zinc-limited [1 μM] environment; and iii) ADH5 is linked with the production of ethanol, predominantly at pH 4.5. A successful metabolic engineering strategy is detailed which increases the product portfolio of S. cerevisiae, currently used for large-scale production of bioethanol. Heterologous expression of the cytochrome P450 fatty acid peroxygenase from Jeotgalicoccus sp., OleTJE, fused to the RhFRED reductase from Rhodococcus sp. NCIMB 978 converted free fatty acid precursors to C13, C15 and C17 alkenes (3.81 ng μl-1 total alkene concentration).
APA, Harvard, Vancouver, ISO, and other styles
22

Hsien, Tsai Ming, and 蔡明憲. "An Effective Amount-Driven Encoding/Decoding Method (ADEM) for Low-Power Data Bus with Coupling." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/37690855267536619029.

Full text
Abstract:
碩士
國立交通大學
資訊科學與工程研究所
94
As technology trends advancing, the increased bus length and the narrower geometrical proximity of adjacent bus lines form non-negligible coupling capacitances between two adjacent bus lines. Therefore, more power dissipation is caused by charge and discharge of the coupling capacitances. In this case, the effect of line-to-ground and coupling capacitances plays an important role for low-power bus system. In this thesis, we propose an integrated method, named amount-driven encoding method (ADEM), which minimizes the power dissipation of on-chip data buses through combining bus encoding and Spacing mechanisms. In our bus model, the bus lines are considered as the constitution of several adjacent pairs without intersection. Spacing mechanism is applied to decrease the values of coupling capacitances between pairs. For coupling capacitances between two adjacent lines within a pair, we reduce the charge and discharge times of them by applying four encoding methods in each bus cycle. Our method saves more than 25% of bus power on average compared to the un-encoded cases by transferring a large set of common used multimedia files on the bus. Comparing to previous work, ADEM saves more power effectively with a little overhead of circuit complexity and delay time.
APA, Harvard, Vancouver, ISO, and other styles
23

Huang, Andrew "bunnie." "ADAM: A Decentralized Parallel Computer Architecture Featuring Fast Thread and Data Migration and a Uniform Hardware Abstraction." 2002. http://hdl.handle.net/1721.1/7096.

Full text
Abstract:
The furious pace of Moore's Law is driving computer architecture into a realm where the the speed of light is the dominant factor in system latencies. The number of clock cycles to span a chip are increasing, while the number of bits that can be accessed within a clock cycle is decreasing. Hence, it is becoming more difficult to hide latency. One alternative solution is to reduce latency by migrating threads and data, but the overhead of existing implementations has previously made migration an unserviceable solution so far. I present an architecture, implementation, and mechanisms that reduces the overhead of migration to the point where migration is a viable supplement to other latency hiding mechanisms, such as multithreading. The architecture is abstract, and presents programmers with a simple, uniform fine-grained multithreaded parallel programming model with implicit memory management. In other words, the spatial nature and implementation details (such as the number of processors) of a parallel machine are entirely hidden from the programmer. Compiler writers are encouraged to devise programming languages for the machine that guide a programmer to express their ideas in terms of objects, since objects exhibit an inherent physical locality of data and code. The machine implementation can then leverage this locality to automatically distribute data and threads across the physical machine by using a set of high performance migration mechanisms. An implementation of this architecture could migrate a null thread in 66 cycles -- over a factor of 1000 improvement over previous work. Performance also scales well; the time required to move a typical thread is only 4 to 5 times that of a null thread. Data migration performance is similar, and scales linearly with data block size. Since the performance of the migration mechanism is on par with that of an L2 cache, the implementation simulated in my work has no data caches and relies instead on multithreading and the migration mechanism to hide and reduce access latencies.
APA, Harvard, Vancouver, ISO, and other styles
24

Hassanin, Hanan. "Medication Adherence In Children & Adolescents With ADHD In Hawaii: A Secondary Data Analysis Of An Insured Population." Thesis, 2005. http://hdl.handle.net/10125/10398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Pin-TzuChen and 陳品慈. "Differentiation between Resting-State fMRI data from ADHD and Normal Subjects Based on Functional Connectivity and Machine Learning." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/19182679332257666974.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Uhlířová, Jaromíra. "Antropologický profil dětí s diagnózou ADHD (attention deficit hyperactivity disorder)." Master's thesis, 2012. http://www.nusl.cz/ntk/nusl-305371.

Full text
Abstract:
This final thesis deals with the anthropological profile of children with ADHD (attention deficit hyperactivity disorder), which is one of the most common psychiatric diagnoses in childhood. Many studies have demonstrated the relationship of ADHD to differences in physical growth, mostly in terms of growth retardation and lower weight parameters. However, these differences are often associated with the use of pharmacological treatment. Some authors point to the possibility of the influence of ADHD itself. In this paper we compare the anthropometric parameters of the 40 boys with ADHD from 6.00 to 10.99 years of age who are treated with medication (methylphenidate) and 172 boys of control group. The compilation of control group for use in clinical research of ADHD was one of the objectives of the thesis. We also compared anamnestic data obtained using questionnaires. It provided information such as birth parameters, duration of breastfeeding, eating habits, amount of time spent in physical activity, or time spent watching television. The control group of healthy individuals was established to better reflect the somatic profile of recent child population and also provided the anamnestic data. Against currently used growth standards control group showed some significant differences, which could be...
APA, Harvard, Vancouver, ISO, and other styles
27

Xu, Hong. "Efficient Workload and Resource Management in Datacenters." Thesis, 2013. http://hdl.handle.net/1807/36071.

Full text
Abstract:
This dissertation focuses on developing algorithms and systems to improve the efficiency of operating mega datacenters with hundreds of thousands of servers. In particular, it seeks to address two challenges: First, how to distribute the workload among the set of datacenters geographically deployed across the wide area? Second, how to manage the server resources of datacenters using virtualization technology? In the first part, we consider the workload management problem in geo-distributed datacenters. We first present a novel distributed workload management algorithm that jointly considers request mapping, which determines how to direct user requests to an appropriate datacenter for processing, and response routing, which decides how to select a path among the set of ISP links of a datacenter to route the response packets back to a user. In the next chapter, we study some key aspects of cost and workload in geo-distributed datacenters that have not been fully understood before. Through extensive empirical studies of climate data and cooling systems, we make a case for temperature aware workload management, where the geographical diversity of temperature and its impact on cooling energy efficiency can be used to reduce the overall cooling energy. Moreover, we advocate for holistic workload management for both interactive and batch jobs, where the delay-tolerant elastic nature of batch jobs can be exploited to further reduce the energy cost. A consistent 15% to 20% cooling energy reduction, and a 5% to 20% overall cost reduction are observed from extensive trace-driven simulations. In the second part of the thesis, we consider the resource management problem in virtualized datacenters. We design Anchor, a scalable and flexible architecture that efficiently supports a variety of resource management policies. We implement a prototype of Anchor on a small-scale in-house datacenter with 20 servers. Experimental results and trace-driven simulations show that Anchor is effective in realizing various resource management policies, and its simple algorithms are practical to solve virtual machine allocation with thousands of VMs and servers in just ten seconds.
APA, Harvard, Vancouver, ISO, and other styles
28

Jensen, Martin, and Mattias Ross. "Kognitiv tillgänglighet på webben." Thesis, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:hb:diva-22248.

Full text
Abstract:
Since the start of 2019 a new law has been introduced in Sweden, it serves to regulate web accessibility conformity for all publicly funded organisations. At the same time, many services and businesses move their physical fronts to the web. With more digitalization comes a greater need for web accessibility and there might be a risk that users with cognitive disabilities are neglected or forgotten when new and advanced tools and technologies are developed. Accessibility often receives low or no priority in the private sector due to the lack of time and resources. Cognitive disabilities do not always show and there is a risk that the difficulties are not taken into account when services or webpages are developed. These disabilities can affect areas such as the ability to learn new things, memory and the ability to plan one's actions. The purpose of this study is to investigate if there is any knowledge and experience among Swedish developers concerning accessibility, cognitive disabilities and the needs those individuals might have. To perform this study the needs had to be identified and connected to the standard that acts as a base for the new Swedish law. For this purpose, theory was gathered through a literature study as a foundation for a qualitative study consisting of a series of interviews with developers from six different organisations in the Swedish private sector. The interview subjects were selected from different organisations so that their views and opinions would not be affected by one and other for the purpose of increasing the studies generalizability. The interviews were held at locations selected by the interviewees by two different interviewers. The result from the literature study showed that there was no direct support for cognitive disabilities in the standard, but there are guidelines that could be applied to some of the needs identified in the study. Furthermore, the results of the interviews showed that the general knowledge and experience concerning accessibility and specifically cognitive accessibility is very low among Swedish developers. This study is aimed towards people working with development but is also suitable for students and teachers who wish to know more about accessibility and cognitive accessibility.
APA, Harvard, Vancouver, ISO, and other styles
29

Wolverton, Cheryl Lynn. "Staff nurse perceptions' of nurse manager caring behaviors: psychometric testing of the Caring Assessment Tool-Administration (CAT-adm©)." Diss., 2016. http://hdl.handle.net/1805/10462.

Full text
Abstract:
Indiana University-Purdue University Indianapolis (IUPUI)
Caring relationships established between nurse managers and staff nurses promote positive work environments. However, research about staff nurses' perceptions of nurse manager caring behaviors is limited. A 94-item Caring Assessment Tool-Administration (CAT-adm©) was developed to measure staff nurses' perceptions of nurse managers' caring behaviors; however, it lacked robust psychometric testing. This study was undertaken to establish the CAT-adm© survey as a reliable and valid tool to measure staff nurses' perceptions of nurse managers' caring behaviors. The Quality-Caring Model® (QCM®) served as the theoretical framework. Specific aims were to 1) evaluate construct validity of the CAT-adm© survey by describing factors that account for variance in staff nurses' perceptions of nurse manager caring, 2) estimate internal consistency, and 3) conduct item reduction analysis. Four research questions were: 1) Will the factor structure of observed data fit an 8-factor solution? 2) What is the internal consistency reliability of the CAT- adm©? 3) What items can be reduced while maintaining an acceptable factor structure? and 4) What are staff nurses' perceptions of nurse manager caring behaviors? A cross-sectional descriptive design was used. A sample of 703 staff nurses from Midwestern, Midatlantic and Southern Regions of the U.S. completed the CAT-adm© survey electronically. Analysis included Confirmatory Factor Analysis (CFA), Exploratory Factor Analysis (EFA), univariate analysis, and descriptive statistics. CFA did not support an 8-factor solution. EFA supported a two-factor solution and demonstrated significant shared variance between the two factors. This shared variance supported a one-factor solution that could conceptually be labeled Caring Behaviors. Random selection reduced the scale to 25-items while maintaining a Cronbach's Alpha of .98. Using the new 25-item scale, the composite score mean of staff nurses' perceptions of nurse manager caring behaviors indicated a moderately high level of caring. Suggestions for nursing administration, nurse manager practice, leadership, education and for future research were given. The new 25-item CAT-adm© survey has acceptable reliability and validity. The 25-item CAT-adm© survey provides hospital administrators, nurse managers, and researchers with an instrument to collect valuable information about the caring behaviors used by nurse managers in relationship with staff nurses.
APA, Harvard, Vancouver, ISO, and other styles
30

Garcia, Ilicínio António Afonso. "Relatório do estágio profissional realizado no ISCPSI: Retroversão do conteúdo da página Web do ISCPSI à data de 20/10/2009: Tradução do estudo comparado El agente infiltrado en España y Portugal de Adám Carrizo Gonzalez-Castell: Tradução dos três primeiros capítulos do livro Introducción a la criminología de Alfonso Serrano Maíllo." Master's thesis, 2010. http://hdl.handle.net/10451/4240.

Full text
Abstract:
Relatório de estágio de mestrado, Tradução, Universidade de Lisboa, Faculdade de Letras, 2011
O relatório de estágio realizado no âmbito do Mestrado em Tradução teve como objectivo pôr em prática os conhecimentos, experiências e práticas de tradução. Com este estágio profissional pretendeu-se ampliar as competências adquiridas aquando da Licenciatura em Línguas Literaturas e Culturas Modernas e 1ºano do Mestrado em Tradução, propondo-se a efectuar a retroversão do conteúdo da página Web Instituto Superior de Ciências Policiais e Segurança Interna (ISCPSI), de português para espanhol, a tradução do estudo comparado El agente infiltrado en España y Portugal de Adán Carrizo González-Castell e ainda os dois primeiros capítulos do livro Introducción a la Criminologia, de Alfonso Serrano Maíllo. Estes trabalhos são variados e com destinatários distintos: o primeiro trabalho, a retroversão do sítio do (ISCPSI), destina-se a estudantes de países hispânicos, de escolas congéneres, que queiram participar no programa Erasmus; o segundo, dado ser um estudo comparado, destina-se a estudantes e especialistas na área do Direito Penal falantes do português; e o terceiro e último trabalho destina-se a estudantes e especialistas na área da Criminologia falantes do português. O presente relatório realizado no âmbito do estágio profissional do Mestrado em Tradução teve por finalidade, fazer uma exposição acerca do processo da tradução dos trabalhos supra-citados. Este inclui uma análise preliminar dos textos, descrição das didácticas da tradução adoptadas na elaboração deste trabalho, as dificuldades e problemas encontrados, bem como as propostas de tradução. Os trabalhos em causa encontram-se apensos ao presente trabalho, assim como os respectivos originais.
Abstract: The training report concerning the Masters Degree on Translation aims to put into practice the knowledge, experience and translation practices acquired. With this professional training one intended to enhance the competences obtained during the degree on Languages, Literature and Modern Cultures and the first year of the Masters in Translation. In order to do so, one proposed to make the retroversion of the Instituto Superior de Ciências Policiais e Segurança Interna Web page from Portuguese to Spanish, the translation of the compared study El agente infiltrado en España y Portugal, by Adan Carrizo González-Castell and also the first two chapters of Introducción a la Criminologia, by Alfonso Serrano Maillo. These works are diversified and intended for different targets. The first work, the retroversion of the ISCPSI’s Web page, is aimed at students from Hispanic countries who wish to participate in the Erasmus Program; the second, since it is a compared study, is aimed at Portuguese speaking students and specialists in Penal Law; and the third and last work is intended for Portuguese speaking students and specialists in Criminology. The main purpose of the present report performed within the professional traineeship of a Masters Degree on Translation was to make a written statement regarding the translation process of the above mentioned works. This report includes a preliminary text analysis, a description of the translation didactics applied, the difficulties and problems found, as well as translation proposals. The referred works are attached to this work as well as their respective originals.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography