Dissertationen zum Thema „Data harm“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Data harm" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
Buffenbarger, Lauren. „Ethics in Data Science: Implementing a Harm Prevention Framework“. University of Cincinnati / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1623166419961692.
Der volle Inhalt der QuelleAndersson, Erica, und Ida Knutsson. „Immigration - Benefit or harm for native-born workers?“ Thesis, Linnéuniversitetet, Institutionen för nationalekonomi och statistik (NS), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-53829.
Der volle Inhalt der QuelleChang, David C. „A comparison of computed and measured transmission data for the AGM-88 HARM radome“. Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1993. http://handle.dtic.mil/100.2/ADA274868.
Der volle Inhalt der QuelleMcCullagh, Karen. „The social, cultural, epistemological and technical basis of the concept of 'private' data“. Thesis, University of Manchester, 2012. https://www.research.manchester.ac.uk/portal/en/theses/the-social-cultural-epistemological-and-technical-basis-of-the-concept-of-private-data(e2ea538a-8e5b-43e3-8dc2-4cdf602a19d3).html.
Der volle Inhalt der QuelleSteeg, Sarah. „Estimating effects of self-harm treatment from observational data in England : the use of propensity scores to estimate associations between clinical management in general hospitals and patient outcomes“. Thesis, University of Manchester, 2017. https://www.research.manchester.ac.uk/portal/en/theses/estimating-effects-of-selfharm-treatment-from-observational-data-in-england-the-use-of-propensity-scores-to-estimate-associations-between-clinical-management-in-general-hospitals-and-patient-outcomes(ab6f96b1-f326-43ea-9999-0c410e4c517d).html.
Der volle Inhalt der QuelleMpame, Mario Egbe [Verfasser]. „The General Data Protection Regulation and the effective protection of data subjects’ rights in the online environment : To what extent are these rights enforced during mass harm situations? / Mario Egbe Mpame“. Baden-Baden : Nomos Verlagsgesellschaft mbH & Co. KG, 2021. http://d-nb.info/1237168708/34.
Der volle Inhalt der QuelleGkaravella, Antigoni. „A study of patients referred following an episode of self-harm, a suicide attempt, or in a suicidal crisis using routinely collected data“. Thesis, University of East London, 2014. http://roar.uel.ac.uk/4593/.
Der volle Inhalt der QuelleWojda, Magdalena A. „A focus on the risk of harm : applying a risk-centered purposive approach to the interpretation of "personal information" under Canadian data protection laws“. Thesis, University of British Columbia, 2015. http://hdl.handle.net/2429/55133.
Der volle Inhalt der QuelleLaw, Peter A. Allard School of
Graduate
Berto, Hedda. „Sharing is Caring : An Examination of the Essential Facilities Doctrine and its Applicability to Big Data“. Thesis, Uppsala universitet, Juridiska institutionen, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-411945.
Der volle Inhalt der QuelleLee, Amra. „Why do some civilian lives matter more than others? Exploring how the quality, timeliness and consistency of data on civilian harm affects the conduct of hostilities for civilians caught in conflict“. Thesis, Uppsala universitet, Institutionen för freds- och konfliktforskning, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-387653.
Der volle Inhalt der QuelleGratton, Eloïse. „Redéfinir la notion de donnée personnelle dans le contexte des nouvelles technologies de l'Internet“. Thesis, Paris 2, 2012. http://www.theses.fr/2012PA020061.
Der volle Inhalt der QuelleIn the late sixties, with the growing use of computers by organizations, a very broad definition of personal information as “information about an identifiable individual” was elaborated and has been incorporated in data protection laws (“DPLs”). In more recent days, with the Internet and the circulation of new types of information (IP addresses, location information, etc), the efficiency of this definition may be challenged. This thesis aims at proposing a new way of interpreting personal information. Instead of using a literal interpretation, an interpretation which takes into account the purpose behind DPLs will be proposed, in order to ensure that DPLs do what they are supposed to do: address or avoid the risk of harm to individuals triggered by organizations handling their personal information. While the collection or disclosure of information may trigger a more subjective kind of harm (the collection, a feeling of being observed and the disclosure, embarrassment and humiliation), the use of information will trigger a more objective kind of harm (financial, physical, discrimination, etc.). Various criteria useful in order to evaluate this risk of harm will be proposed. The thesis aims at providing a guide that may be used in order to determine whether certain information should qualify as personal information. It will provide for a useful framework under which DPLs remain efficient in light of modern technologies and the Internet
Ramanayaka, Mudiyanselage Asanga. „Data Engineering and Failure Prediction for Hard Drive S.M.A.R.T. Data“. Bowling Green State University / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1594957948648404.
Der volle Inhalt der QuelleAcuna, Stamp Annabelen. „Design Study for Variable Data Printing“. University of Cincinnati / OhioLINK, 2000. http://rave.ohiolink.edu/etdc/view?acc_num=ucin962378632.
Der volle Inhalt der QuelleTroska, Jan Kevin. „Radiation-hard optoelectronic data transfer for the CMS tracker“. Thesis, Imperial College London, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.313621.
Der volle Inhalt der QuelleZhang, Shuang Nan. „Instrumentation and data analysis for hard X-ray astronomy“. Thesis, University of Southampton, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.252689.
Der volle Inhalt der QuelleSchmedding, Anna. „Epidemic Spread Modeling For Covid-19 Using Hard Data“. W&M ScholarWorks, 2021. https://scholarworks.wm.edu/etd/1627047844.
Der volle Inhalt der QuelleYip, Yuk-Lap Kevin, und 葉旭立. „HARP: a practical projected clustering algorithm for mining gene expression data“. Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2003. http://hub.hku.hk/bib/B29634568.
Der volle Inhalt der QuelleCraig, David W. (David William) Carleton University Dissertation Engineering Electrical. „Light traffic loss of random hard real-time tasks in a network“. Ottawa, 1988.
Den vollen Inhalt der Quelle findenLaclau, Charlotte. „Hard and fuzzy block clustering algorithms for high dimensional data“. Thesis, Sorbonne Paris Cité, 2016. http://www.theses.fr/2016USPCB014.
Der volle Inhalt der QuelleWith the increasing number of data available, unsupervised learning has become an important tool used to discover underlying patterns without the need to label instances manually. Among different approaches proposed to tackle this problem, clustering is arguably the most popular one. Clustering is usually based on the assumption that each group, also called cluster, is distributed around a center defined in terms of all features while in some real-world applications dealing with high-dimensional data, this assumption may be false. To this end, co-clustering algorithms were proposed to describe clusters by subsets of features that are the most relevant to them. The obtained latent structure of data is composed of blocks usually called co-clusters. In first two chapters, we describe two co-clustering methods that proceed by differentiating the relevance of features calculated with respect to their capability of revealing the latent structure of the data in both probabilistic and distance-based framework. The probabilistic approach uses the mixture model framework where the irrelevant features are assumed to have a different probability distribution that is independent of the co-clustering structure. On the other hand, the distance-based (also called metric-based) approach relied on the adaptive metric where each variable is assigned with its weight that defines its contribution in the resulting co-clustering. From the theoretical point of view, we show the global convergence of the proposed algorithms using Zangwill convergence theorem. In the last two chapters, we consider a special case of co-clustering where contrary to the original setting, each subset of instances is described by a unique subset of features resulting in a diagonal structure of the initial data matrix. Same as for the two first contributions, we consider both probabilistic and metric-based approaches. The main idea of the proposed contributions is to impose two different kinds of constraints: (1) we fix the number of row clusters to the number of column clusters; (2) we seek a structure of the original data matrix that has the maximum values on its diagonal (for instance for binary data, we look for diagonal blocks composed of ones with zeros outside the main diagonal). The proposed approaches enjoy the convergence guarantees derived from the results of the previous chapters. Finally, we present both hard and fuzzy versions of the proposed algorithms. We evaluate our contributions on a wide variety of synthetic and real-world benchmark binary and continuous data sets related to text mining applications and analyze advantages and inconvenients of each approach. To conclude, we believe that this thesis covers explicitly a vast majority of possible scenarios arising in hard and fuzzy co-clustering and can be seen as a generalization of some popular biclustering approaches
Joyce, Robert. „Dynamic optimisation of NP-hard combinatorial problems of engineering data sets“. Thesis, Coventry University, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.261170.
Der volle Inhalt der QuelleDarragh, Neil. „An adaptive partial response data channel for hard disk magnetic recording“. Thesis, University of Plymouth, 1994. http://hdl.handle.net/10026.1/2594.
Der volle Inhalt der QuelleYasuda, Takeo. „Circuit Technologies for High Performance Hard Disk Drive Data Channel LSI“. 京都大学 (Kyoto University), 2001. http://hdl.handle.net/2433/150621.
Der volle Inhalt der QuellePuchol, Carlos Miguel. „An automation-based design methodolgy [sic] for distributed, hard real-time systems /“. Digital version accessible at:, 1998. http://wwwlib.umi.com/cr/utexas/main.
Der volle Inhalt der QuelleSvensson, Karin. „Har kvinnor förändrade pendlingsmönster? : En kvantitativ studie om kvinnors pendlingsmönster har påverkats av att deras utbildningsnivå ökat“. Thesis, Uppsala universitet, Kulturgeografiska institutionen, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-413694.
Der volle Inhalt der QuelleLi, Guijun, und 李桂君. „Development of recording technology with FePt recording media and magnetic tunnel junction sensors with conetic alloy“. Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2013. http://hub.hku.hk/bib/B50899776.
Der volle Inhalt der Quellepublished_or_final_version
Electrical and Electronic Engineering
Doctoral
Doctor of Philosophy
Chen, Tao. „Development and simulation of hard real-time switched-ethernet avionics data network“. Thesis, Cranfield University, 2011. http://dspace.lib.cranfield.ac.uk/handle/1826/6995.
Der volle Inhalt der QuelleMeister, Eric. „Using hard cost data on resource consumption to measure green building performance“. [Gainesville, Fla.] : University of Florida, 2005. http://purl.fcla.edu/fcla/etd/UFE0010531.
Der volle Inhalt der QuelleDeng, Jiantao. „Adaptation of A TruckSim Model to Experimental Heavy Truck Hard Braking Data“. The Ohio State University, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=osu1259633762.
Der volle Inhalt der QuelleTunstall, Glen Alan. „Dynamic characterisation of the head-media interface in hard disk drives using novel sensor systems“. Thesis, University of Plymouth, 2002. http://hdl.handle.net/10026.1/1643.
Der volle Inhalt der QuelleHarrison, Christopher Bernard. „Feasibility of rock characterization for mineral exploration using seismic data“. Curtin University of Technology, Western Australia School of Mines, Department of Exploration Geophysics, 2009. http://espace.library.curtin.edu.au:80/R/?func=dbin-jump-full&object_id=129417.
Der volle Inhalt der QuelleIn 2002, two high resolution seismic lines, the East Victory and Intrepid, were acquired along with sonic logging, to assess the feasibility of seismic imaging and rock characterisation at the St. Ives gold camp in Western Australia. An innovative research project was undertaken combining seismic processing, rock characterization, reflection calibration, seismic inversion and seismic attribute analysis to show that volumetric predictions of rock type and gold-content may be viable in hard rock environments. Accurate seismic imaging and reflection identification proved to be challenging but achievable task in the all-out hard rock environment of the Yilgarn craton. Accurate results were confounded by crocked seismic line acquisition, low signal-to-noise ratio, regolith distortions, small elastic property variations in the rock, and a limited volume of sonic logging. Each of these challenges, however, did have a systematic solution which allowed for accurate results to be achieved.
Seismic imaging was successfully completed on both the East Victory and Intrepid data sets revealing complex structures in the Earth as shallow as 100 metres to as deep as 3000 metres. The successful imaging required homogenization of the regolith to eliminate regolith travel-time distortions and accurate constant velocity analysis for reflection focusing using migration. Verification of the high amplitude reflections within each image was achieved through integration of surface geological and underground mine data as well as calibration with log derived synthetic seismograms. The most accurate imaging results were ultimately achieved on the East Victory line which had good signal-to-noise ratio and close-to-straight data acquisition direction compared to the more crooked Intrepid seismic line.
The sonic logs from both the East Victory and Intrepid seismic lines were comprehensively analysed by re-sampling and separating the data based on rock type, structure type, alteration type, and Au assay. Cross plotting of the log data revealed statistically accurate separation between harder and softer rocks, as well as sheared and un-sheared rock, were possible based solely on compressional-wave, shear-wave, density, acoustic and elastic impedance. These results were used successfully to derive empirical relationships between seismic attributes and geology. Calibrations of the logs and seismic data provided proof that reflections, especially high-amplitude reflections, correlated well with certain rock properties as expected from the sonic data, including high gold content sheared zones. The correlation value, however, varied with signal-to-noise ratio and crookedness of the seismic line. Subsequent numerical modelling confirmed that separating soft from hard rocks can be based on both general reflectivity pattern and impedance contrasts.
Indeed impedance inversions on the calibrated seismic and sonic data produced reliable volumetric separations between harder rocks (basalt and dolerite) and softer rock (intermediate intrusive, mafic, and volcaniclastic). Acoustic impedance inversions produced the most statistically valid volumetric predictions with the simultaneous use of acoustic and elastic inversions producing stable separation of softer and harder rocks zones. Similarly, Lambda-Mu-Rho inversions showed good separations between softer and harder rock zones. With high gold content rock associated more with “softer” hard rocks and sheared zones, these volumetric inversion provide valuable information for targeted mining. The geostatistical method applied to attribute analysis, however, was highly ambiguous due to low correlations and thus produced overly generalized predictions. Overall reliability of the seismic inversion results were based on quality and quantity of sonic data leaving the East Victory data set, again with superior results as compared to the Intrepid data set.
In general, detailed processing and analysis of the 2D seismic data and the study of the relationship between the recorded wave-field and rock properties measured from borehole logs, core samples and open cut mining, revealed that positive correlations can be developed between the two. The results of rigorous research show that rock characterization using seismic methodology will greatly benefit the mineral industry.
Li, Hai. „Storage Physics and Noise Mechanism in Heat-Assisted Magnetic Recording“. Research Showcase @ CMU, 2016. http://repository.cmu.edu/dissertations/706.
Der volle Inhalt der QuelleAsseburg, Christian. „A Bayesian approach to modelling field data on multi-species predator prey-interactions“. Thesis, St Andrews, 2006. https://research-repository.st-andrews.ac.uk/handle/10023/174.
Der volle Inhalt der QuelleRyman, Jonatan, und Felicia Torbjörnsson. „Hur har digitaliseringen och Big data påverkat revisionsbranschen? : - Hur ser framtiden ut?“ Thesis, Högskolan i Halmstad, Akademin för företagande, innovation och hållbarhet, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-45310.
Der volle Inhalt der QuelleInformation and data are something that is of value in most organizations. Today, there is theopportunity to collect and analyze large amounts of data, which usually goes by the term Bigdata. If the essential material is handled correctly, it may result in improved decision makingand increased insights about clients and the market. The use and processing of Big data hasbeen made possible in the recent years through new technology, but it is associated with highcosts which has resulted in that small and medium-sized companies don’t have the resourcesto implement Big data technologies. This means that larger and more established companieshave an advantage. Big data can also be widely applied to auditing firms. The study's problemdiscussion proves that despite the many benefits that Big data technologies and the digitizationcreate for the auditing industry, the development is slagging behind other industries. Therefore,the purpose of the study was designed to see how digitalization and Big Data technologies hasaffected the Swedish auditing industry.The purpose of this study is to investigate, analyze and describe how the implementation ofBig data has and future affect the auditors’ work process. The purpose is also to investigatewhich advantages and disadvantages the implementation has brought auditing firms and thatwill be examined through an empirical study.The empirical study is a case study that consists of seven interviews with representatives fromvarious auditing firms in Western Sweden. By using interviews, the respondents have theopportunity to describe their own experiences. The study is based on the qualitative method.The study has a deductive approach as the starting point comes from previous research andtheory that already exists about digitization, Big data and the auditing industry.The results of the study shows that the respondents gave somewhat similar answers to thequestions, mostly about their auditing process and how it's designed today. But also about theirdigital development and Big data. Three of the interviewed firms use Big data today, but onlyto a limited extent and one of the respondents had no idea what the concept Big datarepresented. However, all respondents believe that digitalization in the auditing industry willbecome even more important and play a greater role in the near future. Automation andstandardization are processes that the respondents believe will be more extensive in the futurewithin the auditing firms.The study's conclusion shows that digitalization in the auditing industry has been somewhatslow compared with other industries, but that there has been a great development in recentyears. The conclusions that are presented and which are based on the study's questions are thatthe work process for the auditors has not been affected by Big data yet, as it’s still a relativelyunknown concept. However, auditing in the future will not look like it does today, there will besome major changes in the industry in the upcoming years. The work processes will be moreefficient than today and the audit will be of a higher quality.
Smeding, Gideon. „Verification of Weakly-Hard Requirements on Quasi-Synchronous Systems“. Thesis, Grenoble, 2013. http://www.theses.fr/2013GRENM073/document.
Der volle Inhalt der QuelleThe synchronous approach to reactive systems, where time evolves by globally synchronized discrete steps, has proven successful for the design of safetycriticalembedded systems. Synchronous systems are often distributed overasynchronous architectures for reasons of performance or physical constraintsof the application. Such distributions typically require communication and synchronizationprotocols to preserve the synchronous semantics. In practice, protocolsoften have a significant overhead that may conflict with design constraintssuch as maximum available buffer space, minimum reaction time, and robustness.The quasi-synchronous approach considers independently clocked, synchronouscomponents that interact via communication-by-sampling or FIFO channels. Insuch systems we can move from total synchrony, where all clocks tick simultaneously,to global asynchrony by relaxing constraints on the clocks and withoutadditional protocols. Relaxing the constraints adds different behaviors dependingon the interleavings of clock ticks. In the case of data-flow systems, onebehavior is different from another when the values and timing of items in a flowof one behavior differ from the values and timing of items in the same flow ofthe other behavior. In many systems, such as distributed control systems, theoccasional difference is acceptable as long as the frequency of such differencesis bounded. We suppose hard bounds on the frequency of deviating items in aflow with, what we call, weakly-hard requirements, e.g., the maximum numberdeviations out of a given number of consecutive items.We define relative drift bounds on pairs of recurring events such as clockticks, the occurrence of a difference or the arrival of a message. Drift boundsexpress constraints on the stability of clocks, e.g., at least two ticks of one perthree consecutive ticks of the other. Drift bounds also describe weakly-hardrequirements. This thesis presents analyses to verify weakly-hard requirementsand infer weakly-hard properties of basic synchronous data-flow programs withasynchronous communication-by-sampling when executed with clocks describedby drift bounds. Moreover, we use drift bounds as an abstraction in a performanceanalysis of stream processing systems based on FIFO-channels
Byrnes, Denise Dianne. „Static scheduling of hard real-time control software using an asynchronous data-driven execution model /“. The Ohio State University, 1992. http://rave.ohiolink.edu/etdc/view?acc_num=osu14877799148243.
Der volle Inhalt der QuelleHore, Prodip. „Scalable frameworks and algorithms for cluster ensembles and clustering data streams“. [Tampa, Fla.] : University of South Florida, 2007. http://purl.fcla.edu/usf/dc/et/SFE0002135.
Der volle Inhalt der QuelleHsieh, Jane W. „Asking questions is easy, asking great questions is hard: Constructing Effective Stack Overflow Questions“. Oberlin College Honors Theses / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=oberlin1589722602631253.
Der volle Inhalt der QuellePukitis, Furhoff Hampus. „Efficient search of an underwater area based on probability“. Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-254568.
Der volle Inhalt der QuelleIdag utvecklas allt fler olika typer av autonoma robotar och fordon. De flesta av dessa är beroende av det globala positioneringssystemet och/eller kommunikation med andra robotar och fordon för att bestämma deras globala position. Detta är dock inte realistiska alternativ för autonoma undervattensfordon (AUV) idag eftersom radiovågor inte färdas bra i vatten. I stället används olika tekniker för att bestämma AUVens position, tekniker som ofta har en felmarginal. Denna rapport undersöker problemet med att effektivt utföra en lokal sökning inom denna felmarginal med målet att hitta en dockningsstation eller en boj.För att lösa detta problem gjordes en litteraturstudie om ämnet sökteori och hur det tidigare har tillämpats i detta sammanhang. Det som hittades var att den klassiska bayesiska sökteorin inte hade använts mycket ofta i detta sammanhang eftersom det skulle kräva för mycket processorkraft för att det skulle vara ett rimligt alternativ för de inbyggda systemen på en AUV. Istället användes olika heuristiska metoder för att få lösningar som fortfarande var dugliga för de situationer där de användes, även om de kanske inte var optimala.Baserat på detta utvecklades sökstrategierna Spiral, Greedy, Look-ahead och Quad-tree och utvärderades i en simulator. Deras genomsnittliga tid för att upptäcka målet (MTTD) jämfördes liksom den genomsnittliga tiden det tog för strategierna att bearbeta en sökning. Look-ahead var den bästa av de fyra olika strategierna med avseende på MTTD och baserat på detta föreslås det att den ska implementeras och utvärderas i en verklig AUV.
Canzonieri, Massimiliano. „Dati digitali effimeri e permanenti“. Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amslaurea.unibo.it/3904/.
Der volle Inhalt der QuelleWistrand, Henrik. „Vad har svenska företag för syn på sovande data och hur hanterar svenska företag sovande data med avseende på identifiering och lagring?“ Thesis, University of Skövde, School of Humanities and Informatics, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-853.
Der volle Inhalt der QuelleAlltför många organisationer har datalager innehållande stora mängder sovande data, det vill säga data som sällan eller aldrig används. Sovande data påverkar en organisations datalager negativt eftersom den försämrar datalagrets prestanda, kostar pengar i onödan och påverkar datalagrets infrastruktur negativt. Enligt Inmon, Glassey och Welch (1997) är det en mycket svår och komplex process att rensa ut sovande data ur sitt datalager. Administratören måste ha kunskap om vilka datatabeller i datalagret som används och vilka rader utav data som används för att kunna ta bort data från datalagret. Enligt Inmon m.fl. (1997) är det nödvändigt att använda någon form av metod för att kunna identifiera vilken data i datalagret som kan klassas som sovande data. Syftet med arbetet är att undersöka hur svenska företag hanterar sovande data för att ta reda på vilka metoder de använder för att identifiera sovande data och vad de gör med den datan som blir klassad som sovande data.
Erlandsson, Emil. „Har sociala medier förkortat vår koncentrationsförmåga? : Upplever KTH-studenter att deras koncentrationsförmåga har försämrats och finns det något samband med deras användande av sociala medier?“ Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-237252.
Der volle Inhalt der QuelleIn this paper, we examine how students at KTH perceive that their social media habits have affected their attention span related to deep reading and their ability to focus on reading longer texts. We begin by exploring the background to the commonly held assumption that social media is ruining our ability to concentrate. The working assumption and the hypophysis of the study was that students who were active social media users would not have worse attention span than those who were not active users. The study was performed using a survey. After analysing the data from the 70 participants, it was clear that the data would support the hypophysis. However there are a number of sources of errors that are discussed at the end, and the study concludes with possible improvements and considerations for future research.
Eriksson, Claes. „PERMADEATH MEDPERMANENTA OCHTRANSIENTA MÅL : Effekten permanenta och transienta mål har påpermadeath“. Thesis, Högskolan i Skövde, Institutionen för kommunikation och information, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-11099.
Der volle Inhalt der QuelleCopete, Antonio Julio. „BAT Slew Survey (BATSS): Slew Data Analysis for the Swift-BAT Coded Aperture Imaging Telescope“. Thesis, Harvard University, 2012. http://dissertations.umi.com/gsas.harvard:10681.
Der volle Inhalt der QuellePhysics
Zábojník, Jakub. „Využití knihovny HAM-Tools pro simulaci tepelného chování rodinného domu“. Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2015. http://www.nusl.cz/ntk/nusl-231126.
Der volle Inhalt der QuellePoluri, Kaushik. „Bounding the Worst-Case Response Times of Hard-Real-Time Tasks under the Priority Ceiling Protocol in Cache-Based Architectures“. OpenSIUC, 2013. https://opensiuc.lib.siu.edu/theses/1213.
Der volle Inhalt der QuelleImbert, Julie. „Fine-tuning of Fully Convolutional Networks for Vehicle Detection in Satellite Images: Data Augmentation and Hard Examples Mining“. Thesis, KTH, Geoinformatik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-254946.
Der volle Inhalt der QuelleGächter, Sundbäck Dominic. „Analysis of the Hard Spectrum BL Lac Source 1H 1914-194 with Fermi-LAT Data and Multiwavelength Modelling“. Thesis, Linnéuniversitetet, Institutionen för fysik och elektroteknik (IFE), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-76510.
Der volle Inhalt der QuelleAmos, Nissim. „Media fabrication and characterization systems for three dimensional-multilevel magnetic recording“. Diss., [Riverside, Calif.] : University of California, Riverside, 2008. http://proquest.umi.com/pqdweb?index=0&did=1663077871&SrchMode=2&sid=2&Fmt=2&VInst=PROD&VType=PQD&RQT=309&VName=PQD&TS=1268244498&clientId=48051.
Der volle Inhalt der QuelleIncludes abstract. Available via ProQuest Digital Dissertations. Title from first page of PDF file (viewed March 10, 2010). Includes bibliographical references (p. 96-104). Also issued in print.
Lannergård, Joakim, und Mikael Larsson. „Hur stor betydelse har bakåtblickande respektive framåtblickande förväntningar i Phillipskurvan? - en empirisk studie på svenska data“. Thesis, Uppsala University, Department of Economics, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-5957.
Der volle Inhalt der QuelleAndersson, Emanuel. „Hur företag inom dagligvaruhandeln kan uppnåett framgångsrikt arbetssätt med CRM-system : Vilken roll har insamlade data?“ Thesis, Karlstads universitet, Handelshögskolan (from 2013), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-78886.
Der volle Inhalt der Quelle