Дисертації з теми "Informazione tecnica"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся з топ-18 дисертацій для дослідження на тему "Informazione tecnica".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.
Cancellieri, Andrea. "Analisi di tecniche per l'estrazione di informazioni da documenti testuali e non strutturati." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2014. http://amslaurea.unibo.it/7773/.
Повний текст джерелаMartini, Francesco <1971>. ""Tracciature Digitali": la conoscenza nell'era informazionale." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4311/1/Martini_Francesco_tesi.pdf.
Повний текст джерелаThe specificity of the acquisition of content through digital interfaces condemns the epistemic agent to a fragmented interaction, with respect to the huge informational bulk today available through any standard implementation of the man-computer relationship, and invalidates the applicability of the standard model of knowledge as justified true belief, by repudiating the concept of rationally founded belief, to form which would instead require the agent to be able to have precisely the conceptual resources and computational time inaccessible. Thereby the agent, bound by the ontological limitations belong to cultural interfaces, is forced to fall back on ambiguous, arbitrary and often more casual than he takes into account, selection and management information process that produce real epistemological hybrids (by Latour) made of feelings, program outputs, unfounded beliefs, bits of indirect testimonies and of a series of human-digital relationships that give rise to escape in a transcendent dimension belonging to anthropological area of the sacred. Starting from this analysis the work deals with constructing a new epistemological paradigm of propositional knowledge obtained through a digital content acquisition, based on the new concept of Digital Tracings, defined as a process of digital capture of a set of tracks , ie meta-information of a testimonial kind. This device, once recognized as a communication process of digital contents, will be based on the research and selection of meta-information, ie tracks, which allow the implementation of approaches derived from analysis of decision-making under bounded rationality, which approaches, as well to be almost never used in this context, are ontologically prepared for dealing with uncertainty such as that came into the informational hybrid and that can provide the agent on the epistemic goodness of acquired content.
Martini, Francesco <1971>. ""Tracciature Digitali": la conoscenza nell'era informazionale." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4311/.
Повний текст джерелаThe specificity of the acquisition of content through digital interfaces condemns the epistemic agent to a fragmented interaction, with respect to the huge informational bulk today available through any standard implementation of the man-computer relationship, and invalidates the applicability of the standard model of knowledge as justified true belief, by repudiating the concept of rationally founded belief, to form which would instead require the agent to be able to have precisely the conceptual resources and computational time inaccessible. Thereby the agent, bound by the ontological limitations belong to cultural interfaces, is forced to fall back on ambiguous, arbitrary and often more casual than he takes into account, selection and management information process that produce real epistemological hybrids (by Latour) made of feelings, program outputs, unfounded beliefs, bits of indirect testimonies and of a series of human-digital relationships that give rise to escape in a transcendent dimension belonging to anthropological area of the sacred. Starting from this analysis the work deals with constructing a new epistemological paradigm of propositional knowledge obtained through a digital content acquisition, based on the new concept of Digital Tracings, defined as a process of digital capture of a set of tracks , ie meta-information of a testimonial kind. This device, once recognized as a communication process of digital contents, will be based on the research and selection of meta-information, ie tracks, which allow the implementation of approaches derived from analysis of decision-making under bounded rationality, which approaches, as well to be almost never used in this context, are ontologically prepared for dealing with uncertainty such as that came into the informational hybrid and that can provide the agent on the epistemic goodness of acquired content.
Rava, Annalisa. "Sviluppo di una piattaforma software di Content Management System per la condivisone delle informazioni tecniche: il caso Bonfiglioli Riduttori S.p.A." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2019.
Знайти повний текст джерелаTONON, ANDREA. "Tecniche Rigorose e Scalabili per il Mining di Pattern da Dati Sequenziali." Doctoral thesis, Università degli studi di Padova, 2022. http://hdl.handle.net/11577/3445089.
Повний текст джерелаMassive amounts of data are generated continuously in every fields, such as finance, social networks, medicine, and many others. Data mining is the task of discovering interesting patterns exploring large amounts of data, making those useful and understandable. In recent years, the data mining community has tackled and proposed a vast set of problems but many others are still open. Now more than ever, it is of crucial importance to be able to analyze and extract reliable knowledge from massive datasets. However, this fundamental task poses some challenges. The first one is to design algorithms that scale the computation to the analysis of massive datasets. In such a scenario, very often approaches that return rigorous and high quality approximations are the only viable approach. The second one is to develop strategies that extract useful knowledge providing statistical guarantees on the analysis, while filtering out spurious discoveries. Finally, the abundance of data is opening new scenarios, with a lot of sources generating data continuously, requiring analyses that take into account the sequential nature of the data. The objective of this Thesis is to design novel scalable and rigorous techniques to mine patterns from sequential data, in three scenarios. The first scenario we consider is mining frequent sequential patterns through sampling. Sequential pattern mining is a fundamental task in data mining and knowledge discovery, with applications in several areas (e.g., biology), that has been extensively studied in the literature, with the definition of several exact methods. However, for large modern sized datasets, the execution of exact methods is computationally very demanding. In such a direction, we develop an algorithm to mine rigorous approximations, defined in terms of false positives or false negatives, of the frequent sequential patterns using sampling. The second scenario we consider is mining patterns from samples from unknown probability distributions. In many real life applications (e.g., market basket analysis), the analysis of a dataset is performed to gain insight on the underlying generative process of the data. However, by analyzing only a sample, one cannot exactly solve the problem and has to resort to approximations. In this setting, we tackle two problems: the problem of mining, from a single dataset, true frequent sequential patterns, which are sequential patterns frequently generated by the underlying process generating the data; and the problem of mining statistically robust patterns from a sequence of datasets, which are patterns whose probabilities of being generated from the underlying generative processes behind the sequence of datasets follow well specified trends. For both problems, we develop novel algorithms that return rigorous approximations, defined in terms of false positives or false negatives. The last scenario we consider is mining significant patterns. In significant pattern mining, the dataset is seen as a sample from an unknown probability distribution, and the aim is to extract patterns significantly deviating from an assumed null hypothesis with rigorous guarantees in terms of statistical significance of the output, controlling the retrieval of false discoveries. In this setting, we tackle two problems: the problem of mining statistically significant sequential patterns and the problem of mining statistically significant paths in time series data from an unknown network. For both problems, we develop novel algorithms that provide rigorous guarantees in term of false discoveries, employing the statistical hypothesis testing framework and techniques based on permutation testing.
Parrinello, Angelo. "Studio e sviluppo di tecniche di password cracking targettizzato." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2021. http://amslaurea.unibo.it/24315/.
Повний текст джерелаBERGAMINI, LUCA. "Tecniche di Deep Learning applicate all'allevamento." Doctoral thesis, Università degli studi di Modena e Reggio Emilia, 2021. http://hdl.handle.net/11380/1239977.
Повний текст джерелаAlthough Deep Learning (DL) is increasingly being adopted in many sectors, farming is still an almost unscathed niche. This is mainly because of the humongous distance in knowledge between the experts of DL and those of farming itself. It's first of all a communication issue, and only in a second place a matter of reluctance to changes. Tackle those issues and you will find that also this sector can greatly benefit from the application of these new technologies. This thesis if therefore a collection of applications of DL to different topics in farming. This has been made possible by the key role of figures who are placed right in the middle and act as intermediaries between the experts to identify targets and measures of success. In our case, this role is covered by the Farm4Trade startup, which is also the main funder of this PhD. The first covered topic is the automatic cattle re-identification from images and videos. We show how DL methods designed for humans can be adapted to work in a completely different setting. As a feedback loop, methods developed for cattle have been reapplied on people and vehicles with successful results. The second topic is the automatic detection and tracking of pigs in the farm. The target here is to detect and classify individual behaviours and how they change through time. A collection of state-of-the-art DL techniques has been chain together while each individual piece has been analysed on its own to ensure good final performance. Finally, we jump at the end of the production chain to study how to apply DL to slaughtered pigs' carcasses image to detect and segment lungs lesions. These are reliable indicators of a bacterial pathology affecting the animal prior to its death. Results achieved during this PhD show how the whole sector of farming can benefit from the application of artificial intelligence algorithms.
PINI, STEFANO. "Tecniche di Visione Artificiale per l'Interazione Uomo-Veicolo." Doctoral thesis, Università degli studi di Modena e Reggio Emilia, 2022. http://hdl.handle.net/11380/1271181.
Повний текст джерелаIn recent years, the widespread adoption of digital devices in all aspects of everyday life has led to new research opportunities in the field of Human-Computer Interaction. In the automotive field, where infotainment systems are becoming more and more important to the final user, the availability of inexpensive miniaturized cameras has enabled the development of vision-based Natural User Interfaces, paving the way for novel approaches to the Human-Vehicle Interaction. In this thesis, we investigate computer vision techniques, based on both visible light and non-visible spectrum, that can form the foundation of the next generation of in-vehicle infotainment systems. As sensing technology, we focus on infrared-based devices, such as depth and thermal cameras. They provide reliable data under different illumination conditions, making them a good fit for the mutable automotive environment. Using these acquisition devices, we collect two novel datasets: a facial dataset, to investigate the impact of sensor resolution and quality in changing acquisition settings, and a dataset of dynamic hand gestures, collected with several synchronized sensors within a car simulator. As vision approaches, we adopt state-of-the-art deep learning techniques, focusing on efficient neural networks that can be easily deployed on computing devices on the edge. In this context, we study several computer vision tasks to cover the majority of human-car interactions. First, we investigate the usage of depth cameras for the face recognition task, focusing on how depth-map representations and deep neural models affect the recognition performance. Secondly, we address the problem of in-car dynamic hand gesture recognition in real-time, using depth and infrared sensors. Then, we focus on the analysis of the human body, both in terms of the 3D human pose estimation and the contact-free estimation of anthropometric measurements. Finally, focusing on the area surrounding the vehicle, we explore the 3D reconstruction of objects from 2D images, as a first step towards the 3D visualization of the external environment from controllable viewpoints.
GAGLIARDELLI, LUCA. "Tecniche per l’Integrazione di Sorgenti Big Data in Ambienti di Calcolo Distribuito." Doctoral thesis, Università degli studi di Modena e Reggio Emilia, 2020. http://hdl.handle.net/11380/1200610.
Повний текст джерелаData sources that provide a huge amount of semi-structured data are available on Web as tables, annotated contents (e.g. RDF) and Linked Open Data. These sources can constitute a valuable source of information for companies, researchers and government agencies, if properly manipulated and integrated with each other or with proprietary data. One of the main problems is that typically these sources are heterogeneous and do not come with keys to perform join operations, and effortlessly linking their records. Thus, finding a way to join data sources without keys is a fundamental and critical process of data integration. Moreover, for many applications, the execution time is a critical component (e.g., in finance of national security context) and distributed computing can be employed to significantly it. In this dissertation, I present distributed data integration techniques that allow to scale to large volumes of data (i.e., Big Data), in particular: SparkER and GraphJoin. SparkER is an Entity Resolution tool that aims to exploit the distributed computing to identify records in data sources that refer to the same real-world entity—thus enabling the integration of the records. This tool introduces a novel algorithm to parallelize the indexing techniques that are currently state-of-the-art. SparkER is a working software prototype that I developed and employed to perform experiments over real data sets; the results show that the parallelization techniques that I have developed are more efficient in terms of execution time and memory usage than those in literature. GraphJoin is a novel technique that allows to find similar records by applying joining rules on one or more attributes. This technique combines similarity join techniques designed to work on a single rule, optimizing their execution with multiple joining rules, combining different similarity measures both token- and character- based (e.g., Jaccard Similarity and Edit Distance). For GraphJoin I developed a working software prototype and I employed it to experimentally demonstrate that the proposed technique is effective and outperforms the existing ones in terms of execution time.
RAIMONDI, ALESSIO. "La meteorologia: cammino storico-epistemologico e studio del suo stato attuale di diffusione." Doctoral thesis, Università degli Studi di Cagliari, 2011. http://hdl.handle.net/11584/266333.
Повний текст джерелаFornaciari, B. "LA DIRETTIVA 2012/13/UE SUL DIRITTO ALL'INFORMAZIONE.LA CONOSCENZA NEL PROCESSO PENALE FRA UNIONE EUROPEA E ORDINAMENTO INTERNO." Doctoral thesis, Università degli Studi di Milano, 2016. http://hdl.handle.net/2434/369477.
Повний текст джерелаThe present research examines the European Directive on the right to information in criminal proceedings (Directive 2012/13/EU, hereinafter ‘the Directive’), assessing the impact that it is likely to have on the Italian legal system. Before analyzing the legislation, the thesis provides an historical overview of the status of human rights safeguards in the EU and a description of its multi-layered system of protection. Starting from the early ECJ case law setting out a ‘human rights theory’, the research moves on to consider the Charter of Nice and the development of a European Area of Criminal Justice, until the Stockholm Program and the entry into force of the Lisbon Treaty. In addition, it addresses the question as to whether and to what extent the directives ‘of new generation’ based on art. 82 par. 2 TFEU bring an added value to the aforementioned human rights protection system. Chapters 2 and 3 of the research focus on the analysis of the legislation and on the three meanings that the Directive attaches to the right to information in criminal proceedings, namely, the right to information about rights, the right to information about accusation, and the right to information about case file. The effort is shedding some light on the most innovative prescriptions, while at the same time highlighting how much the EU legislation owes to the ECtHR case law, which is used as a yardstick for the evaluation and interpretation of the Directive. Finally, Chapter 4 addresses the Italian implementing legislation (d. lgs. 101/2014) and the impact of the Directive on our legal system. It finds that the NIM is highly unsatisfactory, as the Italian legislator has failed to comply with the most innovative EU standards. In this regard, the research illustrates the impact of EU prescriptions on the jurisdiction of national judges, in particular, the impact of the ‘new’ right to information about accusation. It concludes that Italian judges can (in)directly apply ECtHR case law standards due the direct effect of the Directive (which can be regarded as an ‘ECtHR case-law codification’).
Dashdorj, Zolzaya. "Semantic Enrichment of Mobile Phone Data Records Exploiting Background Knowledge." Doctoral thesis, Università degli studi di Trento, 2015. https://hdl.handle.net/11572/367796.
Повний текст джерелаPisani, Federico. "Knowledge workers management. Concorrenza e invenzioni nel rapporto di lavoro subordinato: il modello statunitense." Doctoral thesis, Università degli studi di Padova, 2019. http://hdl.handle.net/11577/3425914.
Повний текст джерелаThis work addresses the issues of competition and inventions in the U.S. employment relationships. The research was carried out in part at the Boston University School of Law of, under the supervision of Micheal C. Harper, professor of Labour Law. The selection of the topic is justified in the light of its importance, given that in the new production organization, based largely on globalized knowledge, employees are now increasingly being asked for professionalism, innovation and creativity. The decision to examine this issue from the perspective of the "U.S. laboratory" is due to the primacy that this nation holds at international level on the economic, scientific and innovation of work processes, which bring out critical issues that in other Countries probably have not yet been raised. In order to frame the above-mentioned topics, it has become appropriate to give an account of the system of regulatory sources in the USA, with particular focus on the Restatement of Employment Law, i.e. the collection of fundamental principles developed over the years by common law in the field of employment relationships. The examination of the sources is followed by the definition of the concept of employee and self-employed worker (independent contractor), necessary for the assessment of the application of the obligations arising from the employment relationships, including the duty of loyalty, involved in the fiduciary law. In this context, the evolution of the case law has been observed, as well as the examination of the criteria relating to the distinction between employees and independent contractors, mainly concerning the judgement on the relevance of the factual elements determining the assessment of the existence of an employment relationship. Subsequently, this study addresses the issue of the typical form of the U.S. employment contract, the so-called employment-at-will. This peculiarity is originated from the principle that the parties are not bound by any obligation to provide reasons for termination. The third part of the work has as its object the discipline of competition of the worker carried out on the basis of the knowledge acquired, legally or illegally, during the relationship and the relative legal remedies for the employer, against the violation of the duty of loyalty, intended as an obligation of the employee to perform the work in the exclusive interest of the entrepreneur and, consequently, to refrain from engaging in prejudicial conduct against the company. About the remedies available in the event of breach of the obligations examined, the legal and equitable remedies that U.S. law offers the employer have been explained. The final part of this study deals with the rules governing the ownership of rights arising from inventions developed by employees in the course of their employment. The definitions of "invention" and "patent" and their relationship in the context of employment law has been examined and the difference between invention as a work of genius and intellectual property protected by copyright has been highlighted. In addition, the mechanisms underlying the basic rules governing the subject matter and their coexistence with the contractual freedom of the parties and their power to dispose of these rights have been observed.
PANTINI, SARA. "Analysis and modelling of leachate and gas generation at landfill sites focused on mechanically-biologically treated waste." Doctoral thesis, Università degli Studi di Roma "Tor Vergata", 2013. http://hdl.handle.net/2108/203393.
Повний текст джерелаCOSTANZO, Carlo. "Tecniche di visione artificiale per l'interazione uomo-macchina in applicazioni di didattica interattiva." Doctoral thesis, 2008. http://hdl.handle.net/11570/3120447.
Повний текст джерелаSALVADORI, Ivan. "Il diritto penale dell'informatica nel passaggio dal computer crime al cybercrime. Tecniche di tutela e beni giuridici tutelati in prospettiva europea ed internazionale. Le nuove forme di attacco ai sistemi di informazione commesse nel cyberspace." Doctoral thesis, 2009. http://hdl.handle.net/11562/337362.
Повний текст джерелаThe object of the doctoral thesis involves the main menaces to the confidentiality, integrity and availability of computer data and information systems. First of all the thesis underlines how the progressive interconnection between computers has changed the high-tech crime, determining the passage from computer crime to cybercrime. Secondly it determines the socio-criminological profile of the cyber criminals, with special regard to hackers, crackers and insiders. Then it deals with the main techniques used by the cyber criminals to attack the information systems, such as the use of malware, spread out in Internet by means of spam or bulk e-mails and Botnet, the Denial of Service and Distributed Denial of Service attacks, Mail-Bombing and Net-strike. After this criminological analysis, it analyzes the main legal sources adopted at regional and international levels in the fight against cybercrime, with special regard to the provisions and recommendations concerning the confidentiality, integrity and availability. The aim of these supranational legal sources is to evaluate if these legal instruments have favoured the harmonization of the national legal systems or if they have only determined an overlapping of legal models, without an effective standardization of the national criminal legislations. The analysis of the juridical experience of some legal systems of civil law (Germany and Spain) and common law (United States) has allowed to value the legal techniques used by the Italian legislator with regard to the formulation of the offences of illegal access (art. 615-ter criminal code), and data damages (art. 635-bis and 635-ter criminal code) and computer damage (art. 635-quater, 635-quinquies criminal code). The analysis of the mens rea and actus reus elements of these offences has allowed to underline their specific normative structure and to identity the new protected legal interests such as the confidentiality, integrity and availability of computer data and systems. Finally it proposes some amendments to the illegal access and computer damage offences, moving from some examples of good practice of the Spanish, German and American legal systems.
FRATI, GIANMARIA. "I quotidiani on-line a carattere locale: forme, sostenibilità economica e interazione con l’opinione pubblica." Doctoral thesis, 2018. http://hdl.handle.net/11573/1079165.
Повний текст джерелаTriggiani, Maurizio. "Integration of machine learning techniques in chemometrics practices." Doctoral thesis, 2022. http://hdl.handle.net/11589/237998.
Повний текст джерела