Contents
Academic literature on the topic 'Analisi e progettazione dei sistemi di elaborazione'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Analisi e progettazione dei sistemi di elaborazione.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Analisi e progettazione dei sistemi di elaborazione"
Castellano, Nicola. "Modelli e misure di performance aziendale: analisi della letteratura e spunti di ricerca." MANAGEMENT CONTROL, no. 1 (April 2011): 41–63. http://dx.doi.org/10.3280/maco2011-001003.
Full textLorenzi, F. "Breve Storia del Metodo Gemellare 2 - Le Attuali Formulazioni del Metodo." Acta geneticae medicae et gemellologiae: twin research 47, no. 1 (January 1998): 57–71. http://dx.doi.org/10.1017/s0001566000000386.
Full textConti Puorger, Adriana, and Pierpaolo Napolitano. "Caratterizzazione socio-economica della regione Marche per sezioni di censimento." RIVISTA DI ECONOMIA E STATISTICA DEL TERRITORIO, no. 2 (September 2011): 30–59. http://dx.doi.org/10.3280/rest2011-002002.
Full textDissertations / Theses on the topic "Analisi e progettazione dei sistemi di elaborazione"
Lepore, Marco. "Impatto dei cambiamenti climatici in relazione alla progettazione di sistemi di drenaggio urbano." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018.
Find full textLaterza, Marco. "Strategie di ingegnerizzazione e di sviluppo di sistemi software per big data management." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2016.
Find full textChiarelli, Rosamaria. "Elaborazione ed analisi di dati geomatici per il monitoraggio del territorio costiero in Emilia-Romagna." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/19783/.
Full textDi, Giovanni Ferdinando. "Progettazione e sviluppo di un software per la simulazione dinamica della redazione del Master Production Schedule." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2020.
Find full textROLLO, FEDERICA. "Verso soluzioni di sostenibilità e sicurezza per una città intelligente." Doctoral thesis, Università degli studi di Modena e Reggio Emilia, 2022. http://hdl.handle.net/11380/1271183.
Full textA smart city is a place where technology is exploited to help public administrations make decisions. The technology can contribute to the management of multiple aspects of everyday life, offering more reliable services to citizens and improving the quality of life. However, technology alone is not enough to make a smart city; suitable methods are needed to analyze the data collected by technology and manage them in such a way as to generate useful information. Some examples of smart services are the apps that allow to reach a destination through the least busy road route or to find the nearest parking slot, or the apps that suggest better paths for a walk based on air quality. This thesis focuses on two aspects of smart cities: sustainability and safety. The first aspect concerns studying the impact of vehicular traffic on air quality through the development of a network of traffic and air quality sensors, and the implementation of a chain of simulation models. This work is part of the TRAFAIR project, co-financed by the European Union, which is the first project with the scope of monitoring in real-time and predicting air quality on an urban scale in 6 European cities, including Modena. The project required the management of a large amount of heterogeneous data and their integration on a complex and scalable data platform shared by all the partners of the project. The data platform is a PostgreSQL database, suitable for dealing with spatio-temporal data, and contains more than 60 tables and 435 GB of data (only for Modena). All the processes of the TRAFAIR pipeline, the dashboards and the mobile apps exploit the database to get the input data and, eventually, store the output, generating big data streams. The simulation models, executed on HPC resources, use the sensor data and provide results in real-time (as soon as the sensor data are stored in the database). Therefore, the anomaly detection techniques applied to sensor data need to perform in real-time in a short time. After a careful study of the distribution of the sensor data and the correlation among the measurements, several anomaly detection techniques have been implemented and applied to sensor data. A novel approach for traffic data that employs a flow-speed correlation filter, STL decomposition and IQR analysis has been developed. In addition, an innovative framework that implements 3 algorithms for anomaly detection in air quality sensor data has been created. The results of the experiments have been compared to the ones of the LSTM autoencoder, and the performances have been evaluated after the calibration process. The safety aspect in the smart city is related to a crime analysis project, the analytical processes directed at providing timely and pertinent information to assist the police in crime reduction, prevention, and evaluation. Due to the lack of official data to produce the analysis, this project exploits the news articles published in online newspapers. The goal is to categorize the news articles based on the crime category, geolocate the crime events, detect the date of the event, and identify some features (e.g. what has been stolen during the theft). A Java application has been developed for the analysis of news articles, the extraction of semantic information through the use of NLP techniques, and the connection of entities to Linked Data. The emerging technology of Word Embeddings has been employed for the text categorization, while the Question Answering through BERT has been used for extracting the 5W+1H. The news articles referring to the same event have been identified through the application of cosine similarity to the shingles of the news articles' text. Finally, a tool has been developed to show the geolocalized events and provide some statistics and annual reports. This is the only project in Italy that starting from news articles tries to provide analyses on crimes and makes them available through a visualization tool.
AMIRI, AMIR MOHAMMAD. "An intelligent diagnostic system for screening newborns." Doctoral thesis, Università degli Studi di Cagliari, 2015. http://hdl.handle.net/11584/266593.
Full textPARMIGGIANI, Nicolò. "Metodi per l’analisi e la gestione dei dati dell’astrofisica gamma in tempo reale." Doctoral thesis, Università degli studi di Modena e Reggio Emilia, 2021. http://hdl.handle.net/11380/1239980.
Full textThe context of this Ph.D. is the data analysis and management for gamma-ray astronomy, which involves the observation of gamma-rays, the most energetic form of electromagnetic radiation. From the gamma-ray observations performed by telescopes or satellites, it is possible to study catastrophic events involving compact objects, such as white dwarves, neutron stars, and black holes. These events are called gamma-ray transients. To understand these phenomena, they must be observed during their evolution. For this reason, the speed is crucial, and automated data analysis pipelines are developed to detect gamma-ray transients and generate science alerts during the astrophysical observations or immediately after. A science alert is an immediate communication from one observatory to other observatories that an interesting astrophysical event is occurring in the sky. The astrophysical community is experiencing a new era called "multi-messenger astronomy", where the astronomical sources are observed by different instruments, collecting different signals: gravitational waves, electromagnetic radiation, and neutrinos. In the multi-messenger era, astrophysical projects share science alerts through different communication networks. The coordination of different projects done by sharing science alerts is mandatory to understand the nature of these physical phenomena. Observatories have to manage the follow-up of these external science alerts by developing dedicated software. During this Ph. D., the research activity had the main focus on the AGILE space mission, currently in operation, and on the Cherenkov Telescope Array Observatory (CTA), currently in the construction phase. The follow-up of external science alerts received from Gamma-Ray Bursts (GRB) and Gravitational Waves (GW) detectors is one of the AGILE Team's current major activities. Future generations of gamma-ray observatories like the CTA or the ASTRI Mini-Array can take advantage of the technologies developed for AGILE. This research aims to develop analysis and management software for gamma-ray data to fulfill the context requirements. The first chapter of this thesis describes the web platform used by AGILE researchers to prepare the Second AGILE Catalog of Gamma-ray sources. The analysis performed for this catalog is stored in a dedicated database, and the web platform queries this database. This was preparatory work to understand how to manage detections of gamma-ray sources and light curve for the subsequent phase: the development of a scientific pipeline to manage gamma-ray detection and science alerts in real-time. The second chapter presents a framework designed to facilitate the development of real-time scientific analysis pipelines. The framework provides a common pipeline architecture and automatisms that can be used by observatories to develop their own pipelines. This framework was used to develop the pipelines for the AGILE space mission and to develop a prototype of the scientific pipeline of the Science Alert Generation system of the CTA Observatory. The third chapter describes a new method to detect GRBs in the AGILE-GRID data using the Convolutional Neural Network. With this Deep Learning technology, it is possible to improve the detection capabilities of AGILE. This method was also integrated as a science tool in the AGILE pipelines. The last chapter of the thesis shows the scientific results obtained with the software developed during the Ph.D. research activities. Part of the results was published in refereed journals. The remaining part was sent to the scientific community through The Astronomer's Telegram or the Gamma-ray Coordination Network.
PANTINI, SARA. "Analysis and modelling of leachate and gas generation at landfill sites focused on mechanically-biologically treated waste." Doctoral thesis, Università degli Studi di Roma "Tor Vergata", 2013. http://hdl.handle.net/2108/203393.
Full textCoppola, Fabrizio. "Le unità di controllo del supercalcolatore A.P.E." Thesis, 1987. http://eprints.bice.rm.cnr.it/3855/1/Coppola-Tesi-Fisica-Pisa-1987.pdf.
Full textFERRI, ALESSANDRO. "Progettazione e analisi di EHR territoriali ed eterogenei per analisi socio-sanitarie." Doctoral thesis, 2021. http://hdl.handle.net/11566/291125.
Full textIt is now a matter of fact that lifestyle has implications and consequences on a person's general health. However, less consolidated and proven is the assumption that in addition to a lifestyle, mainly represented by nutrition and physical activity, variables that refer to the socio-economic area also affect. Education, unemployment, income, inequalities, poverty, crime, housing and social understanding, for example, are all elements that can have health affect. From this point of view, it can be significant to explore variables and statistical relationships between a person's lifestyle, well-being and quality of life, economic, social and health factors. The purpose is, through the application of statistical algorithms and Machine Learning (ML), to search for patterns, relationships, correlations, causality between the different socio-economic variables. In the current state of the art, ML algorithms allow to identify new predictive signals of the onset of certain diseases or to identify correlations not yet known between the results of various commonly prescribed clinical tests, pathologies and drugs administered. A predictive and preventive medicine also enables a therapeutic and behavioral personalization. Awareness of the patient can make him participate and aware in the faculty of being able to change his life conduct in an anticipatory phase, or in the initial stages of illness, to share the treatment process with the doctor who assists him. The proposed analysis must be carried out, albeit in an experimental, prototype and detailed manner, in the current state of the Italian healthcare context, technologies, organizations and existing regulations on data processing. The project described by this thesis is to create a prototype of a system that allows to predict the risk factors for health by searching for causality and correlations between the different psycho-physical conditions and the individual socio-economic situation. The achievement of this goal depends on the realization of some essential points: • identification of a starting clinical-medically data source (Electronic Health Record); • organization and people; • methods and technological tools for the collection of socio-economic data; • pre-processing and alignment of clinical and socio-economic data; • construction of a clinical and socio-economic dataset; • statistical analysis and ML approaches to search for causality and correlations between the different psycho-physical conditions and the individual socio-economic situation. As a first step, the thesis aims to grasp the difficulties and problems to collect a clinical and socio-economic dataset. The preliminary results, from the statistical analysis and ML approaches on this heterogeneous dataset, can lay the foundations for the development of a clinical decision support system that aims to provide preventive medicine. The contributions of this work to the state of the art are: • Creation of an intersection dataset of areas, in particular medical-socio-economic, on which to carry out longitudinal studies. The dataset was constructed by extrapolating the information from the general medicine EHRs of family doctors, combined with social variables, inspired by and attributable to the indices for Fair and Sustainable Wellbeing (BES) defined by ISTAT; • Statistical analysis on clinical and socio-economic variables (between the lifestyle, well-being and quality of life of the person and economic, social and health factors) to identify correlations between the different psycho-physical conditions and the socio-economic situation individual; • Preliminary estimate of a biological age sub-index with an ML model. The predictors are derived from lifestyle indicators and patient clinical characteristics extracted from the proposed EHR. The simultaneous analysis and comparison of the correlations applied to the relationship dataset produced shows, for some socio-economic variables, a significant correlation with the clinical variables. The coefficients are particularly high if calculated for age groups, over 50 years and especially in the 50-60 cluster age. The attend at a sport club and economic satisfaction have a high inverse (negative) correlation with the prescription of drugs: a high number of drugs prescribed is correlated with the non attend to a sport club and low economic satisfaction. More outcomes and prescribed diagnostic tests correspond to a generalized low trust toward people.
Books on the topic "Analisi e progettazione dei sistemi di elaborazione"
Magnaghi, Alberto, ed. La regola e il progetto. Florence: Firenze University Press, 2014. http://dx.doi.org/10.36253/978-88-6655-624-4.
Full text