Dissertations / Theses on the topic 'Gestione multimediale delle informazioni'

To see the other types of publications on this topic, follow the link: Gestione multimediale delle informazioni.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 22 dissertations / theses for your research on the topic 'Gestione multimediale delle informazioni.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Corsetti, Mauro. "Mediabuilding2. L'evoluzione degli edifici multimediali conseguente all'avanzamento delle modalità di informazione, gestione e controllo attraverso i sistemi tecnologici avanzati." Doctoral thesis, La Sapienza, 2006. http://hdl.handle.net/11573/917208.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Bonavita, Simone <1982&gt. "Il diritto all'oblio e la gestione delle informazioni della società iperconnessa." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amsdottorato.unibo.it/7690/1/Full_9_Giugno_2016_FINAL_01.pdf.

Full text
Abstract:
La tesi di dottorato è relativa al diritto all'oblio nella società iperconnessa. L'autore affronta il tema con una approccio multidisciplinare, analizzando problemi relativi alla gestione dei processi di memorizzazione propri di Internet.
The doctoral thesis is related to the right to be forgotten in the hyperconnected society. The author examins the topic with a multidisciplinary approach, analysing the questions about the management of Internet storage processes.
APA, Harvard, Vancouver, ISO, and other styles
3

CESARETTI, MARGHERITA. "Sistemi domotici: gestione energetica e autodiagnosi." Doctoral thesis, Università Politecnica delle Marche, 2009. http://hdl.handle.net/11566/242334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Paier, Stefania <1987&gt. "Il Marketing Virale e la diffusione delle informazioni all'interno di una rete sociale." Master's Degree Thesis, Università Ca' Foscari Venezia, 2013. http://hdl.handle.net/10579/2367.

Full text
Abstract:
Il seguente elaborato si propone l’obiettivo di illustrare il funzionamento di uno dei più recenti tipi di marketing: il Marketing Virale. Dopo una breve introduzione sulle varie tecnologie Web 2.0 che hanno reso possibile l’amplificazione del tradizionale passaparola, si ripercorrono brevemente le tappe più significative dell’evoluzione del marketing fino ad arrivare a quello non convenzionale. Nel secondo capitolo vengono approfonditi tutti gli aspetti del Viral Marketing e si cerca di dare una spiegazione del perché alcuni particolari messaggi hanno la propensione a diffondersi rapidamente tra le persone come un’epidemia. Ci si concentra quindi a studiare la propagazione di opinioni e informazioni all’interno di una rete sociale. Per fare ciò si fa riferimento a un autore in particolare, Noah Friedkin, che grazie alla sua Social Influence Network Theory ci ha permesso di elaborare un modello di diffusione virale mediante l’utilizzo di un linguaggio di programmazione chiamato Python.
APA, Harvard, Vancouver, ISO, and other styles
5

Bordoni, Matteo <1996&gt. "Il valore delle informazioni ESG per gli investitori. Analisi qualitativa delle tematiche di sostenibilità nel rapporto tra aziende e analisti finanziari." Master's Degree Thesis, Università Ca' Foscari Venezia, 2020. http://hdl.handle.net/10579/18038.

Full text
Abstract:
In considerazione del crescente ruolo che i temi ESG stanno assumendo all'interno del mondo finanziario, la tesi si propone di indagare il rapporto tra sostenibilità e analisti finanziari, al fine di approfondire il valore di questa tipologia di informazioni all'interno della valutazione delle scelte di investimento. A seguito di una breve introduzione sull'evoluzione del concetto di sostenibilità e del rapporto tra azienda e stakeholder, verrà presentata una literature review atta a verificare preventivamente quanto già prodotto dal mondo accademico su questa specifica domanda di ricerca. Questa verrà poi indagata attraverso l’analisi qualitativa di diverse conference call intercorse tra i vertici aziendali e gli analisti finanziari, al fine di individuare, all'interno di esse, la rilevanza dei temi ambientali, sociali e di governance: il campione utilizzato è composto da 907 transcript, appartenenti ad un pool di 109 aziende europee. Al fine raccogliere informazioni di tipo field utili a commentare più accuratamente i risultati dell'analisi desk prima presentata, è stato inoltre realizzato un questionario, somministrato ai soci dell'Associazione Italiana per l'Analisi Finanziaria (AIAF), ed un'intervista con l'attuale Sustainability Manager di Enel.
APA, Harvard, Vancouver, ISO, and other styles
6

BUCCI, UBALDO. "Gestione Risorse per architetture di rete di accesso 5G flessibili." Doctoral thesis, Università degli Studi dell'Aquila, 2022. https://hdl.handle.net/11697/198069.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Spagnolo, Fabio <1981&gt. "I SISTEMI INFORMATIVI NELLE GRANDI AZIENDE: LA GESTIONE DEL CICLO PASSIVO ALLA RIELLO." Master's Degree Thesis, Università Ca' Foscari Venezia, 2013. http://hdl.handle.net/10579/2636.

Full text
Abstract:
LA TESI ESPONE UNA BREVE RELAZIONE CIRCA LA CRESCENTE IMPORTANZA RICOPERTA DAI SISTEMI INFORMATIVI ALL'INTERNO DEL CONTESTO AZIENDALE. DOPO UNA INIZIALE PANORAMICA SUL CONCETTO DI SISTEMA INFORMATIVO E SUI PROCESSI GESTIONALI CHE NE VENGONO COINVOLTI, LA TESI PROSEGUE CON UNA BREVE RASSEGNA DEI SISTEMI ERP E DEI PRINCIPALI MODULI ESTESI QUALI SCM, CRM E BUSINESS INTELLIGENCE. NELLA PARTE FINALE, INFINE, VIENE EFFETTUATA UNA ANALISI, SPECIFICA E CIRCOSCRITTA, DELLA GESTIONE DEL CICLO PASSIVO ALLA RIELLO ATTRAVERSO IL SISTEMA ERP SAP.
APA, Harvard, Vancouver, ISO, and other styles
8

Giannelli, Carlo <1979&gt. "Middleware per la gestione di interfacce di comunicazione e di sorgenti di contesto in ambienti wireless eterogenei." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/925/1/Tesi_Giannelli_Carlo.pdf.

Full text
Abstract:
The full exploitation of multi-hop multi-path connectivity opportunities offered by heterogeneous wireless interfaces could enable innovative Always Best Served (ABS) deployment scenarios where mobile clients dynamically self-organize to offer/exploit Internet connectivity at best. Only novel middleware solutions based on heterogeneous context information can seamlessly enable this scenario: middleware solutions should i) provide a translucent access to low-level components, to achieve both fully aware and simplified pre-configured interactions, ii) permit to fully exploit communication interface capabilities, i.e., not only getting but also providing connectivity in a peer-to-peer fashion, thus relieving final users and application developers from the burden of directly managing wireless interface heterogeneity, and iii) consider user mobility as crucial context information evaluating at provision time the suitability of available Internet points of access differently when the mobile client is still or in motion. The novelty of this research work resides in three primary points. First of all, it proposes a novel model and taxonomy providing a common vocabulary to easily describe and position solutions in the area of context-aware autonomic management of preferred network opportunities. Secondly, it presents PoSIM, a context-aware middleware for the synergic exploitation and control of heterogeneous positioning systems that facilitates the development and portability of location-based services. PoSIM is translucent, i.e., it can provide application developers with differentiated visibility of data characteristics and control possibilities of available positioning solutions, thus dynamically adapting to application-specific deployment requirements and enabling cross-layer management decisions. Finally, it provides the MMHC solution for the self-organization of multi-hop multi-path heterogeneous connectivity. MMHC considers a limited set of practical indicators on node mobility and wireless network characteristics for a coarsegrained estimation of expected reliability/quality of multi-hop paths available at runtime. In particular, MMHC manages the durability/throughput-aware formation and selection of different multi-hop paths simultaneously. Furthermore, MMHC provides a novel solution based on adaptive buffers, proactively managed based on handover prediction, to support continuous services, especially by pre-fetching multimedia contents to avoid streaming interruptions.
APA, Harvard, Vancouver, ISO, and other styles
9

Giannelli, Carlo <1979&gt. "Middleware per la gestione di interfacce di comunicazione e di sorgenti di contesto in ambienti wireless eterogenei." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/925/.

Full text
Abstract:
The full exploitation of multi-hop multi-path connectivity opportunities offered by heterogeneous wireless interfaces could enable innovative Always Best Served (ABS) deployment scenarios where mobile clients dynamically self-organize to offer/exploit Internet connectivity at best. Only novel middleware solutions based on heterogeneous context information can seamlessly enable this scenario: middleware solutions should i) provide a translucent access to low-level components, to achieve both fully aware and simplified pre-configured interactions, ii) permit to fully exploit communication interface capabilities, i.e., not only getting but also providing connectivity in a peer-to-peer fashion, thus relieving final users and application developers from the burden of directly managing wireless interface heterogeneity, and iii) consider user mobility as crucial context information evaluating at provision time the suitability of available Internet points of access differently when the mobile client is still or in motion. The novelty of this research work resides in three primary points. First of all, it proposes a novel model and taxonomy providing a common vocabulary to easily describe and position solutions in the area of context-aware autonomic management of preferred network opportunities. Secondly, it presents PoSIM, a context-aware middleware for the synergic exploitation and control of heterogeneous positioning systems that facilitates the development and portability of location-based services. PoSIM is translucent, i.e., it can provide application developers with differentiated visibility of data characteristics and control possibilities of available positioning solutions, thus dynamically adapting to application-specific deployment requirements and enabling cross-layer management decisions. Finally, it provides the MMHC solution for the self-organization of multi-hop multi-path heterogeneous connectivity. MMHC considers a limited set of practical indicators on node mobility and wireless network characteristics for a coarsegrained estimation of expected reliability/quality of multi-hop paths available at runtime. In particular, MMHC manages the durability/throughput-aware formation and selection of different multi-hop paths simultaneously. Furthermore, MMHC provides a novel solution based on adaptive buffers, proactively managed based on handover prediction, to support continuous services, especially by pre-fetching multimedia contents to avoid streaming interruptions.
APA, Harvard, Vancouver, ISO, and other styles
10

Rossi, Jessica. "Analisi del sistema di gestione della produzione in Mase Generators S.p.A." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2013. http://amslaurea.unibo.it/6258/.

Full text
Abstract:
Analisi del sistema di gestione dell'azienda Mase Generators S.p.A. In particolare dopo aver introdotto l'importanza di un accurato flusso delle informazioni, si sono raccolti alcuni dati relativi alla produzione dell'impresa studiata per poter fare alcune considerazioni su valore di magazzino, livello di servizio ed ore di produzione caricate sulla commessa, al fine di individuare i punti di forza e le criticità del metodo adottato. Non esiste infatti un sistema di gestione che sappia rispondere a tutte le necesseità, ma una volta determinate le strategie esse devono essere ottimizzate e coerenti con gli obiettivi dell'azienda.
APA, Harvard, Vancouver, ISO, and other styles
11

NOTARNICOLA, ELISABETTA. "Uso delle informazioni, decision making e strategie pubbliche: stato dell'arte e nuove traiettorie." Doctoral thesis, Università Cattolica del Sacro Cuore, 2019. http://hdl.handle.net/10280/57879.

Full text
Abstract:
La tesi affronta il tema dell’utilizzo delle informazioni in ambito pubblico sulla scia di quanto avviato dalle teorie di (bounded) rationality, e di diversi filoni di (public) management che si sono occupati nel tempo di indagare i meccanismi attraverso i quali le informazioni vengono identificate, selezionate e utilizzate nei processi decisionali. Attraverso l’analisi delle politiche sociali nei comuni italiani, la tesi si occupa di indagare tre specifiche situazioni: la presentazione delle strategie e politiche pubbliche; la costruzione di piani strategici; la risoluzione di situazioni critiche e di trade-off complessi. La tesi mostra che le informazioni sono usate per presentare le decisioni tramite meccanismi di razionalità ed altri legati alla creazione di consenso per l’innovazione dei contenuti, governance e contesto di riferimento delle politiche sociali. Nella pianificazione strategica le informazioni vengono spesso solamente raccolte oppure utilizzate in modo strumentale. Nel caso di situazioni complesse e caratterizzate da trade-off, le informazioni di accounting hanno sia una valenza manageriale che politica di costruzione del consenso e delle motivazioni ad agire.
The PhD thesis deals with the issue of information use in the public sector. Building on (bounded) rationality theories and (public) management studies focusing on mechanisms through which information is identified, gathered and used in decision making processes, the thesis analysed what happens in Italian Municipalities, in the field of social care policies. Three different situations are analysed: public presentation of social care policies; social care strategic plans; wicked problems resolutions and trade off discussion. The thesis shows that information is used to present public strategies through rationality or by building consensus towards innovation in contents, governance and context of social care policies. In the case of strategic plans, information is often only presented or used instrumentally. In the case of trade-offs and wicked situation information is used both in a managerial perspective and in a political fashion so to build consensus and motivation for action.
APA, Harvard, Vancouver, ISO, and other styles
12

NOTARNICOLA, ELISABETTA. "Uso delle informazioni, decision making e strategie pubbliche: stato dell'arte e nuove traiettorie." Doctoral thesis, Università Cattolica del Sacro Cuore, 2019. http://hdl.handle.net/10280/57879.

Full text
Abstract:
La tesi affronta il tema dell’utilizzo delle informazioni in ambito pubblico sulla scia di quanto avviato dalle teorie di (bounded) rationality, e di diversi filoni di (public) management che si sono occupati nel tempo di indagare i meccanismi attraverso i quali le informazioni vengono identificate, selezionate e utilizzate nei processi decisionali. Attraverso l’analisi delle politiche sociali nei comuni italiani, la tesi si occupa di indagare tre specifiche situazioni: la presentazione delle strategie e politiche pubbliche; la costruzione di piani strategici; la risoluzione di situazioni critiche e di trade-off complessi. La tesi mostra che le informazioni sono usate per presentare le decisioni tramite meccanismi di razionalità ed altri legati alla creazione di consenso per l’innovazione dei contenuti, governance e contesto di riferimento delle politiche sociali. Nella pianificazione strategica le informazioni vengono spesso solamente raccolte oppure utilizzate in modo strumentale. Nel caso di situazioni complesse e caratterizzate da trade-off, le informazioni di accounting hanno sia una valenza manageriale che politica di costruzione del consenso e delle motivazioni ad agire.
The PhD thesis deals with the issue of information use in the public sector. Building on (bounded) rationality theories and (public) management studies focusing on mechanisms through which information is identified, gathered and used in decision making processes, the thesis analysed what happens in Italian Municipalities, in the field of social care policies. Three different situations are analysed: public presentation of social care policies; social care strategic plans; wicked problems resolutions and trade off discussion. The thesis shows that information is used to present public strategies through rationality or by building consensus towards innovation in contents, governance and context of social care policies. In the case of strategic plans, information is often only presented or used instrumentally. In the case of trade-offs and wicked situation information is used both in a managerial perspective and in a political fashion so to build consensus and motivation for action.
APA, Harvard, Vancouver, ISO, and other styles
13

Avanzi, Elisabetta. "Antropologia ed organizzazione delle imprese creative dell'ICT - Analisi della nascita, ascesa e suicidio organizzativo di un micro-distretto di aziende di comunicazione multimediale." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2013. http://amslaurea.unibo.it/4958/.

Full text
Abstract:
Analisi "dall'interno" del settore della comunicazione multimediale, attraverso l'analisi di un caso aziendale complesso vissuto in prima persona dall'autrice. Il caso tratta di un micro-distretto di aziende della Grafica, del Web e del Multimedia nel suo ciclo di vita dal 1998 al 2003, analizzando le mutazioni del comportamento organizzativo nelle sue differenti fasi di nascita, ascesa e declino. L'obiettivo è quello di identificare i comportamenti organizzativi favorevoli e contrari alla sopravvivenza nel settore, fornendo indicazioni per un corretto comportamento manageriale in un ambito creativo relativo ad una rete di imprese co-specializzate.
APA, Harvard, Vancouver, ISO, and other styles
14

REVENAZ, Alfredo. "Nuovi protocolli per gestione dati e comunicazione Real-time in ambito agricolo." Doctoral thesis, Università degli studi di Ferrara, 2011. http://hdl.handle.net/11392/2388762.

Full text
Abstract:
In the last years, Precision Farming and Farm Automation in Agricultural field, have been requesting short range wireless communication systems for machine synchronization and long range wireless communication for fleet management, and for production and task control in real time. Existing protocols can meet the requirements only partially: different legislation and product availability in different countries make it difficult to identify a solution for the global market. TCP/IP protocol and GPRS/GSM networks meet some requirements for long range communication, but not for real time communication needed for short range communication. The synchronization specifications require a real time wireless communication protocol affordable, safe and with low latency, in order to ensure a minimum guaranteed throughput apt to synchronize the machine work and travel. Moreover there are some information security requirements to fulfill in order to guarantee safe data transmission. The paper presents a comprehensive proposal based on UDP and IP protocols with embedded controls, apt to create an isochronous wireless communication for machine synchronization, and some other methods and protocols to fulfill long range data communication requirements of the Agricultural field.
APA, Harvard, Vancouver, ISO, and other styles
15

CURRELI, MARCO. "Integrazione tra digital intelligence e business intelligence: quali opportunità per le imprese della GDO. Il caso della Magazzini Gabrielli S.p.A." Doctoral thesis, Università Politecnica delle Marche, 2016. http://hdl.handle.net/11566/243092.

Full text
Abstract:
Il mondo del retail ha subito dei cambiamenti radicali, con l’avvento dei primi canali di vendita online e con il costante ricorso a processi di digitalizzazione. Per fare fronte a tali cambiamenti, molte aziende del settore retail hanno iniziato a sviluppare strategie e modelli di business in grado di sfruttare al meglio le opportunità del “mondo digitale”, iniziando a ripensare le proprie strategie in un’ottica multicanale. Il lavoro di ricerca si focalizza sulla valutazione del livello di integrazione multi canale raggiunto da un retailer di piccole – medie dimensioni, cercando inoltre di fornire un prima analisi degli impatti sulle performance complessive dell’organizzazione considerata. In particolare si riscontra che l’efficacia di una strategia multicanale, si massimizza principalmente grazie all’incremento delle competenze in termini di information technology all’interno dell’azienda (Lih-Bin, Hock-Hai, & Vallabh, 2012). L’analisi empirica rileva, altresì, che l’IT risulta un attivatore fondamentale per avviare un processo di integrazione dei flussi informativi trasversalmente a diversi canali (o touchpoint) (Neslin et al. 2006), fornendo quindi una esperienza multicanale integrata. Si prospetta pertanto, una importante sfida competitiva che i retailer dovranno affrontare, e che richiede delle specifiche competenze in termini di innovazione e di management.
Retailing has undergone radical changes with the advent of online sales channels and the constant use of the digital technology. To face these changes, many retail companies have begun to develop new strategies and adapt their business models, in order to exploit opportunities of the new "digital world". They start to rethink their strategies in a multichannel view. The research focuses on the multichannel integration reached by a small and medium retailer; it also tries to understand which are the preliminary impacts on the overall performance. Particularly the effectiveness of a multi-channel strategy, has mainly maximized thanks to the IT competencies of the retailer (Lih-Bin, Hock-Hai, & Vallabh, 2012). The empirical analysis also highlights that IT competence is an important activator to start the information integration process, in order to share the same information through different channels (touchpoint) (Neslin et al. 2006), and to provide an integrated multichannel customer experience. So the small and medium retailers face a major competitive challenge, which requires specific skills in terms of innovation and management.
APA, Harvard, Vancouver, ISO, and other styles
16

PARMIGGIANI, Nicolò. "Metodi per l’analisi e la gestione dei dati dell’astrofisica gamma in tempo reale." Doctoral thesis, Università degli studi di Modena e Reggio Emilia, 2021. http://hdl.handle.net/11380/1239980.

Full text
Abstract:
Il contesto delle attività svolte per il Ph.D. sono l’analisi e la gestione dei dati per l’astronomia dei raggi gamma, la quale coinvolge l’osservazione dei raggi gamma, la forma più energetica di radiazione elettromagnetica. Dalle osservazioni dei raggi gamma effettuate con telescopi o satelliti, è possibile studiare eventi catastrofici generati da oggetti compatti come nane bianche, stelle di neutroni e buchi neri. Questi eventi sono chiamati transienti di raggi gamma. Per comprendere questi fenomeni, essi devono essere osservati durante la loro evoluzione. Per questa ragione, la velocità è cruciale e vengono sviluppate pipeline automatiche di analisi dei dati per identificare questi transienti e generare allerte scientifiche. L’allerta scientifica è una comunicazione immediata da un osservatorio ad altri osservatori per segnalare che un evento astrofisico interessante sta avvenendo nel cielo. La comunità astrofisica si trova in una nuova era chiamata “multi-messenger astronomy”, nella quale le sorgenti astronomiche sono osservate con strumenti che raccolgono diversi segnali: onde gravitazionali, radiazioni elettromagnetiche e neutrini. In questa multi-messenger era, i progetti astrofisici condividono le loro allerte scientifiche tramite reti di comunicazione. Il coordinamento di diversi progetti tramite le allerte scientifiche è fondamentale per capire la natura di questi fenomeni fisici. Gli osservatori devono gestire queste allerte scientifiche sviluppando software dedicato. Durante il corso del Ph.D. l’attività di ricerca è stata focalizzata sulla missione spaziale AGILE, attualmente in operazione, e sull’osservatorio Cherenkov Telescope Array, in fase di costruzione. Il follow-up di allerte scientifiche esterne ricevute dagli strumenti che identificano Gamma-Ray Bursts (GRB) e Gravitational Waves (GW) è una delle maggiori attività del Team di AGILE. Le generazioni future degli osservatori di raggi gamma come CTA o ASTRI Mini-Array possono trarre vantaggio dalle tecnologie sviluppate per AGILE. Questa ricerca ha l’obiettivo di sviluppare software per l’analisi e la gestione dei dati gamma. Il primo capitolo della tesi descrive la piattaforma web utilizzata dai ricercatori AGILE per preparare il secondo catalogo di sorgenti gamma di AGILE. Le analisi realizzate per questo catalogo sono memorizzate in un database dedicato e la piattaforma web interroga il database. Questo è stato il lavoro preparatorio per capire come gestire i risultati delle analisi di sorgenti gamma (detection e curve di luce) per la fase successiva: lo sviluppo di una pipeline scientifica per gestire in tempo reale le detection e le allerte scientifiche. Nel secondo capitolo viene presentato un framework progettato per facilitare lo sviluppo di pipeline per l’analisi scientifica in tempo reale. Il framework offre un’architettura comune e automatismi che possono essere utilizzati dagli osservatori per sviluppare le loro pipeline. Questo framework è stato usato per sviluppare le pipeline della missione spaziale AGILE e per sviluppare un prototipo per la Science Alert Generation (SAG) dell’osservatorio CTA. Il terzo capitolo descrive un nuovo metodo per identificare i GRB nei dati dello strumento AGILE-GRID utilizzando le Convolutional Neural Network. Con questa tecnologia Deep Learning è possibile migliorare la capacità di detection di AGILE. Questo metodo è anche stato inserito come tool scientifico all’interno della pipeline AGILE. L’ultimo capitolo mostra i risultati scientifici ottenuti con il software sviluppato durante le attività di ricerca del Ph.D. Parte dei risultati sono stati pubblicati su riviste scientifiche. La restante parte è stata inviata alla comunità scientifica tramite The Astronomer’s Telegram o il Gamma-ray Coordination Network.
The context of this Ph.D. is the data analysis and management for gamma-ray astronomy, which involves the observation of gamma-rays, the most energetic form of electromagnetic radiation. From the gamma-ray observations performed by telescopes or satellites, it is possible to study catastrophic events involving compact objects, such as white dwarves, neutron stars, and black holes. These events are called gamma-ray transients. To understand these phenomena, they must be observed during their evolution. For this reason, the speed is crucial, and automated data analysis pipelines are developed to detect gamma-ray transients and generate science alerts during the astrophysical observations or immediately after. A science alert is an immediate communication from one observatory to other observatories that an interesting astrophysical event is occurring in the sky. The astrophysical community is experiencing a new era called "multi-messenger astronomy", where the astronomical sources are observed by different instruments, collecting different signals: gravitational waves, electromagnetic radiation, and neutrinos. In the multi-messenger era, astrophysical projects share science alerts through different communication networks. The coordination of different projects done by sharing science alerts is mandatory to understand the nature of these physical phenomena. Observatories have to manage the follow-up of these external science alerts by developing dedicated software. During this Ph. D., the research activity had the main focus on the AGILE space mission, currently in operation, and on the Cherenkov Telescope Array Observatory (CTA), currently in the construction phase. The follow-up of external science alerts received from Gamma-Ray Bursts (GRB) and Gravitational Waves (GW) detectors is one of the AGILE Team's current major activities. Future generations of gamma-ray observatories like the CTA or the ASTRI Mini-Array can take advantage of the technologies developed for AGILE. This research aims to develop analysis and management software for gamma-ray data to fulfill the context requirements. The first chapter of this thesis describes the web platform used by AGILE researchers to prepare the Second AGILE Catalog of Gamma-ray sources. The analysis performed for this catalog is stored in a dedicated database, and the web platform queries this database. This was preparatory work to understand how to manage detections of gamma-ray sources and light curve for the subsequent phase: the development of a scientific pipeline to manage gamma-ray detection and science alerts in real-time. The second chapter presents a framework designed to facilitate the development of real-time scientific analysis pipelines. The framework provides a common pipeline architecture and automatisms that can be used by observatories to develop their own pipelines. This framework was used to develop the pipelines for the AGILE space mission and to develop a prototype of the scientific pipeline of the Science Alert Generation system of the CTA Observatory. The third chapter describes a new method to detect GRBs in the AGILE-GRID data using the Convolutional Neural Network. With this Deep Learning technology, it is possible to improve the detection capabilities of AGILE. This method was also integrated as a science tool in the AGILE pipelines. The last chapter of the thesis shows the scientific results obtained with the software developed during the Ph.D. research activities. Part of the results was published in refereed journals. The remaining part was sent to the scientific community through The Astronomer's Telegram or the Gamma-ray Coordination Network.
APA, Harvard, Vancouver, ISO, and other styles
17

PAGANELLI, MATTEO. "Gestione ed Analisi di Big Data: Sfide e Opportunità nell'Integrazione e nell'Estrazione di Conoscenza dai Dati." Doctoral thesis, Università degli studi di Modena e Reggio Emilia, 2021. http://hdl.handle.net/11380/1239979.

Full text
Abstract:
Nell'era dei Big Data, l'adeguata gestione e consumo dei dati rappresenta una delle attività più sfidanti, a causa di una serie di criticità che si è soliti categorizzare in 5 concetti chiave: volume, velocità, varietà, veridicità e variabilità. In risposta a queste esigenze, negli ultimi anni numerosi algoritmi e tecnologie sono stati proposti, tuttavia rimangono molti problemi aperti e nuove sfide sono emerse. Tra queste, solo per citarne alcune, ci sono la necessità di disporre di dati annotati per l'addestramento di tecniche di machine learning, di interpretare la logica dei sistemi utilizzati, di ridurre l'impatto della loro gestione in produzione (ovvero il cosiddetto debito tecnico o technical debt) e di fornire degli strumenti a supporto dell'interazione uomo-macchina. In questa tesi si approfondiscono in particolare le criticità che affliggono gli ambiti dell'integrazione dati e della moderna gestione (in termini di riadattamento rispetto i nuovi requisiti) dei DBMS relazionali. Il principale problema che affligge l'integrazione di dati riguarda la sua valutazione in contesti reali, la quale richiede tipicamente il costoso coinvolgimento, sia a livello economico che di tempo, di esperti del dominio. In quest'ottica l'impiego di strumenti per il supporto e l'automazione di questa operazione critica, nonché la sua risoluzione in maniera non supervisionata, risulterebbero molto utili. In questo ambito, il mio contributo può essere riassunto nei seguenti punti: 1) la realizzazione di tecniche per la valutazione non supervisionata di processi di integrazione di dati e 2) lo sviluppo di approcci automatici per la configurazione di modelli di matching basati su regole. Per quanto riguarda i DBMS relazionali, essi si sono dimostrati di essere, nell'arco degli ultimi decenni, il cavallo di battaglia di molte aziende, per merito della loro semplicità di governance, sicurezza, verificabilità e dell'elevate performance. Oggigiorno, tuttavia si assiste ad un parziale ripensamento del loro utilizzo rispetto alla progettazione originale. Si tratta per esempio di impiegarli nella risoluzione di compiti più avanzati, quali classificazione, regressione e clustering, tipici dell'ambito del machine learning. L'instaurazione di un rapporto simbiotico tra questi due ambiti di ricerca potrebbe rivelarsi essenziale al fine di risolvere alcune delle criticità sopra elencate. In questo ambito, il mio principale contributo è stato quello di verificare la possibilità di eseguire, durante la messa in produzione di un sistema, predizioni di modelli di machine learning direttamente all'interno del database.
In the Big Data era, the adequate management and consumption of data represents one of the most challenging activities, due to a series of critical issues that are usually categorized into 5 key concepts: volume, velocity, variety, veridicity and variability. In response to these needs, a large number of algorithms and technologies have been proposed in recent years, however many open problems remain and new challenges have emerged. Among these, just to name a few, there is the need to have annotated data for the training of machine learning techniques, to interpret the logic of the systems used, to reduce the impact of their management in production (i.e. the so-called technical debt) and to provide tools to support human-machine interaction. In this thesis, the challenges affecting the areas of data integration and modern management (in terms of readjustment with respect to the new requirements) of relational DBMS are studied in depth. The main problem affecting data integration concerns its evaluation in real contexts, which typically requires the costly and time-demanding involvement of domain experts. In this perspective, the use of tools for the support and automation of this critical task, as well as its unsupervised resolution, would be very useful. In this context, my contribution can be summarized in the following points: 1) the realization of techniques for the unsupervised evaluation of data integration tasks and 2) the development of automatic approaches for the configuration of rules-based matching models. As for relational DBMSs, they have proved to be, over the last few decades, the workhorse of many companies, thanks to their simplicity of governance, security, audibility and high performance. Today, however, we are witnessing a partial rethinking of their use compared to the original design. For example, they are used in solving more advanced tasks, such as classification, regression and clustering, typical of the machine learning field. The establishment of a symbiotic relationship between these two research fields could be essential to solve some of the critical issues listed above. In this context, my main contribution was to verify the possibility of performing in-DBMS inference of machine learning pipeline at serving time.
APA, Harvard, Vancouver, ISO, and other styles
18

Di, Renzo Maria Rosaria. "L'efficacia della formazione ICT-based nello sviluppo delle competenze manageriali: evidenze da un caso aziendale." Doctoral thesis, Luiss Guido Carli, 2009. http://hdl.handle.net/11385/200792.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

CORONA, ERIKA. "Web Framework Points: an Effort Estimation Methodology for Web Application Development." Doctoral thesis, Università degli Studi di Cagliari, 2013. http://hdl.handle.net/11584/266242.

Full text
Abstract:
Software effort estimation is one of the most critical components of a successful software project: completing the project on time and within budget is the classic challenge for all project managers. However, predictions made by project managers about their project are often inexact: software projects need, on average, 30-40% more effort than estimated. Research on software development effort and cost estimation has been abundant and diversified since the end of the Seventies. The topic is still very much alive, as shown by the numerous works existing in the literature. During these three years of research activity, I had the opportunity to go into the knowledge and to experiment some of the main software effort estimation methodologies existing in literature. In particular, I focused my research on Web effort estimation. As stated by many authors, the existing models for classic software applications are not well suited to measure the effort of Web applications, that unfortunately are not exempt from cost and time overruns, as traditional software projects. Initially, I compared the effectiveness of Albrecht's classic Function Points (FP) and Reifer's Web Objects (WO) metrics in estimating development effort for Web applications, in the context of an Italian software company. I tested these metrics on a dataset made of 24 projects provided by the software company between 2003 and 2010. I compared the estimate data with the real effort of each project completely developed, using the MRE (Magnitude of Relative Error) method. The experimental results showed a high error in estimates when using WO metric, which proved to be more effective than the FP metric in only two occurrences. In the context of this first work, it appeared evident that effort estimation depends not only on functional size measures, but other factors had to be considered, such as model accuracy and other challenges specific to Web applications; though the former represent the input that influences most the final results. For this reason, I revised the WO methodology, creating the RWO methodology. I applied this methodology to the same dataset of projects, comparing the results to those gathered by applying the FP and WO methods. The experimental results showed that the RWO method reached effort prediction results that are comparable to – and in 4 cases even better than – the FP method. Motivated by the dominant use of Content Management Framework (CMF) in Web application development and the inadequacy of the RWO method when used with the latest Web application development tools, I finally chose to focus my research on the study of a new Web effort estimation methodology for Web applications developed with a CMF. I proposed a new methodology for effort estimation: the Web CMF Objects one. In this methodology, new key elements for analysis and planning were identified; they allow to define every important step in the development of a Web application using a CMF. Following the RWO method approach, the estimated effort of a Web project stems from the sum of all elements, each of them weighted with its own complexity. I tested the whole methodology on 9 projects provided by three different Italian software companies, comparing the value of the effort estimate to the actual, final effort of each project, in man-days. I then compared the effort estimate both with values obtained from the Web CMF Objects methodology and with those obtained from the respective effort estimation methodologies of the three companies, getting excellent results: a value of Pred(0.25) equal to 100% for the Web CMF Objects methodology. Recently, I completed the presentation and assessment of Web CMF Objects methodology, upgrading the cost model for the calculation of effort estimation. I named it again Web Framework Points methodology. I tested the updated methodology on 19 projects provided by three software companies, getting good results: a value of Pred(0.25) equal to 79%. The aim of my research is to contribute to reducing the estimation error in software development projects developed through Content Management Frameworks, with the purpose to make the Web Framework Points methodology a useful tool for software companies.
APA, Harvard, Vancouver, ISO, and other styles
20

PANTINI, SARA. "Analysis and modelling of leachate and gas generation at landfill sites focused on mechanically-biologically treated waste." Doctoral thesis, Università degli Studi di Roma "Tor Vergata", 2013. http://hdl.handle.net/2108/203393.

Full text
Abstract:
Despite significant efforts have been directed toward reducing waste generation and encouraging alternative waste management strategies, landfills still remain the main option for Municipal Solid Waste (MSW) disposal in many countries. Hence, landfills and related impacts on the surroundings are still current issues throughout the world. Actually, the major concerns are related to the potential emissions of leachate and landfill gas into the environment, that pose a threat to public health, surface and groundwater pollution, soil contamination and global warming effects. To ensure environmental protection and enhance landfill sustainability, modern sanitary landfills are equipped with several engineered systems with different functions. For instance, the installation of containment systems, such as bottom liner and multi-layers capping systems, is aimed at reducing leachate seepage and water infiltration into the landfill body as well as gas migration, while eventually mitigating methane emissions through the placement of active oxidation layers (biocovers). Leachate collection and removal systems are designed to minimize water head forming on the bottom section of the landfill and consequent seepages through the liner system. Finally, gas extraction and utilization systems, allow to recover energy from landfill gas while reducing explosion and fire risks associated with methane accumulation, even though much depends on gas collection efficiency achieved in the field (range: 60-90% Spokas et al., 2006; Huitric and Kong, 2006). Hence, impacts on the surrounding environment caused by the polluting substances released from the deposited waste through liquid and gas emissions can be potentially mitigated by a proper design of technical barriers and collection/extraction systems at the landfill site. Nevertheless, the long-term performance of containment systems to limit the landfill emissions is highly uncertain and is strongly dependent on site-specific conditions such as climate, vegetative covers, containment systems, leachate quality and applied stress. Furthermore, the design and operation of leachate collection and treatment systems, of landfill gas extraction and utilization projects, as well as the assessment of appropriate methane reduction strategies (biocovers), require reliable emission forecasts for the assessment of system feasibility and to ensure environmental compliance. To this end, landfill simulation models can represent an useful supporting tool for a better design of leachate/gas collection and treatment systems and can provide valuable information for the evaluation of best options for containment systems depending on their performances under the site-specific conditions. The capability in predicting future emissions levels at a landfill site can also be improved by combining simulation models with field observations at full-scale landfills and/or with experimental studies resembling landfill conditions. Indeed, this kind of data may allow to identify the main parameters and processes governing leachate and gas generation and can provide useful information for model refinement. In view of such need, the present research study was initially addressed to develop a new landfill screening model that, based on simplified mathematical and empirical equations, provides quantitative estimation of leachate and gas production over time, taking into account for site-specific conditions, waste properties and main landfill characteristics and processes. In order to evaluate the applicability of the developed model and the accuracy of emissions forecast, several simulations on four full-scale landfills, currently in operative management stage, were carried out. The results of these case studies showed a good correspondence of leachate estimations with monthly trend observed in the field and revealed that the reliability of model predictions is strongly influenced by the quality of input data. In particular, the initial waste moisture content and the waste compression index, which are usually data not available from a standard characterisation, were identified as the key unknown parameters affecting leachate production. Furthermore, the applicability of the model to closed landfills was evaluated by simulating different alternative capping systems and by comparing the results with those returned by the Hydrological Evaluation of Landfill Performance (HELP), which is the most worldwide used model for comparative analysis of composite liner systems. Despite the simplified approach of the developed model, simulated values of infiltration and leakage rates through the analysed cover systems were in line with those of HELP. However, it should be highlighted that the developed model provides an assessment of leachate and biogas production only from a quantitative point of view. The leachate and biogas composition was indeed not included in the forecast model, as strongly linked to the type of waste that makes the prediction in a screening phase poorly representative of what could be expected in the field. Hence, for a qualitative analysis of leachate and gas emissions over time, a laboratory methodology including different type of lab-scale tests was applied to a particular waste material. Specifically, the research was focused on mechanically biologically treated (MBT) wastes which, after the introduction of the European Landfill Directive 1999/31/EC (European Commission, 1999) that imposes member states to dispose of in landfills only wastes that have been preliminary subjected to treatment, are becoming the main flow waste landfilled in new Italian facilities. However, due to the relatively recent introduction of the MBT plants within the waste management system, very few data on leachate and gas emissions from MBT waste in landfills are available and, hence, the current knowledge mainly results from laboratory studies. Nevertheless, the assessment of the leaching characteristics of MBT materials and the evaluation of how the environmental conditions may affect the heavy metals mobility are still poorly investigated in literature. To gain deeper insight on the fundamental mechanisms governing the constituents release from MBT wastes, several leaching experiments were performed on MBT samples collected from an Italian MBT plant and the experimental results were modelled to obtain information on the long-term leachate emissions. Namely, a combination of experimental leaching tests were performed on fully-characterized MBT waste samples and the effect of different parameters, mainly pH and liquid to solid ratio (L/S,) on the compounds release was investigated by combining pH static-batch test, pH dependent tests and dynamic up-flow column percolation experiments. The obtained results showed that, even though MBT wastes were characterized by relatively high heavy metals content, only a limited amount was actually soluble and thus bioavailable. Furthermore, the information provided by the different tests highlighted the existence of a strong linear correlation between the release pattern of dissolved organic carbon (DOC) and several metals (Co, Cr, Cu, Ni, V, Zn), suggesting that complexation to DOC is the leaching controlling mechanism of these elements. Thus, combining the results of batch and up-flow column percolation tests, partition coefficients between DOC and metals concentration were derived. These data, coupled with a simplified screening model for DOC release, allowed to get a very good prediction of metal release during the experiments and may provide useful indications for the evaluation of long-term emissions from this type of waste in a landfill disposal scenario. In order to complete the study on the MBT waste environmental behaviour, gas emissions from MBT waste were examined by performing different anaerobic tests. The main purpose of this study was to evaluate the potential gas generation capacity of wastes and to assess possible implications on gas generation resulting from the different environmental conditions expected in the field. To this end, anaerobic batch tests were performed at a wide range of water contents (26-43 %w/w up to 75 %w/w on wet weight) and temperatures (from 20-25 °C up to 55 °C) in order to simulate different landfill management options (dry tomb or bioreactor landfills). In nearly all test conditions, a quite long lag-phase was observed (several months) due to the inhibition effects resulting from high concentrations of volatile fatty acids (VFAs) and ammonia that highlighted a poor stability degree of the analysed material. Furthermore, experimental results showed that the initial waste water content is the key factor limiting the anaerobic biological process. Indeed, when the waste moisture was lower than 32 %w/w the methanogenic microbial activity was completely inhibited. Overall, the obtained results indicated that the operative conditions drastically affect the gas generation from MBT waste, in terms of both gas yield and generation rate. This suggests that particular caution should be paid when using the results of lab-scale tests for the evaluation of long-term behaviour expected in the field, where the boundary conditions change continuously and vary significantly depending on the climate, the landfill operative management strategies in place (e.g. leachate recirculation, waste disposal methods), the hydraulic characteristics of buried waste, the presence and type of temporary and final cover systems.
APA, Harvard, Vancouver, ISO, and other styles
21

Terrosi, Giulia. "Guidelines for BIM information management at design stage." Master's thesis, 2020. http://hdl.handle.net/1822/74855.

Full text
Abstract:
Dissertação de mestrado em European Master in Building Information Modelling
Information Management in Building Information Modelling (BIM) deals with the quality and efficiency on the retrieval, storage, use and sharing of information throughout the project lifecycle. The detailed planning, definition of roles and responsibilities, assessment of capabilities, and stating requirements should be established since the very beginning of the project to achieve an effective management of the information of the whole process. However, even if many companies are using BIM tools, lots of them have still not implemented standardised information processes. This work created a framework to support companies in BIM projects, specifically during the Design Stage, to lead them to a structured management of the information generated during this stage and to prepare its handover. To do so, in-depth research was conducted on published papers and articles, BIM guides and applicable Standards to gather the best practices in the field and propose guidelines to support a structured BIM adoption in a company´s project. It was created a specific workflow based on the Integrated Project Delivery methodology to support companies in BIM adoption. From there, several tools were proposed to assist the BIM Manager in establishing protocols during the project Design Stage. The proposed framework may be used to support companies in assessing and selecting the right team members capabilities, in collecting information requirements efficiently, in stablishing a proper BIM Execution Plan and evaluating CDE platforms. This work was developed in collaboration with a BIM consulting company, allowing for a parallel validation process of this workflow and documents throughout the way. These tools and the designed workflow are intended to help companies to accelerate their BIM processes, foster collaboration between all the stakeholders, assuring enhanced information management and therefore contributing to the quality of the output and general performance of the organisation.
Para o Building Information Modelling (BIM) a Gestão da Informação representa qualidade e eficiência na recuperação, armazenamento, utilização e partilha da informação ao longo do ciclo de vida do projeto. O planeamento detalhado, a definição dos papéis e responsabilidades, a avaliação das capacidades, e a declaração de requisitos devem ser estabelecidos desde o início do projeto para se conseguir uma gestão eficaz da informação de todo o processo. Contudo, mesmo que muitas empresas estejam a utilizar ferramentas BIM, muitas delas ainda não implementaram processos de informação normalizados. Este trabalho criou um enquadramento para apoiar as empresas na gestão do BIM, especificamente durante a fase de projeto e levá-las a uma gestão estruturada da informação gerada durante esta fase preparando a sua passagem para as fases subsequentes. Para tal, foi realizada uma pesquisa aprofundada sobre artigos publicados, guias e Normas aplicáveis para reunir as melhores práticas nesta área e propor diretrizes para apoiar uma adoção estruturada do BIM no projeto de uma empresa. Foi criado um fluxo de trabalho específico baseado na metodologia de Integrated Project Delivery para apoiar empresas na adoção do BIM. A partir daí, foram propostas várias ferramentas para ajudar o BIM Manager a estabelecer protocolos durante a fase de conceção do projeto. O quadro proposto poderia apoiar as empresas na avaliação e seleção das capacidades dos membros da equipa certa, na recolha eficiente de requisitos de informação, na elaboração de um Plano de Execução BIM adequado e na avaliação das plataformas comuns de dados CDE. Este trabalho foi desenvolvido em colaboração com uma empresa de consultoria BIM, permitindo a validação paralela deste fluxo de trabalho e documentos ao longo de todo o processo. Estas ferramentas e o fluxo de trabalho concebido destinam-se a ajudar as empresas a acelerar os seus processos BIM, fomentar a colaboração entre todos os interessados, assegurando uma melhor gestão da informação, contribuindo assim para a qualidade dos resultados e desempenho geral da organização.
La gestione delle informazioni all’interno del Building Information Modelling (BIM) si basa sull’efficienza nel recupero, conservazione uso e condivisione delle informazioni attraverso l’intero svolgimento di un progetto. Al fine di ottenere un’efficace gestione delle informazioni dell’intero processo, è necessario stabilire fin dal principio del progetto stesso un piano d’azione dettagliato, fissare i ruoli e i loro compiti, valutare ed asseverarne le competenze ed esplicarne i requisiti richiesti. Tuttavia, nonostante molte compagnie utilizzino già molti strumenti del BIM, mancano ancora di processi standardizzati per la gestione delle informazioni da esso generate e in esso contenute. Lo scopo di questo lavoro è stato quello di creare di una struttura di supporto alle compagnie nella gestione dei progetti in BIM, nello specifico riguardo alla fase della progettazione e guidarle verso una gestione organizzata delle informazioni generate nel corso di questa fase e nella preparazione del passaggio di consegna di queste per la fase successiva. Al fine di conseguire ciò, è stata eseguita un’accurata ricerca attraverso la consultazione di articoli e saggi pubblicati, manuali sul BIM e Standard vigenti al fine di acquisire le migliori tecniche e proporre delle linee guida che siano di supporto all’introduzione del BIM in un progetto da parte di una compagnia. Allo stesso scopo, è stato sviluppato uno specifico flusso di lavoro da seguire basato sulla metodologia dell’Integrated Project Delivery. In aggiunta sono stati sviluppati e proposti altri strumenti per agevolare il lavoro del BIM Manager nello stabilire protocolli per la fase di progettazione. Allo stesso tempo, la struttura di lavoro proposta, può anche avere la funzione di aiutare le compagnie nella valutazione e selezione dei più adatti componenti di un team di lavoro e delle loro competenze, di assisterle in un efficace reperimento delle informazioni necessarie, nella realizzazione di un adeguato ‘BIM Execution Plan’ e nella valutazione delle piattaforme per il CDE. Questo lavoro è stato sviluppato in collaborazione con una compagnia di consulenza per il BIM, permettendo una convalidazione in parallelo del flusso di lavoro sviluppato e i documenti a supporto di esso durante tutto il periodo di lavoro. Lo scopo della creazione del flusso di lavoro e degli strumenti a suo supporto è quello di aiutare le compagnie ad accelerare i loro processi nell’utilizzo del BIM, promuovere la collaborazione tra i soggetti interessati, assicurare l’ottimizzazione della gestione delle informazioni e, conseguentemente, contribuire alla qualità dei risultati e le generali prestazioni dell’organizzazione.
APA, Harvard, Vancouver, ISO, and other styles
22

Triggiani, Maurizio. "Integration of machine learning techniques in chemometrics practices." Doctoral thesis, 2022. http://hdl.handle.net/11589/237998.

Full text
Abstract:
Food safety is a key objective in all the development plans of the European Union. To ensure the quality and the sustainability of the agricultural production (both intensive and extensive) a well-designed analysis strategy is needed. Climate change, precision agriculture, green revolution and industry 4.0 are areas of study that need innovative practices and approaches that aren’t possible without precise and constant process monitoring. The need for product quality assessment during the whole supply chain is paramount and cost reduction is also another constant need. Non targeted Nuclear Magnetic Resonance (NMR) analysis is still a second-choice approach for food analysis and monitoring, one of the problems of this approach is the big amount of information returned. This kind of data needs a new and improved method of handling and analysis. Classical chemometrics practices are not well suited for this new field of study. In this thesis, we approached the problem of food fingerprinting and discrimination by the means of non-targeted NMR spectroscopy combined with modern machine learning algorithms and databases meant for the correct and easy access of data. The introduction of machine learning techniques alongside the clear benefits introduces a new layer of complexity regarding the need for trusted data sources for algorithm training and integrity, if this kind of approach proves is worth in the global market, we’ll need not only to create a good dataset, but we’ll need to be prepared to defend against also more clever attacks like adversarial machine learning attacks. Comparing the machine learning results with the classic chemometric approach we’ll highlight the strengths and the weakness of both approaches, and we’ll use them to prepare the framework needed to tackle the challenges of future agricultural productions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography