Dissertations / Theses on the topic 'Settore FIS/07 - FISICA APPLICATA (A BENI CULTURALI, AMBIENTALI, BIOLOGIA E MEDICINA)'

To see the other types of publications on this topic, follow the link: Settore FIS/07 - FISICA APPLICATA (A BENI CULTURALI, AMBIENTALI, BIOLOGIA E MEDICINA).

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Settore FIS/07 - FISICA APPLICATA (A BENI CULTURALI, AMBIENTALI, BIOLOGIA E MEDICINA).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

MELLE, GIOVANNI. "Development of a Novel Platform for in vitro Electrophysiological Recording." Doctoral thesis, Università degli studi di Genova, 2020. http://hdl.handle.net/11567/1000590.

Full text
Abstract:
The accurate monitoring of cell electrical activity is of fundamental importance for pharmaceutical research and pre-clinical trials that impose to check the cardiotoxicity of all new drugs. Traditional methods for preclinical evaluation of drug cardiotoxicity exploit animal models, which tend to be expensive, low throughput, and exhibit species-specific differences in cardiac physiology (Mercola, Colas and Willems, 2013). Alternative approaches use heterologous expression of cardiac ion channels in non-cardiac cells transfected with genetic material. However, the use of these constructs and the inhibition of specific ionic currents alone is not predictive of cardiotoxicity. Drug toxicity evaluation based on the human ether-à-go-go-related gene (hERG) channel, for example, leads to a high rate of false-positive cardiotoxic compounds, increasing drug attrition at the preclinical stage. Consequently, from 2013, the Comprehensive in Vitro Proarrhythmia Assay (CiPA) initiative focused on experimental methods that identify cardiotoxic drugs and to improve upon prior models that have largely used alterations in the hERG potassium ion channel. The most predictive models for drug cardiotoxicity must recapitulate the complex spatial distribution of the physiologically distinct myocytes of the intact adult human heart. However, intact human heart preparations are inherently too costly, difficult to maintain, and, hence, too low throughput to be implemented early in the drug development pipeline. For these reasons the optimization of methodologies to differentiate human induced Pluripotent Stem Cells (hiPSCs) into cardiomyocytes (CMs) enabled human CMs to be mass-produced in vitro for cardiovascular disease modeling and drug screening (Sharma, Wu and Wu, 2013). These hiPSC-CMs functionally express most of the ion channels and sarcomeric proteins found in adult human CMs and can spontaneously contract. Recent results from the CiPA initiative have confirmed that, if utilized appropriately, the hiPSC-CM platform can serve as a reliable alternative to existing hERG assays for evaluating arrhythmogenic compounds and can sensitively detect the action potential repolarization effects associated with ion channel–blocking drugs (Millard et al., 2018). Data on drug-induced toxicity in hiPSC-CMs have already been successfully collected by using several functional readouts, such as field potential traces using multi-electrode array (MEA) technology (Clements, 2016), action potentials via voltage-sensitive dyes (VSD) (Blinova et al., 2017) and cellular impedance (Scott et al., 2014). Despite still under discussion, scientists reached a consensus on the value of using electrophysiological data from hiPSC-CM for predicting cardiotoxicity and how it’s possible to further optimize hiPSC-CM-based in vitro assays for acute and chronic cardiotoxicity assessment. In line with CiPA, therefore, the use of hiPSC coupled with MEA technology has been selected as promising readout for these kind of experiments. These platforms are used as an experimental model for studying the cardiac Action Potentials (APs) dynamics and for understanding some fundamental principles about the APs propagation and synchronization in healthy heart tissue. MEA technology utilizes recordings from an array of electrodes embedded in the culture surface of a well. When cardiomyocytes are grown on these surfaces, spontaneous action potentials from a cluster of cardiomyocytes, the so called functional syncytium, can be detected as fluctuations in the extracellular field potential (FP). MEA measures the change in FP as the action potential propagates through the cell monolayer relative to the recording electrode, neverthless FP in the MEA do not allows to recapitualte properly the action potential features. It is clear, therefore, that a MEA technology itself is not enough to implement cardiotoxicity assays on hIPSCs-CMs. Under this issue, researchers spread in the world started to think about solutions to achieve a platform able to works both at the same time as a standard MEA and as a patch clamp, allowing the recording of extracellular signals as usual, with the opportunity to switch to intracellular-like signals from the cytosol. This strong interest stimulated the development of methods for intracellular recording of action potentials. Currently, the most promising results are represented by multi-electrode arrays (MEA) decorated with 3D nanostructures that were introduced in pioneering papers (Robinson et al., 2012; Xie et al., 2012), culminating with the recent work from the group of H. Park (Abbott et al., 2017) and of F. De Angelis (Dipalo et al., 2017). In these articles, they show intracellular recordings on electrodes refined with 3D nanopillars after electroporation and laser optoporation from different kind of cells. However, the requirement of 3D nanostructures set strong limitations to the practical spreading of these techniques. Thus, despite pioneering results have been obtained exploiting laser optoporation, these technologies neither been applied to practical cases nor reached the commercial phase. This PhD thesis introduces the concept of meta-electrodes coupled with laser optoporation for high quality intracellular signals from hiPSCs-CM. These signals can be recorded on high-density commercial CMOS-MEAs from 3Brain characterized by thousands of electrode covered by a thin film of porous Platinum without any rework of the devices, 3D nanostructures or circuitry for electroporation7. Subsequently, I attempted to translate these unique features of low invasiveness and reliability to other commercial MEA platforms, in order to develop a new tool for cardiac electrophysiological accurate recordings. The whole thesis is organized in three main sections: a first single chapters that will go deeper in the scientific and technological background, including an explanation of the cell biology of hiPSCs-CM followed by a full overview of MEA technology and devices. Then, I will move on state-of-the-art approaches of intracellular recording, discussing many works from the scientific literature. A second chapter will describe the main objectives of the whole work, and a last chapter with the main results of the activity. A final chapter will resume and recapitulate the conclusion of the work.
APA, Harvard, Vancouver, ISO, and other styles
2

Arosio, Daniele. "Imaging Chloride Homeostasis in Neurons." Doctoral thesis, Università degli studi di Trento, 2017. https://hdl.handle.net/11572/368512.

Full text
Abstract:
Intracellular chloride and pH are fundamental regulators of neuronal excitability and they are often co-modulated during excitation-inhibition activity. The study of their homeostasis requires simultaneous measurements in vivo in multiple neurons. Combining random mutagenesis screening, protein engineering and two-photon-imaging this thesis work led to the discovery of new chloride-sensitive GFP mutants and to the establishment of ratiometric imaging procedures for the quantitative combined imaging of intraneuronal pH and chloride. These achievements have been demonstrated in vivo in the mouse cortex, in real-time monitoring the dynamic changes of ions concentrations during epileptic-like discharges, and in glioblastoma primary cells, measuring osmotic swelling responses to various drugs treatment.
APA, Harvard, Vancouver, ISO, and other styles
3

Filosi, Michele. "A network medicine approach on microarray and Next generation Sequencing data." Doctoral thesis, Università degli studi di Trento, 2014. https://hdl.handle.net/11572/367599.

Full text
Abstract:
The goal of this thesis is the discovery of a bioinformatics solution for network-based predictive analysis of NGS data, in which network structures can substitute gene lists as a more rich and complex signature of disease. I have focused on methods for network stability, network inference and network comparison, as additional components of the pipeline and as methods to detects outliers in high-throughput datasets. Besides a first work on GEO datasets, the main application of my pipeline has been on original data from the FDA SEQC (Sequencing Quality Control)project. Here I will report some initial findings to which I have contributed with methods and analysis: as the corresponding papers are being submitted. My goal is to provide a comprehensive tool for network reconstruction and network comparison as an R package and user-friendly web service interface available on-line at https://renette.fbk.eu The goal of this thesis is the discovery of a bioinformatics solution for network-based predictive analysis of NGS data, in which network structures can substitute gene lists as a more rich and complex signature of disease. I have focused on methods for network stability, network inference and network comparison, as additional components of the pipeline and as methods to detects outliers in high-throughput datasets. Besides a first work on GEO datasets, the main application of my pipeline has been on original data from the FDA SEQC (Sequencing Quality Control)project. Here I will report some initial findings to which I have contributed with methods and analysis: as the corresponding papers are being submitted. My goal is to provide a comprehensive tool for network reconstruction and network comparison as an R package and user-friendly web service interface available on-line at https://renette.fbk.eu.
APA, Harvard, Vancouver, ISO, and other styles
4

DIACCI, CHIARA. "Dispositivi Bioelettronici Organici per Specifici Biomarcatori: Verso l´Integrazione con Sistemi Biologici." Doctoral thesis, Università degli studi di Modena e Reggio Emilia, 2021. http://hdl.handle.net/11380/1256782.

Full text
Abstract:
I materiali inorganici sono stati i protagonisti dell'industria dei semiconduttori negli ultimi quarant'anni. Tuttavia, c'è stato un continuo interesse nella ricerca e nell'applicazione di semiconduttori organici (OSC) per dispositivi elettronici, grazie alla loro possibilità di essere processati a bassa temperatura, su substrati flessibili e di essere fabbricati utilizzando diverse strategie. Grazie a queste caratteristiche, questi dispositivi elettronici organici stanno rapidamente emergendo come biosensori, con un alto potenziale per diventare piattaforme point-of-care. Uno dei dispositivi piu´utilizzati è il Transistor Elettrochimico Organico (OECT). Gli OECT possono trasdurre e amplificare segnali elettrici o rilevare analiti biologici previa funzionalizzazione con specifiche unità di bioriconoscimento. Gli OECT operano a basso voltaggio, sono facili da fabbricare su diversi substrati e sono compatibili con l'ambiente acquoso, quindi possono interfacciarsi con sistemi biologici. La configurazione di un OECT include un elettrodo di gate che modula la corrente nel canale attraverso un elettrolita, che può essere non solo una soluzione tampone ma anche un fluido biologico complesso. Quando gli OECT vengono utilizzati come biosensori, il meccanismo di rilevamento si basa sulla variazione di corrente generata da reazioni specifiche con l'analita di interesse. Durante il mio dottorato di ricerca, mi sono concentrata sulla progettazione, fabbricazione e validazione di biosensori basati su OECT per il rilevamento di biomarcatori clinici, mostrando il loro alto potenziale come piattaforma di rilevamento. Abbiamo sviluppato sensori verso vari analiti, da molecole a proteine, usando diverse strategie per dotare il dispositivo di selettività verso gli analiti. In particolare, ho anche dimostrato la possibilità di integrare gli OECT nelle piante, come esempio di interfaccia con sistemi viventi. Nei primi due articoli, abbiamo sviluppato OECT stampati, che presentano PEDOT:PSS come semiconduttore sul canale. Nel primo caso, il dispositivo presenta anche un elettrodo di gate PEDOT:PSS che è stato ulteriormente funzionalizzato con gelatina biocompatibile e l'enzima ureasi per garantire la selettività verso l'urea. Il biosensore è stato in grado di monitorare diverse concentrazioni di urea con un limite di detection di 1µM. Nel secondo articolo l'elettrodo di gate di carbonio stampato è stato prima modificato con platino e succesivamente abbiamo assicurato la selettività verso l'acido urico, un biomarcatore rilevante per l'infezione, intrappolando l'urato ossidasi in una membrana a doppio strato ionico. Il biosensore ha mostrato un limite di rilevamento e selettività di 4,5 µM anche in essudato artificiale. Nel terzo articolo abbiamo progettato un biosensore per la detection di interleuchina-6 (IL6) in grado di rilevare i livelli di citochina in concentrazioni pM. Il meccanismo di rilevamento si basa sul legame specifico tra un aptamero, utilizzato come unità di rilevamento sull'elettrodo di gate, e l'IL6 in soluzione. Nel quarto articolo abbiamo presentato un sensore, basato sull'enzima glucosio ossidasi (GOx) per rilevare l'esportazione di glucosio da cloroplasti. Abbiamo dimostrato il monitoraggio di glucosio in tempo reale con una risoluzione di 1 minuto. Nel secondo articolo, abbiamo sviluppato sensori impiantabili per il monitoraggio in vivo e in tempo reale del trasporto di zuccheri nelle piante. I biosensori presentano un elettrodo di gate multienzimatico per conferire specificità per il glucosio e il saccarosio. In particolare, i dispositivi non hanno causato singificativi danni al tessuto, dimostrando che gli OECT possono rappresentatere soluzioni alternative per studiare processi fisiologici.
Inorganic materials have been the main players of the semiconductor industry for the past forty years. However, there has been a continuous growth in the research and in the application of organic semiconductors (OSCs) as active material in electronic devices, due to the possibility to process these materials at low temperature, on flexible substrates and fabricate them on large-area. Because of these features, organic electronic devices are rapidly emerging as biosensors, with a high potential for becoming a high-throughput tool for point-of-care. One of the most studied platform is the Organic Electrochemical Transistor (OECT). OECTs have been used as biosensors to transduce and amplify electrical signals or detect biological analytes upon proper functionalization with specific biorecognition units. OECTs can operate at low voltages, they are easy to fabricate on different substrates and compatible with the aqueous environment, therefore can be interfaced with living systems. The OECT device configuration includes a gate electrode that modulates the current in the channel through an electrolyte, which can be a buffered solution or a complex biological fluid. When OECTs are operated as biosensors, the sensing mechanism relies on the current variation generated from specific reactions with the analyte of interest. During my PhD, I focused on design, fabrication and validation of different OECT-based biosensors for the detection of biomarkers for healthcare applications, showing their high potential as sensing platform. We developed sensors towards different analytes, ranging from small molecules to proteins, with ad hoc strategies to endow the device with selectivity towards the species of interest. Most notably, I also demonstrated the possibility of integrating OECTs in plants, as an example of interfacing living systems. In the first two papers, we developed screen printed OECTs, presenting PEDOT:PSS as semiconducting material on the channel. In the first case, the device featured also a PEDOT:PSS gate electrode which was further functionalized with biocompatible gelatin and the enzyme urease to ensure selectivity toward urea. The biosensor was able to monitor increasing urea concentrations with a limit of detection of 1µM. In the second paper the screen-printed carbon gate electrode was first modified with platinum and then we ensured selectivity towards analyte uric acid, a relevant biomarker for wound infection, by entrapping urate oxidase in a dual-ionic-layer hydrogel membrane. The biosensor exhibited a 4.5µM limit of detection and selectivity even in artificial wound exudate. In the third paper we designed an interleukin-6 (IL6) OECT based biosensor able to detect cytokine levels down to the pM regime in PBS buffer. The mechanism of detection relies on the specific binding between an aptamer, used as sensing unit on the gate electrode, and the IL6 in solution, allowing for detection ranging from physiological to pathological levels. In the last two papers we developed OECT based biosensors to be interfaced with the plant world. In the fourth paper we presented a sensor functionalized with glucose oxidase (GOx) to detect glucose export from chloroplast. In particular, we demonstrated real-time glucose monitoring with temporal resolution of 1 minute. In the second paper, we developed implantable OECT-based sugar sensors for in vivo, real time monitoring of sugar transport in poplar trees. The biosensors presented a multienzyme-functionalized gate for specificity towards glucose and sucrose. Most notably, the OECT sensors did not cause a significant wound response in the plant, demonstrating that OECT-based sensors are attractive tools for studying transport kinetics in plants.
APA, Harvard, Vancouver, ISO, and other styles
5

TODISCO, MARCO. "A LIQUID CRYSTAL WORLD AT THE ORIGIN OF LIFE." Doctoral thesis, Università degli Studi di Milano, 2020. http://hdl.handle.net/2434/708665.

Full text
Abstract:
The conundrum of the Origin of Life has always fascinated humankind, given its intimate relation with a fundamental question such as {it ``where do we come from?"}. In order to answer this question, from centuries the scholars of the field have studied the conditions present in our Universe and on our planet at the age when Life emerged, and tested whether Life could raise in such a complex scenario. A general pathway has been defined to recapitulate the origin of Life from simple inorganic chemicals up to complex systems, and it inevitably passes through the so-called {it"RNA World"}; a stage on our Earth when Life consisted of long linear polymers of RNA both carrying genetic information and catalyzing their very own replication and supporting a primitive metabolism. Still, up to nowadays we lack a good explanation of how long biopolymers could have formed on the ancient Earth, given that RNA phosphodiester bonds are intrinsically unstable and that the most common models of non-enzymatic polymerization result in very short products. The necessity to define a complete pathway for the origin of Life has pushed us to develop a new model to drive the formation of a ribozyme, based on the self-assembly of nucleic acids molecules. Exploiting a wide set of tools belonging to the fields of molecular biology, chemistry and soft matter physics, in this work we show that the self-assembly and supramolecular ordering of nucleic acids is a phenomenon with deep roots in their very physicochemical properties, leading to the emergence of liquid crystal ordering in mixtures of nucleosides having different chemical modifications and polymerization state. The formation of long physical polymers with discontinuities or completely lacking a backbone, while still structured following Watson-Crick selectivity, raises questions on the origin of the famous double helix, which seems now to be found in the features of single nucleosides interactions. In this thesis the effect of supramolecular organization and symmetry breaking on short RNA oligomers and chemically activated nucleosides is described and characterized, showing that the chemical reactivity is altered favoring the formation of long linear polymers, and highlighting a new potential pathway for the Origin of Life.
APA, Harvard, Vancouver, ISO, and other styles
6

Meneghello, Anna. "Surface plasmon resonance based platforms for clinical and environmental biosensing applications." Doctoral thesis, Università degli studi di Padova, 2016. http://hdl.handle.net/11577/3424512.

Full text
Abstract:
My PhD Thesis work, developed in Veneto Nanotech Laboratories (Nanofab in Marghera, LaNN in Padova and ECSIN in Rovigo), was aimed at the exploitation of the Surface Plasmon Resonance (SPR) phenomenon for the set-up of biosensing platforms for clinical and environmental applications. In particular, two types of SPR-based platforms were set-up and optimised: the first one was an oligonucleotide-based platform for the detection of Cystic Fibrosis (CF) causing mutations while the second one was an antibody-based platform for the detection of Legionella pneumophila whole cells. Both sensors are based on the same detection strategy, exploiting the advantages of using a highly sensitive Grating Coupled - Surface Plasmon Resonance (GC-SPR) enhanced spectroscopy method, designed using a conical illumination configuration for label-free molecular detection. Concerning DNA platform for Cystic Fibrosis, a strategy for the detection of some of the most frequent mutations responsible for CF among the Italian population is investigated. For the detection of the CF mutations, gold sinusoidal gratings are used as sensing surfaces, and the specific biodetection is achieved through the usage of allele specific oligonucleotide (ASO) DNA hairpin probes, designed for single nucleotide discrimination. Substrates were used to test unlabeled PCR amplified homozygous wild type (wt) and heterozygous samples (wt/mut) - deriving from clinical samples - for the screened mutations. Hybridisation conditions were optimised to obtain the maximum discrimination ratio (DR) between the homozygous wild type and the heterozygous samples. SPR signals obtained from hybridising wild type and heterozygous samples showed DRs able to identify univocally the correct genotypes, as confirmed by fluorescence microarray experiments run in parallel. Furthermore, SPR genotyping was not impaired in samples containing unrelated DNA, allowing the platform to be used for the parallel discrimination of several alleles also scalable for a high throughput screening setting. Concerning antibody platform for Legionella pneumophila bacteria detection, a strategy for the exploitation of the SPR phenomenon to develop a fully automated platform for fast optical detection of Legionella pneumophila pathogens was investigated. The legal limit of L. pneumophila in a high-risk hospital environment in Italy is 102 CFU/L, and the gold standard for its identification is a time consuming microbiological culture method, that requires up to 7 days. Starting from these considerations a sensitive GC-SPR system was applied to the detection of L. pneumophila to test the detection limit of the developed sensing device in term of detectable bacterium CFU. The detection was accurately set up and precisely optimised firstly through the usage of flat gold functionalised slides to be then translated to sinusoidal gold gratings for label-free GC-SPR detection using ellipsometer, in order to ensure a reproducible and precise identification of bacteria. Through azimuthally-controlled GC-SPR, 10 CFU were detected, while in the case of fluorescence analysis results, a negative readout is obtained if incubating less than 104 CFU. Successful results were obtained when incubating environmental derived samples. This detection platform could be implemented as a prototype in which water and air samples will be sequentially concentrated, injected into a microfluidic system, and delivered to the SPR sensor for analysis. The peculiar Grating Coupled - Surface Plasmon Resonance method applied for this work has therefore revealed to be an accurate and highly sensitive strategy – with multiplexing possibility - for the sensing and detecting of different kind of biomolecules, from DNA fragments to whole bacteria cell.
Il mio lavoro di Tesi di Dottorato, sviluppato presso i laboratori Veneto Nanotech (Nanofab a Marghera, LaNN a Padova ed ECSIN a Rovigo), ha avuto come obiettivo l’utilizzo della tecnologia di risonanza plasmonica di superficie (SPR – Surface Plasmon Resonance) per lo sviluppo di piattaforme biosensoristiche per applicazioni clinica ed ambientali. In particolare, durante il lavoro di Dottorato sono state messe a punto due piattaforme SPR: la prima piattaforma utilizza sonde oligonucleotidiche a DNA per l'individuazione di mutazioni causanti fibrosi cistica (CF) mentre la seconda utilizza anticorpi per il rilevamento di cellule di Legionella pneumophila. Entrambi i sensori sono basati sulla stessa strategia di rilevamento, ovvero l’utilizzo di una metodologia Grating Coupled – Surface Plasmon Resonance (GC-SPR) progettata utilizzando una configurazione conica di illuminazione ad azimut rotato per la rilevazione diretta – senza passaggi di marcatura, label-free – dell’analita in esame. Per quanto riguarda la piattaforma a DNA per la fibrosi cistica, si è sviluppata una strategia per l'individuazione di alcune delle mutazioni più frequenti responsabili CF tra la popolazione italiana. Per la rilevazione di tali mutazioni le superfici di analisi utilizzate sono grigliati sinusoidali, e la rilevazione specifica delle sequenze di interesse si ottiene attraverso l'utilizzo di oligonucleotidi allele-specifici (ASO – allele specific oligonucleotide) con struttura ad hairpin, disegnati per la discriminazione di un singolo nucleotide. I substrati plasmonici sono stati utilizzati per testare campioni wild-type ed eterozigoti (wt/mut) per le mutazioni in esame, amplificati tramite PCR a partire da campioni clinici. Le condizioni di ibridazione sono state ottimizzate per ottenere il rapporto di discriminazione (DR – discrimination ratio) massimo tra campioni wild-type ed eterozigoti. I segnali SPR ottenuti ibridando campioni wild-type e campioni eterozigoti hanno mostrato DR in grado di identificare univocamente i genotipi corretti, come confermato da esperimenti di fluorescenza in microarray eseguiti in parallelo. Inoltre la genotipizzazione ottenuta tramite SPR non è stata inficiata in campioni contenenti DNA interferente, consentendo quindi di utilizzare la piattaforma per la discriminazione in parallelo dei diversi alleli, e la possibilità futura di scalare il sistema con un approccio di high throughput screening. Per quanto riguarda la piattaforma ad anticorpi per la rilevazione di Legionella pneumophila, la medesima strategia basata su GC-SPR è stata messa a punto per ottenere una rilevazione rapida e sensibile di tale patogeno. Il limite legale di L. pneumophila in ambienti ospedalieri ad alto rischio in Italia è di 102 UFC/L (unità formanti colonia) e la metodologia di riferimento per la sua identificazione è una tecnica di coltura microbiologica che richiede tempi di attesa fino a 7 giorni. Partendo da tali considerazioni un sistema GC-SPR altamente sensibile è stato sviluppato ed applicato per la rivelazione di L. pneumophila: la rivelazione è stata accuratamente impostata ed ottimizzata con un ceppo standard del battere, prima attraverso l'utilizzo di superfici d’oro non nanostrutturate (flat) opportunamente funzionalizzate ed analizzate tramite fluorescenza, e successivamente attraverso reticoli sinusoidali (grating) d’oro analizzati tramite elissometria GC-SPR. Attraverso la metodologia GC-SPR ad azimut rotato è stato possibile rilevare fino a 10 UFC, mentre con l’analisi in fluorescenza non è stato possibile identificare quantitativi di battere inferiori a 104 UFC. Risultati positivi sono stati ottenuti anche incubando campioni di L. pneumophila isolati direttamente dall’ambiente ospedaliero. Questa piattaforma di rilevazione potrà essere implementata come prototipo in cui campioni di acqua e aria potranno venir sequenzialmente concentrati, iniettati in un sistema di microfluidica, ed incubati sulla superficie del sensore SPR per l'analisi, obiettivi questi del progetto POSEIDON (Horizon2020) attualmente in corso. La particolare metodologia GC-SPR ad azimut rotato applicata in questo lavoro di Tesi si è dimostrata essere una strategia accurata e altamente sensibile - con possibilità di multiplexing - per la rilevazione di diversi tipi di biomolecole, a partire da frammenti di DNA fino ad intere cellule batteriche.
APA, Harvard, Vancouver, ISO, and other styles
7

Verità, Simona <1977&gt. "Radioattività naturale nei prodotti ceramici: valutazioni teoriche e sperimentali degli effetti ambientali e sanitari." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1439/1/Simona_Verit%C3%A0_tesi.pdf.

Full text
Abstract:
I materiali zirconiferi, in particolare i silicati di zirconio e l’ossido di zirconio, vengono ampiamente utilizzati in diverse applicazioni industriali: fra queste, l’industria delle piastrelle ceramiche e di produzione dei materiali refrattari ne consuma quantitativi consistenti. Misure di spettrometria gamma condotte su diversi campioni di materiale zirconifero (farine di zirconio e sabbie zirconifere) utilizzati nell’industria ceramica per la produzione del gres porcellanato e delle piastrelle smaltate, hanno messo in evidenza valori di concentrazione di attività superiori a quelli presenti mediamente sulla crosta terrestre (35, 30 Bqkg-1 per il 238U e il 232Th, rispettivamente [Unscear, 2000]). L’aggiunta del materiale zirconifero nella preparazione delle piastrelle ceramiche in particolare di quelle smaltate (in una percentuale in peso pari al 10-20%) del gres porcellanato (in una percentuale in peso di 1-10%) e delle lamine di gres porcellanato “sottile” (in una percentuale in peso dell’ordine del 30%), conferisce al prodotto finale un alto grado di bianco, buone caratteristiche meccaniche ed un elevato effetto opacizzante, ma comporta anche un arricchimento in radionuclidi naturali. L’obiettivo principale di questo lavoro di tesi è stato quello di mettere a punto una metodologia (teorica e sperimentale) per valutare l’incremento di esposizione che possono ricevere un lavoratore standard di una industria ceramica ed una persona del pubblico che soggiorna in un ambiente rivestito con piastrelle. Da un lato, la presenza di radioattività nelle sabbie zirconifere, utilizzate come materie prime per la produzione di piastrelle, porta a considerare il problema dell’incremento di esposizione che può subire un lavoratore impiegato in ambiente di lavoro dove sono stoccati e manipolati consistenti quantitativi di materiale, dall’altro il contenuto di radioattività nel prodotto finito porta a considerare l’esposizione per una persona della popolazione che soggiorna in una stanza rivestita con materiale ceramico. Le numerose misure effettuate, unitamente allo sviluppo dei modelli necessari per valutare le dosi di esposizione per le persone del pubblico e dei lavoratori impiegati nel processo di produzione di piastrelle ceramiche, hanno permesso di mettere a punto una procedura che fornisce le garanzie necessarie per dichiarare accettabili le condizioni di lavoro nelle industrie ceramiche e più in generale il rispetto delle norme radioprotezionistiche per gli occupanti di ambienti rivestiti con piastrelle italiane.
APA, Harvard, Vancouver, ISO, and other styles
8

Verità, Simona <1977&gt. "Radioattività naturale nei prodotti ceramici: valutazioni teoriche e sperimentali degli effetti ambientali e sanitari." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1439/.

Full text
Abstract:
I materiali zirconiferi, in particolare i silicati di zirconio e l’ossido di zirconio, vengono ampiamente utilizzati in diverse applicazioni industriali: fra queste, l’industria delle piastrelle ceramiche e di produzione dei materiali refrattari ne consuma quantitativi consistenti. Misure di spettrometria gamma condotte su diversi campioni di materiale zirconifero (farine di zirconio e sabbie zirconifere) utilizzati nell’industria ceramica per la produzione del gres porcellanato e delle piastrelle smaltate, hanno messo in evidenza valori di concentrazione di attività superiori a quelli presenti mediamente sulla crosta terrestre (35, 30 Bqkg-1 per il 238U e il 232Th, rispettivamente [Unscear, 2000]). L’aggiunta del materiale zirconifero nella preparazione delle piastrelle ceramiche in particolare di quelle smaltate (in una percentuale in peso pari al 10-20%) del gres porcellanato (in una percentuale in peso di 1-10%) e delle lamine di gres porcellanato “sottile” (in una percentuale in peso dell’ordine del 30%), conferisce al prodotto finale un alto grado di bianco, buone caratteristiche meccaniche ed un elevato effetto opacizzante, ma comporta anche un arricchimento in radionuclidi naturali. L’obiettivo principale di questo lavoro di tesi è stato quello di mettere a punto una metodologia (teorica e sperimentale) per valutare l’incremento di esposizione che possono ricevere un lavoratore standard di una industria ceramica ed una persona del pubblico che soggiorna in un ambiente rivestito con piastrelle. Da un lato, la presenza di radioattività nelle sabbie zirconifere, utilizzate come materie prime per la produzione di piastrelle, porta a considerare il problema dell’incremento di esposizione che può subire un lavoratore impiegato in ambiente di lavoro dove sono stoccati e manipolati consistenti quantitativi di materiale, dall’altro il contenuto di radioattività nel prodotto finito porta a considerare l’esposizione per una persona della popolazione che soggiorna in una stanza rivestita con materiale ceramico. Le numerose misure effettuate, unitamente allo sviluppo dei modelli necessari per valutare le dosi di esposizione per le persone del pubblico e dei lavoratori impiegati nel processo di produzione di piastrelle ceramiche, hanno permesso di mettere a punto una procedura che fornisce le garanzie necessarie per dichiarare accettabili le condizioni di lavoro nelle industrie ceramiche e più in generale il rispetto delle norme radioprotezionistiche per gli occupanti di ambienti rivestiti con piastrelle italiane.
APA, Harvard, Vancouver, ISO, and other styles
9

Marazza, Diego <1970&gt. "Applicazione di sistemi di gestione ambientale alla scala locale." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2007. http://amsdottorato.unibo.it/293/1/marazza_XIX_3.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Marazza, Diego <1970&gt. "Applicazione di sistemi di gestione ambientale alla scala locale." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2007. http://amsdottorato.unibo.it/293/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Pancotti, Lorenzo <1977&gt. "Optical concentrators for photovoltaic use." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2007. http://amsdottorato.unibo.it/349/1/tesi_Pancotti.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Pancotti, Lorenzo <1977&gt. "Optical concentrators for photovoltaic use." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2007. http://amsdottorato.unibo.it/349/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Mesirca, Pietro <1972&gt. "Electrical activity in neurons exposed to low level electromagnetic fields: theory and experiments." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2007. http://amsdottorato.unibo.it/505/1/TesiPietroMesirca.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Mesirca, Pietro <1972&gt. "Electrical activity in neurons exposed to low level electromagnetic fields: theory and experiments." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2007. http://amsdottorato.unibo.it/505/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Masetti, Simone <1970&gt. "Sviluppo di un tomografo multi-energy per lo studio pre-clinico di nuove metodiche diagnostiche finalizzate al riconoscimento precoce della patologia tumorale." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/841/1/Tesi_Masetti_Simone.pdf.

Full text
Abstract:
A new multi-energy CT for small animals is being developed at the Physics Department of the University of Bologna, Italy. The system makes use of a set of quasi-monochromatic X-ray beams, with energy tunable in a range from 26 KeV to 72 KeV. These beams are produced by Bragg diffraction on a Highly Oriented Pyrolytic Graphite crystal. With quasi-monochromatic sources it is possible to perform multi-energy investigation in a more effective way, as compared with conventional X-ray tubes. Multi-energy techniques allow extracting physical information from the materials, such as effective atomic number, mass-thickness, density, that can be used to distinguish and quantitatively characterize the irradiated tissues. The aim of the system is the investigation and the development of new pre-clinic methods for the early detection of the tumors in small animals. An innovative technique, the Triple-Energy Radiography with Contrast Medium (TER), has been successfully implemented on our system. TER consist in combining a set of three quasi-monochromatic images of an object, in order to obtain a corresponding set of three single-tissue images, which are the mass-thickness map of three reference materials. TER can be applied to the quantitative mass-thickness-map reconstruction of a contrast medium, because it is able to remove completely the signal due to other tissues (i.e. the structural background noise). The technique is very sensitive to the contrast medium and is insensitive to the superposition of different materials. The method is a good candidate to the early detection of the tumor angiogenesis in mice. In this work we describe the tomographic system, with a particular focus on the quasi-monochromatic source. Moreover the TER method is presented with some preliminary results about small animal imaging.
APA, Harvard, Vancouver, ISO, and other styles
16

Masetti, Simone <1970&gt. "Sviluppo di un tomografo multi-energy per lo studio pre-clinico di nuove metodiche diagnostiche finalizzate al riconoscimento precoce della patologia tumorale." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/841/.

Full text
Abstract:
A new multi-energy CT for small animals is being developed at the Physics Department of the University of Bologna, Italy. The system makes use of a set of quasi-monochromatic X-ray beams, with energy tunable in a range from 26 KeV to 72 KeV. These beams are produced by Bragg diffraction on a Highly Oriented Pyrolytic Graphite crystal. With quasi-monochromatic sources it is possible to perform multi-energy investigation in a more effective way, as compared with conventional X-ray tubes. Multi-energy techniques allow extracting physical information from the materials, such as effective atomic number, mass-thickness, density, that can be used to distinguish and quantitatively characterize the irradiated tissues. The aim of the system is the investigation and the development of new pre-clinic methods for the early detection of the tumors in small animals. An innovative technique, the Triple-Energy Radiography with Contrast Medium (TER), has been successfully implemented on our system. TER consist in combining a set of three quasi-monochromatic images of an object, in order to obtain a corresponding set of three single-tissue images, which are the mass-thickness map of three reference materials. TER can be applied to the quantitative mass-thickness-map reconstruction of a contrast medium, because it is able to remove completely the signal due to other tissues (i.e. the structural background noise). The technique is very sensitive to the contrast medium and is insensitive to the superposition of different materials. The method is a good candidate to the early detection of the tumor angiogenesis in mice. In this work we describe the tomographic system, with a particular focus on the quasi-monochromatic source. Moreover the TER method is presented with some preliminary results about small animal imaging.
APA, Harvard, Vancouver, ISO, and other styles
17

Marconi, Daniela <1979&gt. "New approaches to open problems in gene expression microarray data." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/842/1/Tesi_Marconi_Daniela.pdf.

Full text
Abstract:
In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.
APA, Harvard, Vancouver, ISO, and other styles
18

Marconi, Daniela <1979&gt. "New approaches to open problems in gene expression microarray data." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/842/.

Full text
Abstract:
In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.
APA, Harvard, Vancouver, ISO, and other styles
19

Montanucci, Ludovica <1978&gt. "Computational methods for genome screening." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/847/1/Tesi_Montanucci_Ludovica.pdf.

Full text
Abstract:
Motivation An actual issue of great interest, both under a theoretical and an applicative perspective, is the analysis of biological sequences for disclosing the information that they encode. The development of new technologies for genome sequencing in the last years, opened new fundamental problems since huge amounts of biological data still deserve an interpretation. Indeed, the sequencing is only the first step of the genome annotation process that consists in the assignment of biological information to each sequence. Hence given the large amount of available data, in silico methods became useful and necessary in order to extract relevant information from sequences. The availability of data from Genome Projects gave rise to new strategies for tackling the basic problems of computational biology such as the determination of the tridimensional structures of proteins, their biological function and their reciprocal interactions. Results The aim of this work has been the implementation of predictive methods that allow the extraction of information on the properties of genomes and proteins starting from the nucleotide and aminoacidic sequences, by taking advantage of the information provided by the comparison of the genome sequences from different species. In the first part of the work a comprehensive large scale genome comparison of 599 organisms is described. 2,6 million of sequences coming from 551 prokaryotic and 48 eukaryotic genomes were aligned and clustered on the basis of their sequence identity. This procedure led to the identification of classes of proteins that are peculiar to the different groups of organisms. Moreover the adopted similarity threshold produced clusters that are homogeneous on the structural point of view and that can be used for structural annotation of uncharacterized sequences. The second part of the work focuses on the characterization of thermostable proteins and on the development of tools able to predict the thermostability of a protein starting from its sequence. By means of Principal Component Analysis the codon composition of a non redundant database comprising 116 prokaryotic genomes has been analyzed and it has been showed that a cross genomic approach can allow the extraction of common determinants of thermostability at the genome level, leading to an overall accuracy in discriminating thermophilic coding sequences equal to 95%. This result outperform those obtained in previous studies. Moreover, we investigated the effect of multiple mutations on protein thermostability. This issue is of great importance in the field of protein engineering, since thermostable proteins are generally more suitable than their mesostable counterparts in technological applications. A Support Vector Machine based method has been trained to predict if a set of mutations can enhance the thermostability of a given protein sequence. The developed predictor achieves 88% accuracy.
APA, Harvard, Vancouver, ISO, and other styles
20

Montanucci, Ludovica <1978&gt. "Computational methods for genome screening." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/847/.

Full text
Abstract:
Motivation An actual issue of great interest, both under a theoretical and an applicative perspective, is the analysis of biological sequences for disclosing the information that they encode. The development of new technologies for genome sequencing in the last years, opened new fundamental problems since huge amounts of biological data still deserve an interpretation. Indeed, the sequencing is only the first step of the genome annotation process that consists in the assignment of biological information to each sequence. Hence given the large amount of available data, in silico methods became useful and necessary in order to extract relevant information from sequences. The availability of data from Genome Projects gave rise to new strategies for tackling the basic problems of computational biology such as the determination of the tridimensional structures of proteins, their biological function and their reciprocal interactions. Results The aim of this work has been the implementation of predictive methods that allow the extraction of information on the properties of genomes and proteins starting from the nucleotide and aminoacidic sequences, by taking advantage of the information provided by the comparison of the genome sequences from different species. In the first part of the work a comprehensive large scale genome comparison of 599 organisms is described. 2,6 million of sequences coming from 551 prokaryotic and 48 eukaryotic genomes were aligned and clustered on the basis of their sequence identity. This procedure led to the identification of classes of proteins that are peculiar to the different groups of organisms. Moreover the adopted similarity threshold produced clusters that are homogeneous on the structural point of view and that can be used for structural annotation of uncharacterized sequences. The second part of the work focuses on the characterization of thermostable proteins and on the development of tools able to predict the thermostability of a protein starting from its sequence. By means of Principal Component Analysis the codon composition of a non redundant database comprising 116 prokaryotic genomes has been analyzed and it has been showed that a cross genomic approach can allow the extraction of common determinants of thermostability at the genome level, leading to an overall accuracy in discriminating thermophilic coding sequences equal to 95%. This result outperform those obtained in previous studies. Moreover, we investigated the effect of multiple mutations on protein thermostability. This issue is of great importance in the field of protein engineering, since thermostable proteins are generally more suitable than their mesostable counterparts in technological applications. A Support Vector Machine based method has been trained to predict if a set of mutations can enhance the thermostability of a given protein sequence. The developed predictor achieves 88% accuracy.
APA, Harvard, Vancouver, ISO, and other styles
21

Miceli, Alice <1978&gt. "An experimental and theoretical approach to correct for the scattered radiation in an X-ray computer tomography system for industrial applications." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/848/1/Tesi_Miceli_Alice.pdf.

Full text
Abstract:
The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.
APA, Harvard, Vancouver, ISO, and other styles
22

Miceli, Alice <1978&gt. "An experimental and theoretical approach to correct for the scattered radiation in an X-ray computer tomography system for industrial applications." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2008. http://amsdottorato.unibo.it/848/.

Full text
Abstract:
The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.
APA, Harvard, Vancouver, ISO, and other styles
23

Bartoli, Lisa <1980&gt. "Computational methods for the analysis of protein structure and function." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1225/1/Bartoli_Lisa_tesi.pdf.

Full text
Abstract:
The vast majority of known proteins have not yet been experimentally characterized and little is known about their function. The design and implementation of computational tools can provide insight into the function of proteins based on their sequence, their structure, their evolutionary history and their association with other proteins. Knowledge of the three-dimensional (3D) structure of a protein can lead to a deep understanding of its mode of action and interaction, but currently the structures of <1% of sequences have been experimentally solved. For this reason, it became urgent to develop new methods that are able to computationally extract relevant information from protein sequence and structure. The starting point of my work has been the study of the properties of contacts between protein residues, since they constrain protein folding and characterize different protein structures. Prediction of residue contacts in proteins is an interesting problem whose solution may be useful in protein folding recognition and de novo design. The prediction of these contacts requires the study of the protein inter-residue distances related to the specific type of amino acid pair that are encoded in the so-called contact map. An interesting new way of analyzing those structures came out when network studies were introduced, with pivotal papers demonstrating that protein contact networks also exhibit small-world behavior. In order to highlight constraints for the prediction of protein contact maps and for applications in the field of protein structure prediction and/or reconstruction from experimentally determined contact maps, I studied to which extent the characteristic path length and clustering coefficient of the protein contacts network are values that reveal characteristic features of protein contact maps. Provided that residue contacts are known for a protein sequence, the major features of its 3D structure could be deduced by combining this knowledge with correctly predicted motifs of secondary structure. In the second part of my work I focused on a particular protein structural motif, the coiled-coil, known to mediate a variety of fundamental biological interactions. Coiled-coils are found in a variety of structural forms and in a wide range of proteins including, for example, small units such as leucine zippers that drive the dimerization of many transcription factors or more complex structures such as the family of viral proteins responsible for virus-host membrane fusion. The coiled-coil structural motif is estimated to account for 5-10% of the protein sequences in the various genomes. Given their biological importance, in my work I introduced a Hidden Markov Model (HMM) that exploits the evolutionary information derived from multiple sequence alignments, to predict coiled-coil regions and to discriminate coiled-coil sequences. The results indicate that the new HMM outperforms all the existing programs and can be adopted for the coiled-coil prediction and for large-scale genome annotation. Genome annotation is a key issue in modern computational biology, being the starting point towards the understanding of the complex processes involved in biological networks. The rapid growth in the number of protein sequences and structures available poses new fundamental problems that still deserve an interpretation. Nevertheless, these data are at the basis of the design of new strategies for tackling problems such as the prediction of protein structure and function. Experimental determination of the functions of all these proteins would be a hugely time-consuming and costly task and, in most instances, has not been carried out. As an example, currently, approximately only 20% of annotated proteins in the Homo sapiens genome have been experimentally characterized. A commonly adopted procedure for annotating protein sequences relies on the "inheritance through homology" based on the notion that similar sequences share similar functions and structures. This procedure consists in the assignment of sequences to a specific group of functionally related sequences which had been grouped through clustering techniques. The clustering procedure is based on suitable similarity rules, since predicting protein structure and function from sequence largely depends on the value of sequence identity. However, additional levels of complexity are due to multi-domain proteins, to proteins that share common domains but that do not necessarily share the same function, to the finding that different combinations of shared domains can lead to different biological roles. In the last part of this study I developed and validate a system that contributes to sequence annotation by taking advantage of a validated transfer through inheritance procedure of the molecular functions and of the structural templates. After a cross-genome comparison with the BLAST program, clusters were built on the basis of two stringent constraints on sequence identity and coverage of the alignment. The adopted measure explicity answers to the problem of multi-domain proteins annotation and allows a fine grain division of the whole set of proteomes used, that ensures cluster homogeneity in terms of sequence length. A high level of coverage of structure templates on the length of protein sequences within clusters ensures that multi-domain proteins when present can be templates for sequences of similar length. This annotation procedure includes the possibility of reliably transferring statistically validated functions and structures to sequences considering information available in the present data bases of molecular functions and structures.
APA, Harvard, Vancouver, ISO, and other styles
24

Bartoli, Lisa <1980&gt. "Computational methods for the analysis of protein structure and function." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1225/.

Full text
Abstract:
The vast majority of known proteins have not yet been experimentally characterized and little is known about their function. The design and implementation of computational tools can provide insight into the function of proteins based on their sequence, their structure, their evolutionary history and their association with other proteins. Knowledge of the three-dimensional (3D) structure of a protein can lead to a deep understanding of its mode of action and interaction, but currently the structures of <1% of sequences have been experimentally solved. For this reason, it became urgent to develop new methods that are able to computationally extract relevant information from protein sequence and structure. The starting point of my work has been the study of the properties of contacts between protein residues, since they constrain protein folding and characterize different protein structures. Prediction of residue contacts in proteins is an interesting problem whose solution may be useful in protein folding recognition and de novo design. The prediction of these contacts requires the study of the protein inter-residue distances related to the specific type of amino acid pair that are encoded in the so-called contact map. An interesting new way of analyzing those structures came out when network studies were introduced, with pivotal papers demonstrating that protein contact networks also exhibit small-world behavior. In order to highlight constraints for the prediction of protein contact maps and for applications in the field of protein structure prediction and/or reconstruction from experimentally determined contact maps, I studied to which extent the characteristic path length and clustering coefficient of the protein contacts network are values that reveal characteristic features of protein contact maps. Provided that residue contacts are known for a protein sequence, the major features of its 3D structure could be deduced by combining this knowledge with correctly predicted motifs of secondary structure. In the second part of my work I focused on a particular protein structural motif, the coiled-coil, known to mediate a variety of fundamental biological interactions. Coiled-coils are found in a variety of structural forms and in a wide range of proteins including, for example, small units such as leucine zippers that drive the dimerization of many transcription factors or more complex structures such as the family of viral proteins responsible for virus-host membrane fusion. The coiled-coil structural motif is estimated to account for 5-10% of the protein sequences in the various genomes. Given their biological importance, in my work I introduced a Hidden Markov Model (HMM) that exploits the evolutionary information derived from multiple sequence alignments, to predict coiled-coil regions and to discriminate coiled-coil sequences. The results indicate that the new HMM outperforms all the existing programs and can be adopted for the coiled-coil prediction and for large-scale genome annotation. Genome annotation is a key issue in modern computational biology, being the starting point towards the understanding of the complex processes involved in biological networks. The rapid growth in the number of protein sequences and structures available poses new fundamental problems that still deserve an interpretation. Nevertheless, these data are at the basis of the design of new strategies for tackling problems such as the prediction of protein structure and function. Experimental determination of the functions of all these proteins would be a hugely time-consuming and costly task and, in most instances, has not been carried out. As an example, currently, approximately only 20% of annotated proteins in the Homo sapiens genome have been experimentally characterized. A commonly adopted procedure for annotating protein sequences relies on the "inheritance through homology" based on the notion that similar sequences share similar functions and structures. This procedure consists in the assignment of sequences to a specific group of functionally related sequences which had been grouped through clustering techniques. The clustering procedure is based on suitable similarity rules, since predicting protein structure and function from sequence largely depends on the value of sequence identity. However, additional levels of complexity are due to multi-domain proteins, to proteins that share common domains but that do not necessarily share the same function, to the finding that different combinations of shared domains can lead to different biological roles. In the last part of this study I developed and validate a system that contributes to sequence annotation by taking advantage of a validated transfer through inheritance procedure of the molecular functions and of the structural templates. After a cross-genome comparison with the BLAST program, clusters were built on the basis of two stringent constraints on sequence identity and coverage of the alignment. The adopted measure explicity answers to the problem of multi-domain proteins annotation and allows a fine grain division of the whole set of proteomes used, that ensures cluster homogeneity in terms of sequence length. A high level of coverage of structure templates on the length of protein sequences within clusters ensures that multi-domain proteins when present can be templates for sequences of similar length. This annotation procedure includes the possibility of reliably transferring statistically validated functions and structures to sequences considering information available in the present data bases of molecular functions and structures.
APA, Harvard, Vancouver, ISO, and other styles
25

Lo, Meo Sergio <1970&gt. "Monte Carlo Simulations of Novel Scintillator Detectors and Dosimetry Calculations." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1583/1/lo_meo_sergio_tesi.pdf.

Full text
Abstract:
Monte Carlo (MC) simulation techniques are becoming very common in the Medical Physicists community. MC can be used for modeling Single Photon Emission Computed Tomography (SPECT) and for dosimetry calculations. 188Re, is a promising candidate for radiotherapeutic production and understanding the mechanisms of the radioresponse of tumor cells "in vitro" is of crucial importance as a first step before "in vivo" studies. The dosimetry of 188Re, used to target different lines of cancer cells, has been evaluated by the MC code GEANT4. The simulations estimate the average energy deposition/per event in the biological samples. The development of prototypes for medical imaging, based on LaBr3:Ce scintillation crystals coupled with a position sensitive photomultiplier, have been studied using GEANT4 simulations. Having tested, in the simulation, surface treatments different from the one applied to the crystal used in our experimental measurements, we found out that the Energy Resolution (ER) and the Spatial Resolution (SR) could be improved, in principle, by machining in a different way the lateral surfaces of the crystal. We have then studied a system able to acquire both echographic and scintigraphic images to let the medical operator obtain the complete anatomic and functional information for tumor diagnosis. The scintigraphic part of the detector is simulated by GEANT4 and first attempts to reconstruct tomographic images have been made using as method of reconstruction a back-projection standard algorithm. The proposed camera is based on slant collimators and LaBr3:Ce crystals. Within the Field of View (FOV) of the camera, it possible to distinguish point sources located in air at a distance of about 2 cm from each other. In particular conditions of uptake, tumor depth and dimension, the preliminary results show that the Signal to Noise Ratio (SNR) values obtained are higher than the standard detection limit.
APA, Harvard, Vancouver, ISO, and other styles
26

Lo, Meo Sergio <1970&gt. "Monte Carlo Simulations of Novel Scintillator Detectors and Dosimetry Calculations." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1583/.

Full text
Abstract:
Monte Carlo (MC) simulation techniques are becoming very common in the Medical Physicists community. MC can be used for modeling Single Photon Emission Computed Tomography (SPECT) and for dosimetry calculations. 188Re, is a promising candidate for radiotherapeutic production and understanding the mechanisms of the radioresponse of tumor cells "in vitro" is of crucial importance as a first step before "in vivo" studies. The dosimetry of 188Re, used to target different lines of cancer cells, has been evaluated by the MC code GEANT4. The simulations estimate the average energy deposition/per event in the biological samples. The development of prototypes for medical imaging, based on LaBr3:Ce scintillation crystals coupled with a position sensitive photomultiplier, have been studied using GEANT4 simulations. Having tested, in the simulation, surface treatments different from the one applied to the crystal used in our experimental measurements, we found out that the Energy Resolution (ER) and the Spatial Resolution (SR) could be improved, in principle, by machining in a different way the lateral surfaces of the crystal. We have then studied a system able to acquire both echographic and scintigraphic images to let the medical operator obtain the complete anatomic and functional information for tumor diagnosis. The scintigraphic part of the detector is simulated by GEANT4 and first attempts to reconstruct tomographic images have been made using as method of reconstruction a back-projection standard algorithm. The proposed camera is based on slant collimators and LaBr3:Ce crystals. Within the Field of View (FOV) of the camera, it possible to distinguish point sources located in air at a distance of about 2 cm from each other. In particular conditions of uptake, tumor depth and dimension, the preliminary results show that the Signal to Noise Ratio (SNR) values obtained are higher than the standard detection limit.
APA, Harvard, Vancouver, ISO, and other styles
27

Gavoci, Entele <1976&gt. "Elf magnetic field influence on ION Channels studied by Patch Clamp Technique: exposure set up and "Whole Cell" measurements on Potassium currents." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1904/1/Gavoci_Entele_Tesi.pdf.

Full text
Abstract:
The aim of this thesis was to study the effects of extremely low frequency (ELF) electromagnetic magnetic fields on potassium currents in neural cell lines ( Neuroblastoma SK-N-BE ), using the whole-cell Patch Clamp technique. Such technique is a sophisticated tool capable to investigate the electrophysiological activity at a single cell, and even at single channel level. The total potassium ion currents through the cell membrane was measured while exposing the cells to a combination of static (DC) and alternate (AC) magnetic fields according to the prediction of the so-called ‘ Ion Resonance Hypothesis ’. For this purpose we have designed and fabricated a magnetic field exposure system reaching a good compromise between magnetic field homogeneity and accessibility to the biological sample under the microscope. The magnetic field exposure system consists of three large orthogonal pairs of square coils surrounding the patch clamp set up and connected to the signal generation unit, able to generate different combinations of static and/or alternate magnetic fields. Such system was characterized in term of field distribution and uniformity through computation and direct field measurements. No statistically significant changes in the potassium ion currents through cell membrane were reveled when the cells were exposed to AC/DC magnetic field combination according to the afore mentioned ‘Ion Resonance Hypothesis’.
APA, Harvard, Vancouver, ISO, and other styles
28

Gavoci, Entele <1976&gt. "Elf magnetic field influence on ION Channels studied by Patch Clamp Technique: exposure set up and "Whole Cell" measurements on Potassium currents." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1904/.

Full text
Abstract:
The aim of this thesis was to study the effects of extremely low frequency (ELF) electromagnetic magnetic fields on potassium currents in neural cell lines ( Neuroblastoma SK-N-BE ), using the whole-cell Patch Clamp technique. Such technique is a sophisticated tool capable to investigate the electrophysiological activity at a single cell, and even at single channel level. The total potassium ion currents through the cell membrane was measured while exposing the cells to a combination of static (DC) and alternate (AC) magnetic fields according to the prediction of the so-called ‘ Ion Resonance Hypothesis ’. For this purpose we have designed and fabricated a magnetic field exposure system reaching a good compromise between magnetic field homogeneity and accessibility to the biological sample under the microscope. The magnetic field exposure system consists of three large orthogonal pairs of square coils surrounding the patch clamp set up and connected to the signal generation unit, able to generate different combinations of static and/or alternate magnetic fields. Such system was characterized in term of field distribution and uniformity through computation and direct field measurements. No statistically significant changes in the potassium ion currents through cell membrane were reveled when the cells were exposed to AC/DC magnetic field combination according to the afore mentioned ‘Ion Resonance Hypothesis’.
APA, Harvard, Vancouver, ISO, and other styles
29

Gonella, Marco <1965&gt. "Un piano territoriale per la riduzione della concentrazione di CO2 nell'atmosfera e la produzione energetica da fonti rinnovabili per mezzo della filiera delle biomasse." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2010. http://amsdottorato.unibo.it/2438/1/Marco_Gonella_tesi.pdf.

Full text
Abstract:
Il progetto propone uno studio per verificare la fattibilita' di un piano territoriale (ideato per il bacino del Po ma di fatto estendibile a tutti i bacini fluviali) per la creazione di una filiera di colture bioenergetiche (biomasse) che, trasportate per mezzo della navigazione fluviale (uno dei mezzi di trasporto a minore emissione di CO2), alimentino una o piu' centrali a nuova tecnologia che associno alla produzione di calore (teleriscaldamento e raffreddamento) e di energia la separazione dei fumi. La CO2 catturata dalla crescita delle biomasse e recuperata dalla combustione, puo' quindi essere segregata nel sottosuolo di aree costiere subsidenti contrastando il fenomeno dell’abbassamento del suolo. Ricavando benefici in tutti i passaggi di attuazione del piano territoriale (lancio dell'agricoltura bioenergetica, rilancio della navigazione a corrente libera, avvio di una economia legata alla logistica del trasporto e dello stoccaggio delle biomasse, generazione di energia pulita, lotta alla subsidenza) il progetto, di fatto, consente di catturare ingenti quantitativi di CO2 dall'atmosfera e di segregarli nel sottosuolo, riducendo l'effetto serra. Nel corso del Dottorato e' stata sviluppata una metodologia di valutazione della sostenibilita' economica ed ambientale del progetto ad un bacino fluviale, che consta di una modulistica di raccolta dei dati di base e di una procedura informatizzata di analisi.
The aim of the project is a feasibility study, from a territorial perspective, of the biomass renewable energy production chain. The innovative solution is to link, for the first time, biomass renewable energy and river transports. Biomass material produced within a chosen catchment area (e.g. the Po River Basin in Italy), will be transported via inland waterways to the biomass renewable energy plants. The use of river transport decreases CO2 production, reduces costs enlarging the area of biomass availability, and creates an economical growth of the biomass market within a river basin. During the phase of energy and heat production, the resulting CO2 could be sequestrated and stored underground, reducing the total CO2 content of the atmosphere. This means that the entire process not only produces renewable and clean energy but could be also considered a "negative CO2 production". CO2 sequestrated could be injected, at groundwater level, in those coastal areas exposed to subsidence so that the ground lowering phenomena could be fought. During the PhD a procedure for the data collection in a river basin and for the environmental and economical sustainability of the project has been developed and tested.
APA, Harvard, Vancouver, ISO, and other styles
30

Gonella, Marco <1965&gt. "Un piano territoriale per la riduzione della concentrazione di CO2 nell'atmosfera e la produzione energetica da fonti rinnovabili per mezzo della filiera delle biomasse." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2010. http://amsdottorato.unibo.it/2438/.

Full text
Abstract:
Il progetto propone uno studio per verificare la fattibilita' di un piano territoriale (ideato per il bacino del Po ma di fatto estendibile a tutti i bacini fluviali) per la creazione di una filiera di colture bioenergetiche (biomasse) che, trasportate per mezzo della navigazione fluviale (uno dei mezzi di trasporto a minore emissione di CO2), alimentino una o piu' centrali a nuova tecnologia che associno alla produzione di calore (teleriscaldamento e raffreddamento) e di energia la separazione dei fumi. La CO2 catturata dalla crescita delle biomasse e recuperata dalla combustione, puo' quindi essere segregata nel sottosuolo di aree costiere subsidenti contrastando il fenomeno dell’abbassamento del suolo. Ricavando benefici in tutti i passaggi di attuazione del piano territoriale (lancio dell'agricoltura bioenergetica, rilancio della navigazione a corrente libera, avvio di una economia legata alla logistica del trasporto e dello stoccaggio delle biomasse, generazione di energia pulita, lotta alla subsidenza) il progetto, di fatto, consente di catturare ingenti quantitativi di CO2 dall'atmosfera e di segregarli nel sottosuolo, riducendo l'effetto serra. Nel corso del Dottorato e' stata sviluppata una metodologia di valutazione della sostenibilita' economica ed ambientale del progetto ad un bacino fluviale, che consta di una modulistica di raccolta dei dati di base e di una procedura informatizzata di analisi.
The aim of the project is a feasibility study, from a territorial perspective, of the biomass renewable energy production chain. The innovative solution is to link, for the first time, biomass renewable energy and river transports. Biomass material produced within a chosen catchment area (e.g. the Po River Basin in Italy), will be transported via inland waterways to the biomass renewable energy plants. The use of river transport decreases CO2 production, reduces costs enlarging the area of biomass availability, and creates an economical growth of the biomass market within a river basin. During the phase of energy and heat production, the resulting CO2 could be sequestrated and stored underground, reducing the total CO2 content of the atmosphere. This means that the entire process not only produces renewable and clean energy but could be also considered a "negative CO2 production". CO2 sequestrated could be injected, at groundwater level, in those coastal areas exposed to subsidence so that the ground lowering phenomena could be fought. During the PhD a procedure for the data collection in a river basin and for the environmental and economical sustainability of the project has been developed and tested.
APA, Harvard, Vancouver, ISO, and other styles
31

Bandini, Vittoria <1982&gt. "Applicazione di strumenti e metodi per la gestione ambientale di un territorio vasto - Sviluppo di un sistema di gestione ambientale in un ente pubblico secondo lo schema europeo EMAS e analisi di un sistema energetico territoriale." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2010. http://amsdottorato.unibo.it/2620/1/Bandini_Vittoria_Tesi.pdf.

Full text
Abstract:
Environmental Management includes many components, among which we can include Environmental Management Systems (EMS), Environmental Reporting and Analysis, Environmental Information Systems and Environmental Communication. In this work two applications are presented: the developement and implementation of an Environmental Management System in local administrations, according to the European scheme "EMAS", and the analysis of a territorial energy system through scenario building and environmental sustainability assessment. Both applications are linked by the same objective, which is the quest for more scientifically sound elements; in fact, both EMS and energy planning are oftec carachterized by localism and poor comparability. Emergy synthesis, proposed by ecologist H.T. Odum and described in his book "Environmental Accounting: Emergy and Environmental Decision Making" (1996) has been chosen and applied as an environmental evaluation tool, in order complete the analysis with an assessment of the "global value" of goods and processes. In particular, eMergy syntesis has been applied in order to improve the evaluation of the significance of environmental aspects in an EMS, and in order to evaluate the environmental performance of three scenarios of future evolution of the energy system. Regarding EMS, in this work an application of an EMS together with the CLEAR methodology for environmental accounting is discussed, in order to improve the identification of the environmental aspects; data regarding environmental aspects and significant ones for 4 local authorities are also presented, together with a preliminary proposal for the integration of the assessment of the significance of environmental aspects with eMergy synthesis. Regarding the analysis of an energy system, in this work the carachterization of the current situation is presented together with the overall energy balance and the evaluation of the emissions of greenhouse gases; moreover, three scenarios of future evolution are described and discussed. The scenarios have been realized with the support of the LEAP software ("Long Term Energy Alternatives Planning System" by SEI - "Stockholm Environment Institute"). Finally, the eMergy synthesis of the current situation and of the three scenarios is shown.
La gestione ambientale è composta da più elementi, tra le quali i Sistemi di gestione ambientale (SGA), l'analisi ed il reporting ambientale, la gestione di dati ambientali, la comunicazione ambientale. In questa tesi ci si occupa dello sviluppo e applicazione di un SGA in un Ente pubblico secondo lo schema europeo "EMAS", e dell'analisi di un sistema energetico territoriale attraverso la costruzione di scenari e l'uso di indici di sostenibilità  ambientale. Entrambe le tematiche sono accomunate dalla ricerca di inserire elementi di scientificità  in procedure (i SGA e la pianificazione energetica) che sono altrimenti spesso caratterizzate da localismo e scarsa confrontabilità . La sintesi eMergetica, proposta dall'ecologo H.T. Odum e descritta nel 1996 nel libro "Environmental Accounting: Emergy and Environmental Decision Making", è stata utilizzata per completare con un elemento di valutazione del valore ambientale globale le analisi svolte. In particolare, la sintesi eMergetica è stata utilizzata per migliorare la metodologia per valutare la significatività  degli aspetti ambientali in un SGA, e per valutare le performance ambientali degli scenari di evoluzione del sistema energetico studiato. Relativamente ai SGA, si riportano una applicazione congiunta del metodo CLEAR per facilitare l'identificazione degli aspetti ambientali, i dati dei 4 Comuni seguiti per quanto riguarda aspetti ambientali e significatività , una proposta preliminare di integrazione della valutazione della significatività  attraverso la sintesi eMergetica. Relativamente all'analisi di un sistema energetico territoriale, si riportano la caratterizzazione della situazione attuale ed il bilancio energetico, il bilancio delle emissioni di gas serra, 3 scenari di evoluzione del sistema definiti col supporto del software LEAP ("Long Term Energy Alternatives Planning System" del SEI - "Stockholm Environment Institute"), la sintesi eMergetica della situazione attuale e degli scenari costruiti.
APA, Harvard, Vancouver, ISO, and other styles
32

Bandini, Vittoria <1982&gt. "Applicazione di strumenti e metodi per la gestione ambientale di un territorio vasto - Sviluppo di un sistema di gestione ambientale in un ente pubblico secondo lo schema europeo EMAS e analisi di un sistema energetico territoriale." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2010. http://amsdottorato.unibo.it/2620/.

Full text
Abstract:
Environmental Management includes many components, among which we can include Environmental Management Systems (EMS), Environmental Reporting and Analysis, Environmental Information Systems and Environmental Communication. In this work two applications are presented: the developement and implementation of an Environmental Management System in local administrations, according to the European scheme "EMAS", and the analysis of a territorial energy system through scenario building and environmental sustainability assessment. Both applications are linked by the same objective, which is the quest for more scientifically sound elements; in fact, both EMS and energy planning are oftec carachterized by localism and poor comparability. Emergy synthesis, proposed by ecologist H.T. Odum and described in his book "Environmental Accounting: Emergy and Environmental Decision Making" (1996) has been chosen and applied as an environmental evaluation tool, in order complete the analysis with an assessment of the "global value" of goods and processes. In particular, eMergy syntesis has been applied in order to improve the evaluation of the significance of environmental aspects in an EMS, and in order to evaluate the environmental performance of three scenarios of future evolution of the energy system. Regarding EMS, in this work an application of an EMS together with the CLEAR methodology for environmental accounting is discussed, in order to improve the identification of the environmental aspects; data regarding environmental aspects and significant ones for 4 local authorities are also presented, together with a preliminary proposal for the integration of the assessment of the significance of environmental aspects with eMergy synthesis. Regarding the analysis of an energy system, in this work the carachterization of the current situation is presented together with the overall energy balance and the evaluation of the emissions of greenhouse gases; moreover, three scenarios of future evolution are described and discussed. The scenarios have been realized with the support of the LEAP software ("Long Term Energy Alternatives Planning System" by SEI - "Stockholm Environment Institute"). Finally, the eMergy synthesis of the current situation and of the three scenarios is shown.
La gestione ambientale è composta da più elementi, tra le quali i Sistemi di gestione ambientale (SGA), l'analisi ed il reporting ambientale, la gestione di dati ambientali, la comunicazione ambientale. In questa tesi ci si occupa dello sviluppo e applicazione di un SGA in un Ente pubblico secondo lo schema europeo "EMAS", e dell'analisi di un sistema energetico territoriale attraverso la costruzione di scenari e l'uso di indici di sostenibilità  ambientale. Entrambe le tematiche sono accomunate dalla ricerca di inserire elementi di scientificità  in procedure (i SGA e la pianificazione energetica) che sono altrimenti spesso caratterizzate da localismo e scarsa confrontabilità . La sintesi eMergetica, proposta dall'ecologo H.T. Odum e descritta nel 1996 nel libro "Environmental Accounting: Emergy and Environmental Decision Making", è stata utilizzata per completare con un elemento di valutazione del valore ambientale globale le analisi svolte. In particolare, la sintesi eMergetica è stata utilizzata per migliorare la metodologia per valutare la significatività  degli aspetti ambientali in un SGA, e per valutare le performance ambientali degli scenari di evoluzione del sistema energetico studiato. Relativamente ai SGA, si riportano una applicazione congiunta del metodo CLEAR per facilitare l'identificazione degli aspetti ambientali, i dati dei 4 Comuni seguiti per quanto riguarda aspetti ambientali e significatività , una proposta preliminare di integrazione della valutazione della significatività  attraverso la sintesi eMergetica. Relativamente all'analisi di un sistema energetico territoriale, si riportano la caratterizzazione della situazione attuale ed il bilancio energetico, il bilancio delle emissioni di gas serra, 3 scenari di evoluzione del sistema definiti col supporto del software LEAP ("Long Term Energy Alternatives Planning System" del SEI - "Stockholm Environment Institute"), la sintesi eMergetica della situazione attuale e degli scenari costruiti.
APA, Harvard, Vancouver, ISO, and other styles
33

Quatraro, Diego <1982&gt. "Collective effects for the LHC injectors: non-ultrarelativistic approaches." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3403/1/quatraro_diego_tesi.pdf.

Full text
Abstract:
The upgrade of the CERN accelerator complex has been planned in order to further increase the LHC performances in exploring new physics frontiers. One of the main limitations to the upgrade is represented by the collective instabilities. These are intensity dependent phenomena triggered by electromagnetic fields excited by the interaction of the beam with its surrounding. These fields are represented via wake fields in time domain or impedances in frequency domain. Impedances are usually studied assuming ultrarelativistic bunches while we mainly explored low and medium energy regimes in the LHC injector chain. In a non-ultrarelativistic framework we carried out a complete study of the impedance structure of the PSB which accelerates proton bunches up to 1.4 GeV. We measured the imaginary part of the impedance which creates betatron tune shift. We introduced a parabolic bunch model which together with dedicated measurements allowed us to point to the resistive wall impedance as the source of one of the main PSB instability. These results are particularly useful for the design of efficient transverse instability dampers. We developed a macroparticle code to study the effect of the space charge on intensity dependent instabilities. Carrying out the analysis of the bunch modes we proved that the damping effects caused by the space charge, which has been modelled with semi-analytical method and using symplectic high order schemes, can increase the bunch intensity threshold. Numerical libraries have been also developed in order to study, via numerical simulations of the bunches, the impedance of the whole CERN accelerator complex. On a different note, the experiment CNGS at CERN, requires high-intensity beams. We calculated the interpolating Hamiltonian of the beam for highly non-linear lattices. These calculations provide the ground for theoretical and numerical studies aiming to improve the CNGS beam extraction from the PS to the SPS.
APA, Harvard, Vancouver, ISO, and other styles
34

Quatraro, Diego <1982&gt. "Collective effects for the LHC injectors: non-ultrarelativistic approaches." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3403/.

Full text
Abstract:
The upgrade of the CERN accelerator complex has been planned in order to further increase the LHC performances in exploring new physics frontiers. One of the main limitations to the upgrade is represented by the collective instabilities. These are intensity dependent phenomena triggered by electromagnetic fields excited by the interaction of the beam with its surrounding. These fields are represented via wake fields in time domain or impedances in frequency domain. Impedances are usually studied assuming ultrarelativistic bunches while we mainly explored low and medium energy regimes in the LHC injector chain. In a non-ultrarelativistic framework we carried out a complete study of the impedance structure of the PSB which accelerates proton bunches up to 1.4 GeV. We measured the imaginary part of the impedance which creates betatron tune shift. We introduced a parabolic bunch model which together with dedicated measurements allowed us to point to the resistive wall impedance as the source of one of the main PSB instability. These results are particularly useful for the design of efficient transverse instability dampers. We developed a macroparticle code to study the effect of the space charge on intensity dependent instabilities. Carrying out the analysis of the bunch modes we proved that the damping effects caused by the space charge, which has been modelled with semi-analytical method and using symplectic high order schemes, can increase the bunch intensity threshold. Numerical libraries have been also developed in order to study, via numerical simulations of the bunches, the impedance of the whole CERN accelerator complex. On a different note, the experiment CNGS at CERN, requires high-intensity beams. We calculated the interpolating Hamiltonian of the beam for highly non-linear lattices. These calculations provide the ground for theoretical and numerical studies aiming to improve the CNGS beam extraction from the PS to the SPS.
APA, Harvard, Vancouver, ISO, and other styles
35

Rivetti, Di Val Cervo Stefano <1975&gt. "Performance evaluation of detectors for digital radiography." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3611/1/stefano_rivetti_tesi.pdf.

Full text
Abstract:
To date the hospital radiological workflow is completing a transition from analog to digital technology. Since the X-rays digital detection technologies have become mature, hospitals are trading on the natural devices turnover to replace the conventional screen film devices with digital ones. The transition process is complex and involves not just the equipment replacement but also new arrangements for image transmission, display (and reporting) and storage. This work is focused on 2D digital detector’s characterization with a concern to specific clinical application; the systems features linked to the image quality are analyzed to assess the clinical performances, the conversion efficiency, and the minimum dose necessary to get an acceptable image. The first section overviews the digital detector technologies focusing on the recent and promising technological developments. The second section contains a description of the characterization methods considered in this thesis categorized in physical, psychophysical and clinical; theory, models and procedures are described as well. The third section contains a set of characterizations performed on new equipments that appears to be some of the most advanced technologies available to date. The fourth section deals with some procedures and schemes employed for quality assurance programs.
APA, Harvard, Vancouver, ISO, and other styles
36

Rivetti, Di Val Cervo Stefano <1975&gt. "Performance evaluation of detectors for digital radiography." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3611/.

Full text
Abstract:
To date the hospital radiological workflow is completing a transition from analog to digital technology. Since the X-rays digital detection technologies have become mature, hospitals are trading on the natural devices turnover to replace the conventional screen film devices with digital ones. The transition process is complex and involves not just the equipment replacement but also new arrangements for image transmission, display (and reporting) and storage. This work is focused on 2D digital detector’s characterization with a concern to specific clinical application; the systems features linked to the image quality are analyzed to assess the clinical performances, the conversion efficiency, and the minimum dose necessary to get an acceptable image. The first section overviews the digital detector technologies focusing on the recent and promising technological developments. The second section contains a description of the characterization methods considered in this thesis categorized in physical, psychophysical and clinical; theory, models and procedures are described as well. The third section contains a set of characterizations performed on new equipments that appears to be some of the most advanced technologies available to date. The fourth section deals with some procedures and schemes employed for quality assurance programs.
APA, Harvard, Vancouver, ISO, and other styles
37

Procopio, Maria <1975&gt. "Radical pair compass sensor model in avian magnetoreception." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3698/1/Procopio_Maria_tesi.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Procopio, Maria <1975&gt. "Radical pair compass sensor model in avian magnetoreception." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3698/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Benini, Lorenzo <1982&gt. "Sustainability indicators for environmental management and decision making." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3772/1/Benini_Lorenzo_tesi.pdf.

Full text
Abstract:
The general objective of this research is to explore theories and methodologies of sustainability indicators, environmental management and decision making disciplines with the operational purpose of producing scientific, robust and relevant information for supporting system understanding and decision making in real case studies. Several tools have been applied in order to increase the understanding of socio-ecological systems as well as providing relevant information on the choice between alternatives. These tools have always been applied having in mind the complexity of the issues and the uncertainty tied to the partial knowledge of the systems under study. Two case studies with specific application to performances measurement (environmental performances in the case of the K8 approach and sustainable development performances in the case of the EU Sustainable Development Strategy) and a case study about the selection of sustainable development indicators amongst Municipalities in Scotland, are discussed in the first part of the work. In the second part of the work, the common denominator among subjects consists in the application of spatial indices and indicators to address operational problems in land use management within the territory of the Ravenna province (Italy). The main conclusion of the thesis is that a ‘perfect’ methodological approach which always produces the best results in assessing sustainability performances does not exist. Rather, there is a pool of correct approaches answering different evaluation questions, to be used when methodologies fit the purpose of the analysis. For this reason, methodological limits and conceptual assumptions as well as consistency and transparency of the assessment, become the key factors for assessing the quality of the analysis.
APA, Harvard, Vancouver, ISO, and other styles
40

Benini, Lorenzo <1982&gt. "Sustainability indicators for environmental management and decision making." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3772/.

Full text
Abstract:
The general objective of this research is to explore theories and methodologies of sustainability indicators, environmental management and decision making disciplines with the operational purpose of producing scientific, robust and relevant information for supporting system understanding and decision making in real case studies. Several tools have been applied in order to increase the understanding of socio-ecological systems as well as providing relevant information on the choice between alternatives. These tools have always been applied having in mind the complexity of the issues and the uncertainty tied to the partial knowledge of the systems under study. Two case studies with specific application to performances measurement (environmental performances in the case of the K8 approach and sustainable development performances in the case of the EU Sustainable Development Strategy) and a case study about the selection of sustainable development indicators amongst Municipalities in Scotland, are discussed in the first part of the work. In the second part of the work, the common denominator among subjects consists in the application of spatial indices and indicators to address operational problems in land use management within the territory of the Ravenna province (Italy). The main conclusion of the thesis is that a ‘perfect’ methodological approach which always produces the best results in assessing sustainability performances does not exist. Rather, there is a pool of correct approaches answering different evaluation questions, to be used when methodologies fit the purpose of the analysis. For this reason, methodological limits and conceptual assumptions as well as consistency and transparency of the assessment, become the key factors for assessing the quality of the analysis.
APA, Harvard, Vancouver, ISO, and other styles
41

Giovannini, Luca <1982&gt. "A novel map-matching procedure for low-sampling GPS data with applications to traffic flow analysis." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3898/1/giovannini_luca_tesi.pdf.

Full text
Abstract:
An extensive sample (2%) of private vehicles in Italy are equipped with a GPS device that periodically measures their position and dynamical state for insurance purposes. Having access to this type of data allows to develop theoretical and practical applications of great interest: the real-time reconstruction of traffic state in a certain region, the development of accurate models of vehicle dynamics, the study of the cognitive dynamics of drivers. In order for these applications to be possible, we first need to develop the ability to reconstruct the paths taken by vehicles on the road network from the raw GPS data. In fact, these data are affected by positioning errors and they are often very distanced from each other (~2 Km). For these reasons, the task of path identification is not straightforward. This thesis describes the approach we followed to reliably identify vehicle paths from this kind of low-sampling data. The problem of matching data with roads is solved with a bayesian approach of maximum likelihood. While the identification of the path taken between two consecutive GPS measures is performed with a specifically developed optimal routing algorithm, based on A* algorithm. The procedure was applied on an off-line urban data sample and proved to be robust and accurate. Future developments will extend the procedure to real-time execution and nation-wide coverage.
APA, Harvard, Vancouver, ISO, and other styles
42

Giovannini, Luca <1982&gt. "A novel map-matching procedure for low-sampling GPS data with applications to traffic flow analysis." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2011. http://amsdottorato.unibo.it/3898/.

Full text
Abstract:
An extensive sample (2%) of private vehicles in Italy are equipped with a GPS device that periodically measures their position and dynamical state for insurance purposes. Having access to this type of data allows to develop theoretical and practical applications of great interest: the real-time reconstruction of traffic state in a certain region, the development of accurate models of vehicle dynamics, the study of the cognitive dynamics of drivers. In order for these applications to be possible, we first need to develop the ability to reconstruct the paths taken by vehicles on the road network from the raw GPS data. In fact, these data are affected by positioning errors and they are often very distanced from each other (~2 Km). For these reasons, the task of path identification is not straightforward. This thesis describes the approach we followed to reliably identify vehicle paths from this kind of low-sampling data. The problem of matching data with roads is solved with a bayesian approach of maximum likelihood. While the identification of the path taken between two consecutive GPS measures is performed with a specifically developed optimal routing algorithm, based on A* algorithm. The procedure was applied on an off-line urban data sample and proved to be robust and accurate. Future developments will extend the procedure to real-time execution and nation-wide coverage.
APA, Harvard, Vancouver, ISO, and other styles
43

Di, Lucchio Laura <1982&gt. "Dynamic stabilization of Rayleigh-Taylor instability of ablation fronts in inertial confinement fusion." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4256/1/tesi_completa.pdf.

Full text
Abstract:
One of the most important problems in inertial confinement fusion is how to find a way to mitigate the onset of the Rayleigh-Taylor instability which arises in the ablation front during the compression. In this thesis it is studied in detail the possibility of using for such a purpose the well-known mechanism of dynamic stabilization, already applied to other dynamical systems such as the inverted pendulum. In this context, a periodic acceleration superposed to the background gravity generates a vertical vibration of the ablation front itself. The effects of different driving modulations (Dirac deltas and square waves) are analyzed from a theoretical point of view, with a focus on stabilization of ion beam driven ablation fronts, and a comparison is made, in order to look for optimization.
APA, Harvard, Vancouver, ISO, and other styles
44

Di, Lucchio Laura <1982&gt. "Dynamic stabilization of Rayleigh-Taylor instability of ablation fronts in inertial confinement fusion." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4256/.

Full text
Abstract:
One of the most important problems in inertial confinement fusion is how to find a way to mitigate the onset of the Rayleigh-Taylor instability which arises in the ablation front during the compression. In this thesis it is studied in detail the possibility of using for such a purpose the well-known mechanism of dynamic stabilization, already applied to other dynamical systems such as the inverted pendulum. In this context, a periodic acceleration superposed to the background gravity generates a vertical vibration of the ablation front itself. The effects of different driving modulations (Dirac deltas and square waves) are analyzed from a theoretical point of view, with a focus on stabilization of ion beam driven ablation fronts, and a comparison is made, in order to look for optimization.
APA, Harvard, Vancouver, ISO, and other styles
45

Petazzi, Pierandrea <1981&gt. "Microscopic Modeling on complex networks." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4296/1/Petazzi_Pierandrea_tesi.pdf.

Full text
Abstract:
The field of complex systems is a growing body of knowledge, It can be applied to countless different topics, from physics to computer science, biology, information theory and sociology. The main focus of this work is the use of microscopic models to study the behavior of urban mobility, which characteristics make it a paradigmatic example of complexity. In particular, simulations are used to investigate phase changes in a finite size open Manhattan-like urban road network under different traffic conditions, in search for the parameters to identify phase transitions, equilibrium and non-equilibrium conditions . It is shown how the flow-density macroscopic fundamental diagram of the simulation shows,like real traffic, hysteresis behavior in the transition from the congested phase to the free flow phase, and how the different regimes can be identified studying the statistics of road occupancy.
APA, Harvard, Vancouver, ISO, and other styles
46

Petazzi, Pierandrea <1981&gt. "Microscopic Modeling on complex networks." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4296/.

Full text
Abstract:
The field of complex systems is a growing body of knowledge, It can be applied to countless different topics, from physics to computer science, biology, information theory and sociology. The main focus of this work is the use of microscopic models to study the behavior of urban mobility, which characteristics make it a paradigmatic example of complexity. In particular, simulations are used to investigate phase changes in a finite size open Manhattan-like urban road network under different traffic conditions, in search for the parameters to identify phase transitions, equilibrium and non-equilibrium conditions . It is shown how the flow-density macroscopic fundamental diagram of the simulation shows,like real traffic, hysteresis behavior in the transition from the congested phase to the free flow phase, and how the different regimes can be identified studying the statistics of road occupancy.
APA, Harvard, Vancouver, ISO, and other styles
47

Giampieri, Enrico <1983&gt. "Stochastic models and dynamic measures for the characterization of bistable circuits." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4298/1/tesi_phd.pdf.

Full text
Abstract:
During the last few years, a great deal of interest has risen concerning the applications of stochastic methods to several biochemical and biological phenomena. Phenomena like gene expression, cellular memory, bet-hedging strategy in bacterial growth and many others, cannot be described by continuous stochastic models due to their intrinsic discreteness and randomness. In this thesis I have used the Chemical Master Equation (CME) technique to modelize some feedback cycles and analyzing their properties, including experimental data. In the first part of this work, the effect of stochastic stability is discussed on a toy model of the genetic switch that triggers the cellular division, which malfunctioning is known to be one of the hallmarks of cancer. The second system I have worked on is the so-called futile cycle, a closed cycle of two enzymatic reactions that adds and removes a chemical compound, called phosphate group, to a specific substrate. I have thus investigated how adding noise to the enzyme (that is usually in the order of few hundred molecules) modifies the probability of observing a specific number of phosphorylated substrate molecules, and confirmed theoretical predictions with numerical simulations. In the third part the results of the study of a chain of multiple phosphorylation-dephosphorylation cycles will be presented. We will discuss an approximation method for the exact solution in the bidimensional case and the relationship that this method has with the thermodynamic properties of the system, which is an open system far from equilibrium.In the last section the agreement between the theoretical prediction of the total protein quantity in a mouse cells population and the observed quantity will be shown, measured via fluorescence microscopy.
APA, Harvard, Vancouver, ISO, and other styles
48

Giampieri, Enrico <1983&gt. "Stochastic models and dynamic measures for the characterization of bistable circuits." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4298/.

Full text
Abstract:
During the last few years, a great deal of interest has risen concerning the applications of stochastic methods to several biochemical and biological phenomena. Phenomena like gene expression, cellular memory, bet-hedging strategy in bacterial growth and many others, cannot be described by continuous stochastic models due to their intrinsic discreteness and randomness. In this thesis I have used the Chemical Master Equation (CME) technique to modelize some feedback cycles and analyzing their properties, including experimental data. In the first part of this work, the effect of stochastic stability is discussed on a toy model of the genetic switch that triggers the cellular division, which malfunctioning is known to be one of the hallmarks of cancer. The second system I have worked on is the so-called futile cycle, a closed cycle of two enzymatic reactions that adds and removes a chemical compound, called phosphate group, to a specific substrate. I have thus investigated how adding noise to the enzyme (that is usually in the order of few hundred molecules) modifies the probability of observing a specific number of phosphorylated substrate molecules, and confirmed theoretical predictions with numerical simulations. In the third part the results of the study of a chain of multiple phosphorylation-dephosphorylation cycles will be presented. We will discuss an approximation method for the exact solution in the bidimensional case and the relationship that this method has with the thermodynamic properties of the system, which is an open system far from equilibrium.In the last section the agreement between the theoretical prediction of the total protein quantity in a mouse cells population and the observed quantity will be shown, measured via fluorescence microscopy.
APA, Harvard, Vancouver, ISO, and other styles
49

Polettini, Matteo <1982&gt. "Geometric and Combinatorial Aspects of NonEquilibrium Statistical Mechanics." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4305/1/polettini_matteo_tesi.pdf.

Full text
Abstract:
Non-Equilibrium Statistical Mechanics is a broad subject. Grossly speaking, it deals with systems which have not yet relaxed to an equilibrium state, or else with systems which are in a steady non-equilibrium state, or with more general situations. They are characterized by external forcing and internal fluxes, resulting in a net production of entropy which quantifies dissipation and the extent by which, by the Second Law of Thermodynamics, time-reversal invariance is broken. In this thesis we discuss some of the mathematical structures involved with generic discrete-state-space non-equilibrium systems, that we depict with networks in all analogous to electrical networks. We define suitable observables and derive their linear regime relationships, we discuss a duality between external and internal observables that reverses the role of the system and of the environment, we show that network observables serve as constraints for a derivation of the minimum entropy production principle. We dwell on deep combinatorial aspects regarding linear response determinants, which are related to spanning tree polynomials in graph theory, and we give a geometrical interpretation of observables in terms of Wilson loops of a connection and gauge degrees of freedom. We specialize the formalism to continuous-time Markov chains, we give a physical interpretation for observables in terms of locally detailed balanced rates, we prove many variants of the fluctuation theorem, and show that a well-known expression for the entropy production due to Schnakenberg descends from considerations of gauge invariance, where the gauge symmetry is related to the freedom in the choice of a prior probability distribution. As an additional topic of geometrical flavor related to continuous-time Markov chains, we discuss the Fisher-Rao geometry of nonequilibrium decay modes, showing that the Fisher matrix contains information about many aspects of non-equilibrium behavior, including non-equilibrium phase transitions and superposition of modes. We establish a sort of statistical equivalence principle and discuss the behavior of the Fisher matrix under time-reversal. To conclude, we propose that geometry and combinatorics might greatly increase our understanding of nonequilibrium phenomena.
APA, Harvard, Vancouver, ISO, and other styles
50

Polettini, Matteo <1982&gt. "Geometric and Combinatorial Aspects of NonEquilibrium Statistical Mechanics." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4305/.

Full text
Abstract:
Non-Equilibrium Statistical Mechanics is a broad subject. Grossly speaking, it deals with systems which have not yet relaxed to an equilibrium state, or else with systems which are in a steady non-equilibrium state, or with more general situations. They are characterized by external forcing and internal fluxes, resulting in a net production of entropy which quantifies dissipation and the extent by which, by the Second Law of Thermodynamics, time-reversal invariance is broken. In this thesis we discuss some of the mathematical structures involved with generic discrete-state-space non-equilibrium systems, that we depict with networks in all analogous to electrical networks. We define suitable observables and derive their linear regime relationships, we discuss a duality between external and internal observables that reverses the role of the system and of the environment, we show that network observables serve as constraints for a derivation of the minimum entropy production principle. We dwell on deep combinatorial aspects regarding linear response determinants, which are related to spanning tree polynomials in graph theory, and we give a geometrical interpretation of observables in terms of Wilson loops of a connection and gauge degrees of freedom. We specialize the formalism to continuous-time Markov chains, we give a physical interpretation for observables in terms of locally detailed balanced rates, we prove many variants of the fluctuation theorem, and show that a well-known expression for the entropy production due to Schnakenberg descends from considerations of gauge invariance, where the gauge symmetry is related to the freedom in the choice of a prior probability distribution. As an additional topic of geometrical flavor related to continuous-time Markov chains, we discuss the Fisher-Rao geometry of nonequilibrium decay modes, showing that the Fisher matrix contains information about many aspects of non-equilibrium behavior, including non-equilibrium phase transitions and superposition of modes. We establish a sort of statistical equivalence principle and discuss the behavior of the Fisher matrix under time-reversal. To conclude, we propose that geometry and combinatorics might greatly increase our understanding of nonequilibrium phenomena.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography