Dissertations / Theses on the topic '1590s'

To see the other types of publications on this topic, follow the link: 1590s.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic '1590s.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Calore, Michela. "Elizabethan stage conventions and their textual verbalization in the drama of the 1580s and 1590s." Thesis, University of Reading, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.391266.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

O'Connor, Lara. "Shakespeare's search for tragic form in the 1590s." Thesis, Cardiff University, 2016. http://orca.cf.ac.uk/93366/.

Full text
Abstract:
Starting from David Bevington’s observation that ‘Shakespeare’s disparate ventures into tragic expression in the years prior to 1599 suggest that he had not yet found the model or models he was looking for’,1 the current thesis explores four early tragedies in terms of their experimental nature: Titus Andronicus, Romeo and Juliet, Richard II and Julius Caesar. Central to the thesis is how metadrama in these plays point towards an exploration of tragedy that is marked by the intermingling of genres and sources. The thesis argues that each of the four tragedies under discussion is creatively imaginative in the way it involves adding a fresh perspective to tragedy and its mixing of genres. The thesis begins by arguing that in ‘Pyramus and Thisbe’, the play-within-the play of A Midsummer Night’s Dream, Shakespeare mocks the amateur staging of tragedy which he seeks to revitalize, starting with Titus Andronicus as a bloody revenge tragedy uneasily mixed with poetic language. Chapter Two suggests that Romeo and Juliet explores some of the ways in which comedy collapses into tragedy, while Chapter Three asks questions about Shakespeare’s use of history as a source for tragedy in relation to Richard II. The final chapter concentrates on Julius Caesar as a ‘broken’ play, examining the role of the ‘hero’. Each chapter examines a number of features, including the blending of genre types,the creation of the tragic ‘hero’, unsatisfactory endings as well as metadrama. Finally, Hamlet is seen to incorporate many of these experiments, and thus to lead on to the later tragedies.
APA, Harvard, Vancouver, ISO, and other styles
3

Dovolis, Petros. "Staging dissidence : the 1590s in the hands of the artists of the 1890s." Thesis, Lancaster University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.440385.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bernard, Branka. "Huntington's disease." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2009. http://dx.doi.org/10.18452/15900.

Full text
Abstract:
Die Huntington''sche Krankheit (Huntington''s disease, HD) ist eine tödliche neurodegenerative Erkrankung mit einem extensiven Verlust von Neuronen im Striatum. Die Ursache für HD ist eine genetische Mutation, bei der eine CAG-Wiederholungssequenz verlängert wird. Im resultierenden Protein, das Huntingtin (htt) genannt wurde, diese Mutation führt zur Missfaltung und Aggregation von htt. Ich habe untersucht ob die Bildung von htt-Aggregaten die Transkription von Genen dass sie von HD-assoziierten Transkriptionsfaktoren kontrolliert werden, verändert. Zur Untersuchung der Transkription wurden die zu untersuchenden Gene auf cDNA-Mikroarrays aufgebracht und mit RNA, welche aus den Zellen nach der Induktion der Expression des mutierten htt gewonnen wurde, hybridisiert. Es wurden keine systematischen Veränderungen innerhalb der durch spezifische Transkriptionsfaktoren regulierten Gengruppen gefunden. Ich habe auch mehrere mathematische Modelle erstellt, welche die htt-Aggregation und den Zelltod beschreiben. Die Ergebnisse zeigten, dass eine transiente Dynamik im System und die nicht-monotone Reaktion auf Parameteränderungen zu den nicht-intuitiven Ergebnissen bei Behandlungsansätzen, welche die htt-Aggregation beeinflussen, führen könnten. Für den Fall, dass Aggregate die toxische Form von htt sind, zeigten die numerischen Simulationen dass das Einsetzen der Aggregation, welches durch ein Überschießen der Aggregatkonzentration gekennzeichnet ist, am ehesten zum Zelltod führt. Dieses Phänomen wurde "one-shot"-Modell genannt. Es gibt, auch bei HD-Patienten mit gleicher Länge der CAG-Wiederholungssequenz, eine große Varianz des Alters bei Krankheitsausbruch (age of onset, AO). Ich habe ein stochastisches Modell für den neuronalen Zelltod im Striatum entwickelt. Das Modell zeigte, dass ein signifikanter Anteil der nicht erklärbaren Varianz des AO der intrinsischen Dynamik der Neurodegeneration zugeschrieben werden kann.
Huntington''s disease (HD) is a fatal neurodegenerative disorder characterized by a progressive neuronal loss in the striatum of HD patients. HD is caused by a CAG repeat expansion which translates into a polyglutamine stretch at the N-terminus of the huntingtin protein (htt). The polyQ stretch induces misfolding, cleavage and aggregation of htt. To test the hypothesis that the sequestration of transcription factors into the htt aggregates causes transcriptional changes observed in HD models, I compiled lists of genes controlled by the transcription factors associated with HD. These genes were spotted on cDNA microarrays that were later hybridized with RNA extracted from cells expressing a mutant htt fragment. In this study, no systematic changes related to a specific transcription factors were observed. Formation and the accumulation of htt aggregates causes neurotoxicity in different HD model systems. To investigate the consequences of therapeutic strategies targeting aggregation, I derived several mathematical models describing htt aggregation and cell death. The results showed that transient dynamics and the non-monotonic response of cell survival to a change of parameter might lead to the non-intuitive outcome of a treatment that targets htt aggregation. Also, the numerical simulations show that if aggregates are toxic, the onset of aggregation, marked by the overshoot in the concentration of aggregates, is the event most likely to kill the cell. This phenomenon was termed a one-shot model. The principal cause of the variability of the age at onset (AO) is the length of the CAG repeat. Still, there is a great variance in the AO even for the same CAG repeat length. To study the variability of the AO, I developed a stochastic model for clustered neuronal death in the HD striatum. The model showed that a significant part of the unexplained variance can be attributed to the intrinsic stochastic dynamics of neurodegeneration.
APA, Harvard, Vancouver, ISO, and other styles
5

Siegismund, Christine. "Immunologische Grundlagen für den Schutz vor simianem AIDS in den natürlichen Wirten von SIV." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2009. http://dx.doi.org/10.18452/15907.

Full text
Abstract:
Das Humane Immundefizienzvirus (HIV) ist der Erreger von AIDS und als Zoonose von den Schimpansen (HIV-1) bzw. den Rauchmangaben (HIV-2) auf die menschliche Population übergesprungen. Diese Primatenspezies sind die natürlichen Wirte für die simianen verwand-ten Viren SIVcpz bzw. SIVsm. Die nicht-natürlichen Virus-Wirt-Beziehungen der Immundefizienzviren resultieren in einem pathogenen Verlauf, wie HIV im Menschen und SIVmac in Rhesusmakaken. Die natürlichen Wirte Rauchmangaben, Schimpansen, Afrikanische Grüne Meerkatzen (AGM) und viele mehr entwickeln hingegen kein simianes AIDS. Dies erfolgt trotz lebenslanger Infektion mit SIV und einer zur HIV-Infektion im Menschen äquivalenten Viruslast. Die natürlichen Wirte weisen darüber hinaus keine Immunantwort gegen das virale Kernprotein Gag (gruppen-spezifisches Antigen) auf, was ein früh und zahlreich gebildetes Protein während der Virusreplikation ist. Die fehlende humorale Immunantwort könnte die natürlichen Wirte vor Aktivierung des Immunsystems und dadurch auch vor sAIDS bewahren. Frühere Versuche in AGM mit injiziertem SIVagmGag-Protein zeigten zwar, dass die Induktion einer humoralen Immunantwort gegen SIVagmGag möglich ist, diese aber schon nach kurzer Zeit wieder absinkt. Des Weiteren bildete sich durch Infektion mit SIVagm in den Tieren keine anamnestische Immunantwort heraus, die für die Erkennung von gleichen Epitopen maßgeblich ist. Es scheint ein Unterschied in der Erkennung von exogenem injiziertem SIVagmGag-Protein zu endogenem, durch das Virus selbst gebildetem, Protein in den AGM zu bestehen. Folglich wurde die Hypothese aufgestellt, dass eine Immunisierung mit SIVagmGag-DNA unter Umgehung des Unterschieds in der Proteinprozessierung eine anamnestische Immunantwort in den AGM induziert. Um diese Hypothese zu testen und die Gag-Immunreaktion in Abwesenheit anderer viraler Gene bezüglich der Pathogenität zu evaluieren, wurden codonoptimierte SIVagmGag- und SIVmacGag-DNA Immunisierungsvektoren generiert. Die Proteinexpression wurde in vitro und die Immunogenität in Balb/c und C57Bl/6 Mäusen getestet. Jeweils eine Gruppe von vier AGM erhielt bioballistisch SIVagmGag-DNA bzw. SIVmacGag-DNA. Als Kontrollen dienten mit codonoptimierter DNA immunisierte Rhesusmakaken sowie mit Leervektor immunisierte Primaten beider Spezies. Als Kostimulanz wurde zusätzlich jeweils speziesspezifische gmcsf DNA verwendet. Im Gegensatz zu den Rhesusmakaken konnte in den AGM durch DNA-Immunisierung keine zelluläre und nur eine schwache, transiente humorale anamnestische Immunantwort nach Infektion induziert werden, obwohl beide Gag-Proteine endogen produziert wurden. Daher kann die fehlende anamnestische Immunantwort der AGM nach Immunisierung mit Gag-Protein vermutlich nicht auf Unterschiede in der Proteinprozessierung und -erkennung (endogen versus exogen) zurückgeführt werden. Die Ergebnisse dieser Studie deuten an, dass während der Infektion dieses natürlichen Wirtes eine aktive Unterdrückung der anti-Gag-Antikörperantwort stattfindet, möglicherweise induziert durch eine Anergie in Gag-spezifischen T-Helferzellen.
The causative agents of AIDS, the human immunodeficiency viruses (HIV-1 and HIV-2), were transmitted zoonotically to the human population from the natural hosts of the related simian immunodeficiency viruses SIVcpz (chimpanzees) and SIVsm (sooty mangabeys), respectively. In contrast to the outcome of infections in non-natural hosts (e.g. HIV in humans, SIVmac in rhesus macaques), the natural hosts of SIV do not develop simian AIDS-like symptoms despite life-long infection with virus loads matching those seen in HIV-infected humans. Many such natural host primates infected with SIV, such as SIVagm-infected African green monkeys (AGMs), fail to mount an antibody response to the intact viral core protein Gag (group-specific antigen), a protein produced extensively during infection. It has been postulated that this lack of an immune response to Gag could protect the natural hosts from immunopathological effects and therefore from simian AIDS. Previous studies in AGMs indicated that Gag protein produced endogenously during infection is ''seen'' by the immune system differently than that introduced exogenously by protein immunisation, possibly through differences in processing or through specific tolerance at the T-cell level. To address these possibilities, primate studies were performed in which the responses in the natural and non-natural hosts to endogenously produced Gag protein in the absence of other viral genes were compared and the influence of such endogenous priming on the immune response to infection was evaluated. This was achieved by bioballistic immunisation of AGMs and rhesus macaques with codon-optimised DNA coding for SIVagm or SIVmac Gag protein delivered together with DNA coding for the species-specific GM-CSF cytokine. Prior to the primate studies, the DNA constructs were evaluated in BALB/c and C57Bl/6 mice for immunogenicity and protein expression. In contrast to the rhesus macaques, SIVagmGag DNA immunisation of African green mon-keys generally failed to prime for an anamnestic cellular immune response and primed for only a weak, transient anamnestic humoral response to the protein upon infection, despite the proteins resulting from both immunisation and infection being produced endogenously. Differences in protein processing and recognition (endogenous versus exogenous) do not therefore appear to account for the lack of priming for an anamnestic immune response seen using a protein immunogen. Rather, the results seem to indicate that an active suppression of the anti-Gag immune response may occur during infection of this natural host of SIV, possibly by the induction of anergy in Gag-specific Th cells.
APA, Harvard, Vancouver, ISO, and other styles
6

Salzmann, Ingo. "Structural and energetic properties of pentacene derivatives and heterostructures." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2009. http://dx.doi.org/10.18452/15909.

Full text
Abstract:
Das Ziel der Arbeit ist die Herstellung und die Charakterisierung von Heterostrukturen des organischen Halbleiters Pentazen (PEN) mit diversen konjugierten organischen Materialien im Hinblick auf das Anwendungspotenzial im Bereich der organischen Elektronik. Für die Untersuchung von PEN-Heterostrukturen mit (i) Fulleren (C60), (ii) Perfluoropentazen (PFP) und (iii) 6,13-Pentazenchinon (PQ) wurden mehrere komplementäre experimentelle Techniken angewendet: Röntgenbeugung, Schwingungsspektroskopie, Rasterkraftmikroskopie und Photoelektronenspektroskopie. (i) Für PEN - Heterostrukturen mit C60 wurden die elektronischen, strukturellen und morphologischen Eigenschaften mit der Leistung von organischen Solarzellen (OSZ) für geschichtete und gemischte Systeme korreliert. Dabei wurde gezeigt, dass morphologische anstatt struktureller oder energetischer Ursachen die Leistungsunterschiede der beiden untersuchten Zelltypen erklären. (ii) Strukturuntersuchungen wurden an reinen PFP-Filmen, sowie an geschichteten und gemischten Heterostrukturen mit PEN durchgeführt. Es wurde die Struktur der PFP-Dünnfilmphase gelöst und das Wachstum von PEN+PFP Mischkristallen gezeigt, welche erfolgreich angewandt wurden, um die Ionisationsenergie (IE) des Films mit dem Mischungsverhältnis durchzustimmen. Dies wurde durch die Existenz von innermolekularen polaren Bindungen (C-H und C-F für PEN und PFP) erklärt. (iii) Für reine PQ-Filme wurde die Struktur der PQ-Dünnfilmphase gelöst (ein Molekül pro Einheitszelle). Es wurde eine stark orientierungsabhängige IE von PQ und PEN gefunden und gezeigt, dass die Energieniveaulagen für die Anwendung in OSZ geeignet sind. Die Untersuchung von Mischsystemen zeigte phasensepariertes Wachstum ohne Hinweise auf Interkalation, selbst bei PQ Konzentrationen von nur 2%. Weiters wurde gezeigt, dass O2 und Wasser keine nachhaltigen Auswirkungen auf PEN-Filme zeigen, wohingegen Singlett-Sauerstoff und Ozon diese angreifen und flüchtige Reaktionsprodukte liefern.
The scope of this work is the combination of the organic semiconductor pentacene (PEN) with different conjugated organic molecules to form application relevant heterostructures in vacuum sublimed films. Using x-ray diffraction (XRD), vibrational spectroscopy, atomic force microscopy and photoelectron spectroscopy, PEN heterostructures with (i) fullerene (C60), (ii) perfluoropentacene (PFP) and (iii) 6,13-pentacenequinone (PQ) were thoroughly characterized to judge on the respective application potential in organic electronics. (i) PEN heterostructures with C60 were investigated regarding the correlation of energetic, structural and morphological properties with the performance of organic photovoltaic cells (OPVCs) for both layered and mixed structures. Morphological rather than energetic or structural issues account for performance differences of bulk-heterojunction OPVCs compared to layered devices. (ii) XRD investigations were carried out on pure PFP films, on layered and mixed heterostructures with PEN. The thin-film polymorph of PFP was solved and it is shown that blended films form a mixed crystal structure, which led to the finding that the ionization energy (IE) of organic films composed of molecules with intramolecular polar bonds (like C-H and C-F for PEN and PFP, respectively) can be tuned through the mixing ratio. (iii) A so far unknown thin-film polymorph of PQ on SiO2 substrates was solved using XRD reciprocal space mapping evidencing a loss of the herringbone arrangement known from the PQ bulk structure. For PEN heterostructures with PQ a highly molecular-orientation dependent IE and energy level offsets interesting for the use in OPVCs were found. Mixed films of PEN and PQ exhibit phase separation and no intercalation was found even at PQ concentrations as low as 2%. Finally, it is shown that O2 and water do not react noticeably with PEN, whereas singlet oxygen and ozone readily oxidize PEN films producing volatile reaction products instead of PQ.
APA, Harvard, Vancouver, ISO, and other styles
7

Uebele, Martin. "Historical business cycles and market integration." Doctoral thesis, Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät, 2009. http://dx.doi.org/10.18452/15902.

Full text
Abstract:
Diese Dissertation befasst sich mit europäischer und US-amerikanischer Konjunkturgeschichte und Marktintegration im 19. und 20. Jahrhundert. Zur Analyse von konjunkturellen Schwankungen stellt sie der weitverbreiteten Historischen Volkswirtschaftlichen Gesamtrechnung (VGR) die Methode dynamischer Faktoranalyse zur Seite, die dazu beiträgt, die begrenzten historischen Zeitreihen effizient zu nutzen. Die nationale und internationale Entwicklung von Weizenmärkten seit dem Ende der Napoleonischen Kriege wird mit einem multivariaten dynamischen Faktormodell untersucht. Spektralanalyse wird zur Berechnung frequenzspezifischer Kohärenz von historischen Börsenindizes und konkurrierenden Schätzungen des Nationalprodukts in Deutschland zwischen 1850 und 1913 herangezogen. Ein wichtiges Ergebnis ist, dass Finanzdaten die Datierung der Konjunktur im Deutschen Kaiserreich erleichtern, was auch durch die Ergebnisse der Faktoranalyse bestätigt wird. Der verwendete Aktienindex, einzelne reale Konjunkturindikatoren und der dynamische Faktor korrelieren eng miteinder. Die Bildung sektoraler Sub-Indizes zeigt, dass der Übergang von einer landwirtschaftlich zu einer industriell geprägten Volkswirtschaft vermutlich früher geschehen ist als Beschäftigungsanteile aus der Historischen VGR vermuten lassen. Die Untersuchung der U.S.-Konjunktur ergibt die Annahme zeitvariierender Strukturparameter eine Erhöhung der Konjunkturschwankungsbreite nach dem 2. Weltkrieg verglichen mit der Zeit vor dem 1. Weltkrieg. Für die Weizenmarktintegration in Europa zeigt sich, dass die Entwicklung vor der Mitte des 19. Jahrhunderts schneller voran ging als danach, was eine Neuinterpretation der Rolle von Technologien wie dem Metallrumpf und dem Dampfschiff sowie dem Eintritt Amerikas als Weizenproduzenten nahelegt.
This thesis addresses historical business cycles and market integration in Europe and America in the 19th and 20th centuries. For the analysis of historical business cycles, the widely used methodology of historical national accounting is complemented with a dynamic factor model that allows for using scarce historical data efficiently. In order to investigate how national and international markets developed since the early 1800s, a multivariate dynamic factor model is used. Spectral analysis helps in measuring frequency specific correlation between financial indicators and rivaling national income estimates for Germany between 1850 and 1913. One result is that the historical stock market index used helps to discriminate between competing estimates of German national income. A dynamic factor estimated from a broad time series data set confirms this result. Sub-indices for agriculture and industry suggest that the German economy industrialized earlier than evidence from national accounting shows. The finding for the U.S. business cycle is that relaxing the assumption of constant structural parameters yields higher postwar aggregate volatility relative to the period before World War I. Concerning market integration, it is found that European wheat markets integrated faster before mid-19th century than after. Thus, the impact of the metal hull and steam ship as well as the relevance of American wheat for the world wheat market have perhaps been overstated.
APA, Harvard, Vancouver, ISO, and other styles
8

Mayr, Philipp. "Re-Ranking auf Basis von Bradfordizing für die verteilte Suche in digitalen Bibliotheken." Doctoral thesis, Humboldt-Universität zu Berlin, Philosophische Fakultät I, 2009. http://dx.doi.org/10.18452/15906.

Full text
Abstract:
Trotz großer Dokumentmengen für datenbankübergreifende Literaturrecherchen erwarten akademische Nutzer einen möglichst hohen Anteil an relevanten und qualitativen Dokumenten in den Trefferergebnissen. Insbesondere die Reihenfolge und Struktur der gelisteten Ergebnisse (Ranking) spielt, neben dem direkten Volltextzugriff auf die Dokumente, inzwischen eine entscheidende Rolle beim Design von Suchsystemen. Nutzer erwarten weiterhin flexible Informationssysteme, die es unter anderem zulassen, Einfluss auf das Ranking der Dokumente zu nehmen bzw. alternative Rankingverfahren zu verwenden. In dieser Arbeit werden zwei Mehrwertverfahren für Suchsysteme vorgestellt, die die typischen Probleme bei der Recherche nach wissenschaftlicher Literatur behandeln und damit die Recherchesituation messbar verbessern können. Die beiden Mehrwertdienste semantische Heterogenitätsbehandlung am Beispiel Crosskonkordanzen und Re-Ranking auf Basis von Bradfordizing, die in unterschiedlichen Phasen der Suche zum Einsatz kommen, werden hier ausführlich beschrieben und im empirischen Teil der Arbeit bzgl. der Effektivität für typische fachbezogene Recherchen evaluiert. Vorrangiges Ziel der Promotion ist es, zu untersuchen, ob das hier vorgestellte alternative Re-Rankingverfahren Bradfordizing im Anwendungsbereich bibliographischer Datenbanken zum einen operabel ist und zum anderen voraussichtlich gewinnbringend in Informationssystemen eingesetzt und dem Nutzer angeboten werden kann. Für die Tests wurden Fragestellungen und Daten aus zwei Evaluationsprojekten (CLEF und KoMoHe) verwendet. Die intellektuell bewerteten Dokumente stammen aus insgesamt sieben wissenschaftlichen Fachdatenbanken der Fächer Sozialwissenschaften, Politikwissenschaft, Wirtschaftswissenschaften, Psychologie und Medizin. Die Evaluation der Crosskonkordanzen (insgesamt 82 Fragestellungen) zeigt, dass sich die Retrievalergebnisse signifikant für alle Crosskonkordanzen verbessern; es zeigt sich zudem, dass interdisziplinäre Crosskonkordanzen den stärksten (positiven) Effekt auf die Suchergebnisse haben. Die Evaluation des Re-Ranking nach Bradfordizing (insgesamt 164 Fragestellungen) zeigt, dass die Dokumente der Kernzone (Kernzeitschriften) für die meisten Testreihen eine signifikant höhere Precision als Dokumente der Zone 2 und Zone 3 (Peripheriezeitschriften) ergeben. Sowohl für Zeitschriften als auch für Monographien kann dieser Relevanzvorteil nach Bradfordizing auf einer sehr breiten Basis von Themen und Fragestellungen an zwei unabhängigen Dokumentkorpora empirisch nachgewiesen werden.
In spite of huge document sets for cross-database literature searches, academic users expect a high ratio of relevant and qualitative documents in result sets. It is particularly the order and structure of the listed results (ranking) that play an important role when designing search systems alongside the direct full text access for documents. Users also expect flexible information systems which allow influencing the ranking of documents and application of alternative ranking techniques. This thesis proposes two value-added approaches for search systems which treat typical problems in searching scientific literature and seek to improve the retrieval situation on a measurable level. The two value-added services, semantic treatment of heterogeneity (the example of cross-concordances) and re-ranking on Bradfordizing, which are applied in different search phases, are described in detail and their effectiveness in typical subject-specific searches is evaluated in the empirical part of the thesis. The preeminent goal of the thesis is to study if the proposed, alternative re-ranking approach Bradfordizing is operable in the domain of bibliographic databases, and if the approach is profitable, i.e. serves as a value added, for users in information systems. We used topics and data from two evaluation projects (CLEF and KoMoHe) for the tests. The intellectually assessed documents come from seven academic abstracting and indexing databases representing social science, political science, economics, psychology and medicine. The evaluation of the cross-concordances (82 topics altogether) shows that the retrieval results improve significantly for all cross-concordances, indicating that interdisciplinary cross-concordances have the strongest (positive) effect on the search results. The evaluation of Bradfordizing re-ranking (164 topics altogether) shows that core zone (core journals) documents display significantly higher precision than was seen for documents in zone 2 and zone 3 (periphery journals) for most test series. This post-Bradfordizing relevance advantage can be demonstrated empirically across a very broad basis of topics and two independent document corpora as well for journals and monographs.
APA, Harvard, Vancouver, ISO, and other styles
9

Litwinski, Christian. "Elektronische Eigenschaften von oligonuklearen Phthalocyaninen." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2009. http://dx.doi.org/10.18452/15901.

Full text
Abstract:
Phthalocyanine (Pc) sind aufgrund ihrer einfachen Herstellung, bekannten Eigenschaften und großen chemischen Stabilität viel versprechende organische Substanzen für verschiedene Anwendungen in der Wissenschaft und Industrie. Im Rahmen dieser Arbeit wurden die elektronischen Eigenschaften von annellierten dinuklearen (ZnPc-ZnPc, H2Pc-H2Pc) und trinuklearen Pc im Vergleich zu mononuklearen Pc (H2Pc, ZnPc) und einfach kovalent über eine Ethandiol-Brücke verknüpfte dimere Pc (DH2, DZn) untersucht. Die stationären Absorptions- und Fluoreszenzspektren von DZn sind vergleichbar mit denen eines unsymmetrisch substituierten mononuklearen ZnPc. Die elektronischen Eigenschaften von DH2 sind von drei verschiedenen Phänomenen beeinflusst: Aggregation, exzitonische Wechselwirkung und der Existenz unterschiedlicher NH-Tautomere. Die große bathochrome Verschiebung der Q-Bande von annellierten dinuklearen und trinuklearen Pc im Vergleich zu den mononuklearen Analogon zeigt die Expansion des pi-Elektronensystems in solchen annellierten Systemen. Zeitabhängige Dichtefunktionalrechnungen (TD-DFT) ergaben für das dinukleare H2Pc-H2Pc erstmals drei verschiedene NH-Tautomere mit unterschiedlichen Molekülorbitalen und Absorptionsspektren. Ein Vergleich der berechneten Resultate mit den experimentellen Ergebnissen aus stationärer und zeitaufgelöster Spektroskopie ermöglichte erstmals die Bestimmung der elektronischen Parameter der einzelnen NH-Tautomere des annellierten dinuklearen H2Pc-H2Pc in Lösung. In Übereinstimmung mit den TD-DFT-Rechnungen wurde nur eine mögliche Spezies für das dinukleare ZnPc-ZnPc im Experiment gefunden. Erste spektroskopische Untersuchungen an den neuen trinuklearen annellierten Pc ergaben einen Hinweis darauf, dass die NH-Tautomerie einen großen Einfluss auf deren elektronischen Eigenschaften selbst bei Raumtemperatur hat.
Phthalocyanines (Pcs) are auspicious molecules for a broad variety of scientific and industrial applications, because of their simple production, well-defined properties and high chemical stability. The photophysical properties of annulated dinuclear (ZnPc-ZnPc, H2Pc-H2Pc) and trinuclear Pcs in comparison to mononuclear Pcs (H2Pc, ZnPc) and dimeric Pcs, covalently linked by an ethandiol-bond (DH2 and DZn), were investigated in this study. Absorption and fluorescence spectra of dimeric DZn resemble those of an asymmetric substituted mononuclear ZnPc. The photophysical properties of DH2 are influenced by three different phenomena, namely aggregation, excitonic interaction and the existence of different NH-tautomers. Strong bathochromic Q-band-shifts of annulated dinuclear and trinuclear Pcs in comparison to the mononuclear analogues show expansion of the pi-electron system in such annulated molecules. For the first time, the existence of three NH-tautomers with different molecular orbitals and absorption spectra were shown by time-dependent density functional theory (TD-DFT) calculations of dinuclear H2Pc-H2Pc. Theoretical calculations and experimental data obtained by steady state and time resolved spectroscopy were matched. This approach enabled the determination of photophysical properties of individual NH-tautomers of dinuclear H2Pc-H2Pc in solution. In accordance to the TD-DFT calculations only one dinuclear ZnPc-ZnPc species was experimentally established. First spectroscopic investigations of novel trinuclear Pc gave an evidence that NH-tautomerism might have a strong influence on their photophysical properties, even at room temperature.
APA, Harvard, Vancouver, ISO, and other styles
10

Lohrberg, Dörte. "Untersuchungen zur affinitäts-basierten Aufreinigung von tight junction-proteinen und deren potentiellen Interaktionspartnern." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2009. http://dx.doi.org/10.18452/15903.

Full text
Abstract:
Die Epithelien vielzelliger Organismen bilden eine funktionelle Grenzschicht, die für die Homöostase innerhalb und für den spezifischen Stoffaustausch zwischen den Kompartimenten verantwortlich ist. Der interzelluläre Spalt zwischen Epithelzellen wird durch tight junctions verschlossen, die eine selektive Permeabilitätsbarriere bilden. Da viele Krankheiten auf eine Dysfunktion der Barriere zurückzuführen sind, ist eine genaue Kenntnis der molekularen Zusammensetzung der tight junctions aus pharmakologischer Sicht von großem Interesse. In dieser Arbeit wurden Anreicherungsstrategien entwickelt, die eine Proteomanalyse der tight junction-Proteine erlauben. Der Fokus wurde dabei auf die Claudine und Tricellulin gelegt, die als transmembranale Proteine das molekulare Rückgrat der tight junctions bilden. Durch Affinitätsreinigung gelang erstmals eine Anreicherung verschiedener Claudine, die durch Massenspektrometrie identifiziert wurden. Die metabolische Markierung der Proteine mit stabilen Isotopen (SILAC) erlaubte die quantitative Diskriminierung von Proteinen, die unspezifisch an das Matrixmaterial banden. Von den potentiellen Interaktionspartnern der Claudine wurden Integrin-a3, SUMO-1 und Sphingosinkinase 2 ausgewählt, um deren Interaktion mit Claudinen weiter zu verifizieren. Es wurden keine Hinweise auf Wechselwirkungen zwischen Claudinen und Integrin-a3 sowie SUMO-1 gefunden, während die Interaktion von Claudinen mit Sphingosinkinase 2 weder bestätigt noch ausgeschlossen werden konnte. Ferner wurde eine Affinitätsreinigung durchgeführt, um Interaktionspartner von Tricellulin anzureichern. Durch die quantitative massenspektrometrische Analyse wurde ausschließlich Claudin-4 nicht aber Claudin-3 und 7 als potentieller, spezifischer Interaktionspartner von Tricellulin identifiziert. Es wurde aber gezeigt, dass die Kombination aus Affinitätsreinigung und quantitativer Massenspektrometrie einen wertvollen Beitrag zur Entschlüsselung von Protein-Komplexen leisten kann.
Epithelia function as specialized barriers that separate different compartments within multicellular organisms and regulate the specific exchange of substances between them. The intercellular space between adjacent epithelial cells is sealed by tight junctions forming a permeability barrier. Dysregulation of the barrier occurs in a variety of diseases. Hence, a deeper knowledge is required of the molecular composition of tight junctions, in particular with respect to pharmacological applications. In the present study, new enrichment strategies have been established that allow the proteomic analysis of tight junction proteins. Special emphasis was placed on claudins and tricellulin as these transmembrane proteins constitute the molecular backbone of the tight junctions. For the first time, using an affinity purification, the enrichment of several claudins was accomplished that were identified by mass spectrometry. The metabolic labeling of proteins with stable isotopes (SILAC) allowed the quantitative discrimination of proteins that bound unspecifically to the matrix. Integrin-a3, SUMO 1 and sphingosin kinase 2 were chosen for further verifications from the proteins considered to potentially interact with claudins. While there was no evidence for an association of claudins with integrin-a3 and SUMO-1, an interaction of claudins with sphingosin kinase 2 could be neither confirmed nor disproved. Furthermore, an affinity purification was performed in order to enrich interaction partners of tricellulin. Claudin-4 was identified as a specific, potential interaction partner of tricellulin by quantitative mass spectrometric analysis whereas claudin 3 and -7 were determined to be enriched unspecifically. The present study demonstrates that a combination of affinity purification and quantitative mass spectrometry can substantially contribute to the elucidation of protein complexes.
APA, Harvard, Vancouver, ISO, and other styles
11

Wagenführ, Katja. "Die Typ III Restriktionsendonuklease EcoP15I." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2009. http://dx.doi.org/10.18452/15908.

Full text
Abstract:
EcoP15I gehört zu den heterooligomeren Typ III Restriktionsendonukleasen. Der multifunktionale Enzymkomplex ist aus zwei Modifikations- und zwei Restriktions-Untereinheiten aufgebaut und katalysiert sowohl die Spaltung als auch die Methylierung der DNA. Für die effektive Spaltung der doppelsträngigen DNA benötigt EcoP15I zwei invers orientierte Erkennungsorte mit der DNA-Sequenz 5’-CAGCAG. Die Spaltung erfolgt im oberen Strang 24 bis 26 Basen in 3’-Richtung nach dem Erkennungsort und im unteren Strang 26 bis 28 Basen in 5’-Richtung nach dem Erkennungsort. Aufgrund des bislang größten definierten Abstandes zwischen Erkennungs- und Spaltort ist EcoP15I ein wichtiges Werkzeug in der funktionellen Genomanalyse. Die Aufklärung der Domänenstruktur beider EcoP15I-Untereinheiten durch limitierte Proteolyse zeigte, dass die Restriktions-Untereinheit modular aufgebaut ist. Sie besteht aus zwei stabil gefalteten Domänen, der N-terminalen Translokase- und der C-terminalen Endo-Domäne. Beide Domänen sind durch einen flexiblen Linker verbunden. In der Modifikations-Untereinheit dagegen wurden keine Domänen identifiziert. Durch Insertion von Aminosäuren in und um den Linkerbereich konnten Enzymmutanten hergestellt werden, die bevorzugt die Positionen mit größten Abstand zum Erkennungsort spalteten. Wurden dagegen in dieser Region Aminosäuren deletiert, verloren die Enzymmutanten ihre DNA-Spaltaktivität. Die photochemische Vernetzung von EcoP15I mit spezifischer DNA ergab, dass EcoP15I drei Kontakte zum Phosphatrückgrat des ersten Adenins im Erkennungsort ausbildet. Ein Kontakt wird dabei über die Aminosäure S635 im C-terminalen Teil der Modifikations-Untereinheit hergestellt, zwei weitere entstehen durch die Aminosäuren Y248 und K421 der Restriktions-Untereinheit. Die transmissionselektronenmikroskopische Abbildung des negativ kontrastierten EcoP15I-Enzym zeigte einen symmetrischen Aufbau und stellt somit eine Grundlage für die Entwicklung eines dreidimensionalen Modells dar.
EcoP15I belongs to the hetero-oligemeric type III restriction endonucleases. The multifunctional enzyme complex consists of two modification and two restriction subunits and catalyses both the cleavage and methylation of the DNA. For effective cleavage of the double stranded DNA EcoP15I needs two inversely oriented recognition sites with the DNA sequence 5’-CAGCAG. The cleavage occurs 24 to 25 bases in 3’-direction from the recognition sequence in the top strand and 26 to 28 bases in 5’-direction from the recognition sequence in the bottom strand. Because of the largest known distance between recognition and cleavage site so far EcoP15I is an important tool in functional genomics. The elucidation of the domain structure of EcoP15I restriction as well as the modification subunit by limited proteolysis showed that the restriction subunit has a modular structure. It consists of two stable folded domains, the N-terminal translocase domain and the C-terminal endonuclease domain. Both domains are connected by a flexible linker. In contrast to the restriction subunit no domains could be detected in the modification subunit. Enzyme mutants that were constructed by insertion of amino acids in and around the linker region cleaved preferentially the position with the largest distance between recognition and cleavage site. The enzyme mutants lost their DNA cleavage activity when the amino acids in this region were deleted. The photochemical crosslinking of EcoP15I with specific DNA showed that EcoP15I forms three contacts to the phosphate backbone of the first adenine of the recognition site. One contact is made by amino acid S635 in the C-terminal part of the modification subunit. Two others are made by amino acids Y248 and K421 of the restriction subunit. The transmission electron microscope picture of the negatively stained EcoP15I enzyme showed a symmetric form and therefore it constitutes a basis for the development of a three dimensional model.
APA, Harvard, Vancouver, ISO, and other styles
12

Topphoff, Ulf Schulze. "Mechanismen der Schädigung und der gestörten Regeneration im entzündeten zentralen Nervensystem." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2009. http://dx.doi.org/10.18452/15905.

Full text
Abstract:
Schädigungen im zentralen Nervensystem treten nicht nur bei der Multiplen Sklerose (MS), sondern auch bei einer Vielzahl weiterer entzündlicher Schadensparadigmen auf. Allgemeines Kennzeichen dieser primär wie auch sekundär entzündlichen neurodegenerativen Erkrankungen ist das Auftreten von oxidativem Stress in Verbindung mit einer eingeschränkten Regeneration von Nervenzellen und einem übermäßiges Auftreten von Astrozyten. Allerdings ist bislang nicht bekannt, welche Faktoren für eine frühe neuronale Schädigung verantwortlich sind, und welche Faktoren zu einem übermäßigen Auftreten von Astrozyten beitragen. Vorarbeiten belegten, dass Apoptose-regulierende Systeme, wie z.B. der TRAIL-Signalweg, sowohl an der Immunregulation als auch an Schädigungsprozessen im Gehirn beteiligt sein können. Im Rahmen dieser Arbeit wurde gezeigt, dass eine auf das ZNS beschränkte Blockade des TRAIL-Signalwegs in der EAE, dem Tiermodell der MS, zu einer signifikanten Verminderung des Erkrankungsgrades führte. Darüber hinaus wurde eine reduzierte Enzephalitogenität von TRAIL-defiziente Myelin-spezifischen Lymphozyten belegt. Die Ergebnisse deuten darauf hin, dass der TRAIL-vermittelte Schädigungsmechanismus die Pathogenese der Neuroinflammation entscheidend mitbestimmt und die immunregulatorische Wirkung eine eher untergeordnete Rolle spielt. Dagegen stellte sich im Tiermodell der bakteriellen Meningitis heraus, dass TRAIL hier eine anti-inflammatorische Rolle im ZNS spielt, die vor allem durch eine TRAIL-R-abhängige apoptotische Minderung der Entzündungsreaktion vermittelt wird. Eine Beeinflussung der Migration von Effektorzellen durch TRAIL konnte in diesem Modell ausgeschlossen werden. Anscheinend hängt die therapeutische Modulation des TRAIL-Systems entscheidend von der jeweils zu Grunde liegenden Ätiopathogenese ab und kann nicht allgemein auf entzündliche ZNS-Erkrankungen übertragen werden. Als mögliche Ursache für eine verminderte Regenerationsfähigkeit endogener Stammzellen konnte hier ein endogener Mechanismus aufgedeckt werden, der als Antwort auf oxidativen Stress zu einem quantitativen Überwiegen von Astrozyten führt. Dabei zeigte sich, dass nicht toxische oxidative Bedingungen das Proliferationsvermögen von neuralen Stammzellen deutlich hemmten und dazu führten, dass anstelle von Neuronen vornehmlich Astrozyten entstehen. Dieses veränderte Differenzierungsvermögen ließ sich sowohl in vitro als auch in vivo experimentell nachvollziehen und wies darauf hin, dass durch milde Entzündungsprozesse hervorgerufene basale metabolische Veränderungen die neuronale und astrogliale Entwicklung aus neuralen Stammzellen reziprok reguliert wird. In weiteren Untersuchungen stellte sich heraus, dass die Histondeacetylase Sirt1 in neuralen Stammzellen als Sensor für das Redox-Potenzial dient. Schon geringe metabolische Änderungen induzierten die Bindung an den bHLH-Transkriptionsfaktor Hes1, die zu einer direkten Modulation des pro-neuronalen Transkriptionsfaktors Mash1 führten und die Differenzierung von neuralen Stammzellen zugunsten der astroglialen Entwicklung beeinflussten. Die Aufklärung dieses Mechanismus könnte somit zukünftig helfen, intrinsische Regenerationsprozesse nach Schädigung des ZNS zu verstärken und damit neue therapeutische Perspektiven bei neurologischen Erkrankungen zu öffnen.
Damage processes of the central nervous system (CNS) are not only found in Multiple sclerosis (MS) even in a variety of inflammatory diseases. A common feature of these inflammatory neurodegenerative disorders is the existence of oxidative stress in combination with a failure of neuronal replenishment and the predominant occurrence of astrocytes (known as astrogliosis). So far, factors, which are responsible for early neuronal damage and overwhelming generation of astrocytes, are not known. Recent studies could show that the tumor necrosis factor related apoptosis-inducing ligand (TRAIL) might be involved in immunregulation as well as damage processes in the CNS. Here, it could be shown that blockade of TRAIL in the CNS of animals suffering from experimental autoimmune encephalomyelitis (EAE) significantly ameliorates the disease. Furthermore, transfer of myelin-specific TRAIL-deficient T cells into wild type recipients lead to a significantly attenuated disease score. These findings underline the contribution of TRAIL to irreversible CNS damage. In the adult mammalian brain, multipotent and self-renewing neural progenitor cells (NPCs) have the capacity to generate new neurons, astrocytes and oligodendrocytes. NPCs may thus serve as a regenerative tool by which brain damage could be compensated. However, repair processes in response to all forms of neuronal injury, be they inflammatory, ischemic, metabolic, traumatic or other, are characterized by the failure of neuronal replenishment and the predominant occurrence of astrocytes. The common molecular pathways underlying this phenomenon are only poorly understood. Here, it could be shown that subtle alterations of the redox state, found in different brain damage scenarios, substantially regulate the fate of murine NPCs via the histone deacetylase silent mating type information regulation 2 homolog 1 (Sirt1). Mild oxidative conditions suppress proliferation of NPCs and direct their differentiation towards the astroglial at the expense of the neuronal lineage (and vice versa). Under oxidative conditions, NPCs upregulate Sirt1 in vitro and in vivo, which then binds to the transcriptional repressor Hes1 and finally downregulates the pro-neuronal basic helix-loop-helix transcription factor Mash1. Furthermore, it could be shown that targeted modulation of Sirt1 activity mimics the effects of subtle redox alterations. The results provide evidence for an as yet unknown metabolic master switch, which determines the fate of NPCs. Targeting these mechanisms may minimize undesired aspects of reactive astrogliosis as well as improve the success of therapeutic neural stem cell implantation.
APA, Harvard, Vancouver, ISO, and other styles
13

Gupta, Shashank. "Identification and characterization of peptide-like MHC-ligand exchange catalyst as immune response enhancer." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2009. http://dx.doi.org/10.18452/15904.

Full text
Abstract:
MHC Klasse II Moleküle präsentieren Peptidantigene für die Überwachung durch CD4+ T Zellen an der Zelloberfläche. Um Sicherzustellen, dass diese Peptidliganden möglichst genau die intrazelluläre Proteinzusammensetzung widerspiegeln, hat sich im Verlauf der Evolution ein komplexer Prozessierungsweg entwickelt, welcher möglichst stabile Peptid/MHC Komplexe an die Zelloberfläche liefert. MHC Moleküle, welche ihren Liganden verloren haben, konvertieren zudem spontan in einen ‚nichtrezeptiven’ Zustand, was als zusätzlicher Sicherheitsmechanismus dient. Diese Studie zeigt jedoch, dass Aminosäureseitenketten kurzer Peptide diesen Sicherheitsmechanismus umgehen können indem sie katalytisch einen reversiblen Ligandenaustausch auslösen. Die katalytische Aktivität von Dipeptiden, wie z.B. Tyr-Arg (YR), war dabei stereospezifisch und konnte durch zusätzliche Modifikationen verstärkt werden, welche das konservierte H-Brückennetzwerk der so genannten P1-Tasche des MHC Moleküls adressierten. Die Dipeptide verstärkten dabei sowohl die Antigenbeladung als auch den Ligandenaustausch, wobei deren relative Aktivität genau mit den bekanten Ankerpräferenzen der P1 Tasche korrelierte. Letzteres weist somit auf eine direkte Interaktion der katalytischen Seitenkette des Dipeptides mit dieser Tasche hin. Der Verstärkungseffekt war auch in CD4+ T Zellassays zu beobachten, bei denen der alleleselektive Einfluss der Dipeptide direkt in eine deutliche Erhöhung der Sensitivität der antigenspezifischen T Zellantwort führte. Durch weitere molekulardynamische Berechnungen konnte die Hypothese unterstützt werden, dass die Besetzung der P1 Tasche durch Aminosäureseitenketten einen Kollaps der leeren Bindungstasche zum ‚nichtrezeptiven’ Zustand verhindert. Während der Antigenpräsentation könnte P1 somit unmittelbar als ‚Sensor’ für die Beladung mit Peptiden dienen. Diese Annahme konnte experimentell durch spektroskopische Untersuchungen unter Verwendung des ANS-Farbstoffes (8-Anilino-1-Naphtalensulfonsäure) sowie durch Messung der intrinsischen Tryptophanfluoreszenz bestätigt werden. Darüber hinaus konnten konformationsspezifische Antikörper, welche bislang lediglich mit unbeladenen MHC Molekülen in Verbindung gebracht wurden, hier als spezifische Sonden für den nichtrezeptiven Zustand definiert werden. Als mögliche Risikofaktoren könnten katalytische kurze Peptide eine Rolle bei der Auslösung von Autoimmunerkrankungen spielen. In dieser Studie konnte gezeigt werden, dass sie die Beladung von Glutenantigenen auf das Zöliakie-assozierte HLA-DQ2 Molekül verstärken können. Zumindest in vitro konnte ihre Anwesenheit deshalb auch die antigenspezifische Antwort von CD4+ T Zellen verstärken, welche zuvor von Zöliakiepatienten isoliert worden waren. Auf der einen Seite könnten diese Peptide als ‚MHC-loading enhancer’ (MLE) deshalb als mögliche Risikofaktoren die Ausbildung entzündlicher (Auto-) Immunerkrankungen beschleunigen. Auf der anderen Seite könnten sie jedoch auch als ‚drug-like’ Vakzinadditiv zur Verbesserung von Immuntherapien führen.
MHC class II molecules present antigenic peptides on the cell surface for the surveillance by CD4+ T cells. To ensure that these ligands accurately reflect the content of the intracellular MHC loading compartment, a complex processing pathway has evolved that delivers only stable peptide/MHC complexes to the surface. As additional safeguard mechanism, MHC molecules quickly acquire a ‘non-receptive’ state once they have lost their ligand. This study shows that amino acid side chains of short peptides can bypass these safety mechanisms by triggering the reversible ligand-exchange. The catalytic activity of dipeptides such as Tyr-Arg (YR) is stereo-specific and could be enhanced by modifications addressing the conserved H-bond network near the P1 pocket of the MHC molecule. It enhanced both antigen-loading and ligand-release and strictly correlated with reported anchor preferences of P1, the specific target site for the catalytic side chain of the dipeptide. The effect was evident also in CD4+ T cell assays, where the allele-selective influence of the dipeptides translated into increased sensitivities of the antigen-specific immune response. The hypothesis that occupation of P1 prevents the ‘closure’ of the ‘empty’ peptide binding site into the ‘non-receptive’ state was further supported by molecular dynamic calculations. During antigen processing and presentation P1 may therefore function as important ‘sensor’ for peptide-load. Spectroscopic studies using ANS dye (8-aninilino-1-napthalenesulfonic acid) and intrinsic tryptophan fluorescence data, confirm the postulate by providing direct evidence for the conformational transitions. Moreover conformation specific antibodies previously described to be specific for ‘empty’ MHC could be shown to be a ‘probe’ for ‘receptive conformation’. As potent risk factors short peptides may be involved in the induction of autoimmune diseases. It could be shown here that they could enhance the loading of gluten derived antigen on celiac disease linked-HLA-DQ2 allele. At least in vitro the effect could enhance gluten specific CD4+ T cell response on T cell clones obtained from celiac disease patients. Thus, on one hand short peptides might work as ‘MHC loading enhancer’ (MLE) in the precipitation of inflammatory-‘autoimmune’ disorder, on the other hand they might be used as drug like vaccine ‘additive’ in various therapeutic settings.
APA, Harvard, Vancouver, ISO, and other styles
14

Junhager, Carina. "Våld i Strängnäs stads tänkebok 1590-1599 : Våldsmål i Strängnäs rådstuga under 1590-talet." Thesis, Uppsala universitet, Historiska institutionen, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-351496.

Full text
Abstract:
The Middle Ages and the 1600th century is often portrayed as a dark time in history, associated with violence and death. That has led to the common conclusion that it was a time when people, usually men, used violence to solve disputes and that it often led to manslaughter. Violence was an acknowledged part of the everyday life in a time when honour and reputation was crucial in the hierarchy of society. By the lake Mälaren in the middle of Sweden is Strängnäs, an old diocese town, that rouse up around the cathedral and monastery in the 1300th century. In 1523 a crucial event took place in Strängnäs, one that would change not only Strängnäs but the whole of Sweden for ever. Gustav Vasa was elected new king of Sweden and the reformation began. The church lost its power and was no longer a threat to the crown for the power of the realm. During this time the power of the king strengthened and the countrys administration developed.   But society is regulated by laws and the development of society shines trough in the enforcement of these laws. By studying the records from the courts you get a direct insight to how the law was practised in reality. In this study I am going to examine the records from the municipal court of Strängnäs during 1590-1599 to see how the enforcement of law and order was followed in a small Swedish town during the 1590s.
APA, Harvard, Vancouver, ISO, and other styles
15

Ha, Polly. "English Presbyterianism, c. 1590-1640." Thesis, University of Cambridge, 2006. https://www.repository.cam.ac.uk/handle/1810/272130.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Fiumana, Lorenzo. "Politiche di manutenzione su condizione e big data analisys in ottica industria 4.0 : il caso Sacmi S.C." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/15905/.

Full text
Abstract:
Studio inerente al tema della manutenzione su condizione, applicata ad un impianto per la produzione di capsule in plastica, in collaborazione con il gruppo SACMI S.C., a Imola, all’interno della divisione Closures & Cointainers. Presentata la tematica della manutenzione degli impianti di produzione e la sua evoluzione negli ultimi decenni, si pone l’attenzione sulle attività aziendali e l’ambito in cui si è incentrato lo studio. Segue una trattazione teorica del tema della Big Data Analysis e di come questa può essere sfruttata in termini di manutenzione predittiva e stima della RUL di un un particolare componente, concentrandosi sulle tecniche più recenti applicate in merito. Successivamente viene spiegato nel dettaglio il funzionamento della linea produttiva esaminata ed è affrontata la gestione del database di dati raw raccolti dai sensori presenti sull’impianto esaminato. Dopo averli presentati, per i diversi gruppi funzionali della linea presa in esame sono stati generati grafici su un periodo di nove mesi di produzione presso un cliente, andando a valutare: condizioni di funzionamento, possibili anomalie, trend che giustificassero usura o malfunzionamento di uno o più componenti. I risultati sono stati, quando possibile, confrontati con i report manutentivi reali, per avere conferme della robustezza dell’informazione generata dai sensori, in un’ottica che ha come possibile obiettivo quello di porre le basi per un futuro sistema di manutenzione su condizione.
APA, Harvard, Vancouver, ISO, and other styles
17

Moretto, Irene. "Introduzione alle rappresentazioni lineari di gruppi finiti." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/15900/.

Full text
Abstract:
Questo elaborato è un’introduzione alla teoria delle rappresentazioni di gruppi. Per semplicità ci si è limitati a studiare rappresentazioni complesse di grado finito di gruppi finiti. Tuttavia, alcuni risultati si possono generalizzare a campi la cui caratteristica non divide l'ordine del gruppo. Vengono date le definizioni di base e i primi risultati importanti, tra cui il teorema di Maschke e il lemma di Schur. Viene discussa la teoria dei caratteri e vengono trattati alcuni esempi di gruppi finiti importanti. Per ognuno di questi è stata determinata le tabelle dei caratteri e da queste si possono anche determinare le tavole dei caratteri per tutti i gruppi finiti fino all'ordine 11.
APA, Harvard, Vancouver, ISO, and other styles
18

Balboni, Chiara. "Vedere oltre le figure. Un'esperienza didattica di geometria dinamica in una scuola secondaria di primo grado." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/15903/.

Full text
Abstract:
A metà del '900 Emma Castelnuovo rovescia le consuetudini di insegnamento fino ad allora diffuse, proponendo una matematica che parte dall'osservazione e dalla manipolazione di oggetti reali per poi arrivare alla formalizzazione e all'astrazione proprie della matematica. Dal concreto per arrivare all'astratto in quattro parole: “Vedere oltre le figure". Il metodo da lei proposto è ben lontano da un sistema trasmissivo della conoscenza, ma pone le sue fondamenta sulla costruzione, sulla discussione e sulla scoperta. Lo studente è al centro di questo processo e viene guidato a costruire il proprio sapere dandogli un senso. La Castelnuovo anticipa così, di una cinquantina d'anni, le nuove tendenze didattiche e le richieste attuali del Ministero della Pubblica Istruzione. Nel tentativo di proporre una matematica di tipo non procedurale e libera da tecnicismi, è stato realizzato un laboratorio di geometria con due classi di una scuola secondaria di primo grado che si ispira fortemente alle idee di Emma Castelnuovo. In questa tesi, dopo un inquadramento storico-culturale, viene riportato un resoconto dell'attività svolta e delle motivazioni che hanno portato a certe scelte didattiche. Al progetto è stato dato il nome laboratorio proprio ad indicare una metodologia didattica in cui il mettersi alla prova, l'osservare, il costruire, l'esercitare lo spirito critico erano elementi fondamentali. Si è cercato di creare un clima di ascolto e dialogo in cui ci fosse spazio per l'errore e per soluzioni alternative, in modo da creare coesione nella classe e coinvolgere tutti gli alunni, seppur di livelli differenti. Durante il progetto sono state trattate le proprietà dei quadrilateri e le aree di figure piane attraverso l'uso di modellini dinamici costruiti dagli studenti e l'utilizzo del geopiano.
APA, Harvard, Vancouver, ISO, and other styles
19

Gridelli, Eleonora. "Algebra modulare e cifre di controllo per il rilevamento di errori." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/15906/.

Full text
Abstract:
Tutte le informazioni sono codificate in sequenze di bit e, per assicurare la loro affidabilità e completezza è necessario corredarle con delle cifre di controllo, o check digit, che devono almeno permettere il riconoscimento degli errori frequenti e non malevoli che avvengono durante la lettura o la trasmissione dei dati. Le possibili tipologie di codice per mantenere l'integrità dei dati sono due:rilevazione o correzione di errore. Per determinare i check digit si considerano le stringhe da proteggere e le si elaborano in modi diversi: si possono fare calcoli sulle sequenze dei dati, applicare nozioni di algebra modulare o implementare algoritmi. In questa tesi illustreremo come i gruppi possano essere utilizzati per realizzare metodi in grado di rintracciare gli errori e garantire quindi l'affidabilità delle informazioni ricevute. Così, nel primo capitolo verrà mostrato il tipo di codice più semplice, quello per il sistema binario. Successivamente considereremo le tipologie di errori che occorrono nel momento in cui si hanno a disposizione più di due caratteri e cercheremo di comprendere l'importanza dello studio della frequenza di tali errori per parlare di prevenzione.Nel secondo capitolo tratteremo la teoria su cui poggiano i metodi che poi studieremo ovvero i gruppi ciclici e i gruppi diedrali. Il terzo capitolo è sulle applicazioni: caleremo gli argomenti trattati in precedenza nel nostro caso specifico e ne sfrutteremo le proprietà per studiare le tecniche di calcolo della cifra di controllo per l'individuazione degli errori. Presenteremo infine degli esempi.
APA, Harvard, Vancouver, ISO, and other styles
20

Sandri, Serena. "PROOFS WITHOUT WORDS: dal linguaggio illustrato al registro formale. Una sperimentazione nelle classi di un liceo." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/15908/.

Full text
Abstract:
Questa tesi è sul tema delle Proofs Without Words, realizzate con GeoGebra: ho animato alcune delle classiche dimostrazioni senza parole proponendole agli studenti come punto di partenza per la costruzione di un percorso deduttivo da stendere poi per iscritto. Il focus finale della sperimentazione sarà registrare i comportamenti degli allievi quando cercheranno di esprimere e formalizzare discorsivamente quanto fatto. Accanto a questo nucleo primario di ricerca, il lavoro è condito da un’indagine sul rapporto tra il registro formale e quello visuale. Nel primo capitolo introduco e presento le PWWs. Nel secondo capitolo metto le basi teoriche entro cui ho costruito la sperimentazione, e che ne guideranno la lettura e l’analisi dei risultati: la visualizzazione e l’argomentazione in matematica. Nel terzo capitolo espongo le specifiche domande che motivano il lavoro di tesi, descrivo dettagliatamente la costruzione della sperimentazione, la scelta delle dimostrazioni senza parole, le varie modalità con cui le ho animate e le attività proposte in classe agli studenti. Nel quarto capitolo espongo i risultati ottenuti dalla lettura e dall’analisi dei protocolli raccolti nelle classi. In particolare, l’analisi delle dimostrazioni formalizzate stese dagli alunni viene condotta seguendo precisi criteri osservabili, alcuni riguardanti prettamente la forma delle dimostrazioni, altri specifici sul contenuto degli elaborati e altri ancora a metà tra le due tipologie precedenti. Si vuole, infine, indagare quale ruolo svolge il linguaggio utilizzato nelle formalizzazioni, se espliciti un processo deduttivo o assuma un ruolo esclusivamente descrittivo. Nell’ultimo capitolo, infine, traggo alcune conclusioni sul possibile utilizzo delle Proofs Without Words come valido ed efficace (o meno) strumento didattico in un percorso di avvio o consolidamento di un pensiero argomentativo formalizzato e strutturato.
APA, Harvard, Vancouver, ISO, and other styles
21

Cordella, Davide. "Equilibrium measures on trees and square tilings." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/15909/.

Full text
Abstract:
Nella tesi vengono trattati alcuni argomenti di teoria del potenziale, in particolare le nozioni di capacità di un insieme, funzione di equilibrio e misura di equilibrio. Sono presentate definizioni e principali risultati di teoria del potenziale assiomatica. Tali risultati vengono poi tradotti nel contesto discreto degli alberi, definendo il bordo di un albero come spazio metrico e le capacità dei suoi sottoinsiemi: ad esse sono associate le relative funzioni di equilibrio sui lati dell'albero, che a loro volta corrispondono a misure d'equilibrio sul bordo. Uno dei risultati principali della tesi è la caratterizzazione delle misure di equilibrio sugli alberi in termini di una equazione discreta non lineare. Ad una misura d’equilibrio si può associare uno square tiling, cioè la suddivisione di un rettangolo in piastrelle quadrate non sovrapposte. Sono presentati alcuni esempi di alberi, misure d’equilibrio e tilings associati. Facendo un parallelo con un teorema di I. Benjamini e O. Schramm che associa ad un grafo più generale il tiling di un cilindro, viene data un'interpretazione delle misure d’equilibrio e delle capacità su alberi in termini probabilistici considerando passeggiate aleatorie. Infine viene analizzato un risultato di O. Schramm sulla costruzione di square tilings finiti con un'assegnata struttura combinatoria.
APA, Harvard, Vancouver, ISO, and other styles
22

Sighinolfi, Silvia. "Valutazione dell'Helichrysum italicum (Roth) G. Don per un possibile impiego in attività di fitorimedio di suoli contaminati." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/15901/.

Full text
Abstract:
La phytoremediation si colloca nel campo dei nuovi interventi di bonifica, come una tecnica verde, sostenibile e a basso costo, che può sostituire le ordinarie tecniche ingegneristiche in condizioni di inquinamento da basso a moderato. Si basa sull'utilizzo di piante per contenere, rimuovere o degradare i contaminanti presenti in suolo, sedimenti e acque. In questo studio è stata valutata la specie H. italicum, per l'applicazione di tecniche di fitorimedio in suoli derivanti da attività mineraria. Il campionamento è stato realizzato nel distretto minerario di Montevecchio (SU) ed in altri siti non minerari utilizzati come riferimento e controllo, situati sia in aree adiacenti al distretto minerario di Montevecchio, sia in zone collocate sull'Appennino Tosco-Emiliano Romagnolo. In ogni sito è stato campionato suolo rizosferico e campioni vegetali della specie H. italicum. Lo studio si è focalizzato su cinque elementi potenzialmente tossici (PTEs) molto diffusi, ovvero Cu, Ni, Pb, Zn e Cd. Per valutare a quale livello della pianta vengano accumulati, ogni campione vegetale è stato separato diviso in tre porzioni, radici, fusto e foglie che sono state analizzate separatamente; inoltre sono state determinate le concentrazioni totali e biodisponibili dei cinque metalli nei suoli rizosferici campionati. Dallo studio è emerso che la specie H. italicum tende ad accumulare i contaminanti indagati soprattutto a livello radicale. Mostra però un buon accumulo a livello fogliare dello Zn. Grazie alle buone capacità che la pianta presenta nell'accumulare i contaminanti nell'apparato radicale e nel trasferire lo Zn nell'apparato fogliare, l'H. italicum può essere considerato un buon candidato negli interventi di fitorimedio.
APA, Harvard, Vancouver, ISO, and other styles
23

Zapparrata, Sergio. "Tecnologie di separazione e recupero del compound di PVC da cavi elettrici a fine vita." Master's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/15904/.

Full text
Abstract:
Nel presente studio sono stati considerati parametri costruttivi e di processabilità dei cavi a fine vita e parametri identificativi delle loro plastiche. Del primo insieme, i dati sulla tipologia del conduttore, sul numero di anime del cavo e sul loro diametro, hanno dimostrato la possibilità di recuperare gli spezzoni di cavo attraverso granulazione. La tipologia di isolante e le sigle di designazione, ove possibile, hanno permesso di definire la composizione media polimerica dei campioni di cavi a fine vita. Il dato utilizzato di composizione media polimerica, è riferito alla presenza di tali materiali in almeno una delle componenti strutturali di cavo a fine vita, di cui siano disponibili dati forniti direttamente dal produttore o tramite decifrazione delle sigle di identificazione. Il compound di PVC, è la tipologia prevalente nei tre campioni rappresentativi della popolazione di cavi elettrici a fine vita (77.8%; 45% e 80%). Per motivi anche ambientali, se ne è studiato il recupero definito dalle attività R3-R13. Tecnologie elettrostatiche basate sull’effetto corona e triboelettrico, dimostrano la possibilità di separare i granuli di compound di PVC, rispettivamente da granuli di conduttori (rame e alluminio) e di plastiche (PE e gomma) costituenti gli isolamenti e le guaine dei cavi elettrici a fine vita. Con l’applicazione di tali tecniche, si è ottenuto il 50.3% del compound di PVC al 95%. Il suo riciclo meccanico, può completarsi con la Melt Filtration combinata alla sua miscelazione col polimero vergine. Con la metodologia basata sul Livello di Maturità Tecnologica (TRL- Technology Readiness Level), si è dimostrato che il riciclo meccanico, combinato alla filtrazione del polimero fuso, costituisca un’alternativa fortemente competitiva al riciclo operato attraverso processo Vinyloop®. Allo stato attuale, costituisce un’alternativa tecnologicamente più qualificata e matura rispetto al riciclo chimico operato attraverso incenerimento convenzionale.
APA, Harvard, Vancouver, ISO, and other styles
24

Banks, Alec. "A nature inspired guidance system for unmanned autonomous vehicles employed in a search role." Thesis, Bournemouth University, 2009. http://eprints.bournemouth.ac.uk/15906/.

Full text
Abstract:
Since the very earliest days of the human race, people have been studying animal behaviours. In those early times, being able to predict animal behaviour gave hunters the advantages required for success. Then, as societies began to develop this gave way, to an extent, to agriculture and early studies, much of it trial and error, enabled farmers to successfully breed and raise livestock to feed an ever growing population. Following the advent of scientific endeavour, more rigorous academic research has taken human understanding of the natural world to much greater depth. In recent years, some of this understanding has been applied to the field of computing, creating the more specialised field of natural computing. In this arena, a considerable amount of research has been undertaken to exploit the analogy between, say, searching a given problem space for an optimal solution and the natural process of foraging for food. Such analogies have led to useful solutions in areas such as numerical optimisation and communication network management, prominent examples being ant colony systems and particle swarm optimisation; however, these solutions often rely on well-defined fitness landscapes that may not always be available. One practical application of natural computing may be to create behaviours for the control of autonomous vehicles that would utilise the findings of ethological research, identifying the natural world behaviours that have evolved over millennia to surmount many of the problems that autonomous vehicles find difficult; for example, long range underwater navigation or obstacle avoidance in fast moving environments. This thesis provides an exploratory investigation into the use of natural search strategies for improving the performance of autonomous vehicles operating in a search role. It begins with a survey of related work, including recent developments in autonomous vehicles and a ground breaking study of behaviours observed within the natural world that highlights general cooperative group behaviours, search strategies and communication methods that might be useful within a wider computing context beyond optimisation, where the information may be sparse but new paradigms could be developed that capitalise on research into biological systems that have developed over millennia within the natural world. Following this, using a 2-dimensional model, novel research is reported that explores whether autonomous vehicle search can be enhanced by applying natural search behaviours for a variety of search targets. Having identified useful search behaviours for detecting targets, it then considers scenarios where detection is lost and whether natural strategies for re-detection can improve overall systemic performance in search applications. Analysis of empirical results indicate that search strategies exploiting behaviours found in nature can improve performance over random search and commonly applied systematic searches, such as grids and spirals, across a variety of relative target speeds, from static targets to twice the speed of the searching vehicles, and against various target movement types such as deterministic movement, random walks and other nature inspired movement. It was found that strategies were most successful under similar target-vehicle relationships as were identified in nature. Experiments with target occlusion also reveal that natural reacquisition strategies could improve the probability oftarget redetection.
APA, Harvard, Vancouver, ISO, and other styles
25

Kadlec, Petr. "On robust and adaptive soft sensors." Thesis, Bournemouth University, 2009. http://eprints.bournemouth.ac.uk/15907/.

Full text
Abstract:
In process industries, there is a great demand for additional process information such as the product quality level or the exact process state estimation. At the same time, there is a large amount of process data like temperatures, pressures, etc. measured and stored every moment. This data is mainly measured for process control and monitoring purposes but its potential reaches far beyond these applications. The task of soft sensors is the maximal exploitation of this potential by extracting and transforming the latent information from the data into more useful process knowledge. Theoretically, achieving this goal should be straightforward since the process data as well as the tools for soft sensor development in the form of computational learning methods, are both readily available. However, contrary to this evidence, there are still several obstacles which prevent soft sensors from broader application in the process industry. The identification of the sources of these obstacles and proposing a concept for dealing with them is the general purpose of this work. The proposed solution addressing the issues of current soft sensors is a conceptual architecture for the development of robust and adaptive soft sensing algorithms. The architecture reflects the results of two review studies that were conducted during this project. The first one focuses on the process industry aspects of soft sensor development and application. The main conclusions of this study are that soft sensor development is currently being done in a non-systematic, ad-hoc way which results in a large amount of manual work needed for their development and maintenance. It is also found that a large part of the issues can be related to the process data upon which the soft sensors are built. The second review study dealt with the same topic but this time it was biased towards the machine learning viewpoint. The review focused on the identification of machine learning tools, which support the goals of this work. The machine learning concepts which are considered are: (i) general regression techniques for building of soft sensors; (ii) ensemble methods; (iii) local learning; (iv) meta-learning; and (v) concept drift detection and handling. The proposed architecture arranges the above techniques into a three-level hierarchy, where the actual prediction-making models operate at the bottom level. Their predictions are flexibly merged by applying ensemble methods at the next higher level. Finally from the top level, the underlying algorithm is managed by means of metalearning methods. The architecture has a modular structure that allows new pre-processing, predictive or adaptation methods to be plugged in. Another important property of the architecture is that each of the levels can be equipped with adaptation mechanisms, which aim at prolonging the lifetime of the resulting soft sensors. The relevance of the architecture is demonstrated by means of a complex soft sensing algorithm, which can be seen as its instance. This algorithm provides mechanisms for autonomous selection of data preprocessing and predictive methods and their parameters. It also includes five different adaptation mechanisms, some of which can be applied on a sample-by-sample basis without any requirement to store the on-line data. Other, more complex ones are started only on-demand if the performance of the soft sensor drops below a defined level. The actual soft sensors are built by applying the soft sensing algorithm to three industrial data sets. The different application scenarios aim at the analysis of the fulfilment of the defined goals. It is shown that the soft sensors are able to follow changes in dynamic environment and keep a stable performance level by exploiting the implemented adaptation mechanisms. It is also demonstrated that, although the algorithm is rather complex, it can be applied to develop simple and transparent soft sensors. In another experiment, the soft sensors are built without any manual model selection or parameter tuning, which demonstrates the ability of the algorithm to reduce the effort required for soft sensor development. However, if desirable, the algorithm is at the same time very flexible and provides a number of parameters that can be manually optimised. Evidence of the ability of the algorithm to deploy soft sensors with minimal training data and as such to provide the possibility to save the time consuming and costly training data collection is also given in this work.
APA, Harvard, Vancouver, ISO, and other styles
26

Skidmore, David. "Pedagogical discourse and the dynamic of school development." Thesis, University of Reading, 1998. http://opus.bath.ac.uk/15906/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Dogar, Omara. "Understanding variations in the outcome of a smoking cessation programme in tuberculosis patients in Pakistan : the role of fidelity?" Thesis, University of York, 2016. http://etheses.whiterose.ac.uk/15900/.

Full text
Abstract:
I conducted a review to show that smoking cessation in TB patients can have positive TB outcomes. Behavioural interventions (BI) for smoking cessation are known to be effective and cost effective but there are important variations in quit rates that are hard to explain. My aim was to develop and validate a fidelity index for BIs in smoking cessation, and use it to explore the extent to which variations in smoking quit outcomes reflect the degree to which providers implement the various components and the quality of delivery of a smoking cessation programme in TB treatment settings. I developed and tested a theoretically mapped fidelity index comprising two sub-indices: ‘Adherence’ for measuring compliance (37 items) and ‘Quality’ for measuring provider competence (8 items). Items were rated as fully, partially, or not, implemented against a behaviourally anchored response scale. Fidelity was measured in a prospective study of 18 providers in TB clinics in Pakistan (154 patients) whose sessions were audio-recorded and then coded. These providers had participated in delivering the same BIs as part of the ASSIST trial four years earlier. Reliability was assessed in three ways. There was good inter-coder reliability using Kripendorff’s alpha, Principal Components Analysis showed the items of the index were coherent in measuring fidelity and Generalisability theory showed that the index reliably differentiated between providers by capturing the variation in their BI delivery practice. Provider Adherence and Quality of provision were positively correlated. I tested the assumption that relative provider practice was consistent between the ASSIST trial in 2010 and the fidelity study in 2014. Using Kendall’s W coefficient of concordance on data from a self-recorded checklist used in both studies, I found moderate to strong concordance. I then used binomial regression analysis to estimate the relationship between fidelity and ASSIST trial provider-level quit rates. This showed that the provider-level quit rate was positively associated with Quality of interaction (odds ratio: 2.15; 95% CI, 1.43 to 3.24) but negatively associated with Adherence to BI content (odds ratio: 0.55; 95% CI, 0.40 to 0.77). A negative interaction was found between Adherence and Quality and quit rates. This research makes several contributions to the field. We have a better understanding of the impact of smoking on TB. I developed and validated a new, theoretically-informed, fidelity index which can be used to quantify and score delivery of BI ingredients in a standardised way and used to better understand how BIs influence outcomes. I report that the quality of delivery of BIs maybe as important as content in influencing smoking cessation.
APA, Harvard, Vancouver, ISO, and other styles
28

O'Connor, Dominic. "A novel heat recovery device for passive ventilation systems." Thesis, University of Sheffield, 2016. http://etheses.whiterose.ac.uk/15904/.

Full text
Abstract:
The purpose of this study was to assess the performance of a novel heat recovery device for integration with a passive ventilation system. Current heat recovery devices are not suitable for integration with passive ventilation systems due to the high pressure drop experienced by airstreams across the devices. This would result in low ventilation supply rates required for good indoor air quality. The device could be also reconfigured to dehumidify an incoming airstream, lowering the relative humidity of the air. These modifications would improve the air quality and reduce energy demand on mechanical ventilation systems. The novel heat recovery device was designed and constructed using 3D printing techniques and tested experimentally using different inlet conditions for two counter-current airstreams. Numerical analysis using Computational Fluid Dynamics (CFD) calculated solutions for air velocity, gauge air pressure, air temperature and relative humidity before and after the heat recovery device using the same geometry as the printed prototype. The experimental testing of prototypes of the heat recovery device validated these characteristics. The results from the experiments and CFD analysis showed that the novel design of the heat recovery device achieved the three primary objectives of the project. The pressure drop measured across the heat recovery device was between 10.02-10.31Pa, significantly lower 150Pa experienced in standard devices. The resultant air velocity suggested that an air supply rate of 140.86 litres per second was possible, high enough to provide ventilation to a room with 17 occupants. The device was capable of increasing the temperature of the incoming airstream by up to 0.68°C when the temperature of the outgoing airstream was 40°C. Finally, the relative humidity of an incoming airstream with 100% relative humidity was reduced by up to 67.01%, at regeneration temperatures between 25-40°C, significantly lower than current temperatures of 120°C.
APA, Harvard, Vancouver, ISO, and other styles
29

Kasdirin, Hyreil. "Adaptive bio-inspired firefly and invasive weed algorithms for global optimisation with application to engineering problems." Thesis, University of Sheffield, 2016. http://etheses.whiterose.ac.uk/15905/.

Full text
Abstract:
The focus of the research is to investigate and develop enhanced version of swarm intelligence firefly algorithm and ecology-based invasive weed algorithm to solve global optimisation problems and apply to practical engineering problems. The work presents two adaptive variants of firefly algorithm by introducing spread factor mechanism that exploits the fitness intensity during the search process. The spread factor mechanism is proposed to enhance the adaptive parameter terms of the firefly algorithm. The adaptive algorithms are formulated to avoid premature convergence and better optimum solution value. Two new adaptive variants of invasive weed algorithm are also developed seed spread factor mechanism introduced in the dispersal process of the algorithm. The working principles and structure of the adaptive firefly and invasive weed algorithms are described and discussed. Hybrid invasive weed-firefly algorithm and hybrid invasive weed-firefly algorithm with spread factor mechanism are also proposed. The new hybridization algorithms are developed by retaining their individual advantages to help overcome the shortcomings of the original algorithms. The performances of the proposed algorithms are investigated and assessed in single-objective, constrained and multi-objective optimisation problems. Well known benchmark functions as well as current CEC 2006 and CEC 2014 test functions are used in this research. A selection of performance measurement tools is also used to evaluate performances of the algorithms. The algorithms are further tested with practical engineering design problems and in modelling and control of dynamic systems. The systems considered comprise a twin rotor system, a single-link flexible manipulator system and assistive exoskeletons for upper and lower extremities. The performance results are evaluated in comparison to the original firefly and invasive weed algorithms. It is demonstrated that the proposed approaches are superior over the individual algorithms in terms of efficiency, convergence speed and quality of the optimal solution achieved.
APA, Harvard, Vancouver, ISO, and other styles
30

Quan, Guan. "A component-based approach to modelling beam-end buckling adjacent to beam-column connections in fire." Thesis, University of Sheffield, 2016. http://etheses.whiterose.ac.uk/15908/.

Full text
Abstract:
The investigation of the collapse of “7 World Trade” as part of the events of 11 September 2001 in New York City (Gann, 2008) indicated that connections were among the most vulnerable elements of steel-framed or composite buildings, and their characteristics can determine whether such buildings survive in extreme scenarios such as fire. In this case total collapse of the building was triggered by the fracture of beam-to-column connections caused largely by thermal expansion of long-span beams. This emphasized the importance of investigating the complex mechanisms through which forces are transferred from the adjacent parts of a structure to the connections under fire conditions. The Cardington fire tests in 1995-96 (Newman, 2000) provided ample evidence that both shear buckling of beam webs and beam bottom-flange buckling, near to the ends of steel beams, are very prevalent under fire conditions. Both of these phenomena could affect the force distribution at the adjacent column-face connection bolt rows, and therefore the sequence of fracture of components. However, there is a distinct lack of practical research investigating the post-buckling behaviour of beams of Classes 1 and 2 sections adjacent to connections at elevated temperatures. In this PhD thesis, the development of analytical models of pure beam-web shear buckling and a combination of both beam-web shear buckling and bottom-flange buckling of beams of Classes 1 and 2 sections are reported. The analytical models are able to predict the post-buckling behaviour of the beam-end buckling panels in the vicinity of beam-column connections at elevated temperatures. A transition criterion, to distinguish between cases in which pure beam-web shear buckling occurs and those in which the instability is a combination of shear buckling and bottom-flange buckling, has been proposed, including a calculation procedure to detect the transition length between these two buckling modes. A component-based buckling element has been created and implemented in the three-dimensional structural fire analysis software Vulcan. The influence of the buckling elements on the bolt row force redistribution of the adjacent connections has been investigated in isolated beams and a simple two-span two-floor frame. It is expected that the buckling element will be involved in more complex performance-based frame analysis for design, and that it will be used with an explicit dynamic procedure to simulate local and progressive collapse of whole buildings.
APA, Harvard, Vancouver, ISO, and other styles
31

Peyvast, Negin. "Quantum well and quantum dot broadband optical devices for optical coherence tomography applications." Thesis, University of Sheffield, 2016. http://etheses.whiterose.ac.uk/15906/.

Full text
Abstract:
In this thesis quantum well (QW) and quantum dot (QD) based devices are investigated with the aim of obtaining broad bandwidth light sources for optical coherence tomography (OCT) applications. QD based structures have many possible advantages for broadband applications due to their inhomogeneous broadening. However, more investigation is required in order to fulfill this potential. Firstly, in chapter one, an introduction to the fundamental principles of semiconductor heterostructures is provided followed by basic concepts of OCT. The experimental techniques used in this thesis are outlined and briefly discussed. Brief reviews of the gain measurement techniques which have been used throughout this thesis are presented. Free carrier effects have been highlighted as a source of line-width broadening in QD structures. However, to date the effects of free carriers have mostly been experimentally determined at comparatively high carrier densities. In chapter 2 I develop a model for the gain and spontaneous emission spectra of QD active elements and show that not only are free carrier effects important at high QD occupancies, but also at much lower carrier densities where QD lasers would normally operate. Furthermore, it is shown that the choice of carrier distribution function is far less important than was previously thought in describing the experimentally observed gain and spontaneous emission spectra. The literature has suggested that incorporating QW layers in hybrid QW/QD structures changes the behaviour of the QDs. Optical pumping of the QD active element by emission from the QW active element is investigated experimentally in chapter 3. Analysis of a QD laser, a hybrid QW/QD super luminescence diode (SLD) and mesa diodes with different active element designs show that emission from the quantum well layer does indeed modify the QD spontaneous emission, suggesting optical pumping of the QD states and the prospect for enhanced gain from the QD ground-state. Finally in chapter 4, different configurations of swept light sources (SLSs) are implemented with the aim of obtaining broader spectral bandwidth. It is demonstrated that increasing the gain of the QD-SOA is important in enhancing the sweep range. The use of complimentary SOAs is then explored. InP QW-SOAs and GaAs based QD-SOAs have overlapping gain and SE spectra which is utilised in a swept source laser (SSL) and filtered ASE configuration SLS. The results suggest that such sources may be able to achieve ~220nm sweep bandwidths. Chapter 5 summarizes the whole thesis and provides an overview of future work.
APA, Harvard, Vancouver, ISO, and other styles
32

Ryan, Thomas Anthony. "Defining the early cellular response to mitochondrial stress : implications for Parkinson's." Thesis, University of Leeds, 2016. http://etheses.whiterose.ac.uk/15902/.

Full text
Abstract:
Parkinson’s disease (PD) is characterised by loss of dopaminergic neurons within the substantia nigra pars compacta. Mitochondrial dysfunction and oxidative stress are pathological characteristics of PD. It is not fully understood how neurons respond to these stresses, though adaptive responses involving the activity of PD-associated proteins such as Parkin may play a critical role. When these adaptive responses are overcome neurons are lost through apoptotic mechanisms aimed at minimising further tissue damage. AP-1 transcription factors are encoded by immediate early genes (IEGs) such as FOS and JUN. They regulate early cellular responses to stress. Mitogen-associated protein kinases (MAPKs), such as JNK and ERK, regulate AP-1 function. Previous studies have implicated MAPK/AP-1 signalling in the progression of PD. The aim of this thesis was to elucidate whether the AP-1 system was involved in the response to mitochondrial stress in neurons and to dissect the signalling pathways by which this was regulated. SH-SY5Y neuroblastoma cells were used as a model system. CCCP-induced mitochondrial uncoupling induced oxidative stress in this cell line. In response to this AP-1 levels were modulated at both the mRNA and protein levels. SH-SY5Y differentiation or Parkin overexpression altered AP-1 protein response profiles. Treatment with c-Jun siRNA lead to an increase in cell death induced by lower levels of mitochondrial coupling, but a decrease under higher levels of uncoupling. Additionally, a novel link between Parkin and the AP-1 response was observed and investigated. Further work indicated that both JNK and ERK could regulate c-Jun function. Critically, this was dependent on both cell differentiation and the level of stress. Notably, JNK signalling was the primary promoter of apoptosis in response to high levels of mitochondrial stress in differentiated cells. This study suggests that the JNK/c-Jun pathway is differentially regulated by the level of mitochondrial stress, contributing to both adaptive and apoptotic responses and thereby playing a central role in deciding neuronal fate. It further suggests that enhancing adaptive cytoprotective pathways represent a better approach than inhibiting apoptotic pathways when developing therapeutics for the neurodegenerative progression observed in PD.
APA, Harvard, Vancouver, ISO, and other styles
33

Deakin, Robert Luke. "Economic Information Warfare : Analysis of the relationship between the protection of Financial Information Infrastructure and Australia's National Security." Queensland University of Technology, 2003. http://eprints.qut.edu.au/15900/.

Full text
Abstract:
The thesis presents an argument for the re-alignment of Australia's National Security efforts so that they mirror the changes occurring in our economy and society. Specifically, it seeks outcomes that can focus our efforts in protecting the critical infrastructure including our financial information infrastructure, from new and emerging threats. The thesis presents a definition of security as it applies to the Nation-State and provides the evidence of the changes in conflict and global security. It outlines the changes in the National Security environment and identifies emerging forms of conflict that effect national economic systems, and financial information infrastructures. It shows how the traditional view of National Security is being eroded by the Information Age. To support this argument a view of the key forces that are changing weapons, warfare, and approaches to National Security are provided. The thesis presents a view of the unconventional threats to economic systems. In particular, it describes the targeting of critical national information infrastructures including energy, banking and finance, transportation, vital human services, and telecommunications. The thesis outlines concern regarding the vulnerability of developed nations' social structures through increasing dependence on information and communications technology (ICT) systems, and the small population of specialist that support and protect it. The thesis argues that this dependency exposes national and global economies to new threats and forms of attack. The thesis consists of an overview of the evolution National Security approaches. In particular, it examines collective security and security concepts as they relates to the Nation-State. It identifies a number of drivers, which outline the changing nature of the global security environment. The thesis describes asymmetric warfare and discusses Information Warfare (IW), describing its elements and goals. An analysis of offensive and defensive information warfare, strategies, and operations is provided. This is then related to economic systems and how they have featured in conflicts between nations throughout history. The information and communications technology (ICT) that underpin the financial environment (in particular payment systems) are reviewed and the thesis closes by focusing on the elements of economic infrastructure assets and types of attacks from which they require protection. We conclude that the challenges for Critical Infrastructure protection in Australia are daunting and a number of recommendations are made that should be considered in light of the rapidly changing security environment. The challenges of Information Warfare will stretch Australia's resources, require re-conceptualising of our national defence forces, greater participation of the private sector, and changes in our daily lives.
APA, Harvard, Vancouver, ISO, and other styles
34

Williams, Elizabeth A. "The Globalisation Of Regulation And Its Impact On The Domain Name System : Domain Names And A New Regulatory Economy." Queensland University of Technology, 2003. http://eprints.qut.edu.au/15901/.

Full text
Abstract:
This is a multidisciplinary work that encompasses considerations of politics, regulation and technology. It considers the impact of technology on the way in which, politically, we are able to regulate technology and how we devise policy to guide that regulation. The added complication is that Internet technology knows no jurisdiction. The rulemaking established in recent years is globally applicable and is carried out without the direct involvement of national governments in the key decision making processes, particularly in the environment under examination here which focuses on the management of the technical resources of the Internet. In formulating the hypothesis that grounds this work, I have focused on two things. Firstly, that technical regulation has political, and therefore, policy implications. Secondly, that where there are policy implications with direct commercial impact, we can expect to see the vigorous involvement of corporations as they manage the environment in which they do business. These two critical conditions have driven the formulation of policies and procedures for making decisions about Internet governance. They have also driven the actual decisions which have been implemented, to a greater or lesser degree of success. This research contributes to the scholarship in four significant ways. The first is that the Internet Domain Name System (IDNS) and its governance present a new perspective on the discussion of the globalisation of business regulation. The data used to support the analysis has not been collated or examined previously and is presented here to illustrate the extension of the literature and to frame the hypothesis. The second is that I have found that national governments have, despite ongoing control within their national jurisdiction, little effective influence over the management and governance of the Domain Name System (DNS) at an international level. Thirdly, I have found that corporations have significant power to determine the way in which policies for the management of the technical resources of the Internet are discussed, developed to consensus policy positions, implemented and reviewed. Finally, the research has opened up new lines of inquiry into the rise of a new class of bureaucrats, the cosmocrats and their cosmocracy, on which further research continues.
APA, Harvard, Vancouver, ISO, and other styles
35

Tang, Eng Hoo Joseph. "The Petrogenesis Of The Station Creek Igneous Complex And Associated Volcanics, Northern New England Orogen." Queensland University of Technology, 2004. http://eprints.qut.edu.au/15902/.

Full text
Abstract:
The Station Creek Igneous Complex (SCIC) is one of the largest Middle-Late Triassic plutonic bodies in the northern New England Orogen of Eastern Australia. The igneous complex comprises of five plutons - the Woonga Granodiorite (237 Ma), Woolooga Granodiorite (234 Ma), Rush Creek Granodiorites (231 Ma) and Gibraltar Quartz Monzodiorite and Mount Mucki Diorite (227 Ma respectively), emplaced as high-level or epizonal bodies within the Devonian-Carboniferous subduction complex that resulted from a westward subduction along the east Australian margin. Composition of the SCIC ranges from monzogabbro to monzogranite, and includes diorite, monzodiorite, quartz monzodiorite and granodiorite. The SCIC has the typical I-type granitoid mineralogy, geochemistry and isotopic compositions. Its geochemistry is characteristics of continental arc magma, and has a depleted-upper mantle signature with up to 14 wt% supracrustal components (87Sr/86Srinitial = 0.70312 to 0.70391; Nd = +1.35 to +4.9; high CaO, Sr, MgO; and low Ni, Cr, Ba, Rb, Zr, Nb, Ga and Y). The SCIC (SiO2 47%-76%) has similar Nd and Sr isotopic values to island-arc and continentalised island-arc basalts, which suggests major involvement of upper mantle sourced melts in its petrogenesis. SCIC comprises of two geochemical groups - the Woolooga-Rush Greek Granodiorite group (W-RC) and the Mount Mucki Diorite-Gibraltar Quartz Monzodiorite group (MMD-GQM). The W-RC Group is high-potassium, calc-alkalic and metaluminous, whereas the MMD-GQM Group is medium to high potassium, transitional calc-alkalic to tholeiitic and metaluminous. The two geochemical groups of the SCIC magmas are generated from at least two distinct sources - an isotopically evolved Neoproterozoic mantle-derived source with greater supracrustal component (10-14 wt%), and an isotopically primitive mafic source with upper mantle affinity. Petrogenetic modeling using both major and trace elements established that the variations within respective geochemical group resulted from fractional crystallisation of clinopyroxene, amphibole and plagioclase from mafic magma, and late fractionation of alkalic and albitic plagioclase in the more evolved magma. Volcanic rocks associated with SCIC are the North Arm Volcanics (232 Ma), and the Neara Volcanics (241-242 Ma) of the Toogoolawah Group. The major and trace element geochemistry of the North Arm Volcanics is similar to the SCIC, suggesting possible co-magmatic relationship between the SCIC and the volcanic rock. The age of the North Arm Volcanics matches the age of the fractionated Rush Creek Granodiorite, and xenoliths of the pluton are found within epiclastic flows of the volcanic unit. The Neara Volcanics (87Sr/86Sr= 0.70152-0.70330, 143Nd/144Nd = 0.51253-0.51259) differs isotopically from the SCIC, indicating a source region within the HIMU mantle reservoir (commonly associated with contaminated upper mantle by altered oceanic crust). The Neara Volcanics is not co-magmatic to the SCIC and is derived from partial melting upper-mantle with additional components from the subducting oceanic plate. The high levels emplacement of an isotopically primitive mantle-derived magma of the SCIC suggest periods of extension during the waning stage of convergence associated with the Hunter Bowen Orogeny in the northern New England Orogen. The geochemical change between 237 to 227 Ma from a depleted-mantle source with diminishing crustal components, to depleted-mantle fractionate, reflects a fundamental change in the source region that can be related to the tectonic styles. The decreasing amount of supracrustal component suggests either thinning of the subduction complex due to crustal attenuation, leading to the late Triassic extension that enables mantle melts to reach subcrustal levels.
APA, Harvard, Vancouver, ISO, and other styles
36

Dwyer, Sarah Blyth. "Identifying Children At Risk Of Developing Mental Health Problems : Screening For Family Risk Factors In The School Setting." Queensland University of Technology, 2002. http://eprints.qut.edu.au/15903/.

Full text
Abstract:
Children's mental health problems are a significant public health concern. They are costly to society in both human and financial terms. This thesis contributes to the 'science of prevention' by examining issues related to the identification of children at risk of mental health problems. In particular, it was of interest to determine whether 'at-risk' children could be identified before the development of significant behavioural or emotional problems. Three areas were explored: family risk factors that predict the development of children's mental health problems, teachers' ability to identify family risk factors, and parent- and teacher-report screening methods. Data were collected from the parents and teachers of over 1000 children in preschool to Year 3 as part of the Promoting Adjustment in Schools (PROMAS) Project. Parents and teachers each completed two questionnaires at two time points, one year apart. Parents completed the Family Risk Factor Checklist - Parent (FRFCP) and the Child Behaviour Checklist (CBCL) and the equivalent instruments for teachers were, respectively, the Family Risk Factor Checklist - Teacher (FRFC-T) and the Teacher Report Form (TRF). The FRFC-P and FRFC-T were original to the current research and were designed to assess children's exposure to multiple family risk factors across five domains: adverse life events and instability (ALI), family structure and socioeconomic status (SES), parenting practices (PAR), parental verbal conflict and mood problems (VCM), and parental antisocial and psychotic behaviour (APB). Paper 1 investigated the psychometric properties of the FRFC-P and the potential for its use at a population-level to establish community risk factor profiles that subsequently inform intervention planning. The FRFC-P had satisfactory test-retest reliability and construct validity, but modest internal consistency. Risk assessed by the PAR domain was the most important determinant of mental health problem onset, while the PAR, VCM, and APB domains were the strongest predictors of mental health problem persistence. This risk factor profile suggests that, for the studied population, the largest preventive effects may be achieved through addressing parenting practices. Paper 2 examined teachers' knowledge of children's exposure to family risk factors using the FRFC-T. While teachers had accurate knowledge of children's exposure to risk factors within the ALI and SES domains, they had poor knowledge of children's exposure to risk factors within the PAR, VCM, or APB domains - the types of risk factors found in Paper 1 to be the most strongly related to children's mental health problems. Nevertheless, teachers' knowledge of children's exposure to risk factors within the ALI and SES domains predicted children's mental health problems at one year follow-up even after accounting for children's behaviour at the first assessment. Paper 3 investigated the potential of both the FRFC-P and FRFC-T for identifying individual, at-risk children. The accuracy of the FRFC in predicting internalising versus externalising disorders was compared against behavioural and simple nomination screening methods. For both parents and teachers, the behavioural screening methods were superior, however, the simple nomination method also showed promise for teachers. Both parents and teachers were more accurate at identifying children at risk of externalising mental health problems than children at risk of internalising problems. The performance of the FRFC and simple nomination methods in identifying children for selective interventions, before the development of significant behavioural or emotional problems, was also tested. Both the FRFC and simple nomination methods showed only modest predictive accuracy for these children. Combined, the results suggest that while on the one hand, the FRFC is useful for population level screening to inform intervention planning, on the other hand, it falls short of achieving good predictive accuracy for individual children. Future research should investigate ways to optimise predictive accuracy for individual children, particularly those at risk of developing internalising disorders. One option may be to use the FRFC in conjunction with behavioural screening methods. The challenge is to develop accurate screening methods that remain practical to complete at a population level. Finally, this body of research provides insight into the feasibility of offering selective preventive interventions within the school setting. While significant obstacles remain, there were several promising indications that using screening methods such as FRFC-T or simple nomination, teachers may be able to identify children earlier on the developmental pathway, before significant behavioural or emotional symptoms have developed.
APA, Harvard, Vancouver, ISO, and other styles
37

Trapp, Jamie Vincent. "Imaging And Radiation Interactions Of Polymer Gel Dosimeters." Queensland University of Technology, 2003. http://eprints.qut.edu.au/15904/.

Full text
Abstract:
Aim: The past two decades have seen a large body of work dedicated to the development of a three dimensional gel dosimetry system for the recording of radiation dose distributions in radiation therapy. The purpose of much of the work to date has been to improve methods by which the absorbed dose information is extracted. Current techniques include magnetic resonance imaging (MRI), optical tomography, Raman spectroscopy, x-ray computed tomography (CT) and ultrasound. This work examines CT imaging as a method of evaluating polymer gel dosimeters. Apart from publications resulting from this work, there has been only two other journal articles to date reporting results of CT gel dosimetry. This indicates that there is still much work required to develop the technique. Therefore, the aim of this document is to develop CT gel dosimetry to the extent that it is of use to clinical and research physicists. Scope: Each chapter in this document describes an aspect of CT gel dosimetry which was examined; with Chapters 2 to 7 containing brief technical backgrounds for each aspect. Chapter 1 contains a brief review of gel dosimetry. The first step in the development of any method for reading a signal is to determine whether the signal can actually be obtained. However, before polymer gel dosimeters can be imaged using a CT scanner, imaging techniques are required which are employable to obtain reliable readings. Chapter 2 examines the various artifacts inherent in CT which interfere with the quantitative analysis of gel dosimeters and a method for their removal is developed. The method for artifact reduction is based on a subtraction technique employed previously in a feasibility study and a system is designed to greatly simplify the process. The simplification of the technique removes the requirement for accurate realignment of the phantom within the scanner and the imaging of calibration vials is enabled. Having established a method by which readings of polymer gel dosimeters can be obtained with CT, Chapter 3 examines the CT dose response. A number of formulations of polymer gel dosimeter are studied by varying the constituent chemicals and their concentrations. The results from this chapter can be employed to determine the concentration of chemicals when manufacturing a polymer gel dosimeter with a desired CT dose response. With the CT dose response characterised in Chapter 3, the macroscopic cause of the CT signal is examined in Chapter 4. To this end direct measurement of the linear attenuation coefficient is obtained with a collimated radiation source and detector. Density is measured by Archimedes' principle. Comparison of the two results shows that the cause of the CT signal is a density change and the implications for polymer gel dosimetry are discussed. The CT scanner is revisited in Chapter 5 to examine the CT imaging techniques required for optimal performance. The main limitation of the use of CT in gel dosimetry to date has been image noise. In Chapter 5 stochastic noise is investigated and reduced. The main source of non-stochastic noise in CT is found and imaging techniques are examined which can greatly reduce this residual noise. Predictions of computer simulations are verified experimentally. Although techniques for the reduction of noise are developed in Chapter 5, there may be situations where the noise must be further reduced. An image processing algorithm is designed in Chapter 6 which employs a combination of commonly available image filters. The algorithm and the filters are tested for their suitability in gel dosimetry through the use of a simulated dose distribution and by performing a pilot study on an irradiated polymer gel phantom. Having developed CT gel dosimetry to the point where a suitable image can be obtained, the final step is to investigate the uncertainty in the dose calibration. Methods used for calibration uncertainty in MRI gel dosimetry to date have either assumed a linear response up to a certain dose, or have removed the requirement for linearity but incorrectly ignored the reliability of the data and fit of the calibration function. In Chapter 7 a method for treatment of calibration data in CT gel dosimetry is proposed which allows for non-linearity of the calibration function, as well as the goodness of its fit to the data. Alternatively, it allows for the reversion to MRI techniques if linearity is assumed in a limited dose range. Conclusion: The combination of the techniques developed in this project and the newly formulated normoxic gels (not extensively studied here) means that gel dosimetry is close to becoming viable for use in the clinic. The only capital purchase required for a typical clinic is a suitable water tank, which is easily and inexpensively producible if the clinic has access to a workshop.
APA, Harvard, Vancouver, ISO, and other styles
38

Walker, Tara L. "The Development Of Microalgae As A Bioreactor System For The Production Of Recombinant Proteins." Queensland University of Technology, 2004. http://eprints.qut.edu.au/15905/.

Full text
Abstract:
Dunaliella, a genus of unicellular, biflagellate green algae, is one of the most studied microalgae for mass culture and is of commercial importance as a source of natural -carotene. Dunaliella species have the desirable properties of halotolerance and photoautotrophy that makes their large-scale culture simple and cheap using resources unsuitable for conventional agriculture. The ease and cost-effectiveness of culture makes Dunaliella a desirable target for increased production of natural compounds by metabolic engineering or for exploitation as biological factories for the synthesis of novel high-value compounds. However, the lack of efficient genetic transformation systems has been a major limitation in the manipulation of these microalgae. In chapter four we describe the development of a nuclear transformation system for Dunaliella tertiolecta. The gene encoding the phleomycin-binding protein from Streptoalloteichus hindustanus, was chosen as the selectable marker as this protein retains activity at high salt concentrations. To drive expression of the chosen selectable marker, two highly expressed Dunaliella tertiolecta RbcS genes and their associated 5' and 3' regulatory regions were isolated and characterised (chapter three). Dunaliella transformation cassettes containing the RbcS promoter and terminator regions flanking the ble antibiotic resistance gene were constructed. These expression cassettes were tested in Chlamydomonas reinhardtii cells and found to drive expression of the ble gene in this heterologous system. This study also demonstrated that truncation of both the D. tertiolecta RbcS1 and RbcS2 regulatory regions significantly increases the expression of the ble gene in C. reinhardtii cells. To determine if the foreign DNA could stably integrate into the Dunaliella genome, four transformation methods: microprojectile bombardment, glass bead-mediated transformation, PEG-mediated transformation and electroporation were tested and a number of parameters varied. Southern blot analysis revealed that the plasmid DNA transiently entered the Dunaliella cells following electroporation but was rapidly degraded. Following electroporation, one stably transformed Dunaliella line was recovered. This is the first demonstration of the stable transformation of this alga. Chloroplast transformation is becoming a favoured method for the production of recombinant proteins in plants, as levels of heterologous protein are often higher than those achieved by transforming the nucleus. The Dunaliella chloroplast genome has not been genetically characterised, and thus there were no existing promoter and terminator sequences or sequences of intergenic regions that could be used for vectors in transformation of the chloroplast. Therefore, this study aimed to isolate and characterise promoters of highly expressed genes and matching terminators capable of driving transgene expression, and also to characterise intergenic regions that would be suitable insertion sites for the vector construct (chapter five). The complete gene sequence of two highly expressed Dunaliella chloroplast genes psbB and rbcL including the promoter and terminator regions as well as the coding sequence of the psbA gene were cloned and sequenced. In addition, the psbA gene is useful as a selectable marker as introduced mutations confer resistance to the herbicide 3-(3,4-Dichlorophenyl)-1,1-Dimethylurea (DCMU). Two homologous transformation constructs based on mutated psbA genes were developed and tested using microprojectile bombardment. A number of parameters were tested including: the size of the gold microprojectile particle, the distance of the plates from the point of discharge, plating onto membranes or filter paper, helium pressure, addition of an osmoticum to the medium and recovery time. Although no chloroplast transformants were recovered in this study, these homologous recombination constructs should prove useful in the development of a chloroplast transformation protocol. The other major component of this study was to investigate the use of microalgae as an expression system for the production of recombinant proteins. Transformation of Chlamydomonas reinhardtii, a species related to Dunaliella, is well developed. In chapter six, this study examined the expression of two human proteins, -lactalbumin and IGF-1 in Chlamydomonas reinhardtii. Plasmids containing the C. reinhardtii RbcS2 promoter upstream of the cDNAs of these two proteins were introduced into C. reinhardtii cells using glass-bead mediated transformation. Transgenic C. reinhardtii lines were generated and shown to contain the transgenes by PCR and Southern hybridisation. RT- PCR and northern hybridisation were subsequently used to demonstrate that the transgenes were transcriptionally active. The transcripts however, could only be detected by RT-PCR indicating that the genes were transcribed at low levels. Accumulation of the -lactalbumin protein could not be demonstrated, suggesting that although the transgenes were transcribed, they were either not translated or translated at levels below the sensitivity of western blot analysis or that any protein produced was rapidly degraded. Previous studies have indicated that in microalgae codon usage is vital in translation of the foreign protein. Codon modification of the IGF-I and -lactalbumin genes should lead to higher levels of protein accumulation. This study reports the first successful stable nuclear transformation of Dunaliella tertiolecta. Therefore it is now feasible that Dunaliella can be examined as a bioreactor for the expression of recombinant proteins. In addition, two chloroplast genes (psbB and rbcL) and their corresponding promoters and terminators have been characterised and a selectable marker cassette based on the mutated psbA gene constructed.
APA, Harvard, Vancouver, ISO, and other styles
39

French, Nicole. "Identity stressors associated with the reintegration experiences of Australasian undercover police officers." Queensland University of Technology, 2003. http://eprints.qut.edu.au/15906/.

Full text
Abstract:
This dissertation investigated a very specialised, highly sensitive and complex research area in policing known as undercover policing or covert policing operations. This is the first examination to be conducted in the Australasian policing context and the only known research to explore, empirically, undercover operatives' experiences of returning to mainstream police duties after completing their covert duties.----- There were two main research objectives in this dissertation. The first was to develop research methodology specific for use with undercover police officers. The second main objective was to conduct an empirical investigation to identify the psychological processes associated with the reintegration or re-assimilation of undercover police officers into mainstream policing environments. Social identity theory was applied to deconstruct undercover police officers' reintegration experiences.----- Given the closed and protected nature of covert policing, careful consideration to methodological and ethical issues required high priority in the development of research practices. Addressing these considerations in research practices protected the anonymity and security of those involved in the research. Tailoring research methods to suit the officers' circumstances and satisfy police managements' security concerns improved the practical application of research methods and research relations with police members and, therefore, the quality of the findings.----- In developing a research methodology for specific use with undercover police officers, a multi-method approach was adopted. Data triangulation with the use of a variety of data sources and methodological triangulation with the use of multiple methods and multiple indicators were employed. This technique proved constructive in creating a more holistic perspective of undercover policing and officers' experiences of re-assimilation.----- In theoretical terms, the major issue under investigation is of negotiating dual memberships or multiple identities. Three studies are reported. The first study is a field study, in which the researcher spent more than 18 months in the covert policing context, as a participant observer. Through field research, the researcher was able to learn about the Australasian covert policing context; obtain in-house police documents; define research issues and hypotheses; understand methodological considerations; identify a psychological theoretical framework; and examine "the fit " between theory and the social dynamics of covert policing. Other benefits of becoming immersed in the working life of undercover police officers and the police organisation included understanding the ways of proceeding and the social and organisational structure that exists among covert personnel.----- The second study interviewed 20 former covert police personnel, from two police jurisdictions, who had been reintegrated for more than three years. The majority of officers found returning to mainstream police duties a difficult experience and two separate profiles of reintegration experiences emerged from the data. This study identified the presence of more than one police identity among former operatives. It found that some officers internalise aspects of the undercover policing norms and use these police norms to define aspects of the self both as a police officer and as an individual. That is, role-playing the undercover police persona became an extension of the officer as an individual and contributed positively to their personal self-worth. It was noted that the majority of officers expressed cognitive confusion over how to behave in the mainstream policing environment after covert duties had ceased.----- The other profile to emerge from the data was of officers who characterised their undercover policing experiences as being more integrated into their overall police persona. Officers interviewed in this study employed different identity decision-making strategies to restructure their police identities. In sum, this study found that the extent, to which the undercover and mainstream memberships were integrated cognitively, influenced officers' experiences of reintegration.----- The third study is a cross-sectional design using survey methods. Thirty-eight trainees, 31 currently operational and 38 former undercover operatives from four police jurisdictions took part in this study. A group of mainstream police officers matched according to former operatives' age, gender and years of policing experience was also included. This study found that police identities change over the phases of undercover police work and that changes in former operatives' mainstream police identity were a function of covert police work. Cross-sectional comparisons revealed that former operatives' undercover police identity had declined since covert policing, however, officers' mainstream police identity had not significantly increased. Failure to increase identification with mainstream police after undercover police work has ceased has a number of implications in terms of predicting re-assimilation. Operatives most likely to experience difficulties were those who resisted the mainstream police identification and reported difficult relations with their mainstream peers. Trends analysis revealed that despite the physical change, 'cognitive' re-assimilation actually commences in the second year of the operatives' reintegration. These exploratory analyses revealed that following return to the mainstream policing environment, identity stressors were mostly likely to be experienced in the first year of reintegration.----- To determine psychological adjustment since undercover police work, the person-environment fit was also investigated in the study. Operatives' current perceptions of working in the mainstream context were reported using a number of behavioural and organisational indicators. Overall, this study found that former operatives remain committed to their policing profession, however, those who experienced identity stress during the re-assimilation process were less satisfied with their current work duties and failed to find their work interesting, tended to perceive undercover duties as having been detrimental to their career, and expressed greater intentions to leave the service within 12 months of the survey. Overall former operatives' satisfaction and commitment levels were not significantly different from mainstream officers. Mainstream police, however, reported being under greater pressure and felt more overworked in the mainstream context than former operatives. In summary, these organisational indicators revealed that the difficulties of re-assimilation and intentions to leave the service are more related to the stress of modifying officers' police identity during this period than the workload characteristics of mainstream policing.----- Overall these studies demonstrated that the process of negotiating police identities is an important psychological dynamic present in undercover operatives' reintegration experiences. The identity stress experienced during this period was shown to have a number of organisational-behavioural consequences, such as problematic intergroup relations and greater intentions to leave the police service after undercover police work. Based on findings from this research a number of practical recommendations are made and suggestions for the direction of future research are outlined. Contributions to theory are also discussed.
APA, Harvard, Vancouver, ISO, and other styles
40

Taylor, Hazel Ann. "Risk management and tacit knowledge in IT projects: making the implicit explicit." Queensland University of Technology, 2004. http://eprints.qut.edu.au/15907/.

Full text
Abstract:
This research addressed the need for in-depth investigation of what actually happens in the practice of risk management in software package implementation projects. There is strong 'official' sanction in the IT literature for the use of formal risk management processes for IT projects but there is a confused picture of their application in practice. While many potential risk factors for IT projects have been identified, and formal procedures have been prescribed for the management of these risks, there has been little work investigating how project managers assess these risks in practice and what countermeasures they employ against these risks in their projects. In particular, the study used an interpretive critical decision interview approach to focus on those areas of risk management knowledge that project managers have acquired through experience, i.e. tacit knowledge. A new categorization of risk factors emanating from three sources -- vendor, client, and third party -reveals risk factors not previously identified. Some of these new factors arise from the three sources noted, while others arise from the package implementation focus of the projects and from aspects arising from the location of the projects in Hong Kong. Key factors that cause problems even when anticipated and mitigated, and the most often unanticipated problems are also identified. The study further presents an examination of the studied managers' risk management practices, and the strategies they use to address both potential and actual problems. This examination revealed close conformance with recommended literature prescriptions at some stages of projects, and significant variation at other stages, with strategies applied being broad and general rather than risk specific. A useful categorization of these strategies into four broad groups relating to different sets of risk factors is presented, reflecting the actual practice of respondents. Tacit knowledge was revealed throughout these investigations in the variances observed between prescribed and actual practice, and particularly from an examination of project managers' decision-making practices from two different perspectives - rational and naturalistic. A hybrid decision-making model is proposed to capture the actual processes observed, and to provide prescriptive guidance for risk management practice. The investigation makes a contribution to the field of IT project risk management in three ways. First, the investigation has addressed the need for empirical studies into IT risk management practices and the factors influencing project managers in their choice and application of strategies to manage risk. Second, by examining how experienced IT project managers approach the task of managing risk in software package implementations, the study has extended our understanding of the nature of the knowledge and skills that effective IT project managers develop through experience. Third, the study makes a theoretical contribution to our understanding of IT project risk management by examining the decision-making processes followed by IT project managers from the perspective of two contrasting theories of decision-making - the rational method and the Naturalistic Decision Making theory.
APA, Harvard, Vancouver, ISO, and other styles
41

Busch, Andrew W. "Wavelet Transform For Texture Analysis With Application To Document Analysis." Queensland University of Technology, 2004. http://eprints.qut.edu.au/15908/.

Full text
Abstract:
Texture analysis is an important problem in machine vision, with applications in many fields including medical imaging, remote sensing (SAR), automated flaw detection in various products, and document analysis to name but a few. Over the last four decades many techniques for the analysis of textured images have been proposed in the literature for the purposes of classification, segmentation, synthesis and compression. Such approaches include analysis the properties of individual texture elements, using statistical features obtained from the grey-level values of the image itself, random field models, and multichannel filtering. The wavelet transform, a unified framework for the multiresolution decomposition of signals, falls into this final category, and allows a texture to be examined in a number of resolutions whilst maintaining spatial resolution. This thesis explores the use of the wavelet transform to the specific task of texture classification and proposes a number of improvements to existing techniques, both in the area of feature extraction and classifier design. By applying a nonlinear transform to the wavelet coefficients, a better characterisation can be obtained for many natural textures, leading to increased classification performance when using first and second order statistics of these coefficients as features. In the area of classifier design, a combination of an optimal discriminate function and a non-parametric Gaussian mixture model classifier is shown to experimentally outperform other classifier configurations. By modelling the relationships between neighbouring bands of the wavelet trans- form, more information regarding a texture can be obtained. Using such a representation, an efficient algorithm for the searching and retrieval of textured images from a database is proposed, as well as a novel set of features for texture classification. These features are experimentally shown to outperform features proposed in the literature, as well as provide increased robustness to small changes in scale. Determining the script and language of a printed document is an important task in the field of document processing. In the final part of this thesis, the use of texture analysis techniques to accomplish these tasks is investigated. Using maximum a posterior (MAP) adaptation, prior information regarding the nature of script images can be used to increase the accuracy of these methods. Novel techniques for estimating the skew of such documents, normalising text block prior to extraction of texture features and accurately classifying multiple fonts are also presented.
APA, Harvard, Vancouver, ISO, and other styles
42

Carey, Gemma Marian. "New Understanding Of 'Relevant' Keyboard Pedagogy In Tertiary Institutions." Queensland University of Technology, 2004. http://eprints.qut.edu.au/15909/.

Full text
Abstract:
In current times, issues of curriculum relevance are driving a raft of reforms and reviews in higher education. The unmet needs of students in terms of employment outcomes, particularly in the area of the performing arts are increasingly a matter of concern. For tertiary music training institutions, the need to attach greater importance to student needs has forced a more critical reappraisal of curriculum priorities. An effect of this has been ongoing contestation and debate within music institutions about the nature and purposes of music curriculum as a university offering. This thesis examines the implications of the above by undertaking an investigation into the relevance of keyboard curriculum, as it is currently understood in one tertiary institution, a Conservatorium of Music. It examines the contestation over student needs that is apparent within the curriculum of keyboard within such an institution. The aim is to improve the institution's capacity to respond appropriately to 'student needs' by better understanding issues about curriculum relevance. This is done by investigating how needs become articulated within this particular institution and curriculum domain and by investigating the effect these needs articulations have on the practices of those who teach and those who learn within this domain. The study uses the conceptual work of Nancy Fraser (1989) and Elizabeth Ellsworth (1989) and a doctoral study by Erica McWilliam (1992), to focus on needs articulations or needs talk that is related to the needs of keyboard students within this Conservatorium. This talk, which is generated in management, staff and student texts, is examined as produced out of systems of language use that are employed within and outside the Conservatorium. The analysis of the talk treats the contestations and struggle over student needs in the Conservatorium as products of, and productive of, power relations. The analysis reveals discourse communities that are not only fractured from within but which share very little common language. It demonstrates how systems of language use at work within the Conservatorium marginalise students at the same time as they permit the institution to continue its traditional work and practice. The study clearly demonstrates how the institution itself is actively producing 'failing' and 'blaming' students as discursive subjects. The conclusion is drawn that more attention needs to be paid to building shared communities that share a common discourse, rather than trying to wedge more 'relevant' material into the curriculum.
APA, Harvard, Vancouver, ISO, and other styles
43

Brejon, de Lavergnée Barbara. "Dessins de Simon Vouet : 1590-1649 /." Paris : Ed. de la Réunion des musées nationaux, 1987. http://catalogue.bnf.fr/ark:/12148/cb349061811.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

McKeown, Francis S. "Barnabe Googe : poetry and society in the 1560s." Thesis, Queen's University Belfast, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.286832.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Chapman, Christopher. "The representation of murder, c. 1590-1695." Thesis, University of Oxford, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.368848.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Jones, Chris. "Reformed sacramental piety in England 1590-1630." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:4cd35e30-c3dd-4764-a365-d14591e0f279.

Full text
Abstract:
England in the late-Elizabethan and early-Stuart period saw a surge of pastoral writings intended to provide lay-readers with information and advice about sacraments. Using sixty-four such texts from the period 1590-1630, this thesis analyses the conceptions of sacraments offered by cleric-authors to their audience. As a group these works had two structural features in common. First they were concerned to outline the ‘qualities’ of a ‘worthy’ receiver of the Lord’s Supper, foremost amongst which were knowledge, faith, newness of life and repentance. Second they tended to divide the concept of worthiness into three temporal chunks comprising the times before, during, and after the Supper. Using these rubrics as guidelines the thesis compares and contrasts the content of the corpus. In opposition to stereotypes of puritans neglecting sacraments, it is found that sacraments were presented by Reformed English clerics as highly efficacious entities, which truly communicated something to the believer. The importance of faith to the Reformed conception of sacraments is affirmed, with the caveat that the dominance of this concept did not prohibit clerics from extolling the sensuous or ceremonial aspects of sacraments. It is further contended that sacraments continued to be seen as spurs to moral amelioration, occasions for charity, and a demonstration of community – and that receiving sacraments did not become a wholly individualised enterprise. Building on this analysis the thesis offers three broader conclusions. Firstly it is shown that sacraments played a key part in the quest to gain assurance of salvation. Secondly it can be seen that in England there was a way of extolling sacraments and their use which is not usually thought about – a species of ‘sacramental piety’ which used mainstream Reformed ideas about sacrament to urge believers to comfort and increased Godliness. Thirdly it is contended that key Reformed theological distinctions were often submerged by the contingencies of pastoral writing.
APA, Harvard, Vancouver, ISO, and other styles
47

Dell'Olio, Fiorella. "The significance of the European Union for the evolution of citizenship and immigration policies : the cases of the United Kingdom and Italy." Thesis, London School of Economics and Political Science (University of London), 2001. http://etheses.lse.ac.uk/1590/.

Full text
Abstract:
This thesis analyses the link that the establishment of European citizenship creates between citizenship, nationality, and immigration policies. To be a European citizen, one needs to be a national of a member state. According to this criterion, nationality and citizenship are bound to each other. There is no possibility of access for those who do not have the status of national citizenship. European citizenship legitimised a privileged position to which not all individuals are entitled, and conditions of access are under the jurisdiction of each member state. It is argued that normatively European citizenship reinforces the ideology of nationality while empirically it has been used to forge a sort of European identity. In other words, the underlying argument is that European citizenship functions to define European identity and nationality functions towards the establishment of national immigration policies. This process leads to the formation of a binary typology of 'us and them', strengthened by legislation and political debates. The formation of the category of 'us' as Europeans does not find a response at the empirical level as the public does not fully identify with the Euro-polity. What emerges instead is that the public regards 'compatibility' between a European and national identity as more optimal. The principal benefit of Euro-citizenship is to re-prioritise the means of citizenship from political rights to social and economic rights. This 'opportunity structure', nevertheless, remains in a void as long as Community membership relies on the condition of nationality. The thesis proposes the introduction of a 'legal subjectivity' based on the redefinition of the concept of legality detached from nationality and grounded in the active exercise of civil, political, and social rights. Such a redefinition is necessary to sidestep the difficulties entailed in any attempt to separate citizenship from nationality in theory and practice. This would deprive citizenship of its regulative functions in terms of inclusion and exclusion, and it would reduce the importance attached to the inherent link between citizenship and nationality.
APA, Harvard, Vancouver, ISO, and other styles
48

Anghelé, Federico <1978&gt. "Il modello tedesco per la classe politica italiana (1866-1890)." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1590/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Heddon, Deirdre E. "In search of the subject : locating the shifting politics of women's performance art." Thesis, University of Glasgow, 1999. http://theses.gla.ac.uk/1590/.

Full text
Abstract:
From the late 1960s to the present, women have utilised performance art as a 'form' with which to resist, transgress, contest or reveal the position of women within wider society. However, as both the nature of feminist politics and the contexts within which the work has been produced have changed, the enactment of such oppositional strategies has also shifted. This thesis aims to locate and account for such shifts by mapping multiple subjects, including performance art, feminism(s), contemporary theory, performers and women's performance art. In the late 1960s throughout the 1970s, the strategies most often utilised by women performance artists either offered alternative, supposedly more 'truthful' representations which drew on the real, material lives of women, or completely reimagined woman, locating her in a place before or outside of the patriarchal structure. From the 1980s onwards, however, the practice of women's performance art looks somewhat different. While performers continue to contest the material conditions and results of being positioned as female in Western society, such contestations are now often enacted from within what might be considered a 'deconstructive' or 'poststructuralist' frame. Acknowledging the impossibility of ever representing the 'real' woman, since 'woman' is always already a representation (and is always multiple), I suggest that the aim of this work is therefore not so much to reveal the 'real' woman behind the fiction, but to take apart the fiction itself, revealing the way in which the signifier 'woman' has been differentially constructed, for what purpose, and with what real effects. I have nominated this shift as a movement from a performance and politics of identity to a performance and politics of subjectivity.
APA, Harvard, Vancouver, ISO, and other styles
50

Alkhelaiwi, Khalid S. "The impact of oil revenue fluctuations on the Saudi Arabian economy." Thesis, Durham University, 2001. http://etheses.dur.ac.uk/1590/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography