Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Green technique.

Dissertationen zum Thema „Green technique“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Green technique" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Choudhury, Prasun. „Carbonaceous nanomaterials and composites green techniques for organic synthesis“. Thesis, University of North Bengal, 2020. http://ir.nbu.ac.in/handle/123456789/4336.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Tonyé, Emmanuel. „Contribution à l'étude de multipôles microruban utilisés en technique microélectronique“. Toulouse, INPT, 1987. http://www.theses.fr/1987INPT038H.

Der volle Inhalt der Quelle
Annotation:
On etudie la diffraction des ondes dans les multipoles microruban par la formulation integrale bidimentionnelle du champ dans la jonction et le modele d'un guide d'onde equivalent aux lignes de transmission. En ecrivant la continuite globale des champs aux acces de la jonction, on aboutit directement a la caracterisation du multipole sur substrat isotrope ou anisotrope, soit par sa matrice impedance, soit par sa matrice de repartition multimodale
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

ETZI, ANDREINA. „Granite By-Products for inverted pavement technique“. Doctoral thesis, Università degli Studi di Cagliari, 2016. http://hdl.handle.net/11584/266695.

Der volle Inhalt der Quelle
Annotation:
As mentioned by Sardinia Region: ''Sustainable Development is what satisfies present's needs without shattering those of future generations, thanks to smart use of environmental resources and without waists''. This research aims to use a current resource available in high quantities in Sardinia to optimize the uses of the extracted materials and not to take just advantage of the Region. The mining activity in Sardinia, which is very important since ‘60s, during these years, has produced huge amounts of granite by-products. The ornamental use of granite is an important money source for Sardinia. Unfortunately, as the virgin material extracted must have high aesthetic qualities, many rock blocks are rejected. This research has the aim of making the most of the material stored in a quarry sites and of optimizing the uses of resource stone examined. The target is the use of granite by-products as material with high mechanical featured to be used for road pavements, from the foundation to surface. This research gives you the opportunity to make the most of regional resources, to minimize the thickness of asphalt, reducing the maintenance and realization costs. This is a good start for the Island to make money of something easy to export. Granite by-products will be used for the construction of innovative road pavement design. The Inverted Pavement Technique, studied and developed in South Africa since 1950, is going to be used for road infrastructure. In particular, I have focused the attention on the behavior of granular base layer. Thanks to this technique is actually possible to take advantage of mechanical features of base layer, creating base layers that assure high and long lasting performances with almost zero environmental costs. The Project is finalized to a sustainable design by using resources, considered as waste so far, present in the Sardinian Land and the minimal use of exhaustible and expensive row materials as asphalt layers. The increasing costs of petrol products and their limited availability leaded to find alternative solutions to flexible infrastructure everywhere in the world. Another target was to make sure that European and South African Regulation matched regarding granular aggregates. Through laboratory testing physical, chemical, and mechanical features of granite by-products were analyzed comparing to Dolerite, used in South Africa in the Inverted Pavement Technique.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Kumas, Gozde. „Detecting G-protein Coupled Receptor Interactions Using Enhanced Green Fluorescent Protein Reassembly“. Master's thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12614136/index.pdf.

Der volle Inhalt der Quelle
Annotation:
The largest class of cell surface receptors in mammalian genomes is the superfamily of G protein-coupled receptors (GPCRs) which are activated by a wide range of extracellular responses such as hormones, pheromones, odorants, and neurotransmitters. Drugs which have therapeutic effects on a wide range of diseases are act on GPCRs. In contrast to traditional idea, it is recently getting accepted that G-protein coupled receptors can form homo- and hetero-dimers and this interaction could have important role on maturation, internalization, function or/and pharmacology. Bimolecular fluorescence complementation technique (BiFC)
is an innovative approach based on the reassembly of protein fragments which directly report interactions. In our study we implemented this technique for detecting and visualizing the GPCR interactions in yeast cells. The enhanced green fluorescent protein (EGFP) fractionated into two fragments at genetic level which does not possess fluorescent function. The target proteins which are going to be tested in terms of interaction are modified with the non-functional fragments, to produce the fusion proteins. The interaction between two target proteins, in this study Ste2p receptors which are alpha pheromone receptors from Saccharomyces cerevisiae, enable the fragments to come in a close proximity and reassemble. After reassembly, EGFP regains its fluorescent function which provides a direct read-out for the detection of interaction. Further studies are required to determine subcellular localization of the interaction. Moreover, by using the fusion protein partners constructed in this study, effects of agonist/antagonist binding and post-translational modifications such as glycosylation and phosphorylation can be examined. Apart from all, optimized conditions for BiFC technique will guide for revealing new protein-protein interactions.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Stroian, Gabriela Andreea. „Reconstruction d'images en tomographie par impédance électrique en utilisant une équation de type Lippmann-Schwinger“. Montpellier 2, 2001. http://www.theses.fr/2001MON20050.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Putter, Ad. „Narrative technique and chivalric ethos in Sir Gawain and the Green Knight and the Old French roman courtois“. Thesis, University of Cambridge, 1992. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.259478.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Waldebäck, Monica. „Pressurized Fluid Extraction : A Sustainable Technique with Added Values“. Doctoral thesis, Uppsala University, Department of Chemistry, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-6022.

Der volle Inhalt der Quelle
Annotation:

The challenge for the future was defined by the Brundtland Commission (1987) and by the Rio Declaration (1992), in which the fundamental principles for achieving a sustainable development were provided. Sustainable chemistry can be defined as the contribution of chemistry to the implementation of the Rio Declaration. This thesis shows how Pressurized Fluid Extraction (PFE) can be utilized in chemical analysis, and how this correlates to Green Chemistry.

The reliability and efficiency of the PFE technique was investigated for a variety of analytes and matrices. Applications discussed include: the extraction of the antioxidant Irganox 1076 from linear low density polyethylene, mobile forms of phosphorus in lake sediment, chlorinated paraffins from source-separated household waste, general analytical method for pesticide residues in rape seed, total lipid content in cod muscle, and squalene in olive biomass. Improved or comparable extraction yields were achieved with reduced time and solvent consumption. The decrease in use of organic solvents was 50-90%, resulting in minimal volatile organic compounds emissions and less health-work problem. Due to higher extraction temperatures and more efficient extractions, the selection of solvent is not as important as at lower temperatures, which makes it possible to choose less costly, more environmentally and health beneficial solvents. In general, extraction times are reduced to minutes compared to several hours. As a result of the very short extraction times, the amount of co-extracted material is relatively low, resulting in fewer clean-up step and much shorter analysis time. Selective extractions could be obtained by varying the solvent or solvent mixture and/or using adsorbents.

In this thesis, the PFE technique was compared to the twelve principles of Green Chemistry, and it was shown that it follows several of the principles, thus giving a major contribution to sustainable chemistry.

APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Nichile, Felipe Ferreira de. „Paciente-limite: entre Winnicott e Green“. Pontifícia Universidade Católica de São Paulo, 2013. https://tede2.pucsp.br/handle/handle/15285.

Der volle Inhalt der Quelle
Annotation:
Made available in DSpace on 2016-04-28T20:38:43Z (GMT). No. of bitstreams: 1 Felipe Ferreira de Nichile.pdf: 895017 bytes, checksum: 3f8751056aa871d474a5cbd930f3d686 (MD5) Previous issue date: 2013-06-07
The basis of this psychoanalytic research is the comparative analysis between the works of D. W. Winnicott and André Green regarding the Theoretical-Clinical proposals presented by the authors to the treatment of the borderline patient, whose behavior is often characterized by a refractory dynamic to the classical treatment proposed by Sigmund Freud. Therefore, we aimed to draw a parallel between the psychoanalytic treatment sessions conducted by each of the authors, having as reference the instigating case of a borderline patient they had in common. Based on the clinical interventions that both authors report about the case, we attempt to establish which were the main theoretical aspects that supported them, as to investigate which innovations the authors bring to this type of patient, to whom they created clinical dispositives compatible with what they understand to be this psychopathology. To that end we intent to establish their main points of convergence and divergence. We concluded then that despite being quite close in some aspects, the authors propositions are, to a large extent, quite heterogeneous theories and present quite divergent clinical proposals. According to Winnicott, the regression to the dependency is the emphasis of the treatment, as a way to resume the normal process of personal maturation, amended by a traumatic situation and that needs to be lived again in analysis, this time with the support of a favorable environment to this return. To Green, the treatment is based on the creation of analytical environment favorable to the processes of verbalization and symbolization, as a possibility to give meaning to the vicissitudes of the drives as a way to the internalization of negative
A presente pesquisa psicanalítica tem como escopo a análise comparativa entre as obras de D. W. Winnicott e André Green no que diz respeito às propostas teórico-clínicas apresentadas pelos autores para o tratamento do paciente-limite, aquele que costuma apresentar uma dinâmica refratária ao tratamento psicanalítico clássico como foi proposto por Sigmund Freud. Para isto, buscamos traçar um paralelo entre as sessões do tratamento psicanalítico conduzido por cada um dos autores, tendo como referência o instigante caso de uma paciente-limite comum aos dois autores. Partindo, portanto, das intervenções clínicas que cada um deles relatam sobre o caso, buscamos estabelecer quais os principais aspectos teóricos que as fundamentaram, no intuito de apurar quais as inovações que os autores trazem para esta espécie de paciente, para a qual ambos se dedicaram a criar dispositivos clínicos compatíveis com o que eles entendem ser esta psicopatologia. Neste sentido, procuramos estabelecer quais são seus principais pontos de convergência e de divergência. Pudemos concluir com isso que, apesar das proposições dos autores serem bastante próximas em alguns pontos, em sua maior parte tratam-se de teorias bastante heterogêneas e que apresentam propostas clínicas bastante divergentes. Para Winnicott, a ênfase do tratamento encontra-se na regressão à dependência, como uma maneira de retomar o processo normal de amadurecimento pessoal, alterado por uma situação traumática e que necessita ser vivido novamente na análise, desta vez com o apoio de um ambiente favorável a esta retomada. Para Green, o tratamento baseia-se na criação de um ambiente analítico favorável aos processos de verbalização e simbolização, como uma possibilidade de dar sentido às vicissitudes das pulsões por meio da internalização do negativo
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Spyrou, Julie. „Religion et création romanesque dans "Moira" et "L'Autre" de Julien Green“. Besançon, 1992. http://www.theses.fr/1992BESA1031.

Der volle Inhalt der Quelle
Annotation:
Cette recherche tente de montrer, à travers l'étude de deux œuvres de julien green: Moira et l'Autre, la relation entre la religion et la création romanesque. Les héros quittent leur pays natal et sont projetés dans un univers hostile. Cette situation déclenchera une sorte de quête. D'abord seuls et en rupture de communication, les personnages greeniens seront amenés à voyager dans le temps et dans l'espace. Là, ils finissent par rencontrer Dieu, dans différentes manifestations de l'au-delà, mais pour arriver à la vérité ils doivent vaincre pièges et tentations. Cet itinéraire matériel et mystique se reproduira dans leur inconscient, d'où l'étude psychanalytique de cette démarche. Désormais, il y a un besoin de recréation d'un univers personnel dont les principales étapes passent par la perte du sacre, le rejet du profane et la crise religieuse qui les ramènera au sacre. Des lors, l'omniprésence du sacre est indubitable et joue un rôle prépondérant dans la création romanesque. La démarche créatrice de Green est directement liée à la genèse et à sa relation avec Dieu, elle se transmet également à ses héros qui deviennent à leur tour créateurs. La création romanesque est alors la recréation du monde. C'est ainsi que par l'écriture, acte mystique et acte d'amour, green devient l'intermédiaire entre Dieu et les hommes.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Kiessling, Brittany L. „Ethnographic Investigations of Commercial Aquaculture as a Rural Development Technique in Tamil Nadu, India“. FIU Digital Commons, 2016. http://digitalcommons.fiu.edu/etd/2560.

Der volle Inhalt der Quelle
Annotation:
Since the 1960s, international aid organizations and governments have invested millions of dollars in promoting aquaculture as a way to stimulate local economies and improve food security. India is one such country, incorporating aquaculture research and extension programs as part of their development plans as early as 1971. India’s aquaculture promotion efforts gained momentum in 2004, following the Indian Ocean tsunami of 2004. The government sees aquaculture as a post-disaster development tool and a method to increase community resilience in rural areas of India. Aquaculture currently constitutes nearly half of global seafood production today. Due to this importance, and the attention such practices receive through funding and extension, many scholars have focused on the social impacts that aquaculture practices have on rural communities. In particular, scholars have investigated the effects of aquaculture on environmental conditions, food security, livelihoods, gender relations, and social conflict. However, more scholarship is needed concerning the historical legacies that have contributed to how aquaculture is promoted and practiced, particularly connections to the Green Revolution. Furthermore, there needs to be more research about commercial aquaculture as a post-disaster development strategy. My research – based on 9 months of ethnographic fieldwork and archival analysis in Tamil Nadu, India – contributes to this body of literature. I synthesized post-development theory with that of environmental risk and vulnerability, building upon the work of scholars such as James Ferguson, Tania Li, and Piers Blaikie. My analysis uncovers large disparities between the goals of aquaculture development programs and actual aquaculture outcomes. I attribute this to the technocratic governance structure of the aquaculture industry, which leads to a lack of engagement and participation between aquaculture managers, researchers, and practitioners. This lack of engagement ultimately makes the communities in which aquaculture is being practiced more vulnerable to anthropogenic and natural disturbances. Additionally, I found that aquaculture practices in the study site are causing significant changes to local agrarian structures, particularly through changes to labor. These changes have implications for social stratification and disempowerment of women. Overall, these findings contribute to the anthropological study of aquaculture as well as to theories of post-development.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

De, Domenico Antonio. „Technique de gestion de ressources radios pour l'amélioration de l'efficacité énergétique dans les réseaux cellulaires hétérogènes“. Thesis, Grenoble, 2012. http://www.theses.fr/2012GRENM012/document.

Der volle Inhalt der Quelle
Annotation:
Les communications sans fil prolifèrent dans presque chaque aspect de la société humaine : puissants ‘smart-phones' et ‘tablets', accès haut débit sans fil, et communications ‘machine-to-machine' ont généré des volumes de trafic de données imprévisibles quelques années en arrière. Dans ce nouveau paradigme, l'industrie des télécommunications se doit de garantir à la fois la durabilité économique des communications sans fil à large bande ainsi que la qualité de son service. En outre, il y a une forte incitation sociale à réduire les émissions de C02 duent aux communications mobiles, qui a augmenté notamment dans la dernière décennie. Dans ce contexte, l'intégration des ‘femtocells' dans les réseaux cellulaires est une solution à faible coût pour offrir une qualité de service élevée et en même temps de décharger le réseau macrocellule. Cependant, le déploiement massif et chaotique des points d'accès femtocell et leurs opérations non coordonnées peuvent conduire à une augmentation de l'interférence co-canal. De plus, un nombre élevé de cellules faiblement chargées augmente la consommation énergétique du réseau. Dans cette thèse, nous avons étudié les effets du déploiement de femtocells sur l'efficacité énergétique du réseau cellulaire. Par ailleurs, nous investiguons sur les mécanismes d'adaptation pour les réseaux des femtocells comme un moyen pour améliorer l'efficacité des communications mobiles. Notre objectif est de répondre dynamiquement à la demande des ressources afin de limiter la consommation d'énergie moyenne et l'interférence co-canal, tout en garantissant la qualité de service. Nous profitons du contexte inhabituel de communication ‘femtocellulaire' pour proposer des mécanismes d'allocation des ressources et des systèmes de gestion de réseau qui coordonne l'activité des points d'accès, la consommation d'énergie et de la couverture. Les résultats des simulations montrent que nos propositions améliorent l'efficacité énergétique et les performances perçues par les utilisateurs du système dans les réseaux ‘femtocellulaires' coopératives et autonomes
Wireless communication proliferates into nearly each aspect of the human society, driving to the exponential growth in number of permanently connected devices. Powerful smart-phones and tablets, ubiquitous wireless broadband access, and machine-to-machine communications gen- erate volumes of data traffic that were unpredictable few years back. In this novel paradigm, the telecommunication industry has to simultaneously guarantee the economical sustainability of broadband wireless communications and users' quality of experience. Additionally, there is a strong social incentive to reduce the carbon footprint due to mobile communications, which has notably increased in the last decade. In this context, the integration of femtocells in cellular networks is a low-power, low-cost solution to offer high data rates to indoor customers and simultaneously offload the macrocell network. However, the massive and unplanned deployment of femtocell access points and their uncoordinated operations may result in harmful co-channel interference. Moreover, a high number of lightly loaded cells increases the network energy consumption. In this thesis, we investigate the effects of femtocells deployment on the cellular network energy efficiency. Moreover, we look into adaptive mechanisms for femtocell networks as a means to pave the way towards agile and economically viable mobile communications. Our goal is to dynamically match resource demand and offered capacity in order to limit the average power consumption and co-channel interference while guaranteeing quality of service constraints. We take advantage of the unusual communication context of femtocells to propose resource allocation and network management schemes that coordinate the access points activity, power consumption, and coverage. Simulation results show that our proposals improve system energy efficiency and users' performance in both networked and stand-alone femtocell deployment scenarios
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

Nyqvist, Daniel. „In vivo imaging of islet cells and islet revascularization /“. Stockholm, 2007. http://diss.kib.ki.se/2007/978-91-7357-116-6/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Lee, Ki Sun. „Towards an Improved Baton Technique: The Application and Modification of Conducting Gestures Drawn from the Methods of Rudolf, Green and Saitō for Enhanced Performance of Orchestral Interpretations“. Diss., The University of Arizona, 2008. http://hdl.handle.net/10150/193786.

Der volle Inhalt der Quelle
Annotation:
Since the early nineteenth century, a conductor has led orchestras in concert, rather than the concertmaster or the composer from a keyboard instrument. There is no theory about the function of the conductor or technique for conducting an orchestra or choir in that early period. Early conductors probably imitated the bow motions of the concertmaster, who was the leader of the group of instrumental players. The increasing importance of conducting resulted in conductors who not only cued to indicate entrances and cut offs as the concertmaster did, but also helped the musicians to understand his musical interpretation and play as a unified musical body. The establishment of this new role soon required the training of future generations of conductors and eventually conducting textbooks, with guidelines and other educational material for the apprentice conductor. In this paper, the author explores the historical background of conducting technique and the development of conducting textbooks in the twentieth century. Three conducting textbooks were chosen representing different approaches: Max Rudolf's The Grammar of Conducting; Elizabeth A. H. Green The Modern Conductor; and Hideo Saitō The Saitō Conducting Method. The author analyzes the conducting theory presented in each textbook and pinpoints the strengths and weaknesses of the three schools. He then suggested integrates beat patterns, combining elements from them, proposes more effective conducting gestures for his interpretation of the music. The focus for these integrated beat patterns is on the physical gestures and patterns of the right hand, not left hand gestures or specific expressive gestures. Chapter 2 summarizes the characteristics of the three conducting theories. Chapter 3 analyzes the basic characteristic motions of each school. Chapters 4 through 6 propose adaptations of conducting gestures, drawing from the three schools to interpret challenging sections of the examples: Marche royale from Histoire du Soldat by Igor Stravinsky, Adagio for Strings by Samuel Barber, and the Ouverture: Die Hebriden by Felix Mendelssohn. For the most effective performance of the author s interpretations, proposed integrated beat patterns are suggested for the phrases shown in the musical examples. Some of the beat patterns are presented in diagrams to show the integrated beat pattern, derived from the author s synthesis of the basic five motions of the three schools adapted from the three schools.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Ibrahim, Nidal. „Caractérisation des propriétés mécaniques des géomatériaux par technique de micro indentation“. Thesis, Lille 1, 2008. http://www.theses.fr/2008LIL10048/document.

Der volle Inhalt der Quelle
Annotation:
La technologie de micro indentation est un des moyens de caractérisation (à partir de petits échantillons) qui s'est imposé ces derniers temps dans différents domaines (pharmaceutique, génie civil, industrie pétrolière etc.). Il répond à un certain nombre d'exigences en matière de solution au problème d'échantillonnage. Cette thèse est consacrée à la caractérisation des propriétés mécanique des géomatériaux, et spécialement pour les roches pétrolières comme l'argilite, le grès, la craie ... qui ont été utilisées pour les différentes études expérimentales menées au cours de la thèse. Après avoir présenté la méthode de dépouillement du test d'indentation pour un milieu isotrope, nous avons développé une méthode semi-analytique basée sur la fonction de Green pour caractériser le milieu isotrope transverse en déterminant les cinq paramètres élastique de ce milieu. L'influence des différentes sollicitations (mécaniques, thermiques, hydriques) sur les propriétés mécaniques des roches a été étudiée en utilisant la technologie de micro indentation avec la méthode de dépouillement isotrope transverse. Nous avons essayé de caractériser les paramètres de rupture (C et f) à l'aide du test d'indentation et d'un test de micro compression simple (MCS) effectué par la même machine d'indentation. Par l'essai d'indentation et une méthode d'analyse inverse, nous avons identifié les paramètres d'une loi de comportement élastoplastique (Drucker Prager). En l'absence d'une solution directe du problème d'indentation en régime plastique, nous avons eu recours à une modélisation numérique par un code de calcule élément finis (ABAQUS) pour déterminer la courbe d'indentation calculée. Cette détermination s'est révélée tout à fait probante et a été de plus validée par une simulation d'essais de compression triaxiale sur le même matériau
The technology of micro indentation is one of the techniques ofmateriaJ characterization (by using small specimens) in various fields (mechanical engineering, civil engineering, oil industry, and pharmaceutical industry). Its main advantage lies in a certain number of practical requirements as regards the solution to the problem of small specimens. The present study is devoted the characterization of the mechanical properties of geomaterials, especially rocks involved in petroleum engineering. After having presented the methodology of the indentation test for isotropic rocks, we developed a semi-analytical method based on the use of Green function to characterize transverse isotropic rocks (five elastic parameters of these rocks). The influence of the various loadings (mechanical, thermal, hydrous) on the rock mechanics properties was studied by using the technology of micro indentation and the methodology proposed for isotropic transverse were used. Moreover, we characterize the failure parameters (C and f) by a combined approach of the indentation test and a test of micro compression (MCS) carried out the indentation device. Finally, we use inverse analysis in order to identify the parameters of a Drucker Prager mode!. ln the absence of a direct solution of the problem of indentation (in plastic regime), we had recourse to a numerical modelling by a finite element code (ABAQUS) to determine the calculated curve of indentation. This determination appeared completely convincing and moreover was validated by a simulation of triaxial compression tests on the same material
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Vimal, Ruppert. „Des aires protégées aux réseaux écologiques : science, technique et participation pour penser collectivement la durabilité des territoires“. Thesis, Montpellier 2, 2010. http://www.theses.fr/2010MON20246/document.

Der volle Inhalt der Quelle
Annotation:
Les stratégies de conservation de la nature évoluent de la protection stricte d'espaces isolés et dépourvus d'activités humaines, à l'intégration des enjeux de biodiversité dans le développement territorial. Ce changement de perspective, des aires protégées à la territorialisation de l'environnement, suppose de gérer la complexité et d'appréhender la nature collectivement et met l'accent sur les relations entre science, politique et société. Cette thèse, fondée sur une approche pluridisciplinaire, a eu pour objectif l'élaboration de recommandations pour répondre aux enjeux d'une conservation intégrée de la biodiversité. S'inscrivant dans le contexte de l'avènement des stratégies de réseau écologique en France, notre recherche porte à la fois sur les méthodes spatiales et sur les dispositifs socio-techniques en charge de planifier la conservation intégrée de la biodiversité. Chacune de ces parties a mené à des résultats et conclusions qui leur sont propres et a contribué à une réflexion plus globale sur le rôle de l'expertise technique pour répondre à ces nouveaux enjeux. Nous montrons comment une approche trop technique, centrée notamment sur la spatialisation du réseau écologique, tend non seulement à limiter le partage au sein du collectif et donc l'adhésion des acteurs mais aussi à fournir une vision réductrice et partielle des enjeux de conservation. A l'inverse, un positionnement de l'expertise en accompagnement du dialogue territorial favorise l'apprentissage social et aboutit à un cadre d'action publique qui intègre davantage les incertitudes et la complexité du vivant. Le processus participatif doit donc permettre d'opérer un glissement de l'expertise technique comme fondement de l'action publique à l'expertise collective qui assure l'intégration des savoirs et savoirs-faire de tous. Ainsi l'enjeu n'est pas celui du compromis entre science, technique et débat social mais plutôt celui de la gestion de leur interaction et de leur complémentarité
Nature conservation strategies evolve from the strict protection of isolated and devoid of human activity spaces, to the integration of biodiversity issues in territory development. This change in perspective requires the development of a collective management of complexity and a collective conception of nature and highlights the developing relationship between science, policy and society.Based on a multidisciplinary approach, the objective of this thesis was the elaboration of recommendations for integrated conservation. In the context of the advent of ecological network strategies in France, our research concerns both the spatial methods and the socio-technical process which are in charge of the conservation planning. The analysis of these two issues has produced results pertinent to each theme and more general reflection concerning the role of technical expertise in the development of such new issues. We show how an overly technical approach, which notably aims to spatially identify the network, tends to limit the collective sharing of major issues and thus the adherence of the stakeholders to conservation goals, and also provide a reductive and partial vision of conservation issues. In contrast, a position of expertise which accompanies the dialogue on territory favors social learning and leads to a framework for public action which more fully integrates both the uncertainty and complexity of the natural world. The participative process could thus allow tfor a transition from technical to collective expertise as the foundation of public action, which ensures the inclusion of general knowledge and know-how. In this way, the issue is not of a compromise between science, technique and social debate, but of the way that they can complement each other through interaction and how this interaction may be conducted
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Bahoura, Samy Skander. „Etude du sup u × inf u pour l'équation de la courbure prescrite en dimension n ≥ 3 et inégalités de Harnack“. Paris 6, 2003. https://tel.archives-ouvertes.fr/tel-00009722.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Cortez, Juliana 1984. „Pré-concentração baseada na técnica de ring oven para microanálise : determinação simultânea de sódio, ferro e cobre em etanol hidratado combustível por espectroscopia de emissão óptica em plasma induzido por laser (LIBS)“. [s.n.], 2012. http://repositorio.unicamp.br/jspui/handle/REPOSIP/249968.

Der volle Inhalt der Quelle
Annotation:
Orientador: Celio Pasquini
Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Química
Made available in DSpace on 2018-08-21T16:25:47Z (GMT). No. of bitstreams: 1 Cortez_Juliana_D.pdf: 25144768 bytes, checksum: bb2f3753af786e01a3f99e0a7e89fe95 (MD5) Previous issue date: 2012
Resumo: A técnica de ring oven, originalmente desenvolvida por Weisz em 1954 e revisitada com o objetivo de ser utilizada em um procedimento simples e altamente eficiente de pré-concentração de analitos antes da determinação por técnicas analíticas de microanálise. A técnica de pré-concentração proposta e baseada na utilização de um pequeno volume de amostra que e transferida gota-a-gota, com o auxílio de um sistema de análise por injeção em fluxo, para um substrato de papel de filtro. O papel de filtro e mantido em um pequeno forno circular aquecido (ring oven). As gotas da solução da amostra difundem por capilaridade do centro do papel de filtro para a frente do solvente. Depois do término do processo, um anel com um contorno circular bem definido (0,35 mm de largura e cerca de 2,0 cm de diâmetro) e formado no papel de filtro e este contém, pré-concentrados, os elementos presentes no volume processado de amostra. Coeficientes de preconcentração do analito podem alcancar 250 vezes (m/ m) para um volume de amostra de 600 mL. O sistema proposto foi avaliado para pré-concentrar Na, Fe e Cu presentes em amostras de etanol hidratado combustível, em nível de mg mL. O procedimento de pré-concentração não utiliza nenhum solvente adicional, aos já existentes na amostra, nem reagentes adicionais, podendo ser considerado um método limpo, de acordo com os preceitos da química analítica verde. A determinação direta e simultânea dessas espécies, no contorno do anel, foi realizada empregando a técnica microanalítica de espectroscopia de emissão óptica em plasma induzida por laser (LIBS). Os limites de detecção encontrados atendem as exigências da Agencia Nacional de Petroleo (ANP) para controle do etanol combustível. Os limites de detecção são de 0,9; 0,5 e 0,4 mg kg para sódio, ferro e cobre, respectivamente. O potencial da associação da técnica de ring oven e LIBS, uma representante das técnicas analíticas de microanálise, foi demonstrado, ajudando os limites de detecção das medidas LIBS e abrindo um novo caminho para o uso do ring oven em novas propostas analíticas e na associação com outras técnicas de microanálise
Abstract: The ring oven technique, originally developed by Weisz in 1954, is revisited to be used in a simple though highly efficient procedure for analyte preconcentration prior its determination by the micro-analytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center of filter paper to the solvent front. After the total sample volume has been delivered, a ring with a sharp (c.a. 0.35 mm) circular contour, of about 2.0 cm diameter is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250 fold (on a m/m basis) for a sample volume small as 600 mL. The proposed system and procedure have been evaluated to concentrate Na, Fe and Cu in levels of mg mL in fuel ethanol. The preconcentation procedure uses no solvent or reactant, besides the already existing in the sample, in agreement with the principles of green analytical chemistry. The simultaneous determination of these species in the ring contour, was made by employing the micro-analytical technique of laser induced breakdown spectroscopy (LIBS). The detection limits are sufficient to attend to the requirements of the National Agency of Petroleum (ANP) aiming at the quality control of fuel ethanol. The detection limits are 0.9, 0.5 and 0.4 mg kg for sodium, iron and copper, respectively. The potential of association between the ring oven technique and LIBS, representative of the microanalytical techniques, was demonstrated, helping to achieved a better detectivity in LIBS measurements and opening the path for new analytical proposals and for the association with others microanalytical techniques
Doutorado
Quimica Analitica
Doutora em Ciências
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Videv, Stefan. „Techniques for green radio cellular communications“. Thesis, University of Edinburgh, 2013. http://hdl.handle.net/1842/7988.

Der volle Inhalt der Quelle
Annotation:
This thesis proposes four novel techniques to solve the problem of growing energy consumption requirements in cellular communication networks. The first and second part of this work propose a novel energy efficient scheduling mechanism and two new bandwidth management techniques, while the third part provides an algorithm to actively manage the power state of base stations (BSs) so that energy consumption is minimized throughout the day while users suffer a minimal loss in achieved data rate performance within the system. The proposed energy efficient score based scheduler (EESBS) is based on the already existing principle of score based resource allocation. Resource blocks (RBs) are given scores based on their energy efficiency for every user and then their allocation is decided based on a comparison between the scores of the different users on each RB. Two additional techniques are introduced that allow the scheduler to manage the user’s bandwidth footprint or in other words the number of RBs allocated. The first one, bandwidth expansion mode (BEM), allows users to expand their bandwidth footprint while retaining their overall transmission data rate. This allows the system to save energy due to the fact that data rate scales linearly with bandwidth and only logarithmically with transmission power. The second technique, time compression mode (TCoM), is targeted at users whose energy consumption is dominated by signalling overhead transmissions. If the assumption is made that the overhead is proportional to the number of RBs allocated, then users who find themselves having low data rate demands can release some of their allocated RBs by using a higher order modulation on the remaining ones and thus reduce their overall energy expenditure. Moreover, a system that combines all of the aforementioned scheduling techniques is also discussed. Both theoretical and simulation results on the performance of the described systems are provided. The energy efficient hardware state control (EESC) algorithm works by first collecting statistical information about the loading of each BS during the day that is due to the particular mobility patterns of users. It then uses that information to allow the BSs to turn off for parts of the day when the expected load is low and they can offload their current users to nearby cell sites. Simplified theoretical, along with complete system computer simulation, results are included. All the algorithms presented are very straightforward to implement and are not computationally intensive. They provide significant energy consumption reductions at none to minimal cost in terms of experienced user data rate.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Maaz, Mohamad. „Allocation de ressource et analyse des critères de performance dans les réseaux cellulaires coopératifs“. Thesis, Rennes, INSA, 2013. http://www.theses.fr/2013ISAR0036/document.

Der volle Inhalt der Quelle
Annotation:
Dans les systèmes de communications sans fil, la transmission de grandes quantités d'information et à faible coût énergétique sont les deux principales questions qui n'ont jamais cessé d'attirer l'attention de la communauté scientifique au cours de la dernière décennie. Récemment, il a été démontré que la communication coopérative est une technique intéressante notamment parce qu'elle permet d'exploiter la diversité spatiale dans le canal sans fil. Cette technique assure une communication robuste et fiable, une meilleure qualité de service (QoS) et rend le concept de coopération prometteur pour les futurs générations de systèmes cellulaires. Typiquement, les QoS sont le taux d'erreurs paquet, le débit et le délai. Ces métriques sont impactées par le délai, induit par les mécanismes de retransmission Hybrid-Automatic Repeat-Request (HARQ) inhérents à la réception d'un paquet erroné et qui a un retard sur la QoS demandée. En revanche, les mécanismes HARQ créent une diversité temporelle. Par conséquent, l'adoption conjointe de la communication coopérative et des protocoles HARQ pourrait s'avérer avantageux pour la conception de schémas cross-layer. Nous proposons tout d'abord une stratégie de maximisation de débit total dans un réseau cellulaire hétérogène. Nous introduisons un algorithme qui alloue la puissance optimale à la station de base (BS) et aux relais, qui à chaque utilisateur attribue de manière optimale les sous-porteuses et les relais. Nous calculons le débit maximal atteignable ainsi que le taux d'utilisateurs sans ressources dans le réseau lorsque le nombre d'utilisateurs actifs varie. Nous comparons les performances de notre algorithme à ceux de la littérature existante, et montrons qu'un gain significatif est atteint sur la capacité globale. Dans un second temps, nous analysons théoriquement le taux d'erreurs paquet, le délai ainsi que l'efficacité de débit des réseaux HARQ coopératifs, dans le canal à évanouissements par blocs. Dans le cas des canaux à évanouissement lents, le délai moyen du mécanisme HARQ n'est pas pertinent à cause de la non-ergodicité du processus. Ainsi, nous nous intéressons plutôt à la probabilité de coupure de délai en présence d'évanouissements lents. La probabilité de coupure de délai est de première importance pour les applications sensibles au délai. Nous proposons une forme analytique de la probabilité de coupure permettant de se passer de longues simulations. Dans la suite de notre travail, nous analysons théoriquement l'efficacité énergétique (bits/joule) dans les réseaux HARQ coopératifs. Nous résolvons ensuite un problème de minimisation de l'énergie dans les réseaux coopératifs en liaison descendante. Dans ce problème, chaque utilisateur possède une contrainte de délai moyen à satisfaire de telle sorte que la contrainte sur la puissance totale du système soit respectée. L'algorithme de minimisation permet d'attribuer à chaque utilisateur la station-relai optimale et sa puissance ainsi que la puissance optimale de la BS afin de satisfaire les contraintes de délai. Les simulations montrent qu'en termes de consommation d'énergie, les techniques assistées par relais prédominent nettement les transmissions directes, dans tout système limité en délai. En conclusion, les travaux proposés dans cette thèse peuvent promettre d'établir des règles fiables pour l'ingénierie et la conception des futures générations de systèmes cellulaires énergétiquement efficaces
In wireless systems, transmitting large amounts of information with low energetic cost are two main issues that have never stopped drawing the attention of the scientific community during the past decade. Later, it has been shown that cooperative communication is an appealing technique that exploits spatial diversity in wireless channel. Therefore, this technique certainly promises a robust and reliable communications, higher quality-of-service (QoS) and makes the cooperation concept attractive for future cellular systems. Typically, the QoS requirements are the packet error rate, throughput and delay. These metrics are affected by the delay, where each erroneous packet is retransmitted several times according to Hybrid-Automatic Repeat-Request (HARQ) mechanism inducing a delay on the demanded QoS but a temporal diversity is created. Therefore, adopting jointly cooperative communications and HARQ mechanisms could be beneficial for designing cross-layer schemes. First, a new rate maximization strategy, under heterogeneous data rate constraints among users is proposed. We propose an algorithm that allocates the optimal power at the base station (BS) and relays, assigns subcarriers and selects relays. The achievable data rate is investigated as well as the average starvation rate in the network when the load, i.e. the number of active users in the network, is increasing. It showed a significant gain in terms of global capacity compared to literature. Second, in block fading channel, theoretical analyses of the packet error rate, delay and throughput efficiency in relayassisted HARQ networks are provided. In slow fading channels, the average delay of HARQ mechanisms w.r.t. the fading states is not relevant due to the non-ergodic process of the fading channel. The delay outage is hence invoked to deal with the slow fading channel and is defined as the probability that the average delay w.r.t. AWGN channel exceeds a predefined threshold. This criterion has never been studied in literature, although being of importance for delay sensitive applications in slow fading channels. Then, an analytical form of the delay outage probability is proposed which might be useful to avoid lengthy simulations. These analyses consider a finite packet length and a given modulation and coding scheme (MCS) which leads to study the performance of practical systems. Third, a theoretical analysis of the energy efficiency (bits/joule) in relay-assisted HARQ networks is provided. Based on this analysis, an energy minimization problem in multiuser relayassisted downlink cellular networks is investigated. Each user has an average delay constraint to be satisfied such that a total power constraint in the system is respected. The BS is assumed to have only knowledge about the average channel statistics but no instantaneous channel state information (CSI). Finally, an algorithm that jointly allocates the optimal power at BS, the relay stations and selects the optimal relay in order to satisfy the delay constrains of users is proposed. The simulations show the improvement in terms of energy consumption of relay-assisted techniques compared to nonaided transmission in delay-constrained systems. Hence, the work proposed in this thesis can give useful insights for engineering rules in the design of the next generation energyefficient cellular systems
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Hermanns, Sebastian [Verfasser]. „Nonequilibrium Green functions - selfenergy approximation techniques / Sebastian Hermanns“. Kiel : Universitätsbibliothek Kiel, 2017. http://d-nb.info/1143595246/34.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Taylor, Sarah Frances Rebecca. „Green catalyst preparation using electrochemical and mechanochemical techniques“. Thesis, Queen's University Belfast, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.580117.

Der volle Inhalt der Quelle
Annotation:
The conventional method for synthesis of supported metal catalysts is a multi step reaction that produces large amounts of waste. The focus of this work has been to look at alternative methods of catalyst production which eliminate or lower the number of steps and therefore waste produced. The aim of the project was to synthesise supported metal catalysts by alternative (greener) techniques and then to test these catalysts alongside conventional catalysts for improved and novel activity. It was proposed that catalysts could be synthesized by a novel electrochemical route where the pure metal is electrochemically oxidized into solution to form a transient soluble complex. This complex is then reduced electrochemically/electrolessly and the metal is deposited onto a support with the ligand of the complex being recovered in solution for subsequent cycles. The development of such a system was studied for gold using ionic liquids, the stripping and depositing of gold was demonstrated using dicyanamide ligand ([DCAn but attempts to prove the recyclability of the ligand were not successful. However during these studies a set of active gold catalysts where prepared by the electroless deposition of gold from H[AuCI4].3H20 in [C4mim][NTf2] onto silica and titania. The activity of these catalysts was compared to standard wet impregnated catalysts, interestingly the preparation method was found to control the selectivity of the reaction. The standard catalysts showed activity for the oxidation of benzyl alcohol in toluene whereas the catalysts prepared by electroless deposition from ionic liquids showed Friedel-Crafts alkylation of benzyl alcohol with toluene. This is the first time that heterogeneously supported gold catalysts have been found to be active in Friedel--Crafts alkylations. Ag/AI203 and PtlAI203 catalysts have been prepared by means of solvent-free mechanochemistry using a ball mill. Remarkable catalytic activity was observed using a Ag/AI203 catalyst by ball milling (Ag20) for octane-SCR, compared with a conventionally prepared Ag/AI203 catalyst (wet impregnation) the ball milled catalyst shows an increase in activity with a reduction in the light off temperature of -150°C and NOx conversion below 200°C which is the first time this has been achieved in the absence of hydrogen.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Adhinarayanan, Vignesh. „Models and Techniques for Green High-Performance Computing“. Diss., Virginia Tech, 2020. http://hdl.handle.net/10919/98660.

Der volle Inhalt der Quelle
Annotation:
High-performance computing (HPC) systems have become power limited. For instance, the U.S. Department of Energy set a power envelope of 20MW in 2008 for the first exascale supercomputer now expected to arrive in 2021--22. Toward this end, we seek to improve the greenness of HPC systems by improving their performance per watt at the allocated power budget. In this dissertation, we develop a series of models and techniques to manage power at micro-, meso-, and macro-levels of the system hierarchy, specifically addressing data movement and heterogeneity. We target the chip interconnect at the micro-level, heterogeneous nodes at the meso-level, and a supercomputing cluster at the macro-level. Overall, our goal is to improve the greenness of HPC systems by intelligently managing power. The first part of this dissertation focuses on measurement and modeling problems for power. First, we study how to infer chip-interconnect power by observing the system-wide power consumption. Our proposal is to design a novel micro-benchmarking methodology based on data-movement distance by which we can properly isolate the chip interconnect and measure its power. Next, we study how to develop software power meters to monitor a GPU's power consumption at runtime. Our proposal is to adapt performance counter-based models for their use at runtime via a combination of heuristics, statistical techniques, and application-specific knowledge. In the second part of this dissertation, we focus on managing power. First, we propose to reduce the chip-interconnect power by proactively managing its dynamic voltage and frequency (DVFS) state. Toward this end, we develop a novel phase predictor that uses approximate pattern matching to forecast future requirements and in turn, proactively manage power. Second, we study the problem of applying a power cap to a heterogeneous node. Our proposal proactively manages the GPU power using phase prediction and a DVFS power model but reactively manages the CPU. The resulting hybrid approach can take advantage of the differences in the capabilities of the two devices. Third, we study how in-situ techniques can be applied to improve the greenness of HPC clusters. Overall, in our dissertation, we demonstrate that it is possible to infer power consumption of real hardware components without directly measuring them, using the chip interconnect and GPU as examples. We also demonstrate that it is possible to build models of sufficient accuracy and apply them for intelligently managing power at many levels of the system hierarchy.
Doctor of Philosophy
Past research in green high-performance computing (HPC) mostly focused on managing the power consumed by general-purpose processors, known as central processing units (CPUs) and to a lesser extent, memory. In this dissertation, we study two increasingly important components: interconnects (predominantly focused on those inside a chip, but not limited to them) and graphics processing units (GPUs). Our contributions in this dissertation include a set of innovative measurement techniques to estimate the power consumed by the target components, statistical and analytical approaches to develop power models and their optimizations, and algorithms to manage power statically and at runtime. Experimental results show that it is possible to build models of sufficient accuracy and apply them for intelligently managing power on multiple levels of the system hierarchy: chip interconnect at the micro-level, heterogeneous nodes at the meso-level, and a supercomputing cluster at the macro-level.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Fransson, Jonas. „Non-Orthogonality and Electron Correlations in Nanotransport : Spin- and Time-Dependent Currents“. Doctoral thesis, Uppsala University, Department of Physics, 2002. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-2687.

Der volle Inhalt der Quelle
Annotation:

The concept of the transfer Hamiltonian formalism has been reconsidered and generalized to include the non-orthogonality between the electron states in an interacting region, e.g. quantum dot (QD), and the states in the conduction bands in the attached contacts. The electron correlations in the QD are described by means of a diagram technique for Hubbard operator Green functions for non-equilibrium states.

It is shown that the non-orthogonality between the electrons states in the contacts and the QD is reflected in the anti-commutation relations for the field operators of the subsystems. The derived forumla for the current contains corrections from the overlap of the same order as the widely used conventional tunneling coefficients.

It is also shown that kinematic interactions between the QD states and the electrons in the contacts, renormalizes the QD energies in a spin-dependent fashion. The structure of the renormalization provides an opportunity to include a spin splitting of the QD levels by polarizing the conduction bands in the contacts and/or imposing different hybridizations between the states in the contacts and the QD for the two spin channels. This leads to a substantial amplification of the spin polarization in the current, suggesting applications in magnetic sensors and spin-filters.

APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

West, Andrew. „Green techniques in reversible-deactivation radical polymerization : alternative synthetic methodologies“. Thesis, The University of Sydney, 2010. https://hdl.handle.net/2123/28902.

Der volle Inhalt der Quelle
Annotation:
Recently, environmental concerns caused the United States Environmental Protection Agency to draw up a list of 12 'Principles of Green Chemistry’ in order to combat environmentally harmful and wasteful practices with a view to reducing the environmental impact caused by the chemical industry. While a lot of these principles may be achieved through more conscientious design of experiments, it is sometimes unavoidable to use certain chemicals in the pursuit of the desired result. Living radical polymerization is a useful tool for the synthesis of well—defined structures through the use of various techniques, such as reversible addition fragmentation chain transfer (RAFT) and atom transfer radical polymerization (ATRP). These techniques are very powerful weapons in the polymer chemists’ arsenal, but are not without limitation. ATRP in particular requires large amounts of soluble copper species in order to synthesise polymers, which can be difficult to remove. Most polymerizations are undertaken in a volatile organic solvent making up approx. 50% of the reaction media. Whilst alternatives have been suggested, many require specific and extreme reaction conditions. in this work we address these issues by proposing an alternative system for ATRP that requires little purification to remove metal contamination and provides the basis for an alternative initiation system for RAFT polymerization. We also detail the use of an alternative, low cost green solvent for use in RAFT polymerizations, with a five—fold increase in the rate of polymerization when compared to standard volatile organic solvents.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Wang, Rui. „Energy-efficient LTE transmission techniques : introducing Green Radio from resource allocation perspective“. Thesis, University of Edinburgh, 2011. http://hdl.handle.net/1842/9596.

Der volle Inhalt der Quelle
Annotation:
Energy consumption has recently become a key issue from both environmental and economic considerations. A typical mobile phone network in the UK may consume approximately 40-50 MW, contributing a significant proportion of the total energy consumed by the information technology industry. With the worldwide growth in the number of mobile subscribers, the associated carbon emissions and growing energy costs are becoming a significant operational expense, leading to the need for energy reduction. The Mobile VCE Green Radio Project has been launched, which targets to achieve 100x energy reduction of the current wireless networks by 2020. In this thesis, energy-efficient resource allocation strategies have been investigated taking the LTE system as an example. Firstly, theoretical analysis of energy-efficient design in cellular environments is provided according to the Shannon Theory. Based on a two-link scenario the performance of simultaneous transmission and orthogonal transmission for network power minimization under the specified rate constraints is investigated. It is found that simultaneous transmission consumes less power than orthogonal transmission close to the base station, but much more power in the cell-edge area. Also, simulation results suggest that the energy-efficient switching margins between these two schemes are dominated by the sum total of their required data rates. New definitions of power-utility and fairness metrics are further proposed, following by the design of weighted resource allocation approaches based on efficiency-fairness trade-offs. Apart from energy-efficient multiple access between different links, the energy used by individual base stations can also be reduced. For example, deploying sleep modes is an effective approach to reduce radio base station operational energy consumption. By periodically switching off the base station transmission, or using fewer transmit antennas, the energy consumption of base station hardware may decrease. By delivering less control signalling overhead, the radio frequency energy consumption can also be reduced. Simulation results suggest that up to 90% energy reduction can be obtained in low traffic conditions by employing time-domain optimization in each radio frame. The optimum on/off duty cycle is derived, enabling the energy consumption of the base station to scale with traffic loads. In the spatial-domain, an antenna selection criterion is proposed, indicating the most energy-efficient antenna configuration with the knowledge of users’ locations and quality of service requirements. Without time-domain sleep modes, using fewer transmit antennas could outperform full antenna transmission. However, with time-domain sleep modes, using all available antennas is generally the most energy-efficient choice.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Rahman, Mustazibur. „Management of City Traffic, Using Wireless Sensor Networks with Dynamic Model“. Thèse, Université d'Ottawa / University of Ottawa, 2014. http://hdl.handle.net/10393/30916.

Der volle Inhalt der Quelle
Annotation:
Road network of a region is of a paramount importance in the overall development. Management of road traffic is a key factor for the city authority and reducing the road traffic congestion is a significant challenge in this perspective. In this thesis, a Wireless Sensor Network (WSN) based road-traffic monitoring scheme with dynamic mathematical traffic model is presented that will not necessarily include all adjacent intersections of a block; rather the important major intersections of a city. The objective of this scheme is to reduce the congestion by re-routing the vehicles to better performing road-segments by informing the down-stream drivers through broadcasting the congestion information in a dedicated radio channel. The dynamic model can provide with the instantaneous status of the traffic of the road-network. The scheme is a WSN based multi-hop relay network with hierarchical architecture and composed of ordinary nodes, Cluster-Head nodes, Base Stations, Gateway nodes and Monitoring and Control Centers (MCC) etc. Through collecting the traffic information, MCC will check the congestion status and in defining the congestion, threshold factors have been used in this model. For the congested situation of a road-segment, a cost function has been defined as a performance indicator and estimated using the weight factors (importance) of these selected intersections. This thesis considered a traffic network with twelve major intersections of a city with four major directions. Traffic arrivals in these intersections are assumed to follow Poisson distribution. Model was simulated in Matlab with traffic generated through Poisson Random Number Generator and cost function was estimated for the congestion status of the road-segments over a simulation period of 1440 minutes starting from midnight. For optimization purpose we adopted two different approaches; in the first approach, performance of the scheme was evaluated for all threshold factor values iteratively one at a time, applying a threshold factor value to define threshold capacities of all the road segments; traffic was generated and relative cost has been estimated following the model specifications with the purpose of congestion avoidance. In the second approach, different values of threshold factor have been used for different road segments for determining the optimum set-up, and exhaustive search technique has been applied with a smaller configuration in order to keep computations reachable. Simulation results show the capacity of this scheme to improve the traffic performance by reducing the congestion level with low congestion costs.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Dupleix, Anna. „Faisabilité du déroulage du bois assisté par infrarouge“. Thesis, Paris, ENSAM, 2013. http://www.theses.fr/2013ENAM0044/document.

Der volle Inhalt der Quelle
Annotation:
Le déroulage permet de transformer un billon en un ruban continu de bois vert (de 0.6 à plus de 3 mm d'épaisseur) appelé “placage”. La production de placages joue un rôle important dans l'industrie du bois car les placages servent de base d'un grand nombre de produits industriels (ex : Parallel Strand Lumbers (PSL), Laminated Veneer Lumber (LVL), contreplaqués, emballages légers, etc.) parmi les plus utilisés dans l'industrie du bois. Pour certaines essences, ce procédé exige un prétraitement, appelé « l'étuvage » qui consiste à chauffer au préalable le bois vert (saturé en eau) par immersion dans l'eau ou dans la vapeur d'eau chaude afin de lui conférer une déformabilité remarquable tout en diminuant les efforts de coupe. Cette pratique présente cependant de nombreux inconvénients industriels et environnementaux (fentes à cœur, faible rendement, dépense énergétique importante, pollution des eaux, fentes à cœur, traitement immobilisant des stocks de bois importants pour des longues périodes,…).L'objectif de cette étude est de développer une innovation majeure pour les industries du déroulage et du tranchage, visant à remplacer les pratiques d'étuvage par une technologie de chauffe embarquée sur les machines de production. La technologie de chauffe par rayonnement infrarouge a été retenue pour sa facilité de mise en place sur la machine (panneaux rayonnants peu encombrants) et sa rapidité à atteindre des températures source élevées pouvant ainsi suivre les cadences de déroulage rapides exigées par les industriels (de 1 à 5 m.s-1). Cette nouvelle technologie utilisant les infrarouges pour chauffer le bois vert avant le déroulage serait une innovation majeure pour les industries impliquées dans la fabrication du contreplaqué, LVL, etc.Pour ce faire, l'étude a été conduite en quatre temps:-Elaboration d'un modèle numérique permettant la simulation de la chauffe de bois ronds déroulé avec différents paramètres du bois (humidité, propriétés thermiques),-Caractérisations thermique et optique du bois vert (en termes de profondeur de pénétration et de capacité d'absorption des rayonnements infrarouge) pour alimenter le modèle,-Validation du modèle par des essais de déroulage avec chauffe embarquée.L'apport majeur de cette étude est d'avoir démontré que la pénétration des rayonnements infrarouge dans le bois se limite à quelques dizaines de micromètres. La propagation de la chaleur jusqu'au plan de coupe situé à quelques millimètres sous la surface s'effectue donc par conduction, mode de transfert de chaleur lent dans le cas du bois aux propriétés isolantes remarquables. La chauffe embarquée semble donc inadaptée face aux cadences de déroulage imposées par les industriels. L'utilisation d'une telle technologie dans le cas du tranchage reste à étudier et en particulier l'impact de l'absence d'étuvage par immersion sur la qualité des placages (couleur, état de surface)
In the wood-products industry ‘peeling' is the process of converting a log into a continuous thin ribbon of green wood (from 0.6 to more than 3 mm thickness) termed veneer. Veneers are mainly used for manufacturing light weight packaging and Engineer Wood Products (EWP) such as plywood, Laminated Veneer Lumber (LVL) and Parallel Strand Lumbers (PSL). These three latter EWPs manufactured from veneers glued and pressed together, are amongst the most used wood products. That is the reason why the production of veneer plays an important role in the wood-products industry. For certain species, the peeling process requires the prior heating of round green-wood to temperatures ranging from 30 to 90 °C. This treatment is necessary to increase wood deformability, to reduce the severity of lathe checking in the veneers and to reduce cutting forces. It is usually done by immersion in hot water or by steam treatment. However it has many disadvantages amongst which are the duration of treatment (12 to 72 hours), the washing out of polyphenolic extractives - which causes water pollution and can affect wood's natural durability - low yield and energy losses.The goal of this PhD thesis was to develop a heating system embedded on the peeling lathe to circumvent many of these disadvantages. Infrared technology appears to be the most promising solution because of the ease of integration into the peeling process and of the power it offers, enabling the required heating temperatures to be achieved quickly and follow the highly demanding peeling speeds in use in the industry (from 1 to 5 m.s-1). This new technology, using radiant energy to heat green-wood prior to peeling, would be a major innovation for the industries involved in the production of plywood, Laminated Veneer Lumber (LVL), etc.The plan to achieve this goal consisted of:- Creating a model of infrared heat transfer in green wood while peeling it, with the characteristics of wood (moisture content, thermal properties) being amongst the input variables,-Investigating the thermal and optical characteristics of green wood (in terms of penetration depth and infrared absorption by green wood) to feed the model,-Validating the model with experimental peeling tests assisted by an infrared heating system.One of the main outputs of this study was to demonstrate that the penetration depth of infrared radiation into green wood is limited to several tenths of micrometers. Heat transfer into green wood up to the cutting plane (located several millimeters underneath the surface) is by conduction, which is slow due to the insulating properties of wood. Heating green wood with infrared radiation is therefore unable to match the highly demanding peeling rates in use in the industry today. However, the use of an embedded heating system in the case of slicing and the potential impact on improving veneer quality (colour, surface quality) remain open for further research
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

Livingstone, Niall. „Isocrates' Busiris : a commentary; with special reference to rhetorical purpose and technique“. Thesis, University of Oxford, 1995. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.321516.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Kakoulli, Ioanna. „Late Classical and Hellenistic monumental paintings : techniques, materials and analysis“. Thesis, University of Oxford, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.313475.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Aston, Steven David. „Acoustic techniques for property estimation in green and fired ceramic powder compact components“. Thesis, Keele University, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.323738.

Der volle Inhalt der Quelle
Annotation:
A commonly used process for the formation of ceramic wall tile bodies is powder compaction. Variations in density in wall tile bodies introduced by the compaction process can cause rejection at later stages of production. Quality control equipment currently employed does not identify reject bodies in the unfired state. Scope exists to reduce production costs by the timely removal of reject bodies in the unfired state. In this thesis an ultrasonic non-destructive technique is presented which allows the mapping of the density variations found in wall tile bodies in the unfired and fired states to an accuracy of ± 0.5%. An effective medium theory for the propagation of ultrasound in porous media is developed. The significance of the dependence of Young's modulus on density in determining the relationship between compression wave propagation velocity and density is explored. Using a vibrational resonance technique, it is shown that the evolution of Young's modulus and Poisson's ratio in the wall tile body material are very sensitive to the conditions used for the firing operation. The Biot two-phase theory of acoustic propagation in fluid saturated porous media which considers dissipation due to friction between the fluid and porous frame is reviewed, and the applicability to the wall tile body material assessed. It is shown that this dissipation mechanism is insignificant for this particular material. A modification is made to the model in an attempt to include dissipation due to the inelasticity and scattering of the porous frame. The results show that the two-phase theory reduces to an effective medium theory in the limit of the saturating fluid being air. The thesis concludes that density variations in wall tile bodies can be measured using and ultrasonic technique and that an effective medium theory can be used to describe the propagation of ultrasound in porous media.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Ng, Siak Hong. „Application of conventional machining techniques for green ceramic compacts produced by powder reaction moulding“. Thesis, Nottingham Trent University, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.429429.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Thiel, Stefan U. „The use of image processing techniques for the automated detection of blue-green algae“. Thesis, University of South Wales, 1994. https://pure.southwales.ac.uk/en/studentthesis/the-use-of-image-processing-techniques-for-the-automated-detection-of-bluegreen-algae(fd73551d-72d8-46a1-a0e8-e3c08b51f03e).html.

Der volle Inhalt der Quelle
Annotation:
The determination of water quality in freshwater lakes and reservoirs is an important task, which must be carried out on a regular basis. Information about long term water quality must be provided by the existence of particular organisms, for example blue-green algae. Currently the detection of these algae is done in a very time consuming manual way, involving highly trained biologists, for example those employed by the National Rivers Authority. This thesis is a first investigation in the automatic detection of blue-green algae using image processing techniques. Samples of seven species of blue-green algae and two species of green algae were examined under a microscope and transferred to a computer. The micro­ scope pictures were then stored as digital images. In order to locate the organisms Image Segmentation routines were applied. In particular, a newly developed LoG Thresholding Operator proved to be effective for the segmentation of biological organisms. Image Enhancement improved the quality and appearance of the segmented species in the images. In the identification process the biological key, which describes some important features of each species, needed to be implemented. With the aid of shape algorithms and textural algorithms both occluding and non-occluding organisms were analyzed and meaningful features were extracted. The obtained features were then used to classify the organisms into different Species. Both, discriminant analysis and neural networks were used for classification purposes. A detection rate of approximately 90% was achieved. The approach has produced promising results and it is hoped that further Investigations will be encouraged.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

FRATERRIGO, GAROFALO SILVIA. „Valorization of rice and canned tuna processing wastes: a focus on green extraction techniques“. Doctoral thesis, Politecnico di Torino, 2022. http://hdl.handle.net/11583/2960759.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Katsan, Gerasimus Michael. „Unmaking history: postmodernist technique and national identity in the contemporary greek novel“. The Ohio State University, 2003. http://rave.ohiolink.edu/etdc/view?acc_num=osu1062992115.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Beck, John A. „Ancient translation technique analysis with application to the Greek and Targum Jonah“. Theological Research Exchange Network (TREN), 1993. http://www.tren.com.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Uzunova, Milka. „Commande non-entière des systèmes. : développement et application pour les modèles du flux de trafic routier“. Thesis, Artois, 2009. http://www.theses.fr/2009ARTO0205/document.

Der volle Inhalt der Quelle
Annotation:
Le travail de thèse présenté dans le manuscrit s’articule autours de plusieurs éléments d’études concernant les modèles macroscopiques de flux de trafic routier à savoir la modélisation, la simulation et la commande. L’objectif de l’étude consiste à atteindre ou à maintenir une circulation automobile fluide sur des voies rapides. Il s’agit donc de s’assurer que le processus de flux de trafic routier reste dans des limites de stabilité et tout en offrant les meilleures performances et qualités de service aux usagers. L’étude repose sur l’analyse de la solution analytique d’une équation dynamique d’évolution du processus afin d’obtenir une fonction de transfert (TF). Le modèle retenu est un modèle macroscopique de flux de trafic du premier ordre du type LWR. L’objectif est d’obtenir une modélisation analytique conforme au modèle du réseau routier, qui dans le cas applicatif retenu est constitué d’un segment en aval d’un péage routier. Une commande du flux de trafic reposant sur le choix d’une stratégie qui satisfait les besoins des usagers sur les autoroutes au niveau des péages a été étudiée. Mettre en place une gestion des axes routier est une nécessité due à la croissance des flux qui ont pour conséquence de provoquer une saturation des voies de circulation. Les congestions apparaissent généralement aux heures de pointe, lors de travaux ou d’incidents. Elles provoquent des retards dans les déplacements des usagers et ont donc des répercutions socio-économiques et sur l’environnement. Il est donc nécessaire de garantir la fluidité du trafic routier par la conception et l’implémentation de stratégies de commande efficaces permettant d’annuler, de réduire, ou de retarder l’apparition des congestions. Une boucle de correction robuste de type CRONE est introduite dans le système de flux de trafic afin de satisfaire les objectifs de qualité requises du réseau routier face aux aléas de circulation et en assurant une circulation fluide, par le contrôle des barrières de péage.La variable de commande proposée est la densité du tronçon en amont du péage. Le résultat obtenu représente un retard pur pour le modèle de trafic comme un système à paramètres distribués. La commande étudiée est une commande robuste d’ordre non-entier associée à un prédicteur de Smith et une compensation du retard. Toutes les études ont été menées en simulation sous Matlab/Simulink. L’étude des réponses temporelles et harmoniques du système de flux de trafic a été réalisée. La stabilité du système et de ses performances ont pu ainsi être abordées. De même l’étude harmonique permet d’assurer que le système présente une marge de stabilité suffisante dans le domaine de variation des paramètres
This thesis presents research carried out to several elements of the macroscopic traffic flow as the model, the control and the simulation of his control system. The main aims of the realized studies consist to keep the circulation on the high-ways fluid. That means that we must to assure some quality of the process regarding the stability of this process. More over to offer best performances and quality of the traffic services for the users on the ways networks.In our study we use the analytical solution method of the dynamic equation presenting the LWR traffic flow model process, for which we look to obtain transfer function. Our objective is to obtain a conform result to a toll plaza. Furthermore we look to make a choice of appropriate control algorithm to satisfy the traffic network and users’ needs. The traffic flow management needs results from the increasingly of the flows. As consequence of this we can obtain saturation in some places in the road network wildly known as a traffic jam usually in the rush hours, by reason of accident or repairs works. All this provoke a delay of the transportation flow and important environmental after-effect. Therefore it’s very important to assure the fluidity of the traffic using control strategies which will cancel, reduce or delay the traffic jam appearances. Because of all the reasons above, we have proposed a system with non-integer order control algorithm for maintain the traffic fluid by the control of the pikes in the toll plaza. The control variable is the upstream density which will influence on the downstream one. After the analytical solution of the toll plaza model we obtain a delay function which presents the plant in our distributed parameter system. For this system we apply a Smith prediction non-integer control algorithm and moreover we ameliorate this system with a Dead time non-integer order compensator
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Rutherford, I. „Technique and innovation in late Greek stylistics : Six studies in the idea-theory of Hermogenes and PS.Aristides“. Thesis, University of Oxford, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.375986.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Mei, Xinxin. „Energy conservation techniques for GPU computing“. HKBU Institutional Repository, 2016. https://repository.hkbu.edu.hk/etd_oa/298.

Der volle Inhalt der Quelle
Annotation:
The emerging general purpose graphics processing units (GPGPU) computing has tremendously speeded up a great variety of commercial and scientific applications. The GPUs have become prevalent accelerators in current high performance clusters. Though the computational capacity per Watt of the GPUs is much higher than that of the CPUs, the hybrid GPU clusters still consume enormous power. To conserve energy on this kind of clusters is of critical significance. In this thesis, we seek energy conservative computing on the GPU accelerated servers. We introduce our studies as follows. First, we dissect the GPU memory hierarchy due to the fact that most of the GPU applications are suffering from the GPU memory bottleneck. We find that the conventional CPU cache models cannot be applied on the modern GPU caches, and the microbenchmarks to study the conventional CPU cache become invalid for the GPU. We propose the GPU-specified microbenchmarks to examine the GPU memory structures and properties. Our benchmark results verify that the design goal of the GPU has transformed from pure computation performance to better energy efficiency. Second, we investigate the impact of dynamic voltage and frequency scaling (DVFS), a successful energy management technique for CPUs, on the GPU platforms. Our experimental results suggest that GPU DVFS is still promising in conserving energy, but the patterns to save energy strongly differ from those of the CPU. Besides, the effect of GPU DVFS depends on the individual application characteristics. Third, we derive the GPU DVFS power and performance models from our experimental results, based on which we find the optimal GPU voltage and frequency setting to minimize the energy consumption of a single GPU task. We then study the problem of scheduling multiple tasks on a hybrid CPU-GPU cluster to minimize the total energy consumption by GPU DVFS. We design an effective offline scheduling algorithm which can reduce the energy consumption significantly. At last, we combine the GPU DVFS and dynamic resource sleep (DRS), another energy management technique, to further conserve the energy, for the online task scheduling on hybrid clusters. Though the idle energy consumption increases significantly compared to the offline problem, our online scheduling algorithm still achieves more than 30% of energy conservation with appropriate runtime GPU DVFS readjustments.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Al-Nu'aimi, Abdallah S. N. A. „Design, Implementation and Performance Evaluation of Robust and Secure Watermarking Techniques for Digital Coloured Images. Designing new adaptive and robust imaging techniques for embedding and extracting 2D watermarks in the spatial and transform domain using imaging and signal processing techniques“. Thesis, University of Bradford, 2009. http://hdl.handle.net/10454/4255.

Der volle Inhalt der Quelle
Annotation:
The tremendous spreading of multimedia via Internet motivates the watermarking as a new promising technology for copyright protection. This work is concerned with the design and development of novel algorithms in the spatial and transform domains for robust and secure watermarking of coloured images. These algorithms are adaptive, content-dependent and compatible with the Human Visual System (HVS). The host channels have the ability to host a large information payload. Furthermore, it has enough capacity to accept multiple watermarks. Abstract This work achieves several contributions in the area of coloured images watermarking. The most challenging problem is to get a robust algorithm that can overcome geometric attacks, which is solved in this work. Also, the search for a very secure algorithm has been achieved via using double secret keys. In addition, the problem of multiple claims of ownership is solved here using an unusual approach. Furthermore, this work differentiates between terms, which are usually confusing the researchers and lead to misunderstanding in most of the previous algorithms. One of the drawbacks in most of the previous algorithms is that the watermark consists of a small numbers of bits without strict meaning. This work overcomes this weakness III in using meaningful images and text with large amounts of data. Contrary to what is found in literature, this work shows that the green-channel is better than the blue-channel to host the watermarks. A more general and comprehensive test bed besides a broad band of performance evaluation is used to fairly judge the algorithms.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

McLay, Robert Timothy. „Translation technique and textual studies in the Old Greek and Theodotion versions of Daniel“. Thesis, Durham University, 1994. http://etheses.dur.ac.uk/5485/.

Der volle Inhalt der Quelle
Annotation:
This thesis focuses on two separate, but related areas: the analysis of translation technique and the Greek texts of Daniel. Foremost in the research of Translation Technique (TT) in the Septuagint is the need for a model that is appropriate for the analysis of different ancient languages. In recent years there has been an increasing emphasis on the features of literalism in a translation, but it is argued in this thesis that the focus on literalism is inadequate as a methodology for the analysis of TT. The contention of this thesis is that the analysis of TT should incorporate insights from modem linguistic research. Therefore, the main purpose of this thesis is to develop and apply such a model to the Old Greek (CG) and Theodotion (Th) versions of Daniel. The existence of two complete Greek versions of the book of Daniel that are closely related to the same Vorlage (at least in chapters 1-3 and 7-12), furnish ideal examples for the application of the methodology. Unfortunately, it is no straightforward matter to employ the OG of Daniel, because the available critical edition can no longer be regarded as reliable. The most important witness to the OG version of Daniel is Papyrus 967, and large portions of this manuscript have been published since the appearance of the critical edition of the OG of Daniel in 1954. Therefore, in order to analyze and compare the two Greek texts of Daniel, it is necessary to evaluate all of the variants of Papyrus 967 in order to establish a preliminary critical text of OG. Once a critical text is established the proposed methodology for translation technique is applied to selected passages in the OG and Th versions of Daniel. An analysis and comparison of TT in OG and Th makes it possible to: 1) characterize the TT employed by OG and Th in detail; 2) determine Th's relationship to OG, i.e. is it a revision or independent translation; 3) demonstrate how the Greek texts can be employed effectively for textual criticism of the Hebrew Bible. On the basis of the analysis of Th's text it is also possible to determine Th's relationship to the body of works, which exhibit a close formal correspondence to the Masoretic text, known as Kaige-Theodotion.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

Al-Nu'aimi, Abdallah Saleem Na. „Design, implementation and performance evaluation of robust and secure watermarking techniques for digital coloured images : designing new adaptive and robust imaging techniques for embedding and extracting 2D watermarks in the spatial and transform domain using imaging and signal processing techniques“. Thesis, University of Bradford, 2009. http://hdl.handle.net/10454/4255.

Der volle Inhalt der Quelle
Annotation:
The tremendous spreading of multimedia via Internet motivates the watermarking as a new promising technology for copyright protection. This work is concerned with the design and development of novel algorithms in the spatial and transform domains for robust and secure watermarking of coloured images. These algorithms are adaptive, content-dependent and compatible with the Human Visual System (HVS). The host channels have the ability to host a large information payload. Furthermore, it has enough capacity to accept multiple watermarks. Abstract This work achieves several contributions in the area of coloured images watermarking. The most challenging problem is to get a robust algorithm that can overcome geometric attacks, which is solved in this work. Also, the search for a very secure algorithm has been achieved via using double secret keys. In addition, the problem of multiple claims of ownership is solved here using an unusual approach. Furthermore, this work differentiates between terms, which are usually confusing the researchers and lead to misunderstanding in most of the previous algorithms. One of the drawbacks in most of the previous algorithms is that the watermark consists of a small numbers of bits without strict meaning. This work overcomes this weakness III in using meaningful images and text with large amounts of data. Contrary to what is found in literature, this work shows that the green-channel is better than the blue-channel to host the watermarks. A more general and comprehensive test bed besides a broad band of performance evaluation is used to fairly judge the algorithms.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

Kahane, Ahuvia. „The interpretation of order : a study in the poetics of Homeric repetition“. Thesis, University of Oxford, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.670325.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Hoyt, Sue Allen. „Masters, pupils and multiple images in Greek red-figure vase painting“. Columbus, Ohio : Ohio State University, 2006. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1150472109.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Xu, Jialin. „Techniques of red-figure vase-painting in late sixth- and early fifth-century Athens“. Thesis, University of Oxford, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.670015.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Goetz, Charity. „Textile dyes techniques and their effects on the environment with a recommendation for dyers concerning the Green effect /“. Lynchburg, Va. : Liberty University, 2008. http://digitalcommons.liberty.edu.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Abi, Aad Maya P. „Modeling Techniques and Local Strategies of Green Infrastructure Capitals to Control Urban Stormwater Runoff and Combined Sewer Overflows“. University of Cincinnati / OhioLINK, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1236016465.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

McLaaughlin, Grainne Carmel. „A comparative analysis of the compositional technique of the Ancient Greek lyric and Gaelic poetic traditions“. Thesis, University of Cambridge, 1987. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.278363.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Dimitrios, Rekleitis. „Cloud-based Knowledge Management in Greek SME’s“. Thesis, Linnéuniversitetet, Institutionen för informatik (IK), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-78715.

Der volle Inhalt der Quelle
Annotation:
Nowadays, Cloud Technologies are commonly used for a lot of large organizations to aid knowledge sharing.  This brings some benefits to the organization by reducing the cost of the charges, improve security, enhance content accessibility, improve efficiency etc. On the other hand, Small and Medium Enterprises (SMEs) tend to manage their information in more informal way by not using the specific language or terminology of KM. Moreover, Small and Medium enterprises do not trust the adoption of cloud-based techniques for managing information for many reasons that discussed later. This thesis tries to provide the benefits and drawbacks of cloud-based Knowledge Management techniques in Greek SMEs and also to find how knowledge processes are used in Greek SMEs according to cloud-based Knowledge Management techniques. Also, through this work I will come up with the benefits and drawbacks of applying cloud-based techniques for managing information-knowledge in SMEs. To accomplish this, I derived with a methodology that is based on qualitative approach. More specifically, I provide an exhaustive literature review and then I contacted with five SMEs in Greece to explore, using different techniques, if these SMEs can benefit from the cloud-based Knowledge Management techniques and how indent are for adopting cloud-based Knowledge Management techniques in their organization. I realized that three of the SMEs are using cloud-based techniques for Knowledge Management, where the two of them does not. To be more specific one of these two SMEs does not manage its knowledge at all. However, all of the five organizations showed a great interest to adopt cloud-based and information system technologies for Knowledge Management. At the end, this work comes up with the following important findings and insights, as well: Cloud-based Knowledge Management techniques can bring a lot of benefits in terms of cost savings and performance. However, this suits the right and efficiently use of cloud-based techniques. The lack of using efficiently cloud-based Knowledge Management techniques may lead to some drawbacks, such as reduction on the performance of the organization and reduction on the savings.   This thesis also discusses some points for future direction such as the analysis of a larger space of organizations, the investigation of quantitative analysis and also the combination of both (qualitative and quantitative).
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

Nieuwoudt, Bernard Andre. „Aspects of the translation technique of the Septuagint : the finite verb in the Septuagint of Deuteronomy“. Thesis, Stellenbosch : Stellenbosch University, 1992. http://hdl.handle.net/10019.1/69510.

Der volle Inhalt der Quelle
Annotation:
Thesis (PhD)--Stellenbosch University, 1992.
One copy microfiche.
ENGLISH ABSTRACT: Two major religions, Judaism and Christianity, use the ancient Hebrew Bible as Holy Scripture. These books were translated in the last three centuries before the common era. The oldest of these translations is the Septuagint, a Greek translation. Not only are the Hebrew and Greek texts that were involved in the original translation process missing, but precious little is known about the doctrine and translation methods of the translators of the Septuagint. Much can be learned about these crucial issues, however, if the translation technique followed by those ancient translators is studied by comparing the present Hebrew and Greek texts. A new method to determine and describe the translation technique of the Septuagint was proposed and tested in this dissertation. This method is based on the use of the Computer Assisted Tools for Septuagint Studies (CATSS) data base and statistical methods. The translation technique of the book Deuteronomy was described using different criteria, all of which measure the frequency of non-literal renderings. Three different groups of criteria were utilized, viz. the Tov criteria as proposed by E. Tov, criteria defined using the markers in the CATSS data base called the CATSS criteria, and grammatical criteria using the person of the verb. Each criterion was applied to the data base individually. The translation units were determined first, after which the translation technique found within the translation unit was described. The methodology implemented discriminates between significant and insignificant trends in translation technique. It became clear that the results of the different criteria indicate different translation units and different translation techniques for each of the criteria. Except for some criteria using the person of the verb, very little indication was found that the traditional translation units are supported by the data used in this study. In fact, it seems as if translation units should be determined before the translation technique is described. The translation technique should then be described according to the indicated units. Not all the Tov criteria could be utilized, but their results are in agreement to some extent. The CATSS criteria proved to be more difficult to implement than expected, but some of the criteria rendered excellent results. The person of the verb was discussed in detail using 12 different criteria. The results of the criteria utilizing the person of the verb are disappointing, and provide some scope for future research. The results rendered by this new approach are firm and easy to interpret. In addition, it is possible to utilize these results when dealing with specific text-critical problems.
AFRIKAANSE OPSOMMING: Die antieke Hebreeuse Bybel word deur twee godsdienstige groepe, Judaisme en die Christendom, as Heilige Skrif gebruik. Hierdie boeke is in die laaste drie eeue voor die begin van die huidige era vertaal. Die oudste vertaling is die Griekse vertaling, genoemd die Septuagint. Die Hebreeuse en die Griekse tekste wat by hierdie vertaalproses betrokke was, is verlore. Daarbenewens is bale min van die lering en vertalingstegniek van die vertalers van die Septuagint bekend. Indien die vertalingstegniek, wat deur hierdie vertalers gevolg is, bestudeer word deur die huidige Hebreeuse en Griekse tekste met mekaar te vergelyk, kan daar lig op hierdie probleme gewerp word. 'n Nuwe metode waarvolgens die vertalingstegniek van die Septuagint bepaal en omskryf kan word, is in hierdie verhandeling voorgestel en getoets. Die metodologie is gebaseer op die gebruik van die Computer Assisted Tools for Septuagint Studies (CATSS) databasis en statistiese metodes. Die vertalingstegniek van die boek Deuteronomium is omskryf deur gebruik te maak van verskillende kriteria, wat almal die frekwensie van nie-letterlike vertalingselemente meet. Drie verskillende groepe kriteria is gebruik, nl. die Tov-kriteria, soos voorgestel deur E. Toy, die CATSS-kriteria, gebaseer op merkers in die CATSS databasis en grammatikale kriteria, in die vorm van die persoon van die werkwoord. Elke kriterium is individueel op die databasis toegepas. Die vertalingseenhede is eers vasgestel, waarna die vertalingstegniek beskryf is. Die metodologie wat gebruik is, onderskei tussen betekenisvolle en nie-betekenisvolle neigings in vertalingstegniek. Dit is duidelik dat die resultate van die verskillende kriteria verskillende vertalingseenhede en verskillende vertalingstegnieke vir elk van die kriteria aandui. Uitgesonder sommige kriteria, wat gebruik maak van die persoon van die werkwoord, is daar baie min ondersteuning gevind vir die handhawing van tradisionele vertalingseenhede. Dit wil eerder voorkom asof vertalingseenhede bepaal moet word voordat daar met die beskrywing van vertalingstegniek voortgegaan kan word. Die vertalingstegniek moet dan beskryf word met inagneming van die verskillende vertalingseenhede. Nie al die Tov-kriteria kon gebruik word nie, maar die resultate van die wat gebruik kon word, stem tot 'n mate ooreen. Dit het geblyk dat die CATSS-kriteria baie moeiliker was om te implementeer as wat verwag is. Sommige van hierdie kriteria het egter uitstekende resultate gelewer. Die persoon van die werkwoord is in nouere besonderhede ondersoek, deur gebruik te maak van 12 verskillende kriteria. Die resultate van die kriteria wat van die persoon van die werkwoord gebruik gemaak het, is teleurstellend, en bied moontlikhede vir addisionele navorsing. Die resultate wat deur die nuwe metode van ondersoek gelewer word, is vas en maklik om te interpreteer. Dit is ook moontlik om hierdie resultate te gebruik wanneer spesifieke tekskritiese probleme ondersoek moet word.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Buson, Cristina <1991&gt. „Indagine sull’interazione tra Nanoparticelle e Green Fluorescent Protein (GFP) mediante tecniche Spettroscopiche e Calorimetriche Investigation on the interaction between Nanoparticles and Green Fluorescent Protein (GFP) using Spectroscopic and Calorimetric techniques“. Master's Degree Thesis, Università Ca' Foscari Venezia, 2016. http://hdl.handle.net/10579/7947.

Der volle Inhalt der Quelle
Annotation:
I materiali luminescenti aventi dimensioni nell’ordine di grandezza dei nanometri, come Nanoparticelle (NPs) che presentano proprietà di Upconversion (UC) o Quantum Dots (QDs), sono attualmente molto studiati per le loro interessanti proprietà, trovando possibili applicazioni in campo biomedico come marcatori ottici, ad esempio come bio-sensori, per il bio-imaging e nel drug delivery. In particolare UC-NPs possono essere utilizzate per ottenere un elevato contrasto nella tecnica di imaging eliminando il rilevante problema dell'autofluorescenza tissutale e permettendo, quindi, di ottenere una maggiore accuratezza nella misura. Lo studio dell'interazione tra NPs e bio-molecole ha coinvolto, nei recenti anni, molti campi di ricerca. Tali interazioni permettono di ottenere nuove ed importanti proprietà per le NPs, come un comportamento idrofilico, bio-compatibilità e bio-degradabilità, che permettono, inoltre, di acquisire una miglior conoscenza sul processo di internalizzazione del bio-nanomateriale da parte delle cellule. La specificità delle bio-molecole permette anche di ottenere differenti bio-distribuzioni del bio-nanomateriale. Queste nuove proprietà migliorano, quindi, le possibili applicazioni delle NPs in campo biomedico. I processi di Energy Transfer (ET), tra due sistemi, coinvolgono l'assorbimento di luce da parte di un partner, detto donatore, che trasferisce l'energia immagazzinata all'altro partner, detto accettore, il quale, di conseguenza, emette l'energia sotto forma luminosa. Tale ET può avvenire mediante processi di tipo radiativo o non-radiativo, dipendenti dalle caratteristiche del sistema. Un esempio di ET è quella che viene definita come FRET, Fluorescence Resonance Energy Transfer, che è un processo fisico distanza-dipendente. L’utilizzo dei processi ET permettono l’analisi di una grande varietà di fenomeni biologici. I fluorofori comunemente utilizzati come donatori e accettori appartengono ad un gruppo di proteine auto-fluorescenti chiamate genericamente GFPs (Green Fluorescent Protein). In particolare le tecniche di imaging basate sull'ET utilizzando GFPs, sono state molto importanti per la determinazione dell'organizzazione cellulare e per tracciare i movimenti di differenti proteine all'interno delle cellule stesse. In questo lavoro studiamo l'interazione tra differenti tipi di Nanoparticelle e GFP, in particolare il processo di Energy Transfer da UC-NPs/QDs a GFP utilizzando varie tecniche spettroscopiche, come spettroscopia di assorbimento ed emissione, e anche tecniche calorimetriche, come ITC (Isotermal Titration Calorimetry). ITC è una tecnica molto promettente che permette di investigare la termodinamica relativa all'interazione tra due sistemi, nel nostro caso tra UC-NPs/QDs e GFP.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie