Tesis sobre el tema "Distribution"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: Distribution.

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "Distribution".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Al-Awadhi, Shafeeqah. "Elicitation of prior distributions for a multivariate normal distribution". Thesis, University of Aberdeen, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.387799.

Texto completo
Resumen
This thesis focuses on elicitation methods for quantifying an expert's subjective opinion about a multivariate normal distribution. Firstly, it is assumed that the expert's opinion can be adequately represented by a natural conjugate prior distribution (a normal inverse-Wishart distribution) and an elicitation method is developed in which the expert performs various assessment tasks that enable the hyperparameters of the distribution to be estimated. An example illustrating use of the method is given. There are some choices in the way hyperparameters are determined and empirical work underlies the choices made. The empirical work aimed to provide a basis for choosing between alternative assessment tasks that may be used in the elicitation method and to examine different ways of using the elicited assessments to estimate the hyperparameters of the prior distribution. In particular, we compare two methods for estimating a spread matrix. The method is implemented in an interactive computer program that questions the expert and forms the subjective distribution. In some practical situations, it may not be possible to accurately represent an expert's opinions by a natural conjugate prior distribution, especially as the conjugate prior description suffers from some restrictions in the manner it represents dependencies between the mean vector and the covariance matrix. As a more flexible alternative, non-conjugate prior distributions are considered in which independent prior distributions for the mean vector and spread matrix are employed. A method of eliciting a prior distribution for the mean when it is assumed to be a multivariate normal distribution is developed. The implementation of the method is given through a pilot study. The prior distribution for the variance is assumed to have one of two forms: either an inverse-Wishart distribution or a generalised inverse-Wishart distribution. An elicitation method is developed for each of these forms of prior distribution. An example illustrating the implementation of the methods is given. Finally, the elicitation methods for the conjugate and the non-conjugate prior distributions are studied and compared in depth through an experiment with subject-matter experts. In this experiment two assessment tasks are used: one is related to the distribution of a sample mean and the other to the distribution of an individual item. A comparison is made between the expert assessments for these two types of task and marked differences are observed.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

King, Robert Arthur Ravenscroft. "New distributional fitting methods applied to the generalised [lambda] distribution". Thesis, Queensland University of Technology, 1999.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Mild, Axel y Arvid Mild. "Hållbar last mile distribution : Sustainable Last Mile Distribution". Thesis, Uppsala universitet, Industriell teknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-418146.

Texto completo
Resumen
Syftet med examensarbetet är att utforma en distributionscentrals processer för identifiering, sortering och packning av gods, för att möjliggöra en hållbar last mile-distribution med cykel. Varulogistik i städer är en stor utmaning för att främja en hållbar stadsutveckling. Transporter är nödvändiga för att förse verksamheter och invånare med gods. Samtidigt bidrar dessa transporter till negativa miljöpåverkningar, i form av buller och utsläpp. Kommuner har en central roll i att främja en hållbar utveckling på grund av deras planmonopol och möjligheter att reglera trafiken. Ett sätt att minska tyngre trafik i innerstäder är genom införandet av mindre, energieffektiva distributionsfordon för last mile-leveranser av lättare paket. Studien bygger på en mix av kvalitativa och kvantitativa metoder. Intervjuer med experter inom området, samt studiebesök hos framgångsrika transportföretag har gett en bred kunskap i området. Tillgång till data inom nuvarande processer har dessutom hjälpt forma de två lösningsförslag som slutligen presenteras i studien. Resultatet av studien framkommer i två lösningsförslag, ett nutidsförslag och ett framtidsförslag. Nutidsförslaget innehåller förändringar i såväl nuvarande processer som införandet av nya. Förslagen som presenteras här är mindre resurskrävande att implementera och ses därför vara möjliga att genomföra i dag. Framtidsförslaget presenterar vidare förbättringar som kan tillämpas inom terminalens processer. Dessa förslag är mer resurskrävande. Trots att studien är utförd efter specifika önskemål från uppdragsgivaren täcker den relevanta områden som är av intresse för alla företag som vill implementera en last mile-lösning med hjälp av cykelfordon.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Krech, Anthony. "Feasibility study of centralized distribution in a regional building materials distributor". Online version, 2009. http://www.uwstout.edu/lib/thesis/2009/2009krecha.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Hopkins, Michael. "Critical Node Analysis for Water Distribution System Using Flow Distribution". DigitalCommons@CalPoly, 2012. https://digitalcommons.calpoly.edu/theses/722.

Texto completo
Resumen
The expansive nature of water distribution system makes them susceptible to threats such as natural disasters and man-made destructions. Vulnerability assessment research efforts have increased since the passing of “Bioterrorism Preparedness and Response Act” in 2002 to harden WDS. This study aimed to develop a method that locates critical nodes without hydraulic analysis of every failure scenario, applicable for any size WDS, incorporates critical infrastructure, and capable of verifying method accuracy. The Flow Distribution method is the application of the gravity model, typically used to predict traffic flows in transportation engineering, to a distribution system. Flow distribution predicts the amount of demand and population that would be affected if any node in the system were disabled by solving for the distribution of each node’s outflow. Flow Distribution is applied to the hypothetical city, Anytown, USA using the computer simulation program WaterCAD to model two different disaster scenarios. Results were verified by analyzing sixteen failure scenarios (one for each node) to measure the actual demand and population effect, which was then compared to the nodes predicted by Flow Distribution. Flow Distribution predicted the critical nodes with 70% accuracy and can still be improved with future work.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Bavouzet, Nicolas. "Les galaxies infrarouges : distribution spatiale, contributions au fond extragalactique et distributions spectrales d'énergie". Phd thesis, Université Paris Sud - Paris XI, 2008. http://tel.archives-ouvertes.fr/tel-00363975.

Texto completo
Resumen
Si la formation des grandes structures de l'Univers est plutot bien comprise, celle des galaxies ainsi que leur évolution l'est beaucoup moins. On s'intéresse en particulier aux mécanismes de mise en route de la formation stellaire dans les galaxies. L'étude des galaxies lumineuses en infrarouge constitue une des approches pour répondre à ces questions. Le travail effectué au cours de cette thèse repose essentiellement sur l'analyse de données infrarouges provenant du satellite Spitzer.

La première partie de ce travail porte sur l'étude de la
distribution spatiale des galaxies infrarouges. Nous avons introduit une nouvelle méthode pour mesurer la fonction de corrélation angulaire des galaxies. Cette méthode a été validée sur des simulations et des données. Nous avons également montré comment les effets de corrélation spatiale pouvaient biaiser les mesures de flux moyen réalisées par la méthode d'empilement. De plus, la
fonction de corrélation angulaire mesurée pour les sources sélectionnées à 3.6 microns et 24 microns montre un excès de corrélation aux petites échelles angulaires. Ceci pourrait être lié à l'interaction des galaxies à l'intérieur d'un meme halo de matière noire qui favoriserait alors les mécanismes d'émission infrarouge.

Dans un second temps, nous nous sommes attachés à mieux caractériser le fond diffus infrarouge (CIB) en déterminant la contribution à ce fond des sources détectées à 3.6 microns et en la comparant à celle des sources sélectionnées à 24 microns. Nous avons également estimé la contribution au CIB à 3.6 et 24 microns des sources sélectionnées à 3.6 microns en fonction de leur taux de formation stellaire spécifique.

Enfin, nous avons étudié les distributions spectrales d'énergie d'un grand nombre de galaxies situées entre z=0 et z=2 : nous avons montré d'une part que les luminosités à 8 et 24 microns étaient de bons traceurs de la luminosité totale infrarouge (et donc du taux de formation stellaire) et, d'autre part, que les propriétés de ces galaxies ne semblaient pas évoluer entre entre z=0 et z=1. Nous avons également étudié de façon détaillée le spectre infrarouge de 17 galaxies sélectionnées à 70 microns et nous avons montré que la luminosité relative des PAHs diminuait lorsque le champ de rayonnement
augmentait.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Godin, Antoine. "Deciphering synaptic receptor distributions, clustering and stoichiometry using spatial intensity distribution analysis (SpIDA)". Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=96774.

Texto completo
Resumen
Measuring protein interactions in subcellular compartments is key to understanding cell signalling mechanisms, but quantitative analysis of these interactions in situ has remained a major challenge. This thesis presents a novel analysis technique, spatial intensity distribution analysis (SpIDA), which may be applied to images obtained using fluorescence microscopy. SpIDA measures fluorescent particle densities and oligomerization states within individual images. The method is based on fitting intensity histograms from single images with super-Poissonian distributions to obtain density maps of fluorescent molecules and their quantal brightness. Since distributions are acquired spatially rather than temporally, this analysis may be applied to both live and chemically fixed cells and tissue. The technique does not rely on spatial correlations, freeing it from biases due to subcellular compartmentalization and heterogeneity within tissue samples. First, we validated the analysis technique evaluating its limits and demonstrating how it can be used to obtain useful information from complex biological samples. Analysis of simulations and heterodimeric GABAB receptors in spinal cord samples shows that the approach yields accurate estimates over a broad range of densities. SpIDA is applicable to sampling within subcell areas and reveals the presence of monomers and multimers with single dye labeling. We show that the substance P receptor (NK-1r) almost exclusively forms homodimers on the membrane and is primarily monomeric in the cytoplasm of dorsal horn neurons. Triggering receptor internalization caused a measurable decrease in homodimer density on the membrane surface. Finally, using GFP-tagged receptor subunits, we show that SpIDA can resolve dynamic changes in receptor oligomerization in live cells and is applicable to detection of high order oligomerization states. We then compared SpIDA results with those obtained from fluorescence lifetime imaging, and used it to extract information on receptor tyrosine kinase (RTK) dimerization at the cell membrane in response to GPCR activation. We show that RTK dimerization can be used as an index of activation or transactivation and then characterize the level of transactivation of many RTK-GPCR pairs, with cell cultures and primary neuron cultures with endogenous levels of RTKs and GPCRs. Dose-response curves were obtained from which pharmalogical parameters can be compared for each GPCR studied. Our data demonstrates that by allowing for time and space quantification of heterogenous oligomeric states, SpIDA enables systematic quantitative mechanistic studies not only of RTK transactivation at the cell membrane, but also of other cell signaling processes involving changes in protein oligomerization, trafficking and activity in different subcellular localizations. Finally, we studied the changes in number of synaptic sites in the neurons of the dorsal horn of the spinal cord of rats after a peripheral nerve injury (PNI), which consists of our model for chronic pain. We show that, after the PNI, there is a general decrease in synaptic sites together with a scaling or increasing of some of the GABAA receptor subunits. This scaling of the GABAA receptors at the postsynaptic sites was replicated by incubating the histological sections in a brain derivative nerve factor. Furthermore, we use SpIDA to obtain stoichiometry information for the GABAA receptor subunits directly at the postsynaptic sites. In short, we observe a switch from receptors containing two alpha1 to receptors containing two alpha2 and alpha3. This general change in subunits will have a direct effect on the cell as it will have different effects on the cell membrane conductance in response to GABA. As demonstrated, the advantages and greater versatility of SpIDA over current techniques opens the door to a new level of quantification for studies of protein interactions in native tissue using standard fluorescence microscopy.
Measuring protein interactions in subcellular compartments is key to understanding cell signalling mechanisms, but quantitative analysis of these interactions in situ has remained a major challenge. This thesis presents a novel analysis technique, spatial intensity distribution analysis (SpIDA), which may be applied to images obtained using fluorescence microscopy. SpIDA measures fluorescent particle densities and oligomerization states within individual images. The method is based on fitting intensity histograms from single images with super-Poissonian distributions to obtain density maps of fluorescent molecules and their quantal brightness. Since distributions are acquired spatially rather than temporally, this analysis may be applied to both live and chemically fixed cells and tissue. The technique does not rely on spatial correlations, freeing it from biases due to subcellular compartmentalization and heterogeneity within tissue samples. First, we validated the analysis technique evaluating its limits and demonstrating how it can be used to obtain useful information from complex biological samples. Analysis of simulations and heterodimeric GABAB receptors in spinal cord samples shows that the approach yields accurate estimates over a broad range of densities. SpIDA is applicable to sampling within subcell areas and reveals the presence of monomers and multimers with single dye labeling. We show that the substance P receptor (NK-1r) almost exclusively forms homodimers on the membrane and is primarily monomeric in the cytoplasm of dorsal horn neurons. Triggering receptor internalization caused a measurable decrease in homodimer density on the membrane surface. Finally, using GFP-tagged receptor subunits, we show that SpIDA can resolve dynamic changes in receptor oligomerization in live cells and is applicable to detection of high order oligomerization states. We then compared SpIDA results with those obtained from fluorescence lifetime imaging, and used it to extract information on receptor tyrosine kinase (RTK) dimerization at the cell membrane in response to GPCR activation. We show that RTK dimerization can be used as an index of activation or transactivation and then characterize the level of transactivation of many RTK-GPCR pairs, with cell cultures and primary neuron cultures with endogenous levels of RTKs and GPCRs. Dose-response curves were obtained from which pharmalogical parameters can be compared for each GPCR studied. Our data demonstrates that by allowing for time and space quantification of heterogenous oligomeric states, SpIDA enables systematic quantitative mechanistic studies not only of RTK transactivation at the cell membrane, but also of other cell signaling processes involving changes in protein oligomerization, trafficking and activity in different subcellular localizations. Finally, we studied the changes in number of synaptic sites in the neurons of the dorsal horn of the spinal cord of rats after a peripheral nerve injury (PNI), which consists of our model for chronic pain. We show that, after the PNI, there is a general decrease in synaptic sites together with a scaling or increasing of some of the GABAA receptor subunits. This scaling of the GABAA receptors at the postsynaptic sites was replicated by incubating the histological sections in a brain derivative nerve factor. Furthermore, we use SpIDA to obtain stoichiometry information for the GABAA receptor subunits directly at the postsynaptic sites. In short, we observe a switch from receptors containing two alpha1 to receptors containing two alpha2 and alpha3. This general change in subunits will have a direct effect on the cell as it will have different effects on the cell membrane conductance in response to GABA. As demonstrated, the advantages and greater versatility of SpIDA over current techniques opens the door to a new level of quantification for studies of protein interactions in native tissue using standard fluorescence microscopy.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Persson, Erold. "Multicast Time Distribution". Thesis, Linköping University, Department of Electrical Engineering, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2274.

Texto completo
Resumen

The Swedish National Testing and Research Institute is maintaining the Swedish realization of the world time scale UTC, called UTC(SP). One area of research and development for The Swedish National Laboratory of Time and Frequency is time synchronization and how UTC(SP) can be distributed in Sweden. Dissemination of time information by SP is in Sweden mainly performed via Internet using the Network Time Protocol (NTP) as well as via a modem dial up service and a speaking clock (Fröken Ur). In addition to these services, time information from the Global Positioning System (GPS) and from the long-wave transmitter DCF77 in Germany, is also available in Sweden.

This master’s thesis considers how different available commercial communication systems could be used for multicast time distribution. DECT, Bluetooth, Mobile Telecommunication and Radio Broadcasting are different techniques that are investigated. One application of Radio Broadcasting, DARC, was found to be interesting for a more detailed study. A theoretical description of how DARC could be used for national time distribution is accomplished and a practical implementation of a test system is developed to evaluate the possibilities to use DARC for multicast time distribution.

The tests of DARC and the radio broadcast system showed that these could be interesting techniques to distribute time with an accuracy of a couple of milliseconds. This quality level is not obtained today but would be possible with some alterations of the system.

Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Carlsson, Daniel y Johan Dvärsäter. "Flexibel Fysisk Distribution". Thesis, Linköping University, Department of Management and Economics, 2001. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-857.

Texto completo
Resumen

Bakgrund: Vi anser att en framtida utmaning är att bygga in en flexibilitet i den fysiska distributionen. Detta innebär att ett detaljhandelsföretag skall ha möjlighet att växla mellan att stundtals distribuera en och samma produkt via centrallager och stundtals direkt från producent till varuhus i förädlingskedjan, en flexibel fysisk distribution.

Syfte: Syftet är att utröna förutsättningar för och konsekvenser av en flexibel fysisk distribution för de olika aktörerna i en förädlingskedja, i vilken detaljhandelsföretaget besitter en påtaglig maktposition.

Genomförande: Empirisk data har samlats in genom en fallstudie i IKEA:s förädlingskedja.

Resultat: En integrerad förädlingskedja med ett tydligt supply chain synsätt, i vilken samtliga aktörer samarbetar och har incitament till medverkan, utgör en förutsättning för en flexibel fysisk distribution. Vi vill även framhålla vikten av att de produkter i ett detaljhandelsföretags sortiment som säljer bäst och endast de varuhus med störst kapacitet, bör ingå. Om den fysiska distributionen skall kunna växla i en förädlingskedja måste berörda aktörer vidare erhålla en planeringshorisont. Detta implicerar att skiften avseende fysisk distribution måste ske säsongsvis och då företrädesvis med direktleveranser i högsäsong. Den flexibla fysiska distributionen kommer att få konsekvenser gällande lager och därmed kostnader för aktörerna.

Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Helgason, Ólafur. "Opportunistic Content Distribution". Doctoral thesis, KTH, Kommunikationsnät, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-33828.

Texto completo
Resumen
In recent decades, communication networks have had a profound effect on society. Wireless communication has affected our lifestyle and altered how humans communicate and the Internet has revolutionized how we access, publish and disseminate information. In recent years we have also witnessed a radical change in how information is generated on the Internet. Today, information is no longer only generated by a small group of professionals but it is created by the users themselves and shared with a broad community with matching interests. This is evident with ”Web 2.0” applications such as blogs, podcasts, YouTube and social platforms like Facebook and Flickr. As a result of these trends, the Internet is today mainly used to provide users with access to contents. With recent advances in mobile platforms, information generation and consumption has spread from personal computers and Internet into people’s palms. This calls for efficient dissemination of information to and from mobile devices. This thesis considers content-centric networking in the context of mobile wireless networks. The main focus is on opportunistic distribution of content where mobile nodes directly exchange content items when they are within communication range. This communication mode enables dissemination of content between mobile nodes without relying on infrastructure, which can be beneficial for several reasons: infrastructure may be absent, overloaded, unreliable, expensive to use, censored or limited to certain users or contents. Opportunistic networking also has different properties than infrastructure based wireless networking, particularly in terms of scalability, locality and dissemination delay. The contributions of this thesis lie in two areas. Firstly we study the feasibility and performance of opportunistic networking among mobile nodes in urban areas using both analytic models and simulations. In particular we study the effect of two enablers of opportunistic networking: cooperation and mobility. By applying models from epidemic modeling, we show that if nodes cooperate by sharing, even in a limited manner, content can spread efficiently in a number of common case scenarios. We also study in detail which aspects of human mobility affect wireless communication and conclude that performance is not very sensitive to accurate estimation of the probability distributions of mobility parameters such as speed and arrival process. Our results however suggest that it is important to capture the scenario and space in which mobility occurs since this may affect performance significantly. Secondly, we present our design and implementation of a middleware architecture for a mobile peer-to-peer content distribution. Our system uses a decentralized content solicitation scheme that allows the distribution of content between mobile devices without requiring Internet connectivity and infrastructure support. Our system is based on the publish/subscribe paradigm and we describe the design and implementation of key components. We evaluate the performance and correctness of the system using both large-scale simulations and small-scale experimentation with our implementation. Finally we present the design and evaluation of an energy-efficient radio subsystem for opportunistic networking.
QC 20110524
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Li, Minyue. "Distribution Preserving Quantization". Doctoral thesis, KTH, Skolan för elektro- och systemteknik (EES), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-38482.

Texto completo
Resumen
In the lossy coding of perceptually relevant signals, such as sound and images, the ultimate goal is to achieve good perceived quality of the reconstructed signal, under a constraint on the bit-rate. Conventional methodologies focus either on a rate-distortion optimization or on the preservation of signal features. Technologies resulting from these two perspectives are efficient only for high-rate or low-rate scenarios. In this dissertation, a new objective is proposed: to seek the optimal rate-distortion trade-off under a constraint that statistical properties of the reconstruction are similar to those of the source. The new objective leads to a new quantization concept: distribution preserving quantization (DPQ). DPQ preserves the probability distribution of the source by stochastically switching among an ensemble of quantizers. At low rates, DPQ exhibits a synthesis nature, resembling existing coding methods that preserve signal features. Compared with rate-distortion optimized quantization, DPQ yields some rate-distortion performance for perceptual benefits. The rate-distortion optimization for DPQ facilitates mathematical analysis. The dissertation defines a distribution preserving rate-distortion function (DP-RDF), which serves as a lower bound on the rate of any DPQ method for a given distortion. For a large range of sources and distortion measures, the DP-RDF approaches the classic rate-distortion function with increasing rate. This suggests that, at high rates, an optimal DPQ can approach conventional quantization in terms of rate-distortion characteristics. After verifying the perceptual advantages of DPQ with a relatively simple realization, this dissertation focuses on a method called transformation-based DPQ, which is based on dithered quantization and a non-linear transformation. Asymptotically, with increasing dimensionality, a transformation-based DPQ achieves the DP-RDF for i.i.d. Gaussian sources and the mean squared error (MSE). This dissertation further proposes a DPQ scheme that asymptotically achieves the DP-RDF for stationary Gaussian processes and the MSE. For practical applications, this scheme can be reduced to dithered quantization with pre- and post-filtering. The simplified scheme preserves the power spectral density (PSD) of the source. The use of dithered quantization and non-linear transformations to construct DPQ is extended to multiple description coding, which leads to a multiple description DPQ (MD-DPQ) scheme. MD-DPQ preserves the source probability distribution for any packet loss scenario. The proposed schemes generally require efficient entropy coding. The dissertation also includes an entropy coding algorithm for lossy coding systems, which is referred to as sequential entropy coding of quantization indices with update recursion on probability (SECURE). The proposed lossy coding methods were subjected to evaluations in the context of audio coding. The experimental results confirm the benefits of the methods and, therewith, the effectiveness of the proposed new lossy coding objective.
QC 20110829
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Sonalker, Anuja Anilkumar. "Asymmetric Key Distribution". NCSU, 2002. http://www.lib.ncsu.edu/theses/available/etd-20020403-040240.

Texto completo
Resumen

ABSTRACT BY Anuja A Sonalker on Asymmetric Key Distribution. (Under the direction of Dr. Gregory T. Byrd) Currently, in Threshold Public Key Systems key shares are generated uniformly and distributed in the same manner to every participant. We propose a new scheme, Asymmetric Key Distribution (AKD), in which one share server is provided with a larger, unequal chunk of the original secret key. Asymmetric Key Distribution is a unique scheme for generating and distributing unequal shares via a Trusted Dealer to all the registered peers in the system such that without the combination of the single compulsory share from the Special Server no transaction can be completed. This application is aimed for circumstances where a single party needs to co-exist within a group of semi-trusted peers, or in a coalition where every entity should have a choice to participate and one of the entities needs to be privileged with more powers. This thesis presents the algorithm and security model for Asymmetric Key Distribution, along with all the assumptions and dependencies within the boundaries of which this algorithm is guaranteed to be secure. Its robustness lies in its simplicity and in its distributed nature. We address all security concerns related to the model including compromised share servers and cryptanalytic attacks. A variation, called the Dual Threshold Scheme, is created to reduce the vulnerability in the algorithm, namely, the compromise of the Special Server and its secret share. In this scheme, a combination of another threshold number of Distributed Special Servers must combine to collectively generate a share equivalent to the Special Server?s share. This flexibility allows us to adjust our threshold scheme for the environment. We describe a Java-based implementation of the AKD algorithm, using Remote Method Invocation (RMI) for communication among share servers. A typical scenario of a Trusted Dealer, a Special Server and a number of Share Servers was created, where timed asymmetric key generation and distribution was carried out after which the servers initiated and carried out certificate signing transactions in the appropriated manner. As an interesting exercise, the share servers were corrupted so that they would try to exclude the Special Server in the transactions and try to form its share themselves, to observe the consequence. All their efforts were futile. Another interesting aspect was the key generation timing. Key generation is known to be a very time-extensive process but the key share reuse concept used in this implementation reduced the time for key generation by 66-90% of the classical key generation time.

Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Dahlgren, Fredrik. "Effective Distribution Theory". Doctoral thesis, Uppsala : Department of Mathematics, Uppsala University, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-8210.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Bhusal, Prabodh. "Distribution reliability analysis". Thesis, Wichita State University, 2007. http://hdl.handle.net/10057/1532.

Texto completo
Resumen
This thesis presents an example for optimization of distribution maintenance scheduling of a recloser. It applies a risk reduction technique associated with maintenance of the equipment. Furthermore, this thesis examines how various distribution system designs, including distributed energy resources (DER), affect distribution reliability indices, System Average Interruption Duration Index (SAIDI) and System Average Interruption Frequency Index (SAIFI).
Thesis (M.S.)--Wichita State University, College of Engineering, Dept. of Electrical and Computer Engineering
"December 2007."
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Sundeck, Per. "Distribution Banverket PDÖ". Thesis, Linköpings universitet, Institutionen för teknik och naturvetenskap, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-97890.

Texto completo
Resumen
This thesis was made at Banverket Produktion region Öst (Banverket Production, Eastern region), who have decided to restructure their warehousing for spare parts. The basic problem is the amount of spare parts with long stock rotation cycles, which represents a great costs for the warehousing. To become more financially effective, several spare parts with low frequency of use will be centralized to four larger warehouses, seventeen earlier. When there are spare parts at every warehouse, the time to rectify problems will be relatively short. If the repair related transports are taken care of in the same way after the centralization there is a large risk that time rectify the problem will be unacceptably long. The new conditions for the warehousing lead to a need for a new model of distribution to use. The purpose of this thesis is to examine how the repair time will be affected by the restructuring and the new transport situation that comes with it. It is also to see if it is possible to find logistical solutions to make the prolonged repair time acceptable. It is necessary to focus on how to manage the transports at the time of spare parts need when administrating the warehousing if some spare parts only are to be found at few warehouses. In the end there will be some kind of ”Just-In-Time” system where the parts only are to be found at the central warehouses and transports will be administered when needed. The effect is that some of the financial resources that will be saved by the centralization will be used to create an effective distribution system. A good basis is necessary to discuss and produce proposals for managing the distribution. That's why the situation today is analyzed in regards to the distribution system and the plans for the future It will then give relevant and necessary information about considerations to take into account. You will also get a clear picture of why the actions are needed. Due to the low stock rotation frequency of these parts, it will probably be most financially effective to purchase transportation services as they are needed. To produce proposals for how to manage the distribution both ground and air transportation companies have been contacted and analyzed. These contacts have resulted in a model that gives information of how to manage different situations. The final solution has become the alternative where most transports are performed by cars and trucks. The helicopters will be used as a complement in extremely urgent situations. The model is validated by some analysis and discussions of hypothetical situations. This will give examples of how the system will work under the new circumstances. You will see how the new system will work in comparison to the old and get a clear picture of the differences between the helicopter and the truck transportation alternatives. The validation shows that, in a time perspective, the model is a satisfactory solution to the problem. There are many opportunities to develop the model by including more information in the analysis parts. The costs, for example, have not been focused on. Other opportunities might be to bring focus to the frequently used spare parts and all warehouses, as well as to look closer at the possibilities to cooperate with a neighbouring regions of Banverket Produktion.
Examensarbetet är ett beställningsarbete åt Banverket Produktion Öst, som beslutat att omstrukturera sin lagerverksamhet med felavhjälpningsmaterial. Det grundläggande problemet är att en alltför stor andel produkter med låg förbrukningsfrekvens i lager innebär stora lagerkostnader. För att blir mer kostnadseffektiv har man valt att centralisera lagerhållningen av artiklar med lågfrekvent förbrukning till fyra större lager mot tidigare sjutton. Då artiklar finns på i stort sett samtliga lager blir tiden för att avhjälpa fel relativt kort. Om transporterna vid felavhjälpning hanteras på samma sätt efter centraliseringen är risken stor att felavhjälpningstiden blir oacceptabelt lång. De nya lagerförutsättningarna medför ett behov av en ny distributionsmodell vid felavhjälpning. Syftet med examensarbetet är att undersöka hur felavhjälpningstiden påverkas av omstruktureringen med de längre transporter som då blir fallet, och om man med hjälp av olika logistiska lösningar kan göra denna förlängda felavhjälpningstid acceptabel. Det som krävs för att hantera lagerhållningen när man börjat lagerföra vissa artiklar endast centralt är att man måste lägga mer resurser på direkt anskaffning av produkter vid felavhjälpning. I praktiken blir det då en form av Just-In-Timesystem där det inte finns säkerhetslagrade produkter och transporter går från huvud- eller centrallager vid behov. Effekten blir att en del av de resurser som sparats genom minskad lagerhållning får användas till att skapa ett effektivt distributionssystem. Ett bra underlag är nödvändigt för att kunna diskutera och ge förslag på distributionsalternativ. Därför har nuläget analyserats med avseende på distributionssystem och framtidsplaner eftersom en sådan analys ger relevant och nödvändig information om vad man bör ta hänsyn till. Man får även en tydligare bild av varför åtgärderna är nödvändiga. På grund av det lågfrekventa behovet av transporter blir det med största sannolikhet mest lönsamt att köpa transporter när det är nödvändigt. Fördelen med detta alternativ är också att man behåller kontrollen över transportkostnaderna. För att kunna ge förslag på en konkret distributionsmodell har således både marktransportföretag och helikopterföretag kontaktats och analyserats. Dessa kontakter har resulterat i en modell som visar hur man kan hantera olika situationer. Den slutliga lösningen har blivit ett alternativ där olika varianter av bilar hanterar merparten av transporterna medan helikoptertransporter fungerar som ett komplement vid extremt akuta transporter. Modellen har validerats genom att några hypotetiska situationer analyserats och diskuterats, detta för att ge exempel på hur systemet kommer att fungera under de nya förhållandena. Dels ser man hur det nya systemet fungerar i jämförelse med det gamla, dels ser man tydligt skillnaden mellan helikopter- och bilalternativet. Valideringen visar att modellen ur ett tidsperspektiv fungerar tillfredställande. Det finns många möjligheter att utveckla modellen genom att inkludera flera parametrar. Kostnadsparametern har till exempel inte varit i fokus. Andra möjligheter kan vara att ta hänsyn till alla artiklar och alla lager eller samarbete med en närliggande region.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Simonyan, Mesrop. "Rethinking Film Distribution". Digital Commons at Loyola Marymount University and Loyola Law School, 2012. https://digitalcommons.lmu.edu/etd/449.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Segura, Inès. "La distribution immobilière". Montpellier 1, 2006. http://www.theses.fr/2006MON10018.

Texto completo
Resumen
Les concepts d'immobilier et de distribution ne nous sont nullement étrangers. En revanche, l'expression distribution immobilière n'a jamais été évoquée. Ce qui peut expliquer que le droit de la distribution immobilière n'ait pas été abordé dans le droit de la distribution tient sans doute à la nature de son objet. Nous nous sommes intéressés, d'une part, aux opérations élémentaires de la distribution immobilière à travers l'étude de l'agent immobilier, de l'agent commercial immobilier, du négociateur immobilier salarié (VRP ou non bénéficiaire du statut), de l'indicateur occasionnel. Nous avons procédé, d'autre part, à l'examen des opérations complexes de la distribution immobilière telles que les franchises. Toutefois, nous observons qu'en dehors de la franchise, les notions de groupement et de réseau peuvent recouvrir des réalités juridiques très différentes : associations, groupements d'intérêt économique, sociétés coopératives, etc. Ce travail de recherche a eu pour objet l'étude des formules contractuelles utilisées dans le domaine de la distribution immobilière. Les juristes pourront se pencher sur les vides juridiques qui ne manquent pas en la matière et que cette recherche a contribué à mettre à jour. Les professionnels et leurs conseils pourront y puiser des informations juridiques et pratiques concernant les opérateurs et les opérations de la distribution immobilière. Puisse ce travail permettre à quiconque voudra s'établir d'avoir une vue générale de ce qui se pratique en matière de contrats de distribution immobilière.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Bhusal, Prabodh Jewell Ward T. "Distribution reliability analysis /". Thesis, A link to full text of this thesis in SOAR, 2007. http://hdl.handle.net/10057/1532.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Oliveira, Costa Larisse. "La satisfaction des acteurs dans le canal de distribution : le cas de la relation entre la grande distribution et les fournisseurs locaux au Brésil". Thesis, Aix-Marseille, 2013. http://www.theses.fr/2013AIXM1082.

Texto completo
Resumen
La thèse a eu comme objectif d'étudier la relation entre les fournisseurs brésiliens et les grands distributeurs dans le canal de distribution alimentaire, afin d'expliquer la satisfaction dans cette relation. Le cadre théorique aborde les concepts du canal de distribution et s'appuie sur les théories comportementales et leurs variables, ainsi que sur le marketing relationnel et les principaux modèles identifiés dans la littérature. En ce qui concerne les variables, nous nous centrons sur le conflit et la coopération qui ont un impact clé sur la satisfaction des acteurs du canal de distribution. Nous avons adopté une approche exploratoire pour observer les grands distributeurs et leurs fournisseurs locaux. Notre recherche conclut à la satisfaction du côté des grands distributeurs, dans les relations de coopération, de partenariat, accentuée par la confiance instaurée avec leurs fournisseurs. Cependant du côté des fournisseurs locaux, deux types de discours ont été relevés. D'un côté les grands fournisseurs qui partagent la même vision que les distributeurs; et d'un autre côté, les discours des petits et moyens fournisseurs locaux qui signalent une relation conflictuelle et de pouvoir avec leurs clients distributeurs, sans coopération ou engagement dans cette relation, conduisant à leur insatisfaction
This research studies the relationship between the Brazilian suppliers and large food retailers in order to identify whether this relationship results in satisfaction. We present the concept of distribution channel, behavioral theories and their key variables, relational marketing theories and the models found in the literature. Regarding the variables, we emphasize conflict and cooperation, with the aim of identifying the variables that impact on satisfaction of these two actors. We adopted an exploratory approach. Our research concludes that there is satisfaction on the side of large retailers with cooperation and partnership strategies, strengthened by trust between the actors. However from the perspective of local suppliers, two types of speeches were observed. On one hand, the large suppliers share the same vision and on the other hand, the small and medium size local suppliers report an adversarial relationship and power by large retailerss, with no cooperation or compromise in this respect, leading to unsatisfaction
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Hansen, Peder. "Approximating the Binomial Distribution by the Normal Distribution – Error and Accuracy". Thesis, Uppsala universitet, Matematisk statistik, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-155336.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

McCourt, James. "Retail distribution review : a critical evaluation of the retail distribution review". Thesis, University of Edinburgh, 2017. http://hdl.handle.net/1842/25942.

Texto completo
Resumen
Despite the high profile nature of the interventions made by regulators after the global financial crisis, there have been few objective assessments of their success and of the orthodoxy of market failure analysis that underpins the rationale for taking action. This study addresses both literature gaps by developing a distribution landscape segment model to measure the success of an exemplar; the Retail Distribution Review (RDR). It also undertakes exploratory research to establish a basis for a diagnostic paradigm based on customer value rather than well established, but criticised, classical economic indicators. A “stock flow” based model was constructed to assess post-RDR levels of asymmetry, agency and trust. The absence of source data prompted a second exploratory phase of research into Trust as a welfare benefit, using customer focus groups and telephone surveys. An evidential basis for an alternative framework based on what consumers value, rather than how economists think is rational for them to act, was established. The model results indicated a landscape which is more complex than 2013, with competing interests transmuted rather than eradicated and information asymmetry growing rather than shrinking. The results support a view that interventions focussing on narrow “market” definitions do not reflect the complexity of human behaviour and are simply “squeezing the balloon”. The customer value research found that trust is complicated and related to several key “motivators”. These have underlying attributes which differ between socio economic groups, the financial objectives and whether customers have advisers. The conclusion reached is that an evidence based customer perspective should be at the heart of regulatory analysis, if public welfare is to be maximised. The study provides evidence of complexities and connectedness between actors and economic forces in the retail financial services landscape, cautiously supporting the literature on regulatory interventions as socio-technical assemblages. It argues that the customer value framework enriches the regulatory toolkit by forming a guard against intellectual capture and unintended consequences of shaping reality to fit a so-called perfect market model.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Fernando, W. Anand K. "Techniques for Designing HFAC Power Distribution Systems; Power Conversion and Distribution". Thesis, The University of Sydney, 2018. http://hdl.handle.net/2123/17995.

Texto completo
Resumen
Modern power distribution systems (PDS) utilize multiple converters, making power flow undergo several conversions between source and the load. Use of high frequency AC (HFAC) in PDSs eliminates a few stages of converters in addition to the smaller sized capacitors and inductors being used; making the converters much lighter in weight offering a variety of solutions for the weight critical applications such as spacecraft, aircraft and electric vehicle onboard PDSs. HFAC converters with resonant filters have been widely used in the past despite of being tuned to a single frequency. Thus, the variable frequency operation as well as parallel connection of multiple converters had been less efficient. This part of the research work focusses on development of a bi-directional AC-AC converter that could work within a range of grid parameters. The proposed two-stage converter constructed with wide bandgap power switches, a high-performance microcontroller, low-pass filters which operates at high switching frequencies provide the desired variable frequency and voltage operation capability. A major drawback in HFAC PDS had been the excessive power loss and voltage drop due to skin effect and proximity effect of conductors. This part of the research work investigates the development of new cable types with hollow core cross-sections. This would minimize skin effect losses by shifting much of the conducting material to the skin depth, keeping the weight increase to a minimal. Feasibility studies performed using PSCAD software showed improved performance of cables upto 100kHz; enabling using such as in wireless power transmission applications.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Acharya, Tanka Prasad Somers Greg Lynn. "Prediction of distribution for total height and crown ratio using normal versus other distributions". Auburn, Ala., 2006. http://repo.lib.auburn.edu/2006%20Fall/Theses/ACHARYA_TANKA_3.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Yoo, Jiyoun. "Modeling Compressive Stress Distributions at the Interface between a Pallet Deck and Distribution Packaging". Diss., Virginia Tech, 2011. http://hdl.handle.net/10919/39939.

Texto completo
Resumen
Three components, a pallet, packaging, and material handling equipment, of the unit load portion of the supply chain are physically and mechanically interacting during product storage and shipping. Understanding the interactions between two primary components, a pallet and packaging, in a unit load is a key step towards supply chain cost reduction and workplace safety improvement. Designing a unit load without considering physical and mechanical interactions, between those two components, can result in human injury or death caused from a unsafe workplace environment and increased supply chain operating costs, due to product damage, high packaging cost, disposal expense, and waste of natural resources. This research is directed towards developing predictive models of the compressive stress distributions using the principle of the beam on an elastic foundation and experimentally quantifying the compressive stress distributions. The overall objective of this study is to develop a model that predicts compressive stress distributions at the interface between a pallet deck and packaging as a function of: pallet deck stiffness, packaging stiffness, and pallet joint fixity. The developed models were validated by comparison to the results of physical testing of the unit load section. Design variables required for modeling included Modulus of Elasticity (MOE) of pallet deckboards, Rotation Modulus (RM) for nailed joints, and packaging stiffness. Predictive models of the compressive stress distributions were non-uniformly distributed across the interface between pallet deckboards and packaging. Maximum compressive stresses were observed at the deckboard ends over stringer segments. All predictive compressive stress distributions were influenced by pallet deck stiffness, packaging stiffness, and joint fixity. The less the joint fixity the greater the pallet deck deflection. The stiffer deckboards are more sensitive to joint fixity. For predictive compressive stress distribution models, the measure of the stress concentrations was the Compressive Stress Intensity Factor (SIF), which was the ratio of the estimated maximum compressive stress to the applied stress. Less stiff pallets and stiffer packaging resulted in greater SIF for all end condition models. SIF was reduced by stiffer joint, stiffer pallet deck and more flexible packaging. The stiffer the pallet deck and pallet joint the greater the effective bearing area. The lower stiffness packaging resulted in the greater effective bearing area with all three packages. The predicted effective bearing area was more influenced by pallet deck stiffness than the packaging stiffness. The developed prediction models were validated by comparison to experimental results. All prediction models fell within 95% confidence bounds except the 3/8-inch deck with free ends and 3/4-inch deck with fixed ends. The difference between predicted and measured results was due to a limitation in pressure sensor range and test specimen construction for the free end model and fixed end model, respectively. The results show effects of pallet deck stiffness and packaging stiffness on SIFs with percentage changes ranging from 2 to 26% (absolute value of change) for all three end conditions. The sensitivity study concluded that changing both pallet deck stiffness and packaging stiffness more significantly influenced the SIFs than bearing areas.
Ph. D.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Buhari, Muhammad. "Reliability assessment of ageing distribution cable for replacement in 'smart' distribution systems". Thesis, University of Manchester, 2016. https://www.research.manchester.ac.uk/portal/en/theses/reliability-assessment-of-ageing-distribution-cable-for-replacement-in-smart-distribution-systems(e253c774-b5e3-4872-9139-894e7df553f0).html.

Texto completo
Resumen
Majority of electricity networks have growing number of ageing elements. Critical network components, such as ageing underground cables, are very expensive to install and disruptive to replace. On the other hand, global climate changes have made connection of new low carbon technologies (LCT) into the grids increasingly necessary. These factors are contributing to the increasing complexity of the planning and management of power systems. Numerous techniques published on this subject tend to ignore the impact of LCT integration and the anchoring ꞌSmartꞌ solutions on ageing network assets, such as underground cables and transformers. This thesis presents the development procedures of an ageing underground cable reliability model (IEC-Arrhenius-Weibull model) and cable ranking models for replacement based on system wide effects and thermal loss-of-life metrics. In addition, a new concept of LCT integration and distribution network management was proposed using two optimization models. The first optimizes connection of new wind sources by minimizing the connection cost and the cost of cable thermal loss-of-lives in the planning period. In the second stage, the network is optimally reconfigured in such a way to minimize thermal-loss-of-life of ageing cable. Both optimization models are formulated as mixed integer non-linear programming (MINLP) problems applicable to radially operated medium voltage networks. To quantify the reliability benefits of the proposed approach, Sequential Monte Carlo Simulation (SMCS) procedure was formulated. Some of the main features of the SMCS procedure are the IEC-Arrhenius-Weibull model for ageing cable, optimal network reconfiguration, wind generation modelling using ARMA models and real time thermal ratings. The final outputs are reliability metrics, cable ranking lists for replacement, savings due to 'non-spend' cable thermal lives, etc. These studies have proven to be important in formulating an effective strategy for extending the lives of network cables, managing overall network reliability and planning cables replacement in power distribution networks.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Kjellberg, Hans. "Organising distribution : Hakonbolaget and the efforts to rationalise food distribution, 1940-1960". Doctoral thesis, Stockholm : Economic Research Institute, Stockholm School of Economics [Ekonomiska forskningsinstitutet vid Handelshögsk.] (EFI), 2001. http://www.hhs.se/efi/summary/560.htm.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Heng, Hui Yi. "Long-term distribution network pricing and planning to facilitate efficient power distribution". Thesis, University of Bath, 2010. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.527519.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Marelli, Alina. "Innovative Distribution für Uhren". St. Gallen, 2005. http://www.biblio.unisg.ch/org/biblio/edoc.nsf/wwwDisplayIdentifier/01652320002/$FILE/01652320002.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Axén, Per-Allan y Martina Stenvall. "Kundnöjdhet genom fysisk distribution". Thesis, Jönköping University, Jönköping International Business School, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-332.

Texto completo
Resumen

Introduction

An increased globalization and competition has forced companies to change their view on the supply chain, from supply chain management to demand chain management.

The company Martela is experiencing an increased competition and a declining market. They have started a program for change in order to reclaim market shares. The goal of one of the sub-projects is to increase the customers’ confidence on Martela in order to increase customer satisfaction.

Purpose

The purpose of this study is to illustrate the strong significance of physical distribution on customer satisfaction.

Method

We have used a qualitative method. We have gathered information for this paper by means of discussions, interviews and data processing of delivery and sales. It has been an iterative process in which we have moved from empiric data to theory and then back, in order to analyze the result. A significant part of this paper are interpretations, in which the theory as well as our background had an influence in the analysis and the dis-cussion.

Conclusions

Customer satisfaction plays an important role in order to compete in the office furniture market. One step towards making the customer satisfied is to have a satisfactory physi-cal distribution. The conditions for a good distribution and satisfied customers can be created by focusing on damage-free deliveries on time along with the relevant informa-tion.

Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Daar, Waqas. "Distribution Agnostic Video Server". Thesis, KTH, School of Information and Communication Technology (ICT), 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-24269.

Texto completo
Resumen

With the advances of network and communication technology, real time audio and video streaming services are becoming progressively popular over the Internet. In order to enable universal access of multimedia streaming content and thus the desired end-to-end QoS, it is very desirable to design a video server. A video server, that can dynamically coupled to dierent streaming engines and deployed in a test bed for conducting dierent streaming experiments.

In this thesis we present the design of a video server that implement an agnostic" experiments using dierent engines. Proposed video server is also deployed in a test bed for evaluating dierent performance measurement parameters like CPU load, memory utilization etc. The results of test bed also support our proposed idea and unfold many opportunities for the research community to perform dierent multimedia streaming experiments with proposed video server. "engine-abstraction that will help to automate and repeat deterministic streaming

Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Stacklies, Wolfram. "Force Distribution in Macromolecules". Doctoral thesis, Universitätsbibliothek Leipzig, 2010. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-39367.

Texto completo
Resumen
All living organisms utilize thousands of molecular building blocks to perform mechanical tasks. These building blocks are mostly proteins, and their mechanical properties define the way they can be utilized by the cell. The spectrum ranges from rope like structures that give hold and stability to our bodies to microscopic engines helping us to perform or sense mechanical work. An increasing number of biological processes are revealed to be driven by force and well-directed distribution of strain is the very base of many of these mechanisms. We need to be able to observe the distribution of strain within bio-molecules if we want to gain detailed insight into the function of these highly complex nano-machines. Only by theoretical understanding and prediction of mechanical processes on the molecular level will we be able to rationally tailor proteins to mimic specific biological functions. This thesis aims at understanding the molecular mechanics of a wide range of biological molecules, such as the muscle protein titin or silk fibers. We introduce Force Distribution Analysis (FDA), a new approach to directly study the forces driving molecular processes, instead of indirectly observing them by means of coordinate changes.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Yu, Xuebei. "Distribution system reliability enhancement". Thesis, Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/41091.

Texto completo
Resumen
Practically all everyday life tasks from economic transactions to entertainment depend on the availability of electricity. Some customers have come to expect a higher level of power quality and availability from their electric utility. Federal and state standards are now mandated for power service quality and utilities may be penalized if the number of interruptions exceeds the mandated standards. In order to meet the requirement for safety, reliability and quality of supply in distribution system, adaptive relaying and optimal network reconfiguration are proposed. By optimizing the system to be better prepared to handle a fault, the end result will be that in the event of a fault, the minimum number of customers will be affected. Thus reliability will increase. The main function of power system protection is to detect and remove the faulted parts as fast and as selectively as possible. The problem of coordinating protective relays in electric power systems consists of selecting suitable settings such that their fundamental protective function is met under the requirements of sensitivity, selectivity, reliability, and speed. In the proposed adaptive relaying approach, weather data will be incorporated as follows. By using real-time weather information, the potential area that might be affected by the severe weather will be determined. An algorithm is proposed for adaptive optimal relay setting (relays will optimally react to a potential fault). Different types of relays (and relay functions) and fuses will be considered in this optimization problem as well as their coordination with others. The proposed optimization method is based on mixed integer programming that will provide the optimal relay settings including pickup current, time dial setting, and different relay functions and so on. The main function of optimal network reconfiguration is to maximize the power supply using existing breakers and switches in the system. The ability to quickly and flexibly reconfigure the power system of an interconnected network of feeders is a key component of Smart Grid. New technologies are being injected into the distribution systems such as advanced metering, distribution automation, distribution generation and distributed storage. With these new technologies, the optimal network reconfiguration becomes more complicated. The proposed algorithms will be implemented and demonstrated on a realistic test system. The end result will be improved reliability. The improvements will be quantified with reliability indexes such as SAIDI.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Tvedt, Ole Christian. "Quantum key distribution prototype". Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for elektronikk og telekommunikasjon, 2010. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-15845.

Texto completo
Resumen
This thesis covers the basics of cryptography, both classical and the newer quantum-basedapproches. Further, it details an implementation of a BB84-based quantum key distributionsystem currently under construction, focusing on the controlling hardware and FPGA-basedsoftware. The overarching goal is to create a system impervious to currently known attackson such systems. The system is currently running at 100 Mbit/s, though the goal is to double this asthe design nears its completion. The system currently chooses encoding base, bit value andwhether a state is a socalled decoy state. However, the modulator for bit encoding is notyet operational. Output for decoy state generation, however, is fully functional. Finally, the thesis describes what steps are necessary to reach a complete BB84-basedquantum key distribution system implementing decoy states.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Norström, Fredrik. "The Gompertz-Makeham distribution". Thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 1997. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-51351.

Texto completo
Resumen
This work is about the Gompertz-Makeham distribution. The distribution has been applied plenty of times. The properties of the Gompertz-Makeham distribution investigated in this work are unimodality of the Gompertz-Makeham distribution and relationship between the median value and the mean residual life time of the Gompertz-Makeham distribution. For most of the realistic set of parameter values the function is unimodal but not for all. The example used in this work gives a set of parameter values for which the function is unimodal. The relationship between the median value and the mean life time, left after some given time, has also been investigated with truncation at fixed age S. A program for simulations of life time with the Gompertz-Makeham distribution has been written in Pascal. For the possibility to have estimators of the unknown parameters of the Gompertz-Makeham distribution the least square estimation has been applied by use of a program written in Pascal. Description of the least square estimation and also the method of Maximum Likelihood and the EM- algorithm are also included in this work. The estimators of parameters can also be obtained by use of the last two methods. For testing the hypothesis that extreme old ages follows the Gompertz-Makeham distribution a goodness of fit test has been applied to real demographic data. For this example the hypothesis that extreme old ages follows the Gompertz-Makeham distribution, with parameters estimated by use of the least square estimation, is rejected.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Starzer, Michael. "Optimizing Tor Bridge Distribution". Thesis, Karlstads universitet, Institutionen för matematik och datavetenskap, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:kau:diva-26543.

Texto completo
Resumen
The Onion Router (Tor) is a good way to have privacy and anonymity while using the Internet. However there are several problems it has to deal with, because it is also possible to bypass governmental censorship, which also became goal of the Tor network. By different techniques several governments and other parties who have the capability to, try to block the network completely. One technique is to overwhelm the distribution strategies for bridges – which are an essential part of the Tor network, especially for censored users. Hereby a possible approach for distributing bridges via online social networks (OSN) is presented. It is based on the Proximax distribution but has also the capability to separate and exclude possible adversaries who managed to join the social group. Moreover trustful users get rewarded by a better status and less waiting time for bridges.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Babahacene, Sarah Fadila. "L'après contrat de distribution". Thesis, Montpellier 1, 2014. http://www.theses.fr/2014MON10039.

Texto completo
Resumen
Le contrat de distribution entité juridique dynamique, est régi au moment de sa formation par une construction légale et jurisprudentielle portant sur l'avant-contrat, et lors de son exécution par les dispositions du Code civil, du Code de commerce et des règles du droit de la concurrence. La question se pose cependant de ce qui le gouverne au-delà de son terme. Cette période particulière est nommée l'après-contrat de distribution. Peu importe la raison de son extinction, le contrat de distribution ainsi terminé, il faudra s'interroger sur la nature des règles applicables aux rapports post-contractuels entre ex-contractants. Aujourd'hui le droit commun des contrats ne suffit plus pour la liquidation du passé contractuel entre distributeur et fournisseur, un autre droit émerge influencé d'une part, par l'économie et de l'autre, par l'internationalisation du contrat de distribution impliquant l'inspiration des droits étrangers, du droit communautaire et international, mais également, des nombreux projets de réforme du droit des obligations à différents niveaux. Toutes ces réflexions permettront d'établir une approche plus pratique du régime des règles applicables à cette période complexe de l'après contrat de distribution
The distribution contract, legal framework, governed at the time of its formation by a legal and jurisprudential construction at the preliminary contract, and during its execution by the Civil Code, the Commercial Code and the rules of competition law. The question arises, however, what governs beyond its end. This particular period is named : the post-contract distribution. Whatever the reason for termination, when the distribution contract is over, it is necessary to consider the nature of the rules governing post-contractual relationship between contractors. Today, the general law of contracts is insufficient about the liquidation of the contractual past between distributor and supplier. Another right emerge, influenced in part, by the economy and the internationalization of the distribution contract involving the inspiration of foreign laws; Community and international law, but also, many reform projects of obligations laws. All these reflexions will establish a more practical approach to the legal regime applies to this complex period of the post-contract distribution
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Ajaja, Adile. "Distribution network optimal reconfiguration". Thesis, McGill University, 2012. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=110641.

Texto completo
Resumen
This thesis reports on research conducted on the Optimal Reconfiguration (OR) of distribution networks using Mixed Integer Linear Programming (MILP). At the operational hourly level, for a set of predicted bus loads, OR seeks the optimum on/off position of line section switches, shunt capacitors and distributed generators so that the distribution network is radial and operates at minimum loss. At the planning level, OR seeks the optimum placement of line switches and shunt capacitors so that, over the long-term, losses will be minimized. The main steps and outcomes of this research are (i) the development of a simplified single-phase distribution network model for Optimal Reconfiguration; (ii) the development of a linear DC load flow model with line and device switching variables accounting for both active and reactive power flows; (iii) the development of an algorithm HYPER which finds the minimum loss on/off status of existing line switches, shunt capacitors and distributed generators; (iv) the extension of HYPER to find the optimum (minimum loss) placement of switches, capacitors and distributed generators; (v) the representation of losses via supporting hyperplanes enabling the full linearization of the OR problem, which can then be solved using efficient and commercially available MILP solvers like CPLEX. Keywords: Distribution Network, Optimal Reconfiguration, OR, Loss minimization, Mixed-Integer Linear Programming, MILP, Operations research, Linear network model, DC load flow, Supporting hyperplanes, Real time optimization, Switch, Capacitor and Distributed Generator placement, Power Systems Operations and Planning.
Ce mémoire de thèse rend compte des produits d'activités de recherche menée relativement à la Reconfiguration Optimale (RO) de réseaux de distribution par Programmation Linéaire en Variables Mixtes (PLVM). Dans un contexte de conduite de réseau, la RO s'applique à déterminer l'état ouvert/fermé optimal des interrupteurs, disjoncteurs, condensateurs et producteurs distribués, avec objectif d'opérer à un niveau de pertes minimum un réseau de distribution radial. La RO s'applique également, dans un contexte de planification, à identifier l'emplacement optimal sur le réseau d'interrupteurs, disjoncteurs et condensateurs visant le maintien, sur le long terme, des pertes à un niveau minimum. Les principaux résultats de cette recherche sont: (i) le développement d'un modèle unifilaire simplifié de réseau de distribution pour la Reconfiguration Optimale; (ii) le développement d'un modèle d'écoulement de puissance linéaire avec variables contrôlant l'état des lignes, adapté autant pour l'écoulement de puissance actif que réactif; (iii) le développement de l'algorithme HYPER capable d'identifier l'état ouvert/fermé optimal (minimum de pertes) des interrupteurs, disjoncteurs, condensateurs et producteurs distribués; (iv) une extension de l'algorithme HYPER permettant de déterminer l'emplacement optimal (minimum de pertes) d'interrupteurs, disjoncteurs, condensateurs et producteurs distribués; (v) la représentation des pertes via hyperplans-porteurs permettant la linéarisation complète du problème RO et sa résolution par l'emploi de solveurs PLVM performants et commercialement disponibles tels que CPLEX.Mots clés: Réseau de distribution, Reconfiguration Optimale, RO, Minimisation des pertes, Programmation Linéaire en Variables Mixtes, PLVM, Recherche opérationnelle, Modèle de réseau linéaire, Écoulement de puissance linéaire, Hyperplans-porteurs, Optimisation temps réel, Interrupteur, Disjoncteur, Condensateur, Producteur privé, Exploitation, Conduite, Planification.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Novak, Julia. "Generalised key distribution patterns". Thesis, Royal Holloway, University of London, 2012. http://repository.royalholloway.ac.uk/items/f582aac8-df73-28ea-fe2e-80f1c37f5e59/8/.

Texto completo
Resumen
Given a network of users, with certain secure communication requirements, we examine the mathematics that underpins the distribution of the necessary secret information, to enable the secure communications within that network. More precisely, we let f!lJ be a network of users and ~, § be some prede- termined families of subsets of those users. The secret information (keys or subkeys) must be distributed in such a way that for any G E ~, the members of G can communicate securely among themselves without fear of the members of some F E § (that have no users in common with G), colluding together to either eavesdrop on what is being said (and understand the content of the message) or tamper with the message, undetected. , In the case when ~ and § comprise of all the subsets of f!lJ that have some fixed cardinality t and w respectively, we have a well-known and much studied problem. However, in this thesis we remove these rigid cardinality constraints and make ~ and § as unrestricted as possible. This allows for situations where the members of ~ and § are completely irregular, giving a much less well-known and less studied problem. Without any regularity emanating from cardinality constraints, the best approach to the study of these general structures is unclear. It is unreason- able to expect that highly regular objects (such as designs or finite geometries) play any significant role in the analysis of such potentially irregular structures. Thus, we require some new techniques and a more general approach. In this thesis we use methods from set theory and ideas from convex analysis in order to construct these general structures and provide some mathematical insight into their behaviour. Furthermore, we analyse these general structures by ex- ploiting the proof techniques of other authors in new ways, tightening existing inequalities and generalising results from the literature.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Edelstein, Elspeth Claire. "Syntax of adverb distribution". Thesis, University of Edinburgh, 2012. http://hdl.handle.net/1842/7919.

Texto completo
Resumen
The distribution of adverbs is particularly difficult to account for, given the amount of variation it encompasses. Not only are adverbs typically optional, but any adverb may also appear in several different positions relative to other constituents, with placement differing according to adverb type and language. As a result, although adverbs are not essential clausal mainstays, the way they are incorporated into the syntax has crucial implications for an overall understanding of clause structure. Some recent accounts of adverb distribution, most notably Cinque (1999), require a highly articulated clausal cartography, where each adverb fits into a specific syntactic position. The placement of adverbs is determined by their semantic properties inasmuch as their specified positions correspond to semantic classes. The ordering of these positions is syntactically predetermined, supposedly with no little or no semantic input. More semantics-based accounts of adverb distribution, as exemplified by Ernst (2002), do not restrict adverbs to specific positions. Rather, any adverb may adjoin to any projection, as long as its individual semantic requirements are satisfied. Such theories of distribution thus depend on adverbs’ semantic interactions with each other and other constituents. The differences between these ‘syntactic’ and ‘semantic’ approaches have led to questions about the nature of verb movement, functional projections, and adjunction. The debate over adverb distribution also raises the issue of what contribution semantics makes to the syntax, and what is syntactically primitive. The aim of this dissertation is to develop an account of adverb distribution that neither requires the introduction of new functional projections, nor attempts to shoehorn an external semantic hierarchy onto a pre-existing syntactic one. It will focus on the position of adverbs in relation to other constituents rather than their order with respect to each other. In this thesis I will review previous theories of adverb distribution, giving special attention to Cinque’s (1999) ‘functional specifier’ approach and Ernst’s (2002) ‘semantic adjunction’ approach, as well as some alternatives, especially the VP-remnant analysis proposed in Nilsen (2003). I will then look at the little-discussed phenomenon of ‘Adverb Climbing’ (AC), in which an adverb precedes a verb that takes an infinitival complement, but is interpreted as modifying the embedded rather than the matrix verb. Taking the varying availability of AC with Control and Raising verbs as a starting point, I will develop a theory of adverb licensing that determines where an adverb may adjoin according to its location in relation to a particular projection. Specifically, I will propose that an adverb must c-command the projection it modifies, and must have access to that projection either in the same phase or at the edge of a lower phase. Based on this analysis I will argue that AC is in fact an indicator of restructuring, and that control and raising verbs take different sizes of infinitival complement. I will also examine the distribution of ‘verb-modifying’ adverbs. Drawing on previous ‘split VP’ proposals (e.g. Ramchand 2008; Travis (2010)), I will contend that the varying distribution of agentive, subject-oriented, and manner adverbs indicates that each is distributed in relation to a different projection within the vP, and that some postverbal adverbs are complements of VP. This proposal will require the introduction of crosslinguistically parameterised restrictions on the order in which adverbs and feature-checking elements may be merged to a single projection. Moreover, I will argue that the array of positions available to agentive adverbs indicates that English has head movement within the vP which bypasses a head, violating Travis’s (1984) Head Movement Constraint (HMC). I will then posit a new analysis of head movement which allows for this violation while still precluding the instances of ungrammaticality that the HMC was meant to rule out. I will finally discuss the distribution of adverbs and negation in the IP range, giving special attention to Pollock’s (1989) classic data from English and French. I will develop an analysis of negation which will allow me to explain the distribution of both sentential adverbs and negation without splitting the IP. Further refinement of the ordering restrictions on multiple merge will also provide an explanation for the ungrammaticality of an adverb between a subject and the highest verb in French, and between do and not in English. This dissertation will serve to situate the study of adverb distributionwithin Chomsky’s (1995) Minimalist framework while providing fresh insight into the extent to which adverb distribution may be used as an indicator of clause structure and movement of other constituents.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Lamo, Ana Rosa. "Cross section distribution dynamics". Thesis, London School of Economics and Political Science (University of London), 1996. http://etheses.lse.ac.uk/1448/.

Texto completo
Resumen
This thesis contains four chapters. Each chapter constitutes an empirical exercise in which I apply econometric ideas on studying the dynamics of large cross sections of data (Random Fields). Three of them concern the empirics of convergence and the fourth analyses business cycle fluctuations. The first, "Notes on Convergence Empirics: Some Calculations for Spanish Regions," describes the econometric methods for studying the dynamics of the distributions and how to characterise convergence in this framework, explains why the standard cross-section regression analysis is misleading when testing for convergence and then performs some calculations for regions in Spain. The second chapter, "Dynamics of the Income Distribution Across OECD Countries", considers its baseline hypotheses to be those generated by the Solow growth model. Using sequential conditioning, it studies whether the convergence hypothesis implications can be shown to hold for the OECD economies. It finds that neither absolute nor conditional convergence, in the sense of economies approaching the OECD average, has taken place. The third chapter, "Cross Sectional Firm Dynamics: Theory and Empirical Results", extends ideas of distribution dynamics to a discrete choice setting, and extends the reasoning of Galton's Fallacy to the logit model. It provides evidence of the tendency of firm sizes to converge for the US chemicals sector by analysing dynamically evolving cross-section distributions. Finally, the fourth chapter, "Unemployment in Europe and Regional Labour Fluctuations" applies distribution dynamics ideas to a business cycle setting. It analyses the dynamics of employment for 51 European regions from 1960 to 1990, addressing the issue of whether regional shocks have aggregate effects on unemployment or the opposite. It uses a model for non-stationary evolving distributions to identify idiosyncratic and aggregate disturbances.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

林漢坤 y Hon-kwan Lam. "Modelling of income distribution". Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1989. http://hub.hku.hk/bib/B31975914.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Phillips, C. "An automatic distribution tool". Thesis, Queen's University Belfast, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.396221.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Nasr, Entesar. "Distribution problems in arithmetic". Thesis, University of Liverpool, 2018. http://livrepository.liverpool.ac.uk/3022467/.

Texto completo
Resumen
In this thesis we use modern developments in ergodic theory and uniform distribution theory to investigate the distribution of polynomials, partial quotients of convergents, random and oscillatory sequences.
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Lee, Christina (Christina Esther). "Computing stationary distribution locally". Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/82410.

Texto completo
Resumen
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2013.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 89-93).
Computing stationary probabilities of states in a large countable state space Markov Chain (MC) has become central to many modern scientific disciplines, whether in statistical inference problems, or in network analyses. Standard methods involve large matrix multiplications as in power iterations, or long simulations of random walks to sample states from the stationary distribution, as in Markov Chain Monte Carlo (MCMC). However, these approaches lack clear guarantees for convergence rates in the general setting. When the state space is prohibitively large, even algorithms that scale linearly in the size of the state space and require computation on behalf of every node in the state space are too expensive. In this thesis, we set out to address this outstanding challenge of computing the stationary probability of a given state in a Markov chain locally, efficiently, and with provable performance guarantees. We provide a novel algorithm, that answers whether a given state has stationary probability smaller or larger than a given value [delta] [epsilon] (0, 1). Our algorithm accesses only a local neighborhood of the given state of interest, with respect to the graph induced between states of the Markov chain through its transitions. The algorithm can be viewed as a truncated Monte Carlo method. We provide correctness and convergence rate guarantees for this method that highlight the dependence on the truncation threshold and the mixing properties of the graph. Simulation results complementing our theoretical guarantees suggest that this method is effective when our interest is in finding states with high stationary probability.
by Christina Lee.
S.M.
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Simpkins, Travis L. (Travis Lee) 1977. "Active optical clock distribution". Thesis, Massachusetts Institute of Technology, 2002. http://hdl.handle.net/1721.1/87826.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Chen, Sheit Sheng. "Refrigerant distribution model development". Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/35035.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Bhatia, Vimal. "Distribution dependent adaptive learning". Thesis, University of Edinburgh, 2005. http://hdl.handle.net/1842/10801.

Texto completo
Resumen
To improve the performance of adaptive algorithms, we develop algorithms adapted on the noise characteristics rather than adapting only on second order statistics. The developments in this thesis can be classified in two major works. First work is on developing a minimum bit-error rate (MBER) decision feedback equaliser (DFE) for impulsive noise modelled as a α-stable distribution. The development exploits the stable nature of the α-distribution and the concepts build on earlier work in a Gaussian noise environment. Further, a Wiender-filter-with-limiter solution is also presented and used as a performance bench mark. An improvement in convergence and BER performance is achieved by using a minimum bit error rate (MBER) cost function instead of a conventional least mean square (LMS) based design. The ability of least BER (LBER) equalisers based on a Gaussian noise assumption to operate in α-stable noise environment is also highlighted. In the second work, a block based maximum-likelihood algorithm using kernel density estimates to improve channel estimation in non-Gaussian noise environment is proposed. The likelihood pdf is assumed unknown and is estimated by using a kernel density estimator at the receiver. Thereby combining log-likelihood as a cost function with a kernel density estimator provides a robust channel estimator, which could be used for various non-Gaussian noise environments without any modification. The performance of the proposed estimator is compared with the theoretical lower bounds for associated noise distribution. The simulations for impulsive noise and co-channel interference (CCI) in presence of Gaussian noise, confirms that a better estimate can be obtained by using the proposed technique as compared to the traditional algorithms. The proposed algorithm is then applied to orthogonal frequency division multiplexing (OFDM) communication systems. A considerable performance improvement is observed when using a non-parametric channel estimator in conjunction with a symbol-by-symbol non-parametric maximum a posteriori probability (MAP) equaliser.
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Broadfoot, Stuart Graham. "Long distance entanglement distribution". Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:e7039911-f16b-4f49-8aab-8bb30ae97daa.

Texto completo
Resumen
Developments in the interdisciplinary field of quantum information open up previously impossible abilities in the realms of information processing and communication. Quantum entanglement has emerged as one property of quantum systems that acts as a resource for quantum information processing and, in particular, enables teleportation and secure cryptography. Therefore, the creation of entangled resources is of key importance for the application of these technologies. Despite a great deal of research the efficient creation of entanglement over long distances is limited by inevitable noise. This problem can be overcome by creating entanglement between nodes in a network and then performing operations to distribute the entanglement over a long distance. This thesis contributes to the field of entanglement distribution within such quantum networks. Entanglement distribution has been extensively studied for one-dimensional networks resulting in "quantum repeater" protocols. However, little work has been done on higher dimensional networks. In these networks a fundamentally different scaling, called "long distance entanglement distribution", can appear between the resources and the distance separating the systems to be entangled. I reveal protocols that enable long distance entanglement distribution for quantum networks composed of mixed state and give a few limitations to the capabilities of entanglement distribution. To aid in the implementation of all entanglement distribution protocols I finish by introducing a new system, composed of an optical nanofibre coupled to a carbon nanotube, that may enable new forms of photo-detectors and quantum memories.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Jurado, Ignacio. "The politics of distribution". Thesis, University of Oxford, 2012. http://ora.ox.ac.uk/objects/uuid:a9da1efe-7b7c-41df-aa5a-96ff380b955b.

Texto completo
Resumen
This dissertation presents a theoretical framework about which voters parties distribute to and with which policies. To develop this full framework of distributive policies, the dissertation proceeds in two stages. First, it analyses which voters parties have more incentives to target distributive policies. Second, it also develops the conditions under which political parties can focus exclusively on these voters or need to combine this strategy with appeals to a broader electorate. The first part of the argument analyses which voters parties have at the centre of their distributive strategies, or, in the words of Cox and McCubbins (1986) to whom parties will give an available extra dollar for distribution. The argument is that core voters provide more efficient conditions for distribution, contradicting Stokes’ (2005) claim that a dollar spent on core voters is a wasted dollar. The explanation is twofold. First, core supporters might not vote for another party, but they can get demobilised. Once we include the effects on turnout, core voters are more responsive. Their party identification makes them especially attentive and reactive to economic benefits provided by their party. Secondly, incumbents cannot individually select who receives a distributive policy, and not all voters are equally reachable with distributive policies. When a party provides a policy, it cannot control if some of those resources go to voters the party is not interested in. Core supporters are more homogenous groups with more definable traits, whereas swing voters are a residual category composed by heterogeneous voters with no shared interests. This makes it easier for incumbents to shape distributive benefits that target core voters more exclusively. These mechanisms define the general distribution hypothesis: parties will focus on core voters, by targeting their distributive strategies to them. The second part of the dissertation develops the conditions under which politicians stick to this distributive strategy or, instead, would provide more universalistic spending to a more undefined set of recipients. The conventional argument explaining this choice relies on the electoral system, arguing that proportional systems give more incentives to provide universalistic policies than majoritarian systems. This dissertation challenges this argument and provides two other contextual conditions that define when parties have a stronger interest in their core supporters or in a more general electorate. First, the geographic distribution of core supporters across districts is a crucial piece of information to know the best distributive strategy. When parties’ core supporters are geographically concentrated, they cannot simply rely on them, as the party will always fall short of districts to win the election. Therefore, parties will have greater incentives to expand their electorate by buying off other voters. This should reduce the predicted differences between electoral systems in the provision of universalistic programmes. Secondly, the policy positions of candidates are a result of strategic considerations that respond to other candidates’ positions. Thus, I argue that parties adapt their distributive strategies to the number of competing parties, independently of the electoral system. In a two-party scenario, parties need broader coalitions of electoral support. In equilibrium, any vote can change the electoral outcome. As more parties compete, the breadth of parties’ electorates is reduced and parties will find narrow distributive policies more profitable. In summary, the main contribution of this dissertation one is to provide a new framework to study distributive politics. This framework makes innovations both on the characterisation of swing and core electoral groups, and the rationale of parties’ distributive strategies, contributing to advance previous theoretical and empirical research.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Schmidt, Andreas Tupac. "Freedom and its distribution". Thesis, University of Oxford, 2014. http://ora.ox.ac.uk/objects/uuid:dce62f88-1419-4159-ad13-8bdb927a0d3c.

Texto completo
Resumen
This dissertation develops a new theory of specific and overall socio-political freedom and discusses its role in normative political theory. The aim is to dissolve some of the conceptual confusions that have often beset previous discussions and to develop a theoretical framework with which to approach questions of public policy. This dissertation consists of three parts. In the first part, I develop a new account that specifies under which conditions a person is specifically free and when she is unfree to do something. It is shown that republican accounts of freedom are unsatisfactory and that a trivalent liberal account that equates freedom with ability is most plausible. A new analysis of unfreedom is defended according to which a person is made unfree (as opposed to merely unable) to do something only if she would have this freedom in a better and available distribution that another person could have foreseeably brought about. In the second part, I discuss how to move from an account of specific freedom and unfreedom to a measure of overall freedom. I develop a new and simple aggregation function and argue that the measurement of overall freedom requires both quantitative and evaluative factors. In the third part, I then discuss what role freedom should play in a theory of distributive justice. Instead of freedom deontologically constraining the reach of distributive justice, freedom should be one of its distribuenda. I will first discuss how best to distribute freedom across a person’s lifetime and how this impacts on discussions of paternalistic policies. It will then be shown that we ought not simply maximise freedom between persons, not aim to give everyone enough freedom nor aim at equal freedom. Instead, distributing freedom requires a principle that combines maximisation with a concern for fairness.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía