Um die anderen Arten von Veröffentlichungen zu diesem Thema anzuzeigen, folgen Sie diesem Link: Analysis of deviations.

Dissertationen zum Thema „Analysis of deviations“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit Top-50 Dissertationen für die Forschung zum Thema "Analysis of deviations" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.

1

Martori, Amanda Lynn. „A Wearable Motion Analysis System to Evaluate Gait Deviations“. Scholar Commons, 2013. http://scholarcommons.usf.edu/etd/4724.

Der volle Inhalt der Quelle
Annotation:
A Wearable Motion Analysis System (WMAS) was developed to evaluate gait, particularly parameters that are indicative of mild traumatic brain injury. The WMAS consisted on six Opal IMUs attached on the sternum, waist, left and right thigh and left and right shank. Algorithms were developed to calculate the knee flexion angle, stride length and cadence parameters during slow, normal and fast gait speeds. The WMAS was validated for repeatability using a robotic arm and accuracy using the Vicon motion capture system, the gold standard for gait analysis. The WMAS calculated the gait parameters to within a clinically acceptable range and is a powerful tool for gait analysis and potential concussion diagnosis outside of a laboratory setting.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

De, Losier Clayton Ray. „Effects of Manufacturing Deviations on Core Compressor Blade Performance“. Thesis, Virginia Tech, 2009. http://hdl.handle.net/10919/31340.

Der volle Inhalt der Quelle
Annotation:
There has been recent incentive for understanding the possible deleterious effects that manufacturing deviations can have on compressor blade performance. This is of particular importance in todayâ s age, as compressor designs are pushing operating limits by employing fewer stages with higher loadings and are designed to operate at ever higher altitudes. Deviations in these advanced, as well as legacy designs, could negatively affect the performance and operation of a core compressor; thus, a numerical investigation to quantify manufacturing deviations and their effects is undertaken. Data from three radial sections of every compressor blade in a single row of a production compressor is used as the basis for this investigation. Deviations from the compressor blade design intent to the as-manufactured blades are quantified with a statistical method known as principle component analysis (PCA). MISES, an Euler solver coupled with integral boundary-layer calculations, is used to analyze the effects that the aforementioned deviations have on compressor blade performance when the inlet flow conditions produce a Mach number of approximately 0.7 and a Reynolds number of approximately 6.5e5. It was found that the majority of manufacturing deviations were within a range of plus or minus 4 percent of the design intent, and deviations at the leading edge had a critical effect on performance. Of particular interest is the fact that deviations at the leading edge not only degraded performance but significantly changed the boundary-layer behavior from that of the design case.
Master of Science
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Dungca, Jason Tomas. „Large deviations and multifractal analysis for expanding countably-branched Markov maps“. Thesis, University of Bristol, 2018. http://hdl.handle.net/1983/0aed71dd-5e99-4fd7-a105-d4b5aae4d95c.

Der volle Inhalt der Quelle
Annotation:
We will use the theory of thermodynamic formalism for countable Markov shifts to pose and solve problems in multifractal analysis and large deviations. We start with an introduction outlining results in thermodynamic formalism, multifractal analysis, and large deviations in Chapter 1. We state necessary concepts and results from dynamical systems, ergodic theory, thermodynamic formalism, dimension theory, and large deviations in Chapter 2. In Chapter 3, we consider the multifractal analysis for Gibbs measures for expanding, countably branched Markov maps. We will find conditions for the multifractal spectrum to have various numbers of phase transitions. Finally, in Chapter 4, we consider an expanding, countably-branched Markov map T, the countable Markov shift, and a locally Hölder potential f. The behaviour of the dynamical system (T(lambda),(0,1]) depends on the value of lambda. We will aim to form a large deviation principle for f for a fixed lambda in (1/2,1) and we will discuss the method for forming a such a principle for a lambda in (0,1/2) in Chapter 4’s introduction.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Leung, Vincent W. „Analysis and compensation of log-domain filter deviations due to transistor nonidealities“. Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk1/tape11/PQDD_0034/MQ50637.pdf.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Hodzic, Amer, und Danny Hoang. „Detection of Deviations in Beehives Based on Sound Analysis and Machine Learning“. Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-105316.

Der volle Inhalt der Quelle
Annotation:
Honeybees are an essential part of our ecosystem as they take care of most of the pollination in the world. They also produce honey, which is the main reason beekeeping was introduced in the first place. As the production of honey is affected by the living conditions of the honeybees, the beekeepers aim to maintain the health of the honeybee societies. TietoEVRY, together with HSB Living Lab, introduced connected beehives in a project named BeeLab. The goal of BeeLab is to provide a service to monitor and gain knowledge about honeybees using the data collected with different sensors. Today they measure weight, temperature, air pressure, and humidity. It is known that honeybees produce different sounds when different events are occurring in the beehive. Therefore BeeLab wants to introduce sound monitoring to their service. This project aims to investigate the possibility of detecting deviations in beehives based on sound analysis and machine learning. This includes recording sound from beehives followed by preprocessing of sound data, feature extraction, and applying a machine learning algorithm on the sound data. An experiment is done using Mel-Frequency Cepstral Coefficients (MFCC) to extract sound features and applying the DBSCAN machine learning algorithm to investigate the possibilities of detecting deviations in the sound data. The experiment showed promising results as deviating sounds used in the experiment were grouped into different clusters.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Keyser, Leonid. „Does specialization in security analysis and portfolio management explain deviations from the CAPM?“ Thesis, Massachusetts Institute of Technology, 2005. http://hdl.handle.net/1721.1/33666.

Der volle Inhalt der Quelle
Annotation:
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management, 2005.
Includes bibliographical references (p. 44-45).
The Capital Asset Pricing Model (CAPM), which relates the risk of an individual security to its expected return, is frequently cited in investments textbooks and the academic literature as a centerpiece of modem finance theory. The main prediction of the CAPM is that investors are compensated in the form of expected return only for bearing systematic or market risk, which is the portion of a security's risk that cannot be diversified away. That investors demand reparation for and only for systematic risk is a consequence from the pivotal assumption that all investors have identical information for the entire universe of publicly traded securities. In actuality, professional active money managers rarely invest in a portfolio broad enough to be considered the market portfolio. Instead, the asset management industry has self-organized over time according to a top-down investment process, where asset allocators provide capital to security selectors who specialize in high-yield bonds, large-cap value stocks, and the like. Any losses in diversification benefits resulting from this theoretically suboptimal two-phase investment strategy are deemed an unavoidable cost of obtaining accurate forecasts through specialization in security analysis and portfolio management.
(cont.) This research paper extends the ideas of the CAPM to formulate an equilibrium security pricing model that attempts to account for the top-down approach followed by investors in the real-world.
by Leonid Keyser.
M.B.A.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Tsang, Tat-shing. „Statistical inference on the coefficient of variation /“. Hong Kong : University of Hong Kong, 2000. http://sunzi.lib.hku.hk/hkuto/record.jsp?B21903980.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Ma, Jinyong. „Topics in sequence analysis“. Diss., Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/45908.

Der volle Inhalt der Quelle
Annotation:
This thesis studies two topics in sequence analysis. In the first part, we investigate the large deviations of the shape of the random RSK Young diagrams, associated with a random word of size n whose letters are independently drawn from an alphabet of size m=m(n). When the letters are drawn uniformly and when both n and m converge together to infinity, m not growing too fast with respect to n, the large deviations of the shape of the Young diagrams are shown to be the same as that of the spectrum of the traceless GUE. Since the length of the top row of the Young diagrams is the length of the longest (weakly) increasing subsequence of the random word, the corresponding large deviations follow. When the letters are drawn with non-uniform probability, a control of both highest probabilities will ensure that the length of the top row of the diagrams satisfies a large deviation principle. In either case, both speeds and rate functions are identified. To complete our study, non-asymptotic concentration bounds for the length of the top row of the diagrams, are obtained for both models. In the second part, we investigate the order of the r-th, 1<= r < +∞, central moment of the length of the longest common subsequence of two independent random words of size n whose letters are identically distributed and independently drawn from a finite alphabet. When all but one of the letters are drawn with small probabilities, which depend on the size of the alphabet, the r-th central moment is shown to be of order n^{r/2}. In particular, when r=2, we get the order of the variance of the longest common subsequence.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Guimond, Jean-Francois. „Do Mutual Fund Managers Have Superior Skills? An Analysis of the Portfolio Deviations from a Benchmark“. Digital Archive @ GSU, 2006. http://digitalarchive.gsu.edu/finance_diss/12.

Der volle Inhalt der Quelle
Annotation:
By construction, actively managed portfolios must differ from passively managed ones. Consequently, the manager’s problem can be viewed as selecting how to deviate from a passive portfolio composition. The purpose of this study is to see if we can infer the presence of superior skills through the analysis of the portfolio deviations from a benchmark. Based on the Black-Litterman approach, we hypothesize that positive signals should lead to an increase in weight, from which should follow that the largest deviations from a benchmark weight reveal the presence of superior skills. More precisely, this study looks at the subsequent performance of the securities corresponding to the largest deviations from different external benchmarks. We use a sample of 8385 US funds from the CRSP Survivorship bias free database from June 2003 to June 2004 to test our predictions. We use two external benchmarks to calculate the deviations: the CRSP value weighted index (consistent with the Black-Litterman model) and the investment objective of each fund. Our main result shows that a portfolio of the securities with the most important positive deviations with respect to a passive benchmark (either CRSP-VW or investment objective), would have earned a subsequent positive abnormal return (on a risk-adjusted basis) for one month after the portfolio date. The magnitude of this return is around 0.6% for all the funds, and can be as high as 2.77% for small caps value funds. This result is robust to all the performance measures used in this study.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Probst, George T. „Analysis of the Effects of Privacy Filter Use on Horizontal Deviations in Posture of VDT Operators“. Thesis, Virginia Tech, 2000. http://hdl.handle.net/10919/33544.

Der volle Inhalt der Quelle
Annotation:
The visual display terminal (VDT) is an integral part of the modern office. An issue of concern associated with the use of the VDT is maintaining privacy of on-screen materials. Privacy filters are products designed to restrict the viewing angle to documents displayed on a VDT, so that the on-screen material is not visible to persons other than the VDT operator. Privacy filters restrict the viewing angle either by diffraction or diffusion of the light emitted from the VDT. Constrained posture is a human factors engineering problem that has been associated with VDT use. The purpose of this research was to evaluate whether the use of privacy filters affected: 1) the restriction of postures associated with VDT use, 2) operator performance, and 3) subjective ratings of display issues, posture, and performance. Nine participants performed three types of tasks: word processing, data entry, and Web browsing. Each task was performed under three filter conditions: no filter, diffraction filter, and diffusion filter. Participants were videotaped during the tasks using a camera mounted above the VDT workstation. The videotape was analyzed and horizontal head deviation was measured at 50 randomly selected points during each task. Horizontal head deviation was measured as the angle between an absolute reference line, which bisects the center of the VDT screen, and a reference point located at the center of the participantâ s head. Standard deviation of head deviation were evaluated across filter type and task type. Accuracy- and/or time-based measures were used to evaluate performance within each task. Participants used a seven-point scale to rate the following: readability, image quality, brightness, glare, posture restriction, performance, and discomfort. The results indicated that the interaction between task and filter type affected the standard deviation of horizontal head deviation (a measure of the average range of horizontal deviation). The standard deviation of horizontal deviation was significantly larger within the Web browsing task under the no filter and diffusion filter conditions as compared to the diffraction filter condition. Filter type affected subjective ratings of the following: readability, image quality, brightness, posture restriction, and discomfort. The diffraction filter resulted in lower readability, image quality, and brightness ratings than the diffusion and no filter conditions. Participants reported that the ability to change postures was significantly decreased by the use of the diffraction filter as compared to the no filter and diffraction filter conditions. The diffraction filter resulted in an increase in reported discomfort as compared to the no filter condition. The interaction between filter and task type affected subjective ratings of performance. Participants reported a decrease in the rating of perceived performance under the diffraction filter / Web browsing condition as compared to the no filter / word processing, diffusion filter / Web browsing, and diffusion filter / data entry conditions. A decrease in the rating of perceived performance was reported in the diffraction filter / data entry condition as compared to the no filter / word processing and diffusion filter / Web browsing conditions. Neither diffraction nor diffusion filter affected performance within any of the tasks, based on the objective performance measures used in the experiment.
Master of Science
APA, Harvard, Vancouver, ISO und andere Zitierweisen
11

Kuhn, Jason William. „Measurement and Analysis of Wavefront Deviations and Distortions by Freeform Optical See-through Head Mounted Displays“. Thesis, The University of Arizona, 2016. http://hdl.handle.net/10150/613396.

Der volle Inhalt der Quelle
Annotation:
A head-mounted-display with an optical combiner may introduce significant amount of distortion to the real world scene. The ability to accurately model the effects of both 2-dimensional and 3-dimensional distortion introduced by thick optical elements has many uses in the development of head-mounted display systems and applications. For instance, the computer rendering system must be able to accurately model this distortion and provide accurate compensation in the virtual path in order to provide a seamless overlay between the virtual and real world scenes. In this paper, we present a ray tracing method that determines the ray shifts and deviations introduced by a thick optical element giving us the ability to generate correct computation models for rendering a virtual object in 3D space with the appropriate amount of distortion. We also demonstrate how a Hartmann wavefront sensor approach can be used to evaluate the manufacturing errors in a freeform optical element to better predict wavefront distortion. A classic Hartmann mask is used as an inexpensive and easily manufacturable solution for accurate wavefront measurements. This paper further suggests two techniques; by scanning the Hartmann mask laterally to obtain dense sampling and by increasing the view screen distance to the testing aperture, for improving the slope measurement accuracy and resolution. The paper quantifies the improvements of these techniques on measuring both the high and low sloped wavefronts often seen in freeform optical-see-through head-mounted displays. By comparing the measured wavefront to theoretical wavefronts constructed with ray tracing software, we determine the sources of error within the freeform prism. We also present a testing setup capable of measuring off-axis viewing angles to replicate how the system would perform when worn by its user.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
12

曾達誠 und Tat-shing Tsang. „Statistical inference on the coefficient of variation“. Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2000. http://hub.hku.hk/bib/B31223503.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
13

Hall, Bentley (Tyler), Hayward (Trey) Hargrove und James (Marshall) Willis. „Analysis of Current Flight Scheduling Practices and Recommendations to Efficiently Reduce Deviations from Syllabus Time-To-Train“. Thesis, Monterey, California. Naval Postgraduate School, 2011. http://hdl.handle.net/10945/7069.

Der volle Inhalt der Quelle
Annotation:
EMBA Project Report
EXECUTIVE SUMMARY: The objective of our project is to investigate current scheduling requirements, constraints, and procedures to identify problems with scheduling practices and syllabus management for Primary Flight Training in Training Wing 4. We analyzed three alternative scheduling approaches to reduce excess training time in the maximum efficient manner. Alternative 1: Prioritize students based on deviations from syllabus flow Changing the prioritization of students does not have a direct impact on reducing Training Timeline, since no additional production capacity is being added. However, changing the prioritization of scheduling students to give the highest priority to students who are the most behind should reduce gaps in training and increase proficiency, thereby reducing failures and required warm up flights for time out of the cockpit. This will reduce time-to-train (TTT) and additional overhead flights. The Training Timeline function of TIMS provides information on deviations from syllabus-designed TTT for use in the prioritization in scheduling. Alternative 2: Utilize aircraft availability in schedule builds Like instructors and students, aircraft are required to complete a flight event, and should be managed accordingly. Schedule writers can use current metrics of aircraft availability and make reasonable assumptions on the longevity of the information to predict follow-on production capacity. Events scheduled without considering aircraft availability should be presumed unlikely until availability is confirmed. Alternative 3: Monitor completer production / TTT deficits to trigger increased production When necessary, increased production can be gained through very limited means without introducing further scheduling constraints. Schedule writers must monitor when excess capacity is required and consider what can be gained at what cost; options can be prioritized based on a reasonable ordering (based on relative costs, both monetary and follow-on production loss risk) of the available options: Saturday operations, mandatory prepositions, forced cross countries, or recommending a detachment. We recommend TIMS Training Timeline function permissions be made available to schedule writing personnel for the operational database. Training needs to be provided to all TRAWING 4 schedule writers from the TIMS help desk to ensure utilization and integration of the Training Timeline. Scheduling in this manner will help ensure that extra syllabus flight requirements and time out of the cockpit are minimized. Scheduling templates based on aircraft availability will ensure events are planned to the maximum capacity of the system. We recommend schedule writers monitor Daily Status Reports and build follow-on schedules based on predicted asset availability. This will help avoid unnecessary use of other variables that could contribute to rippling production limitations. When it is mandatory to fly other than normal weekday field hours, having the field open for mandatory Saturday operations is the best alternative to gain on the student deficit depicted on the Training Timeline. Simultaneously, squadrons can use prepositions and cross countries to manage their own in house training deficits as they see fit.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
14

Lantz, Robin. „Time series monitoring and prediction of data deviations in a manufacturing industry“. Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-100181.

Der volle Inhalt der Quelle
Annotation:
An automated manufacturing industry makes use of many interacting moving parts and sensors. Data from these sensors generate complex multidimensional data in the production environment. This data is difficult to interpret and also difficult to find patterns in. This project provides tools to get a deeper understanding of Swedsafe’s production data, a company involved in an automated manufacturing business. The project is based on and will show the potential of the multidimensional production data. The project mainly consists of predicting deviations from predefined threshold values in Swedsafe’s production data. Machine learning is a good method of finding relationships in complex datasets. Supervised machine learning classification is used to predict deviation from threshold values in the data. An investigation is conducted to identify the classifier that performs best on Swedsafe's production data. The technique sliding window is used for managing time series data, which is used in this project. Apart from predicting deviations, this project also includes an implementation of live graphs to easily get an overview of the production data. A steady production with stable process values is important. So being able to monitor and predict events in the production environment can provide the same benefit for other manufacturing companies and is therefore suitable not only for Swedsafe. The best performing machine learning classifier tested in this project was the Random Forest classifier. The Multilayer Perceptron did not perform well on Swedsafe’s data, but further investigation in recurrent neural networks using LSTM neurons would be recommended. During the projekt a web based application displaying the sensor data in live graphs is also developed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
15

Nyqvist, Daniel. „Time management challenges of major refurbishment projects : An analysis of 20 hydropower outages at Fortum“. Thesis, Uppsala universitet, Industriell teknik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-255604.

Der volle Inhalt der Quelle
Annotation:
While most western hydropower sites are already developed or protected by legislation, the aging hydropower park requires refurbishment actions. Especially to tackle the challenges of an increased fluctuation at the grid coming from the expansion of other renewable energy sources such as wind power. The company Fortum is carrying out a number of major refurbishment projects every year and want to enhance their time performance during the outage. Delayed projects are resulting in unexpected costs and production losses. By investigating 20 historical refurbishment outages from a project manager perspective, the delays are related to different sections of the outage time. These sections are referred to as work packages, meaning a set of activities related to a functional part of the plant. The material is based on interviews and project documentation. The outages are divided into three groups depending on the amount of delay and some additional set of factors are used in a comparison. The results are discussed from a time management and a multi-project perspective. The study can be viewed as an initial study to address time management challenges in a company. The methodology proves to be an efficient way to get time management challenges at a company to the surface. The results displays late manufacturing deliveries and overruns of assembly and erection durations as the most common reasons of delay. A number of potential success/failure factors are suggested. It is also pointed out that small projects are at risk being more delayed compared to larger ones.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
16

Giljum, Stefan, Hanspeter Wieland, Franz Stephan Lutter, Nina Eisenmenger, Heinz Schandl und Anne Owen. „The impacts of data deviations between MRIO models on material footprints: A comparison of EXIOBASE, Eora, and ICIO“. Wiley, 2019. http://dx.doi.org/10.1111/jiec.12833.

Der volle Inhalt der Quelle
Annotation:
In various international policy processes such as the UN Sustainable Development Goals, an urgent demand for robust consumption-based indicators of material flows, or material footprints (MFs), has emerged over the past years. Yet, MFs for national economies diverge when calculated with different Global Multiregional Input-Output (GMRIO) databases, constituting a significant barrier to a broad policy uptake of these indicators. The objective of this paper is to quantify the impact of data deviations between GMRIO databases on the resulting MF. We use two methods, structural decomposition analysis and structural production layer decomposition, and apply them for a pairwise assessment of three GMRIO databases, EXIOBASE, Eora, and the OECD Inter-Country Input-Output (ICIO) database, using an identical set of material extensions. Although all three GMRIO databases accord for the directionality of footprint results, that is, whether a countries' final demand depends on net imports of raw materials from abroad or is a net exporter, they sometimes show significant differences in level and composition of material flows. Decomposing the effects from the Leontief matrices (economic structures), we observe that a few sectors at the very first stages of the supply chain, that is, raw material extraction and basic processing, explain 60% of the total deviations stemming from the technology matrices. We conclude that further development of methods to align results from GMRIOs, in particular for material-intensive sectors and supply chains, should be an important research priority. This will be vital to strengthen the uptake of demand-based material flow indicators in the resource policy context.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
17

Kimball, Thomas L. „An analysis of similarities between developmentally delayed and non delayed preschool boys with attention deficit disorder in their differential responses to objective measures of vigilance and activity level /“. The Ohio State University, 1987. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487584612166325.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
18

Zemanová, Tereza. „Controlling ve firmě Llentab“. Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-113136.

Der volle Inhalt der Quelle
Annotation:
This Diploma thesis aims to theoretically define tools used by modern controlling and then practically apply them in specific company in Czech Republic. To analyse the chosen company will be used both strategic and operational controlling instruments. Calculations and analysis will help the company to understand the environment in which they operate and which is their current position. Finally I formulate opinions and recommendations about the currently used procedures in the company.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
19

Özaslan, Tan Hakan. „Computational analysis of expressivity in classical guitar performances“. Doctoral thesis, Universitat Pompeu Fabra, 2013. http://hdl.handle.net/10803/128877.

Der volle Inhalt der Quelle
Annotation:
L’estudi de l’expressivitat musical és un camp molt actiu en la computació musical. El seu interès ve donat per diverses motivacions: entendre i modelitzar l’expressivitat musical; identificar els recursos expressius que caracteritzen un instrument, gènere musical o intèrpret; i construir sistemes de síntesi amb la capacitat de reproduir música expresivament. Per abordar aquest problema tan ampli, la literatura existent tendeix a focalitzarse en instruments o gèneres musicals concrets. En aquesta tesi, ens hem focalitzat en l’anàlisi de la expressivitat en la guitarra clàssica y el nostre objectiu serà modelitzar l’ús de recursos expressius en aquest instrument. Els fonaments de tots els mètodes utilitzats en aquesta tesi estan basats en tècniques de búsqueda y recuperació de la informació, aprenentatge automàtic y processament del senyal. Concretament, combinem diversos algorismes de l’estat de l’art per fer una proposta de caracterització de l’ús dels recursos expressius. La guitarra clàssica és un instrument que es caracteritza per la diversitat de les seves possibilitats tímbriques. Els guitarristes professionals són capaços de transmetre molts matisos durant la interpretació d’una peça musical. Aquesta característica específica de la guitarra clàssica fa que l’anàlisi d’aquest instrument sigui una tasca difícil. Dividim el nostre anàlisi en dues línies de treball principals. La primera línia proposa una eina capaç d’identificar automàticament recursos expressius en el context d’una gravació comercial. Construim un model amb l’objectiu d’analitzar i extreure automàticament els tres recursos expressius més utilitzats: legato, glissando i vibrato. La segona línia proposa un anàlisi integral de desviacions de tempo en la guitarra clàssica. De les variacions, potser les més importants siguin les variacions de tempo: són fonamentals per a la interpretació expressiva i un ingredient clau per conferir una qualitat humana a interpretacions basades en ordinador. No obstant, la naturalesa d’aquestes variacions és encara un problema d’investigació que no ha estat resolt, amb diverses teories que apunten a un fenòmen multi-dimensional. El nostre sistema utilitza tècniques d’extracció de característiques i aprenentatge automàtic. La precisió de la classificació mostra que les desviacions de tempo són predictors precisos de la peça musical corresponent. Para recapitular, aquesta tesi contribueix al camp de l’anàlisi expressiu proveint un model automàtic d’articulació expressiva i un sistema predictor de peces musicals que analitza les desviacions de tempo. Finalment, aquesta tesi analitza el comportament dels models proposats utilitzant gravacions comercials.
The study of musical expressivity is an active field in sound and music computing. The research interest comes from different motivations: to understand or model musical expressivity; to identify the expressive resources that characterize an instrument, musical genre, or performer; or to build synthesis systems able to play expressively. To tackle this broad problem, researchers focus on specific instruments and/or musical styles. Hence, in this thesis we focused on the analysis of the expressivity in classical guitar and our aim is to model the use of expressive resources of the instrument. The foundations of all the methods used in this dissertation are based on techniques from the fields of information retrieval, machine learning, and signal processing. We combine several state of the art analysis algorithms in order to deal with modeling the use of the expressive resources. Classical guitar is an instrument characterized by the diversity of its timbral possibilities. Professional guitarists are able to convey a lot of nuances when playing a musical piece. This specific characteristic of classical guitar makes the expressive analysis is a challenging task. The research conducted focuses on two different issues related to musical expressivity. First, it proposes a tool able to automatically identify expressive resources such as legato, glissando, and vibrato, in commercial guitar recordings. Second, we conducted a comprehensive analysis of timing deviations in classical guitar. Timing variations are perhaps the most important ones: they are fundamental for expressive performance and a key ingredient for conferring a human-like quality to machine-based music renditions. However, the nature of such variations is still an open research question, with diverse theories that indicate a multi-dimensional phenomenon. Our system exploits feature extraction and machine learning techniques. Classification accuracies show that timing deviations are accurate predictors of the corresponding piece. To sum up, this dissertation contributes to the field of expressive analysis by providing, an automatic expressive articulation model and a musical piece prediction system by using timing deviations. Most importantly, it analyzes the behavior of proposed models by using commercial recordings.
El estudio de la expresividad musical es un campo muy activo en la computación musical. El interés en investigar ésta área tiene distintas motivaciones: entender y modelar la expresividad musical; identificar los recursos expresivos que caracterizan un instrumento, género musical, o intérprete; y construir sistemas de síntesis con la capacidad de reproducir música expresivamente. Para abordar este problema tan amplio, la literatura existente tiende a enfocarse en instrumentos o géneros musicales específicos. En esta tesis nos enfocaremos en el análisis de la expresividad en la guitarra clásica y nuestro objetivo será modelar el uso de recursos expresivos en este instrumento. Los fundamentos de todos los métodos usados en esta tesis están basados en técnicas de búsqueda y recuperación de la información, aprendizaje automático y procesamiento de señales. Combinamos varios algoritmos del estado del arte para lidiar con el modelado del uso de los recursos expresivos. La guitarra clásica es un instrumento que se caracteriza por la diversidad de sus posibilidades tímbricas. Los guitarristas profesionales son capaces de transmitir muchos matices durante la interpretación de una pieza musical. Esta característica específica de la guitarra clásica hace que el análisis de este instrumento sea una tarea difícil. Dividimos nuestro análisis en dos líneas de trabajo principales. La primera línea propone una herramienta capaz de identificar automáticamente recursos expresivos en el contexto de una grabación comercial. Construimos un modelo con el objetivo de analizar y extraer automáticamente los tres recursos expresivos más utilizados: legato, glissando y vibrato. La segunda línea propone un análisis integral de desviaciones de tiempo en la guitarra clásica. De las variaciones, quizás las más importantes sean las variaciones de tiempo: son fundamentales para la interpretación expresiva y un ingrediente clave para conferir una cualidad humana a interpretaciones basadas en ordenador. No obstante, la naturaleza de tales variaciones es aún un problema de investigación que no ha sido resuelto, con diversas teorías que apuntan a un fenómeno multi-dimensional. Nuestro sistema utiliza técnicas de extracción de características y aprendizaje de automático. La precisión de la clasificación muestra que las desviaciones de tiempo son predictores precisos de la pieza musical correspondiente. Para recapitular, esta tesis contribuye al campo del análisis expresivo proveyendo un modelo automático de articulación expresiva y un sistema predictor de piezas musicales que emplea desviaciones de tiempo. Finalmente, esta tesis analiza el comportamiento de los modelos propuestos utilizando grabaciones comerciales.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
20

Andersson, Albin. „Human Factors Analysis & Classifications System - Maintenance Extension applicerad på avvikelser vid underhållsarbete med JAS 39 Gripen“. Thesis, Luleå tekniska universitet, Institutionen för ekonomi, teknik och samhälle, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-67346.

Der volle Inhalt der Quelle
Annotation:
Denna rapport avser att förbättra och utveckla Sveriges flygvapens metoder för kategorisering och analys av avvikelser vid flygplansunderhåll. Säkerheten är den viktigaste aspekten vid flygning och oftast är det mänskliga faktorer som bidrar till att säkerheten äventyras. De mänskliga faktorerna är direkt relaterade till flygolyckor, därför får underhållsarbetet en stor roll inom flygsäkerheten. Studier påvisar att 70-80% av olyckor orsakas av mänskliga felhandlingar vid underhållsmoment vilket gör det ytterst viktigt att undersöka vilka faktorer som ligger bakom avvikelser. Målet med arbetet är att förstå och förklara felhandlingar. Tvåhundratvå avvikelserapporter från Sveriges Försvarsmakts avdelning F21 analyserades. Modellen ”Human Factors Analysis & Classification System- Maintenance Extension” (HFACS-ME) användes för att kategorisera avvikelserna. Resultatet visar att kategorierna inom HFACS-ME är för generella för att med precision kunna kategorisera avvikelserna på ett effektivt sätt. Därför tillförde författaren nya externa kategorier som berör leverantörer och en ny typ av dokumentationsbrist. Den nya modellen kallas för HFACS-SAFE. Det föreslås att ett systematiskt arbete med modeller samt informationsspridning om felhandlingar förhoppningsvis ska kunna minska antalet avvikelser.
This report aims to improve and develop Sweden's air force's methods for categorization and analysis of aircraft maintenance deviations. Safety is the most important aspect of aviation and human factors is usually a contributing factor when safety is jeopardized. Human factors are directly related to aviation accidents, and maintenance work therefore plays a major role in aviation safety. Studies show that 70-80% of accidents are caused by human error at the maintenance level, which makes it of outmost importance to investigate the factors behind deviations. An analysis of 202 deviation reports contributed from the Swedish Defense Forces department F21were made. The aim of the work is to understand and explain human errors. The model "Human Factors Analysis & Classification System Maintenance Extension" (HFACS-ME) was used to categorize the deviations. The results show that the HFACS-ME categories were too general to accurately categorize the deviations efficiently. Therefore, the author added new categories involving suppliers and a new type of documentation shortage. The new model is called HFACS-SAFE. The report also suggests that, through systematic work on models and distribution of information on wrong actions, deviations may hopefully be reduced.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
21

Decloedt, Andre. „Seeking common deviations from South Africa’s tax treaty policy: a comparative analysis identifying trends (regional or otherwise) in treaty practice in bi-lateral tax treaties with countries in Asia, Australasia, North America and South America“. Master's thesis, Faculty of Commerce, 2018. http://hdl.handle.net/11427/31554.

Der volle Inhalt der Quelle
Annotation:
South Africa experienced an unprecedented growth in its tax treaty network since 1994 as a result of an increase in global trade. In concluding these bi-lateral tax treaties with other countries, South Africa depends primarily on its national model policy during its negotiations with other contracting states. The country’s national tax treaty policy was previously defined in one document, the publication of which has since been discontinued. Apart from Professor C West’s contribution to the global tax community, there is little research information available on the current tax treaty policy of South Africa. It is submitted that the OECD Model and its positions recorded in the commentaries are now widely accepted as the national tax treaty policy of South Africa. The findings of the comparative analysis between the previously documented tax treaty policy and this new widely accepted position of South Africa, suggested that the OECD Model and its recorded positions in the commentaries, subject to a few exceptions, is a fair reflection of South Africa’s national tax treaty policy. It is submitted that South Africa accepted common deviations from its national tax treaty policy when negotiating bi-lateral treaties with countries in the Americas, Asia and Australasia. Previous research failed to provide guidance in this aspect and in an attempt to seek common deviations from South Africa’s national tax treaty policy, a comparative analysis was conducted to identify trends (whether regional or otherwise) in tax treaties with a sample of countries in Asia, Australasia, North America and South America. The findings of this comparative analysis indicated that South Africa successfully applied its national tax treaty policy to a large extend, but does accept common deviations from the policy.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
22

Rademeyer, Anerie. „The development of a root cause analysis process for variations in human performance“. Thesis, Pretoria : [s.n.], 2009. http://upetd.up.ac.za/thesis/available/etd-04012009-231223/.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
23

Novotná, Marta. „Využití controllingu v podniku“. Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2009. http://www.nusl.cz/ntk/nusl-222378.

Der volle Inhalt der Quelle
Annotation:
The diploma thesis deals with the analysis of the current status of controlling in the company Skynet, a.s. and suggests improvements by using controlling as a subsystem of the management system. The financial analysis, which is the crucial tool for decision making in the corporate management, is used to evaluate the corporate history and current state. In the first part of the financial analysis, the corporate economical status is evaluated using the analysis of the relative data, while in the second part, it is evaluated using the aggregate data. In the following chapter, the internal economy of the company is discussed, including calculations, optimization of deviations and the consequences of the inclusion of controlling into the management process. It is concluded that if controlling is used in the management process, the competitive edge grows thanks to the improved adaptability to the business environment and more flexible strategic management.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
24

Rodrigues, Roberto Wagner da Silva. „Deviation analysis of inter-organisational workflow systems“. Thesis, Imperial College London, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.271151.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
25

Špicarová, Kateřina. „Využití controllingu v podniku“. Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2021. http://www.nusl.cz/ntk/nusl-442964.

Der volle Inhalt der Quelle
Annotation:
The master´s thesis is focused on application of Management Control System in company especially on planning and analysis of deviations. In theoretical part of the thesis will be described basic charakteristic and terms and will be used continualy in the analytical part. In this part Honeywell s.r.o. will be described and analyzed its current situation and methods for planning costs. Then will be analyzed and interpreted deviations based on actual data in MS Excel. In the proposal part will be propose actions to improve the current situation in the form of implemetation new program MS Power BI or Tableau into controlling team.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
26

Gumbo, Victor. „Mean absolute deviation skewness model with transactions costs“. Pretoria : [s.n.], 2005. http://upetd.up.ac.za/thesis/available/etd-09052005-115438.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
27

Hrůzová, Lucie. „Analýza a optimalizace nákladového plánu ve vybrané společnosti“. Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2020. http://www.nusl.cz/ntk/nusl-417394.

Der volle Inhalt der Quelle
Annotation:
This diploma thesis deals with analysis of cost planning in company Gebauer a Griller Kabeltechnik, spol s r.o. for third quartal of chosen cost centre. There are described all calculations and final cost plan is compared with actual data. On base of this are proposed recommendations for optimization of cost planning and elimination of deviations.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
28

RIVERO, CÁMARA FRANCISCO JOSÉ. „POWER DEVIATION ANALYSIS OF THE ROCKNEBY WIND FARM“. Thesis, Högskolan i Halmstad, Energivetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-28453.

Der volle Inhalt der Quelle
Annotation:
Nowadays the globalization and the economy expansion of the emerging countries demand anincreasing amount of energy. Therefore, energy production as well as the efficiency of energyusage, is essential for future developments of societies. Renewable energies appear as a turnkeysolution that could support the growing demands, and at the same time not being harmful to theenvironment [1]. Within the types of renewable energies, wind energy could be considered asone with large potential.In this paper I present the study of a Swedish wind farm placed in Rockneby.Once the wind turbines were installed and working correctly, a discrepancy between the realenergy obtained and the theoretical energy indicated by the manufacturer was detected. Thestored data in the SCADA system were compared with the values provided by the manufacturerand several analyses were performed. Initially an anomaly in the power residual deviation wasdetected. It was showing an unusual behaviour at high wind speeds. The variation of the airdensity in the wind farm at hub height was considered as a possible reason of the disagreementobserved in the power parameters since the reference density used by the manufacturer was aconstant value calculated in laboratory environment. However, this idea was rejected becausethe power generated in both conditions is similar. The pitch angle was analysed after detectinga significant variations in wind speed measurements made by the anemometer in the turbinenumber three. As a result, it was found a pitch variation in the turbine which seems due to afailure in the anemometer. As a final result, the turbulences were analysed giving as aconclusion that the turbulence intensity were situated around 20%. Therefore, I mainly suggestas a possible explanation of this fact the influence of the turbulence accompanied of a badcalibration or failure in the anemometers.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
29

Samara, Myrto [Verfasser], Stefan M. [Akademischer Betreuer] [Gutachter] Leucht und Johann [Gutachter] Förstl. „Meta-analysis in schizophrenia trials: comparison of chlorpromazine versus every other antipsychotic drug for schizophrenia and assessment of an imputation technique for estimating response rates from means and standard deviations in schizophrenia / Myrto Samara ; Gutachter: Stefan M. Leucht, Johann Förstl ; Betreuer: Stefan M. Leucht“. München : Universitätsbibliothek der TU München, 2017. http://d-nb.info/1143826086/34.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
30

Li, Guodong. „On some nonlinear time series models and the least absolute deviation estimation“. Click to view the E-thesis via HKUTO, 2007. http://sunzi.lib.hku.hk/hkuto/record/B3878239X.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
31

Li, Guodong, und 李國棟. „On some nonlinear time series models and the least absolute deviation estimation“. Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2007. http://hub.hku.hk/bib/B3878239X.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
32

Tan, Vincent Yan Fu. „Large-deviation analysis and applications Of learning tree-structured graphical models“. Thesis, Massachusetts Institute of Technology, 2011. http://hdl.handle.net/1721.1/64486.

Der volle Inhalt der Quelle
Annotation:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2011.
This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.
Cataloged from student submitted PDF version of thesis.
Includes bibliographical references (p. 213-228).
The design and analysis of complexity-reduced representations for multivariate data is important in many scientific and engineering domains. This thesis explores such representations from two different perspectives: deriving and analyzing performance measures for learning tree-structured graphical models and salient feature subset selection for discrimination. Graphical models have proven to be a flexible class of probabilistic models for approximating high-dimensional data. Learning the structure of such models from data is an important generic task. It is known that if the data are drawn from tree-structured distributions, then the algorithm of Chow and Liu (1968) provides an efficient algorithm for finding the tree that maximizes the likelihood of the data. We leverage this algorithm and the theory of large deviations to derive the error exponent of structure learning for discrete and Gaussian graphical models. We determine the extremal tree structures for learning, that is, the structures that lead to the highest and lowest exponents. We prove that the star minimizes the exponent and the chain maximizes the exponent, which means that among all unlabeled trees, the star and the chain are the worst and best for learning respectively. The analysis is also extended to learning foreststructured graphical models by augmenting the Chow-Liu algorithm with a thresholding procedure. We prove scaling laws on the number of samples and the number variables for structure learning to remain consistent in high-dimensions. The next part of the thesis is concerned with discrimination. We design computationally efficient tree-based algorithms to learn pairs of distributions that are specifically adapted to the task of discrimination and show that they perform well on various datasets vis-`a-vis existing tree-based algorithms. We define the notion of a salient set for discrimination using information-theoretic quantities and derive scaling laws on the number of samples so that the salient set can be recovered asymptotically.
by Vincent Yan Fu Tan.
Ph.D.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
33

Pinto, Felipe Alves Pereira. „An automated approach for performance deviation analysis of evolving software systems“. Universidade Federal do Rio Grande do Norte, 2015. http://repositorio.ufrn.br/handle/123456789/21132.

Der volle Inhalt der Quelle
Annotation:
Submitted by Automa??o e Estat?stica (sst@bczm.ufrn.br) on 2016-08-09T22:42:29Z No. of bitstreams: 1 FelipeAlvesPereiraPinto_TESE.pdf: 3382982 bytes, checksum: 85ac4517804b6893a9d66c8dc3ea1c78 (MD5)
Approved for entry into archive by Arlan Eloi Leite Silva (eloihistoriador@yahoo.com.br) on 2016-08-10T21:37:28Z (GMT) No. of bitstreams: 1 FelipeAlvesPereiraPinto_TESE.pdf: 3382982 bytes, checksum: 85ac4517804b6893a9d66c8dc3ea1c78 (MD5)
Made available in DSpace on 2016-08-10T21:37:28Z (GMT). No. of bitstreams: 1 FelipeAlvesPereiraPinto_TESE.pdf: 3382982 bytes, checksum: 85ac4517804b6893a9d66c8dc3ea1c78 (MD5) Previous issue date: 2015-11-18
The maintenance and evolution of software systems have become a critical task over the last years due to the diversity and high demand of features, devices and users. The ability to understand and analyze how newly introduced changes impact the quality attributes of the architecture of those software systems is an essential prerequisite for avoiding the deterioration of the engineering quality of them during their evolution. This thesis proposes an automated approach for the deviation analysis of the quality attribute of performance in terms of execution time (response time). It is implemented by a framework that adopts dynamic analysis and mining software repository techniques to provide an automated way to revel potential sources - commits and issues - of performance deviation in scenarios of an evolving software system. The approach defines four phases: (i) preparation - choosing the scenarios and preparing the target releases; (ii) dynamic analysis - determining the performance of scenarios and methods by calculating their execution time; (iii) deviation analysis - processing and comparing the results of the dynamic analysis for different releases; and (iv) repository mining - identifying development issues and commits associated with performance deviation. Several empirical studies have been developed to assess the approach from different perspectives. An initial study shows the feasibility of the approach to support traceability of quality attributes with static analysis. An exploratory study analyzed the usefulness and domain independence of the proposal in automatically identifying source code assets with performance deviation and the changes that have affected them during an evolution. This study was performed using three systems: (i) SIGAA - a web academic management system; (ii) ArgoUML - an UML modeling tool; and (iii) Netty - a network application framework. A third study has performed an evolutionary analysis of applying the approach to multiple releases of Netty, and the web frameworks Wicket and Jetty. It has analyzed twenty-one releases (seven releases of each system) and addressed a total of 57 scenarios. Overall, we have found 14 scenarios with significant performance deviation for Netty, 13 for Wicket, and 9 for Jetty. In addition, the feedback obtained from an online survey with eight developers of Netty, Wicket and Jetty is also discussed. Finally, in our last study, we built a performance regression model in order to indicate the properties of code changes that are more likely to cause performance degradation. We mined a total of 997 commits, of which 103 were retrieved from degraded code assets, 19 from optimized, while 875 had no impact on execution time. Number of days before release and day of week were the most relevant variables of commits that cause performance degradation in our model. The receiver operating characteristic (ROC) area of our regression model is 60%, which means that deciding if a commit will cause performance degradation or not by using the model is 10% better than a random guess.
A manuten??o e evolu??o de sistemas de software tornou-se uma tarefa bastante cr?tica ao longo dos ?ltimos anos devido ? diversidade e alta demanda de funcionalidades, dispositivos e usu?rios. Entender e analisar como novas mudan?as impactam os atributos de qualidade da arquitetura de tais sistemas ? um pr?-requisito essencial para evitar a deteriora??o de sua qualidade durante sua evolu??o. Esta tese prop?e uma abordagem automatizada para a an?lise de varia??o do atributo de qualidade de desempenho em termos de tempo de execu??o (tempo de resposta). Ela ? implementada por um framework que adota t?cnicas de an?lise din?mica e minera??o de reposit?rio de software para fornecer uma forma automatizada de revelar fontes potenciais ? commits e issues ? de varia??o de desempenho em cen?rios durante a evolu??o de sistemas de software. A abordagem define quatro fases: (i) prepara??o ? escolher os cen?rios e preparar os releases alvos? (ii) an?lise din?mica ? determinar o desempenho de cen?rios e m?todos calculando seus tempos de execu??o? (iii) an?lise de varia??o ? processar e comparar os resultados da an?lise din?mica para releases diferentes? e (iv) minera??o de reposit?rio ? identificar issues e commits associados com a varia??o de desempenho detectada. Estudos emp?ricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo explorat?rio analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes dom?nios para identificar automaticamente elementos de c?digo fonte com varia??o de desempenho e as mudan?as que afetaram tais elementos durante uma evolu??o. Esse estudo analisou tr?s sistemas: (i) SIGAA ? um sistema web para ger?ncia acad?mica? (ii) ArgoUML ? uma ferramenta de modelagem UML? e (iii) Netty ? um framework para aplica??es de rede. Outro estudo realizou uma an?lise evolucion?ria ao aplicar a abordagem em m?ltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cen?rios. Em resumo, foram encontrados 14 cen?rios com varia??o significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas atrav?s de um formul?rio online. Finalmente, no ?ltimo estudo, um modelo de regress?o para desempenho foi desenvolvido visando indicar propriedades de commits que s?o mais prov?veis a causar degrada??o de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de c?digo fonte degradados e 19 de otimizados, enquanto 875 n?o tiveram impacto no tempo de execu??o. O n?mero de dias antes de disponibilizar o release e o dia da semana se mostraram como as vari?veis mais relevantes dos commits que degradam desempenho no nosso modelo. A ?rea de caracter?stica de opera??o do receptor (ROC ? Receiver Operating Characteristic) do modelo de regress?o ? 60%, o que significa que usar o modelo para decidir se um commit causar? degrada??o ou n?o ? 10% melhor do que uma decis?o aleat?ria.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
34

Kawnine, Tanzim. „A Radial-Ulnar Deviation and Wrist-Finger Flexion Analysis Based on Electromyography“. Thesis, Mälardalen University, Department of Computer Science and Electronics, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-7329.

Der volle Inhalt der Quelle
Annotation:

This study is aimed to determine the electromyographic signals of the forearm, using Ag/AgCl electrodes. The four major muscles of forearm, which are providing the bioelectrical currents, have been displayed and analysed to determine the different activities. In order to record the signals, an EMG device has been developed and installed and a schematic has also been presented in this paper.

APA, Harvard, Vancouver, ISO und andere Zitierweisen
35

Paz, Alvarez Alfonso. „Deviation occurrence analysis in a human intensive production environment by using MES data“. Thesis, KTH, Industriell produktion, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-230674.

Der volle Inhalt der Quelle
Annotation:
Despite decades of automation initiatives, manual assembly still represents one of the most cost-effective approaches in scenarios with high product variety and complex geometry. It represents 50% of total production time and 20% of total production cost. Understanding human performance and its impact in the assembly line is key in order to improve the overall performance of an assembly line. Along this thesis work, by studying the deviations occurring in the line, it is aimed to understand how human workers are affected by certain functioning aspects of the assembly line. To do so, three different influence factors have been chosen, and then observed its impact in human performance: i. How past events occurring in the line affect the current action of the worker. ii. How do scheduled stops affect the current action of the worker. iii. How does theoretical cycle time affect the performance of the worker. In order to observe these influence relationships, it has been used data gathered in the shop floor from SCANIA's Manufacturing Execution System (MES). By applying methods of Knowledge Discovery in Database (KDD) data has been indexed and the analyzed providing the necessary results for the study. Finally, from the results shown, it can be inferred that variability on the functioning of the line does have an impact on human performance overall. However, due the complexity of the manufacturing system, impact in human performance might not be as regular as initially thought.
Trots decennier av automatiseringsinitiativ utgör manuell montering fortfarande en av de mest kostnadseffektiva metoderna i scenarier med hög produktsortiment och komplex geometri. Den representerar 50% av den totala produktionstiden och 20% av den totala produktionskostnaden. Att förstå mänsklig prestanda och dess inverkan i monteringsledningen är nyckeln för att förbättra den totala prestandan hos en monteringslinje. Utöver detta avhandlingsarbete, genom att studera avvikelserna som uppstår i linjen, syftar det till att förstå hur mänskliga arbetstagare påverkas av vissa fungerande aspekter av monteringslinjen. För att göra det har tre olika inflytningsfaktorer valts och sedan observerat dess inverkan i mänsklig prestation: i. Hur tidigare händelser som uppstår i linjen påverkar arbetarens nuvarande åtgärder. ii. Hur påverkar planerade stopp arbetstagarens nuvarande åtgärder. iii. Hur påverkar teoretisk cykeltid arbetarens prestation. För att observera dessa inflytningsrelationer har det använts data som samlats in i butiksgolvetfrån SCANIAs Manufacturing Execution System (MES). Genom att tillämpa metoder för Knowledge Discovery i Database (KDD) har data indexerats och analyseras vilket ger de nödvändiga resultaten för studien. Slutligen kan det framgå av de visade resultaten att variationen i linjens funktion har en inverkan på den mänskliga prestationen övergripande. På grund av tillverkningssystemets komplexitet kan emellertid effekten i mänsklig prestanda inte vara så regelbunden som ursprungligen tänkt.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
36

Škrnová, Lucie. „Controlling ve firmě Kunststoff-Frohlich Czech Plast, s.r.o“. Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-114288.

Der volle Inhalt der Quelle
Annotation:
Comparision of the primary accounting with the present status in Kunststoff-Frohlich Czech Plast Company is the target of the graduation theses. Evaluation of actual situation in controlling with a view to financial and cost controlling is the next target. Eventual changes within the meaning of expansion or completion of the controlling activities in order to increases of company efficiency will be proposed on the basis of effected evaluation.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
37

Dobeš, Radim. „Řešení pro odchylkovou analýzu nákladů ve výrobní společnosti“. Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2021. http://www.nusl.cz/ntk/nusl-444622.

Der volle Inhalt der Quelle
Annotation:
At the very beginning of the diploma thesis, we introduce the reader to the issues of BI and controlling of manufacturing companies. Subsequently, we perform an analysis and evaluation of the current state of the selected manufacturing company in terms of variations in production. Then we use MSSQL server and SSAS to create a controlling model. The company will be able to unambiguously and quickly identify weaknesses in production and quickly eliminate them. Finally, we evaluate the real benefits of this project for the company.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
38

Cook, Guy William Davidson. „A theory of discourse deviation : the application of schema theory to the analysis of literary discourse“. Thesis, University of Leeds, 1990. http://etheses.whiterose.ac.uk/12996/.

Der volle Inhalt der Quelle
Annotation:
Schema theory suggests that people understand texts and experiences by comparing them with stereotypical mental representations of similar cases. This thesis examines the relevance of this theory (as developed in some Artificial Intelligence (AI) work of the 1970s and 1980s) to literary theory and the analysis of literary texts. The general theoretical framework is that of discourse analysis. In this approach, the usefulness of schema theory is already widely acknowledged for the contribution it can make to an explanation of 'coherence': the quality of meaningfulness and unity perceived in discourse. Building upon this framework, relevant AI work on text processing is discussed, evaluated, and applied to literary and non-literary discourse. The argument then moves on to literary theory, and in particular to the 'scientific' tradition of formalism, structuralism and Jakobsonian stylistics. The central concept of this tradition is 'defamiliarization': the refreshing of experience through deviation from expectation. In structuralism, attention has been concentrated on text structure, and in Jakobsonian stylistics on language. It is argued that whereas AI work on text pays little attention to linguistic and textual form, seeking to 'translate' texts into a neutral representation of 'content', the literary theories referred to above have erred in the opposite direction, and concentrated exclusively on form. Through contrastive analyses of literary and non-literary discourse, it is suggested that neither approach is capable of accounting for •literariness* on its own. The two approaches are, however, complementary, and each would benefit from the insights of the other. Human beings need to change and refresh their schematic representations of the world, texts and language. It is suggested that such changes to schemata are effected through linguistic and textual deviation from expectation, but that deviations at these levels are no guarantee of change (as is often the case in advertisements). Discourses which do. effect changes through text and language are described as displaying 'discourse deviation*. Their primary function and value may be this effect. Discourse categorized as 'literary' is frequently of this type. Discourse deviation is best described by a combination of the methods of A1 text analysis with formalist, structuralist and Jakobsonian literary theories. In illustration of these proposals, the thesis concludes with analyses of three well-known literary texts.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
39

Maslova, Maria. „Calibration of parameters for the Heston model in the high volatility period of market“. Thesis, Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-2206.

Der volle Inhalt der Quelle
Annotation:

The main idea of our work is the calibration parameters for the Heston stochastic volatility model. We make this procedure by using the OMXS30 index from the NASDAQ OMX Nordic Exchange Market. We separate our data into the stable period and high-volatility period on this Nordic Market. Deviation detection problem are solved using the Bayesian analysis of change-points. We estimate parameters of the Heston model for each of periods and make some conclusions.

APA, Harvard, Vancouver, ISO und andere Zitierweisen
40

Huang, Yu-Fen. „Connecting orchestral conductors' interpretational intentions to conducting movement kinematics : a mixed-methods approach using Deviation Point Analysis“. Thesis, University of Edinburgh, 2018. http://hdl.handle.net/1842/31059.

Der volle Inhalt der Quelle
Annotation:
During orchestral performance, conductors play a role in which they provide their interpretations of the musical composition, communicating these interpretational intentions via their body movement. Pedagogical sources propose movement emblems for stock actions by which a conductor may deliver compositional and interpretational features in conducting practise. This thesis reports a mixed-methods study which provides empirical observations on the kinematic features evident in conducting practise, and which aims to explore the connection between such movements with compositional features and conductors’ interpretative intentions. Six conductors’ interpretational intentions were collected in interviews, and their conducting movements were recorded using a Qualisys motion capture system, while they worked on excerpts of repertoire by Mozart, Dvořák, and Bartók with a small string ensemble. In the interviews, conductors reported their general thoughts and beliefs about conducting. They were also prompted to identify the compositional events which they sought to highlight in their conducting, and to describe the conducting strategies they intended to use to highlight these musical events. The resulting qualitative data were thematically analysed. The conductor-identified compositional features were also used to guide kinematic investigations, using an innovative analysis method original to this project, Deviation Point Analysis (DPA). Conductors’ movements are described using four dependent variables of baton tip (movement distance, speed, acceleration, and jerk). Results are reported for two-way repeated measures ANOVAs (repertoire x trial), and for t-tests revealing significant differences between cross-correlation coefficients for within-conductor trial pairs and between-conductor trial pairs. Further examination of the data using DPA serves to distinguish time-points with observable kinematic deviations from the conducting trials. These kinematic deviations were compared with conductors’ stated intentions. Prominent clusters of kinematic deviations were seen to be associated with key musical events which conductors intended to emphasize temporal, melodic, dynamic, and instrumental aspects. Minor clusters of kinematic deviations were seen to be connected with interpretational intentions in a less stable manner, some occurring remotely from the conductor-identified locations. DPA method and findings are fully reported. The implications, advantages and limitations of this novel analysis approach are also discussed.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
41

Andersch, Adrienn. „Lean Implementation and the Role of Lean Accounting in the Transportation Equipment Manufacturing Industry“. Diss., Virginia Tech, 2014. http://hdl.handle.net/10919/50850.

Der volle Inhalt der Quelle
Annotation:
Implementing Lean in the United States transportation equipment manufacturing industry holds the promise for improvements in, among other things, productivity, quality, and innovation, resulting in more competitive success and profits. Although Lean has been applied throughout the industry with noted success, there have been some difficulties in demonstrating the financial benefits derived from Lean initiatives. Most of the evidence supporting a positive relationship between Lean implementation and improved financial performance is anecdotal. As companies have become more proficient in carrying out Lean initiatives in manufacturing, they have extended Lean ideas to other parts of their organization and throughout the entire supply chain. Nowadays, it is widely recognized that a holistic, enterprise-wide view is critical to obtain the potential benefits of a Lean transformation. However, Lean transformations are often undertaken without consideration of supporting functions such as accounting and finance. Lean transformation in accounting and finance should be run in the same way as it is in the manufacturing environment by decreasing reporting cycle time, improving transaction processing accuracy, eliminating unnecessary transaction processing, changing product costing procedures, and financial reporting among many other things, but there is limited empirical evidence of that happening. To address these shortcomings, this research focuses on three areas. First, this study aims to evaluate transportation equipment manufacturing facilities in respect to their operational and financial performance. Second, this study aims to investigate the extent of Lean implementation of a given operation in respect to leadership, manufacturing, accounting and finance, and supplier and customer relationship and correlate these results to their performance. Finally, this study aims to further examine the contextual characteristics of companies that successfully aligned their systems with Lean. A mixed-mode survey, addressed to a subset of the United States transportation equipment manufacturing industry, asked questions pertinent to companies' Lean transformation efforts, performance, and general characteristics. During the four months long survey period, a total of 69 valid responses were received, for a response rate of 3.78 percent. From the 69 valid responses, 8 responses were eliminated due to containing more than 20 percent missing values. Multiple imputation procedure was applied to handle remaining missing values in the dataset. Before testing study hypotheses, scale reliability and construct validity tests were run to decide whether a particular survey item should be retained in further analysis. Study hypotheses were then tested using profile deviation analysis, multiple regression analysis, and hierarchical regression analysis. When the level of Lean implementation and performance relationship was investigated using a multiple regression analysis, results did not show any evidence that the higher level of Lean implementation along four business dimensions (leadership, manufacturing, accounting and finance, and supplier and customer relationship) of transportation equipment manufacturing facilities positively influences their operational and financial performance. However, it was revealed that the higher level of Lean implementation in transportation equipment manufacturing facilities' manufacturing dimension resulted in better quality performance as measured by first-time through, inbound quality, and outbound quality. When the same relationship was investigated using a profile deviation analysis, results were identical. When the level of Lean implementation in accounting and finance and its relationship with performance was investigated using a single regression analysis, results showed that the higher level of Lean implementation in transportation equipment manufacturing facilities' accounting and finance dimension has a positive effect on accounting performance and on operational performance (e.g., on time-based performance and delivery-based performance), but no effect on financial performance. When the same relationship was investigated using a profile deviation analysis, results were different by showing no relationship between the level of Lean implementation in transportation equipment manufacturing facilities' accounting and finance dimension and accounting, operational, and financial performance. Lastly, the effect of contextual variables (e.g., industry segment, location, annual sales volume, and unionization) on performance, the level of Lean implementation, and the performance -- Lean implementation relationship was investigated using hierarchical regression. Results showed that transportation equipment manufacturing facilities' performance is influenced by annual sales volume. Their level of Lean implementation in the accounting and finance dimension is influenced by location, while their performance -- Lean implementation in the accounting and finance dimension relationship is influenced by industry segment.
Ph. D.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
42

Pisztora, Ágoston. „Surface order large deviation behavior of the ising model in the phase transition regime : a Fortuin-Kasteleyn percolation analysis /“. [S.l.] : [s.n.], 1993. http://e-collection.ethbib.ethz.ch/show?type=diss&nr=10286.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
43

Zabel, Julia. „Deviations from chain ideality : are they detectable in simulations and neutron scattering of polyisobutylene ?“ Phd thesis, Université de Strasbourg, 2013. http://tel.archives-ouvertes.fr/tel-01064158.

Der volle Inhalt der Quelle
Annotation:
The Flory ideality hypothesis states that flexible polymer chains in a melt assume the shape of three-dimensional random walks leading to so called Gaussian coils. The basis of this hypothesis is that any local conformational information decays exponentially along the chain backbone and thus has no influence on the long range conformation. Additionally it is argued that the excluded volume shielding of neigbor chains cancels out any swelling effects. Neutron scattering (NS) experiments dating back 30 years confirm the postulated Gaussian coil shape of polymers. This leads to a pillar of polymer theory: Any flexible polymer can be described as a three-dimensional random walk. Advances in simulation technics and computing power have opened the door to the possibility of studing very long chains. This allowed for a closer look at the chain structure of polymer melts and revealed deviations from ideality. This deviation is very slight and thus great care must be taken to distinguish it from noise. So far the deviation from the Gaussian coil structure was only studied for coarse-grained models. The scope of this thesis is to explore if these deviations are also measurable in atomistically realistic simulations and modern day NS experiments.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
44

Stärner, Nathalie. „Mass balance analysis of phosphorous in Motala Ström River Basin : A focus on lake Roxen and Glan“. Thesis, Linköpings universitet, Tema vatten i natur och samhälle, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-81971.

Der volle Inhalt der Quelle
Annotation:
Phosphorous (P) has been found to be the limiting nutrient in freshwater systems, directly affecting rates of planktonic growth. The P circulation is very complex, and its pathways through lake systems are difficult to determine. Motala Ström is the biggest watercourse in the south-east of Sweden and an important source of P to the Baltic Sea. The aim of this study is to conduct a P mass balance analysis of the lakes Roxen and Glan over a period of time. The analysis will also include a quality control of the concentrations data. The P concentration data used in this investigation were collected from the Motala Ström River Association, consisting of seasonal or monthly concentration data of Tot-P during the period 1960-2010. Daily water flow data used in this study were modelled by the Swedish Meteorological and Hydrological Institute (SMHI) using the S-HYPE model. P concentration deviations from monthly averages at each sampling station were calculated, followed by a seasonal Mann Kendall trend analysis. At five out of eight sampling stations, negative trends were detected, indicating decreasing concentrations. The exception was the outflow from lake Glan, Stångån and Finspångsån. Linear interpolation of P concentration data was performed to create daily data for the period 1980-2010. Following interpolation, daily transport values were calculated and summed up to annual values. Lake Roxen has acted as a source of P during the whole period 1980-2010, except for one year. Lake Glan has acted as a source during 22 of the 31 years. There is a tendency of Glan to become more of a source over the years, which is in line with the deviation observations, but variation between years makes it necessary to analyse also future data in order to establish any possible trend in P transports. Before construction of wastewater treatment plants, the lakes were certainly sinks of phosphorus. But at least for Roxen, the switch from sink to source was completed before 1980.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
45

Kim, Thanh Tùng. „Limited Feedback Information in Wireless Communications : Transmission Schemes and Performance Bounds“. Doctoral thesis, KTH, Kommunikationsteori, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-4737.

Der volle Inhalt der Quelle
Annotation:
This thesis studies some fundamental aspects of wireless systems with partial channel state information at the transmitter (CSIT), with a special emphasis on the high signal-to-noise ratio (SNR) regime. The first contribution is a study on multi-layer variable-rate communication systems with quantized feedback, where the expected rate is chosen as the performance measure. Iterative algorithms exploiting results in the literature of parallel broadcast channels are developed to design the system parameters. Necessary and sufficient conditions for single-layer coding to be optimal are derived. In contrast to the ergodic case, it is shown that a few bits of feedback information can improve the expected rate dramatically. The next part of the thesis is devoted to characterizing the tradeoff between diversity and multiplexing gains (D-M tradeoff) over slow fading channels with partial CSIT. In the multiple-input multiple-output (MIMO) case, we introduce the concept of minimum guaranteed multiplexing gain in the forward link and show that it influences the D-M tradeoff significantly. It is demonstrated that power control based on the feedback is instrumental in achieving the D-M tradeoff, and that rate adaptation is important in obtaining a high diversity gain even at high rates. Extending the D-M tradeoff analysis to decode-and-forward relay channels with quantized channel state feedback, we consider several different scenarios. In the relay-to-source feedback case, it is found that using just one bit of feedback to control the source transmit power is sufficient to achieve the multiantenna upper bound in a range of multiplexing gains. In the destination-to-source-and-relay feedback scenario, if the source-relay channel gain is unknown to the feedback quantizer at the destination, the diversity gain only grows linearly in the number of feedback levels, in sharp contrast to an exponential growth for MIMO channels. We also consider the achievable D-M tradeoff of a relay network with the compress-and-forward protocol when the relay is constrained to make use of standard source coding. Under a short-term power constraint at the relay, using source coding without side information results in a significant loss in terms of the D-M tradeoff. For a range of multiplexing gains, this loss can be fully compensated for by using power control at the relay. The final part of the thesis deals with the transmission of an analog Gaussian source over quasi-static fading channels with limited CSIT, taking the SNR exponent of the end-to-end average distortion as performance measure. Building upon results from the D-M tradeoff analysis, we develop novel upper bounds on the distortion exponents achieved with partial CSIT. We show that in order to achieve the optimal scaling, the CSIT feedback resolution must grow logarithmically with the bandwidth ratio for MIMO channels. The achievable distortion exponent of some hybrid schemes with heavily quantized feedback is also derived. As for the half-duplex fading relay channel, combining a simple feedback scheme with separate source and channel coding outperforms the best known no-feedback strategies even with only a few bits of feedback information.
QC 20100817
APA, Harvard, Vancouver, ISO und andere Zitierweisen
46

Ayyalasomayajula, Swarna Manjari. „Analysis of the South Korean Procedure for the Fuel Consumption and CO2 Emissions from Heavy Duty Vehicles. Sensitivity Analysis of the Fuel Consumption Deviation in Transient Cycles over Steady State Conditions“. Thesis, KTH, Maskinkonstruktion (Inst.), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-249230.

Der volle Inhalt der Quelle
Annotation:
To meet the demands to reduce national energy consumption and greenhouse gas emission targets based on environmental policy, the Ministry of Trade, Industry, and Energy of Korea formed a research consortium consisting of government agencies and academic research institutions to establish the first fuel efficiency standards for Heavy-duty commercial vehicles (HDV). The standards are expected to be introduced at the earliest in 2020 as Phase 1 of the plan. Research is also in progress to derive regulation measures for CO2 emissions from heavy-duty vehicles. The test-driving cycle selected for comparison with current road situations is Korean-World Harmonized Vehicle Cycle (K-WHVC) for all heavy-duty vehicles. The Heavy duty vehicle Emission Simulator (HES) is used to simulate the fuel consumption and subsequent CO2 emissions for the specified HDV. The power demand can be too high for the HDV model during full payload as the simulated velocity could not reach the demanded velocity in an instance. HES simulates the fuel consumption to ±1.5% deviation in the transient part of the cycle. It over estimates the fuel consumption to 9% deviation in rest of the cycle. This report also studies the factors that are affecting the fuel consumption during the transient cycle on an engine level and estimates of fuel consumption under transient conditions on an engine level. The deviations of the transient cycle (WHTC of the DC-13 164 engine) from the quasi-stationary values (interpolated steady state values of DC-13 164 engine), which are considered as the transient characteristics of these parameters, are studied to estimate the factors affecting the fuel consumption. It is observed that the change in the fuel flow varies inversely with change in the air fuel ratio and directly with change in the boost pressure. The equations describing this behaviour of air fuel ratio with change of fuel flow is calculated. Comparison of the model/equation results with measurements on both the steady state conditions and a transient cycle (WHTC) is done. It is observed that the percentage deviation of fuel consumption from transient to quasi-stationary flow for the DC-13 164 engine is 1.1 percent where as from transient to corrected fuel flow is 0.4 percent.
För att möta kraven på att minska nationella energiförbrukning och mål för utsläpp av växthusgaser utifrån miljöpolitiken bildade Koreas ministerium för handel, industri och energi ett forskningskonsortium bestående av myndigheter och akademiska forskningsinstitutioner för att fastställa de första bränsleeffektivitetsstandarderna för Heavy -duty kommersiella fordon (HDV). Standarden förväntas vara de tidigast 2020 som fas 1 i planen. Forskning är också i progressiv reglering för utsläpp på CO2 från tunga fordon. Testkörningscykeln för jämförelse med nuvarande vägsituationer är Koreansk-Världsharmoniserad Fordonscykel (K-WHVC) för alla tunga fordon. Koldioxidutsläppssimulatorn (HES) används för att simulera bränsleförbrukningen och efterföljande CO2 -utsläpp för den angivna HDV. Effektbehovet kan vara för högt för HDV-modellen under full nyttolast eftersom den simulerade hastigheten inte kunde nå den krävda hastigheten i en instans. HES simulerar bränsleförbrukningen till 1,5% i den transienta delen av cykeln. Om uppskattar bränsleförbrukningen till 9% avvikelse i resten av cykeln. I denna rapport studeras också de faktorer som påverkar bränsleförbrukningen under den övergående cykeln på en motornivå och uppskattningar av bränsleförbrukningen vid övergående förhållanden på motornivå. Avvikelserna från den transienta cykeln (WHTC hos DC-13 164-motorn) från de kvasi-stationära värdena (interpolerade steady state-värden för DC-13 164-motorn), vilka betraktas som de transienta egenskaperna hos dessa parametrar, studeras för att uppskatta de faktorer som påverkar bränsleförbrukningen. Det observeras att förändringen i bränsleflödet varierar omvänt med förändring i luftbränsleförhållandet och direkt med förändring i boosttrycket. Ekvationerna som beskriver detta beteende av luftbränsleförhållande med förändring av bränsleflödet beräknas. Jämförelse av modell / ekvation resulterar i mätningar på både stadigt tillstånd och en övergående cykel (WHTC). Det observeras att den procentuella avvikelsen av bränsleförbrukningen från övergående till kvasi-stationärt flöde f¨or DC-13 164-motorn ¨ar 1,1 procent där från transient till korrigerat bränsleflöde är 0,4 procent.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
47

Alawneh, Tariq. „A critical analysis of the implied obligation against unjustified deviation : is the rule still relevant to the modern law on carriage of goods by sea?“ Thesis, University of Huddersfield, 2015. http://eprints.hud.ac.uk/id/eprint/26283/.

Der volle Inhalt der Quelle
Annotation:
The general area of this research is shipping law, more specifically the law governing the carriage of goods by sea. The research has been narrowed down to the implications of terms into contracts of affreightment, and then further narrowed down to the concept of deviation. The specific research question is whether or not the concept of deviation is still relevant to the law governing the carriage of goods by sea in the modern era. While this question has been posed before in the academic literature, it has never been discussed in sufficient depth. The researcher was therefore able to identify gaps in the literature through the literature review which the research has attempted to fill. The thesis on which the research is based is that the principle of deviation is a long standing and very important rule of law which form an integral part of the law and practice governing the carriage of goods by sea. However, a multi-jurisdictional review of both primary sources (i.e. conventions, statutes and cases) and secondary sources (academic literature) in relation to deviation indicates that there are many conceptual, legal and practical problems associated with the principle. Adding to this problem is the concept of quasi deviation in some jurisdictions such as the United States where there continues to be conflicting approaches to the concept even within the various federal circuits. Therefore the hypothesis of this study is based on the need for legal reform. Chapters 1 and 2 provide the background to the study as well as the conceptual framework for the research, including the literature review. The main research aims, objectives and research questions are addressed in Chapters 3, 4 and 5. Chapter 6 concludes the research by presenting the findings and recommendations together with an outline of the research contribution.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
48

Bencová, Monika. „Využití controllingu v podniku“. Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2020. http://www.nusl.cz/ntk/nusl-417395.

Der volle Inhalt der Quelle
Annotation:
The purpose of the thesis is to describe controlling and its function in a real company. Focus is specifically on the cost of imbalances and the analysis of their origin. The theoretical part serves as a basis for understanding the real processes in a company, followed by their evaluation and proposals for improvements in the scope of cost management.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
49

Kolivand, Mohsen. „DEVELOPMENT OF TOOTH CONTACT AND MECHANICAL EFFICIENCY MODELS FOR FACE-MILLED AND FACE-HOBBED HYPOID AND SPIRAL BEVEL GEARS“. The Ohio State University, 2009. http://rave.ohiolink.edu/etdc/view?acc_num=osu1245266082.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
50

Agyei-Boapeah, Henry. „Mergers and acquisitions and corporate financial leverage : an empirical analysis of UK firms“. Thesis, Loughborough University, 2013. https://dspace.lboro.ac.uk/2134/13455.

Der volle Inhalt der Quelle
Annotation:
This thesis examines the link between mergers and acquisitions (M&As) and corporate financial leverage. The thesis proposes and tests various hypotheses regarding: (1) the relationship between the probability of firms undertaking M&As and corporate financial leverage; and (2) the changes in financial leverage prior to firms' decision to initiate M&As. The empirical evidence on the proposed hypotheses is based on a large sample of firms in the UK during the period 1996 and 2006. The empirical analysis presented in this study contributes to the large and growing body of literature on the interdependence of corporate financing and investment decisions. Specifically, this study contributes to the literature in two ways. First, the thesis investigates the link between firms leverage deviations (i.e. the deviations of firms observed leverage ratios from target leverage ratios) and the probability of undertaking M&As in the future. Building upon the earlier literature, it is argued that extreme leverage deviations lower the probability of undertaking M&As by impairing firms ability to raise capital to finance these deals. The study s empirical analyses suggest that extremely overleveraged firms have lower probability of undertaking M&As. Moreover, the link between extreme overleverage and the probability of undertaking M&As is weaker for diversification-increasing acquisitions (i.e. deals in which the acquirer and the target firm operate in different industries); for domestic acquisitions (i.e. deals in which the acquirer and the target firm are domiciled in the same country); and for focused (i.e. single-segment) firms undertaking acquisitions. Thus, the leverage deviation effect is not symmetric for all types of acquisitions and for all firms. Second, the thesis examines how the pre-acquisition changes in corporate financial leverage may be influenced by: (1) the extent to which firms deviate from their target leverage ratios; and (2) firms intentions to initiate M&As. Key empirical findings in this section suggest that firms that have higher leverage deviations adjust their leverage at a higher rate than those with lower deviations. More importantly, the empirical evidence suggests that firms that undertake M&As adjust their pre-acquisition leverage at a higher rate than those that do not. These findings suggest that, when making adjustments to corporate capital structure, managers tend to consider their firms leverage deviations and their future acquisition plans. Furthermore, the study s findings partly explain the differences in the speeds of financial leverage adjustments reported in the existing literature.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie