Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Analysis.

Дисертації з теми "Analysis"

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 дисертацій для дослідження на тему "Analysis".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Shani, Najah Turki. "Multivariate analysis and survival analysis with application to company failure." Thesis, Bangor University, 1991. https://research.bangor.ac.uk/portal/en/theses/multivariate-analysis-and-survival-analysis-with-application-to-company-failure(a031bf91-13bc-4367-b4fc-e240ab54a73b).html.

Повний текст джерела
Анотація:
This thesis offers an explanation of the statistical modelling of corporate financial indicators in the context where the life of a company is terminated. Whilst it is natural for companies to fail or close down, an excess of failure causes a reduction in the activity of the economy as a whole. Therefore, studies on business failure identification leading to models which may provide early warnings of impending financial crisis may make some contribution to improving economic welfare. This study considers a number of bankruptcy prediction models such as multiple discriminant analysis and logit, and then introduces survival analysis as a means of modelling corporate failure. Then, with a data set of UK companies which failed, or were taken over, or were still operating when the information was collected, we provide estimates of failure probabilities as a function of survival time, and we specify the significance of financial characteristics which are covariates of survival. Three innovative statistical methods are introduced. First, a likelihood solution is provided to the problem of takeovers and mergers in order to incorporate such events into the dichotomous outcome of failure and survival. Second, we move away from the more conventional matched pairs sampling framework to one that reflects the prior probabilities of failure and construct a sample of observations which are randomly censored, using stratified sampling to reflect the structure of the group of failed companies. The third innovation concerns the specification of survival models, which relate the hazard function to the length of survival time and to a set of financial ratios as predictors. These models also provide estimates of the rate of failure and of the parameters of the survival function. The overall adequacy of these models has been assessed using residual analysis and it has been found that the Weibull regression model fitted the data better than other parametric models. The proportional hazard model also fitted the data adequately and appears to provide a promising approach to the prediction of financial distress. Finally, the empirical analysis reported in this thesis suggests that survival models have lower classification error than discriminant and logit models.
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Gibson, Christine M. "Response Patterns in Functional Analyses: a Preliminary Analysis." Thesis, University of North Texas, 2012. https://digital.library.unt.edu/ark:/67531/metadc149594/.

Повний текст джерела
Анотація:
Functional assessment procedures have proven effective in identifying the operant contingencies that maintain problem behavior. Typically, the evaluation of responding during functional analyses is conducted at the condition level. However, some variables affecting occurrences of behavior cannot be evaluated solely through the use of a cross-session analysis. Evaluating within-session patterns of responding may provide information about variables such as extinction bursts, discriminative stimuli, and motivating operations such as deprivation and satiation. The current study was designed to identify some typical response patterns that are generated when data are displayed across and within sessions of functional analyses, discuss some variables that may cause these trends, and evaluate the utility of within-session analyses. Results revealed that several specific patterns of responding were identified for both across- and within-session analyses, which may be useful in clarifying the function of behavior.
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Ovenden, Simon P. B. "Preparation of a natural product extract library for investigation against disease states specific to defence health a mini long range research project /." Fishermans Bend Victoria : Defence Science and Technology Organisation, 2009. http://hdl.handle.net/1947/9861.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Smith, Fraser O. "Discontinuous flow analyser for process chemical analysis." Thesis, Queensland University of Technology, 1999.

Знайти повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Ashoor, Khalil Layla Ali. "Performance analysis integrating data envelopment analysis and multiple objective linear programming." Thesis, University of Manchester, 2013. https://www.research.manchester.ac.uk/portal/en/theses/performance-analysis-integrating-data-envelopment-analysis-and-multiple-objective-linear-programming(65485f28-f6c5-4eff-b422-6dd05f1b46fe).html.

Повний текст джерела
Анотація:
Firms or organisations implement performance assessment to improve productivity but evaluating the performance of firms or organisations may be complex and complicated due to the existence of conflicting objectives. Data Envelopment Analysis (DEA) is a non-parametric approach utilized to evaluate the relative efficiencies of decision making units (DMUs) within firms or organizations that perform similar tasks. Although DEA measures the relative efficiency of a set of DMUs the efficiency scores generated do not consider the decision maker’s (DM’s) or expert preferences. DEA is used to measure efficiency and can be extended to include DM’s and expert preferences by incorporating value judgements. Value judgements can be implemented by two techniques: weight restrictions or constructing an equivalence Multiple Objective Linear Programming (MOLP) model. Weight restrictions require prior knowledge to be provided by the DM and moreover the DM cannot interfere during the assessment analysis. On the other hand, the second approach enables the DM to interfere during performance assessment without prior knowledge whilst providing alternative objectives that allow the DM to reach the most preferred decision subject to available resources. The main focus of this research was to establish interactive frameworks to allow the DM to set targets, according to his preferences, and to test alternatives that can realistically be measured through an interactive procedure. These frameworks are based on building an equivalence model between extended DEA and MOLP minimax formulation incorporating an interactive procedure. In this study two frameworks were established. The first is based on an equivalence model between DEA trade-off approach and MOLP minimax formulation which allows for incorporating DM’s and expert preferences. The second is based on an equivalence model between DEA bounded model and MOLP minimax formulation. This allows for integrating DM’s preferences through interactive steps to measure the whole efficiency score (i.e. best and worst efficiency) of individual DMU. In both approaches a gradient projection interactive approach is implemented to estimate, regionally, the most preferred solution along the efficient frontier. The second framework was further extended by including ranking based on the geometric average. All the frameworks developed and presented were tested through implementation on two real case studies.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Bairu, Semere Ghebru. "Diamond paste based electrodes for inorganic analysis." Diss., Pretoria : [s.n.], 2003. http://hdl.handle.net/2263/27268.

Повний текст джерела
Анотація:
Differential pulse voltammetry is one of the most widely used analytical polarographic techniques especially for trace inorganic analysis. Up to now mercury electrode and different types of carbon electrodes were used for such analysis. The emphasis of the present dissertation is on the design of a new class of electrodes, namely mono crystalline diamond paste based electrodes, to be used in differential pulse voltammetry for trace analysis of inorganic compounds. Monocrystalline diamond and boron doped polycrystalline diamond based electrodes exhibit several superior electrochemical properties that are significantly different from those of other carbon allotropes based electrodes, e.g., glassy carbon electrodes, highly oriented pyrolytic graphite based electrodes, which have been widely used for many years. The advantages are: (a) lower background currents and noise signals, which lead to improve SIB and SIN ratios, and lower detection limits; (b) good electrochemical activity ( pre-treatment is not necessary); (c) wide electrochemical potential window in aqueous media; (d) very low capacitance; ( e) extreme electrochemical stability; and (f) high reproducibility of analytical information. Furthermore, later studies shown the superiority of the mono crystalline diamond as electrode material due to high mobilities measured for electrons and holes. The design selected for the electrodes is simple, fast and reproducible. The diamond powder was mixed with paraffine oil to give the diamond paste used as electroactive material in the electrodes. The results obtained by employing the diamond paste based electrodes proved a high sensitivity, selectivity, accuracy and high reliability. These characteristics made them suitable to be used for the analysis of different cations (e.g., Fe(ll), Fe(Ill), Cr(Ill), Cr(VI), Pb(ll), Ag(I)) as well as of anions (e.g., iodide) in pharmaceutical, food and environmental matricies.
Dissertation (MSc (Chemistry))--University of Pretoria, 2006.
Chemistry
unrestricted
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Hazell, Laurence Paul. "Rule analysis and social analysis." Thesis, Durham University, 1986. http://etheses.dur.ac.uk/6883/.

Повний текст джерела
Анотація:
This thesis investigates the use of rules in the analysis of language mastery and human action, which are both viewed as social phenomena. The investigation is conducted through an examination of two analyses of the use of language in everyday social life and documents how each formulates a different understanding of rule-following in explaining linguistic and social action. The analyses in question are ‘Speech Act Theory' and 'Ethnomethodology'. The principal idea of speech act theory is that social action is rule-governed, and the theory attempts to explain the possibility of meaningful social interaction on that basis. The rigidities imposed by the notion of rule-governance frustrate that aim. The thesis then turns to an examination of ethnomethodology and conversation analysis and contrasts the notion of rule-orientation developed by that perspective. From that examination it becomes clear that what is on offer is not just a greater flexibility in the use of rules, but a restructuring of the concept of analysis itself. It is argued that re-structuring amounts to a reflexive conception of analysis. Its meaning and implications are enlarged upon through a close scrutiny of the later philosophy of Wittgenstein, particularly his concern with the nature of rule-following in his ‘Philosophical Investigations'. The thesis argues that his concern with rules was motivated by his insight that their use as ‘explanations’ of action said as much about the formulater of the rule as the activities the rules were held to formulate. The thesis concludes by outlining the meaning of this analytic reflexivity for social scientific findings.
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Зайцев, Олександр Васильович, Александр Васильевич Зайцев, Oleksandr Vasylovych Zaitsev, and М. Л. Назаренко. "Analysis Mechanisms of Financial Markets. Fundamental Analysis and Technical Analysis." Thesis, Sumy State University, 2021. https://essuir.sumdu.edu.ua/handle/123456789/86030.

Повний текст джерела
Анотація:
Тези доповіді на конференції
Щоб дізнатися, що буде з валютою завтра, або, іншими словами, спрогнозувати її вартість на певний період часу в майбутньому, необхідно знати і розуміти основні методи аналізу валютних ринків. На даний момент основними і підтвердженими практичним використанням двома методами є фундаментальний аналіз і технічний аналіз.
Чтобы узнать, что будет с валютой завтра, или, другими словами, спрогнозировать ее стоимость на определенный период времени в будущем, необходимо знать и понимать основные методы анализа валютных рынков. На данный момент основными и подтвержденными практикой двумя методами являются фундаментальный анализ и технический анализ.
In order to find out what will happen to the currency tomorrow, or, in other words, to predict its value for a certain period of time in the future, you need to know and understand the basic methods of analyzing foreign exchange markets. At the moment, the main and confirmed by practical use of two methods are fundamental analysis and technical analysis.
немає
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Wang, Jing. "Investment analysis in practice : evidence from Chinese financial analysts." Thesis, Heriot-Watt University, 2006. http://hdl.handle.net/10399/176.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Yang, Di. "Analysis guided visual exploration of multivariate data." Worcester, Mass. : Worcester Polytechnic Institute, 2007. http://www.wpi.edu/Pubs/ETD/Available/etd-050407-005925/.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Cai, Yue. "Nondestructive multi-element analysis of colorants for forensic applications and artwork authentication." HKBU Institutional Repository, 2013. https://repository.hkbu.edu.hk/etd_ra/1528.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Zarif, Karimi Navid <1988&gt. "Analysis of drilling of composite laminates." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amsdottorato.unibo.it/8503/1/ANALYSIS%20OF%20DRILLING%20OF%20COMPOSITE%20LAMINATES.pdf.

Повний текст джерела
Анотація:
This dissertation deals with the characterization, modeling, and monitoring of drilling process of composite materials through various experimental and analytical investigations. Analytical models were developed which predicts critical thrust force and feed rate above which the delamination crack begins to propagate in the drilling of multi-directional laminated composites. The delamination zone was modeled as a circular plate, with clamped edge and subjected to different load profiles. Based on fracture mechanics, classical laminate theory and orthogonal cutting mechanics, expressions were obtained for critical thrusts and feed rates at different ply locations. The proposed models have been verified by experiments and compared with the existing models. It was found that the new developed models provide more accurate and rigorous results than the formers. Quality of holes and drilling-induced damage when drilling fiber reinforced composite laminates were experimentally studied. Several quality responses were measured as indices of drilling performance, including thrust force, delamination size, residual compression strength, and flexural strength. Effects of key drilling parameters on these responses were statistically analyzed, and optimal drilling conditions for high performance and free-damage drilling were identified. Experimental results revealed that the choice of drilling conditions are critical to hole performance especially when these materials are subjected to structural loads. An experimental study of acoustic emission as a tool for in-process monitoring and nondestructive evaluation of drilling of composites was conducted. Acoustic emission was used to examine the relationship between signal response and drilling induced damages. A procedure for discrimination and identification of different damage mechanisms was presented utilizing different signal analysis tools. Based on the results, frequency distribution and energy percentage of most important damage mechanisms occurring during drilling were determined. It was concluded that acoustic emission has a great potential for the application of online monitoring and damage characterization in the drilling of composite structures.
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Seasholtz, Mary Beth. "Parsimonious construction of multivariate calibration models in chemometrics /." Thesis, Connect to this title online; UW restricted, 1992. http://hdl.handle.net/1773/8705.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Brückner, Carsten Albrecht. "Rapid chromatographic analysis using novel detection systems and chemometric techniques /." Thesis, Connect to this title online; UW restricted, 1998. http://hdl.handle.net/1773/11573.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Aksu, tIbrahim. "Performance analysis of image motion analysis algorithms." Thesis, Monterey, California. Naval Postgraduate School, 1991. http://hdl.handle.net/10945/28443.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Lak, Rashad Rashid Haji. "Harmonic analysis using methods of nonstandard analysis." Thesis, University of Birmingham, 2015. http://etheses.bham.ac.uk//id/eprint/5754/.

Повний текст джерела
Анотація:
Throughout this research we use techniques of nonstandard analysis to derive and interpret results in classical harmonic analysis particularly in topological (metric) groups and theory of Fourier series. We define monotonically definable subset \(N\) of a nonstandard *finite group \(F\), which is the monad of the neutral element of \(F\) for some invariant *metric \(d\) on \(F\). We prove some nice properties of \(N\) and the nonstandard metrisation version of first-countable Hausdorff topological groups. We define locally embeddable in finite metric groups (LEFM). We show that every abelian group with an invariant metric is LEFM. We give a number of LEFM group examples using methods of nonstandard analysis. We present a nonstandard version of the main results of the classical space \(L\)\(^1\)(T) of Lebesgue integrable complex-valued functions defined on the topological circle group T, to study Fourier series throughout: the inner product space; the DFT of piecewise continuous functions; some useful properties of Dirichlet and Fejér functions; convolution; and convergence in norm. Also we show the relationship between \(L\)\(^1\)(T) and the nonstandard \(L\)\(^1\)(\(F\)) via Loeb measure. Furthermore, we model functionals defined on the test space of exponential polynomial functions on T by functionals in NSA.
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Tavares, Nuno Filipe Ramalho da Cunha. "Multivariate analysis applied to clinical analysis data." Master's thesis, Faculdade de Ciências e Tecnologia, 2014. http://hdl.handle.net/10362/12288.

Повний текст джерела
Анотація:
Dissertação para obtenção do Grau de Mestre em Engenharia e Gestão Industrial
Folate, vitamin B12, iron and hemoglobin are essential for metabolic functions in the body. The deficiency of these can be the cause of several known pathologies and, untreated, can be responsible for severe morbidity and even death. The objective of this study is to characterize a population, residing in the metropolitan area of Lisbon and Setubal, concerning serum levels of folate, vitamin B12, iron and hemoglobin, as well as finding evidence of correlations between these parameters and illnesses, mainly cardiovascular, gastrointestinal, neurological and anemia. Clinical analysis data was collected and submitted to multivariate analysis. First the data was screened with Spearman correlation and Kruskal-Wallis analysis of variance to study correlations and variability between groups. To characterize the population, we used cluster analysis with Ward’s linkage method. Finally a sensitivity analysis was performed to strengthen the results. A positive correlation between iron with, ferritin and transferrin, and with hemoglobin was observed with the Spearman correlation. Kruskal-Wallis analysis of variance test showed significant differences between these biomarkers in persons aged 0 to 29, 30 to 59 and over 60 years old. Cluster analysis proved to be a useful tool when characterizing a population based on its biomarkers, showing evidence of low folate levels for the population in general, and hemoglobin levels below the reference values. Iron and vitamin B12 were within the reference range for most of the population. Low levels of the parameters were registered mainly in patients with cardiovascular, gastrointestinal, and neurological diseases and anemia.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Alqahtani, Abdullah Ayed F. "Comparative Analysis of Roundabout Capacity Analysis Methods." University of Dayton / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=dayton1557252181941848.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Roberts, David Anthony. "Discontinuous Systems Analysis: an Interdisciplinary Analysis Tool." Oxford, Ohio : Miami University, 2007. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=miami1196390609.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Liu, Xuan. "Some contribution to analysis and stochastic analysis." Thesis, University of Oxford, 2018. http://ora.ox.ac.uk/objects/uuid:485474c0-2501-4ef0-a0bc-492e5c6c9d62.

Повний текст джерела
Анотація:
The dissertation consists of two parts. The first part (Chapter 1 to 4) is on some contributions to the development of a non-linear analysis on the quintessential fractal set Sierpinski gasket and its probabilistic interpretation. The second part (Chapter 5) is on the asymptotic tail decays for suprema of stochastic processes satisfying certain conditional increment controls. Chapters 1, 2 and 3 are devoted to the establishment of a theory of backward problems for non-linear stochastic differential equations on the gasket, and to derive a probabilistic representation to some parabolic type partial differential equations on the gasket. In Chapter 2, using the theory of Markov processes, we derive the existence and uniqueness of solutions to backward stochastic differential equations driven by Brownian motion on the Sierpinski gasket, for which the major technical difficulty is the exponential integrability of quadratic processes of martingale additive functionals. A Feynman-Kac type representation is obtained as an application. In Chapter 3, we study the stochastic optimal control problems for which the system uncertainties come from Brownian motion on the gasket, and derive a stochastic maximum principle. It turns out that the necessary condition for optimal control problems on the gasket consists of two equations, in contrast to the classical result on ℝd, where the necessary condition is given by a single equation. The materials in Chapter 2 are based on a joint work with Zhongmin Qian (referenced in Chapter 2). Chapter 4 is devoted to the analytic study of some parabolic PDEs on the gasket. Using a new type of Sobolev inequality which involves singular measures developed in Section 4.2, we establish the existence and uniqueness of solutions to these PDEs, and derive the space-time regularity for solutions. As an interesting application of the results in Chapter 4 and the probabilistic representation developed in Chapter 2, we further study Burgers equations on the gasket, to which the space-time regularity for solutions is deduced. The materials in Chapter 4 are based on a joint work with Zhongmin Qian (referenced in Chapter 4). In Chapter 5, we consider a class of continuous stochastic processes which satisfy the conditional increment control condition. Typical examples include continuous martingales, fractional Brownian motions, and diffusions governed by SDEs. For such processes, we establish a Doob type maximal inequality. Under additional assumptions on the tail decays of their marginal distributions, we derive an estimate for the tail decay of the suprema (Theorem 5.3.2), which states that the suprema decays in a manner similar to the margins of the processes. In Section 5.4, as an application of Theorem 5.3.2, we derive the existence of strong solutions to a class of SDEs. The materials in this chapter is based on the work [44] by the author (Section 5.2 and Section 5.3) and an ongoing joint project with Guangyu Xi (Section 5.4).
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Blaschke, Tobias. "Independent component analysis and slow feature analysis." Doctoral thesis, Humboldt-Universität zu Berlin, Mathematisch-Naturwissenschaftliche Fakultät I, 2005. http://dx.doi.org/10.18452/15270.

Повний текст джерела
Анотація:
Der Fokus dieser Dissertation liegt auf den Verbindungen zwischen ICA (Independent Component Analysis - Unabhängige Komponenten Analyse) und SFA (Slow Feature Analysis - Langsame Eigenschaften Analyse). Um einen Vergleich zwischen beiden Methoden zu ermöglichen wird CuBICA2, ein ICA Algorithmus basierend nur auf Statistik zweiter Ordnung, d.h. Kreuzkorrelationen, vorgestellt. Dieses Verfahren minimiert zeitverzögerte Korrelationen zwischen Signalkomponenten, um die statistische Abhängigkeit zwischen denselben zu reduzieren. Zusätzlich wird eine alternative SFA-Formulierung vorgestellt, die mit CuBICA2 verglichen werden kann. Im Falle linearer Gemische sind beide Methoden äquivalent falls nur eine einzige Zeitverzögerung berücksichtigt wird. Dieser Vergleich kann allerdings nicht auf mehrere Zeitverzögerungen erweitert werden. Für ICA lässt sich zwar eine einfache Erweiterung herleiten, aber ein ähnliche SFA-Erweiterung kann nicht im originären SFA-Sinne (SFA extrahiert die am langsamsten variierenden Signalkomponenten aus einem gegebenen Eingangssignal) interpretiert werden. Allerdings kann eine im SFA-Sinne sinnvolle Erweiterung hergeleitet werden, welche die enge Verbindung zwischen der Langsamkeit eines Signales (SFA) und der zeitlichen Vorhersehbarkeit desselben verdeutlich. Im Weiteren wird CuBICA2 und SFA kombiniert. Das Resultat kann aus zwei Perspektiven interpretiert werden. Vom ICA-Standpunkt aus führt die Kombination von CuBICA2 und SFA zu einem Algorithmus, der das Problem der nichtlinearen blinden Signalquellentrennung löst. Vom SFA-Standpunkt aus ist die Kombination eine Erweiterung der standard SFA. Die standard SFA extrahiert langsam variierende Signalkomponenten die untereinander unkorreliert sind, dass heißt statistisch unabhängig bis zur zweiten Ordnung. Die Integration von ICA führt nun zu Signalkomponenten die mehr oder weniger statistisch unabhängig sind.
Within this thesis, we focus on the relation between independent component analysis (ICA) and slow feature analysis (SFA). To allow a comparison between both methods we introduce CuBICA2, an ICA algorithm based on second-order statistics only, i.e.\ cross-correlations. In contrast to algorithms based on higher-order statistics not only instantaneous cross-correlations but also time-delayed cross correlations are considered for minimization. CuBICA2 requires signal components with auto-correlation like in SFA, and has the ability to separate source signal components that have a Gaussian distribution. Furthermore, we derive an alternative formulation of the SFA objective function and compare it with that of CuBICA2. In the case of a linear mixture the two methods are equivalent if a single time delay is taken into account. The comparison can not be extended to the case of several time delays. For ICA a straightforward extension can be derived, but a similar extension to SFA yields an objective function that can not be interpreted in the sense of SFA. However, a useful extension in the sense of SFA to more than one time delay can be derived. This extended SFA reveals the close connection between the slowness objective of SFA and temporal predictability. Furthermore, we combine CuBICA2 and SFA. The result can be interpreted from two perspectives. From the ICA point of view the combination leads to an algorithm that solves the nonlinear blind source separation problem. From the SFA point of view the combination of ICA and SFA is an extension to SFA in terms of statistical independence. Standard SFA extracts slowly varying signal components that are uncorrelated meaning they are statistically independent up to second-order. The integration of ICA leads to signal components that are more or less statistically independent.
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Haouas, Nabiha. "Wind energy analysis and change point analysis." Thesis, Clermont-Ferrand 2, 2015. http://www.theses.fr/2015CLF22554.

Повний текст джерела
Анотація:
L’énergie éolienne, l’une des énergies renouvelables les plus compétitives, est considérée comme une solution qui remédie aux inconvénients de l’énergie fossile. Pour une meilleure gestion et exploitation de cette énergie, des prévisions de sa production s’avèrent nécessaires. Les méthodes de prévisions utilisées dans la littérature permettent uniquement une prévision de la moyenne annuelle de cette production. Certains travaux récents proposent l’utilisation du Théorème Central Limite (TCL), sous des hypothèses non classiques, pour l’estimation de la production annuelle moyenne de l’énergie éolienne ainsi que sa variance pour une seule turbine. Nous proposons dans cette thèse une extension de ces travaux à un parc éolien par relaxation de l’hypothèse de stationnarité la vitesse du vent et la production d’énergie, en supposant que ces dernières sont saisonnières. Sous cette hypothèse la qualité de la prévision annuelle s’améliore considérablement. Nous proposons aussi de prévoir la production d’énergie éolienne au cours des quatre saisons de l’année. L’utilisation du modèle fractal, nous permet de trouver une division ”naturelle” de la série de la vitesse du vent afin d’affiner l’estimation de la production éolienne en détectant les points de ruptures. Dans les deux derniers chapitres, nous donnons des outils statistiques de la détection des points de ruptures et d’estimation des modèles fractals
The wind energy, one of the most competitive renewable energies, is considered as a solution which remedies the inconveniences of the fossil energy. For a better management and an exploitation of this energy, forecasts of its production turn out to be necessary. The methods of forecasts used in the literature allow only a forecast of the annual mean of this production. Certain recent works propose the use of the Central Limit Theorem (CLT), under not classic hypotheses, for the estimation of the mean annual production of the wind energy as well as its variance for a single turbine. We propose in this thesis, an extension of these works in a wind farm by relaxation of the hypothesis of stationarity the wind speed and the power production, supposing that the latter are seasonal. Under this hypothesis the quality of the annual forecast improves considerably. We also suggest planning the wind power production during four seasons of the year. The use of the fractal model, allows us to find a "natural" division of the series of the wind speed to refine the estimation of the wind production by detecting abrupt change points. Statistical tools of the change points detection and the estimation of fractal models are presented in the last two chapters
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Mattila, Max, and Hassan Salman. "Analysing Social Media Marketing on Twitter using Sentiment Analysis." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229787.

Повний текст джерела
Анотація:
Social media is an increasingly important marketing platform in today’s society, and many businesses use them in one way or another in their advertising. This report aimed to determine the effect of different factors on the sentiment in the response to a tweet posted on Twitter for advertising purposes by companies in the fast food sector in North America. The factors considered were the time of posting, the length and the sentiment of a tweet, along with the presence of media other than text in the tweet. Sentiment was extracted from samples of the response to the advertising tweets collected daily between the 27th of March and the 28th of April and plotted against the factors mentioned. The results indicate that the sentiment of the advertising tweet along with the time of posting had the biggest impact on the response, though no definitive conclusions on their effects could be drawn.
Sociala medier är en allt viktigare marknadsföringsplattform i dagens samhälle, och många företag använder dem på ett eller annat sätt i sin marknadsföring. Syftet med denna studie är att genom attitydanalys undersöka hur ett antal faktorer inom marknadsföring på det sociala mediet Twitter påverkar responsen till den. Dessa faktorer var följande: inläggets tid, längd och attityd, samt förekomst av media i inlägget. Inläggen samlades från Twitter mellan 28. mars och 28. april och attityden i dem mättes genom attitydanalys, varpå attityden i svaren till reklaminläggen jämfördes baserat på de ovannämnda faktorerna. Resultaten visar på att attityden i reklaminläggen och tiden då de läggs upp har störst påverkan på hur svaren ser ut, men inga säkra slutsatser har kunnat dras.
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Wardak, Mohammad Alif. "Survival analysis." CSUSB ScholarWorks, 2005. https://scholarworks.lib.csusb.edu/etd-project/2810.

Повний текст джерела
Анотація:
Survival analysis pertains to a statistical approach designed to take into account the amount of time an experimental unit contributes to a study. A Mayo Clinic study of 418 Primary Biliary Cirrhosis patients during a ten year period was used. The Kaplan-Meier Estimator, a non-parametric statistic, and the Cox Proportional Hazard methods were the tools applied. Kaplan-Meier results include total values/censored values.
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Akula, Venkata Ganesh Ashish. "Implementation of Advanced Analytics on Customer Satisfaction Process in Comparison to Traditional Data Analytics." University of Akron / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=akron1555612496986004.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
26

González, García Juan. "Application of clustering analysis and sequence analysis on the performance analysis of parallel applications." Doctoral thesis, Universitat Politècnica de Catalunya, 2013. http://hdl.handle.net/10803/128875.

Повний текст джерела
Анотація:
High Performance Computing and Supercomputing is the high end area of the computing science that studies and develops the most powerful computers available. Current supercomputers are extremely complex so are the applications that run on them. To take advantage of the huge amount of computing power available it is strictly necessary to maximize the knowledge we have about how these applications behave and perform. This is the mission of the (parallel) performance analysis. In general, performance analysis toolkits oUer a very simplistic manipulations of the performance data. First order statistics such as average or standard deviation are used to summarize the values of a given performance metric, hiding in some cases interesting facts available on the raw performance data. For this reason, we require the Performance Analytics, i.e. the application of Data Analytics techniques in the performance analysis area. This thesis contributes with two new techniques to the Performance Analytics Veld. First contribution is the application of the cluster analysis to detect the parallel application computation structure. Cluster analysis is the unsupervised classiVcation of patterns (observations, data items or feature vectors) into groups (clusters). In this thesis we use the cluster analysis to group the CPU burst of a parallel application, the regions on each process in-between communication calls or calls to the parallel runtime. The resulting clusters obtained are the diUerent computational trends or phases that appear in the application. These clusters are useful to understand the behaviour of computation part of the application and focus the analyses to those that present performance issues. We demonstrate that our approach requires diUerent clustering algorithms previously used in the area. Second contribution of the thesis is the application of multiple sequence alignment algorithms to evaluate the computation structure detected. Multiple sequence alignment (MSA) is technique commonly used in bioinformatics to determine the similarities across two or more biological sequences: DNA or roteins. The Cluster Sequence Score we introduce applies a Multiple Sequence Alignment (MSA) algorithm to evaluate the SPMDiness of an application, i.e. how well its computation structure represents the Single Program Multiple Data (SPMD) paradigm structure. We also use this score in the Aggregative Cluster Re-Vnement, a new clustering algorithm we designed, able to detect the SPMD phases of an application at Vne-grain, surpassing the cluster algorithms we used initially. We demonstrate the usefulness of these techniques with three practical uses. The Vrst one is an extrapolation methodology able to maximize the performance metrics that characterize the application phases detected using a single application execution. The second one is the use of the computation structure detected to speedup in a multi-level simulation infrastructure. Finally, we analyse four production-class applications using the computation characterization to study the impact of possible application improvements and portings of the applications to diUerent hardware conVgurations. In summary, this thesis proposes the use of cluster analysis and sequence analysis to automatically detect and characterize the diUerent computation trends of a parallel application. These techniques provide the developer / analyst an useful insight of the application performance and ease the understanding of the application’s behaviour. The contributions of the thesis are not reduced to proposals and publications of the techniques themselves, but also practical uses to demonstrate their usefulness in the analysis task. In addition, the research carried out during these years has provided a production tool for analysing applications’ structure, part of BSC Tools suite.
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Han, Jin Hee. "Isolation of phenylalanine hydroxylases and enzymatic studies with 3- and 4- pyridylalanine." Diss., Georgia Institute of Technology, 1988. http://hdl.handle.net/1853/27313.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Petersen, Wiebke, and Petja Heinrich. "Qualitative Citation Analysis Based on Formal Concept Analysis." Universitätsbibliothek Chemnitz, 2008. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-200801464.

Повний текст джерела
Анотація:
Zu den Aufgaben der Bibliometrie gehört die Zitationsanalyse (Kessler 1963), das heißt die Analyse von Kozitationen (zwei Texte werden kozipiert, wenn es einen Text gibt, in dem beide zitiert werden) und die bibliographische Kopplung (zwei Texte sind bibliographisch gekoppelt, wenn beide eine gemeinsame Zitation aufweisen). In dem Vortrag wird aufgezeigt werden, daß die Formale Begriffsanalyse (FBA) für eine qualitative Zitationsanalyse geeignete Mittel bereithält. Eine besondere Eigenschaft der FBA ist, daß sie die Kombination verschiedenartiger (qualitativer und skalarer) Merkmale ermöglicht. Durch den Einsatz geeigneter Skalen kann auch dem Problem begegnet werden, daß die große Zahl von zu analysierenden Texten bei qualitativen Analyseansätzen in der Regel zu unübersichtlichen Zitationsgraphen führt, deren Inhalt nicht erfaßt werden kann. Die Relation der bibliographischen Kopplung ist eng verwandt mit den von Priss entwickelten Nachbarschaftskontexten, die zur Analyse von Lexika eingesetzt werden. Anhand einiger Beispielanalysen werden die wichtigsten Begriffe der Zitationsanalyse in formalen Kontexten und Begriffsverbänden modelliert. Es stellt sich heraus, daß die hierarchischen Begriffsverbände der FBA den gewöhnlichen Zitationsgraphen in vielerlei Hinsicht überlegen sind, da sie durch ihre hierarchische Verbandstruktur bestimmte Regularitäten explizit erfassen. Außerdem wird gezeigt, wie durch die Kombination geeigneter Merkmale (Doktorvater, Institut, Fachbereich, Zitationshäufigkeit, Keywords) und Skalen häufigen Fehlerquellen wie Gefälligkeitszitationen, Gewohnheitszitationen u.s.w. begegnet werden kann.
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Borhani, Khomami Arghavan. "Separate analysis of Small Pipes in Piping Analysis." Thesis, KTH, Hållfasthetslära (Inst.), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-177355.

Повний текст джерела
Анотація:
A piping system generally consists of primary system, large pipes, and secondary system, small pipes [1]. The primary system can be analysed separately without considering the secondary system. However, the opposite is not true because the primary system affects the movement of the secondary system. The aim of this study is to investigate the possibility of analysing the small pipes separately, which is called for sub-modelling, and then if the sub-modelling is possible, generate and validate a method which makes this sub-modelling possible. The response spectrum method [2] is used for analysing the structure. A ground acceleration spectra in three directions is applied at the primary structure, a new floor response spectra is then generated with the results from the analysis of the primary structure [3]. The calculated floor response spectra is applied at the secondary structure. The results from this analysis is compared to the results from applying the ground acceleration at the total structure. Two different ground accelerations are applied on two different models. A separation would be allowed if the results of the secondary structure are more than 90% of the results of the whole structure in all parts of the secondary structure in all studied cases. The results after separation reach more than 100% of the results from the analysis of the whole structure, called for conservative results, in three cases but not all the cases. The separation will be allowed in those three cases where the results reach 90% of the results taken from the whole structure, but not in that case where results does not reach 90%.
Vanligtvis består ett rörsystem av ett primärsystem, stora rör, och ett sekundärsystem, klena rör. Det primära systemet kan analyseras separat utan att ta hänsyn till det sekundära systemet. Men det är inte möjligt att analysera det sekundära systemet utan att ta hänsyn till det primära, eftersom det primära systemet påverkar rörelsen av det sekundära systemet. Syftet med denna studie är att undersöka möjligheten att analysera de små rören separat, och sedan om denna separation är möjligt, skapa och validera en metod som gör denna separation möjligt. Metoden responsspektrum används för att analysera strukturen. En grundacceleration spektra i tre riktningar tillämpas på den primära strukturen, sedan genereras ett nytt spektra med resultaten från analysen av den primära strukturen. Den beräknade responsspektra appliceras sedan på den sekundära strukturen. Resultaten från denna analys jämförs med resultaten från applicering av grundaccelerationen på den totala strukturen, när den primära och sekundära strukturen sitter ihop. Två olika grundaccelerationer appliceras på två olika modeller. Separationen kommer att tillåtas om resultaten av den sekundära strukturen är mer än 90% av resultaten från hela strukturen i alla delar av den sekundära strukturen i samtliga studerade fall. Resultaten efter separationen uppnår till mer än 100% av resultaten från analysen av hela strukturen, kallad för konservativa resultat, i tre fall men inte alla de studerade fallen. Separationen kommer att tillåtas i de tre fallen där resultaten når 90% av resultaten som tagits från hela strukturen, men inte i det fallet där resultaten inte når 90%.
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Manthe, Alicia Louise. "Symbolic circuit analysis : DDD optimization and nonlinearity analysis /." Thesis, Connect to this title online; UW restricted, 2003. http://hdl.handle.net/1773/6082.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
31

Clatworthy, Mark Anthony. "Transnational equity analysis : evidence from UK investment analysts and institutional investors." Thesis, Cardiff University, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.432665.

Повний текст джерела
Анотація:
This thesis responds to calls for research into transnational investment decisionmaking by institutional investors and investment analysts. This issue has assumed a growing significanceo ver recent decadesd ue to the internationalisationa nd institutionalisationo f equity markets. Transnationald ecision-makingp rocessesa re also of import to the debate on the harmonization of international accounting. Using a questionnaires urvey and semi-structuredin terviews, the thesis investigates three related issues. First, it examines and compares the analysis techniques used by UK analystsa nd fund managersin analysingd omestica nd oversease quities. Second, it investigatesth e sourcesu sedb y fund managersa nd investmenta nalystst o inform domestic and transnational equity decisions, placing particular emphasis on accounting information and direct company contact. Finally, the research examines the attitude of the investment community to the international harmonization of accounting. In particular, it assessews hether fund managers'a nd investment analysts' decisions are affected by international accounting differences and gauges the views of these important users of accounting information on the harmonization process. The main findings are that fundamental analysis is the predominant appraisal technique in domestic and transnational equity analysis, irrespective of the country of domicile of the company being analysed. However, in order to facilitate international accountingc omparisons,a nalystsa nd fund managersu se ratios and measuresw hich are largely unaffected by international accounting differences when analysing overseasc ompanies. In particular, earningsb efore interest,t ax, depreciationa nd amortisation (EBITDA) is perceived as useful for cross-border comparisons. In respect of the information sources used to analyse overseas companies, direct company contact and the annual report remain the most influential overall. The financial statementsa rc clearly the most heavily utilised componentso f the annual report, yet there is some evidence that qualitative information in the annual report is more useful in overseasc ompanya nalysis. UK fund managersa lso rely on locallybaseda nalystsi n overseasc ompanya nalysisi n order to overcomep ossible information asymmetriesa nd for assistancew ith internationala ccountingd ifferences. However,f und managerse xpressedc oncerno ver analysts' independenced ue to analysts'r eluctancet o j eopardiseth eir relationshipw ith companym anagement. Although international accounting differences are not perceived to have a material effect on transnational investment decisions, the findings demonstrate substantial support from the financial community for harmonization. Indeed, analysts' and fund managersd' ecisionsa re themselvesa market-basedfo rce for harmonization,a s companiesu sing opaquea nd 'unfamiliar' accountings tandardsa re penalisedt hrough higher capital costs imposed to allow for additional accounting risk.
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Al, Sayari Naji Mohammed Awn sulaiman. "Dynamic analysis of cage rotor induction motor using harmonic field analysis and coupling inductances method." Thesis, University of Manchester, 2011. https://www.research.manchester.ac.uk/portal/en/theses/dynamic-analysis-of-cage-rotor-induction-motor-using-harmonic-field-analysis-and-coupling-inductances-method(8c0aebfc-2d74-427e-9448-f9667a6ca099).html.

Повний текст джерела
Анотація:
The work presented in this thesis involves the development of a new analytical method for predicting the transient behaviour of squirrel cage induction motors subjected to pulsating mechanical loads such as a reciprocating compressor. The objective of this project is to develop analysis that will better inform the subsequent design method for determining the rating of industrial induction motors driving an oscillatory load. The analytical approach used to determine the transient response of the motor is based upon the harmonic coupling inductance method which is capable of accommodating any stator winding arrangement used in industrial motor designs. The analytical work described in this thesis includes the response of an induction motor subjected to a general oscillating load in terms of the damping and synchronous torque components. These torque components can be used to determine the additional system inertia required to limit the motor speed and current oscillations to predetermined levels. The work further identifies the motor-load natural resonant frequency and demonstrates the potential issues when driving a general oscillatory load at or close to this frequency. The analytical model was cross-checked using software modelling in Matlab for an industrial squirrel cage induction motor driving a selection of compressor loads. The simulation results were finally correlated with a detailed experimental validation in the laboratory using a squirrel cage induction motor connected to a permanent magnet synchronous motor controlled electronically to simulate general oscillatory load.
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Astiasuinzarra, Bereciartua Txomin. "COMPILATION OF TASK ANALYSIS METHODS: PRACTICAL APPROACH OF HIERARCHICAL TASK ANALYSIS, COGNITIVE WORK ANALYSYS AND GOALS, OPERATIONS, METHODS AND SELECTION RULES." Thesis, KTH, Industriell ekonomi och organisation (Inst.), 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-91369.

Повний текст джерела
Анотація:
Progressively Human Factor methods are becoming more and more relevant in companies. Companies are more conscious about how affect work environment in workers and those in productivity. Globalization also affect to the companies given that they have to be more competitive, effective and flexible. In this context, Human Factors carry out very important role. With this perspective, this thesis is oriented to get acquainted some of the different methods of Human Factors. Human Factor's area is very extensive, for this reason in the thesis are included some of the most important methods. The main objective is to achieve a general perspective, a practical perspective. According with the previous paragraph, in this thesis the most relevant variables and constraints are analyzed and compared. It is theoretical based; different papers, articles and books are the platform of the thesis. The most prestigious authors’ works are included. Hierarchical Task Analysis (HTA), Cognitive Work Analysis (CWA) and Goal, Operator, Method, and Selection (GOMS) are the chosen methods. There are comparisons between HTA and CWA, and a general comparison between different techniques of GOMS. At the end, there are conclusions in order to underpin the previous analyses and comparisons.
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Hackl, Matthias. "GPS analysis." Diss., lmu, 2012. http://nbn-resolving.de/urn:nbn:de:bvb:19-146274.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Lion, Majed, and Daniel Ramström. "Production Analysis." Thesis, KTH, Tillämpad maskinteknik (KTH Södertälje), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-222274.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
36

黃美香 and Mee-heung Cecilia Wong. "Shape analysis." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1994. http://hub.hku.hk/bib/B31211999.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Fischer, Manfred M. "Spatial Analysis." WU Vienna University of Economics and Business, 1999. http://epub.wu.ac.at/4145/1/WSG_DP_6699.pdf.

Повний текст джерела
Анотація:
This article views spatial analysis as a research paradigm that provides a unique set of specialised techniques and models for a wide range of research questions in which the prime variables of interest vary significantly over space. The heartland of spatial analysis is concerned with the analysis and modeling of spatial data. Spatial point patterns and area referenced data represent the most appropriate perspectives for applications in the social sciences. The researcher analysing and modeling spatial data tends to be confronted with a series of problems such as the data quality problem, the ecological fallacy problem, the modifiable areal unit problem, boundary and frame effects, and the spatial dependence problem. The problem of spatial dependence is at the core of modern spatial analysis and requires the use of specialised techniques and models in the data analysis. The discussion focuses on exploratory techniques and model-driven [confirmatory] modes of analysing spatial point patterns and area data. In closing, prospects are given towards a new style of data-driven spatial analysis characterized by computational intelligence techniques such as evolutionary computation and neural network modeling to meet the challenges of huge quantities of spatial data characteristic in remote sensing, geodemographics and marketing. (author's abstract)
Series: Discussion Papers of the Institute for Economic Geography and GIScience
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Ahlsén, Daniel. "Limitless Analysis." Thesis, Uppsala universitet, Algebra och geometri, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-224880.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Hess, Richard Christopher. "An analysis." Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/26855.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Kignell, Johannes. "Dispense Analysis." Thesis, Luleå tekniska universitet, Institutionen för teknikvetenskap och matematik, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-86522.

Повний текст джерела
Анотація:
De automatiserade produktionslinjerna på Cepheid kallas Robal. Robal arbetar dygnet runt för attproducera så många patroner som möjligt. Produktionslinjen består av en grupp av stationer somtillsammans producerar patroner.Detta examensarbete är baserat på Pre-lid fyllningsstation på Robal som har syfte a dispensera enhögviskös vätska bestående av PEG, (poly etylenglykol) i patronerna. Mängden vätska somdispenseras är olika och kan skilja mellan 0.5 och 4 gram beroende på vilket recept på patron somanvänds. Receptet för CTNG patronen är fokus i detta arbete, CTNG är patronen vars har denlängsta cykeltiden, detta på grund av den kräver den största mängden vätska att dispenseras.Det praktiska arbetet för detta examensarbete var uppdelat i olika delar. Första uppgiften var attanalysera olika tryck och deras respektive dispenseringstid. Analysen utfördes genom att testaolika intervall av tryck på en dispenseringsstation placerad i labbet på Cepheid.Dispenseringsstation liknar Pre-lid stationen på Robal. Tryck mellan 40 och 400 Kpa analyseradesmed steg på 10 Kpa.Nästa uppgift var att validera om tryckkärlet som används i nuläget för att hitta det maximalatillåtna trycket som tryckkärlet får utsättas för. Uppgiften börjades med att samla ritningar påtryckkärlet för att ta fram korrekta mått som behövs i dem kommande beräkningarna.Med svenska standarden för tryckkärl [7] utfördes beräkningar på tryckkärlet. Eftersom locket äroerhört överdimensionerat så försummades locket i dessa beräkningar.När de teoretiska beräkningarna var slutförda modellerades tryckkärlet upp efter ritningar iSolidworks 2020. Med en komplett Cad-modell kunde en FEM-analys i samma mjukvara utföraspå tryckkärlet. På FEM-analysen analyserades hela tryckkärlet, inklusive locket.Tryckanalysen indikerar att dispenseringstiden drastiskt minskar med bara små ökande trycksteg.Resultat från valideringsberäkningarna visar att det trycksatta kärlet är dimensionerat enligtstandarder [7] för att hantera ett tryck högre än det inställda måltrycket.
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Kliegr, Tomáš. "Clickstream Analysis." Master's thesis, Vysoká škola ekonomická v Praze, 2007. http://www.nusl.cz/ntk/nusl-2065.

Повний текст джерела
Анотація:
Thesis introduces current research trends in clickstream analysis and proposes a new heuristic that could be used for dimensionality reduction of semantically enriched data in Web Usage Mining (WUM). Click-fraud and conversion fraud are identified as key prospective application areas for WUM. Thesis documents a conversion fraud vulnerability of Google Analytics and proposes defense - a new clickstream acquisition software, which collects data in sufficient granularity and structure to allow for data mining approaches to fraud detection. Three variants of K-means clustering algorithms and three association rule data mining systems are evaluated and compared on real-world web usage data.
Стилі APA, Harvard, Vancouver, ISO та ін.
42

Wong, Mee-heung Cecilia. "Shape analysis /." [Hong Kong : University of Hong Kong], 1994. http://sunzi.lib.hku.hk/hkuto/record.jsp?B13637642.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
43

Erickson, Brice Carl. "Multicomponent flow injection analysis and quantitative infrared emission spectroscopy : chemometric applications /." Thesis, Connect to this title online; UW restricted, 1988. http://hdl.handle.net/1773/8633.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
44

Zhang, Hao. "Nondeterministic Linear Static Finite Element Analysis: An Interval Approach." Diss., Available online, Georgia Institute of Technology, 2005, 2005. http://etd.gatech.edu/theses/available/etd-08232005-020145/.

Повний текст джерела
Анотація:
Thesis (Ph. D.)--Civil and Environmental Engineering, Georgia Institute of Technology, 2006.
White, Donald, Committee Member ; Will, Kenneth, Committee Member ; Zureick, Abdul Hamid, Committee Member ; Hodges, Dewey, Committee Member ; Muhanna, Rafi, Committee Chair ; Haj-Ali, Rami, Committee Member.
Стилі APA, Harvard, Vancouver, ISO та ін.
45

Vera, Xavier. "Towards a static cache analysis for whole program analysis /." Västerås : Mälardalen Univ, 2002. http://www.mrtc.mdh.se/publications/0382.pdf.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
46

Blackburn, Gavin J. "Breath analysis : methodology towards a fieldable breath analysis device." Thesis, Loughborough University, 2011. https://dspace.lboro.ac.uk/2134/8518.

Повний текст джерела
Анотація:
In this work lung cancer is introduced along with the current detection methods. The inadequacies of the current situation are highlighted along with the need for better detection technologies that would allow for a more rigorous testing regime to be implemented. Metabolism and metabolites are introduced as potential biomarkers. The advanced detection techniques mass spectrometry (MS) and differential mobility spectrometry (DMS) are introduced and discussed with regard to being a fieldable device. The methods applicable to processing data generated by these instruments are discussed. Finally the research objectives are highlighted. The science of breath sampling is discussed along with the considerations when engaging in breath analysis research. Sampling and trapping of volatile organic compounds (VOCs) is discussed with particular emphasis on the adaptive breath sampler which was used in this work. The benefits of a dual detector instrument allowing for analysis of a single sample using both MS and DMS are outlined. The design and implementation of a parallel, two detector system is outlined including the intricacies of balancing the two columns that operate at different pressures and developing a mount Processing DMS data currently lags behind the current hardware available as there are no methods that allow the full data surface to be utilised. This work outlines a method for transforming DMS data from three dimensions to two dimensions while retaining the full information contained within the data surface. This method was tested with generated data sets to show its' utility and compared to the current standard processing method using real data sets. An understanding of all aspects of a clinical research project is vital to ensure the smooth running and completion of the project. The currently required documentation for an outside researcher to work within the NHS are detailed along with the expected timeframe for each step of designing, gaining ethical approval and implementing the research. The use of Gantt charts and work flow diagrams is highlighted and examples are given. An initial inspection of the data produced by a pilot study shows that there a several challenges that must be overcome, these are contamination and artefact peaks, retention time shifting, unresolved peaks, differing intensities in similar samples and the complexities of correctly identifying compounds found in breath samples. These are discussed and a workflow is highlighted.
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Kennedy, Christopher Brandon. "GPT-Free Sensitivity Analysis for Reactor Depletion and Analysis." Thesis, North Carolina State University, 2015. http://pqdtopen.proquest.com/#viewpdf?dispub=3710730.

Повний текст джерела
Анотація:

Introduced in this dissertation is a novel approach that forms a reduced-order model (ROM), based on subspace methods, that allows for the generation of response sensitivity profiles without the need to set up or solve the generalized inhomogeneous perturbation theory (GPT) equations. The new approach, denoted hereinafter as the generalized perturbation theory free (GPT-Free) approach, computes response sensitivity profiles in a manner that is independent of the number or type of responses, allowing for an efficient computation of sensitivities when many responses are required. Moreover, the reduction error associated with the ROM is quantified by means of a Wilks’ order statistics error metric denoted by the κ-metric.

Traditional GPT has been recognized as the most computationally efficient approach for performing sensitivity analyses of models with many input parameters, e.g. when forward sensitivity analyses are computationally overwhelming. However, most neutronics codes that can solve the fundamental (homogenous) adjoint eigenvalue problem do not have GPT (inhomogenous) capabilities unless envisioned during code development. Additionally, codes that use a stochastic algorithm, i.e. Monte Carlo methods, may have difficult or undefined GPT equations. When GPT calculations are available through software, the aforementioned efficiency gained from the GPT approach diminishes when the model has both many output responses and many input parameters. The GPT-Free approach addresses these limitations, first by only requiring the ability to compute the fundamental adjoint from perturbation theory, and second by constructing a ROM from fundamental adjoint calculations, constraining input parameters to a subspace. This approach bypasses the requirement to perform GPT calculations while simultaneously reducing the number of simulations required.

In addition to the reduction of simulations, a major benefit of the GPT-Free approach is explicit control of the reduced order model (ROM) error. When building a subspace using the GPT-Free approach, the reduction error can be selected based on an error tolerance for generic flux response-integrals. The GPT-Free approach then solves the fundamental adjoint equation with randomly generated sets of input parameters. Using properties from linear algebra, the fundamental k-eigenvalue sensitivities, spanned by the various randomly generated models, can be related to response sensitivity profiles by a change of basis. These sensitivity profiles are the first-order derivatives of responses to input parameters. The quality of the basis is evaluated using the κ-metric, developed from Wilks’ order statistics, on the user-defined response functionals that involve the flux state-space. Because the κ-metric is formed from Wilks’ order statistics, a probability-confidence interval can be established around the reduction error based on user-defined responses such as fuel-flux, max-flux error, or other generic inner products requiring the flux. In general, The GPT-Free approach will produce a ROM with a quantifiable, user-specified reduction error.

This dissertation demonstrates the GPT-Free approach for steady state and depletion reactor calculations modeled by SCALE6, an analysis tool developed by Oak Ridge National Laboratory. Future work includes the development of GPT-Free for new Monte Carlo methods where the fundamental adjoint is available. Additionally, the approach in this dissertation examines only the first derivatives of responses, the response sensitivity profile; extension and/or generalization of the GPT-Free approach to higher order response sensitivity profiles is natural area for future research.

Стилі APA, Harvard, Vancouver, ISO та ін.
48

PACHAS, MAURO ARTEMIO CARRION. "LIMIT ANALYSIS WITH LARGE SCALE OPTIMIZER AND RELIABILITY ANALYSIS." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2009. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=31860@1.

Повний текст джерела
Анотація:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
COORDENAÇÃO DE APERFEIÇOAMENTO DO PESSOAL DE ENSINO SUPERIOR
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
PROGRAMA DE EXCELENCIA ACADEMICA
O presente trabalho tem por objetivo desenvolver um otimizador eficiente de grande escala, que permita a aplicabilidade prática da Análise Limite Numérica pelo MEF, para resolver problemas reais da Engenharia Geotécnica. Para isto, foi desenvolvido um otimizador para o programa GEOLIMA (GEOtechnical LIMit Analysis) (Carrión, 2004) baseado no algoritmo de Pontos Interiores, computacionalmente mais eficiente que os otimizadores comerciais existentes. Pelo fato das propriedades do solo serem de natureza aleatória, a possibilidade de aplicar Análise de Confiabilidade com a Análise Limite pelo método FORM em problemas geotécnicos é pesquisada também. Sendo a grande vantagem do método FORM a possibilidade de se aplicar para funções de falha quaisquer e variáveis com distribuição quaisquer. Inicialmente, são apresentados os fundamentos da teoria de Análise Limite e sua formulação numérica pelo MEF (Método dos Elementos Finitos). A seguir, é investigada a possibilidade de se usar otimizadores comerciais para resolver o problema matemático resultante da aplicação de Análise Limite com o MEF e são descritos os fundamentos teóricos do otimizador implementado baseado no algoritmo de Pontos Interiores. Um resumo dos fundamentos teóricos da Análise de Confiabilidade é apresentado. É descrito o processo de cálculo pelo método FORM e dois exemplos de aplicação são realizados. Finalmente, análises de diferentes problemas resolvidos com o otimizador implementado são apresentados indicando o grande potencial da Análise Limite Numérica, na solução de problemas reais da Engenharia Geotécnica.
This work has, as its main objective, the development of an efficient and large scale optimizer, that allows the practical application of Numerical Limit Analysis (NLA) with Finite Element Method (FEM) to solve real problems in Geotechnical Engineering. For that purpose, an optimizer was developed for GEOLIMA (GEOtechnical LIMit Analysis) program (Carrión, 2004), based on Interior Points algorithm, computationally more efficient than the existing commercial optimizers. Due to the fact that soils have random properties, the possibility to apply Reliability Analysis with Limit Analysis using the FORM method was also investigated. Initially, Limit Analysis theory was presented together with its numerical formulation using the FEM. In sequence, the use of commercial optimizers was investigated in order to solve the resulting mathematical problem. Subsequently, the theorical foundations of the developed optimizer, based on the Interior Points algorithm were described. A summary of Reliability Analysis was also presented together with a description of computational procedures using FORM and two examples were developed. Finally, analyses of different problems solved with developed optimizer were presented. The obtained results demonstrated the great potential of Numerical Limit Analysis (NLA), in the solution of real problems in Geotechnical Engineering.
Стилі APA, Harvard, Vancouver, ISO та ін.
49

Mkhatshwa, Elijah Johan. "Grammatical analysis: its role in the reading of legal texts." Thesis, 2007. http://hdl.handle.net/10530/348.

Повний текст джерела
Анотація:
Submitted to the Faculty of Arts in fulfilment of the requirements for the Degree of Doctor of Philosophy in the Department of General Linguistics at the University of Zululand, 2007.
In almost all the statutory sentences that obtain in the statutes of the University of Zululand and the University of Swaziland respectively, modification and subordination or rather embedding form part of the essential techniques used by the writers to enhance the communicative potential of the sentences. The objective of the study, therefore, was to establish that using adjectival and adverbial information in legal texts does have an effect on the act of reading and interpretation and the resultant meaning on the text. The construction of the sentences in the two statutes favours the study's hypotheses. The first hypothesis is that using adjectival and adverbial information in legal texts significantly enhances clarity and precision of the expression as mediated by the text. The second hypothesis is. that reference both within the nominal group and the verbal group in legal texts is susceptible of further specification. In chapter two, we argue, in Bex's (1996:95) terms, that texts orient themselves to readers in particular ways, and organize their information in ways appropriate to the medium selected and the context in which they occur. We also note that in the construction and interpretation of texts due attention is given to the elements in the language which are capable of encoding various functions and particular realizations of these functions determine the register of the text under consideration (cf. Bex, 1996:95). In our analysis of the statutes of the University of Zululand and the University of Swaziland respectively, we establish that language varies according to the activity in which it plays a part (Leech et al, 1982:10). We also establish that sentences with different structures have different communicative functions and that one important property of a sentence is its communicative potential (Akmajian et al, 1995:229). This communicative potential of sentences, with specific reference to the statutory sentences under discussion, is, as already indicated earlier on, enhanced by using modification both within the nominal group and the verbal group. Thus, it is worth emphasizing that in enhancing the effectiveness and communicative potential of the statutory sentences in order to achieve clarity and precision of the expression, modifying elements carrying adjectival and adverbial information are put to use in constructing the sentences. In consequence thereof, modification which employs non-nuclear constituents is accorded a central role in determining the effectiveness of the sentences whilst the acceptability of the sentences in terms of its grammaticaltty is determined solely by the nuclear constituents. Thus the argument that the occurrence of a modifier is never essential for the internal structure of a noun phrase and that a modifier can be easily omitted without affecting the acceptability of the noun phrase (Aarts and Aarts, 1988:63) is, in our view, not at issue. Our concern is not so much with the acceptability of both reference and predication within the structure of the sentence. Rather, we are concerned with whether the communicative potential or effectiveness of the sentences makes it possible for the communicative intent to be realized as intended. Our analysis of the sentences in the statutes in question, demonstrate that the necessary specification is contained in the modifier and that a modifier has the effect of explicitness and of specifying precisely that which is the point of information (Halliday and Hasan, 1997:96). Our view, therefore, is that although non-nuclear constituents (modifiers) in a sentence are optional, their role of specification cannot go unnoticed since they are tightly integrated into the structure of the clause (cf. Huddleston and Pullum, 2005). This view is corroborated by Akmajian et al's (1995:223) argument that the meaning of a syntactically complex expression is determined by the meaning of its constituents and their grammatical relations. Hence we argue that notwithstanding the fact that nuclear constituents are obligatory for the sentence to be accepted as grammatical, the grammaticality of the sentence as determined by the nuclear constituents does not necessarily translate into its effectiveness as a communicative device of information. It bears repeating, therefore, that in almost all the statutory sentences of the two universities, modification and subordination or rather embedding from part of the essential techniques use by the writers to enhance the communicative potential and effectiveness of the sentences.
Стилі APA, Harvard, Vancouver, ISO та ін.
50

Lee, Arthur Foreman. "The application of anodic-stripping voltammetry to the determination of trace elements in standard reference materials." Thesis, 2015. http://hdl.handle.net/10210/14920.

Повний текст джерела
Анотація:
Ph.D. (Chemistry)
Materials that are to be used as reference samples are frequently analysed using costly and sophisticated instrumentation, itself calibrated with similar certified standards. Analytical programmes using such instrumentation are only as accurate as the initial calibrations, and their poor results have led to the adoption by the United States National Bureau of Standards of definitive methods of analysis for the determination cf trace elements ...
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії