To see the other types of publications on this topic, follow the link: Equity statistics.

Dissertations / Theses on the topic 'Equity statistics'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Equity statistics.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Oladele, Oluwatosin Seun. "Low volatility alternative equity indices." Master's thesis, University of Cape Town, 2015. http://hdl.handle.net/11427/15691.

Full text
Abstract:
In recent years, there has been an increasing interest in constructing low volatility portfolios. These portfolios have shown significant outperformance when compared with the market capitalization-weighted portfolios. This study analyses the low volatility portfolios in South Africa using sectors instead of individual stocks as building blocks for portfolio construction. The empirical results from back-testing these portfolios show significant outperformance when compared with their market capitalization weighted equity benchmark counterpart (ALSI). In addition, a further analysis of this study delves into the construction of the low volatility portfolios using the Top 40 and Top 100 stocks. The results also show significant outperformance over the market-capitalization portfolio (ALSI), with the portfolios constructed using the Top 100 stocks having a better performance than portfolio constructed using the Top 40 stocks. Finally, the low volatility portfolios are also blended with typical portfolios (ALSI and the SWIX indices) in order to establish their usefulness as effective portfolio strategies. The results show that the Low volatility Single Index Model (SIM) and the Equally Weight low-beta portfolio (Lowbeta) were the superior performers based on their Sharpe ratios.
APA, Harvard, Vancouver, ISO, and other styles
2

Van, der Linden Courtney Adele. "An Historical Analysis of Fiscal Equity in the Commonwealth of Virginia: 2004-2018." Diss., Virginia Tech, 2021. http://hdl.handle.net/10919/103965.

Full text
Abstract:
This research examines the horizontal and vertical equity of public school funding in the Commonwealth of Virginia from 2004 to 2018. This study analyzed and measured the horizontal and vertical equity funding allocations across each reporting division in the Commonwealth of Virginia from FY2004 to FY2018 in two-year increments reflective of the final year in each biennium where the local composite index (LCI) is calculated. Data were collected for the 132 reporting divisions in the Commonwealth of Virginia including funding amounts, student counts, categorical counts, and average daily membership. Weights were applied to specific groups within the study (i.e., economically disadvantaged students, special education students, and English language learners) in order to obtain vertical equity measures. The chosen measures of wealth neutrality and fiscal equity were range, restricted range, restricted range ratio, coefficient of variation, the Theil Index, the Pearson Correlation, regression, slope, elasticity, the Gini Coefficient, and the McLoone Index. At fixed intervals reflecting FY2004, 2006, 2008, 2010, 2012, 2014, 2016 and 2018, the measures were used to analyze the selected data points for each district across the Commonwealth of Virginia with both unweighted and weighted values. The information from these analyses will help inform researchers and educational leaders about the current state of equity for divisions across the Commonwealth of Virginia. Furthermore, it will inform stakeholders about whether or not horizontal and vertical fiscal equity measures have increased or decreased in the selected fiscal years for the Commonwealth of Virginia.
Doctor of Education
This research examines the equity of public school funding in the Commonwealth of Virginia from 2004 to 2018 two different ways. First, the research measures equity where every student is mathematically identical, which is how funding currently works; this is called horizontal equity. The second measure of equity in this research applies mathematical weights of different amounts to students with different classifications that historically cost more to educate (i.e., economically disadvantaged students, special education students, and English language learners) (Berne and Stiefel, 1984; Verstegen and Knoeppel, 2012); this is referred to as vertical equity. This study analyzed and measured the horizontal and vertical equity funding allocations across each reporting division in the Commonwealth of Virginia from fiscal year 2004 to fiscal year 2018 in two-year increments. This is because every two years, the amount of funding a division receives is recalculated as is the division's ability to pay, also known as the local composite index (LCI). For the purposes of this study, the final year of each two-year cycle was analyzed. Data were collected for the 132 reporting divisions in the Commonwealth of Virginia including funding amounts, student counts, categorical counts, and average daily membership. Weights were applied to specific groups within the study (i.e., economically disadvantaged students, special education students, and English language learners) in order to obtain vertical equity measures. The chosen measures of wealth neutrality and fiscal equity were range, restricted range, restricted range ratio, coefficient of variation, the Theil Index, the Pearson Correlation, regression, slope, elasticity, the Gini Coefficient, and the McLoone Index. At fixed intervals reflecting FY2004, 2006, 2008, 2010, 2012, 2014, 2016 and 2018, the measures were used to analyze the selected data points for each district across the Commonwealth of Virginia with both unweighted and weighted values. The information from these analyses will help inform researchers and educational leaders about the current state of equity for divisions across the Commonwealth of Virginia. Furthermore, it will inform stakeholders about whether or not horizontal and vertical fiscal equity measures have increased or decreased in the selected fiscal years for the Commonwealth of Virginia.
APA, Harvard, Vancouver, ISO, and other styles
3

Barkhagen, Mathias. "Risk-Neutral and Physical Estimation of Equity Market Volatility." Licentiate thesis, Linköpings universitet, Produktionsekonomi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-94360.

Full text
Abstract:
The overall purpose of the PhD project is to develop a framework for making optimal decisions on the equity derivatives markets. Making optimal decisions refers e.g. to how to optimally hedge an options portfolio or how to make optimal investments on the equity derivatives markets. The framework for making optimal decisions will be based on stochastic programming (SP) models, which means that it is necessary to generate high-quality scenarios of market prices at some future date as input to the models. This leads to a situation where the traditional methods, described in the literature, for modeling market prices do not provide scenarios of sufficiently high quality as input to the SP model. Thus, the main focus of this thesis is to develop methods that improve the estimation of option implied surfaces from a cross-section of observed option prices compared to the traditional methods described in the literature. The estimation is complicated by the fact that observed option prices contain a lot of noise and possibly also arbitrage. This means that in order to be able to estimate option implied surfaces which are free of arbitrage and of high quality, the noise in the input data has to be adequately handled by the estimation method. The first two papers of this thesis develop a non-parametric optimization based framework for the estimation of high-quality arbitrage-free option implied surfaces. The first paper covers the estimation of the risk-neutral density (RND) surface and the second paper the local volatility surface. Both methods provide smooth and realistic surfaces for market data. Estimation of the RND is a convex optimization problem, but the result is sensitive to the parameter choice. When the local volatility is estimated the parameter choice is much easier but the optimization problem is non-convex, even though the algorithm does not seem to get stuck in local optima. The SP models used to make optimal decisions on the equity derivatives markets also need generated scenarios for the underlying stock prices or index levels as input. The third paper of this thesis deals with the estimation and evaluation of existing equity market models. The third paper gives preliminary results which show that, out of the compared models, a GARCH(1,1) model with Poisson jumps provides a better fit compared to more complex models with stochastic volatility for the Swedish OMXS30 index.
Det övergripande syftet med doktorandprojektet är att utveckla ett ramverk för att fatta optimala beslut på aktiederivatmarknaderna. Att fatta optimala beslut syftar till exempel på hur man optimalt ska hedga en optionsportfölj, eller hur man ska göra optimala investeringar på aktiederivatmarknaderna. Ramverket för att fatta optimala beslut kommer att baseras på stokastisk programmerings-modeller (SP-modeller), vilket betyder att det är nödvändigt att generera högkvalitativa scenarier för marknadspriser för en framtida tidpunkt som indata till SP-modellen. Detta leder till en situation där de traditionella metoderna, som finns beskrivna i litteraturen, för att modellera marknadspriser inte ger scenarier av tillräckligt hög kvalitet för att fungera som indata till SP-modellen. Följaktligen är huvudfokus för denna avhandling att utveckla metoder som, jämfört med de traditionella metoderna som finns beskrivna i litteraturen, förbättrar estimeringen av ytor som impliceras av en given mängd observerade optionspriser. Estimeringen kompliceras av att observerade optionspriser innehåller mycket brus och möjligen också arbitrage. Det betyder att för att kunna estimera optionsimplicerade ytor som är arbitragefria och av hög kvalitet, så behöver estimeringsmetoden hantera bruset i indata på ett adekvat sätt. De första två artiklarna i avhandlingen utvecklar ett icke-parametriskt optimeringsbaserat ramverk för estimering av högkvalitativa och arbitragefria options-implicerade ytor. Den första artikeln behandlar estimeringen av den risk-neutrala täthetsytan (RND-ytan) och den andra artikeln estimeringen av den lokala volatilitetsytan. Båda metoderna ger upphov till jämna och realistiska ytor för marknadsdata. Estimeringen av RND-ytan är ett konvext optimeringsproblem men resultatet är känsligt för valet av parametrar. När den lokala volatilitetsytan estimeras är parametervalet mycket enklare men optimeringsproblemet är icke-konvext, även om algoritmen inte verkar fastna i lokala optima. SP-modellerna som används för att fatta optimala beslut på aktiederivatmarknaderna behöver också indata i form av genererade scenarier för de underliggande aktiepriserna eller indexnivåerna. Den tredje artikeln i avhandlingen behandlar estimering och evaluering av existerande modeller för aktiemarknaden. Den tredje artikeln tillhandahåller preliminära resultat som visar att, av de jämförda modellerna, ger en GARCH(1,1)-modell med Poissonhopp en bättre beskrivning av dynamiken för det svenska aktieindexet OMXS30 jämfört med mer komplicerade modeller som innehåller stokastisk volatilitet.
APA, Harvard, Vancouver, ISO, and other styles
4

Khuzwayo, Bhekinkosi. "Understanding the low volatility anomaly in the South African equity market." Master's thesis, University of Cape Town, 2015. http://hdl.handle.net/11427/20256.

Full text
Abstract:
The Capital Asset Pricing Model (CAPM) advocates that expected return has a linear proportional relationship with beta (and subsequently volatility). As such, the higher the systematic risk of a security the higher the CAPM expected return. However, empirical results have hardly supported this view as argued as early as Black (1972). Instead, an anomaly has been evidenced across a multitude of developed and emerging markets, where portfolios constructed to have lower volatility have outperformed their higher volatility counterparts as found by Baker and Haugen (2012). This result has been found to exist in most Equity markets globally. In the South African market the studies of Khuzwayo (2011), Panulo (2014) and Oladele (2014) focused on establishing whether low volatility portfolios had outperformed market-cap weighted portfolios in the South African market. While they found this to be the case, it is important to understand if this is truly an anomaly or just a result of prevailing market conditions that have rewarded lower volatility stocks over the back-test period. As such, those conditions might not exist in the future and low volatility portfolios might then underperform. This research does not aim to show, yet again, the existence of this 'anomaly'; instead the aim is to dissect if there is any theoretical backing for low volatility portfolios to outperform high volatility portfolios. If this can be uncovered, then it should help one understand if the 'anomaly' truly exists and also if it can be expected to continue into the future.
APA, Harvard, Vancouver, ISO, and other styles
5

Sumner, Steven W. "Bank equity and the monetary transmission mechanism /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2003. http://wwwlib.umi.com/cr/ucsd/fullcit?p3099930.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Qvennerstedt, Eric, and William Svensson. "Pairs trading on the Swedish equity market; Cointegrate and Capitalize." Thesis, Uppsala universitet, Statistiska institutionen, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-353020.

Full text
Abstract:
This thesis investigates the long- and short- run stability of Cointegrated dual share equity pairs on the Swedish Equity Market. Testing for a cointegrated relationship on each pair are executed for a 13 year period to establish the cointegrated pairs. The stability of each cointegrated pair is then estimated using a rolling two year period. An Arbitrage Trading strategy is applied to the cointegrated pairs for the following one year period. The long-run relationship of the pairs are found to be stable. The short-term relationship varies from pair to pair, where some pairs break their cointegrated relationship for some time periods. But generally, most pairs are stable over the short- term as well. The trading strategy generate the highest returns during volatile market conditions and underperforms during positive market conditions with low volatility. The Sharpe ratio is far better than the Index during the whole period.
APA, Harvard, Vancouver, ISO, and other styles
7

Ngundze, Unathi. "Statistical comparison of international size-based equity index using a mixture distribution." Thesis, Nelson Mandela Metropolitan University, 2011. http://hdl.handle.net/10948/d1012367.

Full text
Abstract:
Investors and financial analysts spend an inordinate amount of time, resources and effort in an attempt to perfect the science of maximising the level of financial returns. To this end, the field of distribution modelling and analysis of firm size effect is important as an investment analysis and appraisal tool. Numerous studies have been conducted to determine which distribution best fits stock returns (Mandelbrot, 1963; Fama, 1965 and Akgiray and Booth, 1988). Analysis and review of earlier research has revealed that researchers claim that the returns follow a normal distribution. However, the findings have not been without their own limitations in terms of the empirical results in that many also say that the research done does not account for the fat tails and skewness of the data. Some research studies dealing with the anomaly of firm size effect have led to the conclusion that smaller firms tend to command higher returns relative to their larger counterparts with a similar risk profile (Banz, 1981). Recently, Janse van Rensburg et al. (2009a) conducted a study in which both non- normality of stock returns and firm size effect were addressed simultaneously. They used a scale mixture of two normal distributions to compare the stock returns of large capitalisation and small capitalisation shares portfolios. The study concluded that in periods of high volatility, the small capitalisation portfolio is far more risky than the large capitalisation portfolio. In periods of low volatility they are equally risky. Janse van Rensburg et al. (2009a) identified a number of limitations to the study. These included data problems, survivorship bias, exclusion of dividends, and the use of standard statistical tests in the presence of non-normality. They concluded that it was difficult to generalise findings because of the use of only two (limited) portfolios. In the extension of the research, Janse van Rensburg (2009b) concluded that a scale mixture of two normal distributions provided a more superior fit than any other mixture. The scope of this research is an extension of the work by Janse van Rensburg et al. (2009a) and Janse van Rensburg (2009b), with a view to addressing several of the limitations and findings of the earlier studies. The Janse van rensburg (2009b) study was based on data from the Johannesburg Stock Exchange (JSE); this study seeks to compare their research by looking at the New York Stock Exchange (NYSE) to determine if similar results occur in developed markets. For analysis purposes, this study used the statistical software package R (R Development Core Team 2008) and its package mixtools (Young, Benaglia, Chauveau, Elmore, Hettmansperg, Hunter, Thomas, Xuan 2008). Some computation was also done using Microsoft Excel. This dissertation is arranged as follows: Chapter 2 is a literature review of some of the baseline studies and research that supports the conclusion that earlier research finding had serious limitations. Chapter 3 describes the data used in the study and gives a breakdown of portfolio formation and the methodology used in the study. Chapter 4 provides the statistical background of the methods used in this study. Chapter 5 presents the statistical analysis and distribution fitting of the data. Finally, Chapter 6 gives conclusions drawn from the results obtained in the analysis of data as well as recommendations for future work.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhou, Zhenhao. "From valuing equity-linked death benefits to pricing American options." Diss., University of Iowa, 2017. https://ir.uiowa.edu/etd/5690.

Full text
Abstract:
Motivated by the Guaranteed Minimum Death Benefits (GMDB) in variable annuities, we are interested in valuing equity-linked options whose expiry date is the time of the death of the policyholder. Because the time-until-death distribution can be approximated by linear combinations of exponential distributions or mixtures of Erlang distributions, the analysis can be reduced to the case where the time-until-death distribution is exponential or Erlang. We present two probability methods to price American options with an exponential expiry date. Both methods give the same results. An American option with Erlang expiry date can be seen as an extension of the exponential expiry date case. We calculate its price as the sum of the price of the corresponding European option and the early exercise premium. Because the optimal exercise boundary takes the form of a staircase, the pricing formula is a triple sum. We determine the optimal exercise boundary recursively by imposing the “smooth pasting” condition. The examples of the put option, the exchange option, and the maximum option are provided to illustrate how the methods work. Another issue related to variable annuities is the surrender behavior of the policyholders. To model this behavior, we suggest using barrier options. We generalize the reflection principle and use it to derive explicit formulas for outside barrier options, double barrier options with constant barriers, and double barrier options with time varying exponential barriers. Finally, we provide a method to approximate the distribution of the time-until-death random variable by combinations of exponential distributions or mixtures of Erlang distributions. Compared to directly fitting the distributions, my method has two advantages: 1) It is more robust to the initial guess. 2) It is more likely to obtain the global minimizer.
APA, Harvard, Vancouver, ISO, and other styles
9

Ritchie, Felix. "Accessing the New Earnings Survey Panel Dataset : efficient techniques and applications." Thesis, University of Stirling, 1996. http://hdl.handle.net/1893/21519.

Full text
Abstract:
The New Earnings Survey Panel Dataset is one of the largest datasets of its kind in the world. Its size and confidentiality restrictions present considerable difficulties for analysis using standard econometric packages. This thesis presents a number of methods for accessing the information held within the panel relatively efficiently, based upon the use of cross-product matrices and on data compression techniques. These methods allow, for the first time, the panel aspect of the dataset to be used in analysis. The techniques described here are then employed to produce an overview of changes in the UK labour market from 1975 to 1990 and detailed estimates of male and female earnings over a fourteen year period. These are the first panel estimates on the dataset, and they indicate the importance of allowing the parameters of any labour market model to vary over time. This is significant as panel estimators typically impose structural stability on the coefficients. A comparison of cross-section and panel estimates of earnings functions for males indicate that the allowance for individual heterogeneity also has a notable effect on the estimates produced, implying simple cross-sections may be significantly biased. Some preliminary estimates of the male-female wage gap indicate that variation over time has an important part to play in accounting for the differences in wages, and that "snapshot" studies may not capture dynamic changes in the labour market. Individual differences also playa significant role in the explanation of the wage gap.
APA, Harvard, Vancouver, ISO, and other styles
10

Franksson, Rikard. "Private Equity Portfolio Management and Positive Alphas." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-275666.

Full text
Abstract:
This project aims to analyze Nordic companies active in the sector of Information and Communications Technology (ICT), and does this in two parts. Part I entails analyzing public companies to construct a valuation model aimed at predicting the enterprise value of private companies. Part II deals with analyzing private companies to determine if there are opportunities providing excess returns as compared to investments in public companies. In part I, a multiple regression approach is utilized to identify suitable valuation models. In doing so, it is revealed that 1-factor models provide best statistical results in terms of significance and prediction error. In descending order, in terms of prediction accuracy, these are (1) total assets, (2) turnover, (3) EBITDA, and (4) cash flow. Part II uses model (1) and finds that Nordic ICT private equity does provide opportunities for positive alphas, and that it is possible to construct portfolio strategies that increase this alpha. However, with regards to previous research, it seems as though the returns offered by the private equity market analyzed does not adequately compensate investors for the additional risks related to investing in private equity.
Det här projektet analyserar nordiska bolag aktiva inom Informations- och Kommunikationsteknologi (ICT) i två delar. Del I behandlar analys av publika bolag för att konstruera en värderingsmodell avsedd att förutsäga privata bolags enterprise value. Del II analyserar privata bolag för att undersöka huruvida det finns möjligheter att uppnå överavkastning jämfört med investeringar i publika bolag. I del I utnyttjas multipel regressionsanalys för att identifiera tillämpliga värderingsmodeller. I den processen påvisas att modeller med enbart en faktor ger bäst statistiska resultat i fråga om signifikans och förutsägelsefel. I fallande ordning, med avseende på precision i förutsägelser, är dessa modeller (1) totala tillgångar, (2) omsättning, (3) EBITDA, och (4) kassaflöde. Del II använder modell (1) och finner att den nordiska marknaden för privata ICT-bolag erbjuder möjligheter för överavkastning jämfört med motsvarande publika marknad, samt att det är möjligt att konstruera portföljstrategier som ökar avkastningen ytterligare. Dock, med hänsyn till tidigare forskning, verkar det som att de möjligheter för avkastning som går att finna på marknaden av privata bolag som undersökts inte kompenserar investerare tillräckligt för de ytterligare risker som är relaterade till investeringar i privata bolag.
APA, Harvard, Vancouver, ISO, and other styles
11

Neal, Mary Jo Johnson. "Teaching Probability and Statistics to English Language Learners in Grade Five." Digital Commons @ East Tennessee State University, 2007. https://dc.etsu.edu/etd/2072.

Full text
Abstract:
An increasing number of English Language Learners enrolling in the Washington County Virginia Public School System during the past several years prompted the idea of this thesis. These students are currently mainstreamed in the regular academic classroom. Adapting to their needs is a new challenge in education for teachers in Southwest Virginia. This thesis offers an opportunity for teachers to prepare for a multicultural classroom setting providing English Language Learners with learning strategies necessary to gain confidence in their mathematical ability and academic success in the areas of probability and statistics. Lesson plans have been specifically designed emphasizing teaching strategies, the role of an effective teacher, classroom environment, various cultures and relevant and authentic data.
APA, Harvard, Vancouver, ISO, and other styles
12

Pettersson, Fabian, and Oskar Ringström. "Portfolio Optimization: An Evaluation of the Downside Risk Framework on the Nordic Equity Markets." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-275688.

Full text
Abstract:
Risk management in portfolio construction is a widely discussed topic and the tradeoff between risk and return is always considered before an investment is made. Modern portfolio theory is a mathematical framework which describes how a rational investor can use diversification to optimize a portfolio, which suggests using variance to measure financial risk. However, since variance is a symmetrical metric, the framework fails to correctly account for the loss aversion preferences most investors exhibit. Therefore, the use of downside risk measures were proposed, which only measures the variance of the portfolio below a certain threshold, usually set to zero or the risk-free rate. This thesis empirically investigates the differences in performance between the two risk measures when used to solve a real world portfolio optimization problem. Backtests using the different measures on all major Nordic equity markets are performed to highlight the dynamics between the frameworks, and when one should be preferred over the other. It is concluded that the optimization frameworks indeed provides a useful tool for investors to construct great performing portfolios. However, even though the downside risk framework is more mathematically rigorous, implementing this risk measure instead of variance seems to be of less importance for the actual results.
Riskhantering för aktieportföljer är mycket centralt och en avvägning mellan risk och avkastning görs alltid innan en investering. Modern Portföljteori är ett matematiskt ramverk som beskriver hur en rationell investerare kan använda diversifiering för att optimera en portfölj. Centralt för detta är att använda portföljens varians för att mäta risk. Dock, eftersom varians är ett symmetriskt mått lyckas inte detta ramverk korrekt ta hänsyn till den förlustaversion som de flesta investerare upplever. Därför har det föreslagits att istället använda olika mått på nedsiderisk (downside risk), som endast tar hänsyn till portföljens varians under en viss avkastningsgräns, oftast satt till noll eller den riskfria räntan. Denna studie undersöker skillnaderna i prestation mellan dessa två riskmått när de används för att lösa ett verkligt portföljoptimeringsproblem. Backtests med riskmåtten har genomförts på de olika nordiska aktiemarknaderna för att visa på likheter och skillnader mellan de olika riskmåtten, samt när det enda är att föredra framför det andra. Slutsatsen är att ramverken ger investerare ett användbart verktyg för att smidigt optimera portföljer. Däremot verkar den faktiska skillnaden mellan de två riskmåtten vara av mindre betydelse för portföljernas prestation. Detta trots att downside risk är mer matematiskt rigoröst.
APA, Harvard, Vancouver, ISO, and other styles
13

Sävendahl, Carl, and Erik Flodmark. "A Return Maximizing Strategy in Market Rebounds for Swedish Equity Funds." Thesis, KTH, Matematisk statistik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-252747.

Full text
Abstract:
The growing interest in savings on the financial markets implicates that the competition is expanding and managers of Swedish equity funds need to create shareholder value, independent of the macroeconomic situation. The Swedish financial market experienced a rapid rebound during the first quarter of 2019, following the plunge in the preceding quarter. This thesis utilizes multiple linear regression to analyze Swedish equity funds during the first quarter of 2019. The aim is to identify variables affecting fund performance in a market rebound in order to formulate a performance maximizing strategy. Based on the results of the performance influencing variables, the strategy is to underweight small cap stocks, overweight the energy and technology sector, underweight the communication services sector and staying neutral to overweighted in remaining sectors. Furthermore, the strategy proposes an overweighted exposure to North American stocks and an underweight to Western European stocks. The overexposure to North America should be larger in absolute value compared to the underexposure to Western Europe. The strategy is ambiguous since data from only one market rebound is analyzed. Therefore, the strategy is not significantly proven to be adaptable in any market rebound. The model analysis is based on modern macroeconomic and financial theories. In addition, the discussion problematizes the neoclassical view on economics based on the notion that a combination of rationality and irrationality is prevalent among investors. Further research is essential either to support or reject the performance affecting variables and the allocation strategy specified in the thesis.
Det växande intresset att investera på de finansiella marknaderna implicerar att konkurrensen hårdnar bland fondförvaltare. Fondförvaltare för svenska aktiefonder måste därmed skapa andelsägarvärde, oberoende av det makroekonomiska läget. Den finansiella marknaden återhämtade sig snabbt under det första kvartalet 2019 efter den branta nedgången under det föregående kvartalet. Studien avser att identifiera de bidragande faktorerna till avkastning för svenska aktiefonder under denna återhämtning. Multipel linjär regression används för detta ändamål samt för att formulera en avkastningsmaximerande strategi. Strategin föreslår att förvaltare för svenska aktiefonder bör undervikta småbolag, övervikta aktier inom energi och teknik samt undervikta aktier i kommunikationssektorn. Strategin är vidare att vara neutral till överviktad i övriga sektorer. Dessutom är strategin att övervikta nordamerikanska aktier och att undervikta västeuropeiska aktier. Övervikten i Nordamerika ska vara större i absoluta termer än undervikten i Västeuropa. Strategin är tvetydig då den bygger på data från enbart en marknadsåterhämtning. Därmed är den framtagna strategin inte bevisad att vara applicerbar på vilken marknadsåterhämtning som helst. Analysen är baserad på modern makroekonomisk och finansiell teori. Diskussionen problematiserar den neoklassiska synen på ekonomi baserat på uppfattningen att investerare är både irrationella och rationella i sina investeringsbeslut. Fortsatt forskning är essentiell för att antingen stärka eller förkasta dragna slutsatser i denna studie.
APA, Harvard, Vancouver, ISO, and other styles
14

Park, YoongSoo. "The development and field testing of an instrument for measuring citizens' attitudes toward public school funding in terms of equity, adequacy, and accountability." Ohio : Ohio University, 2010. http://www.ohiolink.edu/etd/view.cgi?ohiou1268147159.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Kwon, Tae Yeon. "Three Essays on Credit Risk Models and Their Bayesian Estimation." Thesis, Harvard University, 2012. http://dissertations.umi.com/gsas.harvard:10427.

Full text
Abstract:
This dissertation consists of three essays on credit risk models and their Bayesian estimation. In each essay, defaults or default correlation models are built under one of two main streams. In our first essay, sequential estimation on hidden asset value and model parameters estimation are implemented under the Black-Cox model. To capture short-term autocorrelation in the stock market, we assume that market noise follows a mean reverting process. For estimation, two Bayesian methods are applied in this essay: the particle filter algorithm for sequential estimation of asset value and the generalized Gibbs sampling method for model parameters estimation. The first simulation study shows that sequential hidden asset value estimation using option price and equity price is more efficient and accurate than estimation using only equity price. The second simulation study shows that by applying the generalized Gibbs sampling method, model parameters can be successfully estimated under the model setting that there is no closed-form solution. In an empirical analysis using eight companies, half of which are DowJones30 companies and the other half non-Dow Jones 30 companies, the stock market noise for the firms with more liquid stock is estimated as having smaller volatility in market noise processes. In our second essay, the frailty idea described in Duffie, Eckner, Horel, and Saita (2009) is expanded to industry-specific terms. The MCEM algorithm is used to estimate parameters and random effect processes under the condition of unknown hidden paths and analytically-difficult likelihood functions. The estimate used in the study are based on U.S. public firms between 1990 and 2008. By introducing industry-specific hidden factors and assuming that they are random effects, a comparison is made of the relative scale of within- and between-industries correlations. A comparison study is also developed among a without-hidden-factor model, a common-hiddenfactor model, and our industry-specific common-factor model. The empirical results show that an industry-specific common factor is necessary for adjusting over- or under-estimation of default probabilities and over- or under-estimation of observed common factor effects. Our third essay combines and extends works of the first two essays by proposing a common model frame for both structural and intensity credit risk models. The common model frame combines the merits of several default correlation studies which are independently developed under each model setting. Following the work of Duffie, Eckner, Horel, and Saita (2009), we apply not only observed common factors, but also un-observed hidden factor to explain the correlated defaults. Bayesian techniques are used for estimation and generalized Gibbs sampling and Metropolis-Hasting (MH) algorithms are developed. More than a simple combination of two model approaches (structural and intensity models), we relax the assumptions of equal factor effect across entire firms in previous studies, instead adopting a random coefficients model. Also, a novelty of the approach lies in the fact that CDS and equity prices are used together for estimation. A simulation study shows that the posterior convergence is improved by adding CDS prices in estimation. Empirical results based on daily data of 125 companies comprising CDS.NA.IG13 in 2009 supports the necessity of such relaxations of assumption in previous studies. In order to demonstrate potential practical applications of the proposed framework, we derive the posterior distribution of CDX tranche prices. Our correlated structural model is successfully able to predict all the CDX tranche prices, but our correlated intensity model results suggests the need for further modification of the model.
Statistics
APA, Harvard, Vancouver, ISO, and other styles
16

Wuilmart, Adam, and Erik Harrysson. "Assessing the Operational Value Creation by the Private Equity Industry in the Nordics." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-275693.

Full text
Abstract:
More and more capital is being directed towards the private equity industry. As a result, private equity owned firms make up an increasingly large share of the economy. Therefore, it is becoming more important to understand the nature of how the operational performance of firms change under private equity ownership. This study looked at how the operational efficiency in terms of EBIT-margin changed over a three-year period after a private equity acquisition in the Nordic market. The study found that companies which had an initial positive EBIT margin behaved differently from companies with an initial negative EBIT margin and therefore two separate models where created. It was found that in the case where the company had a positive EBIT margin before being bought by a private equity firm saw an average decrease in EBIT margin of 1.14% units. In the case of a firm with initial negative EBIT-margin a private equity acquisition led to an average increase in EBIT margin by 1.99% units compared to the reference data. This study thus shows that private equity ownership affects the operational efficiency of companies. Moreover, it shows that one should make a distinction between PE ownership in profitable growth cases and turn-around cases of inefficient companies and that the impact of PE ownership in terms of effect on operational profitability can be vastly different depending on the nature of the acquisition in this regard.
Private Equity industrin ser ökande inströmning av investeringskapital, vilket resulterat i att en allt större del av ekonomin utgörs av private equity-ägda företag. Därmed ökar vikten av att förstå hur private equity firmor påverkar sina portföljbolag under ägandeperioden. Denna studie undersöker hur EBIT-marginalen i företag förändrats över en treårsperiod efter att företagen blivit förvärvade av ett nordiskt private equity-bolag. Studien hittade en signifikant skillnad mellan hur företag med initialt positiv, respektive negativ EBIT-marginal påverkades under treårsperioden och två separata modeller skapades för att utvärdera effekten. Resultaten påvisade med signifikans att företag med initial positiv EBIT-marginal minskade sin EBIT-marginal med 1.14% relativt jämförbara företag efter ett private equity förvärv. För företag med initialt negativ EBIT-marginal påvisades med signifikans en ökning av EBIT-marginalen med 1.99% relativt jämförbara företag efter ett private equity förvärv. Studien påvisar därmed att private equity ägande har en påverkan på operationell lönsamhet och att den skiljer sig markant beroende på ifall företaget initialt är operativt lönsamt eller ej.
APA, Harvard, Vancouver, ISO, and other styles
17

Petrov, Pavel. "Cointegration in equity markets: a comparison between South African and major developed and emerging markets." Thesis, Rhodes University, 2011. http://hdl.handle.net/10962/d1005539.

Full text
Abstract:
Cointegration has important implications for portfolio diversification. One of these is that in order to spread risk it is advisable to invest in markets that are not cointegrated. Over the last several decades communication technology has made the world a smaller place and hence cointegration in equity markets has become more prevalent. The bulk of research into cointegration focuses on developed and Asian markets, with little research been done on African markets. This study compares the Engle-Granger and Johansen tests for cointegration and uses them to calculate the level of cointegration between South African and other global equity markets. Each market is compared pair-wise with South Africa and the results have been that in general South Africa is cointegrated with other emerging markets but not really with African nor developed markets. Short-run analysis with the error correction was carried out and showed that in general markets respond slowly to any disequilibrium. Innovation accounting methods showed that the country placed first in Cholesky ordering dominates the other one. Multivariate cointegration was carried out using three selections of 4, 6 and 8 market portfolios. One of the markets was SA and the others were all chosen based on the criteria that they are not pair-wise cointegrated with SA. The level of cointegration varied depending on the portfolios, as did the error correction rates, impulse responses and variance decomposition. The one constant was that the USA dominated any portfolio where it was introduced. Recommendations were finally made about which market portfolio an investor should consider as most favourable.
APA, Harvard, Vancouver, ISO, and other styles
18

Tokunaga, Meagan. "Implementing (Environmental) Justice: Equity and Performance in California's S.B. 535." Scholarship @ Claremont, 2015. http://scholarship.claremont.edu/pomona_theses/137.

Full text
Abstract:
This thesis evaluates the equity performance of a recent state environmental justice policy, California’s Senate Bill 535 (S.B. 535). “Environmental justice” refers to the disproportionate environmental harm imposed on low-income and minority communities. S.B. 535 uses competitive grants to provide funding to these communities. The research is centered around two questions: (1) to what extent has S.B. 535 experienced successful implementation in its first year of operation, and (2) how can policy actors improve implementation while balancing performance and equity goals? In regards to the first question, I utilize a case study of the policy’s implementation within 17 local governments in Riverside County. I find that the number of actors involved and the alignment of their interests prevent the policy from more successful implementation. Local government officials identify staff capacity as a primary concern in the program’s implementation. I then evaluate the policy’s balance of program performance and equity with an econometric analysis that characterizes the decisions of local governments to implement the policy. I find impressive equity performance, as low-income and minority populations are more likely to participate. The implementing governments have sufficient capacity to achieve program goals, as larger cities and cities with more staff per capita are more likely to participate. My findings support the use of competitive grants in environmental justice policies. The S.B. 535 grant program demonstrates the ability to distribute funding to governments with both socioeconomic disadvantage and the capacity for successful implementation. The analysis concludes with policy recommendations.
APA, Harvard, Vancouver, ISO, and other styles
19

Jiang, Huijing. "Statistical computation and inference for functional data analysis." Diss., Georgia Institute of Technology, 2010. http://hdl.handle.net/1853/37087.

Full text
Abstract:
My doctoral research dissertation focuses on two aspects of functional data analysis (FDA): FDA under spatial interdependence and FDA for multi-level data. The first part of my thesis focuses on developing modeling and inference procedure for functional data under spatial dependence. The methodology introduced in this part is motivated by a research study on inequities in accessibility to financial services. The first research problem in this part is concerned with a novel model-based method for clustering random time functions which are spatially interdependent. A cluster consists of time functions which are similar in shape. The time functions are decomposed into spatial global and time-dependent cluster effects using a semi-parametric model. We also assume that the clustering membership is a realization from a Markov random field. Under these model assumptions, we borrow information across curves from nearby locations resulting in enhanced estimation accuracy of the cluster effects and of the cluster membership. In a simulation study, we assess the estimation accuracy of our clustering algorithm under a series of settings: small number of time points, high noise level and varying dependence structures. Over all simulation settings, the spatial-functional clustering method outperforms existing model-based clustering methods. In the case study presented in this project, we focus on estimates and classifies service accessibility patterns varying over a large geographic area (California and Georgia) and over a period of 15 years. The focus of this study is on financial services but it generally applies to any other service operation. The second research project of this part studies an association analysis of space-time varying processes, which is rigorous, computational feasible and implementable with standard software. We introduce general measures to model different aspects of the temporal and spatial association between processes varying in space and time. Using a nonparametric spatiotemporal model, we show that the proposed association estimators are asymptotically unbiased and consistent. We complement the point association estimates with simultaneous confidence bands to assess the uncertainty in the point estimates. In a simulation study, we evaluate the accuracy of the association estimates with respect to the sample size as well as the coverage of the confidence bands. In the case study in this project, we investigate the association between service accessibility and income level. The primary objective of this association analysis is to assess whether there are significant changes in the income-driven equity of financial service accessibility over time and to identify potential under-served markets. The second part of the thesis discusses novel statistical methodology for analyzing multilevel functional data including a clustering method based on a functional ANOVA model and a spatio-temporal model for functional data with a nested hierarchical structure. In this part, I introduce and compare a series of clustering approaches for multilevel functional data. For brevity, I present the clustering methods for two-level data: multiple samples of random functions, each sample corresponding to a case and each random function within a sample/case corresponding to a measurement type. A cluster consists of cases which have similar within-case means (level-1 clustering) or similar between-case means (level-2 clustering). Our primary focus is to evaluate a model-based clustering to more straightforward hard clustering methods. The clustering model is based on a multilevel functional principal component analysis. In a simulation study, we assess the estimation accuracy of our clustering algorithm under a series of settings: small vs. moderate number of time points, high noise level and small number of measurement types. We demonstrate the applicability of the clustering analysis to a real data set consisting of time-varying sales for multiple products sold by a large retailer in the U.S. My ongoing research work in multilevel functional data analysis is developing a statistical model for estimating temporal and spatial associations of a series of time-varying variables with an intrinsic nested hierarchical structure. This work has a great potential in many real applications where the data are areal data collected from different data sources and over geographic regions of different spatial resolution.
APA, Harvard, Vancouver, ISO, and other styles
20

Novick-Finder, Taylor. "Stand Clear of the Closing Doors, Please: Transit Equity, Social Exclusion, and the New York City Subway." Scholarship @ Claremont, 2017. http://scholarship.claremont.edu/pitzer_theses/78.

Full text
Abstract:
The history of transportation planning in New York City has created disparities between those who have sufficient access to the public transportation network, and those who face structural barriers to traveling from their home to education, employment, and healthcare opportunities. This thesis analyzes the legacy of discriminatory policy surrounding the Metropolitan Transportation Authority (MTA) and city and state governments that have failed to support vital infrastructure improvement projects and service changes to provide multi-modal welfare to New York’s working poor. By exploring issues of transit equity as they pertain to the New York City subway system, this thesis raises the question: which communities lack adequate access to public transit opportunity and what are the policies and historical developments that have created these inequities? Through examination of grassroots community-based movements towards social justice and transportation equity, this thesis will review the proposals, campaigns, and demands that citizen-driven organizations have fought for in New York City. These movements, I argue, are the most effective method to achieve greater transportation justice and intergenerational equity.
APA, Harvard, Vancouver, ISO, and other styles
21

Hardy, James C. (James Clifford). "A Monte Carlo Study of the Robustness and Power Associated with Selected Tests of Variance Equality when Distributions are Non-Normal and Dissimilar in Form." Thesis, University of North Texas, 1990. https://digital.library.unt.edu/ark:/67531/metadc332130/.

Full text
Abstract:
When selecting a method for testing variance equality, a researcher should select a method which is robust to distribution non-normality and dissimilarity. The method should also possess sufficient power to ascertain departures from the equal variance hypothesis. This Monte Carlo study examined the robustness and power of five tests of variance equality under specific conditions. The tests examined included one procedure proposed by O'Brien (1978), two by O'Brien (1979), and two by Conover, Johnson, and Johnson (1981). Specific conditions included assorted combinations of the following factors: k=2 and k=3 groups, normal and non-normal distributional forms, similar and dissimilar distributional forms, and equal and unequal sample sizes. Under the k=2 group condition, a total of 180 combinations were examined. A total of 54 combinations were examined under the k=3 group condition. The Type I error rates and statistical power estimates were based upon 1000 replications in each combination examined. Results of this study suggest that when sample sizes are relatively large, all five procedures are robust to distribution non-normality and dissimilarity, as well as being sufficiently powerful.
APA, Harvard, Vancouver, ISO, and other styles
22

Masindi, Khuthadzo. "Statistical arbitrage in South African equity markets." Master's thesis, University of Cape Town, 2014. http://hdl.handle.net/11427/13427.

Full text
Abstract:
The dissertation implements a model driven statistical arbitrage strategy that uses the principal components from Principal Component Analysis as factors in a multi-factor stock model, to isolate the idiosyncratic component of returns, which is then modelled as an Ornstein Uhlenbeck process. The idiosyncratic process (referred to as the residual process) is estimated in discrete-time by an auto-regressive process with one lag (or AR(1) process). Trading signals are generated based on the level of the residual process. This strategy is then evaluated over historical data for the South African equity market from 2001 to 2013 through backtesting. In addition the strategy is evaluated over data generated from Monte Carlo simulations as well as bootstrapped historical data. The results show that the strategy was able to significantly out-perform cash for most of the periods under consideration. The performance of the strategy over data that was generated from Monte Carlo simulations demonstrated that the strategy is not suitable for markets that are asymptotically efficient.
APA, Harvard, Vancouver, ISO, and other styles
23

Mdete, Devotha. "INVESTIGATING THE ROBUSTNESS OF MULTIVARIATE TESTS OF EQUALITY OF MEANS USING DIFFERENT SCENARIOS." Thesis, Örebro universitet, Handelshögskolan vid Örebro Universitet, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:oru:diva-49389.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Dunu, Emeka Samuel. "Comparing the Powers of Several Proposed Tests for Testing the Equality of the Means of Two Populations When Some Data Are Missing." Thesis, University of North Texas, 1994. https://digital.library.unt.edu/ark:/67531/metadc278198/.

Full text
Abstract:
In comparing the means .of two normally distributed populations with unknown variance, two tests very often used are: the two independent sample and the paired sample t tests. There is a possible gain in the power of the significance test by using the paired sample design instead of the two independent samples design.
APA, Harvard, Vancouver, ISO, and other styles
25

Rudy, Jozef. "Four essays in statistical arbitrage in equity markets." Thesis, Liverpool John Moores University, 2011. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.546739.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Showalter, Daniel A. "Estimating the Causal Effect of High School Mathematics Coursetaking on Placement out of Postsecondary Remedial Mathematics." Ohio University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1395226381.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Li, Haiyin. "Power Analysis for Alternative Tests for the Equality of Means." Digital Commons @ East Tennessee State University, 2011. https://dc.etsu.edu/etd/1304.

Full text
Abstract:
The two sample t-test is the test usually taught in introductory statistics courses to test for the equality of means of two populations. However, the t-test is not the only test available to compare the means of two populations. The randomization test is being incorporated into some introductory courses. There is also the bootstrap test. It is also not uncommon to decide the equality of the means based on confidence intervals for the means of these two populations. Are all those methods equally powerful? Can the idea of non-overlapping t confidence intervals be extended to bootstrap confidence intervals? The powers of seven alternative ways of comparing two population means are analyzed using small samples with data coming from distributions with different degrees of skewness and kurtosis. The analysis is done using simulation; programs in GAUSS were especially written for this purpose.
APA, Harvard, Vancouver, ISO, and other styles
28

Trivette, Carol M., Michael Garrett, Hongxia Zhao, and Carol Landry. "Research Evidence for Environment Recommended Practices." Digital Commons @ East Tennessee State University, 2016. https://dc.etsu.edu/etsu-works/4436.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Waesche, Matthew J. "The equity of punishment in the Naval Academy conduct system : a statistical analysis." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2002. http://library.nps.navy.mil/uhtbin/hyperion-image/02Jun%5FWaesche.pdf.

Full text
Abstract:
Thesis (M.S. in Leadership and Human Resource Development)--Naval Postgraduate School, June 2002.
Thesis advisor(s): J. Eric Fredland, Erik Jansen. Includes bibliographical references (p. 101-102). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
30

Rogers, Francis H. III. "The measurement and decomposition of achievement equity - an introduction to its concepts and methods including a multiyear empirical study of sixth grade reading scores." The Ohio State University, 2004. http://rave.ohiolink.edu/etdc/view?acc_num=osu1092419197.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Park, Seoungbyung. "Factor Based Statistical Arbitrage in the U.S. Equity Market with a Model Breakdown Detection Process." Thesis, Marquette University, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10280168.

Full text
Abstract:

Many researchers have studied different strategies of statistical arbitrage to provide a steady stream of returns that are unrelated to the market condition. Among different strategies, factor-based mean reverting strategies have been popular and covered by many. This thesis aims to add value by evaluating the generalized pairs trading strategy and suggest enhancements to improve out-of-sample performance. The enhanced strategy generated the daily Sharpe ratio of 6.07% in the out-of-sample period from January 2013 through October 2016 with the correlation of -.03 versus S&P 500. During the same period, S&P 500 generated the Sharpe ratio of 6.03%.

This thesis is differentiated from the previous relevant studies in the following three ways. First, the factor selection process in previous statistical arbitrage studies has been often unclear or rather subjective. Second, most literature focus on in-sample results, rather than out-of-sample results of the strategies, which is what the practitioners are mainly interested in. Third, by implementing hidden Markov model, it aims to detect regime change to improve the timing the trade.

APA, Harvard, Vancouver, ISO, and other styles
32

Pervaiz, M. K. "A comparison of tests of equality of covariance matrices, with special reference to the case of cluster sampling." Thesis, University of Southampton, 1986. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.374234.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Pollock, Adam. "Statistical examination of the relationship between return on equity and plant investment for natural gas pipelines." CONNECT TO ELECTRONIC THESIS, 2007. http://dspace.wrlc.org/handle/4185.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Kureshy, Imran A. 1965. "Credit derivatives : market dimensions, correlation with equity and implied option volatility, regression modeling and statistical price risk." Thesis, Massachusetts Institute of Technology, 2004. http://hdl.handle.net/1721.1/17896.

Full text
Abstract:
Thesis (M.B.A.)--Massachusetts Institute of Technology, Sloan School of Management, 2004.
Includes bibliographical references (leaves 50-51).
This research thesis explores the market dimensions of credit derivatives including the prevalent product structures, leading participants, market applications and the issues confronting this relatively new product. We find the market continues to experience significant growth particularly in single name default swaps. This growth is fueled in part by increased participation of hedge funds and applications beyond risk management as an acceptable trading instrument. As this market continues to grow it must address the need for specialized technology infrastructure to support continued growth and consistency in documentation to ensure confidence. We then set out to explore the relationship between CDS spreads and the equity markets. We find a strong correlation with implied option volatility. While this relationship does not suggest causation, the magnitude of these relationships should assist market participants in developing effective trading and portfolio management strategies. We also explore the volatility of default spreads and find that there is wide disparity in volatility among common credit ratings. This leads to a suggestion that market participants may be able to reduce spread volatility and earn enhanced risk-adjusted yields by constructing credit portfolios based on spread widening risk rather than default risk. Finally, since the focus of this thesis report was market application, we take the analyses and develop a regression model that serves as a quick and easy to use reference tool for credit derivative market participants. The data sample for this research paper spans 97 issuers across 19 industries and 10 credit rating levels including non-investment grade. The sample period covers daily observations between
(cont.) September 20, 2002 and December 31, 2003.
by Imran A. Kureshy.
M.B.A.
APA, Harvard, Vancouver, ISO, and other styles
35

Costa, Ana Maria Paes Scott da. "Análise temporal da ocorrência da anemia infecciosa equina no Brasil no período de 2005 a 2016 /." Jaboticabal, 2018. http://hdl.handle.net/11449/180494.

Full text
Abstract:
Orientador: Luis Antônio Mathias
Resumo: A equideocultura possui grande importância econômica no Brasil, e a anemia infecciosa equina (AIE) representa ameaça para o crescimento do setor. Pouco se sabe sobre análise temporal da ocorrência da doença no Brasil, por isso este trabalho foi conduzido com o objetivo de investigar a frequência e distribuição da anemia infecciosa equina em todo o território brasileiro no período de 2005 a 2016, por meio das análises temporais do índice de morbidade, índice de eliminação de reagentes, razão entre o número de equídeos eliminados e o número de equídeos reagentes, índice de notificação de focos e índice de focos novos e antigos ao longo do período. Foi realizada uma análise de série temporal sobre AIE no período de 2005 a 2016, utilizando dois bancos de dados, o da OIE e o do MAPA. No Brasil, 2010 foi o ano que apresentou o menor índice de morbidade de AIE, com 100,9 casos por 100 mil cabeças, e 2009 apresentou o maior índice, com 159,1 por 100 mil cabeças, e a taxa de variação anual no período foi de -2,4% (-7,5% a 2,8%), revelando estabilidade desse indicador (P = 0,317). O índice de eliminação de reagentes por AIE no Brasil foi maior em 2010, com 62,7 por 100 mil cabeças, e menor em 2016, com 32,1. Durante todo o período estudado, a variação anual no país se mostrou estável, com valor de -6,1% (-18,2% a 7,8%) e valor de P não significativo (0,333). A razão entre o número de equídeos eliminados e o número de equídeos reagentes por 100 equídeos foi baixa em quase todas as UF’s ... (Resumo completo, clicar acesso eletrônico abaixo)
Mestre
APA, Harvard, Vancouver, ISO, and other styles
36

Mu, Zhiqiang. "Comparing the Statistical Tests for Homogeneity of Variances." Digital Commons @ East Tennessee State University, 2006. https://dc.etsu.edu/etd/2212.

Full text
Abstract:
Testing the homogeneity of variances is an important problem in many applications since statistical methods of frequent use, such as ANOVA, assume equal variances for two or more groups of data. However, testing the equality of variances is a difficult problem due to the fact that many of the tests are not robust against non-normality. It is known that the kurtosis of the distribution of the source data can affect the performance of the tests for variance. We review the classical tests and their latest, more robust modifications, some other tests that have recently appeared in the literature, and use bootstrap and permutation techniques to test for equal variances. We compare the performance of these tests under different types of distributions, sample sizes and true ratios of variances of the populations. Monte-Carlo methods are used in this study to calculate empirical powers and type I errors under different settings.
APA, Harvard, Vancouver, ISO, and other styles
37

Falkenborn, Filip, and Mehdi Lahlou. "Do Correlations Between Macroeconomic Variables and Equity Return Change during Volatile Times? : A statistical Analysis with Focus on the Oil Crisis 2014." Thesis, KTH, Matematisk statistik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-169865.

Full text
Abstract:
Every investor place his or her investment with the desire of maximum return with lowest possible risk. To accomplish this desire a good knowledge of how macro variables affect the equity return is important. During the last two decades we have had several crises turning these basics up-side-down. This thesis aims to examine how macroeconomic variables have affected the equity return during stable times and further analyse what impact the recent oil crisis have had on these correlations. This research is limited to only a few selected counties in Europe, namely Germany, Sweden, France, Norway and United Kingdom. We have analysed this phenomenon using multiple linear regressions with a lagged dependent variable on data from February 2010 to August 2014. The data was gathered from 55 consecutive months before the oil crises and also during the six succeeding months of volatility. The obtained models from tranquil times were then used to predict stock development during times of turmoil. The estimated index values in each country were then compared to the actual outcome. From these comparisons it was possible to determine if the models were accurate even in times of crisis. Our results confirms many of the known correlations between macro variables and equity prices during stable times, but also produces more unforeseen findings. The results we came across further implies that the models are not suited for predicting the performances in times of uncertainty. This conclusion was drawn by investigating the probabilities of occurrence for the estimated returns, using the obtained models. Germany and Sweden appeared to yield particularly high returns during the time of turmoil while the Norwegian stock market instead decreased in value.
Varje investerare gör sin investering med en önskan om maximal avkastning med lägsta möjliga risk. För att uppfylla denna önskan är det viktigt att förstå hur makroekonomiska faktorer påverkar den potentiella avkastningen. Under de senaste två årtiondena har flertalet kriser vänt upp och ner på många av grunderna. Denna uppsats ämnar att undersöka hur makroekonomiska variabler påverkar avkastningen under stabila tider och vidare analysera vilken påverkan den senaste oljekrisen har haft på dessa korrelationer. Undersökningen är begränsad till ett fåtal länder i Europa, mer ingående Tyskland, Sverige, Frankrike, Norge, och Storbritannien. Vi har analyserat detta fenomen med hjälp utav multipla linjära regressioner med en laggad beroende variabel på data från february 2010 till augusti 2014. Datan hämtades från 55 månader i följd innan oljekrisen och även under de sex efterföljande volatila månaderna. De erhållna modellerna från stabila tider användes sedan för att skatta indexvärden under tider av turbulens. Skattningarna för varje land jämfördes sedan med det verkliga utfallet. Från jämförelserna var det möjligt att avgöra om modellerna var precisa även under osäkra tider. Våra resultat bekräftar många av de tidigare kända korrelationerna mellan makroekonomiska variabler och aktiemarknaden under stabila tider, men påvisar också mer oförutsedda utfall. Vidare antyder även resultaten att modellerna inte är lämpade för att skatta prestationer i kristider. Denna slutsats kunde dras genom att studera hur sannolika de skattade värdena var vid användandet av de framtagna modellerna. Tyskland och Sverige verkar ha gett speciellt höga avkastningar under den turbulenta tiden medan den norska börsen snarare tappade i värde.
APA, Harvard, Vancouver, ISO, and other styles
38

Myers, Ron Y. "The Effects of the Use of Technology In Mathematics Instruction on Student Achievement." FIU Digital Commons, 2009. http://digitalcommons.fiu.edu/etd/136.

Full text
Abstract:
The purpose of this study was to examine the effects of the use of technology on students’ mathematics achievement, particularly the Florida Comprehensive Assessment Test (FCAT) mathematics results. Eleven schools within the Miami-Dade County Public School System participated in a pilot program on the use of Geometers Sketchpad (GSP). Three of these schools were randomly selected for this study. Each school sent a teacher to a summer in-service training program on how to use GSP to teach geometry. In each school, the GSP class and a traditional geometry class taught by the same teacher were the study participants. Students’ mathematics FCAT results were examined to determine if the GSP produced any effects. Students’ scores were compared based on assignment to the control or experimental group as well as gender and SES. SES measurements were based on whether students qualified for free lunch. The findings of the study revealed a significant difference in the FCAT mathematics scores of students who were taught geometry using GSP compared to those who used the traditional method. No significant differences existed between the FCAT mathematics scores of the students based on SES. Similarly, no significant differences existed between the FCAT scores based on gender. In conclusion, the use of technology (particularly GSP) is likely to boost students’ FCAT mathematics test scores. The findings also show that the use of GSP may be able to close known gender and SES related achievement gaps. The results of this study promote policy changes in the way geometry is taught to 10th grade students in Florida’s public schools.
APA, Harvard, Vancouver, ISO, and other styles
39

Jung, Yoonsung. "Tests for unequal treatment variances in crossover designs." Diss., Manhattan, Kan. : Kansas State University, 2009. http://hdl.handle.net/2097/1581.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Harding, Seeromanie. "Mortality and morbidity patterns in ethnic minorities in England and Wales evidence from the Office for National Statistics Longitudinal Study /." Thesis, Connect to e-thesis record to view abstract. Move to record for print version, 2007. http://theses.gla.ac.uk/94/.

Full text
Abstract:
Thesis (Ph.D.) - University of Glasgow, 2007.
Ph.D. thesis submitted by published work to the MRC Social and Public Health Sciences Unit, University of Glasgow, 2007. Includes bibliographical references. Print version also available.
APA, Harvard, Vancouver, ISO, and other styles
41

MAHOI, ISATA. "UNA VALUTAZIONE SULL'EQUITÀ E FONDIARIA E IL SUO IMPATTO SULLA SICUREZZA ALIMENTARE IN SIERRA LEONE." Doctoral thesis, Università Cattolica del Sacro Cuore, 2016. http://hdl.handle.net/10280/12223.

Full text
Abstract:
Della proprietà astratta è associata con lo stato, potere e ricchezza nelle società più africane e terreni agricoli di proprietà appartiene agli uomini. Lo scopo di questo studio è di esaminare il legame tra proprietà fondiaria e differenze di genere nei sistemi di possesso della terra. Questo studio esplora l'accesso delle donne alla terra nell'ambito dei sistemi di consueto possesso. Rassegna i principali aspetti del contributo delle donne africane alla produzione alimentare e raccolto in contanti e offre alcuni suggerimenti per migliorare la loro partecipazione e intensificazione nel settore dei piccoli. Inoltre, lo studio esamina come i cambiamenti nella proprietà fondiaria, proprietà, accesso e diritti alla terra come conseguenza di leggi consuete stanno influenzando la produttività dell'agricoltura, sicurezza alimentare e lotta alla povertà. Il dibattito è incentrato sulle preoccupazioni di un'equa distribuzione tra uomini e donne e Guarda le donne rurali come operai agricoli a un livello in cui le disuguaglianze di genere coincidono. I risultati da questo studio illustrano la cultura predominante e le pratiche tradizionali ancora colpiscono le donne, andare a loro discapito a favore degli uomini per quanto riguarda l'ereditarietà e la diretta proprietà di terreni e proprietà in casa. Parole chiave: Equità di genere, Proprietà della terra, Riforma agraria, Sicurezza alimentare.
Landownership is associated with status, power and wealth in most African societies and agricultural land property belongs to men. The aim of this study is to examine the link between land ownership and gender differences in land tenure systems. This study explores women’s access to land under the customary tenure systems. It reviews the major aspects of African women's contribution to food and cash crop production and offers some suggestions to improve their participation and intensification in the smallholder sector. Also, the study examines how the changes in land tenure, ownership, access and rights to land as a consequence of customary laws are affecting agricultural productivity, food security and poverty alleviation. The debate is centred on concerns of equitable distribution among men and women and looks at rural women as agricultural workers at a level where gender inequalities coincide. The findings from this study illustrate the predominant culture and traditional practices still affect women, disadvantaging them in favour of men regarding inheritance and direct ownership of land and property in the household. Keywords: Gender Equity, Land ownership, Land Reform, Food Security.
APA, Harvard, Vancouver, ISO, and other styles
42

Gayle, Suelen S. "A simulation study of the size and power of Cochran’s Q versus the standard Chi-square test for testing the equality of correlated proportions." Kansas State University, 2010. http://hdl.handle.net/2097/3881.

Full text
Abstract:
Master of Science
Department of Statistics
Paul I. Nelson
The standard Chi-square test for the equality of proportions of positive responses to c specified binary questions is valid when the observed responses arise from independent random samples of units. When the responses to all c questions are recorded on the same unit, a situation called correlated proportions, the assumptions under which this test is derived are no longer valid. Under the additional assumption of compound symmetry, the Cochran-Q test is a valid test for the equality of proportions of positive responses. The purpose of this report is to use simulation to examine and compare the performance of the Cochran-Q test and the standard Chisquare test when testing for the equality of correlated proportions. It is found that the Cochran-Q test is superior to the Chi-square test in terms of size and power, especially when the common correlation among the binary responses is large.
APA, Harvard, Vancouver, ISO, and other styles
43

Silva, Marcos Roberto Alves da. "Decisões de estrutura de capital no Brasil - uma abordagem por setor de atividade, fatores econômicos e de mercado e desempenho empresarial." Universidade Presbiteriana Mackenzie, 2015. http://tede.mackenzie.br/jspui/handle/tede/833.

Full text
Abstract:
Made available in DSpace on 2016-03-15T19:31:12Z (GMT). No. of bitstreams: 1 Marcos Robertoprot.pdf: 3270286 bytes, checksum: 656412122410522073af0f0ac5a7066e (MD5) Previous issue date: 2015-08-03
Universidade Presbiteriana Mackenzie
The aim of this study is to verify the influence of the sector of activities, economic and market factors and business performance in the definition of capital structure. It uses data from Economática with 415 Brazilian companies that operated in the capital market (BM&FBOVESPA), between 2001 and 2014, to examine the behavior to such dimensions and their adherence to the wider theoretical set today. Inappropriate decisions of capital structure raises the cost of capital, hindering acceptable investments that maximize the wealth of the owners. Many studies regarding the indebtedness of companies were made in recent decades, but so far has no obvious response of relevance or lack thereof. In this sense, one can conclude that we do not have a theory fully accepted on the capital structure. It is difficult to generalize about funding policies because they differ widely from company to company and in the various sectors of activity. The specific variables to business performance continue to be used exhaustively to seek underpin a theoretical framework on the subject. Other studies, on a smaller scale, mainly in Brazil, investigate a possible influence of the sectors of activity and the economics and market conditions / restrictions in the choice of capital structure. In this sense, realizing the gap of capital structure studies in Brazil, that address sectors of activity and economic and market variables, it opens up the opportunity for this research project. As a result it appears that, after robust regression problems correct order autocorrelation of errors and heteroscedasticity, the variables average leverage of sector, investment of sector, Ibovespa, GDP, inflation, market-to-book, Tobin's Q, profitability, liquidity, growth and business risk were statistically significant in order to explain the variations dependent variable, ie, leverage the market value. Other variables, such as concentration of the sector, interest rate, size and tangibility, did not show, after the robust regression, statistical significance. As a result it appears that, after robust regression correct order autocorrelation problems of errors and heteroscedasticity, the average leverage variables sector, industry investment, Ibovespa, gdp, inflation, market-to-book, Tobin's Q, profitability, liquidity, growth and business risk were statistically significant in order to explain the variations of the dependent variable, ie, leverage at market value. Other variables, such as concentration of the sector, interest rate, size and tangibility, did not show, after the robust regression, statistical significance.
O objetivo principal deste estudo é verificar a influência do setor de atividades, dos fatores econômicos e de mercado e do desempenho empresarial na definição da estrutura de capital. Utiliza-se de dados da Economática com 415 empresas brasileiras que atuaram no mercado de capitais (BM&FBOVESPA), no período entre 2001 e 2014, buscando examinar o comportamento de tais dimensões e sua aderência ao conjunto teórico mais difundido atualmente. Decisões inadequadas de estrutura de capital eleva o custo de capital, dificultando investimentos aceitáveis que maximize a riqueza dos proprietários. Muitos estudos em relação ao endividamento das empresas foram realizados nas últimas décadas, mas, até agora, não foi encontrada uma resposta de relevância ou falta dela. Neste sentido, pode-se concluir que não temos ainda uma teoria totalmente aceita sobre a estrutura de capital. É difícil generalizar sobre políticas de financiamento, pois elas diferem bastante de empresa para empresa e nos diversos setores de atividades. As variáveis específicas de desempenho empresarial continuam sendo usadas de forma exaustiva para buscar alicerçar um arcabouço teórico a respeito do tema. Outros estudos, em menor escala, principalmente no Brasil, investigam uma possível influência do setor de atividade e das condições/restrições econômicas e de mercado na escolha da estrutura de capital. Neste sentido, percebendo a lacuna de estudos de estrutura de capital no Brasil, que contemplem setor de atividades e variáveis econômicas e de mercado, abre-se a oportunidade para a contribuição deste projeto de pesquisa. Como resultado constata-se que, após regressão robusta visando corrigir problemas de autocorrelação dos erros e heterocedasticidade, que as variáveis alavancagem média do setor, investimentos do setor, Ibovespa, pib, inflação, market-to-book, Q de Tobin, lucratividade, liquidez, crescimento e risco do negócio apresentaram significância estatística, no sentido de explicar as variações da variável dependente, ou seja, a alavancagem a valor de mercado. Outras variáveis, como concentração do setor, taxa de juros, tamanho e tangibilidade, não apresentaram, depois da regressão robusta, significância estatística.
APA, Harvard, Vancouver, ISO, and other styles
44

Chatelain, Megan E. "Minority Representations in Crime Drama: An Examination of Roles, Identity, and Power." Scholarly Commons, 2020. https://scholarlycommons.pacific.edu/uop_etds/3716.

Full text
Abstract:
The storytelling ability of television can be observed in any genre. Crime drama offers a unique perspective because victims and offenders change every episode increasing stereotypes with each new character. In other words, the more victims and criminals observed by the audience, the more likely the show creates the perception of a mean world. Based on previous literature, three questions emerged which this study focused on by asking the extent of Criminal Minds’ ability to portray crime accurately compared to the Federal Bureau of Investigations Uniform Crime Report (UCR) and the Behavioral Analysis Unit’s (BAU-4) report on serial murderers and how those portrayals changed over the fifteen years of the show. A content analysis was conducted through the lens of cultivation theory, coding 324 episodes which produced a sample size of 354 different cases to answer the research questions. Two additional coders focused on the first, middle, and last episodes of each season (N=45) for reliability. The key findings are low levels of realism with the UCR and high levels of realism with the BAU-4 statistics. Mean-world syndrome was found to be highly likely to be cultivated in heavy viewers. Finally, roles for minority groups did improve overtime for Black and Brown bodies, yet Asian bodies saw a very small increase in representation. LGBT members were nearly nonexistent. The findings indicated that there is still not enough space in television for minority roles and found that the show perpetuated stereotypes. Additional implications and themes include a lack discourse on violence and erasure of sexual assault victims.
APA, Harvard, Vancouver, ISO, and other styles
45

Cañadas, Alejandro A. "Inequality and economic growth evidence from Argentina's provinces using spatial econometrics /." Columbus, Ohio : Ohio State University, 2008. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1211944935.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Gordaliza, Pastor Paula. "Fair learning : une approche basée sur le transport optimale." Thesis, Toulouse 3, 2020. http://www.theses.fr/2020TOU30084.

Full text
Abstract:
L'objectif de cette thèse est double. D'une part, les méthodes de transport optimal sont étudiées pour l'inférence statistique. D'autre part, le récent problème de l'apprentissage équitable est considéré avec des contributions à travers le prisme de la théorie du transport optimal. L'utilisation généralisée des applications basées sur les modèles d'apprentissage automatique dans la vie quotidienne et le monde professionnel s'est accompagnée de préoccupations quant aux questions éthiques qui peuvent découler de l'adoption de ces technologies. Dans la première partie de cette thèse, nous motivons le problème de l'équité en présentant quelques résultats statistiques complets en étudiant le critère statistical parity par l'analyse de l'indice disparate impact sur l'ensemble de données réel Adult income. Il est important de noter que nous montrons qu'il peut être particulièrement difficile de créer des modèles d'apprentissage machine équitables, surtout lorsque les observations de formation contiennent des biais. Ensuite, une revue des mathématiques pour l'équité dans l'apprentissage machine est donné dans un cadre général, avec également quelques contributions nouvelles dans l'analyse du prix pour l'équité dans la régression et la classification. Dans cette dernière, nous terminons cette première partie en reformulant les liens entre l'équité et la prévisibilité en termes de mesures de probabilité. Nous analysons les méthodes de réparation basées sur le transport de distributions conditionnelles vers le barycentre de Wasserstein. Enfin, nous proposons le random repair qui permet de trouver un compromis entre une perte minimale d'information et un certain degré d'équité. La deuxième partie est dédiée à la théorie asymptotique du coût de transport empirique. Nous fournissons un Théorème de Limite Centrale pour la distance de Monge-Kantorovich entre deux distributions empiriques de tailles différentes n et m, Wp(Pn,Qm), p > = 1, avec observations sur R. Dans le cas de p > 1, nos hypothèses sont nettes en termes de moments et de régularité. Nous prouvons des résultats portant sur le choix des constantes de centrage. Nous fournissons une estimation consistente de la variance asymptotique qui permet de construire tests à deux échantillons et des intervalles de confiance pour certifier la similarité entre deux distributions. Ceux-ci sont ensuite utilisés pour évaluer un nouveau critère d'équité de l'ensemble des données dans la classification. En outre, nous fournissons un principe de déviations modérées pour le coût de transport empirique dans la dimension générale. Enfin, les barycentres de Wasserstein et le critère de variance en termes de la distance de Wasserstein sont utilisés dans de nombreux problèmes pour analyser l'homogénéité des ensembles de distributions et les relations structurelles entre les observations. Nous proposons l'estimation des quantiles du processus empirique de la variation de Wasserstein en utilisant une procédure bootstrap. Ensuite, nous utilisons ces résultats pour l'inférence statistique sur un modèle d'enregistrement de distribution avec des fonctions de déformation générale. Les tests sont basés sur la variance des distributions par rapport à leurs barycentres de Wasserstein pour lesquels nous prouvons les théorèmes de limite centrale, y compris les versions bootstrap
The aim of this thesis is two-fold. On the one hand, optimal transportation methods are studied for statistical inference purposes. On the other hand, the recent problem of fair learning is addressed through the prism of optimal transport theory. The generalization of applications based on machine learning models in the everyday life and the professional world has been accompanied by concerns about the ethical issues that may arise from the adoption of these technologies. In the first part of the thesis, we motivate the fairness problem by presenting some comprehensive results from the study of the statistical parity criterion through the analysis of the disparate impact index on the real and well-known Adult Income dataset. Importantly, we show that trying to make fair machine learning models may be a particularly challenging task, especially when the training observations contain bias. Then a review of Mathematics for fairness in machine learning is given in a general setting, with some novel contributions in the analysis of the price for fairness in regression and classification. In the latter, we finish this first part by recasting the links between fairness and predictability in terms of probability metrics. We analyze repair methods based on mapping conditional distributions to the Wasserstein barycenter. Finally, we propose a random repair which yields a tradeoff between minimal information loss and a certain amount of fairness. The second part is devoted to the asymptotic theory of the empirical transportation cost. We provide a Central Limit Theorem for the Monge-Kantorovich distance between two empirical distributions with different sizes n and m, Wp(Pn,Qm), p > = 1, for observations on R. In the case p > 1 our assumptions are sharp in terms of moments and smoothness. We prove results dealing with the choice of centering constants. We provide a consistent estimate of the asymptotic variance which enables to build two sample tests and confidence intervals to certify the similarity between two distributions. These are then used to assess a new criterion of data set fairness in classification. Additionally, we provide a moderate deviation principle for the empirical transportation cost in general dimension. Finally, Wasserstein barycenters and variance-like criterion using Wasserstein distance are used in many problems to analyze the homogeneity of collections of distributions and structural relationships between the observations. We propose the estimation of the quantiles of the empirical process of the Wasserstein's variation using a bootstrap procedure. Then we use these results for statistical inference on a distribution registration model for general deformation functions. The tests are based on the variance of the distributions with respect to their Wasserstein's barycenters for which we prove central limit theorems, including bootstrap versions
APA, Harvard, Vancouver, ISO, and other styles
47

Benichou, Sarah. "Le droit à la non-discrimination "raciale" : instruments juridiques et politiques publiques." Thesis, Paris 10, 2011. http://www.theses.fr/2011PA100168/document.

Full text
Abstract:
Depuis une dizaine d'années, la France s'est engagée dans la lutte contre les discriminations « raciales », c'est-à-dire contre les discriminations fondées sur l'origine réelle ou supposée des personnes à partir de leurs caractéristiques visibles. Sous l'influence du droit communautaire, la législation a intégré une définition objectivée de la discrimination : la discrimination directe n'est plus nécessairement rattachée à une intention discriminatoire ; et, la discrimination indirecte permet de vérifier que les mesures neutres en apparence n'ont pas d'impact particulièrement désavantageux sur les populations d'origine immigrée et ultra-marine. L'égalité de traitement est ainsi concrètement appréciée afin de mieux garantir l'effectivité du droit à la non-discrimination "raciale". L'aménagement de la preuve doit favoriser la juridictionnalisation des discriminations, étape nécessaire pour légitimer et préciser le droit rénové. Mais, la définition exigeante de la discrimination impose aussi des obligations positives aux personnes morales qui doivent réviser l’ensemble des critères et procédures de sélection. Elle implique une implication des pouvoirs publics, notamment via la HALDE, pour soutenir les victimes, diffuser le droit antidiscriminatoire et promouvoir le droit à l'égalité de traitement. Pour autant, l'articulation de ces nouvelles définitions n'est pas évidente sur le terrain des discriminations « raciales ». Il faut notamment tenir compte du contexte politique français et de l’interdit constitutionnel (art. 1er), qui induisent une protection maximale du droit à la non-discrimination « raciale » et interdisent la catégorisation des origines. Enfin, la recherche de l’effectivité du droit à la non-discrimination « raciale » semble être mise à mal par l’émergence du paradigme utilitariste de la diversité
For the past ten years, France has been committed to fighting "racial" discrimination, specifically discriminations based on the genuine or surmised origin of individuals based on their physical features or names. Influenced by EU Law, French legislation has adopted an objective definition of discrimination: direct discrimination that no longer requires an underlying intention to discriminate. Indirect discrimination serves to ensure that otherwise neutral measures do not have a deleterious effect on immigrant and Caribbean populations. Equal treatment can therefore be objectively appraised, which reinforces the effectiveness of the right not to be discriminated against. The admissibility of evidence must evolve to bring more discrimination cases to trial, which is a prerequisite to endow anti-discrimination law with more legitimacy and clarity. At the same time, a strict definition of discrimination creates positive obligations for legal entities to review their selection criteria and processes. It calls for a commitment among public authorities, including through HALDE, to support victims, to raise awareness of anti-discrimination law, and to promote the right to equal treatment. Nevertheless, implementing theses definitions is challenging, specifically in the context of “racial” discrimination. The French background, as well as the Constitutional ban on all discrimination (art. 1), which fully guarantees the right not to be discriminated against on racial grounds and bans the collection of ethnic data, must both be taken into account. Finally, the effectiveness of the right not to be discriminated against may be undermined by the rise of a utilitarian view of diversity
APA, Harvard, Vancouver, ISO, and other styles
48

Nunes, Gustavo de Faro Colen. "Modelo da dinâmica de um livro de ordens para aplicações em high-frequency trading." reponame:Repositório Institucional do FGV, 2013. http://hdl.handle.net/10438/10570.

Full text
Abstract:
Submitted by Gustavo de Faro Colen Nunes (gustavocolennunes@gmail.com) on 2013-02-28T19:45:35Z No. of bitstreams: 1 MODELO DA DINÂMICA DE UM LIVRO DE ORDENS PARA APLICAÇÕES EM HIGH-FREQUENCY TRADING.pdf: 1769569 bytes, checksum: fcb41165f230caf02656cf7b8a709951 (MD5)
Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2013-02-28T21:30:40Z (GMT) No. of bitstreams: 1 MODELO DA DINÂMICA DE UM LIVRO DE ORDENS PARA APLICAÇÕES EM HIGH-FREQUENCY TRADING.pdf: 1769569 bytes, checksum: fcb41165f230caf02656cf7b8a709951 (MD5)
Made available in DSpace on 2013-03-01T11:06:28Z (GMT). No. of bitstreams: 1 MODELO DA DINÂMICA DE UM LIVRO DE ORDENS PARA APLICAÇÕES EM HIGH-FREQUENCY TRADING.pdf: 1769569 bytes, checksum: fcb41165f230caf02656cf7b8a709951 (MD5) Previous issue date: 2013-02-01
As operações de alta frequência (High-Frequency Trading - HFT) estão crescendo cada vez mais na BOVESPA (Bolsa de Valores de São Paulo), porém seu volume ainda se encontra muito atrás do volume de operações similares realizadas em outras bolsas de relevância internacional. Este trabalho pretende criar oportunidades para futuras aplicações e pesquisas nesta área. Visando aplicações práticas, este trabalho foca na aplicação de um modelo que rege a dinâmica do livro de ordens a dados do mercado brasileiro. Tal modelo é construído com base em informações do próprio livro de ordens, apenas. Depois de construído o modelo, o mesmo é utilizado em uma simulação de uma estratégia de arbitragem estatística de alta frequência. A base de dados utilizada para a realização deste trabalho é constituída pelas ordens lançadas na BOVESPA para a ação PETR4.
High-frequency trading (HFT) are increasingly growing on BOVESPA (São Paulo Stock Exchange), but their volume is still far behind the volume of similar operations performed on other internationally relevant exchange markets. The main objective of this work is to create opportunities for future research and applications in this area. Aiming at practical applications, this work focuses on applying a model that governs the dynamics of the order book to the Brazilian market. This model is built based in the information of the order book alone. After building the model, a high frequency statistical arbitrage strategy is simulated to validate the model. The database used for this work consists on the orders posted on the equity PETR4 in BOVESPA.
APA, Harvard, Vancouver, ISO, and other styles
49

Saldivia, Miguel Enrique Tejos. "A relação causal entre comprometimento e desempenho: um estudo em Centros de Pesquisa." Universidade de São Paulo, 2006. http://www.teses.usp.br/teses/disponiveis/85/85134/tde-18052012-085904/.

Full text
Abstract:
Neste estudo, foram analisadas relações existentes entre liderança, motivação, clima organizacional, trabalho em equipe e o comprometimento organizacional e ocupacional. O estudo exploratório foi realizado em três partes, na primeira procurou-se identificar os fatores de desempenho mais importantes numa relação maior resultantes da pesquisa bibliográfica e na segunda buscou-se quantificar os quatro fatores de desempenho junto aos comprometimentos organizacional e ocupacional. Na primeira parte foi realizada uma pesquisa exploratória e na segunda parte foram entrevistados 52 servidores do CCTM no Instituto de Pesquisas Energéticas e Nucleares IPEN e 252 servidores do IAE no Centro Técnico Aeroespacial CTA. A pesquisa utilizou 18 indicadores de comprometimento organizacional e 18 indicadores de comprometimento ocupacional, todos extraídos do instrumento de Meyer, Allen e Smith. Além destes, foram utilizadas 7 variáveis demográficas e 71 variáveis de desempenho construídas a partir da revisão teórica realizada. Os resultados obtidos com a pesquisa exploratória da primeira parte identificaram os quatro fatores de desempenho já citados e na segunda parte os resultados obtidos nos dois principais locais comprovaram a hipótese que os locais ou grupos de servidores que apresentam maior grau de comprometimento tendem a um maior grau de desempenho. Na terceira parte do estudo utilizou-se a técnica de modelagem de equações estruturais partindo de um modelo teórico definido com as 12 variáveis de desempenho mais importantes em ambos locais da pesquisa e com o apoio dos softwares estatísticos SPSS e LISREL obteve-se um modelo de relacionamento causal mais fortalecido para explicar as variáveis envolvidas.
In this work relation among leadership, motivation, organizational climate, teamwork, and the organizational and occupational commitment, were studied. The exploratory study was accomplished in three parts. In the first it was identified the more important performance factors in a larger relation resultant of the bibliographical research and in the second, it was undertaken a quantification the 4 performance factors together to the organizational and occupational commitments. In the first part, it was accomplished an exploratory research close to some experienced servants, with emphasis in the management area. In the second part it was interviewed 52 employees of the CCTM\'s at the Energy and Nuclear Research Institute IPEN and 252 employees of the IAE at the Aerospace Technical Center CTA. This research used 18 indicators of the organizational commitment and 18 indicators of the occupational commitment, all extracted from the instrument of Meyer, Allen and Smith. Beyond of these, it was used 7 demographic variables and 71 performance variables built from the theoretical revision. The results obtained with the exploratory research of the first part identified the 4 factors aforementioned performance factors. In the second part the obtained results in the two firsts places proved the hypothesis that the servants locations or groups that show a higher degree of commitment tend to a higher degree of performance. In the third part of this study it was used the Structural Equations Modelling SEM, from one theoretic model defined with the 12 more important variables from performance in both researched locals and with assistance of two statistical softwares SPSS and LISREL it was obtained a model of causal relations more strengthened to explain the relationship among the used variables.
APA, Harvard, Vancouver, ISO, and other styles
50

Vieira, Tiago de Medeiros. "Conceitos e t?cnicas da mec?nica estat?stica e termodin?mica aplicados ao estudo dos grafos aleat?rios." Universidade Federal do Rio Grande do Norte, 2012. http://repositorio.ufrn.br:8080/jspui/handle/123456789/16621.

Full text
Abstract:
Made available in DSpace on 2014-12-17T15:14:59Z (GMT). No. of bitstreams: 1 TiagoMV_DISSERT.pdf: 2054032 bytes, checksum: 01cc6c903e54670f8c4b846fab645cd0 (MD5) Previous issue date: 2012-02-24
Conselho Nacional de Desenvolvimento Cient?fico e Tecnol?gico
This dissertation briefly presents the random graphs and the main quantities calculated from them. At the same time, basic thermodynamics quantities such as energy and temperature are associated with some of their characteristics. Approaches commonly used in Statistical Mechanics are employed and rules that describe a time evolution for the graphs are proposed in order to study their ergodicity and a possible thermal equilibrium between them
Esta disserta??o apresenta brevemente os grafos aleat?rios e as principais quantidades calculadas a partir deles. Ao mesmo tempo, grandezas b?sicas da Termodin?mica como energia e temperatura s?o associadas a algumas de suas caracter?sticas. Abordagens comumente utilizadas na Mec?nica Estat?stica s?o empregadas e regras que descrevem uma evolu??o temporal para os grafos s?o propostas com o objetivo de estudar sua ergodicidade e um poss?vel equil?brio t?rmico entre eles
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography