Academic literature on the topic 'Conditional parametric re finement'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Conditional parametric re finement.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Conditional parametric re finement"

1

Chen, Ting-Li, Hsieh Fushing, and Elizabeth P. Chou. "Learned Practical Guidelines for Evaluating Conditional Entropy and Mutual Information in Discovering Major Factors of Response-vs.-Covariate Dynamics." Entropy 24, no. 10 (September 28, 2022): 1382. http://dx.doi.org/10.3390/e24101382.

Full text
Abstract:
We reformulate and reframe a series of increasingly complex parametric statistical topics into a framework of response-vs.-covariate (Re-Co) dynamics that is described without any explicit functional structures. Then we resolve these topics’ data analysis tasks by discovering major factors underlying such Re-Co dynamics by only making use of data’s categorical nature. The major factor selection protocol at the heart of Categorical Exploratory Data Analysis (CEDA) paradigm is illustrated and carried out by employing Shannon’s conditional entropy (CE) and mutual information (I[Re;Co]) as the two key Information Theoretical measurements. Through the process of evaluating these two entropy-based measurements and resolving statistical tasks, we acquire several computational guidelines for carrying out the major factor selection protocol in a do-and-learn fashion. Specifically, practical guidelines are established for evaluating CE and I[Re;Co] in accordance with the criterion called [C1:confirmable]. Following the [C1:confirmable] criterion, we make no attempts on acquiring consistent estimations of these theoretical information measurements. All evaluations are carried out on a contingency table platform, upon which the practical guidelines also provide ways of lessening the effects of the curse of dimensionality. We explicitly carry out six examples of Re-Co dynamics, within each of which, several widely extended scenarios are also explored and discussed.
APA, Harvard, Vancouver, ISO, and other styles
2

NE, Gyamfi, Kyei KA, and Gill R. "African Stock Markets and Return Predictability." Journal of Economics and Behavioral Studies 8, no. 5(J) (October 30, 2016): 91–99. http://dx.doi.org/10.22610/jebs.v8i5(j).1434.

Full text
Abstract:
This article re-examines the return predictability of eight African stock markets. When returns of stocks are predictable, arbitrageurs make abnormal gains from analyzing prices. The study uses a non-parametric Generalised Spectral (GS) test in a rolling window approach. The rolling window approach tracts the periods of efficiency over time. The GS test is robust to conditional heteroscedasticity and it detects the presence of linear and nonlinear dependencies in a stationary time series. Our results support the Adaptive Market Hypothesis (AMH). This is because, indices whose returns were observed to be predictable by analyzing them in absolute form and therefore weak - form inefficient showed trends of unpredictability in a rolling window.
APA, Harvard, Vancouver, ISO, and other styles
3

Rezaei, Ashkan, Rizal Fathony, Omid Memarrast, and Brian Ziebart. "Fairness for Robust Log Loss Classification." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 04 (April 3, 2020): 5511–18. http://dx.doi.org/10.1609/aaai.v34i04.6002.

Full text
Abstract:
Developing classification methods with high accuracy that also avoid unfair treatment of different groups has become increasingly important for data-driven decision making in social applications. Many existing methods enforce fairness constraints on a selected classifier (e.g., logistic regression) by directly forming constrained optimizations. We instead re-derive a new classifier from the first principles of distributional robustness that incorporates fairness criteria into a worst-case logarithmic loss minimization. This construction takes the form of a minimax game and produces a parametric exponential family conditional distribution that resembles truncated logistic regression. We present the theoretical benefits of our approach in terms of its convexity and asymptotic convergence. We then demonstrate the practical advantages of our approach on three benchmark fairness datasets.
APA, Harvard, Vancouver, ISO, and other styles
4

Dobrovolskiy, Vladimir. "Optimization of Portfolio of federal loan bonds and RE-PO trades." Economics and the Mathematical Methods 58, no. 3 (2022): 129. http://dx.doi.org/10.31857/s042473880018212-2.

Full text
Abstract:
Within the framework of the considered model, the Investor can make transactions for the buying and sailing of federal loan bonds (OFZ), as well as direct and reverse REPO deals secured by OFZ. Transactions are made for liquidity management and increasing interest income. This paper discusses the problem of constructing an optimal portfolio of such transactions. The paper considers the approach for the generation of scenarios for OFZ price changes, the mathematical formulation of the optimization problem, the assessment of its dimension depending on the number of assets and the number of scenarios, numerical experiments on historical data and the construction of an efficient portfolio frontier. The generation of scenarios for OFZ price changes is implemented using historical modeling of a parametric zero-coupon yield curve. The optimization problem criterion is the conditional value at risk (CVAR) risk measure. Constraints on the average return and self-financing of the portfolio are taken into account. As a result, the method of portfolio rebalancing without additional investment, the purpose of which is to minimize risk for a given profitability, is proposed. Numerical experiments are based on historical data about active traded OFZ in 2014-2020. The model is close to the real world: it takes into account commissions, repo discount, bid-ask spreads, trade volumes. Numerical results show that a trading strategy based on the introduced model is more profitable on average than investments in individual OFZ with comparable risk. Note that this effect is shown for strategies with high average profit constraints.
APA, Harvard, Vancouver, ISO, and other styles
5

Dohi, Tadashi, Hiroyuki Okamura, and Cun Hua Qian. "Statistical software fault management based on bootstrap confidence intervals." International Journal of Quality & Reliability Management 37, no. 6/7 (June 1, 2020): 905–23. http://dx.doi.org/10.1108/ijqrm-10-2019-0326.

Full text
Abstract:
PurposeIn this paper, the authors propose two construction methods to estimate confidence intervals of the time-based optimal software rejuvenation policy and its associated maximum system availability via a parametric bootstrap method. Through simulation experiments the authors investigate their asymptotic behaviors and statistical properties.Design/methodology/approachThe present paper is the first challenge to derive the confidence intervals of the optimal software rejuvenation schedule, which maximizes the system availability in the sense of long run. In other words, the authors concern the statistical software fault management by employing an idea of process control in quality engineering and a parametric bootstrap.FindingsAs a remarkably different point from the existing work, the authors carefully take account of a special case where the two-sided confidence interval of the optimal software rejuvenation time does not exist due to that fact that the estimator distribution of the optimal software rejuvenation time is defective. Here the authors propose two useful construction methods of the two-sided confidence interval: conditional confidence interval and heuristic confidence interval.Research limitations/implicationsAlthough the authors applied a simulation-based bootstrap confidence method in this paper, another re-sampling-based approach can be also applied to the same problem. In addition, the authors just focused on a parametric bootstrap, but a non-parametric bootstrap method can be also applied to the confidence interval estimation of the optimal software rejuvenation time interval, when the complete knowledge on the distribution form is not available.Practical implicationsThe statistical software fault management techniques proposed in this paper are useful to control the system availability of operational software systems, by means of the control chart.Social implicationsThrough the online monitoring in operational software systems, it would be possible to estimate the optimal software rejuvenation time and its associated system availability, without applying any approximation. By implementing this function on application programming interface (API), it is possible to realize the low-cost fault-tolerance for software systems with aging.Originality/valueIn the past literature, almost all authors employed parametric and non-parametric inference techniques to estimate the optimal software rejuvenation time but just focused on the point estimation. This may often lead to the miss-judgment based on over-estimation or under-estimation under uncertainty. The authors overcome the problem by introducing the two-sided confidence interval approach.
APA, Harvard, Vancouver, ISO, and other styles
6

Stupfler, Gilles, and Fan Yang. "ANALYZING AND PREDICTING CAT BOND PREMIUMS: A FINANCIAL LOSS PREMIUM PRINCIPLE AND EXTREME VALUE MODELING." ASTIN Bulletin 48, no. 1 (November 2, 2017): 375–411. http://dx.doi.org/10.1017/asb.2017.32.

Full text
Abstract:
AbstractCAT bonds play an important role in transferring insurance risks to the capital market. It has been observed that typical CAT bond premiums have changed since the recent financial crisis, which has been attributed to market participants being increasingly risk averse. In this work, we first propose a new premium principle, the financial loss premium principle, which includes a term measuring losses in the financial market that we represent here by the Conditional Tail Expectation (CTE) of the negative daily log-return of the S&P 500 index. Our analysis of empirical evidence suggests indeed that in the post-crisis market, instead of simply increasing the fixed level of risk load universally, the increased risk aversion should be modeled jointly by a fixed level of risk load and a financial loss factor to reflect trends in the financial market. This new premium principle is shown to be flexible with respect to the confidence/exceedance level of CTE. In the second part, we focus on the particular example of extreme wildfire risk. The distribution of the amount of precipitation in Fort McMurray, Canada, which is a very important factor in the occurrence of wildfires, is analyzed using extreme value modeling techniques. A wildfire bond with parametric trigger of precipitation is then designed to mitigate extreme wildfire risk, and its premium is predicted using an extreme value analysis of its expected loss. With an application to the 2016 Fort McMurray wildfire, we demonstrate that the extreme value model is sensible, and we further analyze how our results and construction can be used to provide a design framework for CAT bonds which may appeal to (re)insurers and investors alike.
APA, Harvard, Vancouver, ISO, and other styles
7

Finkelstein, T. "A new isothermal theory for Stirling machine analysis and a volume optimization using the concept of ‘ancillary’ and ‘tidal’ domains." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 212, no. 3 (March 1, 1998): 225–36. http://dx.doi.org/10.1243/0954406981521178.

Full text
Abstract:
Theoretical studies of Stirling cycle machines have always utilized a topological system view that goes back to Schmidt's isothermal analysis, where the process is analysed by reference to the expansion space volume variations. Due to this idiosyncrasy in the formulation, it has been difficult to deduce meaningful design criteria from the results. In this paper an alternative visualization is presented, using the newly introduced concepts of a ‘tidal phase angle’ and overlapping ‘tidal’ and ‘ancillary’ domains. With vectorial parameters and a centralized reference basis, a non-dimensional parameter Rcaronfr;tcr, the ‘tidal compression ratio’, equal to the ratio of the average masses in the tidal and ancillary domains, is derived. This number uniquely characterizes the operation of equivalent machines and is therefore akin to the compression ratio in internal combustion engines. On the basis of this, a second new parametric grouping emerged to enhance the usefulness of the resultant integrated equations for use with dimensional analysis. It was defined as the ‘specific performance’ Rcaronfr;sp and is proportional to the output per unit mass, the gas constant and the operating temperature range. It is applicable to engines, heat pumps and refrigerators. Prior attempts at optimizing the proportions of a Stirling engine have not yielded usable results and consequently nearly all Stirling cycle machines built up to the present time have expansion and compression spaces of equal size. The new analysis shows that this is not the most appropriate configuration and it readily yields an optimization of the component volumes. One single analytical conditional equation for the optimum relative sizes of the constituent spaces was obtained from the new formulation for performance that quantifies the condition for an optimized proportioning of any Stirling cycle machine. It has three distinct usable solutions, one of which is an analytical confirmation of a postulate that has previously been published by the author without proof, equating VE/ VC and also Vh/ Vk to the temperature ratio TE/ TC. A numerical verification of this rule based on the proportions of the United Stirling V-160 engine compares it with 12 equivalent re-proportioned derivative engines, all with equal charge masses and operating at precisely the same conditions. This shows a substantial increase in the ideal performance through the use of the derived criteria. The main conclusion is that this theory may lead to a re-examination of the overall layout of Stirling cycle machines and the emergence of a new class of machines with superior performance.
APA, Harvard, Vancouver, ISO, and other styles
8

Kulinkina, Alexandra V., Andrea Farnham, Nana-Kwadwo Biritwum, Jürg Utzinger, and Yvonne Walz. "How do disease control measures impact spatial predictions of schistosomiasis and hookworm? The example of predicting school-based prevalence before and after preventive chemotherapy in Ghana." PLOS Neglected Tropical Diseases 17, no. 6 (June 16, 2023): e0011424. http://dx.doi.org/10.1371/journal.pntd.0011424.

Full text
Abstract:
Background Schistosomiasis and soil-transmitted helminth infections are among the neglected tropical diseases (NTDs) affecting primarily marginalized communities in low- and middle-income countries. Surveillance data for NTDs are typically sparse, and hence, geospatial predictive modeling based on remotely sensed (RS) environmental data is widely used to characterize disease transmission and treatment needs. However, as large-scale preventive chemotherapy has become a widespread practice, resulting in reduced prevalence and intensity of infection, the validity and relevance of these models should be re-assessed. Methodology We employed two nationally representative school-based prevalence surveys of Schistosoma haematobium and hookworm infections from Ghana conducted before (2008) and after (2015) the introduction of large-scale preventive chemotherapy. We derived environmental variables from fine-resolution RS data (Landsat 8) and examined a variable distance radius (1–5 km) for aggregating these variables around point-prevalence locations in a non-parametric random forest modeling approach. We used partial dependence and individual conditional expectation plots to improve interpretability. Principal findings The average school-level S. haematobium prevalence decreased from 23.8% to 3.6% and that of hookworm from 8.6% to 3.1% between 2008 and 2015. However, hotspots of high-prevalence locations persisted for both diseases. The models with environmental data extracted from a buffer radius of 2–3 km around the school location where prevalence was measured had the best performance. Model performance (according to the R2 value) was already low and declined further from approximately 0.4 in 2008 to 0.1 in 2015 for S. haematobium and from approximately 0.3 to 0.2 for hookworm. According to the 2008 models, land surface temperature (LST), modified normalized difference water index (MNDWI), elevation, slope, and streams variables were associated with S. haematobium prevalence. LST, slope, and improved water coverage were associated with hookworm prevalence. Associations with the environment in 2015 could not be evaluated due to low model performance. Conclusions/significance Our study showed that in the era of preventive chemotherapy, associations between S. haematobium and hookworm infections and the environment weakened, and thus predictive power of environmental models declined. In light of these observations, it is timely to develop new cost-effective passive surveillance methods for NTDs as an alternative to costly surveys, and to focus on persisting hotspots of infection with additional interventions to reduce reinfection. We further question the broad application of RS-based modeling for environmental diseases for which large-scale pharmaceutical interventions are in place.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Conditional parametric re finement"

1

Haque, Inzemamul. "Verification of a Generative Separation Kernel." Thesis, 2020. https://etd.iisc.ac.in/handle/2005/4571.

Full text
Abstract:
A Separation Kernel is a small specialized microkernel that provides a sand-boxed execution environment for a given set of processes (also called \subjects"). The subjects may communicate only via declared memory channels, and are otherwise isolated from each other. A generative separation kernel is one in which a specialized separation kernel is generated for each system con figuration. Separation kernels are typically used in safety-critical and security-critical systems in the avionics and military domains, and it is of utmost importance to have a high level of assurance regarding the correct working of the separation kernel. In this thesis we present a formal veri fication of the functional correctness of the Muen separation kernel which is representative of the class of modern separation kernels that leverage hardware virtualization support and are generative in nature. The nature of generativeness is template-based, in that the kernel is essentially based on a fixed template of code and various constant data-structures are filled-in by the kernel generator. Although there is a great deal of work in verifi cation of OS kernels in the past two decades, applying those techniques to a separation kernel like Muen becomes challenging because of its generative nature. The use of hardware virtualization support poses another challenge. We propose a veri fication framework called conditional parametric re finement to reason about the functional correctness of template-based generative systems. Conditional parametric re finement extends classical re finement to parametric programs which are programs with uninitialized variables. This is a two-step technique for parametric programs. We first perform a general verifi cation step (independent of the input spec) to verify that the parametric program re fines a parametric abstract specifi cation, assuming certain natural conditions on the parameter values (for example injectivity of the page tables) that are to be fi lled in. This first step essentially tells us that for any input speci fication P, if the parameters generated by the system generator satisfy the assumed conditions, then the generated system is correct vis-a-vis the abstract specifi cation. In the second step, which is input-speci fic, we check that for a given input speci fication, the assumptions actually hold for the generated parameter values. This gives us an effective veri fication technique for verifying generative systems. We applied this technique to verify the Muen Separation Kernal. We chose to model the virtualization layer (in this case Intel's VT-x layer) along with the rest of the hardware components like registers and memory, programmatically in software. We carried out the first step of conditional parametric refi nement for Muen, using the Spark Ada veri fication tool. The effort involved about 20K lines of source code and annotation. We have also implemented a tool that automatically and effciently performs the Step 2 check for a given separation kernel con figuration. The tool is effective in proving the assumptions, leading to machine-checked proofs of correctness for 16 different input con figurations, as well as in detecting issues like undeclared sharing of memory components in some seeded faulty con figurations
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography