Journal articles on the topic 'Incremental variational approach'

To see the other types of publications on this topic, follow the link: Incremental variational approach.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Incremental variational approach.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Balzani, Daniel, and Thomas Schmidt. "Relaxed incremental variational approach for damage in arteries." PAMM 15, no. 1 (October 2015): 81–82. http://dx.doi.org/10.1002/pamm.201510031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Huang, Y., A. Abou-Chakra Guéry, and J. F. Shao. "Incremental variational approach for time dependent deformation in clayey rock." International Journal of Plasticity 64 (January 2015): 88–103. http://dx.doi.org/10.1016/j.ijplas.2014.07.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Veerse, F., and J. N. Thepaut. "Multiple-truncation incremental approach for four-dimensional variational data assimilation." Quarterly Journal of the Royal Meteorological Society 124, no. 550 (July 1998): 1889–908. http://dx.doi.org/10.1002/qj.49712455006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Dimitrov, S., and E. Schnack. "Variational approach for strain-driven, incremental homogenization of inelastic solids." PAMM 5, no. 1 (December 2005): 329–30. http://dx.doi.org/10.1002/pamm.200510141.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Fletcher, Steven J., and Andrew S. Jones. "Multiplicative and Additive Incremental Variational Data Assimilation for Mixed Lognormal–Gaussian Errors." Monthly Weather Review 142, no. 7 (June 27, 2014): 2521–44. http://dx.doi.org/10.1175/mwr-d-13-00136.1.

Full text
Abstract:
Abstract An advance that made Gaussian-based three- and four-dimensional variational data assimilation (3D- and 4DVAR, respectively) operationally viable for numerical weather prediction was the introduction of the incremental formulation. This reduces the computational costs of the variational methods by searching for a small increment to a background state whose evolution is approximately linear. In this paper, incremental formulations for 3D- and 4DVAR with lognormal and mixed lognormal–Gaussian-distributed background and observation errors are presented. As the lognormal distribution has geometric properties, a geometric version for the tangent linear model (TLM) is proven that enables the linearization of the observational component of the cost functions with respect to a geometric increment. This is combined with the additive TLM for the mixed distribution–based cost function. Results using the mixed incremental scheme with the Lorenz’63 model are presented for different observational error variances, observation set sizes, and assimilation window lengths. It is shown that for sparse accurate observations the scheme has a relative error of ±0.5% for an assimilation window of 100 time steps. This improves to ±0.3% with more frequent observations. The distributions of the analysis errors are presented that appear to approximate a lognormal distribution with a mode at 1, which, given that the background and observational errors are unbiased in Gaussian space, shows that the scheme is approximating a mode and not a median. The mixed approach is also compared against a Gaussian-only incremental scheme where it is shown that as the z-component observational errors become more lognormal, the mixed approach appears to be more accurate than the Gaussian approach.
APA, Harvard, Vancouver, ISO, and other styles
6

Ye, Fei, and Adrian G. Bors. "Continual Variational Autoencoder via Continual Generative Knowledge Distillation." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 9 (June 26, 2023): 10918–26. http://dx.doi.org/10.1609/aaai.v37i9.26294.

Full text
Abstract:
Humans and other living beings have the ability of short and long-term memorization during their entire lifespan. However, most existing Continual Learning (CL) methods can only account for short-term information when training on infinite streams of data. In this paper, we develop a new unsupervised continual learning framework consisting of two memory systems using Variational Autoencoders (VAEs). We develop a Short-Term Memory (STM), and a parameterised scalable memory implemented by a Teacher model aiming to preserve the long-term information. To incrementally enrich the Teacher's knowledge during training, we propose the Knowledge Incremental Assimilation Mechanism (KIAM), which evaluates the knowledge similarity between the STM and the already accumulated information as signals to expand the Teacher's capacity. Then we train a VAE as a Student module and propose a new Knowledge Distillation (KD) approach that gradually transfers generative knowledge from the Teacher to the Student module. To ensure the quality and diversity of knowledge in KD, we propose a new expert pruning approach that selectively removes the Teacher's redundant parameters, associated with unnecessary experts which have learnt overlapping information with other experts. This mechanism further reduces the complexity of the Teacher's module while ensuring the diversity of knowledge for the KD procedure. We show theoretically and empirically that the proposed framework can train a statistically diversified Teacher module for continual VAE learning which is applicable to learning infinite data streams.
APA, Harvard, Vancouver, ISO, and other styles
7

Schmidt, Thomas, and Daniel Balzani. "Relaxed incremental variational approach for the modeling of damage-induced stress hysteresis in arterial walls." Journal of the Mechanical Behavior of Biomedical Materials 58 (May 2016): 149–62. http://dx.doi.org/10.1016/j.jmbbm.2015.08.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Günther, Christina, Philipp Junker, and Klaus Hackl. "A variational viscosity-limit approach to the evolution of microstructures in finite crystal plasticity." Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 471, no. 2180 (August 2015): 20150110. http://dx.doi.org/10.1098/rspa.2015.0110.

Full text
Abstract:
A micromechanical model for finite single crystal plasticity was introduced by Kochmann & Hackl (2011 Contin. Mech. Thermodyn. 23, 63–85 ( doi:10.1007/s00161-010-0714-5 )). This model is based on thermodynamic variational principles and leads to a non-convex variational problem. Based on the Lagrange functional, an incremental strategy was outlined to model the time-continuous evolution of a first-order laminate microstructure. Although this model provides interesting results on the material point level, owing to the global minimization in the evolution equations, the calculation time and numerical instabilities may cause problems when applying this model to macroscopic specimens. In this paper, a smooth transition zone between the laminates is introduced to avoid global minimization, which makes the numerical calculations cumbersome compared with the model in Kochmann & Hackl. By introducing a smooth viscous transition zone, the dissipation potential and its numerical treatment have to be adapted. We outline rate-dependent time-evolution equations for the internal variables based on variational techniques and show as first examples single-slip shear and tension/compression tests.
APA, Harvard, Vancouver, ISO, and other styles
9

Pethe, Rohit, Thomas Heuzé, and Laurent Stainier. "Variational h-adaption for coupled thermomechanical problems." Engineering Computations 37, no. 4 (November 25, 2019): 1261–90. http://dx.doi.org/10.1108/ec-05-2019-0243.

Full text
Abstract:
Purpose The purpose of this paper is to present a variational mesh h-adaption approach for strongly coupled thermomechanical problems. Design/methodology/approach The mesh is adapted by local subdivision controlled by an energy criterion. Thermal and thermomechanical problems are of interest here. In particular, steady and transient purely thermal problems, transient strongly coupled thermoelasticity and thermoplasticity problems are investigated. Findings Different test cases are performed to test the robustness of the algorithm for the problems listed above. It is found that a better cost-effectiveness can be obtained with that approach compared to a uniform refining procedure. Because the algorithm is based on a set of tolerance parameters, parametric analyses and a study of their respective influence on the mesh adaption are carried out. This detailed analysis is performed on unidimensional problems, and a final example is provided in two dimensions. Originality/value This work presents an original approach for independent h-adaption of a mechanical and a thermal mesh in strongly coupled problems, based on an incremental variational formulation. The approach does not rely on (or attempt to provide) error estimation in the classical sense. It could merely be considered to provide an error indicator. Instead, it provides a practical methodology to adapt the mesh on the basis of the variational structure of the underlying mathematical problem.
APA, Harvard, Vancouver, ISO, and other styles
10

Hernández-Sanjaime, Rocío, Martín González, Antonio Peñalver, and Jose J. López-Espín. "Estimating Simultaneous Equation Models through an Entropy-Based Incremental Variational Bayes Learning Algorithm." Entropy 23, no. 4 (March 24, 2021): 384. http://dx.doi.org/10.3390/e23040384.

Full text
Abstract:
The presence of unaccounted heterogeneity in simultaneous equation models (SEMs) is frequently problematic in many real-life applications. Under the usual assumption of homogeneity, the model can be seriously misspecified, and it can potentially induce an important bias in the parameter estimates. This paper focuses on SEMs in which data are heterogeneous and tend to form clustering structures in the endogenous-variable dataset. Because the identification of different clusters is not straightforward, a two-step strategy that first forms groups among the endogenous observations and then uses the standard simultaneous equation scheme is provided. Methodologically, the proposed approach is based on a variational Bayes learning algorithm and does not need to be executed for varying numbers of groups in order to identify the one that adequately fits the data. We describe the statistical theory, evaluate the performance of the suggested algorithm by using simulated data, and apply the two-step method to a macroeconomic problem.
APA, Harvard, Vancouver, ISO, and other styles
11

Kutyłowski, R., and B. Rasiak. "Incremental method of Young modulus updating procedure in topology optimization." Bulletin of the Polish Academy of Sciences: Technical Sciences 60, no. 2 (October 1, 2012): 223–28. http://dx.doi.org/10.2478/v10175-012-0029-2.

Full text
Abstract:
Abstract. This paper presents a new Young modulus updating procedure as an extension to the SIMP method used for topology optimization. In essence, the modified Young modulus updating procedure consists in taking into account in a given optimization step not only the material density from the preceding step, but also the increment in density in the two preceding steps. Thanks to this, it is possible to obtain a solution in cases in which the classic SIMP method failed. The variational approach was adopted and the structure’s strain energy was minimized under constraints imposed on body mass. FEM was used to solve numerical examples. The numerical analysis confirmed the effectiveness of the proposed method, particularly for structures with relatively long spans.
APA, Harvard, Vancouver, ISO, and other styles
12

Hasrati, E., R. Ansari, and H. Rouhi. "A numerical approach to the elastic/plastic axisymmetric buckling analysis of circular and annular plates resting on elastic foundation." Proceedings of the Institution of Mechanical Engineers, Part C: Journal of Mechanical Engineering Science 233, no. 19-20 (August 9, 2019): 7041–61. http://dx.doi.org/10.1177/0954406219867726.

Full text
Abstract:
Presented herein is the elastic/plastic axisymmetric buckling analysis of circular and annular plates resting on elastic foundation under radial loading based on a variational numerical method named as variational differential quadrature. To accomplish this aim, a first-order shear deformable plate model is developed in the context of incremental theory of plasticity (IT) (with the Prandtl-Reuss constitutive equations) and the deformation theory of plasticity (DT) (with the Hencky constitutive equations). It is considered that the material of plates exhibits strain hardening characterized by the Ramberg-Osgood relation. Also, the Winkler and Pasternak models are employed in order to formulate the elastic foundation. To implement the variational differential quadrature method, the matrix formulations of strain rates and constitutive relations are first derived. Then, based upon Hamilton's principle and using the variational differential quadrature derivative and integral operators, the discretized energy functional of the problem is directly obtained. Selected numerical results are presented to study the effects of various parameters including thickness-to-radius ratio, elastic modulus-to-nominal yield stress ratio, power of the Ramberg-Osgood relation and parameters of elastic foundation on the elastic/plastic buckling of circular and annular plates subject to different boundary conditions. Moreover, several comparisons are provided between the results of two plasticity theories, i.e. IT and DT. The effect of transverse shear deformation is also highlighted.
APA, Harvard, Vancouver, ISO, and other styles
13

Qin, Qing-Hua. "Trefftz Finite Element Method and Its Applications." Applied Mechanics Reviews 58, no. 5 (September 1, 2005): 316–37. http://dx.doi.org/10.1115/1.1995716.

Full text
Abstract:
This paper presents an overview of the Trefftz finite element and its application in various engineering problems. Basic concepts of the Trefftz method are discussed, such as T-complete functions, special purpose elements, modified variational functionals, rank conditions, intraelement fields, and frame fields. The hybrid-Trefftz finite element formulation and numerical solutions of potential flow problems, plane elasticity, linear thin and thick plate bending, transient heat conduction, and geometrically nonlinear plate bending are described. Formulations for all cases are derived by means of a modified variational functional and T-complete solutions. In the case of geometrically nonlinear plate bending, exact solutions of the Lamé-Navier equations are used for the in-plane intraelement displacement field, and an incremental form of the basic equations is adopted. Generation of elemental stiffness equations from the modified variational principle is also discussed. Some typical numerical results are presented to show the application of the finite element approach. Finally, a brief summary of the approach is provided and future trends in this field are identified. There are 151 references cited in this revised article.
APA, Harvard, Vancouver, ISO, and other styles
14

Peigney, M., and J. P. Seguin. "An incremental variational approach to coupled thermo-mechanical problems in anelastic solids. Application to shape-memory alloys." International Journal of Solids and Structures 50, no. 24 (November 2013): 4043–54. http://dx.doi.org/10.1016/j.ijsolstr.2013.08.013.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Karima, M. "Blank Development and Tooling Design for Drawn Parts Using a Modified Slip Line Field Based Approach." Journal of Engineering for Industry 111, no. 4 (November 1, 1989): 345–50. http://dx.doi.org/10.1115/1.3188770.

Full text
Abstract:
Box shaped parts demonstrate unusual characteristics as opposed to the drawn cylindrical cups. A novel approach for blank development, based on the modified plane strain slip line field (SLF), is presented in this work. The approach is to balance the element volume between the final position in the wall and the starting position in the flat flange. The end result of this SLF based unfolding technique is a set of elements representing the deformation path of the part. By post-processing the information on the nodal coordinates and invoking the variational principle in its incremental form, the strain distribution in the flange and the wall stresses are determined. A methodology is also presented for understanding the implications of the metal flow lines for tooling design.
APA, Harvard, Vancouver, ISO, and other styles
16

Liu, Chengsi, Qingnong Xiao, and Bin Wang. "An Ensemble-Based Four-Dimensional Variational Data Assimilation Scheme. Part II: Observing System Simulation Experiments with Advanced Research WRF (ARW)." Monthly Weather Review 137, no. 5 (May 1, 2009): 1687–704. http://dx.doi.org/10.1175/2008mwr2699.1.

Full text
Abstract:
Abstract An ensemble-based four-dimensional variational data assimilation (En4DVAR) algorithm and its performance in a low-dimension space with a one-dimensional shallow-water model have been presented in Part I. This algorithm adopts the standard incremental approach and preconditioning in the variational algorithm but avoids the need for a tangent linear model and its adjoint so that it can be easily incorporated into variational assimilation systems. The current study explores techniques for En4DVAR application in real-dimension data assimilation. The EOF decomposed correlation function operator and analysis time tuning are formulated to reduce the impact of sampling errors in En4DVAR upon its analysis. With the Advanced Research Weather Research and Forecasting (ARW-WRF) model, Observing System Simulation Experiments (OSSEs) are designed and their performance in real-dimension data assimilation is examined. It is found that the designed En4DVAR localization techniques can effectively alleviate the impacts of sampling errors upon analysis. Most forecast errors and biases in ARW are reduced by En4DVAR compared to those in a control experiment. En3DVAR cycling experiments are used to compare the ensemble-based sequential algorithm with the ensemble-based retrospective algorithm. These experiments indicate that the ensemble-based retrospective assimilation, En4DVAR, produces an overall better analysis than the ensemble-based sequential algorithm, En3DVAR, cycling approach.
APA, Harvard, Vancouver, ISO, and other styles
17

Miehe, C., F. E. Hildebrand, and L. Böger. "Mixed variational potentials and inherent symmetries of the Cahn–Hilliard theory of diffusive phase separation." Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 470, no. 2164 (April 8, 2014): 20130641. http://dx.doi.org/10.1098/rspa.2013.0641.

Full text
Abstract:
This work shows that the Cahn–Hilliard theory of diffusive phase separation is related to an intrinsic mixed variational principle that determines the rate of concentration and the chemical potential. The principle characterizes a canonically compact model structure, where the two balances involved for the species content and microforce appear as the Euler equations of a variational statement. The existence of the variational principle underlines an inherent symmetry in the two-field representation of the Cahn–Hilliard theory. This can be exploited in the numerical implementation by the construction of time- and space-discrete incremental potentials , which fully determine the update problems of typical time-stepping procedures. The mixed variational principles provide the most fundamental approach to the finite-element solution of the Cahn–Hilliard equation based on low-order basis functions, leading to monolithic symmetric algebraic systems of iterative update procedures based on a linearization of the nonlinear problem. They induce in a natural format the choice of symmetric solvers for Newton-type iterative updates, providing a speed-up and reduction of data storage when compared with non-symmetric implementations. In this sense, the potentials developed are believed to be fundamental ingredients to a deeper understanding of the Cahn–Hilliard theory.
APA, Harvard, Vancouver, ISO, and other styles
18

Wang, Hongli, Juanzhen Sun, Xin Zhang, Xiang-Yu Huang, and Thomas Auligné. "Radar Data Assimilation with WRF 4D-Var. Part I: System Development and Preliminary Testing." Monthly Weather Review 141, no. 7 (July 1, 2013): 2224–44. http://dx.doi.org/10.1175/mwr-d-12-00168.1.

Full text
Abstract:
Abstract The major goal of this two-part study is to assimilate radar data into the high-resolution Advanced Research Weather Research and Forecasting Model (ARW-WRF) for the improvement of short-term quantitative precipitation forecasting (QPF) using a four-dimensional variational data assimilation (4D-Var) technique. In Part I the development of a radar data assimilation scheme within the WRF 4D-Var system (WRF 4D-Var) and the preliminary testing of the scheme are described. In Part II the performance of the enhanced WRF 4D-Var system is examined by comparing it with the three-dimensional variational data assimilation system (WRF 3D-Var) for a convective system over the U.S. Great Plains. The WRF 4D-Var radar data assimilation system has been developed with the existing framework of an incremental formulation. The new development for radar data assimilation includes the tangent-linear and adjoint models of a Kessler warm-rain microphysics scheme and the new control variables of cloud water, rainwater, and vertical velocity and their error statistics. An ensemble forecast with 80 members is used to produce background error covariance. The preliminary testing presented in this paper includes single-observation experiments as well as real data assimilation experiments on a squall line with assimilation windows of 5, 15, and 30 min. The results indicate that the system is able to obtain anisotropic multivariate analyses at the convective scale and improve precipitation forecasts. The results also suggest that the incremental approach with successive basic-state updates works well at the convection-permitting scale for radar data assimilation with the selected assimilation windows.
APA, Harvard, Vancouver, ISO, and other styles
19

Lin, Xuxin, Jianwen Gan, Chaohao Jiang, Shuai Xue, and Yanyan Liang. "Wi-Fi-Based Indoor Localization and Navigation: A Robot-Aided Hybrid Deep Learning Approach." Sensors 23, no. 14 (July 12, 2023): 6320. http://dx.doi.org/10.3390/s23146320.

Full text
Abstract:
Indoor localization and navigation have become an increasingly important problem in both industry and academia with the widespread use of mobile smart devices and the development of network techniques. The Wi-Fi-based technology shows great potential for applications due to the ubiquitous Wi-Fi infrastructure in public indoor environments. Most existing approaches use trilateration or machine learning methods to predict locations from a set of annotated Wi-Fi observations. However, annotated data are not always readily available. In this paper, we propose a robot-aided data collection strategy to obtain the limited but high-quality labeled data and a large amount of unlabeled data. Furthermore, we design two deep learning models based on a variational autoencoder for the localization and navigation tasks, respectively. To make full use of the collected data, a hybrid learning approach is developed to train the models by combining supervised, unsupervised and semi-supervised learning strategies. Extensive experiments suggest that our approach enables the models to learn effective knowledge from unlabeled data with incremental improvements, and it can achieve promising localization and navigation performance in a complex indoor environment with obstacles.
APA, Harvard, Vancouver, ISO, and other styles
20

Buehner, Mark, Ron McTaggart-Cowan, Alain Beaulne, Cécilien Charette, Louis Garand, Sylvain Heilliette, Ervig Lapalme, et al. "Implementation of Deterministic Weather Forecasting Systems Based on Ensemble–Variational Data Assimilation at Environment Canada. Part I: The Global System." Monthly Weather Review 143, no. 7 (July 1, 2015): 2532–59. http://dx.doi.org/10.1175/mwr-d-14-00354.1.

Full text
Abstract:
Abstract A major set of changes was made to the Environment Canada global deterministic prediction system during the fall of 2014, including the replacement of four-dimensional variational data assimilation (4DVar) by four-dimensional ensemble–variational data assimilation (4DEnVar). The new system provides improved forecast accuracy relative to the previous system, based on results from two sets of two-month data assimilation and forecast experiments. The improvements are largest at shorter lead times, but significant improvements are maintained in the 120-h forecasts for most regions and vertical levels. The improvements result from the combined impact of numerous changes, in addition to the use of 4DEnVar. These include an improved treatment of radiosonde and aircraft observations, an improved radiance bias correction procedure, the assimilation of ground-based GPS data, a doubling of the number of assimilated channels from hyperspectral infrared sounders, and an improved approach for initializing model forecasts. Because of the replacement of 4DVar with 4DEnVar, the new system is also more computationally efficient and easier to parallelize, facilitating a doubling of the analysis increment horizontal resolution. Replacement of a full-field digital filter with the 4D incremental analysis update approach, and the recycling of several key variables that are not directly analyzed significantly reduced the model spinup during both the data assimilation cycle and in medium-range forecasts.
APA, Harvard, Vancouver, ISO, and other styles
21

Ting, Jo-Anne, Aaron D'Souza, Sethu Vijayakumar, and Stefan Schaal. "Efficient Learning and Feature Selection in High-Dimensional Regression." Neural Computation 22, no. 4 (April 2010): 831–86. http://dx.doi.org/10.1162/neco.2009.02-08-702.

Full text
Abstract:
We present a novel algorithm for efficient learning and feature selection in high-dimensional regression problems. We arrive at this model through a modification of the standard regression model, enabling us to derive a probabilistic version of the well-known statistical regression technique of backfitting. Using the expectation-maximization algorithm, along with variational approximation methods to overcome intractability, we extend our algorithm to include automatic relevance detection of the input features. This variational Bayesian least squares (VBLS) approach retains its simplicity as a linear model, but offers a novel statistically robust black-box approach to generalized linear regression with high-dimensional inputs. It can be easily extended to nonlinear regression and classification problems. In particular, we derive the framework of sparse Bayesian learning, the relevance vector machine, with VBLS at its core, offering significant computational and robustness advantages for this class of methods. The iterative nature of VBLS makes it most suitable for real-time incremental learning, which is crucial especially in the application domain of robotics, brain-machine interfaces, and neural prosthetics, where real-time learning of models for control is needed. We evaluate our algorithm on synthetic and neurophysiological data sets, as well as on standard regression and classification benchmark data sets, comparing it with other competitive statistical approaches and demonstrating its suitability as a drop-in replacement for other generalized linear regression techniques.
APA, Harvard, Vancouver, ISO, and other styles
22

Daescu, Dacian N., and Ricardo Todling. "Adjoint Estimation of the Variation in Model Functional Output due to the Assimilation of Data." Monthly Weather Review 137, no. 5 (May 1, 2009): 1705–16. http://dx.doi.org/10.1175/2008mwr2659.1.

Full text
Abstract:
Abstract A parametric approach to the adjoint estimation of the variation in model functional output due to the assimilation of data is considered as a tool to analyze and develop observation impact measures. The parametric approach is specialized to a linear analysis scheme and it is used to derive various high-order approximation equations. This framework includes the Kalman filter and incremental three-and four-dimensional variational data assimilation schemes implementing a single outer loop iteration. Distinction is made between Taylor series methods and numerical quadrature methods. The novel quadrature approximations require minimal additional software development and are suitable for testing and implementation at operational numerical weather prediction centers where a data assimilation system (DAS) and the associated adjoint DAS are in place. Their potential use as tools for observation impact estimates needs to be further investigated. Preliminary numerical experiments are provided using the fifth-generation NASA Goddard Earth Observing System (GEOS-5) atmospheric DAS.
APA, Harvard, Vancouver, ISO, and other styles
23

Toledano, A., and H. Murakami. "High-Order Mixture Homogenization of Fiber-Reinforced Composites." Journal of Energy Resources Technology 113, no. 4 (December 1, 1991): 254–63. http://dx.doi.org/10.1115/1.2905909.

Full text
Abstract:
An asymptotic mixture theory of fiber-reinforced composites with periodic microstructure is presented for rate-independent inelastic responses, such as elastoplastic deformation. Key elements are the modeling capability of simulating critical interaction across material interfaces and the inclusion of the kinetic energy of micro-displacements. The construction of the proposed mixture model, which is deterministic, instead of phenomenological, is accomplished by resorting to a variational approach. The principle of virtual work is used for total quantities to derive mixture equations of motion and boundary conditions, while Reissner’s mixed variational principle (1984, 1986), applied to the incremental boundary value problem yields consistent mixture constitutive relations. In order to assess the model accuracy, numerical experiments were conducted for static and dynamic loads. The prediction of the model in the time domain was obtained by an explicit finite element code. DYNA2D is used to furnish numerically exact data for the problems by discretizing the details of the microstructure. On the other hand, the model capability of predicting effective tangent moduli was tested by comparing results with NIKE2D. In all cases, good agreement was observed between the predicted and exact data for plastic, as well as elastic responses.
APA, Harvard, Vancouver, ISO, and other styles
24

Liu, Chengsi, and Qingnong Xiao. "An Ensemble-Based Four-Dimensional Variational Data Assimilation Scheme. Part III: Antarctic Applications with Advanced Research WRF Using Real Data." Monthly Weather Review 141, no. 8 (July 25, 2013): 2721–39. http://dx.doi.org/10.1175/mwr-d-12-00130.1.

Full text
Abstract:
Abstract A four-dimensional ensemble-based variational data assimilation (4DEnVar) algorithm proposed in Part I of the 4DEnVar series (denoted En4DVar in Part I, but here we refer to it as 4DEnVar according to WMO conference recommendation to differentiate it from En4DVar algorithm using adjoint model) uses a flow-dependent background error covariance calculated from ensemble forecasts and performs 4DVar optimization based on an incremental approach and a preconditioning algorithm. In Part II, the authors evaluated 4DEnVar with observing system simulation experiments (OSSEs) using the Advanced Research Weather Research and Forecasting Model (ARW-WRF, hereafter WRF). The current study extends the 4DEnVar to assimilate real observations for a cyclone in the Antarctic and the Southern Ocean in October 2007. The authors performed an intercomparison of four different WRF variational approaches for the case, including three-dimensional variational data assimilation (3DVar), first guess at the appropriate time (FGAT), and ensemble-based three-dimensional (En3DVar) and four-dimensional (4DEnVar) variational data assimilations. It is found that all data assimilation approaches produce positive impacts in this case. Applying the flow-dependent background error covariance in En3DVar and 4DEnVar yields forecast skills superior to those with the homogeneous and isotropic background error covariance in 3DVar and FGAT. In addition, the authors carried out FGAT and 4DEnVar 3-day cycling and 72-h forecasts. The results show that 4DEnVar produces a better performance in the cyclone prediction. The inflation factor on 4DEnVar can effectively improve the 4DEnVar analysis. The authors also conducted a short period (10-day lifetime of the cyclone in the domain) of analysis/forecast intercomparison experiments using 4DEnVar, FGAT, and 3DVar. The 4DEnVar scheme demonstrates overall superior and robust performance.
APA, Harvard, Vancouver, ISO, and other styles
25

Mancilla, Javier, and Christophe Pere. "A Preprocessing Perspective for Quantum Machine Learning Classification Advantage in Finance Using NISQ Algorithms." Entropy 24, no. 11 (November 15, 2022): 1656. http://dx.doi.org/10.3390/e24111656.

Full text
Abstract:
Quantum Machine Learning (QML) has not yet demonstrated extensively and clearly its advantages compared to the classical machine learning approach. So far, there are only specific cases where some quantum-inspired techniques have achieved small incremental advantages, and a few experimental cases in hybrid quantum computing are promising, considering a mid-term future (not taking into account the achievements purely associated with optimization using quantum-classical algorithms). The current quantum computers are noisy and have few qubits to test, making it difficult to demonstrate the current and potential quantum advantage of QML methods. This study shows that we can achieve better classical encoding and performance of quantum classifiers by using Linear Discriminant Analysis (LDA) during the data preprocessing step. As a result, the Variational Quantum Algorithm (VQA) shows a gain of performance in balanced accuracy with the LDA technique and outperforms baseline classical classifiers.
APA, Harvard, Vancouver, ISO, and other styles
26

Zhao, Qingyun, John Cook, Qin Xu, and Paul R. Harasti. "Improving Short-Term Storm Predictions by Assimilating both Radar Radial-Wind and Reflectivity Observations." Weather and Forecasting 23, no. 3 (June 1, 2008): 373–91. http://dx.doi.org/10.1175/2007waf2007038.1.

Full text
Abstract:
Abstract A high-resolution data assimilation system is under development at the Naval Research Laboratory (NRL). The objective of this development is to assimilate high-resolution data, especially those from Doppler radars, into the U.S. Navy’s Coupled Ocean–Atmosphere Mesoscale Prediction System to improve the model’s capability and accuracy in short-term (0–6 h) prediction of hazardous weather for nowcasting. A variational approach is used in this system to assimilate the radar observations into the model. The system is upgraded in this study with new capabilities to assimilate not only the radar radial-wind data but also reflectivity data. Two storm cases are selected to test the upgraded system and to study the impact of radar data assimilation on model forecasts. Results from the data assimilation experiments show significant improvements in storm prediction especially when both radar radial-wind and reflectivity observations are assimilated and the analysis incremental fields are adequately constrained by the model’s dynamics and properly adjusted to satisfy the model’s thermodynamical balance.
APA, Harvard, Vancouver, ISO, and other styles
27

Lancioni, Giovanni, Gianluca Zitti, and Tuncay Yalcinkaya. "Rate-Independent Deformation Patterning in Crystal Plasticity." Key Engineering Materials 651-653 (July 2015): 944–49. http://dx.doi.org/10.4028/www.scientific.net/kem.651-653.944.

Full text
Abstract:
Metal forming processes involve continuous strain path changes inducing plastic anisotropywhich could result in the failure of the material. It has been often observed that the formation andevolution of meso-scale dislocation microstructures under monotonic and non-proportional loading have substantial effect on the induced anisotropy. It is therefore quite crucial to study the microstructureevolution to understand the underlying physics of the macroscopic transient plastic behavior. In thiscontext the deformation patterning induced by the non-convex plastic energies is investigated in amulti-slip crystal plasticity framework. An incremental variational approach is followed, which resultsin a rate-independent model exhibiting a number of similarities to the rate-dependent formulationproposed in [Yalcinkaya, Brekelmans, Geers, Int. J. of Solids and Structures, 49, 2625-2636, 2012].However there is a pronounced difference in the dissipative character of the models. The influenceof the plastic potential on the evolution of dislocation microstructures is studied through a Landau-Devonshire double-well plastic potential. Numerical simulations are performed and the results arediscussed with respect to the observed microstructure evolution in metals.
APA, Harvard, Vancouver, ISO, and other styles
28

Zheng, Tao, Nancy H. F. French, and Martin Baxter. "Development of the WRF-CO2 4D-Var assimilation system v1.0." Geoscientific Model Development 11, no. 5 (May 4, 2018): 1725–52. http://dx.doi.org/10.5194/gmd-11-1725-2018.

Full text
Abstract:
Abstract. Regional atmospheric CO2 inversions commonly use Lagrangian particle trajectory model simulations to calculate the required influence function, which quantifies the sensitivity of a receptor to flux sources. In this paper, an adjoint-based four-dimensional variational (4D-Var) assimilation system, WRF-CO2 4D-Var, is developed to provide an alternative approach. This system is developed based on the Weather Research and Forecasting (WRF) modeling system, including the system coupled to chemistry (WRF-Chem), with tangent linear and adjoint codes (WRFPLUS), and with data assimilation (WRFDA), all in version 3.6. In WRF-CO2 4D-Var, CO2 is modeled as a tracer and its feedback to meteorology is ignored. This configuration allows most WRF physical parameterizations to be used in the assimilation system without incurring a large amount of code development. WRF-CO2 4D-Var solves for the optimized CO2 flux scaling factors in a Bayesian framework. Two variational optimization schemes are implemented for the system: the first uses the limited memory Broyden–Fletcher–Goldfarb–Shanno (BFGS) minimization algorithm (L-BFGS-B) and the second uses the Lanczos conjugate gradient (CG) in an incremental approach. WRFPLUS forward, tangent linear, and adjoint models are modified to include the physical and dynamical processes involved in the atmospheric transport of CO2. The system is tested by simulations over a domain covering the continental United States at 48 km × 48 km grid spacing. The accuracy of the tangent linear and adjoint models is assessed by comparing against finite difference sensitivity. The system's effectiveness for CO2 inverse modeling is tested using pseudo-observation data. The results of the sensitivity and inverse modeling tests demonstrate the potential usefulness of WRF-CO2 4D-Var for regional CO2 inversions.
APA, Harvard, Vancouver, ISO, and other styles
29

Tressou, Benjamin, Reza Vaziri, and Carole Nadot-Martin. "Application of the incremental variational approach (EIV model) to the linear viscoelastic homogenization of different types of microstructures: long fiber-, particle-reinforced and strand-based composites." European Journal of Mechanics - A/Solids 68 (March 2018): 104–16. http://dx.doi.org/10.1016/j.euromechsol.2017.10.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Berak, E. G., and J. C. Gerdeen. "A Finite Element Technique for Limit Analysis of Structures." Journal of Pressure Vessel Technology 112, no. 2 (May 1, 1990): 138–44. http://dx.doi.org/10.1115/1.2928599.

Full text
Abstract:
Limit analysis provides an alternative to incremental elastic-plastic analysis for determining a limit load. The limit load is obtained from the lower and upper-bound theorems. These theorems, which are based on variational principles, establish the static and kinematic methods, respectively, and are particularly attractive for finite element implementation. A finite element approach using the definition of the p-norm is developed for calculating upper and lower bounds of the limit load multiplier for two-dimensional, rigid perfectly plastic structures which obey the von Mises yield criterion. Displacement and equilibrium building block quadrilateral elements are used in these dual upper and lower-bound formulations, respectively. The nonlinear finite element equations are transformed into systems of linear algebraic equations during the iteration process, and the solution vectors are determined using a frontal equation solver. The upper and lower-bound solutions are obtained in a reasonable number of iteration steps, and provide a good estimate of the limit load multiplier. Numerical results are provided to demonstrate this finite element procedure. In addition, this procedure is particularly applicable to the solution of complex problems using parallel processing on a supercomputer.
APA, Harvard, Vancouver, ISO, and other styles
31

Kim, S., B. J. Jung, and Y. Jo. "Development of a tangent linear model (version 1.0) for the High-Order Method Modeling Environment dynamical core." Geoscientific Model Development 7, no. 3 (June 17, 2014): 1175–82. http://dx.doi.org/10.5194/gmd-7-1175-2014.

Full text
Abstract:
Abstract. We describe development and validation of a tangent linear model for the High-Order Method Modeling Environment, the default dynamical core in the Community Atmosphere Model and the Community Earth System Model that solves a primitive hydrostatic equation using a spectral element method. A tangent linear model is primarily intended to approximate the evolution of perturbations generated by a nonlinear model, provides a computationally efficient way to calculate a nonlinear model trajectory for a short time range, and serves as an intermediate step to write and test adjoint models, as the forward model in the incremental approach to four-dimensional variational data assimilation, and as a tool for stability analysis. Each module in the tangent linear model (version 1.0) is linearized by hands-on derivations, and is validated by the Taylor–Lagrange formula. The linearity checks confirm all modules correctly developed, and the field results of the tangent linear modules converge to the difference field of two nonlinear modules as the magnitude of the initial perturbation is sequentially reduced. Also, experiments for stable integration of the tangent linear model (version 1.0) show that the linear model is also suitable with an extended time step size compared to the time step of the nonlinear model without reducing spatial resolution, or increasing further computational cost. Although the scope of the current implementation leaves room for a set of natural extensions, the results and diagnostic tools presented here should provide guidance for further development of the next generation of the tangent linear model, the corresponding adjoint model, and four-dimensional variational data assimilation, with respect to resolution changes and improvements in linearized physics and dynamics.
APA, Harvard, Vancouver, ISO, and other styles
32

Cleja-Ţigoiu, Sanda. "Disclinations and GND tensor effects on the multislip flow rule in crystal plasticity." Mathematics and Mechanics of Solids 25, no. 8 (February 3, 2020): 1643–76. http://dx.doi.org/10.1177/1081286519896394.

Full text
Abstract:
This paper deals with new elastoplastic models for crystalline materials with microstructural defects, such as dislocations and disclinations, which are consistent with the multislip plastic flow rule, and compatible with the free energy imbalance principle. The defect free energy function is a function of the disclination tensor and its gradient, and of the geometrically necessary dislocation (GND) tensor, via the Cartan torsion. By applying the free energy imbalance, the appropriate viscoplastic (diffusion-like) evolution equations are derived for shear plastic rates (in slip systems) and for the disclination tensor. The two sets of differential (or partial differential, i.e., non-local) equations describe the rate form of the adopted disclination–dislocation model. The first set is typical for finite deformation formalism, while the second set refers to the evolution equations with respect to the reference configuration. The dislocation appears to be a source for producing disclination defects. A pure dislocation elastoplastic model is also proposed. Multislip models with disclination within the small deformation approach are derived from the finite deformation models. The initial and boundary value problems are formulated and the incremental (rate) equilibrium equation leads to a variational equality for the velocity field, at any time, which is coupled with the rate type models for the set of variables. First, the elastic problem is solved for a certain time interval by assuming that the existing defects inside the body remain inactive. Subsequently, the variational equality is solved for the velocity field, at any time, if the slip systems are activated. Consequently, the state of the body with defects is defined by the solution of the differential-type equations, when the velocity field is known for a certain time interval. Appropriate initial conditions are necessary, including those associated with defects which became active. Finally, an update algorithm must be provided in order to compute the fields at the current moment.
APA, Harvard, Vancouver, ISO, and other styles
33

Song, Hajoon, Christopher A. Edwards, Andrew M. Moore, and Jerome Fiechter. "Data assimilation in a coupled physical–biogeochemical model of the California Current System using an incremental lognormal 4-dimensional variational approach: Part 1—Model formulation and biological data assimilation twin experiments." Ocean Modelling 106 (October 2016): 131–45. http://dx.doi.org/10.1016/j.ocemod.2016.04.001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Song, Hajoon, Christopher A. Edwards, Andrew M. Moore, and Jerome Fiechter. "Data assimilation in a coupled physical-biogeochemical model of the California Current System using an incremental lognormal 4-dimensional variational approach: Part 2—Joint physical and biological data assimilation twin experiments." Ocean Modelling 106 (October 2016): 146–58. http://dx.doi.org/10.1016/j.ocemod.2016.09.003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Noii, Nima, Amirreza Khodadadian, Jacinto Ulloa, Fadi Aldakheel, Thomas Wick, Stijn François, and Peter Wriggers. "Bayesian inversion for unified ductile phase-field fracture." Computational Mechanics 68, no. 4 (August 26, 2021): 943–80. http://dx.doi.org/10.1007/s00466-021-02054-w.

Full text
Abstract:
AbstractThe prediction of crack initiation and propagation in ductile failure processes are challenging tasks for the design and fabrication of metallic materials and structures on a large scale. Numerical aspects of ductile failure dictate a sub-optimal calibration of plasticity- and fracture-related parameters for a large number of material properties. These parameters enter the system of partial differential equations as a forward model. Thus, an accurate estimation of the material parameters enables the precise determination of the material response in different stages, particularly for the post-yielding regime, where crack initiation and propagation take place. In this work, we develop a Bayesian inversion framework for ductile fracture to provide accurate knowledge regarding the effective mechanical parameters. To this end, synthetic and experimental observations are used to estimate the posterior density of the unknowns. To model the ductile failure behavior of solid materials, we rely on the phase-field approach to fracture, for which we present a unified formulation that allows recovering different models on a variational basis. In the variational framework, incremental minimization principles for a class of gradient-type dissipative materials are used to derive the governing equations. The overall formulation is revisited and extended to the case of anisotropic ductile fracture. Three different models are subsequently recovered by certain choices of parameters and constitutive functions, which are later assessed through Bayesian inversion techniques. A step-wise Bayesian inversion method is proposed to determine the posterior density of the material unknowns for a ductile phase-field fracture process. To estimate the posterior density function of ductile material parameters, three common Markov chain Monte Carlo (MCMC) techniques are employed: (i) the Metropolis–Hastings algorithm, (ii) delayed-rejection adaptive Metropolis, and (iii) ensemble Kalman filter combined with MCMC. To examine the computational efficiency of the MCMC methods, we employ the $$\hat{R}{-}convergence$$ R ^ - c o n v e r g e n c e tool. The resulting framework is algorithmically described in detail and substantiated with numerical examples.
APA, Harvard, Vancouver, ISO, and other styles
36

Gauthier, Pierre, Monique Tanguay, Stéphane Laroche, Simon Pellerin, and Josée Morneau. "Extension of 3DVAR to 4DVAR: Implementation of 4DVAR at the Meteorological Service of Canada." Monthly Weather Review 135, no. 6 (June 1, 2007): 2339–54. http://dx.doi.org/10.1175/mwr3394.1.

Full text
Abstract:
Abstract On 15 March 2005, the Meteorological Service of Canada (MSC) proceeded to the implementation of a four-dimensional variational data assimilation (4DVAR) system, which led to significant improvements in the quality of global forecasts. This paper describes the different elements of MSC’s 4DVAR assimilation system, discusses some issues encountered during the development, and reports on the overall results from the 4DVAR implementation tests. The 4DVAR system adopted an incremental approach with two outer iterations. The simplified model used in the minimization has a horizontal resolution of 170 km and its simplified physics includes vertical diffusion, surface drag, orographic blocking, stratiform condensation, and convection. One important element of the design is its modularity, which has permitted continued progress on the three-dimensional variational data assimilation (3DVAR) component (e.g., addition of new observation types) and the model (e.g., computational and numerical changes). This paper discusses some numerical problems that occur in the vicinity of the Poles where the semi-Lagrangian scheme becomes unstable when there is a simultaneous occurrence of converging meridians and strong wind gradients. These could be removed by filtering the winds in the zonal direction before they are used to estimate the upstream position in the semi-Lagrangian scheme. The results show improvements in all aspects of the forecasts over all regions. The impact is particularly significant in the Southern Hemisphere where 4DVAR is able to extract more information from satellite data. In the Northern Hemisphere, 4DVAR accepts more asynoptic data, in particular coming from profilers and aircrafts. The impact noted is also positive and the short-term forecasts are particularly improved over the west coast of North America. Finally, the dynamical consistency of the 4DVAR global analyses leads to a significant impact on regional forecasts. Experimentation has shown that regional forecasts initiated directly from a 4DVAR global analysis are improved with respect to the regional forecasts resulting from the regional 3DVAR analysis.
APA, Harvard, Vancouver, ISO, and other styles
37

Song, Hajoon, Christopher A. Edwards, Andrew M. Moore, and Jerome Fiechter. "Data assimilation in a coupled physical-biogeochemical model of the California current system using an incremental lognormal 4-dimensional variational approach: Part 3—Assimilation in a realistic context using satellite and in situ observations." Ocean Modelling 106 (October 2016): 159–72. http://dx.doi.org/10.1016/j.ocemod.2016.06.005.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Kim, S., B. J. Jung, and Y. Jo. "Development of a tangent linear model (version 1.0) for the high-order method modelling environment dynamical core." Geoscientific Model Development Discussions 7, no. 1 (January 28, 2014): 1175–96. http://dx.doi.org/10.5194/gmdd-7-1175-2014.

Full text
Abstract:
Abstract. We describe development and validation of a tangent linear model for the High-Order Method Modelling Environment, the default dynamical core in the Community Atmosphere Model and the Community Earth System Model that solves a primitive hydrostatic equation using a spectral element. A tangent linear model is primarily intended to approximate the evolution of perturbations generated by a nonlinear model, provides a computationally efficient way to calculate a nonlinear model trajectory for a short time range, and serves as an intermediate step to write and test adjoint models, as the forward model in the incremental approach to 4-D-Var, and as a tool for stability analysis. Each module in the tangent linear model (version 1.0) is linearized by hands-on derivations, and is validated by the Taylor–Lagrange formula. The linearity checks confirm all modules correctly developed, and the field results of the tangent linear modules converge to the difference field of two nonlinear modules as the magnitude of the initial perturbation is sequentially reduced. Also, experiments for stable integration of the tangent linear model (version 1.0) show that the linear model is also suitable with an extended time step size compared to the time step of the nonlinear model without reducing spatial resolution, or increasing further computational cost. Although the scope of the current implementation leaves room for a set of natural extensions, the results and diagnostic tools presented here should provide guidance for further development of the next generation of the tangent linear model, the corresponding adjoint model, and 4-dimensional variational data assimilation, with respect to resolution changes and improvements in linearized physics and dynamics.
APA, Harvard, Vancouver, ISO, and other styles
39

Ziou, Hassina, and Mohamed Guenfoud. "SIMPLE INCREMENTAL APPROACH FOR ANALYSING OPTIMAL NON-PRISMATIC FUNCTIONALLY GRADED BEAMS." Advances in Civil and Architectural Engineering 14, no. 26 (April 17, 2023): 118–37. http://dx.doi.org/10.13167/2023.26.8.

Full text
Abstract:
This paper presents a simple incremental approach of analysing the static behaviour of functionally graded tapered beams. This approach involves dividing the non-uniform beam into segments with uniform cross-sections, and using two separate finite element models to analyse the structural behavior of slender beams (Euler-Bernoulli model) and deep beams (Timoshenko beam theory). The material properties of the beam vary according to a power law distribution through the thickness, resulting in smooth variations in the mechanical properties. The finite element system of equations is obtained using the principle of virtual work. Detailed information on the shape functions and stiffness matrix of the beam is provided, and the numerical results are evaluated and validated using data from the literature. The comparison demonstrates that the response of the functionally graded tapered beams is accurately assessed by the proposed approach. Additionally, the effects of material distribution, boundary conditions, and tapering parameter on the deflection behavior are presented. Results show that an increase in the power law index increases the flexibility of the functionally graded tapered beams, resulting in higher deflection. Furthermore, lower tapering parameters also result in higher deflection. Compared to other boundary conditions, clamped-clamped boundary conditions demonstrate the best performance in terms of maximum deflection.
APA, Harvard, Vancouver, ISO, and other styles
40

Black, Paul, Iqbal Gondal, Adil Bagirov, and Md Moniruzzaman. "Malware Variant Identification Using Incremental Clustering." Electronics 10, no. 14 (July 8, 2021): 1628. http://dx.doi.org/10.3390/electronics10141628.

Full text
Abstract:
Dynamic analysis and pattern matching techniques are widely used in industry, and they provide a straightforward method for the identification of malware samples. Yara is a pattern matching technique that can use sandbox memory dumps for the identification of malware families. However, pattern matching techniques fail silently due to minor code variations, leading to unidentified malware samples. This paper presents a two-layered Malware Variant Identification using Incremental Clustering (MVIIC) process and proposes clustering of unidentified malware samples to enable the identification of malware variants and new malware families. The novel incremental clustering algorithm is used in the identification of new malware variants from the unidentified malware samples. This research shows that clustering can provide a higher level of performance than Yara rules, and that clustering is resistant to small changes introduced by malware variants. This paper proposes a hybrid approach, using Yara scanning to eliminate known malware, followed by clustering, acting in concert, to allow the identification of new malware variants. F1 score and V-Measure clustering metrics are used to evaluate our results.
APA, Harvard, Vancouver, ISO, and other styles
41

Jung, Useok, Moon-Gyeang Cho, Ji-Won Woo, and Chang-Joo Kim. "Trajectory-Tracking Controller Design of Rotorcraft Using an Adaptive Incremental-Backstepping Approach." Aerospace 8, no. 9 (September 4, 2021): 248. http://dx.doi.org/10.3390/aerospace8090248.

Full text
Abstract:
This paper treats a robust adaptive trajectory-tracking control design for a rotorcraft using a high-fidelity math model subject to model uncertainties. In order to control the nonlinear rotorcraft model which shows strong inter-axis coupling and high nonlinearity, incremental backstepping approach with state-dependent control effectiveness matrix is utilized. Since the incremental backstepping control suffers from performance degradation in the presence of control matrix uncertainties due to change of flight conditions, control system robustness is improved by combining the least squares parameter estimator to estimate time varying uncertainties contained in the control effectiveness matrix. Also, by selecting a suitable gain set by investigating the error dynamics, a uniform trajectory-tracking performance over operational flight envelope of the rotorcraft is ensured without resorting to the conventional gain scheduling method. To evaluate the proposed controller, comparative results between IBSC and Adaptive IBSC are provided in this paper with sequential maneuvers from the ADS-33E-PRF. The proposed method shows improved tracking performance under variations in control effective matrix in the flight simulation. Robust and stable parameter estimation is also guaranteed due to the implementation of the DF-RLS algorithm for the least squares estimator.
APA, Harvard, Vancouver, ISO, and other styles
42

Sturtevant, Nathan, Nicolas Decroocq, Aaron Tripodi, and Matthew Guzdial. "The Unexpected Consequence of Incremental Design Changes." Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment 16, no. 1 (October 1, 2020): 130–36. http://dx.doi.org/10.1609/aiide.v16i1.7421.

Full text
Abstract:
Human designers may find it difficult to anticipate the impact of small changes to some games, particularly in puzzle games. However, it is not difficult for computers to simulate all mechanical impacts of such small changes. This suggests that computers might be able to aid humans designers as they build and analyze game levels. This paper takes one step towards this larger goal by studying how Exhaustive Procedural Content Generation (EPCG) can be used for analysis of incremental changes of existing game levels. Using an incremental EPCG approach, we analyze all of the levels in the popular puzzle game Snakebird, showing that incremental variations in the level designs can significantly increase the length of the shortest possible solution. A user study on a subset of these modified levels shows that the modified levels are both interesting and challenging for humans to play. Thus, through the analysis of Snakebird, we demonstrate the broader potential for incremental applications of EPCG.
APA, Harvard, Vancouver, ISO, and other styles
43

Li, Shaoyong, Tianrui Li, and Dun Liu. "Incremental updating approximations in dominance-based rough sets approach under the variation of the attribute set." Knowledge-Based Systems 40 (March 2013): 17–26. http://dx.doi.org/10.1016/j.knosys.2012.11.002.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Li, Shaoyong, and Tianrui Li. "Incremental update of approximations in dominance-based rough sets approach under the variation of attribute values." Information Sciences 294 (February 2015): 348–61. http://dx.doi.org/10.1016/j.ins.2014.09.056.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Wang, Yu. "An Incremental Classification Algorithm for Mining Data with Feature Space Heterogeneity." Mathematical Problems in Engineering 2014 (2014): 1–9. http://dx.doi.org/10.1155/2014/327142.

Full text
Abstract:
Feature space heterogeneity often exists in many real world data sets so that some features are of different importance for classification over different subsets. Moreover, the pattern of feature space heterogeneity might dynamically change over time as more and more data are accumulated. In this paper, we develop an incremental classification algorithm, Supervised Clustering for Classification with Feature Space Heterogeneity (SCCFSH), to address this problem. In our approach, supervised clustering is implemented to obtain a number of clusters such that samples in each cluster are from the same class. After the removal of outliers, relevance of features in each cluster is calculated based on their variations in this cluster. The feature relevance is incorporated into distance calculation for classification. The main advantage of SCCFSH lies in the fact that it is capable of solving a classification problem with feature space heterogeneity in an incremental way, which is favorable for online classification tasks with continuously changing data. Experimental results on a series of data sets and application to a database marketing problem show the efficiency and effectiveness of the proposed approach.
APA, Harvard, Vancouver, ISO, and other styles
46

Yu, Min-Su, Tae-Won Jung, Dai-Yeol Yun, Chi-Gon Hwang, Sea-Young Park, Soon-Chul Kwon, and Kye-Dong Jung. "A Variational Autoencoder Cascade Generative Adversarial Network for Scalable 3D Object Generation and Reconstruction." Sensors 24, no. 3 (January 24, 2024): 751. http://dx.doi.org/10.3390/s24030751.

Full text
Abstract:
Generative Adversarial Networks (GANs) for 3D volume generation and reconstruction, such as shape generation, visualization, automated design, real-time simulation, and research applications, are receiving increased amounts of attention in various fields. However, challenges such as limited training data, high computational costs, and mode collapse issues persist. We propose combining a Variational Autoencoder (VAE) and a GAN to uncover enhanced 3D structures and introduce a stable and scalable progressive growth approach for generating and reconstructing intricate voxel-based 3D shapes. The cascade-structured network involves a generator and discriminator, starting with small voxel sizes and incrementally adding layers, while subsequently supervising the discriminator with ground-truth labels in each newly added layer to model a broader voxel space. Our method enhances the convergence speed and improves the quality of the generated 3D models through stable growth, thereby facilitating an accurate representation of intricate voxel-level details. Through comparative experiments with existing methods, we demonstrate the effectiveness of our approach in evaluating voxel quality, variations, and diversity. The generated models exhibit improved accuracy in 3D evaluation metrics and visual quality, making them valuable across various fields, including virtual reality, the metaverse, and gaming.
APA, Harvard, Vancouver, ISO, and other styles
47

Knežević, Nikola, Miloš Petrović, and Kosta Jovanović. "Cartesian Stiffness Shaping of Compliant Robots—Incremental Learning and Optimization Based on Sequential Quadratic Programming." Actuators 13, no. 1 (January 13, 2024): 32. http://dx.doi.org/10.3390/act13010032.

Full text
Abstract:
Emerging robotic systems with compliant characteristics, incorporating nonrigid links and/or elastic actuators, are opening new applications with advanced safety features, as well as improved performance and energy efficiency in contact tasks. However, the complexity of such systems poses challenges in modeling and control due to their nonlinear nature and model variations over time. To address these challenges, the paper introduces Locally Weighted Projection Regression (LWPR) and its online learning capabilities to keep the model of compliant actuators accurate and enable the model-based controls to be more robust. The approach is experimentally validated in Cartesian position and stiffness control for a 4 DoF planar robot driven by Variable Stiffness Actuators (VSA), whose real-time implementation is supported by the Sequential Least Squares Programming (SLSQP) optimization approach.
APA, Harvard, Vancouver, ISO, and other styles
48

Borutzky, W., and J. Granda. "Bond graph based frequency domain sensitivity analysis of multidisciplinary systems." Proceedings of the Institution of Mechanical Engineers, Part I: Journal of Systems and Control Engineering 216, no. 1 (February 1, 2002): 85–99. http://dx.doi.org/10.1243/0959651021541453.

Full text
Abstract:
Multidisciplinary systems are described most suitably by bond graphs. In order to determine unnormalized frequency domain sensitivities in symbolic form, this paper proposes to construct in a systematic manner a bond graph from another bond graph, which is called the associated incremental bond graph in this paper. Contrary to other approaches reported in the literature the variables at the bonds of the incremental bond graph are not sensitivities but variations (incremental changes) in the power variables from their nominal values due to parameter changes. Thus their product is power. For linear elements their corresponding model in the incremental bond graph also has a linear characteristic. By deriving the system equations in symbolic state space form from the incremental bond graph in the same way as they are derived from the initial bond graph, the sensitivity matrix of the system can be set up in symbolic form. Its entries are transfer functions depending on the nominal parameter values and on the nominal states and the inputs of the original model. The sensitivities can be determined automatically by the bond graph preprocessor CAMP-G and the widely used program MATLAB together with the Symbolic Toolbox for symbolic mathematical calculation. No particular program is needed for the approach proposed. The initial bond graph model may be non-linear and may contain controlled sources and multiport elements. In that case the sensitivity model is linear time variant and must be solved in the time domain. The rationale and the generality of the proposed approach are presented. For illustration purposes a mechatronic example system, a load positioned by a constant-excitation d.c. motor, is presented and sensitivities are determined in symbolic form by means of CAMP-G/MATLAB.
APA, Harvard, Vancouver, ISO, and other styles
49

Huang, Yanyan, Huijun Wang, and Ke Fan. "Improving the Prediction of the Summer Asian–Pacific Oscillation Using the Interannual Increment Approach." Journal of Climate 27, no. 21 (October 24, 2014): 8126–34. http://dx.doi.org/10.1175/jcli-d-14-00209.1.

Full text
Abstract:
Abstract The summer Asian–Pacific oscillation (APO) is a dominant teleconnection pattern over the extratropical Northern Hemisphere that links the large-scale atmospheric circulation anomalies over the Asian–North Pacific Ocean sector. In this study, the direct Development of a European Multimodel Ensemble System for Seasonal-to-Interannual Prediction (DEMETER) model outputs from 1960 to 2001, which are limited in predicting the interannual variability of the summer Asian upper-tropospheric temperature and the decadal variations, are applied using the interannual increment approach to improve the predictions of the summer APO. By treating the year-to-year increment as the predictand, the interannual increment scheme is shown to significantly improve the predictive ability for the interannual variability of the summer Asian upper-tropospheric temperature and the decadal variations. The improvements for the interannual and interdecadal summer APO variability predictions in the interannual increment scheme relative to the original scheme are clear and significant. Compared with the DEMETER direct outputs, the statistical model with two predictors of APO and sea surface temperature anomaly over the Atlantic shows a significantly improved ability to predict the interannual variability of the summer rainfall over the middle and lower reaches of the Yangtze River valley (SRYR). This study therefore describes a more efficient approach for predicting the APO and the SRYR.
APA, Harvard, Vancouver, ISO, and other styles
50

Kazemi, Pezhman, Jaume Giralt, Christophe Bengoa, Armin Masoumian, and Jean-Philippe Steyer. "Fault detection and diagnosis in water resource recovery facilities using incremental PCA." Water Science and Technology 82, no. 12 (August 5, 2020): 2711–24. http://dx.doi.org/10.2166/wst.2020.368.

Full text
Abstract:
Abstract Because of the static nature of conventional principal component analysis (PCA), natural process variations may be interpreted as faults when it is applied to processes with time-varying behavior. In this paper, therefore, we propose a complete adaptive process monitoring framework based on incremental principal component analysis (IPCA). This framework updates the eigenspace by incrementing new data to the PCA at a low computational cost. Moreover, the contribution of variables is recursively provided using complete decomposition contribution (CDC). To impute missing values, the empirical best linear unbiased prediction (EBLUP) method is incorporated into this framework. The effectiveness of this framework is evaluated using benchmark simulation model No. 2 (BSM2). Our simulation results show the ability of the proposed approach to distinguish between time-varying behavior and faulty events while correctly isolating the sensor faults even when these faults are relatively small.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography