Academic literature on the topic 'Random feature expansion'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Random feature expansion.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Random feature expansion"

1

Febiana Anistya and Erwin Budi Setiawan. "Hate Speech Detection on Twitter in Indonesia with Feature Expansion Using GloVe." Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) 5, no. 6 (December 30, 2021): 1044–51. http://dx.doi.org/10.29207/resti.v5i6.3521.

Full text
Abstract:
Twitter is one of the popular social media to channel opinions in the form of criticism and suggestions. Criticism could be a form of hate speech if the criticism implies attacking something (an individual, race, or group). With the limit of 280 characters in a tweet, there is often a vocabulary mismatch due to abbreviations which can be solved with word embedding. This study utilizes feature expansion to reduce vocabulary mismatches in hate speech on Twitter containing Indonesian language by using Global Vectors (GloVe). Feature selection related to the best model is carried out using the Logistic Regression (LR), Random Forest (RF), and Artificial Neural Network (ANN) algorithms. The results show that the Random Forest model with 5.000 features and a combination of TF-IDF and Tweet corpus built with GloVe produce the best accuracy rate between the other models with an average of 88,59% accuracy score, which is 1,25% higher than the predetermined Baseline. The number of features used is proven to improve the performance of the system.
APA, Harvard, Vancouver, ISO, and other styles
2

Dewi, Mila Putri Kartika, and Erwin Budi Setiawan. "Feature Expansion Using Word2vec for Hate Speech Detection on Indonesian Twitter with Classification Using SVM and Random Forest." JURNAL MEDIA INFORMATIKA BUDIDARMA 6, no. 2 (April 25, 2022): 979. http://dx.doi.org/10.30865/mib.v6i2.3855.

Full text
Abstract:
Hate speech is one of the most common cases on Twitter. It is limited to 280 characters in uploading tweets, resulting in many word variations and possible vocabulary mismatches. Therefore, this study aims to overcome these problems and build a hate speech detection system on Indonesian Twitter. This study uses 20,571 tweet data and implements the Feature Expansion method using Word2vec to overcome vocabulary mismatches. Other methods applied are Bag of Word (BOW) and Term Frequency-Inverse Document Frequency (TF-IDF) to represent feature values in tweets. This study examines two methods in the classification process, namely Support Vector Machine (SVM) and Random Forest (RF). The final result shows that the Feature Expansion method with TF-IDF weighting in the Random Forest classification gives the best accuracy result, which is 88,37%. The Feature Expansion method with TF-IDF weighting can increase the accuracy value from several tests in detecting hate speech and overcoming vocabulary mismatches.
APA, Harvard, Vancouver, ISO, and other styles
3

Elqi Ashok, Sri Suryani Prasetiyowati, and Yuliant Sibaroni. "DHF Incidence Rate Prediction Based on Spatial-Time with Random Forest Extended Features." Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) 6, no. 4 (August 22, 2022): 612–23. http://dx.doi.org/10.29207/resti.v6i4.4268.

Full text
Abstract:
This study proposes a prediction of the classification of the spread of dengue hemorrhagic fever (DHF) with the expansion of the Random Forest (RF) feature based on spatial time. The RF classification model was developed by extending the features based on the previous 2 to 4 years. The three best RF models were obtained with an accuracy of 97%, 93%, and 93%, respectively. Meanwhile, the best kriging model was obtained with an RMSE value of 0.762 for 2022, 0.996 for 2023, and 0.953 for 2024. This model produced a prediction of the classification of dengue incidence rates (IR) with a distribution of 33% medium class and 67% high class for 2022. 2023, the medium class is predicted to decrease by 6% and cause an increase in the high class to 73%. Meanwhile, in 2024, it is predicted that there will be an increase of 10% for the medium class from 27% to 37% and the distribution of the high class is predicted to be around 63%. The contribution of this research is to provide predictive information on the classification of the spread of DHF in the Bandung area for three years with the expansion of features based on time.
APA, Harvard, Vancouver, ISO, and other styles
4

Xue, Liang, Diao Li, Cheng Dai, and Tongchao Nan. "Characterization of Aquifer Multiscale Properties by Generating Random Fractal Field with Truncated Power Variogram Model Using Karhunen–Loève Expansion." Geofluids 2017 (2017): 1–11. http://dx.doi.org/10.1155/2017/1361289.

Full text
Abstract:
The traditional geostatistics to describe the spatial variation of hydrogeological properties is based on the assumption of stationarity or statistical homogeneity. However, growing evidences show and it has been widely recognized that the spatial distribution of many hydrogeological properties can be characterized as random fractals with multiscale feature, and spatial variation can be described by power variogram model. It is difficult to generate a multiscale random fractal field by directly using nonstationary power variogram model due to the lack of explicit covariance function. Here we adopt the stationary truncated power variogram model to avoid this difficulty and generate the multiscale random fractal field using Karhunen–Loève (KL) expansion. The results show that either the unconditional or conditional (on measurements) multiscale random fractal field can be generated by using truncated power variogram model and KL expansion when the upper limit of the integral scale is sufficiently large, and the main structure of the spatial variation can be described by using only the first few dominant KL expansion terms associated with large eigenvalues. The latter provides a foundation to perform dimensionality reduction and saves computational effort when analyzing the stochastic flow and transport problems.
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Jia, Fangcheng Sun, and Meng Li. "A Study on the Impact of Digital Finance on Regional Productivity Growth Based on Artificial Neural Networks." Computational Intelligence and Neuroscience 2022 (May 31, 2022): 1–7. http://dx.doi.org/10.1155/2022/7665954.

Full text
Abstract:
The relationship between financial development and economic growth has become a hot topic in recent years and for China, which is undergoing financial liberalisation and policy reform, the efficiency of the use of digital finance and the deepening of the balance between quality and quantity in financial development are particularly important for economic growth. This paper investigates the utility of digital finance and financial development on total factor productivity in China using interprovincial panel data decomposing financial development into financial scale and financial efficiency; an interprovincial panel data model is used to explore the utility of digital finance on total factor productivity. This involves the collection and preprocessing of financial data, including feature engineering, and the development of an optimised predictive model. We preprocess the original dataset to remove anomalous information and improve data quality. This work uses feature engineering to select relevant features for fitting and training the model. In this process, the random forest algorithm is used to effectively avoid overfitting problems and to facilitate the dimensionality reduction of the relevant features. In determining the model to be used, the random forest regression model was chosen for training. The empirical results show that digital finance has contributed to productivity growth but is not efficiently utilised; China should give high priority to improving financial efficiency while promoting financial expansion; rapid expansion of finance without a focus on financial efficiency will not be conducive to productivity growth.
APA, Harvard, Vancouver, ISO, and other styles
6

Wu, Fan, Yufen Ren, and Xiaoke Wang. "Application of Multi-Source Data for Mapping Plantation Based on Random Forest Algorithm in North China." Remote Sensing 14, no. 19 (October 3, 2022): 4946. http://dx.doi.org/10.3390/rs14194946.

Full text
Abstract:
The expansion of plantation poses new challenges for mapping forest, especially in mountainous regions. Using multi-source data, this study explored the capability of the random forest (RF) algorithm for the extraction and mapping of five forest types located in Yanqing, north China. The Google Earth imagery, forest inventory data, GaoFen-1 wide-field-of-view (GF-1 WFV) images and DEM were applied for obtaining 125 features in total. The recursive feature elimination (RFE) method selected 32 features for mapping five forest types. The results attained overall accuracy of 87.06%, with a Kappa coefficient of 0.833. The mean decrease accuracy (MDA) reveals that the DEM, LAI and EVI in winter and three texture features (entropy, variance and mean) make great contributions to forest classification. The texture features from the NIR band are important, while the other texture features have little contribution. This study has demonstrated the potential of applying multi-source data based on RF algorithm for extracting and mapping plantation forest in north China.
APA, Harvard, Vancouver, ISO, and other styles
7

KOTA, V. K. B., MANAN VYAS, and K. B. K. MAYYA. "SPECTRAL DISTRIBUTION ANALYSIS OF RANDOM INTERACTIONS WITH J-SYMMETRY AND ITS EXTENSIONS." International Journal of Modern Physics E 17, supp01 (December 2008): 318–33. http://dx.doi.org/10.1142/s0218301308011951.

Full text
Abstract:
Spectral distribution theory, based on average-fluctuation separation and trace propagation, is applied in the analysis of some properties of a system of m (identical) nucleons in shell model j-orbits with random interactions preserving angular momentum J-symmetry. Employing the bivariate Gaussian form with Edgeworth corrections for fixed E (energy) and M (Jz eigenvalue) density of states ρ(E,M), analytical results, in the form of expansions to order [J(J+1)]2, are derived for energy centroids Ec(m,J) and spectral variances σ2(m,J). They are used to study distribution of spectral widths, J=0 preponderance in energy centroids, lower order cross correlations in states with different J's and so on. Also, an expansion is obtained for occupation probabilities over spaces with fixed M. All the results obtained with spectral distribution theory compare well with those obtained recently using quite different methods. In addition, using trace propagation methods, a regular feature, that they are nearly constant, of spectral variances generated by random interactions is demonstrated using several examples. These open a new window to study regular structures generated by random interactions.
APA, Harvard, Vancouver, ISO, and other styles
8

Mason, William S., Allison R. Jilbert, and Samuel Litwin. "Hepatitis B Virus DNA Integration and Clonal Expansion of Hepatocytes in the Chronically Infected Liver." Viruses 13, no. 2 (January 30, 2021): 210. http://dx.doi.org/10.3390/v13020210.

Full text
Abstract:
Human hepatitis B virus (HBV) can cause chronic, lifelong infection of the liver that may lead to persistent or episodic immune-mediated inflammation against virus-infected hepatocytes. This immune response results in elevated rates of killing of virus-infected hepatocytes, which may extend over many years or decades, lead to fibrosis and cirrhosis, and play a role in the high incidence of hepatocellular carcinoma (HCC) in HBV carriers. Immune-mediated inflammation appears to cause oxidative DNA damage to hepatocytes, which may also play a major role in hepatocarcinogenesis. An additional DNA damaging feature of chronic infections is random integration of HBV DNA into the chromosomal DNA of hepatocytes. While HBV DNA integration does not have a role in virus replication it may alter gene expression of the host cell. Indeed, most HCCs that arise in HBV carriers contain integrated HBV DNA and, in many, the integrant appears to have played a role in hepatocarcinogenesis. Clonal expansion of hepatocytes, which is a natural feature of liver biology, occurs because the hepatocyte population is self-renewing and therefore loses complexity due to random hepatocyte death and replacement by proliferation of surviving hepatocytes. This process may also represent a risk factor for the development of HCC. Interestingly, during chronic HBV infection, hepatocyte clones detected using integrated HBV DNA as lineage-specific markers, emerge that are larger than those expected to occur by random death and proliferation of hepatocytes. The emergence of these larger hepatocyte clones may reflect a survival advantage that could be explained by an ability to avoid the host immune response. While most of these larger hepatocyte clones are probably not preneoplastic, some may have already acquired preneoplastic changes. Thus, chronic inflammation in the HBV-infected liver may be responsible, at least in part, for both initiation of HCC via oxidative DNA damage and promotion of HCC via stimulation of hepatocyte proliferation through immune-mediated killing and compensatory division.
APA, Harvard, Vancouver, ISO, and other styles
9

Viny, Aaron D., Alan Lichtin, Brad Pohlman, Zachary Nearman, Thomas Loughran, and Jaroslaw P. Maciejewski. "Chronic B-Cell Dyscrasias Are an Important Clinical Feature of LGL Leukemia." Blood 110, no. 11 (November 16, 2007): 4675. http://dx.doi.org/10.1182/blood.v110.11.4675.4675.

Full text
Abstract:
Abstract T-cell large granular lymphocyte leukemia (T-LGL) is a chronic clonal lymphoproliferation of CTL. The context of immune-mediated diseases led to the hypothesis that T-LGL represents an exaggerated clonal immune response to a persistent antigen such as an autoantigen or viral antigen or be associated with an immune surveillance reaction to occult malignancies. Based on the structural similarity of TCR CDR3 we have demonstrated that the transformation event in T-LGL may not be random and is driven by a related antigen. Similar to the clonal evolution in LGL, B cell expansion in low grade non-Hogkin lymphoma may also not be entirely stochastic. There have been coincidental case reports of T-LGL patients with concomitant B cell dyscrasia. It is therefore possible that similar pathogenic triggers may be operative in chronic proliferation of T and B cells, which subsequently predispose to clonal outgrowth. Consequently, we systematically examined a large series of T-LGL patients for evidence of B cell dyscrasias. When our patients (N=70) were studied we found a frequent association of low grade B cell lymphoproliferative disorders (28%). In general, all clinical comparisons with reported studies suggested that our LGL cohort had a composition equivalent to that of prior series and findings with regard to B cell dyscrasias are not due to selection bias. By comparison, a total of 51 patients with concomitant B and T cell dyscrasia were previously reported in small series or case reports. In our series, MGUS was the most common of the B cell disorders identified in T-LGL (21%), B-CLL also was present in 7%. Of note, 8 patients received rituximab and notably, evolution of clonal T cell expansion after therapy with rituximab has been identified in isolated case reports. In addition to clonal B cell expansion, polyclonal hyperglobulinemia was found in 26% of T-LGL, similar to previous report. Hypoglobulinemia was identified in 12% of patients. Evidence of involvement of both the T and B cell compartments in T-LGL fits into several models of disease pathogenesis. T-LGL may represent anti-tumor surveillance reflecting an exaggerated clonal expansion in the context of polyclonal anti-tumor response. An alternative theory is that both conditions may result from an initial polyclonal immune reaction directed against an unrelated common target; one could speculate that autoimmune/viral diseases associated with T-LGL or malignancies (e.g., MDS) provide antigenic triggers. It is also conceivable that impaired humoral immune response could result in an exuberant T cell reaction against an uncleared antigen. If B cell function is insufficient to fully clear the inciting pathogen, then chronic antigenic stimulation could polarize T cell-mediated response, resulting in LGL expansion. Finally, identification of decreased immunoglobulin levels in the context of LGL leukemia may also give merit to the theory that both B and T cell compartments are governed by regulatory/compensatory feedback mechanisms and T-LGL could evolve from unchecked T cell expansion in the context of B cell dysfunction. In sum, we describe here a high frequency of B cell dyscrasias in patients with T-LGL. The association is unlikely to be coincidental and provide important insight into dysregulated expression of T cell and B cell function.
APA, Harvard, Vancouver, ISO, and other styles
10

Moutselos, Konstantinos, Ilias Maglogiannis, and Aristotelis Chatziioannou. "Integration of High-Volume Molecular and Imaging Data for Composite Biomarker Discovery in the Study of Melanoma." BioMed Research International 2014 (2014): 1–14. http://dx.doi.org/10.1155/2014/145243.

Full text
Abstract:
In this work the effects of simple imputations are studied, regarding the integration of multimodal data originating from different patients. Two separate datasets of cutaneous melanoma are used, an image analysis (dermoscopy) dataset together with a transcriptomic one, specifically DNA microarrays. Each modality is related to a different set of patients, and four imputation methods are employed to the formation of a unified, integrative dataset. The application of backward selection together with ensemble classifiers (random forests), followed by principal components analysis and linear discriminant analysis, illustrates the implication of the imputations on feature selection and dimensionality reduction methods. The results suggest that the expansion of the feature space through the data integration, achieved by the exploitation of imputation schemes in general, aids the classification task, imparting stability as regards the derivation of putative classifiers. In particular, although the biased imputation methods increase significantly the predictive performance and the class discrimination of the datasets, they still contribute to the study of prominent features and their relations. The fusion of separate datasets, which provide a multimodal description of the same pathology, represents an innovative, promising avenue, enhancing robust composite biomarker derivation and promoting the interpretation of the biomedical problem studied.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Random feature expansion"

1

Gravner, Janko. "Growth Phenomena in Cellular Automata." In New Constructions in Cellular Automata. Oxford University Press, 2003. http://dx.doi.org/10.1093/oso/9780195137170.003.0010.

Full text
Abstract:
We illustrate growth phenomena in two-dimensional cellular automata (CA) by four case studies. The first CA, which we call Obstacle Course, describes the effect that obstacles have on such features of simple growth models as linear expansion and coherent asymptotic shape. Our next CA is random-walk-based Internal Diffusion Limited Aggregation, which spreads sublinearly, but with a shape which can be explicitly computed due to hydrodynamic effects. Then we propose a simple scheme for characterizing CA according to their growth properties, as indicated by two Larger than Life examples. Finally, a very simple case of Spatial Prisoner’s Dilemma illustrates nucleation analysis of CA. In essence, analysis of growth models is an attempt to study properties of physical systems far from equilibrium (e.g., Meakin [34] and more than 1300 references cited in the latter). Cellular automata (CA) growth models, by virtue of their simplicity and amenability to computer experimentation [25], have become particularly popular in the last 20 years, especially in physics research literature [40, 42]. Needless to say, precise mathematical results are hard to come by, and many basic questions remain completely open at the rigorous level. The purpose of this chapter, then, is to outline some successes of the mathematical approach and to identify some fundamental difficulties. We will mainly address three themes which can be summarized by the terms: aggregation, nucleation, and constraint-expansion transition. These themes also provide opportunities to touch on the roles of randomness, monotonicity, and linearity in CA investigations. We choose to illustrate these issues by particular CA rules, with little attempt to formulate a general theory. Simplicity is often, and rightly, touted as an important selling point of cellular automata. We have, therefore, tried to choose the simplest models which, while being amenable to some mathematical analysis, raise a host of intriguing unanswered questions. The next few paragraphs outline subsequent sections of this chapter. Aggregation models typically study properties of growth from a small initial seed. Arguably, the simplest dynamics are obtained by adding sites on the boundary in a uniform fashion.
APA, Harvard, Vancouver, ISO, and other styles
2

Winblad, Stefan, and Anne-Berit Ekström. "Myotonic Dystrophy." In Cognitive and Behavioral Abnormalities of Pediatric Diseases. Oxford University Press, 2010. http://dx.doi.org/10.1093/oso/9780195342680.003.0057.

Full text
Abstract:
The myotonic dystrophies, type 1 (DM1) and 2 (DM2) are progressive, autosomal, dominantly inherited disorders, mainly characterized by muscle weakness and atrophy but also by a variable impact on heart, eye, brain, and the endocrine and the gastrointestinal system (Meola 2000). The worldwide prevalence is approximately 1 in 8,000. They are considered to be most common in Western Europe and Japan, but less prevalent in Southeast Asia, and rare or absent in southern and central Africa (Emery 1991). A prevalence of 18 in 340,000 children has been reported (Darin and Tulinius 2000). The cause of myotonic dystrophies is an unstable inherited repeat DNA expansions. Expansions are elements occurring and repeated throughout the human genome, typically polymorphic in the general population. Repeats can become unstable during DNA replication and, depending on specific repeat motif and location, expanded repeats can become pathogenic. In disease states, the number of repeats exceeds the normal range, leading to various pathogenic mechanisms (Ranum and Cooper 2006). DM1 is associated with an expanded (CTG)n repeat (>50 to several thousands) within the noncoding 3′ untranslated region of the myotonic dystrophy protein kinase (DMPK) gene on chromosome 19q13.3. In DM2, another mutation exists, namely an expanded CCTG tetranucleotide repeat (from 75 to 11,000 repeats) in the first intron of the zinc finger protein 9 (ZNF9) gene on chromosome 3q21 (Day and Ranum 2005). This means that two unrelated genes are associated with similar phenotypes although there are differences, including the age of onset and severity of symptoms (Meola 2000). The first signs of a DM2 disease are typically shown in adulthood, and no study has as yet systematically described cognitive or behavioral abnormalities in a childhood DM2 phenotype. Consequently, the following chapter focuses on a description of DM1. In this disorder, the age of onset is variable, meaning that there are congenital cases, as well as children, adults and patients experiencing the first symptoms very late in life. DM1 is traditionally divided into categories, each presenting with specific clinical features and broadly associated with the age of onset and extent of genetic abnormality.
APA, Harvard, Vancouver, ISO, and other styles
3

Harding, Dennis. "Graves and grave-goods." In Death and Burial in Iron Age Britain. Oxford University Press, 2015. http://dx.doi.org/10.1093/oso/9780199687565.003.0011.

Full text
Abstract:
It has been stressed that the archaeological remains of the dead in a formal grave represent only the final stage in what may well have been a protracted and complex series of stages in funerary ritual. From this final stage, however, the archaeologist is potentially able to make an informed assessment of several aspects of the prevailing funerary practice, notably: • the context of burial, whether individual, grouped, or collective; • its structure, whether simple pit, with or without coffin, cist, or more elaborate tomb with the provision of additional space for accompaniments; • the placement of the remains, whole or part, cremation or inhumation, in the latter case including factors such as orientation and posture; • the presence or absence of grave-goods, their intrinsic character, and their choreography within the burial area; • any adjacent features, such as remains of pyres or related structures that might reflect pre-depositional stages in funerary ritual; • any secondary episodes of activity, such as subsequent burials or ‘grave robbing’. There is an implicit assumption that cemeteries should be relatively compact groups of graves, with or without a defining enclosure boundary. In the case of a larger cemetery, it might even be possible from grave associations to determine that it expanded over time in one particular direction, as in the case of Wetwang Slack or in the classic instance at Münsingen. Some graves in larger cemeteries were collectively ordered in regular ranks, as at Rudston or Harlyn Bay, implying an informed rather than random pattern of expansion. Smaller burial grounds, however, perhaps used over a shorter period of time, may be dispersed, or in small clusters over a wider area, as at Adanac Park, Cockey Down, Melton, or Little Woodbury, making their recognition more difficult in the absence of widespread stripping. This pattern could arise if a family group, for example, was segregated from the next allowing for infilling over time, which may not have happened if the settlement served by the cemetery for some reason was abandoned. In present-day western society a grave is simply a place of burial, designed for the disposal and commemoration of the dead.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Random feature expansion"

1

Liu, Zhulin, C. L. Philip Chen, Tong Zhang, and Jin Zhou. "Multi-Kernel Broad Learning systems Based on Random Features:A Novel Expansion for Nonlinear Feature Nodes." In 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC). IEEE, 2019. http://dx.doi.org/10.1109/smc.2019.8914328.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Jie, Jimeng Chen, Yi Zhang, and Yalou Huang. "Learning conditional random fields with latent sparse features for acronym expansion finding." In the 20th ACM international conference. New York, New York, USA: ACM Press, 2011. http://dx.doi.org/10.1145/2063576.2063701.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Hu, Chao, and Byeng D. Youn. "Adaptive-Sparse Polynomial Chaos Expansion for Reliability Analysis and Design of Complex Engineering Systems." In ASME 2009 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2009. http://dx.doi.org/10.1115/detc2009-87713.

Full text
Abstract:
This paper presents an adaptive-sparse polynomial chaos expansion (adaptive-sparse PCE) method for performing engineering reliability analysis and design. The proposed method leverages three ideas: (i) an adaptive scheme to build sparse PCE with the minimum number of bivariate basis functions, (ii) a new projection method using dimension reduction techniques to effectively compute the expansion coefficients of system responses, and (iii) an integration of copula to handle nonlinear correlation of input random variables. The proposed method thus has three distinct features for reliability analysis and design: (a) no need of response sensitivities, (b) no extra cost to evaluate probabilistic sensitivity for design, and (c) capability to handle a nonlinear correlation. Besides, an error decomposition scheme of the proposed method is presented to help analyze error sources in probability analysis. Several engineering problems are used to demonstrate the effectiveness of the adaptive-sparse PCE method.
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Zequn, Yan Fu, Ren-Jye Yang, Saeed Barbat, and Wei Chen. "Model Validation of Dynamic Engineering Models Under Uncertainty." In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-59437.

Full text
Abstract:
Validating dynamic engineering models is critically important in practical applications by assessing the agreement between simulation results and experimental observations. Though significant progresses have been made, the existing metrics lack the capability of managing uncertainty in both simulations and experiments, which may stem from computer model instability, imperfection in material fabrication and manufacturing process, and variations in experimental conditions. In addition, it is challenging to validate a dynamic model aggregately over both the time domain and a model input space with data at multiple validation sites. To overcome these difficulties, this paper presents an area-based metric to systemically handle uncertainty and validate computational models for dynamic systems over an input space by simultaneously integrating the information from multiple validation sites. To manage the complexity associated with a high-dimensional data space, Eigen analysis is performed for the time series data from simulations at each validation site to extract the important features. A truncated Karhunen-Loève (KL) expansion is then constructed to represent the responses of dynamic systems, resulting in a set of uncorrelated random coefficients with unit variance. With the development of a hierarchical data fusion strategy, probability integral transform is then employed to pool all the resulting random coefficients from multiple validation sites across the input space into a single aggregated metric. The dynamic model is thus validated by calculating the cumulative area difference of the cumulative density functions. The proposed model validation metric for dynamic systems is illustrated with a mathematical example, a supported beam problem with stochastic loads, and real data from the vehicle occupant restraint system.
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, B. X., and C. Y. Zhao. "Polarized Radiative Transfer in Anisotropic Disordered Media With Short-Range Order." In ASME 2017 Heat Transfer Summer Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/ht2017-5051.

Full text
Abstract:
The aim of this study is to present a general method to investigate radiative transfer in disordered media with a subwave-length, anisotropic short-range order and provide a fundamental understanding on the interplay between polarized radiative transfer and microstructural anisotropy as well short-range order. We show the anisotropy of short-range order, described by an anisotropic correlation length in Gaussian random permittivity model, induces a significant anisotropy of radiative properties. Here the photon scattering mean free path is derived using the Feynman diagrammatic expansion of self-energy, and the transport mean free path and phase function are calculated based on the diagrammatic representation of the irreducible vertex in the Bethe-Salpeter equation. We further consider the transport of polarized light in such media by directly solving Bethe-Salpeter equation (BSE) for photons, without the use of traditional vector radiative transfer equation (VRTE). The present method advantageously allows us to elegantly relate anisotropic structural parameters to polarized radiative transport properties and obtain more fundamental physical insights, because the approximations in all steps of our derivation are given explicitly with reasonable explanations from the exact ab-initio BSE. Moreover, through a polarization eigen-channel expansion technique for intensity tensor, we show that values of transport mean free path in different polarization eigen-channels are rather different, which are also strongly affected by structural anisotropy and short-range order. As a conclusion, this study depicts some fundamental physical features of polarized radiative transfer in disordered media, and is also valuable for potential applications of utilizing anisotropic short-range order in disordered media in manipulation of polarized radiative transfer.
APA, Harvard, Vancouver, ISO, and other styles
6

Osakue, Edward E., Lucky Anetor, and Christopher Odetunde. "Reliability-Based Component Design." In ASME 2015 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/imece2015-50700.

Full text
Abstract:
This paper uses the lognormal probability function to modify deterministic design equations into probabilistic design, thereby transforming the traditional safety factor into a reliability factor. The reliability factor is related to the coefficients of variation (covs) of design parameters and a failure probability. An approximation of the reliability factor for initial sizing is defined as probabilistic design factor. The serviceability design model parameters are treated as random variables characterized by mean values and covs. The cov of the design model is obtained by using first order Taylor series expansion. Multiple serviceability criteria such as bending strength, lateral torsional stability, transverse deflection, and fillet weld strength are considered. The results from this study compare favorably with previous ones and sometimes give solutions with lower weight. In the first example, the solution in the present approach deviates on the conservative side from the previous one by 2.6% for 99.9% reliability and 3.8% for 99.997% reliability. These results are practically the same, suggesting that the method presented is reasonable and accurate. In the second example, the beam in the new solution has 23.65% lower volume or weight and the weld bead volume is lower by 8.4%. This suggests possible substantial cost reductions. From the sizes of the beam and weld bead, it can be concluded that the “factor of reliability” approach of this study and the stochastic Monte Carlo simulation method used previously are in good agreement. Due to the very good results from the examples considered, it seems reasonable to say that the “factor of reliability” method presented is a satisfactory model. The approach has the advantage of being much less computationally intensive and requires no specialized software or skills. These features can lead to cost savings in design projects. Design sizes from this method may be used to create solid models which can be optimized using FEM (Finite Element Method). In addition, and from an instructional perspective, the method could be used to introduce undergraduate engineering students to probabilistic design approaches.
APA, Harvard, Vancouver, ISO, and other styles
7

Hölle, Magnus, Christian Bartsch, and Peter Jeschke. "Evaluation of Measurement Uncertainties for Pneumatic Multi-Hole Probes Using a Monte Carlo Method." In ASME Turbo Expo 2016: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/gt2016-56626.

Full text
Abstract:
The subject of this paper is a statistical method for the accurate evaluation of the uncertainties for pneumatic multi-hole probe measurements. The method can be applied to different types of evaluation algorithms and is suitable for steady flowfield measurements in compressible flows. The evaluation of uncertainties is performed by a Monte Carlo method (MCM), which is based on the statistical law of large numbers. Each input quantity, including calibration and measurement quantities, is randomly varied on the basis of its corresponding probability density function (PDF) and propagated through the deterministic parameter evaluation algorithm. Other than linear Taylor series based uncertainty evaluation methods, MCM features several advantages. On the one hand, MCM does not suffer from lower-order expansion errors and can therefore reproduce nonlinearity effects. On the other hand, different types of PDFs can be assumed for the input quantities and the corresponding coverage intervals can be calculated for any coverage probability. To demonstrate the uncertainty evaluation, a calibration and subsequent measurements in the wake of an airfoil with a 5-hole probe are performed. MCM is applied to different parameter evaluation algorithms. It is found that the MCM approach presented cannot be applied to polynomial curve fits, if the differences between the calibration data and the polynomial curve fits are of the same order of magnitude compared to the calibration uncertainty. Since this method has not yet been used for the evaluation of measurement uncertainties for pneumatic multi-hole probes, the aim of the paper is to present a highly accurate and easy-to-implement uncertainty evaluation method.
APA, Harvard, Vancouver, ISO, and other styles
8

Zhang, Hewen, Honglan Zou, Xuemei Yan, Mingyue Cui, Chong Liang, Yuping Sun, Haibo Li, and Xuewu Wang. "Carbonate Acidising Calculation Model Coupled with Dual-Fractal Wormhole." In International Petroleum Technology Conference. IPTC, 2021. http://dx.doi.org/10.2523/iptc-21262-ms.

Full text
Abstract:
Abstract As recognized as the most economical and effective measure to increase carbonate oil and gas well production, matrix acidising is widely used. The main feature of acidising and the key influence factor of increasing production is what kind of acid etched wormholes can be formed. The actual etched wormholes grow in disorder and randomly, that's why it is extremely difficult to describe by classical mathematical methods. Due to the lack of a quantitative calculation model for the growth law of acid-etched wormholes, penetration depth, competition and distribution patterns among different etched wormholes, an effective method for acidising parameters optimization cannot be formed. The stimulated production of different wells vary greatly. In order to establish the corresponding quantitative calculation model for geometric size of acid etched wormholes, three-dimensional competitive distribution of different wormholes, and production prediction, also to achieve the quantitative optimization of treatment parameters for different wells and improve oil and gas production, firstly, we designed and completed indoor core etched experiments and CT scanning technology to obtain the true three-dimensional morphology of linear acid etched wormholes. Besides, the radial wormholes in 14 cubic feet of super large cores were proven to meet the requirement of fractal. The fractal dimensions of these two wormhole types were also obtained. Furthermore, a quantitative calculation model for the wormhole length expansion was established. Secondly, according to the mathematical model of extending competition among different acid etched wormholes, the fractal distribution law of the length of wormholes in the vertical direction is obtained. Combined with the fractal wormhole length calculation model, a method to solve the dual fractal model by calculating the maximum wormhole length is given. Finally, the classic acidised production rate calculation model was revised. The influence of the three-dimensional expansion length and distribution of wormholes on the skin factor was considered in detail. The sensitivity of key acidising parameters, such as acid strength and pumping rate, was also analyzed. The results show that both the linear wormholes obtained from conventional cores and the radial wormholes obtained from super-large cores can be described by fractal geometry. The fractal dimension corresponding to the optimal pumping rate is 1.46-1.63. Considering the dual-fractal distribution of acid-etched wormholes, the skin factor is larger than that of the conventional equalization model. This is mainly due to the fact that the equalization model only uses the maximum length of the wormholes. This also explains why wells or layers with negative skin factors can still increase production rate after uniform acidising. At the same time, for a specific layer, there is a better acid strength and pumping rate. With these parameters, the acid consumption and predicted production rate are better, which provides a theoretical basis for the quantitative optimization of acidising treatment parameters.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography