Journal articles on the topic 'Random feature expansion'

To see the other types of publications on this topic, follow the link: Random feature expansion.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Random feature expansion.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Febiana Anistya and Erwin Budi Setiawan. "Hate Speech Detection on Twitter in Indonesia with Feature Expansion Using GloVe." Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) 5, no. 6 (December 30, 2021): 1044–51. http://dx.doi.org/10.29207/resti.v5i6.3521.

Full text
Abstract:
Twitter is one of the popular social media to channel opinions in the form of criticism and suggestions. Criticism could be a form of hate speech if the criticism implies attacking something (an individual, race, or group). With the limit of 280 characters in a tweet, there is often a vocabulary mismatch due to abbreviations which can be solved with word embedding. This study utilizes feature expansion to reduce vocabulary mismatches in hate speech on Twitter containing Indonesian language by using Global Vectors (GloVe). Feature selection related to the best model is carried out using the Logistic Regression (LR), Random Forest (RF), and Artificial Neural Network (ANN) algorithms. The results show that the Random Forest model with 5.000 features and a combination of TF-IDF and Tweet corpus built with GloVe produce the best accuracy rate between the other models with an average of 88,59% accuracy score, which is 1,25% higher than the predetermined Baseline. The number of features used is proven to improve the performance of the system.
APA, Harvard, Vancouver, ISO, and other styles
2

Dewi, Mila Putri Kartika, and Erwin Budi Setiawan. "Feature Expansion Using Word2vec for Hate Speech Detection on Indonesian Twitter with Classification Using SVM and Random Forest." JURNAL MEDIA INFORMATIKA BUDIDARMA 6, no. 2 (April 25, 2022): 979. http://dx.doi.org/10.30865/mib.v6i2.3855.

Full text
Abstract:
Hate speech is one of the most common cases on Twitter. It is limited to 280 characters in uploading tweets, resulting in many word variations and possible vocabulary mismatches. Therefore, this study aims to overcome these problems and build a hate speech detection system on Indonesian Twitter. This study uses 20,571 tweet data and implements the Feature Expansion method using Word2vec to overcome vocabulary mismatches. Other methods applied are Bag of Word (BOW) and Term Frequency-Inverse Document Frequency (TF-IDF) to represent feature values in tweets. This study examines two methods in the classification process, namely Support Vector Machine (SVM) and Random Forest (RF). The final result shows that the Feature Expansion method with TF-IDF weighting in the Random Forest classification gives the best accuracy result, which is 88,37%. The Feature Expansion method with TF-IDF weighting can increase the accuracy value from several tests in detecting hate speech and overcoming vocabulary mismatches.
APA, Harvard, Vancouver, ISO, and other styles
3

Elqi Ashok, Sri Suryani Prasetiyowati, and Yuliant Sibaroni. "DHF Incidence Rate Prediction Based on Spatial-Time with Random Forest Extended Features." Jurnal RESTI (Rekayasa Sistem dan Teknologi Informasi) 6, no. 4 (August 22, 2022): 612–23. http://dx.doi.org/10.29207/resti.v6i4.4268.

Full text
Abstract:
This study proposes a prediction of the classification of the spread of dengue hemorrhagic fever (DHF) with the expansion of the Random Forest (RF) feature based on spatial time. The RF classification model was developed by extending the features based on the previous 2 to 4 years. The three best RF models were obtained with an accuracy of 97%, 93%, and 93%, respectively. Meanwhile, the best kriging model was obtained with an RMSE value of 0.762 for 2022, 0.996 for 2023, and 0.953 for 2024. This model produced a prediction of the classification of dengue incidence rates (IR) with a distribution of 33% medium class and 67% high class for 2022. 2023, the medium class is predicted to decrease by 6% and cause an increase in the high class to 73%. Meanwhile, in 2024, it is predicted that there will be an increase of 10% for the medium class from 27% to 37% and the distribution of the high class is predicted to be around 63%. The contribution of this research is to provide predictive information on the classification of the spread of DHF in the Bandung area for three years with the expansion of features based on time.
APA, Harvard, Vancouver, ISO, and other styles
4

Xue, Liang, Diao Li, Cheng Dai, and Tongchao Nan. "Characterization of Aquifer Multiscale Properties by Generating Random Fractal Field with Truncated Power Variogram Model Using Karhunen–Loève Expansion." Geofluids 2017 (2017): 1–11. http://dx.doi.org/10.1155/2017/1361289.

Full text
Abstract:
The traditional geostatistics to describe the spatial variation of hydrogeological properties is based on the assumption of stationarity or statistical homogeneity. However, growing evidences show and it has been widely recognized that the spatial distribution of many hydrogeological properties can be characterized as random fractals with multiscale feature, and spatial variation can be described by power variogram model. It is difficult to generate a multiscale random fractal field by directly using nonstationary power variogram model due to the lack of explicit covariance function. Here we adopt the stationary truncated power variogram model to avoid this difficulty and generate the multiscale random fractal field using Karhunen–Loève (KL) expansion. The results show that either the unconditional or conditional (on measurements) multiscale random fractal field can be generated by using truncated power variogram model and KL expansion when the upper limit of the integral scale is sufficiently large, and the main structure of the spatial variation can be described by using only the first few dominant KL expansion terms associated with large eigenvalues. The latter provides a foundation to perform dimensionality reduction and saves computational effort when analyzing the stochastic flow and transport problems.
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Jia, Fangcheng Sun, and Meng Li. "A Study on the Impact of Digital Finance on Regional Productivity Growth Based on Artificial Neural Networks." Computational Intelligence and Neuroscience 2022 (May 31, 2022): 1–7. http://dx.doi.org/10.1155/2022/7665954.

Full text
Abstract:
The relationship between financial development and economic growth has become a hot topic in recent years and for China, which is undergoing financial liberalisation and policy reform, the efficiency of the use of digital finance and the deepening of the balance between quality and quantity in financial development are particularly important for economic growth. This paper investigates the utility of digital finance and financial development on total factor productivity in China using interprovincial panel data decomposing financial development into financial scale and financial efficiency; an interprovincial panel data model is used to explore the utility of digital finance on total factor productivity. This involves the collection and preprocessing of financial data, including feature engineering, and the development of an optimised predictive model. We preprocess the original dataset to remove anomalous information and improve data quality. This work uses feature engineering to select relevant features for fitting and training the model. In this process, the random forest algorithm is used to effectively avoid overfitting problems and to facilitate the dimensionality reduction of the relevant features. In determining the model to be used, the random forest regression model was chosen for training. The empirical results show that digital finance has contributed to productivity growth but is not efficiently utilised; China should give high priority to improving financial efficiency while promoting financial expansion; rapid expansion of finance without a focus on financial efficiency will not be conducive to productivity growth.
APA, Harvard, Vancouver, ISO, and other styles
6

Wu, Fan, Yufen Ren, and Xiaoke Wang. "Application of Multi-Source Data for Mapping Plantation Based on Random Forest Algorithm in North China." Remote Sensing 14, no. 19 (October 3, 2022): 4946. http://dx.doi.org/10.3390/rs14194946.

Full text
Abstract:
The expansion of plantation poses new challenges for mapping forest, especially in mountainous regions. Using multi-source data, this study explored the capability of the random forest (RF) algorithm for the extraction and mapping of five forest types located in Yanqing, north China. The Google Earth imagery, forest inventory data, GaoFen-1 wide-field-of-view (GF-1 WFV) images and DEM were applied for obtaining 125 features in total. The recursive feature elimination (RFE) method selected 32 features for mapping five forest types. The results attained overall accuracy of 87.06%, with a Kappa coefficient of 0.833. The mean decrease accuracy (MDA) reveals that the DEM, LAI and EVI in winter and three texture features (entropy, variance and mean) make great contributions to forest classification. The texture features from the NIR band are important, while the other texture features have little contribution. This study has demonstrated the potential of applying multi-source data based on RF algorithm for extracting and mapping plantation forest in north China.
APA, Harvard, Vancouver, ISO, and other styles
7

KOTA, V. K. B., MANAN VYAS, and K. B. K. MAYYA. "SPECTRAL DISTRIBUTION ANALYSIS OF RANDOM INTERACTIONS WITH J-SYMMETRY AND ITS EXTENSIONS." International Journal of Modern Physics E 17, supp01 (December 2008): 318–33. http://dx.doi.org/10.1142/s0218301308011951.

Full text
Abstract:
Spectral distribution theory, based on average-fluctuation separation and trace propagation, is applied in the analysis of some properties of a system of m (identical) nucleons in shell model j-orbits with random interactions preserving angular momentum J-symmetry. Employing the bivariate Gaussian form with Edgeworth corrections for fixed E (energy) and M (Jz eigenvalue) density of states ρ(E,M), analytical results, in the form of expansions to order [J(J+1)]2, are derived for energy centroids Ec(m,J) and spectral variances σ2(m,J). They are used to study distribution of spectral widths, J=0 preponderance in energy centroids, lower order cross correlations in states with different J's and so on. Also, an expansion is obtained for occupation probabilities over spaces with fixed M. All the results obtained with spectral distribution theory compare well with those obtained recently using quite different methods. In addition, using trace propagation methods, a regular feature, that they are nearly constant, of spectral variances generated by random interactions is demonstrated using several examples. These open a new window to study regular structures generated by random interactions.
APA, Harvard, Vancouver, ISO, and other styles
8

Mason, William S., Allison R. Jilbert, and Samuel Litwin. "Hepatitis B Virus DNA Integration and Clonal Expansion of Hepatocytes in the Chronically Infected Liver." Viruses 13, no. 2 (January 30, 2021): 210. http://dx.doi.org/10.3390/v13020210.

Full text
Abstract:
Human hepatitis B virus (HBV) can cause chronic, lifelong infection of the liver that may lead to persistent or episodic immune-mediated inflammation against virus-infected hepatocytes. This immune response results in elevated rates of killing of virus-infected hepatocytes, which may extend over many years or decades, lead to fibrosis and cirrhosis, and play a role in the high incidence of hepatocellular carcinoma (HCC) in HBV carriers. Immune-mediated inflammation appears to cause oxidative DNA damage to hepatocytes, which may also play a major role in hepatocarcinogenesis. An additional DNA damaging feature of chronic infections is random integration of HBV DNA into the chromosomal DNA of hepatocytes. While HBV DNA integration does not have a role in virus replication it may alter gene expression of the host cell. Indeed, most HCCs that arise in HBV carriers contain integrated HBV DNA and, in many, the integrant appears to have played a role in hepatocarcinogenesis. Clonal expansion of hepatocytes, which is a natural feature of liver biology, occurs because the hepatocyte population is self-renewing and therefore loses complexity due to random hepatocyte death and replacement by proliferation of surviving hepatocytes. This process may also represent a risk factor for the development of HCC. Interestingly, during chronic HBV infection, hepatocyte clones detected using integrated HBV DNA as lineage-specific markers, emerge that are larger than those expected to occur by random death and proliferation of hepatocytes. The emergence of these larger hepatocyte clones may reflect a survival advantage that could be explained by an ability to avoid the host immune response. While most of these larger hepatocyte clones are probably not preneoplastic, some may have already acquired preneoplastic changes. Thus, chronic inflammation in the HBV-infected liver may be responsible, at least in part, for both initiation of HCC via oxidative DNA damage and promotion of HCC via stimulation of hepatocyte proliferation through immune-mediated killing and compensatory division.
APA, Harvard, Vancouver, ISO, and other styles
9

Viny, Aaron D., Alan Lichtin, Brad Pohlman, Zachary Nearman, Thomas Loughran, and Jaroslaw P. Maciejewski. "Chronic B-Cell Dyscrasias Are an Important Clinical Feature of LGL Leukemia." Blood 110, no. 11 (November 16, 2007): 4675. http://dx.doi.org/10.1182/blood.v110.11.4675.4675.

Full text
Abstract:
Abstract T-cell large granular lymphocyte leukemia (T-LGL) is a chronic clonal lymphoproliferation of CTL. The context of immune-mediated diseases led to the hypothesis that T-LGL represents an exaggerated clonal immune response to a persistent antigen such as an autoantigen or viral antigen or be associated with an immune surveillance reaction to occult malignancies. Based on the structural similarity of TCR CDR3 we have demonstrated that the transformation event in T-LGL may not be random and is driven by a related antigen. Similar to the clonal evolution in LGL, B cell expansion in low grade non-Hogkin lymphoma may also not be entirely stochastic. There have been coincidental case reports of T-LGL patients with concomitant B cell dyscrasia. It is therefore possible that similar pathogenic triggers may be operative in chronic proliferation of T and B cells, which subsequently predispose to clonal outgrowth. Consequently, we systematically examined a large series of T-LGL patients for evidence of B cell dyscrasias. When our patients (N=70) were studied we found a frequent association of low grade B cell lymphoproliferative disorders (28%). In general, all clinical comparisons with reported studies suggested that our LGL cohort had a composition equivalent to that of prior series and findings with regard to B cell dyscrasias are not due to selection bias. By comparison, a total of 51 patients with concomitant B and T cell dyscrasia were previously reported in small series or case reports. In our series, MGUS was the most common of the B cell disorders identified in T-LGL (21%), B-CLL also was present in 7%. Of note, 8 patients received rituximab and notably, evolution of clonal T cell expansion after therapy with rituximab has been identified in isolated case reports. In addition to clonal B cell expansion, polyclonal hyperglobulinemia was found in 26% of T-LGL, similar to previous report. Hypoglobulinemia was identified in 12% of patients. Evidence of involvement of both the T and B cell compartments in T-LGL fits into several models of disease pathogenesis. T-LGL may represent anti-tumor surveillance reflecting an exaggerated clonal expansion in the context of polyclonal anti-tumor response. An alternative theory is that both conditions may result from an initial polyclonal immune reaction directed against an unrelated common target; one could speculate that autoimmune/viral diseases associated with T-LGL or malignancies (e.g., MDS) provide antigenic triggers. It is also conceivable that impaired humoral immune response could result in an exuberant T cell reaction against an uncleared antigen. If B cell function is insufficient to fully clear the inciting pathogen, then chronic antigenic stimulation could polarize T cell-mediated response, resulting in LGL expansion. Finally, identification of decreased immunoglobulin levels in the context of LGL leukemia may also give merit to the theory that both B and T cell compartments are governed by regulatory/compensatory feedback mechanisms and T-LGL could evolve from unchecked T cell expansion in the context of B cell dysfunction. In sum, we describe here a high frequency of B cell dyscrasias in patients with T-LGL. The association is unlikely to be coincidental and provide important insight into dysregulated expression of T cell and B cell function.
APA, Harvard, Vancouver, ISO, and other styles
10

Moutselos, Konstantinos, Ilias Maglogiannis, and Aristotelis Chatziioannou. "Integration of High-Volume Molecular and Imaging Data for Composite Biomarker Discovery in the Study of Melanoma." BioMed Research International 2014 (2014): 1–14. http://dx.doi.org/10.1155/2014/145243.

Full text
Abstract:
In this work the effects of simple imputations are studied, regarding the integration of multimodal data originating from different patients. Two separate datasets of cutaneous melanoma are used, an image analysis (dermoscopy) dataset together with a transcriptomic one, specifically DNA microarrays. Each modality is related to a different set of patients, and four imputation methods are employed to the formation of a unified, integrative dataset. The application of backward selection together with ensemble classifiers (random forests), followed by principal components analysis and linear discriminant analysis, illustrates the implication of the imputations on feature selection and dimensionality reduction methods. The results suggest that the expansion of the feature space through the data integration, achieved by the exploitation of imputation schemes in general, aids the classification task, imparting stability as regards the derivation of putative classifiers. In particular, although the biased imputation methods increase significantly the predictive performance and the class discrimination of the datasets, they still contribute to the study of prominent features and their relations. The fusion of separate datasets, which provide a multimodal description of the same pathology, represents an innovative, promising avenue, enhancing robust composite biomarker derivation and promoting the interpretation of the biomedical problem studied.
APA, Harvard, Vancouver, ISO, and other styles
11

Liu, Junfeng, Wendan Tao, Zhetao Wang, Xinyue Chen, Bo Wu, and Ming Liu. "Radiomics-based prediction of hemorrhage expansion among patients with thrombolysis/thrombectomy related-hemorrhagic transformation using machine learning." Therapeutic Advances in Neurological Disorders 14 (January 2021): 175628642110600. http://dx.doi.org/10.1177/17562864211060029.

Full text
Abstract:
Introduction: Patients with hemorrhagic transformation (HT) were reported to have hemorrhage expansion. However, identification these patients with high risk of hemorrhage expansion has not been well studied. Objectives: We aimed to develop a radiomic score to predict hemorrhage expansion after HT among patients treated with thrombolysis/thrombectomy during acute phase of ischemic stroke. Methods: A total of 104 patients with HT after reperfusion treatment from the West China hospital, Sichuan University, were retrospectively included in this study between 1 January 2012 and 31 December 2020. The preprocessed initial non-contrast-enhanced computed tomography (NECT) imaging brain images were used for radiomic feature extraction. A synthetic minority oversampling technique (SMOTE) was applied to the original data set. The after-SMOTE data set was randomly split into training and testing cohorts with an 8:2 ratio by a stratified random sampling method. The least absolute shrinkage and selection operator (LASSO) regression were applied to identify candidate radiomic features and construct the radiomic score. The performance of the score was evaluated by receiver operating characteristic (ROC) analysis and a calibration curve. Decision curve analysis (DCA) was performed to evaluate the clinical value of the model. Results: Among the 104 patients, 17 patients were identified with hemorrhage expansion after HT detection. A total of 154 candidate predictors were extracted from NECT images and five optimal features were ultimately included in the development of the radiomic score by using logistic regression machine-learning approach. The radiomic score showed good performance with high area under the curves in both the training data set (0.91, sensitivity: 0.83; specificity: 0.89), test data set (0.87, sensitivity: 0.60; specificity: 0.85), and original data set (0.82, sensitivity: 0.77; specificity: 0.78). The calibration curve and DCA also indicated that there was a high accuracy and clinical usefulness of the radiomic score for hemorrhage expansion prediction after HT. Conclusions: The currently established NECT-based radiomic score is valuable in predicting hemorrhage expansion after HT among patients treated with reperfusion treatment after ischemic stroke, which may aid clinicians in determining patients with HT who are most likely to benefit from anti-expansion treatment.
APA, Harvard, Vancouver, ISO, and other styles
12

Xu, Kaibin, Jing Qian, Zengyun Hu, Zheng Duan, Chaoliang Chen, Jun Liu, Jiayu Sun, Shujie Wei, and Xiuwei Xing. "A New Machine Learning Approach in Detecting the Oil Palm Plantations Using Remote Sensing Data." Remote Sensing 13, no. 2 (January 12, 2021): 236. http://dx.doi.org/10.3390/rs13020236.

Full text
Abstract:
The rapid expansion of oil palm is a major driver of deforestation and other associated damage to the climate and ecosystem in tropical regions, especially Southeast Asia. It is therefore necessary to precisely detect and monitor oil palm plantations to safeguard the ecosystem services and biodiversity of tropical forests. Compared with optical data, which are vulnerable to cloud cover, the Sentinel-1 dual-polarization C-band synthetic aperture radar (SAR) acquires global observations under all weather conditions and times of day and shows good performance for oil palm detection in the humid tropics. However, because accurately distinguishing mature and young oil palm trees by using optical and SAR data is difficult and considering the strong dependence on the input parameter values when detecting oil palm plantations by employing existing classification algorithms, we propose an innovative method to improve the accuracy of classifying the oil palm type (mature or young) and detecting the oil palm planting area in Sumatra by fusing Landsat-8 and Sentinel-1 images. We extract multitemporal spectral characteristics, SAR backscattering values, vegetation indices, and texture features to establish different feature combinations. Then, we use the random forest algorithm based on improved grid search optimization (IGSO-RF) and select optimal feature subsets to establish a classification model and detect oil palm plantations. Based on the IGSO-RF classifier and optimal features, our method improved the oil palm detection accuracy and obtained the best model performance (OA = 96.08% and kappa = 0.9462). Moreover, the contributions of different features to oil palm detection are different; nevertheless, the optimal feature subset performed the best and demonstrated good potential for the detection of oil palm plantations.
APA, Harvard, Vancouver, ISO, and other styles
13

Hamrouni, Adam, Daniel Dias, and Xiangfeng Guo. "Behavior of Shallow Circular Tunnels—Impact of the Soil Spatial Variability." Geosciences 12, no. 2 (February 21, 2022): 97. http://dx.doi.org/10.3390/geosciences12020097.

Full text
Abstract:
Spatial variability is unavoidable for soils and it is important to consider such a feature in the design of geotechnical engineering as it may lead to some structure behaviors which cannot be predicted by a calculation assuming homogenous soils. This paper attempts to evaluate the performance of a shallow circular tunnel, in a context of the service limit state, considering the soil spatial variability. The Log-normal distributed random fields, generated by the Karhunen–Loeve expansion method, are used for the spatial modeling. A two-dimensional numerical model, based on the finite difference method, is constructed to deterministically estimate two quantities of interest (i.e., tunnel lining bending moment and surface settlement). The model is combined with the random fields and is implemented into the Monte Carlo simulation to investigate the effects of the soil spatial variability on the tunnel responses. The autocorrelation distance, an important parameter for random fields, is varied within multiple probabilistic analyses. For both of the two tunnel responses, their variabilities are increased with increasing the autocorrelation distance, while a minimum mean value can be observed with this parameter being approximately the tunnel radius. Such finding is very useful for practical designs. A sensitivity analysis is also conducted to show the importance of each random parameter.
APA, Harvard, Vancouver, ISO, and other styles
14

de Santis, Rodrigo Barbosa de, Tiago Silveira Gontijo, and Marcelo Azevedo Costa. "A Data-Driven Framework for Small Hydroelectric Plant Prognosis Using Tsfresh and Machine Learning Survival Models." Sensors 23, no. 1 (December 20, 2022): 12. http://dx.doi.org/10.3390/s23010012.

Full text
Abstract:
Maintenance in small hydroelectric plants (SHPs) is essential for securing the expansion of clean energy sources and supplying the energy estimated to be required for the coming years. Identifying failures in SHPs before they happen is crucial for allowing better management of asset maintenance, lowering operating costs, and enabling the expansion of renewable energy sources. Most fault prognosis models proposed thus far for hydroelectric generating units are based on signal decomposition and regression models. In the specific case of SHPs, there is a high occurrence of data being censored, since the operation is not consistently steady and can be repeatedly interrupted due to transmission problems or scarcity of water resources. To overcome this, we propose a two-step, data-driven framework for SHP prognosis based on time series feature engineering and survival modeling. We compared two different strategies for feature engineering: one using higher-order statistics and the other using the Tsfresh algorithm. We adjusted three machine learning survival models—CoxNet, survival random forests, and gradient boosting survival analysis—for estimating the concordance index of these approaches. The best model presented a significant concordance index of 77.44%. We further investigated and discussed the importance of the monitored sensors and the feature extraction aggregations. The kurtosis and variance were the most relevant aggregations in the higher-order statistics domain, while the fast Fourier transform and continuous wavelet transform were the most frequent transformations when using Tsfresh. The most important sensors were related to the temperature at several points, such as the bearing generator, oil hydraulic unit, and turbine radial bushing.
APA, Harvard, Vancouver, ISO, and other styles
15

Maabreh, Majdi, Ibrahim Obeidat, Esraa Abu Elsoud, Asma Alnajjar, Rahaf Alzyoud, and Omar Darwish. "Towards Data-Driven Network Intrusion Detection Systems: Features Dimensionality Reduction and Machine Learning." International Journal of Interactive Mobile Technologies (iJIM) 16, no. 14 (July 26, 2022): 123–35. http://dx.doi.org/10.3991/ijim.v16i14.30197.

Full text
Abstract:
Cyberattacks have increased in tandem with the exponential expansion of computer networks and network applications throughout the world. In this study, we evaluate and compare four features selection methods, seven classical machine learning algorithms, and the deep learning algorithm on one million random instances of CSE-CIC-IDS2018 big data set for network intrusions. The dataset was preprocessed and cleaned and all learning algorithms were trained on the original values of features. The feature selection methods highlighted the importance of features related to forwarding direction (FWD) and two flow measures (FLOW) in predicting the binary traffic type; benign or attack. Furthermore, the results revealed that whether models are trained on all features or the top 30 features selected by any of the four features selection techniques used in this experiment, there is no significant difference in model performance. Moreover, we may be able to train ML models on only four features and have them perform similarly to models trained on all data,which may result in preferable models in terms of complexity, explainability, and scale for deployment. Furthermore, by choosing four unanimity features instead of all traffic features, training time may be reduced from 10% to 50% of the training time on all features.
APA, Harvard, Vancouver, ISO, and other styles
16

Luo, Xin, Xiaohua Tong, Zhongwen Hu, and Guofeng Wu. "Improving Urban Land Cover/Use Mapping by Integrating A Hybrid Convolutional Neural Network and An Automatic Training Sample Expanding Strategy." Remote Sensing 12, no. 14 (July 16, 2020): 2292. http://dx.doi.org/10.3390/rs12142292.

Full text
Abstract:
Moderate spatial resolution (MSR) satellite images, which hold a trade-off among radiometric, spectral, spatial and temporal characteristics, are extremely popular data for acquiring land cover information. However, the low accuracy of existing classification methods for MSR images is still a fundamental issue restricting their capability in urban land cover mapping. In this study, we proposed a hybrid convolutional neural network (H-ConvNet) for improving urban land cover mapping with MSR Sentinel-2 images. The H-ConvNet was structured with two streams: one lightweight 1D ConvNet for deep spectral feature extraction and one lightweight 2D ConvNet for deep context feature extraction. To obtain a well-trained 2D ConvNet, a training sample expansion strategy was introduced to assist context feature learning. The H-ConvNet was tested in six highly heterogeneous urban regions around the world, and it was compared with support vector machine (SVM), object-based image analysis (OBIA), Markov random field model (MRF) and a newly proposed patch-based ConvNet system. The results showed that the H-ConvNet performed best. We hope that the proposed H-ConvNet would benefit for the land cover mapping with MSR images in highly heterogeneous urban regions.
APA, Harvard, Vancouver, ISO, and other styles
17

Dučinskas, K., and J. Šaltytė. "Quadratic Discriminant Analysis of Spatially Correlated Data." Nonlinear Analysis: Modelling and Control 6, no. 2 (December 5, 2001): 15–28. http://dx.doi.org/10.15388/na.2001.6.1.15212.

Full text
Abstract:
The problem of classification of the realisation of the stationary univariate Gaussian random field into one of two populations with different means and different factorised covariance matrices is considered. In such a case optimal classification rule in the sense of minimum probability of misclassification is associated with non-linear (quadratic) discriminant function. Unknown means and the covariance matrices of the feature vector components are estimated from spatially correlated training samples using the maximum likelihood approach and assuming spatial correlations to be known. Explicit formula of Bayes error rate and the first-order asymptotic expansion of the expected error rate associated with quadratic plug-in discriminant function are presented. A set of numerical calculations for the spherical spatial correlation function is performed and two different spatial sampling designs are compared.
APA, Harvard, Vancouver, ISO, and other styles
18

Liu, Chen, Chunjiang Zhao, Huarui Wu, Xiao Han, and Shuqin Li. "ADDLight: An Energy-Saving Adder Neural Network for Cucumber Disease Classification." Agriculture 12, no. 4 (March 23, 2022): 452. http://dx.doi.org/10.3390/agriculture12040452.

Full text
Abstract:
It is an urgent task to improve the applicability of the cucumber disease classification model in greenhouse edge-intelligent devices. The energy consumption of disease diagnosis models designed based on deep learning methods is a key factor affecting its applicability. Based on this motivation, two methods of reducing the model’s calculation amount and changing the calculation method of feature extraction were used in this study to reduce the model’s calculation energy consumption, thereby prolonging the working time of greenhouse edge devices deployed with disease models. First, a cucumber disease dataset with complex backgrounds is constructed in this study. Second, the random data enhancement method is used to enhance data during model training. Third, the conventional feature extraction module, depthwise separable feature extraction module, and the squeeze-and-excitation module are the main modules for constructing the classification model. In addition, the strategies of channel expansion and = shortcut connection are used to further improve the model’s classification accuracy. Finally, the additive feature extraction method is used to reconstruct the proposed model. The experimental results show that the computational energy consumption of the adder cucumber disease classification model is reduced by 96.1% compared with the convolutional neural network of the same structure. In addition, the model size is only 0.479 MB, the calculation amount is 0.03 GFLOPs, and the classification accuracy of cucumber disease images with complex backgrounds is 89.1%. All results prove that our model has high applicability in cucumber greenhouse intelligent equipment.
APA, Harvard, Vancouver, ISO, and other styles
19

Malamatinos, Marios-Christos, Eleni Vrochidou, and George A. Papakostas. "On Predicting Soccer Outcomes in the Greek League Using Machine Learning." Computers 11, no. 9 (August 31, 2022): 133. http://dx.doi.org/10.3390/computers11090133.

Full text
Abstract:
The global expansion of the sports betting industry has brought the prediction of outcomes of sport events into the foreground of scientific research. In this work, soccer outcome prediction methods are evaluated, focusing on the Greek Super League. Data analysis, including data cleaning, Sequential Forward Selection (SFS), feature engineering methods and data augmentation is conducted. The most important features are used to train five machine learning models: k-Nearest Neighbor (k-NN), LogitBoost (LB), Support Vector Machine (SVM), Random Forest (RF) and CatBoost (CB). For comparative reasons, the best model is also tested on the English Premier League and the Dutch Eredivisie, exploiting data statistics from six seasons from 2014 to 2020. Convolutional neural networks (CNN) and transfer learning are also tested by encoding tabular data to images, using 10-fold cross-validation, after applying grid and randomized hyperparameter tuning: DenseNet201, InceptionV3, MobileNetV2 and ResNet101V2. This is the first time the Greek Super League is investigated in depth, providing important features and comparative performance between several machine and deep learning models, as well as between other leagues. Experimental results in all cases demonstrate that the most accurate prediction model is the CB, reporting 67.73% accuracy, while the Greek Super League is the most predictable league.
APA, Harvard, Vancouver, ISO, and other styles
20

Masri, S. F., A. W. Smyth, and M. I. Traina. "Probabilistic Representation and Transmission of Nonstationary Processes in Multi-Degree-of-Freedom Systems." Journal of Applied Mechanics 65, no. 2 (June 1, 1998): 398–409. http://dx.doi.org/10.1115/1.2789068.

Full text
Abstract:
A relatively simple and straightforward procedure is presented for representing non-stationary random process data in a compact probabilistic format which can be used as excitation input in multi-degree-of-freedom analytical random vibration studies. The method involves two main stages of compaction. The first stage is based on the spectral decomposition of the covariance matrix by the orthogonal Karhunen-Loeve expansion. The dominant eigenvectors are subsequently least-squares fitted with orthogonal polynomials to yield an analytical approximation. This compact analytical representation of the random process is then used to derive an exact closed-form solution for the nonstationary response of general linear multi-degree-of-freedom dynamic systems. The approach is illustrated by the use of an ensemble of free-field acceleration records from the 1994 Northridge earthquake to analytically determine the covariance kernels of the response of a two-degree-of-freedom system resembling a commonly encountered problem in the structural control field. Spectral plots of the extreme values of the rms response of representative multi-degree-of-freedom systems under the action of the subject earthquake are also presented. It is shown that the proposed random data-processing method is not only a useful data-archiving and earthquake feature-extraction tool, but also provides a probabilistic measure of the average statistical characteristics of earthquake ground motion corresponding to a spatially distributed region. Such a representation could be a valuable tool in risk management studies to quantify the average seismic risk over a spatially extended area.
APA, Harvard, Vancouver, ISO, and other styles
21

Shaffie, Ahmed, Ahmed Soliman, Amr Eledkawy, Victor van Berkel, and Ayman El-Baz. "Computer-Assisted Image Processing System for Early Assessment of Lung Nodule Malignancy." Cancers 14, no. 5 (February 22, 2022): 1117. http://dx.doi.org/10.3390/cancers14051117.

Full text
Abstract:
Lung cancer is one of the most dreadful cancers, and its detection in the early stage is very important and challenging. This manuscript proposes a new computer-aided diagnosis system for lung cancer diagnosis from chest computed tomography scans. The proposed system extracts two different kinds of features, namely, appearance features and shape features. For the appearance features, a Histogram of oriented gradients, a Multi-view analytical Local Binary Pattern, and a Markov Gibbs Random Field are developed to give a good description of the lung nodule texture, which is one of the main distinguishing characteristics between benign and malignant nodules. For the shape features, Multi-view Peripheral Sum Curvature Scale Space, Spherical Harmonics Expansion, and a group of some fundamental morphological features are implemented to describe the outer contour complexity of the nodules, which is main factor in lung nodule diagnosis. Each feature is fed into a stacked auto-encoder followed by a soft-max classifier to generate the initial malignancy probability. Finally, all these probabilities are combined together and fed to the last network to give the final diagnosis. The system is validated using 727 nodules which are subset from the Lung Image Database Consortium (LIDC) dataset. The system shows very high performance measures and achieves 92.55%, 91.70%, and 93.40% for the accuracy, sensitivity, and specificity, respectively. This high performance shows the ability of the system to distinguish between the malignant and benign nodules precisely.
APA, Harvard, Vancouver, ISO, and other styles
22

Mazrouee, Sepideh, Susan J. Little, and Joel O. Wertheim. "Incorporating metadata in HIV transmission network reconstruction: A machine learning feasibility assessment." PLOS Computational Biology 17, no. 9 (September 22, 2021): e1009336. http://dx.doi.org/10.1371/journal.pcbi.1009336.

Full text
Abstract:
HIV molecular epidemiology estimates the transmission patterns from clustering genetically similar viruses. The process involves connecting genetically similar genotyped viral sequences in the network implying epidemiological transmissions. This technique relies on genotype data which is collected only from HIV diagnosed and in-care populations and leaves many persons with HIV (PWH) who have no access to consistent care out of the tracking process. We use machine learning algorithms to learn the non-linear correlation patterns between patient metadata and transmissions between HIV-positive cases. This enables us to expand the transmission network reconstruction beyond the molecular network. We employed multiple commonly used supervised classification algorithms to analyze the San Diego Primary Infection Resource Consortium (PIRC) cohort dataset, consisting of genotypes and nearly 80 additional non-genetic features. First, we trained classification models to determine genetically unrelated individuals from related ones. Our results show that random forest and decision tree achieved over 80% in accuracy, precision, recall, and F1-score by only using a subset of meta-features including age, birth sex, sexual orientation, race, transmission category, estimated date of infection, and first viral load date besides genetic data. Additionally, both algorithms achieved approximately 80% sensitivity and specificity. The Area Under Curve (AUC) is reported 97% and 94% for random forest and decision tree classifiers respectively. Next, we extended the models to identify clusters of similar viral sequences. Support vector machine demonstrated one order of magnitude improvement in accuracy of assigning the sequences to the correct cluster compared to dummy uniform random classifier. These results confirm that metadata carries important information about the dynamics of HIV transmission as embedded in transmission clusters. Hence, novel computational approaches are needed to apply the non-trivial knowledge collected from inter-individual genetic information to metadata from PWH in order to expand the estimated transmissions. We note that feature extraction alone will not be effective in identifying patterns of transmission and will result in random clustering of the data, but its utilization in conjunction with genetic data and the right algorithm can contribute to the expansion of the reconstructed network beyond individuals with genetic data.
APA, Harvard, Vancouver, ISO, and other styles
23

Olevsky, Gregory, and Timurs Safiulins. "Knowledge Intensive Business Sector in Latvian National Economy: Random Effect Factor outlook." Humanities and Social Sciences: Latvia 29, no. 2 (December 2021): 36–51. http://dx.doi.org/10.22364/hssl.29.2.03.

Full text
Abstract:
Global experience shows that sustainable economic development takes place in countries with economies focused on the creation and intensive use of knowledge. Entrepreneurs are interested in investing in knowledge, using obtained findings in the company’s development. Investment knowledge strengthens company’s market position, thus increasing the probability of successful implementation of its new products and services. Based on the general idea of tailor-made mix of content, structure, and functioning mechanism of market relations, it can be stated that knowledge is necessary for market participants in order to reach broader market share, take business advantage from innovations, increase competitiveness and uptake new markets, as well as ensure higher satisfaction regarding both goods and services for their customers. Investing in large-scale research projects enables opportunity to accumulate knowledge is a power for large corporations, which further determines their dominance in the global market. However, knowledge in terms of disruptive services is still more important among owners and managers of small and medium-sized enterprises (SMEs). The expansion of knowledge in the medium and especially in the small business environment promoted the emergence of a specific business niche known as the knowledge intensive business. The knowledge-based economy is gradually “displacing” the resource-based economy, stimulating entrepreneurs to put more focus on the use of information resources as a feature of the knowledge-intensive economy, thus pacing overall growth dynamics of segment. This article focuses on the identification and analysis of factors affecting the knowledge intensive business development business sector in Latvian national economy with random effects regression model. Random effects regression was used since it best suited for panel data. Compiled available repeated observations on the same units allowing to enrich the model by inserting an additional term in the regression, capturing individual-specific, time-invariant factors affecting the dependent variable.
APA, Harvard, Vancouver, ISO, and other styles
24

Guo, Jiawei, Yu Jin, Huichun Ye, Wenjiang Huang, Jinling Zhao, Bei Cui, Fucheng Liu, and Jiajian Deng. "Recognition of Areca Leaf Yellow Disease Based on PlanetScope Satellite Imagery." Agronomy 12, no. 1 (December 23, 2021): 14. http://dx.doi.org/10.3390/agronomy12010014.

Full text
Abstract:
Areca yellow leaf disease is a major attacker of the planting and production of arecanut. The continuous expansion of arecanut (Areca catechu L.) planting areas in Hainan has placed a great need to strengthen the monitoring of this disease. At present, there is little research on the monitoring of areca yellow leaf disease. PlanetScope imagery can achieve daily global coverage at a high spatial resolution (3 m) and is thus suitable for the high-precision monitoring of plant pest and disease. In this paper, PlanetScope images were employed to extract spectral features commonly used in disease, pest and vegetation growth monitoring for primary models. In this paper, 13 spectral features commonly used in vegetation growth and pest monitoring were selected to form the initial feature space, followed by the implementation of the Correlation Analysis (CA) and independent t-testing to optimize the feature space. Then, the Random Forest (RF), Backward Propagation Neural Network (BPNN) and AdaBoost algorithms based on feature space optimization to construct double-classification (healthy, diseased) monitoring models for the areca yellow leaf disease. The results indicated that the green, blue and red bands, and plant senescence reflectance index (PSRI) and enhanced vegetation index (EVI) exhibited highly significant differences and strong correlations with healthy and diseased samples. The RF model exhibits the highest overall recognition accuracy for areca yellow leaf disease (88.24%), 2.95% and 20.59% higher than the BPNN and AdaBoost models, respectively. The commission and omission errors were lowest with the RF model for both healthy and diseased samples. This model also exhibited the highest Kappa coefficient at 0.765. Our results exhibit the feasible application of PlanetScope imagery for the regional large-scale monitoring of areca yellow leaf disease, with the RF method identified as the most suitable for this task. Our study provides a reference for the monitoring, a rapid assessment of the area affected and the management planning of the disease in the agricultural and forestry industries.
APA, Harvard, Vancouver, ISO, and other styles
25

Voronin, E. G. "On the displacements of the contours of the optic-electronic space images. Causes and evaluation of offsets." Geodesy and Cartography 923, no. 5 (June 20, 2017): 34–41. http://dx.doi.org/10.22389/0016-7126-2017-923-5-34-41.

Full text
Abstract:
The second article in a series of three consecutive articles devoted to the phenomenon of the displacement of the same points in overlapping scans obtained adjacent CCD matrices with optoelectronic imagery. This article uncovered the main causes of displacement and the theoretical estimation of their values due to the influence of each of these reasons. Noted that the obtained theoretical estimates of displacements are consistent with the results of measurements of real optical-electronic space images. A classification of the considered reasons for gross, systematic and random offsets. It is shown that the coarse offset is caused mainly by clouds and digital correlation error in the measurement images. Systematic offset depends on the inclination of the imaging system; size, location, and installation errors CCDs in the focal plane; the deviation of the actual speed of movement of the image of the theoretical value and the thermal expansion of optical-electronic Converter. The random offsets are determined by a number of factors, the main of which are terrain remove terrain, temperature changes and instability of the angular movement of the imaging system. It is concluded that the displacements of the same points in the ceiling adjacent CCDs are a characteristic feature of the optic-electronic space images and affect their measuring accuracy.
APA, Harvard, Vancouver, ISO, and other styles
26

BRODY, DORJE C., and STALA HADJIPETRI. "COHERENT CHAOS INTEREST-RATE MODELS." International Journal of Theoretical and Applied Finance 18, no. 03 (May 2015): 1550016. http://dx.doi.org/10.1142/s0219024915500168.

Full text
Abstract:
The Wiener chaos approach to interest-rate modeling arises from the observation that in the general context of an arbitrage-free model with a Brownian filtration, the pricing kernel admits a representation in terms of the conditional variance of a square-integrable generator, which in turn admits a chaos expansion. When the expansion coefficients of the random generator factorize into multiple copies of a single function, the resulting interest-rate model is called "coherent", whereas a generic interest-rate model is necessarily "incoherent". Coherent representations are of fundamental importance because an incoherent generator can always be expressed as a linear superposition of coherent elements. This property is exploited to derive general expressions for the pricing kernel and the associated bond price and short rate processes in the case of a generic nth order chaos model, for each n ∈ ℕ. Pricing formulae for bond options and swaptions are obtained in closed form for a number of examples. An explicit representation for the pricing kernel of a generic incoherent model is then obtained by use of the underlying coherent elements. Finally, finite-dimensional realizations of coherent chaos models are investigated and we show that a class of highly tractable models can be constructed having the characteristic feature that the discount bond price is given by a piecewise-flat (simple) process.
APA, Harvard, Vancouver, ISO, and other styles
27

Aslam, Nida, Irfan Ullah Khan, Samiha Mirza, Alanoud AlOwayed, Fatima M. Anis, Reef M. Aljuaid, and Reham Baageel. "Interpretable Machine Learning Models for Malicious Domains Detection Using Explainable Artificial Intelligence (XAI)." Sustainability 14, no. 12 (June 16, 2022): 7375. http://dx.doi.org/10.3390/su14127375.

Full text
Abstract:
With the expansion of the internet, a major threat has emerged involving the spread of malicious domains intended by attackers to perform illegal activities aiming to target governments, violating privacy of organizations, and even manipulating everyday users. Therefore, detecting these harmful domains is necessary to combat the growing network attacks. Machine Learning (ML) models have shown significant outcomes towards the detection of malicious domains. However, the “black box” nature of the complex ML models obstructs their wide-ranging acceptance in some of the fields. The emergence of Explainable Artificial Intelligence (XAI) has successfully incorporated the interpretability and explicability in the complex models. Furthermore, the post hoc XAI model has enabled the interpretability without affecting the performance of the models. This study aimed to propose an Explainable Artificial Intelligence (XAI) model to detect malicious domains on a recent dataset containing 45,000 samples of malicious and non-malicious domains. In the current study, initially several interpretable ML models, such as Decision Tree (DT) and Naïve Bayes (NB), and black box ensemble models, such as Random Forest (RF), Extreme Gradient Boosting (XGB), AdaBoost (AB), and Cat Boost (CB) algorithms, were implemented and found that XGB outperformed the other classifiers. Furthermore, the post hoc XAI global surrogate model (Shapley additive explanations) and local surrogate LIME were used to generate the explanation of the XGB prediction. Two sets of experiments were performed; initially the model was executed using a preprocessed dataset and later with selected features using the Sequential Forward Feature selection algorithm. The results demonstrate that ML algorithms were able to distinguish benign and malicious domains with overall accuracy ranging from 0.8479 to 0.9856. The ensemble classifier XGB achieved the highest result, with an AUC and accuracy of 0.9991 and 0.9856, respectively, before the feature selection algorithm, while there was an AUC of 0.999 and accuracy of 0.9818 after the feature selection algorithm. The proposed model outperformed the benchmark study.
APA, Harvard, Vancouver, ISO, and other styles
28

Han, Lijing, Jianli Ding, Jinjie Wang, Junyong Zhang, Boqiang Xie, and Jianping Hao. "Monitoring Oasis Cotton Fields Expansion in Arid Zones Using the Google Earth Engine: A Case Study in the Ogan-Kucha River Oasis, Xinjiang, China." Remote Sensing 14, no. 1 (January 4, 2022): 225. http://dx.doi.org/10.3390/rs14010225.

Full text
Abstract:
Rapid and accurate mapping of the spatial distribution of cotton fields is helpful to ensure safe production of cotton fields and the rationalization of land-resource planning. As cotton is an important economic pillar in Xinjiang, accurate and efficient mapping of cotton fields helps the implementation of rural revitalization strategy in Xinjiang region. In this paper, based on the Google Earth Engine cloud computing platform, we use a random forest machine-learning algorithm to classify Landsat 5 and 8 and Sentinel 2 satellite images to obtain the spatial distribution characteristics of cotton fields in 2011, 2015 and 2020 in the Ogan-Kucha River oasis, Xinjiang. Unlike previous studies, the mulching process was considered when using cotton field phenology information as a classification feature. The results show that both Landsat 5, Landsat 8 and Sentinel 2 satellites can successfully classify cotton field information when the mulching process is considered, but Sentinel 2 satellite classification results have the best user accuracy of 0.947. Sentinel 2 images can distinguish some cotton fields from roads well because they have higher spatial resolution than Landsat 8. After the cotton fields were mulched, there was a significant increase in spectral reflectance in the visible, red-edge and near-infrared bands, and a decrease in the short-wave infrared band. The increase in the area of oasis cotton fields and the extensive use of mulched drip-irrigation water saving facilities may lead to a decrease in the groundwater level. Overall, the use of mulch as a phenological feature for classification mapping is a good indicator in cotton-growing areas covered by mulch, and mulch drip irrigation may lead to a decrease in groundwater levels in oases in arid areas.
APA, Harvard, Vancouver, ISO, and other styles
29

Schaafsma, S. J., and J. Duysens. "Neurons in the ventral intraparietal area of awake macaque monkey closely resemble neurons in the dorsal part of the medial superior temporal area in their responses to optic flow patterns." Journal of Neurophysiology 76, no. 6 (December 1, 1996): 4056–68. http://dx.doi.org/10.1152/jn.1996.76.6.4056.

Full text
Abstract:
1. Neurons in the ventral intraparietal area (VIP) are known to respond to translating random dot patterns. Such responses can be explained on the basis of the input of the middle temporal area (MT) to this area. Anatomic evidence has shown that VIP receives input from the dorsal part of the medial superior temporal area (MSTd) also. Neurons in the latter area are though to be involved in egomotion because they are sensitive to first-order optic flow components such as divergence and rotation. Because of their MT and MSTd input, neurons in VIP may be expected to show sensitivity to such first-order optic flow as well. 2. The question of whether VIP neurons are selective to translation and/or first-order optic flow was investigated quantitatively in two awake monkeys by recording the responses of 52 visually responsive units and by fitting their tuning curves. The responses after presentation of random dot patterns exhibiting either expansion, contraction, clockwise rotation, or anticlockwise rotation were compared with the responses to translation stimuli tested in eight directions. 3. Most VIP neurons showed clear direction-selective responses, particularly to expansion but sometimes also to a combination of components (spiral stimuli). 4. A typical feature of VIP neurons is that their responses to these optic flow components remain when different parts of the receptive field are stimulated separately (“scale invariance”). For the most responsive subfield the response was on average 93% of the whole field response. For all subfields the mean response was on average 64% of the whole field response. 5. To test whether the scale invariance arose from convergence of translation-sensitive subfields with radial or circular direction preferences (“mosaic hypothesis”), the direction selectivity for translating stimuli was tested over these subfields. Basically the direction selectivity for translation was unchanged in the various subfields, thereby excluding the direction mosaic hypothesis. 6. It is concluded that the receptive field characteristics of VIP are very similar to those of MSTd neurons.
APA, Harvard, Vancouver, ISO, and other styles
30

Dadsena, Ravi, Deboleena Sadhukhan, and Ramakrishnan Swaminathan. "Differentiation of Mild Cognitive Impairment Conditions in MR Images using Fractional order Jacobi Fourier Moment Features." Current Directions in Biomedical Engineering 7, no. 2 (October 1, 2021): 724–27. http://dx.doi.org/10.1515/cdbme-2021-2185.

Full text
Abstract:
Abstract Mild Cognitive Impairment (MCI) is the asymptomatic, preclinical transitional stage among aging and Alzheimer’s Disease (AD). Detection of MCI can ensure the timely intervention required to manage the disease’s severity. Morphological alterations of Lateral Ventricle (LV) is considered as a significant biomarker for disease diagnosis. This research aims to analyze the shape alterations of the LV region using Fractional Order Jacobi Fourier Moment (FOJFM) features, which are categorized by their generic nature and capabilities to perform time-frequency analysis. T1-weighted transaxial view brain MR images (HC = 92 and MCI = 63) are obtained from publicly available Open Access Series of Imaging Studies (OASIS) database. The LV region is delineated using Weighted Level Set (WLS) segmentation method and results are compared to Ground Truth (GT) images. FOJFM features are employed to characterize the morphometry of LV region. From this segmented region, 200 features are computed by varying the value of order and fractional parameters. Random Forest (RF) and Support Vector Machine (SVM) classifiers are used to differentiate Healthy Control (HC) and MCI subjects. Results show that WLSE is able to delineate the LV structure. The segmented region shows good correlation with the GT area. FOJFM features are observed to be statistically significant in discriminating HC and MCI subjects with p<0.05. For MCI subjects, the feature values show higher variation as compared with HC brain, which might be due to the surface expansion of ventricular area during disease progression. SVM and RF classifiers show high performance F-measure values of 93.14% and 86.24%, respectively, for differentiating MCI conditions. The proposed moment based FOJFM features are able to capture the morphological changes of LV region related to MCI condition. Hence the proposed pipeline of work can be useful for the automated and early diagnosis of diseased conditions
APA, Harvard, Vancouver, ISO, and other styles
31

Numbisi, Frederick N., Frieke M. B. Van Coillie, and Robert De Wulf. "Delineation of Cocoa Agroforests Using Multiseason Sentinel-1 SAR Images: A Low Grey Level Range Reduces Uncertainties in GLCM Texture-Based Mapping." ISPRS International Journal of Geo-Information 8, no. 4 (April 6, 2019): 179. http://dx.doi.org/10.3390/ijgi8040179.

Full text
Abstract:
Delineating the cropping area of cocoa agroforests is a major challenge in quantifying the contribution of land use expansion to tropical deforestation. Discriminating cocoa agroforests from tropical transition forests using multispectral optical images is difficult due to the similarity of the spectral characteristics of their canopies. Moreover, the frequent cloud cover in the tropics greatly impedes optical sensors. This study evaluated the potential of multiseason Sentinel-1 C-band synthetic aperture radar (SAR) imagery to discriminate cocoa agroforests from transition forests in a heterogeneous landscape in central Cameroon. We used an ensemble classifier, Random Forest (RF), to average the SAR image texture features of a grey level co-occurrence matrix (GLCM) across seasons. We then compared the classification performance with results from RapidEye optical data. Moreover, we assessed the performance of GLCM texture feature extraction at four different grey levels of quantization: 32 bits, 8 bits, 6 bits, and 4 bits. The classification’s overall accuracy (OA) from texture-based maps outperformed that from an optical image. The highest OA (88.8%) was recorded at the 6 bits grey level. This quantization level, in comparison to the initial 32 bits in the SAR images, reduced the class prediction error by 2.9%. The texture-based classification achieved an acceptable accuracy and revealed that cocoa agroforests have considerably fragmented the remnant transition forest patches. The Shannon entropy (H) or uncertainty provided a reliable validation of the class predictions and enabled inferences about discriminating inherently heterogeneous vegetation categories.
APA, Harvard, Vancouver, ISO, and other styles
32

Xuejun, Zhao, Wang Mingfang, Wang Jie, Tong Chuangming, and Yuan Xiujiu. "Application Research on Fault Diagnosis of the Filter Unit Based on Intelligent Algorithm of GA and WNN." Open Mechanical Engineering Journal 9, no. 1 (October 7, 2015): 922–26. http://dx.doi.org/10.2174/1874155x01509010922.

Full text
Abstract:
This paper focuses on the potential of GA algorithm for adaptive random global search, and WNN resolution as well as the ability of fault tolerance to build a multi intelligent algorithm based on the GA-WNN model using the filter unit of analog circuit for fault diagnosis. Construction of GA-WNN model was divided into two stages; in the first stage GA was used to optimize the initial weights, threshold, expansion factor and translation factor of WNN structure; while in the second stage, initially, based on WNN training and learning, global optimal solution was obtained. In the process of using analog output signal by using wavelet decomposition, the absolute value of coefficient of each frequency band sequence was obtained along with the energy characteristics of the cross joint, with a combination of feature vectors as the input of the neural network. Through the pretreatment method, in order to reduce the neural network input, neural grid size of neurons was reduced in each layer and the convergence speed of neural network was increased. The experimental results show that the method can diagnose single and multiple soft faults of the circuit, with high speed and high precision.
APA, Harvard, Vancouver, ISO, and other styles
33

Lin, Zhiyang, Jihua Zhu, Zutao Jiang, Yujie Li, Yaochen Li, and Zhongyu Li. "Merging Grid Maps in Diverse Resolutions by the Context-based Descriptor." ACM Transactions on Internet Technology 21, no. 4 (July 22, 2021): 1–21. http://dx.doi.org/10.1145/3403948.

Full text
Abstract:
Building an accurate map is essential for autonomous robot navigation in the environment without GPS. Compared with single-robot, the multiple-robot system has much better performance in terms of accuracy, efficiency and robustness for the simultaneous localization and mapping (SLAM). As a critical component of multiple-robot SLAM, the problem of map merging still remains a challenge. To this end, this article casts it into point set registration problem and proposes an effective map merging method based on the context-based descriptors and correspondence expansion. It first extracts interest points from grid maps by the Harris corner detector. By exploiting neighborhood information of interest points, it automatically calculates the maximum response radius as scale information to compute the context-based descriptor, which includes eigenvalues and normals computed from local structures of each interest point. Then, it effectively establishes origin matches with low precision by applying the nearest neighbor search on the context-based descriptor. Further, it designs a scale-based corresponding expansion strategy to expand each origin match into a set of feature matches, where one similarity transformation between two grid maps can be estimated by the Random Sample Consensus algorithm. Subsequently, a measure function formulated from the trimmed mean square error is utilized to confirm the best similarity transformation and accomplish the coarse map merging. Finally, it utilizes the scaling trimmed iterative closest point algorithm to refine initial similarity transformation so as to achieve accurate merging. As the proposed method considers scale information in the context-based descriptor, it is able to merge grid maps in diverse resolutions. Experimental results on real robot datasets demonstrate its superior performance over other related methods on accuracy and robustness.
APA, Harvard, Vancouver, ISO, and other styles
34

Ou, Cong, Jianyu Yang, Zhenrong Du, Tingting Zhang, Bowen Niu, Quanlong Feng, Yiming Liu, and Dehai Zhu. "Landsat-Derived Annual Maps of Agricultural Greenhouse in Shandong Province, China from 1989 to 2018." Remote Sensing 13, no. 23 (November 28, 2021): 4830. http://dx.doi.org/10.3390/rs13234830.

Full text
Abstract:
Agricultural greenhouse (AG), one of the fastest-growing technology-based approaches worldwide in terms of controlling the environmental conditions of crops, plays an essential role in food production, resource conservation and the rural economy, but has also caused environmental and socio-economic problems due to policy promotion and market demand. Therefore, long-term monitoring of AG is of utmost importance for the sustainable management of protected agriculture, and previous efforts have verified the effectiveness of remote sensing-based techniques for mono-temporal AG mapping in a relatively small area. However, currently, a continuous annual AG remote sensing-based dataset at large-scale is generally unavailable. In this study, an annual AG mapping method oriented to the provincial area and long-term period was developed to produce the first Landsat-derived annual AG dataset in Shandong province, China from 1989 to 2018 on the Google Earth Engine (GEE) platform. The mapping window for each year was selected based on the vegetation growth and the phenological information, which was critical in distinguishing AG from other misclassified categories. Classification for each year was carried out initially based on the random forest classifier after the feature optimization. A temporal consistency correction algorithm based on classification probability was then proposed to the classified AG maps for further improvement. Finally, the average User’s Accuracy, Producer’s Accuracy and F1-score of AG based on visually-interpreted samples over 30 years reached 96.56%, 86.64% and 0.911, respectively. Furthermore, we also found that the ranked features via calculating the importance of each tested feature resulted in the highest accuracy and the strongest stability in the initial classification stage, and the proposed temporal consistency correction algorithm improved the final products by approximately five percent on average. In general, the resultant AG sequence dataset from our study has revealed the expansion of this typical object of “Human–Nature” interaction in agriculture and has a potential application in use of greenhouse-related technology and the scientific planning of protected agriculture.
APA, Harvard, Vancouver, ISO, and other styles
35

Abraham-Ekeroth, Ricardo Martín. "Enhanced deep-tissue photoacoustics by using microcomposites made of radiofrequency metamaterials and soft polymers: Double- and triple-resonance phenomena." Journal of Applied Physics 132, no. 8 (August 28, 2022): 083103. http://dx.doi.org/10.1063/5.0086553.

Full text
Abstract:
Photoacoustic imaging systems offer a platform with high resolution to explore body tissues, food, and artwork. On the other hand, plasmonics constitutes a source of resonant heating and thermal expansion to generate acoustic waves. However, its associated techniques are seriously limited to laser penetration and nonspecific hyperthermia in the sample. To address this issue, the present work adopts a paradigm shift in photoacoustics. By simulating microparticles made of random composites, the calculated pressure can be made similar or superior to that calculated via plasmonic optoacoustics. The improvement is due to a phenomenon called double or triple resonance, which is the excitation of one or both electric and magnetic plasmons within radiofrequency range and the simultaneous excitation of the particle’s acoustic mode. Given that electromagnetic pulses are restricted to nanosecond pulse widths and MHz frequencies, the proposed method overcomes the poor penetration in tissues and reduces thermal damage, thereby offering a noninvasive technique of theragnosis. Moreover, the resonant pressure obtained lasts longer than with conventional photoacoustic pressure, providing a central feature to enhance detection. To fully comprehend the multi-resonance framework, we develop a complete photoacoustic solution. The proposed approach could pave the way to thermoacoustic imaging and manipulation methods for sensitive materials and tissues with micrometer resolution.
APA, Harvard, Vancouver, ISO, and other styles
36

Buser, Othmar, and Perry Bartelt. "An energy-based method to calculate streamwise density variations in snow avalanches." Journal of Glaciology 61, no. 227 (2015): 563–75. http://dx.doi.org/10.3189/2015jog14j054.

Full text
Abstract:
AbstractSnow avalanches are gravity-driven flows consisting of hard snow/ice particles. Depending on the snow quality, particularly temperature, avalanches exhibit different flow regimes, varying from dense flowing avalanches to highly disperse, mixed flowing-powder avalanches. In this paper we investigate how particle interactions lead to streamwise density variations, and therefore an understanding of why avalanches exhibit different flow types. A basic feature of our model is to distinguish between the velocity of the avalanche in the mean, downslope direction and the velocity fluctuations around the mean, associated with random particle movements. The mechanical energy associated with the velocity fluctuations is not entirely kinetic, as particle movements in the slope-perpendicular direction are inhibited by the hard boundary at the bottom giving rise to a change in flow height and therefore change in flow density. However, this volume expansion cannot occur without raising the center of mass of the particle ensemble, i.e. an acceleration, which, in turn, exerts a pressure on the bottom, the so-called dispersive pressure. As soon as the volume no longer expands, the dispersive pressure vanishes and the pressure returns to the hydrostatic pressure. Different streamwise density distributions, and therefore different avalanche flow regimes, are possible.
APA, Harvard, Vancouver, ISO, and other styles
37

Lasseux, Didier, Francisco J. Valdés Parada, and Mark L. Porter. "An improved macroscale model for gas slip flow in porous media." Journal of Fluid Mechanics 805 (September 16, 2016): 118–46. http://dx.doi.org/10.1017/jfm.2016.562.

Full text
Abstract:
We report on a refined macroscopic model for slightly compressible gas slip flow in porous media developed by upscaling the pore-scale boundary value problem. The macroscopic model is validated by comparisons with an analytic solution on a two-dimensional (2-D) ordered model structure and with direct numerical simulations on random microscale structures. The symmetry properties of the apparent slip-corrected permeability tensor in the macroscale momentum equation are analysed. Slip correction at the macroscopic scale is more accurately described if an expansion in the Knudsen number, beyond the first order considered so far, is employed at the closure level. Corrective terms beyond the first order are a signature of the curvature of solid–fluid interfaces at the pore scale that is incompletely captured by the classical first-order correction at the macroscale. With this expansion, the apparent slip-corrected permeability is shown to be the sum of the classical intrinsic permeability tensor and tensorial slip corrections at the successive orders of the Knudsen number. All the tensorial effective coefficients can be determined from intrinsic and coupled but easy-to-solve closure problems. It is further shown that the complete form of the slip boundary condition at the microscale must be considered and an important general feature of this slip condition at the different orders in the Knudsen number is highlighted. It justifies the importance of slip-flow correction terms beyond the first order in the Knudsen number in the macroscopic model and sheds more light on the physics of slip flow in the general case, especially for large porosity values. Nevertheless, this new nonlinear dependence of the apparent permeability with the Knudsen number should be further verified experimentally.
APA, Harvard, Vancouver, ISO, and other styles
38

Hidalgo-Soria, M., E. Barkai, and S. Burov. "Cusp of Non-Gaussian Density of Particles for a Diffusing Diffusivity Model." Entropy 23, no. 2 (February 17, 2021): 231. http://dx.doi.org/10.3390/e23020231.

Full text
Abstract:
We study a two state “jumping diffusivity” model for a Brownian process alternating between two different diffusion constants, D+>D−, with random waiting times in both states whose distribution is rather general. In the limit of long measurement times, Gaussian behavior with an effective diffusion coefficient is recovered. We show that, for equilibrium initial conditions and when the limit of the diffusion coefficient D−⟶0 is taken, the short time behavior leads to a cusp, namely a non-analytical behavior, in the distribution of the displacements P(x,t) for x⟶0. Visually this cusp, or tent-like shape, resembles similar behavior found in many experiments of diffusing particles in disordered environments, such as glassy systems and intracellular media. This general result depends only on the existence of finite mean values of the waiting times at the different states of the model. Gaussian statistics in the long time limit is achieved due to ergodicity and convergence of the distribution of the temporal occupation fraction in state D+ to a δ-function. The short time behavior of the same quantity converges to a uniform distribution, which leads to the non-analyticity in P(x,t). We demonstrate how super-statistical framework is a zeroth order short time expansion of P(x,t), in the number of transitions, that does not yield the cusp like shape. The latter, considered as the key feature of experiments in the field, is found with the first correction in perturbation theory.
APA, Harvard, Vancouver, ISO, and other styles
39

Babikir Adam, Edriss Eisa, and Sathesh A. "Construction of Accurate Crack Identification on Concrete Structure using Hybrid Deep Learning Approach." Journal of Innovative Image Processing 3, no. 2 (June 19, 2021): 85–99. http://dx.doi.org/10.36548/jiip.2021.2.002.

Full text
Abstract:
In general, several conservative techniques are available for detecting cracks in concrete bridges but they have significant limitations, including low accuracy and efficiency. Due to the expansion of the neural network method, the performance of digital image processing based crack identification has recently diminished. Many single classifier approaches are used to detect the cracks with high accuracy. The classifiers are not concentrating on random fluctuation in the training dataset and also it reflects in the final output as an over-fitting phenomenon. Though this model contains many parameters to justify the training data, it fails in the residual variation. These residual variations are frequent in UAV recorded photos as well as many camera images. To reduce this challenge, a noise reduction technique is utilized along with an SVM classifier to reduce classification error. The proposed technique is more resourceful by performing classification via SVM approach, and further the feature extraction and network training has been implemented by using the CNN method. The captured digital images are processed by incorporating the bending test through reinforced concrete beams. Moreover, the proposed method is determining the widths of the crack by employing binary conversion in the captured images. The proposed model outperforms conservative techniques, single type classifiers, and image segmentation type process methods in terms of accuracy. The obtained results have proved that, the proposed hybrid method is more accurate and suitable for crack detection in concrete bridges especially in the unmanned environment.
APA, Harvard, Vancouver, ISO, and other styles
40

Hogland, John, and David L. R. Affleck. "Improving Estimates of Natural Resources Using Model-Based Estimators: Impacts of Sample Design, Estimation Technique, and Strengths of Association." Remote Sensing 13, no. 19 (September 29, 2021): 3893. http://dx.doi.org/10.3390/rs13193893.

Full text
Abstract:
Natural resource managers need accurate depictions of existing resources to make informed decisions. The classical approach to describing resources for a given area in a quantitative manner uses probabilistic sampling and design-based inference to estimate population parameters. While probabilistic designs are accepted as being necessary for design-based inference, many recent studies have adopted non-probabilistic designs that do not include elements of random selection or balance and have relied on models to justify inferences. While common, model-based inference alone assumes that a given model accurately depicts the relationship between response and predictors across all populations. Within complex systems, this assumption can be difficult to justify. Alternatively, models can be trained to a given population by adopting design-based principles such as balance and spread. Through simulation, we compare estimates of population totals and pixel-level values using linear and nonlinear model-based estimators for multiple sample designs that balance and spread sample units. The findings indicate that model-based estimators derived from samples spread and balanced across predictor variable space reduce the variability of population and unit-level estimators. Moreover, if samples achieve approximate balance over feature space, then model-based estimates of population totals approached simple expansion-based estimates of totals. Finally, in all comparisons made, improvements in estimation were achieved using model-based estimation over design-based estimation alone. Our simulations suggest that samples drawn from a probabilistic design, that are spread and balanced across predictor variable space, improve estimation accuracy.
APA, Harvard, Vancouver, ISO, and other styles
41

Gleichauf, Daniel, Felix Oehme, Michael Sorg, and Andreas Fischer. "Laminar-Turbulent Transition Localization in Thermographic Flow Visualization by Means of Principal Component Analysis." Applied Sciences 11, no. 12 (June 12, 2021): 5471. http://dx.doi.org/10.3390/app11125471.

Full text
Abstract:
Thermographic flow visualization is a contactless, non-invasive technique to visualize the boundary layer flow on wind turbine rotor blades, to assess the aerodynamic condition and consequently the efficiency of the entire wind turbine. In applications on wind turbines in operation, the distinguishability between the laminar and turbulent flow regime cannot be easily increased artificially and solely depends on the energy input from the sun. State-of-the-art image processing methods are able to increase the contrast slightly but are not able to reduce systematic gradients in the image or need excessive a priori knowledge. In order to cope with a low-contrast measurement condition and to increase the distinguishability between the flow regimes, an enhanced image processing by means of the feature extraction method, principal component analysis, is introduced. The image processing is applied to an image series of thermographic flow visualizations of a steady flow situation in a wind tunnel experiment on a cylinder and DU96W180 airfoil measurement object without artificially increasing the thermal contrast between the flow regimes. The resulting feature images, based on the temporal temperature fluctuations in the images, are evaluated with regard to the global distinguishability between the laminar and turbulent flow regime as well as the achievable measurement error of an automatic localization of the local flow transition between the flow regimes. By applying the principal component analysis, systematic temperature gradients within the flow regimes as well as image artefacts such as reflections are reduced, leading to an increased contrast-to-noise ratio by a factor of 7.5. Additionally, the gradient between the laminar and turbulent flow regime is increased, leading to a minimal measurement error of the laminar-turbulent transition localization. The systematic error was reduced by 4% and the random error by 5.3% of the chord length. As a result, the principal component analysis is proven to be a valuable complementary tool to the classical image processing method in flow visualizations. After noise-reducing methods such as the temporal averaging and subsequent assessment of the spatial expansion of the boundary layer flow surface, the PCA is able to increase the laminar-turbulent flow regime distinguishability and reduce the systematic and random error of the flow transition localization in applications where no artificial increase in the contrast is possible. The enhancement of contrast increases the independence from the amount of solar energy input required for a flow evaluation, and the reduced errors of the flow transition localization enables a more precise assessment of the aerodynamic condition of the rotor blade.
APA, Harvard, Vancouver, ISO, and other styles
42

Liu, Yaqun, and Jieyong Wang. "Revealing Annual Crop Type Distribution and Spatiotemporal Changes in Northeast China Based on Google Earth Engine." Remote Sensing 14, no. 16 (August 19, 2022): 4056. http://dx.doi.org/10.3390/rs14164056.

Full text
Abstract:
Northeast China (NEC) produces 1/4 of the grain and 1/3 of the commercial grain in China, and is essential for food security and a sustainable socio-ecological system development. However, long-term annual crop type distribution in this vital area remains largely unknown, compromising the scientific basis for planting structure adjustment and sustainable agriculture management. To this end, we integrated 111-dimensional MOD09A1 features, feature optimization and random forest algorithms on the Google Earth Engine (GEE) platform to classify annual crop types in the NEC during 2000–2020, and adopted multi-source spatial data and geostatistical methods to reveal anthropogenic and natural characteristics of crop type changes. The results demonstrated that sample-based classification accuracies were 84.73–86.93% and statistics-based R2 were 0.81–0.95. From 2000–2020, the sowing area of maize and rice increased by 11.92 × 106 ha (111.05%) and 4.03 × 106 ha (149.28%), whereas that of soybean and other crops decreased by 13.73 × 106 ha (−64.10%) and 1.03 × 106 ha (−50.94%), respectively. Spatially, maize expanded northwestward, rice expanded northeastward, and soybean demonstrated a south-north shrinkage. The soybean-to-maize shift was the main conversion type, and its area largely reduced from 8.68 × 106 ha in 2000–2010 to 4.15 × 106 ha in 2010–2020. Economic comparative benefit and climate change jointly affected crop types in NEC. Higher-benefits maize and rice were mainly planted in more convenient areas with more population and closer to settlements, roads and waterways. The planting of maize and rice required higher temperature and precipitation, and climate change in the NEC provided favorable conditions for their expansion toward high-latitude areas. The crop type changes in the NEC have boosted economic benefits, but increased water–carbon–energy costs. Thus, effective measures such as subsidy policies, ecological compensation, and knowledge-exchange should be implemented to aid crop type and rotation adjustment and ensure food-ecological security.
APA, Harvard, Vancouver, ISO, and other styles
43

Ma, Jing, Zheng Li, Bin Wang, Shunzhao Sui, and Mingyang Li. "Cloning of an Expansin Gene from Chimonanthus praecox Flowers and Its Expression in Flowers Treated with Ethephon or 1-Methylcyclopropene." HortScience 47, no. 10 (October 2012): 1472–77. http://dx.doi.org/10.21273/hortsci.47.10.1472.

Full text
Abstract:
Expansins are extracellular proteins that are involved in cell wall modifications such as cell wall disassembly, cell separation, and cell expansion. Little is known about expansin gene expression during flower development of wintersweet (Chimonanthus praecox). In the present study, an expansin gene, CpEXP1, was isolated from the wintersweet flower cDNA library through random sequencing; this gene encodes a putative protein of 257 amino acids with the essential features conserved, like in other alpha expansins. The CpEXP1 gene exhibited different transcription levels in different tissues and had a significantly higher expression in flowers than other tissues. It is strongly correlated with the development of the flower. The expression of CpEXP1 increased in the flower buds or whole flowers from Stage 1 to 4 and decreased from Stage 5 to 6 during natural opening. Ethephon (an ethylene releaser) treatment promoted cut flower senescence, whereas 1-methylcyclopropene (1-MCP) (an ethylene perception inhibitor) delayed the process of flower wilting. This result is associated with the concomitant lower transcript levels of CpEXP1 in the ethephon-treated samples as well as the steady expression in the 1-MCP-treated samples compared with that in control flowers. The studies show the interesting observation that the expression of an expansin gene CpEXP1 is correlated with the development of Chimonanthus praecox flowers, the upregulation during flower opening vs. the downregulation during senescence.
APA, Harvard, Vancouver, ISO, and other styles
44

Huisingh-Scheetz, Megan, and Jennifer Schrack. "Emerging Biotechnology Markers of Cognitive Impairment." Innovation in Aging 5, Supplement_1 (December 1, 2021): 443. http://dx.doi.org/10.1093/geroni/igab046.1719.

Full text
Abstract:
Abstract The early detection of cognitive impairment is among the National Institute on Aging’s (NIA) current research priorities. Sensor-based technologies have exploded in recent years allowing remote, continuous measurement of older adults’ free-living activity. This highly granular data has stimulated exciting new research exploring how change in health can be detected remotely using novel “biotechnology” markers. Yet, this area of research is in its infancy as it relates to predicting cognitive function. This symposium will provide an overview of the sensor-cognition research landscape and will feature 5 new studies exploring the relationship between biotechnology markers and cognitive function, each with unique sensors, cognitive measures and samples. The first three presentations will report associations between accelerometry-based activity measures (chest or wrist devices) and cognitive function (assessed by diagnosis, a neurocognitive assessment, or microstructural changes on DTI) in the Baltimore Longitudinal Study on Aging, a large, NIA-funded epidemiologic dataset. The fourth presentation will report the significance of free-living hip accelerometry activity measures beyond clinically-available information in a random forest prediction model of 1-year change in Montreal Cognitive Assessment scores among urban, predominantly African-American older adults without moderate-severe dementia residing in the community. The final presentation will report associations between room-to-room transitions as detected by in-home, infrared motion sensors and mild cognitive impairment using data from a community-dwelling sample of older adults residing alone. This symposium will provide a substantial expansion of current knowledge in this research space and will be relevant to clinicians or researchers with an interest in sensor technology or dementia.
APA, Harvard, Vancouver, ISO, and other styles
45

Johnson, Sarah A., Spencer L. Seale, Rachel M. Gittelman, Julie A. Rytlewski, Harlan S. Robins, and Paul A. Fields. "Impact of HLA type, age and chronic viral infection on peripheral T-cell receptor sharing between unrelated individuals." PLOS ONE 16, no. 8 (August 30, 2021): e0249484. http://dx.doi.org/10.1371/journal.pone.0249484.

Full text
Abstract:
The human adaptive immune system must generate extraordinary diversity to be able to respond to all possible pathogens. The T-cell repertoire derives this high diversity through somatic recombination of the T-cell receptor (TCR) locus, a random process that results in repertoires that are largely private to each individual. However, factors such as thymic selection and T-cell proliferation upon antigen exposure can affect TCR sharing among individuals. By immunosequencing the TCRβ variable region of 426 healthy individuals, we find that, on average, fewer than 1% of TCRβ clones are shared between individuals, consistent with largely private TCRβ repertoires. However, we detect a significant correlation between increased HLA allele sharing and increased number of shared TCRβ clones, with each additional shared HLA allele contributing to an increase in ~0.01% of the total shared TCRβ clones, supporting a key role for HLA type in shaping the immune repertoire. Surprisingly, we find that shared antigen exposure to CMV leads to fewer shared TCRβ clones, even after controlling for HLA, indicative of a largely private response to major viral antigenic exposure. Consistent with this hypothesis, we find that increased age is correlated with decreased overall TCRβ clone sharing, indicating that the pattern of private TCRβ clonal expansion is a general feature of the T-cell response to other infectious antigens as well. However, increased age also correlates with increased sharing among the lowest frequency clones, consistent with decreased repertoire diversity in older individuals. Together, all of these factors contribute to shaping the TCRβ repertoire, and understanding their interplay has important implications for the use of T cells for therapeutics and diagnostics.
APA, Harvard, Vancouver, ISO, and other styles
46

Liu, Yang, Huaiqing Zhang, Zeyu Cui, Kexin Lei, Yuanqing Zuo, Jiansen Wang, Xingtao Hu, and Hanqing Qiu. "Very High Resolution Images and Superpixel-Enhanced Deep Neural Forest Promote Urban Tree Canopy Detection." Remote Sensing 15, no. 2 (January 15, 2023): 519. http://dx.doi.org/10.3390/rs15020519.

Full text
Abstract:
Urban tree canopy (UTC) area is an important index for evaluating the urban ecological environment; the very high resolution (VHR) images are essential for improving urban tree canopy survey efficiency. However, the traditional image classification methods often show low robustness when extracting complex objects from VHR images, with insufficient feature learning, object edge blur and noise. Our objective was to develop a repeatable method—superpixel-enhanced deep neural forests (SDNF)—to detect the UTC distribution from VHR images. Eight data expansion methods was used to construct the UTC training sample sets, four sample size gradients were set to test the optimal sample size selection of SDNF method, and the best training times with the shortest model convergence and time-consumption was selected. The accuracy performance of SDNF was tested by three indexes: F1 score (F1), intersection over union (IoU) and overall accuracy (OA). To compare the detection accuracy of SDNF, the random forest (RF) was used to conduct a control experiment with synchronization. Compared with the RF model, SDNF always performed better in OA under the same training sample size. SDNF had more epoch times than RF, converged at the 200 and 160 epoch, respectively. When SDNF and RF are kept in a convergence state, the training accuracy is 95.16% and 83.16%, and the verification accuracy is 94.87% and 87.73%, respectively. The OA of SDNF improved 10.00%, reaching 89.00% compared with the RF model. This study proves the effectiveness of SDNF in UTC detection based on VHR images. It can provide a more accurate solution for UTC detection in urban environmental monitoring, urban forest resource survey, and national forest city assessment.
APA, Harvard, Vancouver, ISO, and other styles
47

Hansen, Matthew C., Peter V. Potapov, Amy H. Pickens, Alexandra Tyukavina, Andres Hernandez-Serna, Viviana Zalles, Svetlana Turubanova, et al. "Global land use extent and dispersion within natural land cover using Landsat data." Environmental Research Letters 17, no. 3 (March 1, 2022): 034050. http://dx.doi.org/10.1088/1748-9326/ac46ec.

Full text
Abstract:
Abstract The conversion of natural land cover into human-dominated land use systems has significant impacts on the environment. Global mapping and monitoring of human-dominated land use extent via satellites provides an empirical basis for assessing land use pressures. Here, we present a novel 2019 global land cover, land use, and ecozone map derived from Landsat satellite imagery and topographical data using derived image feature spaces and algorithms suited per theme. From the map, we estimate the spatial extent and dispersion of land use disaggregated by climate domain and ecozone, where dispersion is the mean distance of land use to all land within a subregion. We find that percent of area under land use and distance to land use follow a power law that depicts an increasingly random spatial distribution of land use as it extends across lands of comparable development potential. For highly developed climate/ecozones, such as temperate and sub-tropical terra firma vegetation on low slopes, area under land use is contiguous and remnant natural land cover have low areal extent and high fragmentation. The tropics generally have the greatest potential for land use expansion, particularly in South America. An exception is Asian humid tropical terra firma vegetated lowland, which has land use intensities comparable to that of temperate breadbaskets such as the United States’ corn belt. Wetland extent is inversely proportional to land use extent within climate domains, indicating historical wetland loss for temperate, sub-tropical, and dry tropical biomes. Results highlight the need for planning efforts to preserve natural systems and associated ecosystem services. The demonstrated methods will be implemented operationally in quantifying global land change, enabling a monitoring framework for systematic assessments of the appropriation and restoration of natural land cover.
APA, Harvard, Vancouver, ISO, and other styles
48

Kysliuk, Larysa. "Uzual and occasional word-formation in different style language practice." Ukrainska mova, no. 2 (2020): 31–44. http://dx.doi.org/10.15407/ukrmova2020.02.031.

Full text
Abstract:
The article considers the manifestation of the interaction of the usual and the occasional in language practice through the transition of a text word to the system of language. Stylistic features of the functioning of occasional derivatives in journalistic, artistic, scientific texts are analyzed. A transitional zone in the functioning of usual and occasional units in the texts of mass media and fiction has been identified, which can be defined as an expansion of the derivation base of the model or as a repeated violation of the compatibility of components in the word-forming model. Such derivative units can be considered both casual and occasional. The factor of time and the spread of a phenomenon in the community influence the transition of some occasionalisms to the status of neologisms (комп’ютеризація). Peculiarities of creation and functioning of occasional derivatives in journalistic, artistic and scientific texts are analyzed. Active models in the media work to create occasionalisms that are important for the linguistic picture of society (albeit in a very short time, because they relate to a particular event or situation: ректоропад, банкопад, нафтопад). In fiction, poetry, the naming of a certain phenomenon is important primarily for the person of the author (всебезодня, всемежа in Vasyl Stus). Therefore, the word-forming model, which is a defining marker, a characteristic feature of creativity of one author, is not so important in the set of texts of the same style, and is often a random, peripheral phenomenon for usus. The phenomenon of occasional creation of terms of informatics, IT, socio-political sphere is a renewal of the tradition of “forged words” («кованих слів») in the Ukrainian language. A modern author’s innovation can spread in usus, become a terminoid and has a chance to enter in the terminological dictionaries (моментократія, застосунок). However, the conscious creation of a term on a origional basis (шкул, дошкуляч) often has no chance to be ususalized as a term as opposed to widespread internationalism (спам) due to the broad meaning of the producing verb (дошкуляти). Keywords: word-forming model, occasional word-formation, occasional word, potential word, “forged” word.
APA, Harvard, Vancouver, ISO, and other styles
49

Santaguida, Marianne, Koen Schepers, Bryan King, and Emmanuelle Passegue. "Deciphering JunB Function in Regulating Hematopoietic Stem Cell Functions." Blood 110, no. 11 (November 16, 2007): 777. http://dx.doi.org/10.1182/blood.v110.11.777.777.

Full text
Abstract:
Abstract The AP-1 transcription factor JunB plays a key role in controlling normal homeostasis of the hematopoietic stem cell (HSC) compartment and acts as a tumor suppressor in mice. We have previously shown that inactivating JunB expression in HSC causes an aberrant stem cell expansion leading to myeloproliferative disorder (MPD) development and leukemia progression. JunB-deficient HSC are the leukemia-initiating (leukemic) stem cells (LSC) responsible for the initiation and maintenance of this disease. We have now investigated the mechanisms by which loss of JunB alters HSC properties. We found that loss of JunB severely impaired HSC transplantability and cell cycle regulation. Limit dilution transplantation experiments with purified HSC (Lin−/c-Kit+/Sca-1+/Flk2- cells) revealed that junB-deficient HSC are extremely poor at providing engraftment on a cell-by-cell basis. Competitive bone marrow transplantation experiments confirmed that junB-deficient cells are on average 50% less efficient than normal cells in providing engraftment, although in every case where they did engraft, they mediated multilineage reconstitution followed by myeloid expansion and MPD development. JunB-deficient HSC also displayed rapid exhaustion of their self-renewal activity following serial transplantation. At the cellular level, the absence of JunB profoundly deregulated HSC proliferation and cell cycle distribution. Direct analysis of junB-deficient HSC revealed a striking decrease in the number of quiescent G0 cells (30% vs. 70% in normal HSC) and a correlative increase in the number of cycling cells. Quantitative RT-PCR analysis of junB-deficient HSC indicated a global decrease in the expression level of early G1 cyclins and almost all cyclin-dependant kinase inhibitors, associated with an increase in the expression level of late G1 and S-G2/M cyclins. These results provide a molecular understanding of junB-deficient HSC proliferation and strongly suggests that faster trafficking through the cell cycle is a central feature of their leukemic behavior. We also found that loss of JunB severely impaired HSC migratory response, with junB-deficient HSC displaying increased random diffusion and severely blunted response to the chemoattractant SDF1α. In fact, flow cytometry, qRT-PCR and microarray analyses revealed that several molecules involved in cell adhesion/migration (including CXCR4, LFA-1 and VLA-4) are deregulated in junB-deficient HSC. Finally, using short-term in vivo homing assays and intrafemoral injections we established that the engraftment defect exhibited by junB-deficient HSC is not due to a defect in their ability to home to the bone marrow (BM) cavity but to their defective ability to respond and to be maintained in this microenvironment. We are now studying how junB-deficient HSC localize/interact with cellular and molecular components of the BM niches contributes to their loss of quiescence and leukemic expansion. Taken together, these results provide a cellular and molecular understanding of how JunB controls the homeostasis of the stem cell compartment by coordinating stem cell maintenance with stem cell migration. They raise the exciting possibility that interfering with the deregulated cell cycle and altered maintenance of LSC in the BM microenvironment could lead to their specific eradication without impacting on normal HSC function.
APA, Harvard, Vancouver, ISO, and other styles
50

Markandey, M., A. Bajaj, S. K. Vuyyuru, S. Mohta, M. Singh, M. Verma, S. Kumar, et al. "P709 Distinct Pattern of Gut Microbial Dysbiosis in Crohn’s Disease and Intestinal Tuberculosis - A Machine Learning-based classification model." Journal of Crohn's and Colitis 16, Supplement_1 (January 1, 2022): i606—i607. http://dx.doi.org/10.1093/ecco-jcc/jjab232.830.

Full text
Abstract:
Abstract Background Crohn’s disease (CD) and intestinal tuberculosis (ITB) are chronic granulomatous inflammatory disorders characterized by a compromised mucosal immunity. Even with diverging etiologies, CD and ITB presents an uncanny resemblance in clinical manifestation resulting in diagnostic dilemma. The gut microbiota regulates myriad of gut mucosal immunological processes. Present study aims to decipher gut microbial dysbiosis in the two disorders and utilize the CD and ITB-specific gut dysbiosis to construct a machine learning (ML)-based predictive model, which can aid in their differential diagnosis. Methods Fecal samples from healthy controls (n=12) and from patients with CD (n=23) and ITB (n=25) were subjected to 16S (V3-V4) amplicon sequencing. Processing of raw reads, construction of ASV feature tables, diversity, core microbiome analysis and ML classifier construction was done using QIIME2-2021.4. Differential abundance analysis (DAA) between the groups was carried out using Deseq2, after adjusting for the subject-specific confounders. Results The α and β diversity indices in CD and ITB groups were significantly reduced than HC group (p = 0.011 and 0.012 resp.), with no significant differences between the two diseases (Fig.1A, 1B). On comparison with HC, CD and ITB groups showed reduction in members of Firmicutes and Bacteroidetes, with enhancement of Actinobacteria and Proteobacteria (Fig.1C and 1D). DAA (FDR q &lt;0.1, FC &gt;2.5) between CD and ITB groups revealed expansion of Succinivibrio dextrinisolvens, Odoribacter splanchnicus, Megasphaera massiliensis, Bacteroides uniformis and B.xylanisolvens in CD group, while Clostridium sp., Haemophilus parainfluenzae and Bifidobacterium sp. were elevated in ITB (Fig.2A). Random Forest-based ML model constructed on the basis of raw microbiome reads and using 80% of the samples to train the model, showed predictive accuracy of 0.78 (AUC=93%). (Fig.2B) Conclusion Our study shows that CD and ITB witnesses significant changes in gut microbial structure. With no significant differences in microbial diversity between two diseases, the signature of gut dysbiosis is distinct between CD and ITB. Exploitation of these differences to construct ML models can potentiate differential diagnosis of CD and ITB.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography