Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Global Partition Methods.

Artykuły w czasopismach na temat „Global Partition Methods”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 50 najlepszych artykułów w czasopismach naukowych na temat „Global Partition Methods”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

HU, TIANMING, JINZHI XIONG i GENGZHONG ZHENG. "SIMILARITY-BASED COMBINATION OF MULTIPLE CLUSTERINGS". International Journal of Computational Intelligence and Applications 05, nr 03 (wrzesień 2005): 351–69. http://dx.doi.org/10.1142/s1469026805001660.

Pełny tekst źródła
Streszczenie:
Consensus clustering refers to combining multiple clusterings over a common set of objects into a single consolidated partition. After introducing the distribution-based view of partitions, we propose a series of entropy-based distance functions for comparing various partitions. Given a candidate partition set, consensus clustering is then formalized as an optimization problem of searching for a centroid partition with the smallest distance to that set. In addition to directly selecting the local centroid candidate, we also present two combining methods for the global centroid based on the new similarity determined by the whole candidate set. The centroid partition is likely to be top/middle-ranked in terms of closeness to the true partition. Finally we evaluate its effectiveness on both artificial and real datasets, with candidates from either the full space or the subspace.
Style APA, Harvard, Vancouver, ISO itp.
2

ZHAO, YAN, YIYU YAO i JINGTAO YAO. "LEVEL-WISE CONSTRUCTION OF DECISION TREES FOR CLASSIFICATION". International Journal of Software Engineering and Knowledge Engineering 16, nr 01 (luty 2006): 103–26. http://dx.doi.org/10.1142/s0218194006002690.

Pełny tekst źródła
Streszczenie:
A partition-based framework is presented for a formal study of classification problems. An information table is used as a knowledge representation, in which all basic notions are precisely defined by using a language known as the decision logic language. Solutions to, and solution space of, classification problems are formulated in terms of partitions. Algorithms for finding solutions are modelled as searching in a space of partitions under the refinement order relation. We focus on a particular type of solutions called conjunctively definable partitions. Two level-wise methods for decision tree construction are investigated, which are related to two different strategies: local optimization and global optimization. They are not in competition with, but are complementary to each other. Experimental results are reported to evaluate the two methods.
Style APA, Harvard, Vancouver, ISO itp.
3

Efendiev, Yalchin, Juan Galvis i M. Sebastian Pauletti. "Multiscale Finite Element Methods for Flows on Rough Surfaces". Communications in Computational Physics 14, nr 4 (październik 2013): 979–1000. http://dx.doi.org/10.4208/cicp.170512.310113a.

Pełny tekst źródła
Streszczenie:
AbstractIn this paper, we present the Multiscale Finite Element Method (MsFEM) for problems on rough heterogeneous surfaces. We consider the diffusion equation on oscillatory surfaces. Our objective is to represent small-scale features of the solution via multiscale basis functions described on a coarse grid. This problem arises in many applications where processes occur on surfaces or thin layers. We present a unified multiscale finite element framework that entails the use of transformations that map the reference surface to the deformed surface. The main ingredients of MsFEM are (1) the construction of multiscale basis functions and (2) a global coupling of these basis functions. For the construction of multiscale basis functions, our approach uses the transformation of the reference surface to a deformed surface. On the deformed surface, multiscale basis functions are defined where reduced (1D) problems are solved along the edges of coarse-grid blocks to calculate nodal multiscale basis functions. Furthermore, these basis functions are transformed back to the reference configuration. We discuss the use of appropriate transformation operators that improve the accuracy of the method. The method has an optimal convergence if the transformed surface is smooth and the image of the coarse partition in the reference configuration forms a quasiuniform partition. In this paper, we consider such transformations based on harmonic coordinates (following H. Owhadi and L. Zhang [Comm. Pure and Applied Math., LX(2007), pp. 675-723]) and discuss gridding issues in the reference configuration. Numerical results are presented where we compare the MsFEM when two types of deformations are used for multiscale basis construction. The first deformation employs local information and the second deformation employs a global information. Our numerical results show that one can improve the accuracy of the simulations when a global information is used.
Style APA, Harvard, Vancouver, ISO itp.
4

Yao, Hongtai, Xianpei Wang, Le Zhao, Meng Tian, Zini Jian, Li Gong i Bowen Li. "An Object-Based Markov Random Field with Partition-Global Alternately Updated for Semantic Segmentation of High Spatial Resolution Remote Sensing Image". Remote Sensing 14, nr 1 (29.12.2021): 127. http://dx.doi.org/10.3390/rs14010127.

Pełny tekst źródła
Streszczenie:
The Markov random field (MRF) method is widely used in remote sensing image semantic segmentation because of its excellent spatial (relationship description) ability. However, there are some targets that are relatively small and sparsely distributed in the entire image, which makes it easy to misclassify these pixels into different classes. To solve this problem, this paper proposes an object-based Markov random field method with partition-global alternately updated (OMRF-PGAU). First, four partition images are constructed based on the original image, they overlap with each other and can be reconstructed into the original image; the number of categories and region granularity for these partition images are set. Then, the MRF model is built on the partition images and the original image, their segmentations are alternately updated. The update path adopts a circular path, and the correlation assumption is adopted to establish the connection between the label fields of partition images and the original image. Finally, the relationship between each label field is constantly updated, and the final segmentation result is output after the segmentation has converged. Experiments on texture images and different remote sensing image datasets show that the proposed OMRF-PGAU algorithm has a better segmentation performance than other selected state-of-the-art MRF-based methods.
Style APA, Harvard, Vancouver, ISO itp.
5

Zabotin, Vladislav V., i Pavel A. Chernyshevskij. "Continuous global optimization of multivariable functions based on Sergeev and Kvasov diagonal approach". Zhurnal Srednevolzhskogo Matematicheskogo Obshchestva 24, nr 4 (31.12.2022): 399–418. http://dx.doi.org/10.15507/2079-6900.24.202204.399-418.

Pełny tekst źródła
Streszczenie:
Abstract. One of modern global optimization algorithms is method of Strongin and Piyavskii modified by Sergeev and Kvasov diagonal approach. In recent paper we propose an extension of this approach to continuous multivariable functions defined on the multidimensional parallelepiped. It is known that Sergeev and Kvasov method applies only to a Lipschitz continuous function though it effectively extends one-dimensional algorithm to multidimensional case. So authors modify We modify mentioned method to a continuous functions using introduced by Vanderbei ε-Lipschitz property that generalizes conventional Lipschitz inequality. Vanderbei proved that a real valued function is uniformly continuous on a convex domain if and only if it is ε-Lipschitz. Because multidimensional parallelepiped is a convex compact set, we demand objective function to be only continuous on a search domain. We describe extended Strongin’s and Piyavskii’s methods in the Sergeev and Kvasov modification and prove the sufficient conditions for the convergence. As an example of proposed method’s application, at the end of this article we show numerical optimization results of different continuous but not Lipschitz functions using three known partition strategies: “partition on 2”, “partition on 2N” and “effective”. For the first two of them we present formulas for computing a new iteration point and for recalculating the ε-Lipschitz constant estimate. We also show algorithm modification that allows to find a new search point on any algorithm’s step.
Style APA, Harvard, Vancouver, ISO itp.
6

Guo, Tan, Fulin Luo, Yule Duan, Xinjian Huang i Guangyao Shi. "Rethinking Representation Learning-Based Hyperspectral Target Detection: A Hierarchical Representation Residual Feature-Based Method". Remote Sensing 15, nr 14 (19.07.2023): 3608. http://dx.doi.org/10.3390/rs15143608.

Pełny tekst źródła
Streszczenie:
Representation learning-based hyperspectral target detection (HTD) methods generally follow a learning paradigm of single-layer or one-step representation residual learning and the target detection on original full spectral bands, which, in some cases, cannot accurately distinguish the target pixels from variable background pixels via one-round of the detection process. To alleviate the problem and make full use of the latent discriminate characteristics in different spectral bands and the representation residual, this paper proposes a level-wise band-partition-based hierarchical representation residual feature (LBHRF) learning method for HTD with a parallel and cascaded hybrid structure. Specifically, the LBHRR method proposes to partition and fuse different levels of sub-band spectra combinations, and take full advantages of the discriminate information in representation residuals from different levels of band-partition. The highlights of this work include three aspects. First, the original full spectral bands are partitioned in a parallel level-wise manner to obtain the augmented representation residual feature through level-wise band-partition-based representation residual learning, such that the global spectral integrity and contextual information of local adjacent sub-bands are flexibly fused. Second, the SoftMax transformation, pooling operation, and augmented representation residual feature reuse among different layers are equipped in cascade to enhance the learning ability of the nonlinear and discriminant hierarchical representation residual features of the method. Third, a hierarchical representation residual feature-based HTD method is developed in an efficient stepwise learning manner instead of back-propagation optimization. Experimental results on several HSI datasets demonstrate that the proposed model can yield promising detection performance in comparison to some state-of-the-art counterparts.
Style APA, Harvard, Vancouver, ISO itp.
7

Lu, Zhigang, i Hong Shen. "An accuracy-assured privacy-preserving recommender system for internet commerce". Computer Science and Information Systems 12, nr 4 (2015): 1307–26. http://dx.doi.org/10.2298/csis140725056l.

Pełny tekst źródła
Streszczenie:
Recommender systems, tool for predicting users? potential preferences by computing history data and users? interests, show an increasing importance in various Internet applications such as online shopping. As a well-known recommendation method, neighbourhood-based collaborative filtering has attracted considerable attentions recently. The risk of revealing users? private information during the process of filtering has attracted noticeable research interests. Among the current solutions, the probabilistic techniques have shown a powerful privacy preserving effect. The existing methods deploying probabilistic methods are in three categories, one [19] adds differential privacy noises in the covariance matrix; one [1] introduces the randomisation in the neighbour selection process; the other [29] applies differential privacy in both the neighbour selection process and covariance matrix. When facing the k Nearest Neighbour (kNN) attack, all the existing methods provide no data utility guarantee, for the introduction of global randomness. In this paper, to overcome the problem of recommendation accuracy loss, we propose a novel approach, Partitioned Probabilistic Neighbour Selection, to ensure a required prediction accuracy while maintaining high security against the kNN attack. We define the sum of k neighbours? similarity as the accuracy metric ?, the number of user partitions, across which we select the k neighbours, as the security metric ?. We generalise the k Nearest Neighbour attack to the ?k Nearest Neighbours attack. Differing from the existing approach that selects neighbours across the entire candidate list randomly, our method selects neighbours from each exclusive partition of size k with a decreasing probability. Theoretical and experimental analysis show that to provide an accuracy-assured recommendation, our Partitioned Probabilistic Neighbour Selection method yields a better trade-off between the recommendation accuracy and system security.
Style APA, Harvard, Vancouver, ISO itp.
8

Lin, Lijuan, Licheng Xing, Qingquan Jia i Xuerui Zhang. "Research on the Partition Method Based on the Observation Points of Voltage Distortion in Harmonic Optimization". Journal of Physics: Conference Series 2253, nr 1 (1.04.2022): 012005. http://dx.doi.org/10.1088/1742-6596/2253/1/012005.

Pełny tekst źródła
Streszczenie:
Abstract With the use of a large number of power electronic equipment, the harmonics in the power grid are becoming more and more serious. There are many methods for harmonic mitigation and optimization such as Global network Optimization. When there are many nodes, the global network optimization method is more complicated. Considering the problem of large numbers of harmonic injection nodes in the power grid, this paper proposes a method of clustering the voltage distortion nodes according to correlation, and using the characteristic nodes to describe the harmonic voltage distortion in the region to mitigate harmonic voltage. This method is based on the observation data of the voltage distortion of the nodes for dynamic selection and grouping. The results show that this method meets the needs of Global network Optimization. The data processing methods used in this paper such as feature extraction methods and time series interpolation optimization can effectively reduce the data dimension, and get better optimization effect.
Style APA, Harvard, Vancouver, ISO itp.
9

Cotillard, Aurélie, Soline Chaumont, Mathilde Saccareau, Nicole S. Litwin, Diana Gutierrez Lopez, Julien Tap, Franck Lejzerowicz i in. "Unsupervised Diet Partitions Better Explain Variations of the Gut Microbiome Compared to Individual Dietary Markers in U.S. Adults of the American Gut Project Cohort". Current Developments in Nutrition 5, Supplement_2 (czerwiec 2021): 397. http://dx.doi.org/10.1093/cdn/nzab038_009.

Pełny tekst źródła
Streszczenie:
Abstract Objectives Eating habits have been shown to impact the gut microbiome. Here we aimed to define several types of dietary patterns in a U.S. adult cohort and test their associations with the gut microbiome. Methods Using supervised and unsupervised approaches, we built dietary patterns based on a food frequency questionnaire of the American Gut Project database. Focusing on 1800 adult participants living in the United States, we defined patterns as partitions (groups of participants) or factors (combinations of food variables) driven by specific dietary criteria: fibers, proteins, Healthy Eating Index (HEI 2010), food items, food groups and micronutrients. We then associated these patterns with 16S gut microbiome data for 744 participants, excluding those reporting antibiotic intake in the last year or specific diseases. Analyses were adjusted for age, sex and BMI. Results Compared to individual features like fibers and proteins, or to factors representing reduced numbers of features, five unsupervised partitions based on food groups were best associated with gut microbiome beta-diversity. Two partitions presented a lower consumption of animal products, with one being almost completely exclusive and the other, close to a flexitarian diet, presenting the best diet quality as measured by HEI. A third one consisted mostly of participants under low carbohydrate diets with nearly no consumption of starchy foods or sweet products. Finally, the last two partitions presented Western-like diets with increased consumption of mixed dishes, sweet products and refined cereals, one of them being more diverse with increased nuts and whole cereals. Gut microbiome alpha-diversity was slightly increased in the flexitarian partition compared to the most westernized one. Strikingly, the low carbohydrate partition was associated with low levels of the Bifidobacterium genus. Conclusions We showed in a U.S. adult cohort that a global diet may be more associated with gut microbiome variations than individual features like fibers or proteins. Five diet partitions were identified and their specific associations with gut microbiome were studied. These results confirm the importance to consider diet as a whole when studying gut microbiota diversity. Funding Sources Danone Research.
Style APA, Harvard, Vancouver, ISO itp.
10

Wang, Yuexin, Guangcai Feng, Zhixiong Feng, Yuedong Wang, Xiuhua Wang, Shuran Luo, Yinggang Zhao i Hao Lu. "An MT-InSAR Data Partition Strategy for Sentinel-1A/B TOPS Data". Remote Sensing 14, nr 18 (13.09.2022): 4562. http://dx.doi.org/10.3390/rs14184562.

Pełny tekst źródła
Streszczenie:
The Sentinel-1A/B satellite launched by European Space Agency (ESA) in 2014 provides a huge amount of free Terrain Observation by Progressive Scans (TOPS) data with global coverage to the public. The TOPS data have a frame width of 250 km and have been widely used in surface deformation monitoring. However, traditional Multi-Temporal Interferometric Synthetic Aperture Radar (MT-InSAR) methods require large computer memory and time when processing full resolution data with large width and long strips. In addition, they hardly correct atmospheric delays and orbital errors accurately over a large area. In order to solve these problems, this study proposes a data partition strategy based on MT-InSAR methods. We first process the partitioned images over a large area by traditional MT-InSAR method, then stitch the deformation results into a complete deformation result by correcting the offsets of adjacent partitioned images. This strategy is validated in a flat urban area (Changzhou City in Jiangsu province, China), and a mountainous region (Qijiang in Chongqing City, China). Compared with traditional MT-InSAR methods, the precision of the results obtained by the new strategy is improved by about 5% for Changzhou city and about 15% for Qijiang because of its advantage in atmospheric delay correction. Furthermore, the proposed strategy needs much less memory and time than traditional methods. The total time needed by the traditional method is about 20 h, and by the proposed method, is about 8.7 h, when the number of parallel processing is 5 in the Changzhou city case. The time will be further reduced when the number of parallel processes increases.
Style APA, Harvard, Vancouver, ISO itp.
11

Vrac, Mathieu, Alain Chédin i Edwin Diday. "Clustering a Global Field of Atmospheric Profiles by Mixture Decomposition of Copulas". Journal of Atmospheric and Oceanic Technology 22, nr 10 (1.10.2005): 1445–59. http://dx.doi.org/10.1175/jtech1795.1.

Pełny tekst źródła
Streszczenie:
Abstract This work focuses on the clustering of a large dataset of atmospheric vertical profiles of temperature and humidity in order to model a priori information for the problem of retrieving atmospheric variables from satellite observations. Here, each profile is described by cumulative distribution functions (cdfs) of temperature and specific humidity. The method presented here is based on an extension of the mixture density problem to this kind of data. This method allows dependencies between and among temperature and moisture to be taken into account, through copula functions, which are particular distribution functions, linking a (joint) multivariate distribution with its (marginal) univariate distributions. After a presentation of vertical profiles of temperature and humidity and the method used to transform them into cdfs, the clustering method is detailed and then applied to provide a partition into seven clusters based, first, on the temperature profiles only; second, on the humidity profiles only; and, third, on both the temperature and humidity profiles. The clusters are statistically described and explained in terms of airmass types, with reference to meteorological maps. To test the robustness and the relevance of the method for a larger number of clusters, a partition into 18 classes is established, where it is shown that even the smallest clusters are significant. Finally, comparisons with more classical efficient clustering or model-based methods are presented, and the advantages of the approach are discussed.
Style APA, Harvard, Vancouver, ISO itp.
12

Saranjam, Leila, Elisabet Fuguet, Miroslava Nedyalkova, Vasil Simeonov, Francesc Mas i Sergio Madurga. "Prediction of Partition Coefficients in SDS Micelles by DFT Calculations". Symmetry 13, nr 9 (19.09.2021): 1750. http://dx.doi.org/10.3390/sym13091750.

Pełny tekst źródła
Streszczenie:
A computational methodology using Density-Functional Theory (DFT) calculations was developed to determine the partition coefficient of a compound in a solution of Sodium Dodecyl Sulfate (SDS) micelles. Different sets of DFT calculations were used to predict the free energy of a set of 63 molecules in 15 different solvents with the purpose of identifying the solvents with similar physicochemical characteristics to the studied micelles. Experimental partition coefficients were obtained from Micellar Electrokinetic Chromatography (MEKC) measurements. The experimental partition coefficient of these molecules was compared with the predicted partition coefficient in heptane/water, cyclohexane/water, N-dodecane/water, pyridine/water, acetic acid/water, decan-1-ol/water, octanol/water, propan-2-ol/water, acetone/water, propan-1-ol/water, methanol/water, 1,2-ethane diol/water, dimethyl sulfoxide/water, formic acid/water, and diethyl sulphide/water systems. It is observed that the combination of pronan-1-ol/water solvent was the most appropriated to estimate the partition coefficient for SDS micelles. This approach allowed us to estimate the partition coefficient orders of magnitude faster than the classical molecular dynamics simulations. The DFT calculations were carried out with the well-known exchange correlation functional B3LYP and with the global hybrid functional M06-2X from the Minnesota functionals with 6-31++G ** basis set. The effect of solvation was considered by the continuum model based on density (SMD). The proposed workflow for the prediction rate of the participation coefficient unveiled the symmetric balance between the experimental data and the computational methods.
Style APA, Harvard, Vancouver, ISO itp.
13

Li, Jianxin, Guannan Si, Xinyu Liang, Zhaoliang An, Pengxin Tian i Fengyu Zhou. "Partition-Based Point Cloud Completion Network with Density Refinement". Entropy 25, nr 7 (2.07.2023): 1018. http://dx.doi.org/10.3390/e25071018.

Pełny tekst źródła
Streszczenie:
In this paper, we propose a novel method for point cloud complementation called PADPNet. Our approach uses a combination of global and local information to infer missing elements in the point cloud. We achieve this by dividing the input point cloud into uniform local regions, called perceptual fields, which are abstractly understood as special convolution kernels. The set of point clouds in each local region is represented as a feature vector and transformed into N uniform perceptual fields as the input to our transformer model. We also designed a geometric density-aware block to better exploit the inductive bias of the point cloud’s 3D geometric structure. Our method preserves sharp edges and detailed structures that are often lost in voxel-based or point-based approaches. Experimental results demonstrate that our approach outperforms other methods in reducing the ambiguity of output results. Our proposed method has important applications in 3D computer vision and can efficiently recover complete 3D object shapes from missing point clouds.
Style APA, Harvard, Vancouver, ISO itp.
14

Zheng, Junbao, Chenke Xu, Wei Zhang i Xu Yang. "Single Image Dehazing Using Global Illumination Compensation". Sensors 22, nr 11 (30.05.2022): 4169. http://dx.doi.org/10.3390/s22114169.

Pełny tekst źródła
Streszczenie:
The existing dehazing algorithms hardly consider background interference in the process of estimating the atmospheric illumination value and transmittance, resulting in an unsatisfactory dehazing effect. In order to solve the problem, this paper proposes a novel global illumination compensation-based image-dehazing algorithm (GIC). The GIC method compensates for the intensity of light scattered when light passes through atmospheric particles such as fog. Firstly, the illumination compensation was accomplished in the CIELab color space using the shading partition enhancement mechanism. Secondly, the atmospheric illumination values and transmittance parameters of these enhanced images were computed to improve the performance of atmospheric-scattering models, in order to reduce the interference of background signals. Eventually, the dehazing result maps with reduced background interference were obtained with the computed atmospheric-scattering model. The dehazing experiments were carried out on the public data set, and the dehazing results of the foggy image were compared with cutting-edge dehazing algorithms. The experimental results illustrate that the proposed GIC algorithm shows enhanced consistency with the real-imaging situation in estimating atmospheric illumination and transmittance. Compared with established image-dehazing methods, the peak signal-to-noise ratio (PSNR) and the structural similarity (SSIM) metrics of the proposed GIC method increased by 3.25 and 0.084, respectively.
Style APA, Harvard, Vancouver, ISO itp.
15

Castro, C. E., M. Käser i E. F. Toro. "Space–time adaptive numerical methods for geophysical applications". Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 367, nr 1907 (28.11.2009): 4613–31. http://dx.doi.org/10.1098/rsta.2009.0158.

Pełny tekst źródła
Streszczenie:
In this paper we present high-order formulations of the finite volume and discontinuous Galerkin finite-element methods for wave propagation problems with a space–time adaptation technique using unstructured meshes in order to reduce computational cost without reducing accuracy. Both methods can be derived in a similar mathematical framework and are identical in their first-order version. In their extension to higher order accuracy in space and time, both methods use spatial polynomials of higher degree inside each element, a high-order solution of the generalized Riemann problem and a high-order time integration method based on the Taylor series expansion. The static adaptation strategy uses locally refined high-resolution meshes in areas with low wave speeds to improve the approximation quality. Furthermore, the time step length is chosen locally adaptive such that the solution is evolved explicitly in time by an optimal time step determined by a local stability criterion. After validating the numerical approach, both schemes are applied to geophysical wave propagation problems such as tsunami waves and seismic waves comparing the new approach with the classical global time-stepping technique. The problem of mesh partitioning for large-scale applications on multi-processor architectures is discussed and a new mesh partition approach is proposed and tested to further reduce computational cost.
Style APA, Harvard, Vancouver, ISO itp.
16

Liu, Lu, Dong Sun, Wen Qing Wang i Rui Can Niu. "Immune Clonal Spectral Clustering for PolSAR Land Cover Classification". Applied Mechanics and Materials 644-650 (wrzesień 2014): 1783–86. http://dx.doi.org/10.4028/www.scientific.net/amm.644-650.1783.

Pełny tekst źródła
Streszczenie:
In this paper, a novel PolSAR land cover classification method is proposed based on spectral clustering (SC) and immune clonal selection principle. SC is employed to reduce dimension in the polarimetric feature space. However, it is susceptible to local optimum and sensitive to the initial partition. To address this issue, immune clonal clustering (ICC) is used due to its capability of global searching. The proposed method combines the complementary advantage of spectral clustering and immune clonal clustering. Experimental results demonstrate that the proposed method is more stable and efficient compared with the other methods.
Style APA, Harvard, Vancouver, ISO itp.
17

Zhan, Guowei, Qi Wang, Weidong Min, Qing Han, Haoyu Zhao i Zitai Wei. "Unsupervised Vehicle Re-Identification Based on Cross-Style Semi-Supervised Pre-Training and Feature Cross-Division". Electronics 12, nr 13 (3.07.2023): 2931. http://dx.doi.org/10.3390/electronics12132931.

Pełny tekst źródła
Streszczenie:
Vehicle Re-Identification (Re-ID) based on Unsupervised Domain Adaptation (UDA) has shown promising performance. However, two main issues still exist: (1) existing methods that use Generative Adversarial Networks (GANs) for domain gap alleviation combine supervised learning with hard labels of the source domain, resulting in a mismatch between style transfer data and hard labels; (2) pseudo label assignment in the fine-tuning stage is solely determined by similarity measures of global features using clustering algorithms, leading to inevitable label noise in generated pseudo labels. To tackle these issues, this paper proposes an unsupervised vehicle re-identification framework based on cross-style semi-supervised pre-training and feature cross-division. The framework consists of two parts: cross-style semi-supervised pre-training (CSP) and feature cross-division (FCD) for model fine-tuning. The CSP module generates style transfer data containing source domain content and target domain style using a style transfer network, and then pre-trains the model in a semi-supervised manner using both source domain and style transfer data. A pseudo-label reassignment strategy is designed to generate soft labels assigned to the style transfer data. The FCD module obtains feature partitions through a novel interactive division to reduce the dependence of pseudo-labels on global features, and the final similarity measurement combines the results of partition features and global features. Experimental results on the VehicleID and VeRi-776 datasets show that the proposed method outperforms existing unsupervised vehicle re-identification methods. Compared with the last best method on each dataset, the method proposed in this paper improves the mAP by 0.63% and the Rank-1 by 0.73% on the three sub-datasets of VehicleID on average, and it improves mAP by 0.9% and Rank-1 by 1% on VeRi-776 dataset.
Style APA, Harvard, Vancouver, ISO itp.
18

Peyron, O., M. Magny, S. Goring, S. Joannin, J. L. de Beaulieu, E. Brugiapaglia, L. Sadori i in. "Contrasting patterns of climatic changes during the Holocene in the Central Mediterranean (Italy) reconstructed from pollen data". Climate of the Past Discussions 8, nr 6 (29.11.2012): 5817–66. http://dx.doi.org/10.5194/cpd-8-5817-2012.

Pełny tekst źródła
Streszczenie:
Abstract. Lake-level records from Italy suggest a north–south climatic partition in the Central Mediterranean during the Holocene with respect to precipitation, but the scarcity of reliable palaeoclimatic records in the North and Central-Southern Mediterranean means new evidence is needed to validate this hypothesis. Here, we provide robust quantitative estimates of Holocene climate in the Mediterranean region based on four high-resolution pollen records from Northern (Lakes Ledro and Accesa) and Southern (Lakes Trifoglietti and Pergusa) Italy. Multiple methods are used to provide an improved assessment of the paleoclimatic reconstruction uncertainty. The multi-method approach uses the pollen-based Weighted Averaging, Weighted-Average-Partial-Least-Squares regression, Modern Analogues Technique, and the Non-Metric-Multidimensional Scaling/Generalized-Additive-Model methods. The precipitation seasonality reconstructions are validated by independent lake-level data, obtained from the same records. A climatic partition between the north and the south during the Holocene confirms the hypothesis of opposing mid-Holocene summer precipitation regimes in the Mediterranean. During the early-to-mid-Holocene the northern sites (Ledro, Accesa) are characterized by minima for summer precipitation and lake-levels while the southern sites (Trifoglietti, Pergusa) are marked by maxima for precipitation and lake-levels. During the late Holocene, both pollen-inferred precipitation and lake-levels indicate the opposite pattern, a maximum in North Italy and a minimum in Southern Italy/Sicily. Summer temperatures also show partitioning, with warm conditions in Northern Italy and cool conditions in Sicily during the early/mid-Holocene, and a reversal during the Late-Holocene. Comparison with marine cores from the Aegean Sea suggests that climate trends and gradients observed in Italy shows strong similarities with those recognized from the Aegean Sea, and more generally speaking in the Eastern Mediterranean.
Style APA, Harvard, Vancouver, ISO itp.
19

Bainiwal, Tejpaul Singh. "Religious and Political Dimensions of the Kartarpur Corridor: Exploring the Global Politics Behind the Lost Heritage of the Darbar Sahib". Religions 11, nr 11 (29.10.2020): 560. http://dx.doi.org/10.3390/rel11110560.

Pełny tekst źródła
Streszczenie:
The 550th birth anniversary of Guru Nanak and the construction of the Kartarpur Corridor has helped the Darbar Sahib at Kartarpur in Pakistan gain global attention. In 2019, thousands of Sikhs embarked on a pilgrimage to Pakistan to take part in this momentous occasion. However, conversations surrounding modern renovations, government control of sacred sites, and the global implications of the corridor have been missing in the larger dialogue. Using historical methods and examining the Darbar Sahib through the context of the 1947 partition and the recent construction of the Kartarpur Corridor, this paper departs from the metanarrative surrounding the Darbar Sahib and explores the impact that Sikhs across the globe had on the “bridge of peace”, the politics behind the corridor, and how access to sacred Sikh spaces in Pakistan was only partially regained.
Style APA, Harvard, Vancouver, ISO itp.
20

Walczak, Jakub, Tadeusz Poreda i Adam Wojciechowski. "Effective Planar Cluster Detection in Point Clouds Using Histogram-Driven Kd-Like Partition and Shifted Mahalanobis Distance Based Regression". Remote Sensing 11, nr 21 (23.10.2019): 2465. http://dx.doi.org/10.3390/rs11212465.

Pełny tekst źródła
Streszczenie:
Point cloud segmentation for planar surface detection is a valid problem of automatic laser scans analysis. It is widely exploited for many industrial remote sensing tasks, such as LIDAR city scanning, creating inventories of buildings, or object reconstruction. Many current methods rely on robustly calculated covariance and centroid for plane model estimation or global energy optimization. This is coupled with point cloud division strategies, based on uniform or regular space subdivision. These approaches result in many redundant divisions, plane maladjustments caused by outliers, and excessive number of processing iterations. In this paper, a new robust method of point clouds segmentation, based on histogram-driven hierarchical space division, inspired by kd-tree is presented. The proposed partition method produces results with a smaller oversegmentation rate. Moreover, state-of-the-art partitions often lead to nodes of low cardinality, which results in the rejection of many points. In the proposed method, the point rejection rate was reduced. Point cloud subdivision is followed by resilient plane estimation, using Mahalanobis distance with respect to seven cardinal points. These points were established based on eigenvectors of the covariance matrix of the considered point cluster. The proposed method shows high robustness and yields good quality metrics, much faster than a FAST-MCD approach. The overall results indicate improvements in terms of plane precision, plane recall, under-, and the over- segmentation rate with respect to the reference benchmark methods. Plane precision for the S3DIS dataset increased on average by 2.6pp and plane recall- by 3pp. Both over- and under- segmentation rates fell by 3.2pp and 4.3pp.
Style APA, Harvard, Vancouver, ISO itp.
21

Li, Meng, i Shuangxin Wang. "Dynamic Fault Monitoring of Pitch System in Wind Turbines using Selective Ensemble Small-World Neural Networks". Energies 12, nr 17 (23.08.2019): 3256. http://dx.doi.org/10.3390/en12173256.

Pełny tekst źródła
Streszczenie:
Pitch system failures occur primarily because wind turbines typically work in dynamic and variable environments. Conventional monitoring strategies show limitations of continuously identifying faults in most cases, especially when rapidly changing winds occur. A novel selective-ensemble monitoring strategy is presented to diagnose the most pitch failures using Supervisory Control and Data Acquisition (SCADA) data. The proposed strategy consists of five steps. During the first step, the SCADA data are partitioned according to the turbine’s four working states. Correlation Information Entropy (CIE) and 10 indicators are used to select correlation signals and extract features of the partition data, respectively. During the second step, multiple Small-World Neural Networks (SWNNs) are established as the ensemble members. Regarding the third step, all the features are randomly sampled to train the SWNN members. The fourth step involves using an improved global correlation method to select appropriate ensemble members while in the fifth step, the selected members are fused to obtain the final classification result based on the weighted integration approach. Compared with the conventional methods, the proposed ensemble strategy shows an effective accuracy rate of over 93.8% within a short delay time.
Style APA, Harvard, Vancouver, ISO itp.
22

Singsathid, Pirapong, Pikul Puphasuk i Jeerayut Wetweerapong. "Adaptive differential evolution algorithm with a pheromone-based learning strategy for global continuous optimization". Foundations of Computing and Decision Sciences 48, nr 2 (1.06.2023): 243–66. http://dx.doi.org/10.2478/fcds-2023-0010.

Pełny tekst źródła
Streszczenie:
Abstract Differential evolution algorithm (DE) is a well-known population-based method for solving continuous optimization problems. It has a simple structure and is easy to adapt to a wide range of applications. However, with suitable population sizes, its performance depends on the two main control parameters: scaling factor (F ) and crossover rate (CR). The classical DE method can achieve high performance by a time-consuming tunning process or a sophisticated adaptive control implementation. We propose in this paper an adaptive differential evolution algorithm with a pheromone-based learning strategy (ADE-PS) inspired by ant colony optimization (ACO). The ADE-PS embeds a pheromone-based mechanism that manages the probabilities associated with the partition values of F and CR. It also introduces a resetting strategy to reset the pheromone at a specific time to unlearn and relearn the progressing search. The preliminary experiments find a suitable number of subintervals (ns) for partitioning the control parameter ranges and the reset period (rs) for resetting the pheromone. Then the comparison experiments evaluate ADE-PS using the suitable ns and rs against some adaptive DE methods in the literature. The results show that ADE-PS is more reliable and outperforms several well-known methods in the literature.
Style APA, Harvard, Vancouver, ISO itp.
23

Nascimento, Fabrícia F., Manon Ragonnet-Cronin, Tanya Golubchik, Siva Danaviah, Anne Derache, Christophe Fraser i Erik Volz. "Evaluating whole HIV-1 genome sequence for estimation of incidence and migration in a rural South African community". Wellcome Open Research 7 (21.06.2022): 174. http://dx.doi.org/10.12688/wellcomeopenres.17891.1.

Pełny tekst źródła
Streszczenie:
Background: South Africa has the largest number of people living with HIV (PLWHIV) in the world, with HIV prevalence and transmission patterns varying greatly between provinces. Transmission between regions is still poorly understood, but phylodynamics of HIV-1 evolution can reveal how many infections are attributable to contacts outside a given community. We analysed whole genome HIV-1 genetic sequences to estimate incidence and the proportion of transmissions between communities in Hlabisa, a rural South African community. Methods: We separately analysed HIV-1 for gag, pol, and env genes sampled from 2,503 PLWHIV. We estimated time-scaled phylogenies by maximum likelihood under a molecular clock model. Phylodynamic models were fitted to time-scaled trees to estimate transmission rates, effective number of infections, incidence through time, and the proportion of infections imported to Hlabisa. We also partitioned time-scaled phylogenies with significantly different distributions of coalescent times. Results: Phylodynamic analyses showed similar trends in epidemic growth rates between 1980 and 1990. Model-based estimates of incidence and effective number of infections were consistent across genes. Parameter estimates with gag were generally smaller than those estimated with pol and env. When estimating the proportions of new infections in Hlabisa from immigration or transmission from external sources, our posterior median estimates were 85% (95% credible interval (CI) = 78%–92%) for gag, 62% (CI = 40%–78%) for pol, and 77% (CI = 58%–90%) for env in 2015. Analysis of phylogenetic partitions by gene showed that most close global reference sequences clustered within a single partition. This suggests local evolving epidemics or potential unmeasured heterogeneity in the population. Conclusions: We estimated consistent epidemic dynamic trends for gag, pol and env genes using phylodynamic models. There was a high probability that new infections were not attributable to endogenous transmission within Hlabisa, suggesting high inter-connectedness between communities in rural South Africa.
Style APA, Harvard, Vancouver, ISO itp.
24

Stokowiec, K., i S. Sobura. "Hand-held and UAV camera comparison in building thermal inspection process". Journal of Physics: Conference Series 2339, nr 1 (1.09.2022): 012017. http://dx.doi.org/10.1088/1742-6596/2339/1/012017.

Pełny tekst źródła
Streszczenie:
Abstract Thermal building inspections indicate the growing interest among researchers due to the global climate changes resulting in legislations and policy contribution to low-energy consumption. The technologies applied during the experiments involve infrared cameras: both hand-held and on unmanned aerial vehicles. The research conducted in the public building in Kielce included the analysis of glass partition from the inside of the building by means of hand-held and UAV camera. The temperature distributions presented in the graphs proved that both methods are accurate in case of such investigations. However, during the experiments, it was concluded that hand-held camera is far more convenient.
Style APA, Harvard, Vancouver, ISO itp.
25

Guo, Tong Ying, Wei Liu, Zhen Jun Du, Hai Chen Wang i Tong Yan Guo. "Application of Least Squares Algorithm on Maps Building for Mobile Robot". Applied Mechanics and Materials 376 (sierpień 2013): 216–19. http://dx.doi.org/10.4028/www.scientific.net/amm.376.216.

Pełny tekst źródła
Streszczenie:
In this paper, the laser sensor is used to detect environmental information. The problems of maps building of mobile robot are researched in the indoor structured environment. Least squares algorithm is adopted to build local maps, and a method of dynamic partition are presented based on the distance of adjacent points and angle. For accomplishing the self-localization and route planning of the mobile robot in the indoor environment lacking of the prior map, this paper present the method of the global map building based on the match of the feature points and the characteristic line. The experiments show the methods presented are effective and practicable.
Style APA, Harvard, Vancouver, ISO itp.
26

Gasmi, B., i R. Benacer. "NEW BRANCH AND BOUND METHOD OVER A BOXED SET OF $\mathbb{R}^{n}$". Advances in Mathematics: Scientific Journal 12, nr 6 (12.06.2023): 603–30. http://dx.doi.org/10.37418/amsj.12.6.3.

Pełny tekst źródła
Streszczenie:
We present in this paper the new Branch and Bound method with new quadratic approach over a boxed set (a rectangle) of $\mathbb{R}^{n}$. We construct an approximate convex quadratics functions of the objective function to fined a lower bound of the global optimal value of the original non convex quadratic problem (NQP) over each subset of this boxed set. We applied a partition and technical reducing on the domain of (NQP) to accelerate the convergence of the proposed algorithm. Finally,we study the convergence of the proposed algorithm and we give a simple comparison between this method and another methods wish have the same principle.
Style APA, Harvard, Vancouver, ISO itp.
27

Fuentes, Humberto Peredo. "Application of the mode-shape expansion based on model order reduction methods to a composite structure". Open Engineering 7, nr 1 (18.09.2017): 199–212. http://dx.doi.org/10.1515/eng-2017-0026.

Pełny tekst źródła
Streszczenie:
AbstractThe application of different mode-shape expansion (MSE) methods to a CFRP based on model order reduction (MOR) and component mode synthesis (CMS) methods is evaluated combining the updated stiffness parameters of the full FE model obtained with a mix-numerical experimental technique (MNET) in a previous work. The eigenvectors and eigenfrequencies of the different MSE methods obtained are compared with respect to the experimental measurements and with a full FE model solutions using the modal assurance criteria (MAC). Furthermore, the stiffness and mass weighted coefficients (K-MAC and M-MAC respectively) are calculated and compared to observe the influence of the different subspace based expansion methods applying the MAC criteria. The K-MAC and M-MAC are basically the MAC coefficients weighted by a partition of the global stiffness and mass matrices respectively. The best K-MAC and M-MAC results per paired mode-sensor are observed in the subspace based expansion MODAL/SEREP and MDRE-WE methods using the updated stiffness parameters. A strong influence of the subspace based on MOR using MSE methods is observed in the K-MAC and M-MAC criteria implemented in SDTools evaluating the stiffness parameters in a contrieved example.
Style APA, Harvard, Vancouver, ISO itp.
28

Zuenko, Alexander A., Olga V. Fridman i Olga N. Zuenko. "An approach to finding a global optimum in constrained clustering tasks involving the assessments of several experts". Transaction Kola Science Centre 12, nr 5-2021 (27.12.2021): 75–90. http://dx.doi.org/10.37614/2307-5252.2021.5.12.007.

Pełny tekst źródła
Streszczenie:
An approach to solving the constrained clustering problem has been developed, based on the aggregation of data obtained as a result of evaluating the characteristics of clustered objects by several independent experts, and the analysis of alternative variants of clustering by constraint programming methods using original heuristics. Objects clusterized are represented as multisets, which makes it possible to use appropriate methods of aggregation of expert opinions. It is proposed to solve the constrained clustering problem as a constraint satisfaction problem. The main attention is paid to the issue of reducing the number and simplifying the constraints of the constraint satisfaction problem at the stage of its formalization. Within the framework of the approach, we have created: a) a method for estimating the optimal value of the objective function by hierarchical clustering of multisets, taking into account a priori constraints of the subject domain, and b) a method for generating additional constraints on the desired solution in the form of “smart tables”, based on the obtained estimate. The approach allows us to find the best partition in the problems of the class under consideration, which are characterized by a high dimension.
Style APA, Harvard, Vancouver, ISO itp.
29

Guo, Y. T., X. M. Zhang, T. F. Long, W. L. Jiao, G. J. He, R. Y. Yin i Y. Y. Dong. "CHINA FOREST COVER EXTRACTION BASED ON GOOGLE EARTH ENGINE". ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences XLII-3/W10 (8.02.2020): 855–62. http://dx.doi.org/10.5194/isprs-archives-xlii-3-w10-855-2020.

Pełny tekst źródła
Streszczenie:
Abstract. Forest cover rate is the principal indice to reflect the forest acount of a nation and region. In view of the difficulty of accurately calculating large-scale forest area by traditional statistical survey methods, it is proposed to extract China forest area based on Google Earth Engine platform. Trained by the enough samples selected through the Google Earth software, there are nine different random forest classifiers applicable to their corresponding zones. Using Landsat 8 surface reflectance data of 2018 year and the modified forest partition map, China forest cover is generated on the Google Earth Engine platform. The accuracy of China's forest coverage achieves 89.08%, while the accuracy of Global Forest Change datasets of Maryland university and Japan’s ALOS Forest/Non-Forest forest product reach 87.78% and 84.57%. Besides, the precision of tropical/subtropical forest, temperate coniferous forest as well as nonforest region are 83.25%, 87.94% and 97.83%, higher than those of other’s accuracy. Our results show that by means of the random forest algorithm and enough samples, tropical and subtropical broadleaf forest, temperate coniferous forest and nonforest partition can be extracted more accurately. Through the computation of forest cover, our result shows that China has a area of 220.42 million hectare in 2018.
Style APA, Harvard, Vancouver, ISO itp.
30

Lefèvre, Sébastien, David Sheeren i Onur Tasar. "A Generic Framework for Combining Multiple Segmentations in Geographic Object-Based Image Analysis". ISPRS International Journal of Geo-Information 8, nr 2 (30.01.2019): 70. http://dx.doi.org/10.3390/ijgi8020070.

Pełny tekst źródła
Streszczenie:
The Geographic Object-Based Image Analysis (GEOBIA) paradigm relies strongly on the segmentation concept, i.e., partitioning of an image into regions or objects that are then further analyzed. Segmentation is a critical step, for which a wide range of methods, parameters and input data are available. To reduce the sensitivity of the GEOBIA process to the segmentation step, here we consider that a set of segmentation maps can be derived from remote sensing data. Inspired by the ensemble paradigm that combines multiple weak classifiers to build a strong one, we propose a novel framework for combining multiple segmentation maps. The combination leads to a fine-grained partition of segments (super-pixels) that is built by intersecting individual input partitions, and each segment is assigned a segmentation confidence score that relates directly to the local consensus between the different segmentation maps. Furthermore, each input segmentation can be assigned some local or global quality score based on expert assessment or automatic analysis. These scores are then taken into account when computing the confidence map that results from the combination of the segmentation processes. This means the process is less affected by incorrect segmentation inputs either at the local scale of a region, or at the global scale of a map. In contrast to related works, the proposed framework is fully generic and does not rely on specific input data to drive the combination process. We assess its relevance through experiments conducted on ISPRS 2D Semantic Labeling. Results show that the confidence map provides valuable information that can be produced when combining segmentations, and fusion at the object level is competitive w.r.t. fusion at the pixel or decision level.
Style APA, Harvard, Vancouver, ISO itp.
31

LEE, BEN, i ALI R. HURSON. "A STRATEGY FOR SCHEDULING PARTIALLY ORDERED PROGRAM GRAPHS ONTO MULTICOMPUTERS". Parallel Processing Letters 05, nr 04 (grudzień 1995): 575–86. http://dx.doi.org/10.1142/s0129626495000515.

Pełny tekst źródła
Streszczenie:
The issue of scalability is key to the success of massively parallel processing. Due to their distributed nature, message-passing multicomputers are appropriate for achieving scalar performance. However, the message-passing model lacks programmability due to difficulties encountered by the programmers to partition and schedule the computation over the processors and to establish efficient inter-processor communication in the user code. Therefore, this paper presents a compile-time scheduling heuristic, called BLS, that maps programs onto the processors of a message-passing multicomputer. In contrast to other methods proposed, BLS takes a more global approach in attempt to balance the tradeoff between exploiting parallelism and reducing communication overhead. To evaluate the effectiveness of BLS, simulation studies of scheduling SISAL programs are presented.
Style APA, Harvard, Vancouver, ISO itp.
32

Natarajan, Sundararajan. "On the application of the partition of unity method for nonlocal response of low-dimensional structures". Journal of the Mechanical Behavior of Materials 23, nr 5-6 (1.12.2014): 153–68. http://dx.doi.org/10.1515/jmbm-2014-0017.

Pełny tekst źródła
Streszczenie:
AbstractThe main objectives of the paper are to (1) present an overview of nonlocal integral elasticity and Aifantis gradient elasticity theory and (2) discuss the application of partition of unity methods to study the response of low-dimensional structures. We present different choices of approximation functions for gradient elasticity, namely Lagrange intepolants, moving least-squares approximants and non-uniform rational B-splines. Next, we employ these approximation functions to study the response of nanobeams based on Euler-Bernoulli and Timoshenko theories as well as to study nanoplates based on first-order shear deformation theory. The response of nanobeams and nanoplates is studied using Eringen’s nonlocal elasticity theory. The influence of the nonlocal parameter, the beam and the plate aspect ratio and the boundary conditions on the global response is numerically studied. The influence of a crack on the axial vibration and buckling characteristics of nanobeams is also numerically studied.
Style APA, Harvard, Vancouver, ISO itp.
33

Konforte, Danijela, Jennifer L. Shea, Lianna Kyriakopoulou, David Colantonio, Ashley H. Cohen, Julie Shaw, Dana Bailey, Man Khun Chan, David Armbruster i Khosrow Adeli. "Complex Biological Pattern of Fertility Hormones in Children and Adolescents: A Study of Healthy Children from the CALIPER Cohort and Establishment of Pediatric Reference Intervals". Clinical Chemistry 59, nr 8 (1.08.2013): 1215–27. http://dx.doi.org/10.1373/clinchem.2013.204123.

Pełny tekst źródła
Streszczenie:
BACKGROUND Pediatric endocrinopathies are commonly diagnosed and monitored by measuring hormones of the hypothalamic-pituitary-gonadal axis. Because growth and development can markedly influence normal circulating concentrations of fertility hormones, accurate reference intervals established on the basis of a healthy, nonhospitalized pediatric population and that reflect age-, gender-, and pubertal stage–specific changes are essential for test result interpretation. METHODS Healthy children and adolescents (n = 1234) were recruited from a multiethnic population as part of the CALIPER study. After written informed parental consent was obtained, participants filled out a questionnaire including demographic and pubertal development information (assessed by self-reported Tanner stage) and provided a blood sample. We measured 7 fertility hormones including estradiol, testosterone (second generation), progesterone, sex hormone–binding globulin, prolactin, follicle-stimulating hormone, and luteinizing hormone by use of the Abbott Architect i2000 analyzer. We then used these data to calculate age-, gender-, and Tanner stage–specific reference intervals according to Clinical Laboratory Standards Institute C28-A3 guidelines. RESULTS We observed a complex pattern of change in each analyte concentration from the neonatal period to adolescence. Consequently, many age and sex partitions were required to cover the changes in most fertility hormones over this period. An exception to this was prolactin, for which no sex partition and only 3 age partitions were necessary. CONCLUSIONS This comprehensive database of pediatric reference intervals for fertility hormones will be of global benefit and should lead to improved diagnosis of pediatric endocrinopathies. The new database will need to be validated in local populations and for other immunoassay platforms as recommended by the Clinical Laboratory Standards Institute.
Style APA, Harvard, Vancouver, ISO itp.
34

Valori, Gherardo, Pascal Démoulin, Etienne Pariat, Anthony Yeates, Kostas Moraitis i Luis Linan. "Additivity of relative magnetic helicity in finite volumes". Astronomy & Astrophysics 643 (28.10.2020): A26. http://dx.doi.org/10.1051/0004-6361/202038533.

Pełny tekst źródła
Streszczenie:
Context. Relative magnetic helicity is conserved by magneto-hydrodynamic evolution even in the presence of moderate resistivity. For that reason, it is often invoked as the most relevant constraint on the dynamical evolution of plasmas in complex systems, such as solar and stellar dynamos, photospheric flux emergence, solar eruptions, and relaxation processes in laboratory plasmas. However, such studies often indirectly imply that relative magnetic helicity in a given spatial domain can be algebraically split into the helicity contributions of the composing subvolumes, in other words that it is an additive quantity. A limited number of very specific applications have shown that this is not the case. Aims. Progress in understanding the nonadditivity of relative magnetic helicity requires removal of restrictive assumptions in favor of a general formalism that can be used in both theoretical investigations and numerical applications. Methods. We derive the analytical gauge-invariant expression for the partition of relative magnetic helicity between contiguous finite volumes, without any assumptions on either the shape of the volumes and interface, or the employed gauge. Results. We prove the nonadditivity of relative magnetic helicity in finite volumes in the most general, gauge-invariant formalism, and verify this numerically. We adopt more restrictive assumptions to derive known specific approximations, which yields a unified view of the additivity issue. As an example, the case of a flux rope embedded in a potential field shows that the nonadditivity term in the partition equation is, in general, non-negligible. Conclusions. The nonadditivity of relative magnetic helicity can potentially be a serious impediment to the application of relative helicity conservation as a constraint on the complex dynamics of magnetized plasmas. The relative helicity partition formula can be applied to numerical simulations to precisely quantify the effect of nonadditivity on global helicity budgets of complex physical processes.
Style APA, Harvard, Vancouver, ISO itp.
35

Yang, Shiqiang, Qi Li, Duo He, Jinhua Wang i Dexin Li. "Global Correlation Enhanced Hand Action Recognition Based on NST-GCN". Electronics 11, nr 16 (11.08.2022): 2518. http://dx.doi.org/10.3390/electronics11162518.

Pełny tekst źródła
Streszczenie:
Hand action recognition is an important part of intelligent monitoring, human–computer interaction, robotics and other fields. Compared with other methods, the hand action recognition method using skeleton information can ignore the error effects caused by complex background and movement speed changes, and the computational cost is relatively small. The spatial-temporal graph convolution networks (ST-GCN) model has excellent performance in the field of skeleton-based action recognition. In order to solve the problem of the root joint and the further joint not being closely connected, resulting in a poor hand-action-recognition effect, this paper firstly uses the dilated convolution to replace the standard convolution in the temporal dimension. This is in order to process the time series features of the hand action video, which increases the receptive field in the temporal dimension and enhances the connection between features. Then, by adding non-physical connections, the connection between the joints of the fingertip and the root of the finger is established, and a new partition strategy is adopted to strengthen the hand correlation of each joint point information. This helps to improve the network’s ability to extract the spatial-temporal features of the hand. The improved model is tested on public datasets and real scenarios. The experimental results show that compared with the original model, the 14-category top-1 and 28-category top-1 evaluation indicators of the dataset have been improved by 4.82% and 6.96%. In the real scene, the recognition effect of the categories with large changes in hand movements is better, and the recognition results of the categories with similar trends of hand movements are poor, so there is still room for improvement.
Style APA, Harvard, Vancouver, ISO itp.
36

Tzirakis, Konstantinos, Yiannis Kamarianakis, Nikolaos Kontopodis i Christos V. Ioannou. "Classification of Blood Rheological Models through an Idealized Symmetrical Bifurcation". Symmetry 15, nr 3 (2.03.2023): 630. http://dx.doi.org/10.3390/sym15030630.

Pełny tekst źródła
Streszczenie:
The assumed rheological behavior of blood influences the hemodynamic characteristics of numerical blood flow simulations. Until now, alternative rheological specifications have been utilized, with uncertain implications for the results obtained. This work aims to group sixteen blood rheological models in homogeneous clusters, by exploiting data generated from numerical simulations on an idealized symmetrical arterial bifurcation. Blood flow is assumed to be pulsatile and is simulated using a commercial finite volume solver. An appropriate mesh convergence study is performed, and all results are collected at three different time instants throughout the cardiac cycle: at peak systole, early diastole, and late diastole. Six hemodynamic variables are computed: the time average wall shear stress, oscillatory shear index, relative residence time, global and local non-Newtonian importance factor, and non-Newtonian effect factor. The resulting data are analyzed using hierarchical agglomerative clustering algorithms, which constitute typical unsupervised classification methods. Interestingly, the rheological models can be partitioned into three homogeneous groups, whereas three specifications appear as outliers which do not belong in any partition. Our findings suggest that models which are defined in a similar manner from a mathematical perspective may behave substantially differently in terms of the data they produce. On the other hand, models characterized by different mathematical formulations may belong to the same statistical group (cluster) and can thus be considered interchangeably.
Style APA, Harvard, Vancouver, ISO itp.
37

Liu, Yingli, Minghua Hu, Jianan Yin, Jiaming Su, Shuce Wang i Zheng Zhao. "Optimization Design and Performance Evaluation of U-Shaped Area Operation Procedures in Complex Apron". Aerospace 10, nr 2 (9.02.2023): 161. http://dx.doi.org/10.3390/aerospace10020161.

Pełny tekst źródła
Streszczenie:
In view of the common U-shaped apron structure of large- and medium-sized airports at home and abroad, this study considered the optimization design and performance evaluation of the U-shaped apron operation procedure. First, by analyzing the physical structure characteristics and traffic operation characteristics of the U-shaped area, exclusive, partition-shared, and global-shared operation procedures of the U-shaped area were designed, and differentiated apron-operation rules and traffic models were constructed for different types of operation procedures. Then, from the perspectives of safety, efficiency, and environmental protection, a multi-dimensional evaluation index system of U-shaped area operation performance is established, and a classification measurement and comprehensive evaluation method based on critique is proposed. Finally, a traffic simulation model was established based on airport network topology modeling. We used Monte Carlo methods for the simulation in Python 3.6, and the experimental results show that, in the scenario of high-density traffic operation, compared with exclusive and partition-shared procedures, the implementation effect of the global shared procedure is very significant, and the apron operation capacity increased by 14.8% and 5.0%, respectively. The probability of aircraft conflict decreased by 32.2% and 11.8%, respectively, and the time of single conflict relief decreased by 16.1 s and 10.6 s, respectively. The average resource utilization in each U-shaped area increased by 66% and 25%, respectively, while the average daily carbon emissions of a single aircraft were reduced by 16.7 kg and 11.0 kg and the average daily fuel consumption of a single aircraft were reduced by 3.6 kg and 2.4 kg, respectively. The proposed method is scientific and effective and can provide theoretical and methodological support for optimizing the configuration of the scene operation mode of complex airports and for improving flight operation efficiency.
Style APA, Harvard, Vancouver, ISO itp.
38

Tang, Yu, Sheng Wang, Yuancai Huang, Xiaokai Zhao, Weinan Zhao, Yitao Duan i Xu Wang. "Retrieval-Based Factorization Machines for Human Click Behavior Prediction". Computational Intelligence and Neuroscience 2022 (18.11.2022): 1–15. http://dx.doi.org/10.1155/2022/1105048.

Pełny tekst źródła
Streszczenie:
Human click behavior prediction is crucial for recommendation scenarios such as online commodity or advertisement recommendation, as it is helpful to improve the quality and user satisfaction of services. In recommender systems, the concept of click-through rate (CTR) is used to estimate the probability that a user will click on a recommended candidate. Many methods have been proposed to predict CTR and achieved good results. However, they usually optimize the parameters through a global objective function such as minimizing logloss or root mean square error (RMSE) for all training samples. Obviously, they intend to capture global knowledge of user click behavior but ignore local information. In this work, we propose a novel approach of retrieval-based factorization machines (RFM) for CTR prediction, which can effectively predict CTR by combining global knowledge which is learned from the FM method with the neighbor-based local information. We also leverage the clustering technique to partition the large training set into multiple small regions for efficient retrieval of neighbors. We evaluate our RFM model on three public datasets. The experimental results show that RFM performs better than other models in metrics of RMSE, area under ROC (AUC), and accuracy. Moreover, it is efficient because of the small number of model parameters.
Style APA, Harvard, Vancouver, ISO itp.
39

Jiang, Tao, Ting Chen, Xuexue Meng, Han Meng i Min Xiao. "A new method for calculating p-value under unconditional exact test". Thermal Science 25, nr 3 Part B (2021): 2385–97. http://dx.doi.org/10.2298/tsci200220129j.

Pełny tekst źródła
Streszczenie:
An unconditional exact test is a classic method to test the significant difference between two independent binomial proportions or multinomial distributions. The p-value based on the unconditional exact test is computed by maximizing the probability of the tail region. The grid search method and polynomial method are able to find the maximum with sophisticated enough partition of the parameter space, while they require a rather long time to compute and those methods are computationally intensive for a study beyond two groups. In this paper, we pro-pose a new method to obtain the solution of the global maximum which can diminish the computing time based on the fixed-point iterative algorithm. Addition-ally, both simulation and experiment indicate that this method is more competitive compared with the grid search and the polynomial method on the basis of guaranteed accuracy.
Style APA, Harvard, Vancouver, ISO itp.
40

Hanifa, Tsamara Nurul, i Erlinda Muslim. "Strategic Design to Increase MSME Intentions in Exporting with Standards for Global Market Needs". Jurnal Manajemen Teknologi 21, nr 3 (2022): 264–82. http://dx.doi.org/10.12695/jmt.2022.21.3.3.

Pełny tekst źródła
Streszczenie:
Abstract. The contribution of Indonesia's exports, with the number of business entities dominating at 99.99%, is relatively low at 14.37%. Meanwhile, large business entities achieved an export contribution for Indonesia of 85.63%. Indonesia was predicted to emerge from the middle-income trap phenomenon in 2036, implying that Indonesia will become a high-income country with the fourth-largest GDP in the world. This demonstrates that Micro, Small, and Medium Enterprises (MSMEs) have enormous potential and opportunities to expand international trade. The many factors hindering MSMEs from exporting and high export standards were among the critical factors for the low intention of MSMEs in exporting and choosing to focus on domestic selling. This study used the Principal Component Analysis (PCA) and Interpretive Structural Modeling (ISM) methods to find out the main factors that cause low intentions on the awareness of MSME business actors in exporting by reducing data on the PCA method to perceived barrier and enabler factors. The ISM method helped produce policy formulation and strategic planning by analyzing the influence of the relationship between factors. 13 main barriers and six main enabler factors were obtained; from them, the action plan was designed through appropriate strategic planning, ordered by the priority of barrier factors and enablers, which are known from the partition level in the ISM model based on the results of discussion agreements with experts. Keywords: Export; export barriers; export enablers, msmes, Indonesian mses, principal component analysis (PCA); interpretive structural modeling (ISM)
Style APA, Harvard, Vancouver, ISO itp.
41

Enumula, Mahesh, Dr M. Giri i Dr V. K. Sharma. "A New Efficient Forgery Detection Method using Scaling, Binning, Noise Measuring Techniques and Artificial Intelligence (Ai)". International Journal of Innovative Technology and Exploring Engineering 12, nr 9 (30.08.2023): 17–21. http://dx.doi.org/10.35940/ijitee.i9703.0812923.

Pełny tekst źródła
Streszczenie:
In the market new updated editing tools and technologies are available to edit images and with help of these tools images are easily forged. In this research paper we proposed new forgery detection technique with estimation of noise on various scale of input image affect of noise in input image, frequency of images are also changed due to noise, noise signal correlated with original input images and in compressed images quantization level frequency also changed due to noise.We partition input image into M X N blocks, resized blocks are proceed further, image colors are also taken into consideration, each block noise value is evaluated at local level and global level. For each color channel of input image estimate local and global noise levels are estimated and compared using binning method. Also measured heat map of each block and each color channel of input image and all these values are stored in bins. Finally from all noise values calculate average mean value of noise, with these values decide whether input image is forgery or not, and performance of proposed method is compared with existing methods.
Style APA, Harvard, Vancouver, ISO itp.
42

Lu, Pengli, Junxia Yang i Teng Zhang. "Identifying influential nodes in complex networks based on network embedding and local structure entropy". Journal of Statistical Mechanics: Theory and Experiment 2023, nr 8 (1.08.2023): 083402. http://dx.doi.org/10.1088/1742-5468/acdceb.

Pełny tekst źródła
Streszczenie:
Abstract The identification of influential nodes in complex networks remains a crucial research direction, as it paves the way for analyzing and controlling information diffusion. The currently presented network embedding algorithms are capable of representing high-dimensional and sparse networks with low-dimensional and dense vector spaces, which not only keeps the network structure but also has high accuracy. In this work, a novel centrality approach based on network embedding and local structure entropy, called the ELSEC, is proposed for capturing richer information to evaluate the importance of nodes from the view of local and global perspectives. In short, firstly, the local structure entropy is used to measure the self importance of nodes. Secondly, the network is mapped to a vector space to calculate the Manhattan distance between nodes by using the Node2vec network embedding algorithm, and the global importance of nodes is defined by combining the correlation coefficients. To reveal the effectiveness of the ELSEC, we select three types of algorithms for identifying key nodes as contrast approaches, including methods based on node centrality, optimal decycling based algorithms and graph partition based methods, and conduct experiments on ten real networks for correlation, ranking monotonicity, accuracy of high ranking nodes and the size of the giant connected component. Experimental results show that the ELSEC algorithm has excellent ability to identify influential nodes.
Style APA, Harvard, Vancouver, ISO itp.
43

Yu, Xiang, Shui-Hua Wang, Juan Manuel Górriz, Xian-Wei Jiang, David S. Guttery i Yu-Dong Zhang. "PeMNet for Pectoral Muscle Segmentation". Biology 11, nr 1 (14.01.2022): 134. http://dx.doi.org/10.3390/biology11010134.

Pełny tekst źródła
Streszczenie:
As an important imaging modality, mammography is considered to be the global gold standard for early detection of breast cancer. Computer-Aided (CAD) systems have played a crucial role in facilitating quicker diagnostic procedures, which otherwise could take weeks if only radiologists were involved. In some of these CAD systems, breast pectoral segmentation is required for breast region partition from breast pectoral muscle for specific analysis tasks. Therefore, accurate and efficient breast pectoral muscle segmentation frameworks are in high demand. Here, we proposed a novel deep learning framework, which we code-named PeMNet, for breast pectoral muscle segmentation in mammography images. In the proposed PeMNet, we integrated a novel attention module called the Global Channel Attention Module (GCAM), which can effectively improve the segmentation performance of Deeplabv3+ using minimal parameter overheads. In GCAM, channel attention maps (CAMs) are first extracted by concatenating feature maps after paralleled global average pooling and global maximum pooling operation. CAMs are then refined and scaled up by multi-layer perceptron (MLP) for elementwise multiplication with CAMs in next feature level. By iteratively repeating this procedure, the global CAMs (GCAMs) are then formed and multiplied elementwise with final feature maps to lead to final segmentation. By doing so, CAMs in early stages of a deep convolution network can be effectively passed on to later stages of the network and therefore leads to better information usage. The experiments on a merged dataset derived from two datasets, INbreast and OPTIMAM, showed that PeMNet greatly outperformed state-of-the-art methods by achieving an IoU of 97.46%, global pixel accuracy of 99.48%, Dice similarity coefficient of 96.30%, and Jaccard of 93.33%, respectively.
Style APA, Harvard, Vancouver, ISO itp.
44

Ahmad, A., P. Claudio, A. Alizadeh Naeini i G. Sohn. "WI-FI RSS FINGERPRINTING FOR INDOOR LOCALIZATION USING AUGMENTED REALITY". ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences V-4-2020 (3.08.2020): 57–64. http://dx.doi.org/10.5194/isprs-annals-v-4-2020-57-2020.

Pełny tekst źródła
Streszczenie:
Abstract. Indoor localization has attracted the attention of researchers for wide applications in areas like construction, facility management, industries, logistics, and health. The Received Signal Strength (RSS) based fingerprinting method is widely adopted because it has a lower cost over other methods. RSS is a measurement of the power present in the received radio signal. While this fingerprinting method is very popular, there is a significant amount of effort required for collecting fingerprints for indoor space. In this paper, we propose an RSS fingerprinting method using Augmented Reality (AR) that does not rely on an external sensor resulting in ease of use and maintenance. This method uses spatial mapping techniques to help align the floor plan of existing buildings; then, after the alignment, we map local device coordinates to global coordinates. After this process, we partition the space in equally distanced reference points for RSS fingerprint collection. We developed an application for Microsoft HoloLens to align the floor plan and collect fingerprints on reference points. Then we tested collected fingerprints with existing RSS based indoor localization methods for its accuracy and performance.
Style APA, Harvard, Vancouver, ISO itp.
45

Lovelock, James. "Archer John Porter Martin CBE. 1 March 1910 — 28 July 2002". Biographical Memoirs of Fellows of the Royal Society 50 (styczeń 2004): 157–70. http://dx.doi.org/10.1098/rsbm.2004.0012.

Pełny tekst źródła
Streszczenie:
We judge the worth of a scientist by the benefits he or she brings to science and society; by this measure Archer Martin was outstanding, and rightfully his contribution was recognized with a Nobel Prize. Scientific instruments and instrumental methods now come almost entirely from commercial sources and we take them for granted and often have little idea how they work. Archer Martin was of a different time when scientists would often devise their own new instruments, which usually they fully understood, and then they would use them to explore the world. The chromatographic methods and instruments Martin devised were at least as crucial in the genesis and development of molecular biology as were those from X–ray crystallography. Liquid partition chromatography, especially in its two–dimensional paper form, revealed the amino acid composition of proteins and the nucleic acid composition of DNA and RNA with a rapid and elegant facility. Gas chromatography (GC) enabled the accurate and rapid analysis of lipids, which previously had been painfully slow and little more than a greasy sticky confusion of beaker chemistry. Martin's instruments enabled progress in the sciences ranging from geophysics to biology, and without him we might have waited decades before another equivalent genius appeared. More than this, the environmental awareness that Rachel Carson gave us would never have solidified as it did without the evidence of global change measured by GC. This instrumental method provided accurate evidence about the ubiquity of pesticides and pollutants and later made us aware of the growing accumulation in the atmosphere of chlorinated fluorocarbons, nitrous oxide and other ozone-depleting chemicals.If all this were not enough to glorify Martin's partition chromatography, there is the undoubted fact that its simplicity, economy and exquisite resolving power transformed the chemical industry and made possible so many of the conveniences we now take for granted.
Style APA, Harvard, Vancouver, ISO itp.
46

Li, Zhonghan, i Yongbo Zhang. "Constrained ESKF for UAV Positioning in Indoor Corridor Environment Based on IMU and WiFi". Sensors 22, nr 1 (5.01.2022): 391. http://dx.doi.org/10.3390/s22010391.

Pełny tekst źródła
Streszczenie:
The indoor autonomous navigation of unmanned aerial vehicles (UAVs) is the current research hotspot. Unlike the outdoor broad environment, the indoor environment is unknown and complicated. Global Navigation Satellite System (GNSS) signals are easily blocked and reflected because of complex indoor spatial features, which make it impossible to achieve positioning and navigation indoors relying on GNSS. This article proposes a set of indoor corridor environment positioning methods based on the integration of WiFi and IMU. The zone partition-based Weighted K Nearest Neighbors (WKNN) algorithm is used to achieve higher WiFi-based positioning accuracy. On the basis of the Error-State Kalman Filter (ESKF) algorithm, WiFi-based and IMU-based methods are fused together and realize higher positioning accuracy. The probability-based optimization method is used for further accuracy improvement. After data fusion, the positioning accuracy increased by 51.09% compared to the IMU-based algorithm and by 66.16% compared to the WiFi-based algorithm. After optimization, the positioning accuracy increased by 20.9% compared to the ESKF-based data fusion algorithm. All of the above results prove that methods based on WiFi and IMU (low-cost sensors) are very capable of obtaining high indoor positioning accuracy.
Style APA, Harvard, Vancouver, ISO itp.
47

Holiuk, M. I., O. M. Khotiaintseva, V. M. Khotiaintsev, A. V. Nosovskyi i V. I. Gulik. "The Optimization of Calculation Time and Statistical Error for the Radiation Shielding Properties Simulation of Containers for Storage of Spent Nuclear Fuel". Nuclear Power and the Environment 22, nr 3 (2021): 19–27. http://dx.doi.org/10.31717/2311-8253.21.3.3.

Pełny tekst źródła
Streszczenie:
The radiation protection is an important issue in the operation of nuclear power plants and artificial radioactive sources, which include spent nuclear fuel storage facilities. The Monte-Carlo codes are the effective instruments for calculation of radiation shielding properties and radiation field characteristics for complex geometries. However, achieving a satisfactory statistical error of the results in modeling the passage of neutrons and photons through biological protection may require excessively long calculation time. To solve this problem, Monte-Carlo codes use methods to reduce the variance to direct particles to regions with detectors to improve statistical accuracy. Our paper presents the application of the variance reduction function based on weight windows in the Monte-Carlo Serpent code, the function is investigated on the example of a simplified 2D model of the spent nuclear fuel storage container HI-STORM 190. The simple approach option of variance reduction with fixed cartesian and cylindrical meshes was investigated for different mesh nodes and for different dimensions of nodes. Also, global variance reduction option with fixed cartesian and cylindrical meshes was analyzed for case of achieving satisfactory results for the entire simulated volume. For a qualitative assessment of the variance reduction function, the indicator — figure of merit (FOM) used in our paper which proposed by the developers of the Serpent Monte-Carlo code. It is shown that the use of the variance reduction function leads to a significant decrease of statistical error and decrease of the calculation time, and therefore can be useful for biological protection calculations. As conclusions we can note that: the cylindrical mesh is not as effective in terms of FOM compared to Cartesian mesh; for both cylindrical and Cartesian meshes it is possible to find the recommended grain (node) size; the use of azimuthal partition of the cylindrical mesh together with radial partition leads to an increase in FOM; the application of global variance reduction is useful in the case of asymmetric biological protection geometries, while the FOM decreases.
Style APA, Harvard, Vancouver, ISO itp.
48

Yanez-Sierra, Jedidiah, Arturo Diaz-Perez i Victor Sosa-Sosa. "An Efficient Partition-Based Approach to Identify and Scatter Multiple Relevant Spreaders in Complex Networks". Entropy 23, nr 9 (15.09.2021): 1216. http://dx.doi.org/10.3390/e23091216.

Pełny tekst źródła
Streszczenie:
One of the main problems in graph analysis is the correct identification of relevant nodes for spreading processes. Spreaders are crucial for accelerating/hindering information diffusion, increasing product exposure, controlling diseases, rumors, and more. Correct identification of spreaders in graph analysis is a relevant task to optimally use the network structure and ensure a more efficient flow of information. Additionally, network topology has proven to play a relevant role in the spreading processes. In this sense, more of the existing methods based on local, global, or hybrid centrality measures only select relevant nodes based on their ranking values, but they do not intentionally focus on their distribution on the graph. In this paper, we propose a simple yet effective method that takes advantage of the underlying graph topology to guarantee that the selected nodes are not only relevant but also well-scattered. Our proposal also suggests how to define the number of spreaders to select. The approach is composed of two phases: first, graph partitioning; and second, identification and distribution of relevant nodes. We have tested our approach by applying the SIR spreading model over nine real complex networks. The experimental results showed more influential and scattered values for the set of relevant nodes identified by our approach than several reference algorithms, including degree, closeness, Betweenness, VoteRank, HybridRank, and IKS. The results further showed an improvement in the propagation influence value when combining our distribution strategy with classical metrics, such as degree, outperforming computationally more complex strategies. Moreover, our proposal shows a good computational complexity and can be applied to large-scale networks.
Style APA, Harvard, Vancouver, ISO itp.
49

Tonial, MLS, HLR Silva, IJ Tonial, MC Costa, NJ Silva Júnior i JAF Diniz-Filho. "Geographical patterns and partition of turnover and richness components of beta-diversity in faunas from Tocantins river valley". Brazilian Journal of Biology 72, nr 3 (sierpień 2012): 497–504. http://dx.doi.org/10.1590/s1519-69842012000300012.

Pełny tekst źródła
Streszczenie:
There has been a resurging interest in patterns of β-diversity, especially by the mechanisms driving broad-scale, continental and global patterns, and how partitioning β-diversity into richness (or nestedness) and turnover components can be linked with such mechanisms. Here we compared two recent methodologies to find richness and turnover components of β-diversity, using a large regional scale dataset of mammal, bird, reptiles and amphibian species found in seven regions of Central, North and Northeastern Brazil. As well as a simple comparison of the metrics available, we analyzed spatial patterns (i.e., distance-decay similarity) and the effects of biome type in these components using raw and partial Mantel tests. Our analyses revealed that turnover estimated using Baselga's (2010) approach is slightly higher than the estimate using Carvalho's et al. (2012) approach, but all analyses show consistent spatial patterns in species turnover using both methods. Spatial patterns in β-diversity revealed by Mantel tests are also consistent with expectations based on differential dispersal abilities. Our results also reinforce that spatial patterns in β-diversity, mainly in the turnover components expressing faunal differentiation, are determined by a mix or broad scale environmental effects and short distance spatially-structured dispersal.
Style APA, Harvard, Vancouver, ISO itp.
50

Ngabea, Murtala Audu, Dike B. Ojji, Hayatu Umar i Simeon A. Isezuo. "Prevalence of echocardiographic left ventricular hypertrophy among hypertensives in a tertiary health institution in Nigeria". Annals of Medical Research and Practice 3 (2.06.2022): 3. http://dx.doi.org/10.25259/anmrp_1_2022.

Pełny tekst źródła
Streszczenie:
Objective: Systemic hypertension remains an important risk factors for cardiovascular diseases and a major global public health problem. Left ventricular hypertrophy (LVH) is a recognized complication of hypertension and strongly predicts cardiovascular morbidity and mortality. In Nigeria, few studies evaluated the role of echocardiography in the diagnosis of LVHs among hypertensives. This study sets out to determine the prevalence of LVH among hypertensives as determined by echocardiography. Material and Methods: One hundred and seventy-eight hypertensives and eighty-nine age and sex-matched controls were recruited consecutively into the study. They all had echocardiography done to determine which among had LVH. The partition value for LVH for hypertensives was determined using the 97th percentile of the left ventricular mass for controls as a cutoff point. Results: Echocardiographic determined the prevalence of LVH among hypertensives was 32.4%. Conclusion: The echocardiographic prevalence of LVH was 32.4% in the study population. This is a significant proportion among the study population considering the clinical impact of LVH among patients with hypertension.
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii