Gotowa bibliografia na temat „Disjoint component analysis”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Disjoint component analysis”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Disjoint component analysis"

1

Ferrara, Carla, Francesca Martella i Maurizio Vichi. "Probabilistic Disjoint Principal Component Analysis". Multivariate Behavioral Research 54, nr 1 (7.11.2018): 47–61. http://dx.doi.org/10.1080/00273171.2018.1485006.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Vichi, Maurizio, i Gilbert Saporta. "Clustering and disjoint principal component analysis". Computational Statistics & Data Analysis 53, nr 8 (czerwiec 2009): 3194–208. http://dx.doi.org/10.1016/j.csda.2008.05.028.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Lamboy, Warren F. "Disjoint Principal Component Analysis: A Statistical Method of Botanical Identification". Systematic Botany 15, nr 1 (styczeń 1990): 3. http://dx.doi.org/10.2307/2419010.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
4

Hu, Qiguo, i Jinyin He. "Path Sets Combination Method for Reliability Analysis of Phased-Mission Systems Based on Cumulative Exposure Model". Xibei Gongye Daxue Xuebao/Journal of Northwestern Polytechnical University 36, nr 5 (październik 2018): 995–1003. http://dx.doi.org/10.1051/jnwpu/20183650995.

Pełny tekst źródła
Streszczenie:
The modeling of phased-mission systems is difficult and the solving process is complex because of the relevance of the phase tasks and the sharing of components existing in different phases or between phases. To solve the problem, based on the cumulative exposure model, the path sets combination method of phased-mission systems is proposed. Aiming at the problem of the cross-stage correlation of components and its different failure rate in each phase, the cumulative exposure model considering the historical damage of components is used to solve by obtaining the cumulative damage distribution of each component in each phase. Firstly, a phased-mission systems reliability model is build by mapping phased-mission system fault trees into a Bayesian network. By traversing the Bayesian network, the minimal path sets of each phase are obtained. Secondly, the disjoint formulas introduced by variable elimination method are used to do the disjoint operation of the minimal path sets of each phase and the conditional probability relations of the common components are used to reduce the minimal path sets scale. Finally, the minimum disjoint path sets of each phase are combined and summed according to the component conditional probability relation. The path sets combination method of phased-mission systems avoids the large conditional probability table, large storage and large computation problems caused by the excessive discrete states in the traditional Bayesian method and the problem that the PMS-BDD method has strict requirements for variable ordering and is difficult to solve the system reliability with multiple failure distribution types of components. In the end, a phased-mission systems reliability modeling and solving is carried out for a geosynchronous orbit satellite, and compared with the PMS-BDD method, which verifies the correctness of the method.
Style APA, Harvard, Vancouver, ISO itp.
5

Martin-Barreiro, Carlos, John A. Ramirez-Figueroa, Xavier Cabezas, Víctor Leiva i M. Purificación Galindo-Villardón. "Disjoint and Functional Principal Component Analysis for Infected Cases and Deaths Due to COVID-19 in South American Countries with Sensor-Related Data". Sensors 21, nr 12 (14.06.2021): 4094. http://dx.doi.org/10.3390/s21124094.

Pełny tekst źródła
Streszczenie:
In this paper, we group South American countries based on the number of infected cases and deaths due to COVID-19. The countries considered are: Argentina, Bolivia, Brazil, Chile, Colombia, Ecuador, Peru, Paraguay, Uruguay, and Venezuela. The data used are collected from a database of Johns Hopkins University, an institution that is dedicated to sensing and monitoring the evolution of the COVID-19 pandemic. A statistical analysis, based on principal components with modern and recent techniques, is conducted. Initially, utilizing the correlation matrix, standard components and varimax rotations are calculated. Then, by using disjoint components and functional components, the countries are grouped. An algorithm that allows us to keep the principal component analysis updated with a sensor in the data warehouse is designed. As reported in the conclusions, this grouping changes depending on the number of components considered, the type of principal component (standard, disjoint or functional) and the variable to be considered (infected cases or deaths). The results obtained are compared to the k-means technique. The COVID-19 cases and their deaths vary in the different countries due to diverse reasons, as reported in the conclusions.
Style APA, Harvard, Vancouver, ISO itp.
6

Meghanathan, Natarajan. "Quantifying the Theory Vs. Programming Disparity using Spectral Bipartivity Analysis and Principal Component Analysis". International Journal of Computer Science and Information Technology 14, nr 5 (31.10.2022): 1–15. http://dx.doi.org/10.5121/ijcsit.2022.14501.

Pełny tekst źródła
Streszczenie:
Some students in the Computer Science and related majors excel very well in programming-related assignments, but not equally well in the theoretical assignments (that are not programming-based) and vice-versa. We refer to this as the "Theory vs. Programming Disparity (TPD)". In this paper, we propose a spectral bipartivity analysis-based approach to quantify the TPD metric for any student in a course based on the percentage scores (considered as decimal values in the range of 0 to 1) of the student in the course assignments (that involves both theoretical and programming-based assignments). We also propose a principal component analysis (PCA)-based approach to quantify the TPD metric for the entire class based on the percentage scores (in a scale of 0 to 100) of the students in the theoretical and programming assignments. The spectral analysis approach partitions the set of theoretical and programming assignments to two disjoint sets whose constituents are closer to each other within each set and relatively more different from each across the two sets. The TPD metric for a student is computed on the basis of the Euclidean distance between the tuples representing the actual numbers of theoretical and programming assignments vis-a-vis the number of theoretical and programming assignments in each of the two disjoint sets. The PCA-based analysis identifies the dominating principal components within the sets of theoretical and programming assignments and computes the TPD metric for the entire class as a weighted average of the correlation coefficients between the dominating principal components representing these two sets.
Style APA, Harvard, Vancouver, ISO itp.
7

Nose-Filho, Kenji, i Joao Marcos Travassos Romano. "Low-Rank Decomposition Based on Disjoint Component Analysis With Applications in Seismic Imaging". IEEE Transactions on Computational Imaging 3, nr 2 (czerwiec 2017): 275–81. http://dx.doi.org/10.1109/tci.2017.2691548.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
8

LONG, HAO, i XIAO-WEI LIU. "A UNIFIED COMMUNITY DETECTION ALGORITHM IN LARGE-SCALE COMPLEX NETWORKS". Advances in Complex Systems 22, nr 03 (maj 2019): 1950004. http://dx.doi.org/10.1142/s0219525919500048.

Pełny tekst źródła
Streszczenie:
A community is the basic component structure of complex networks and is important for network analysis. In recent decades, researchers from different fields have witnessed a boom of community detection, and many algorithms were proposed to retrieve disjoint or overlapping communities. In this paper, a unified expansion approach is proposed to obtain two different network partitions, which can provide divisions with higher accuracies and have high scalability in large-scale networks. First, we define the edge intensity to quantify the densities of network edges, a higher edge intensity indicates a more compact pair of nodes. Second, vertices of higher density edges are extracted out and denoted as core nodes, whereas other vertices are treated as margin nodes; finally we apply an expansion strategy to form disjoint communities: closely connected core nodes are combined as disjoint skeleton communities, and margin nodes are gradually attached to the nearest skeleton communities. To detect overlapping communities, extra steps are adopted: potential overlapping nodes are identified from the existing disjoint communities and replicated; and communities that bear replicas are further partitioned into smaller clusters. Because replicas of potential overlapping nodes might remain in different communities, overlapping communities can be acquired. Experimental results on real and synthetic networks illustrate higher accuracy and better performance of our method.
Style APA, Harvard, Vancouver, ISO itp.
9

Yu, Kong Shuai, i Dong Hu. "Appearance Model Based Moving Object Matching across Disjoint Camera Views". Advanced Materials Research 760-762 (wrzesień 2013): 1322–26. http://dx.doi.org/10.4028/www.scientific.net/amr.760-762.1322.

Pełny tekst źródła
Streszczenie:
A new object tracking scheme for multi-camera surveillance with non-overlapping views is proposed in this paper. Brightness transfer function (BTF) is used to establish relative appearance correspondence between different views. Mixtures of probabilistic principal component analysis (MPPCA) is incooperated to learn the subspace of brightness transfer function with the concern to deal with multiple different brightness areas in a scene. The incremental major color spectrum histogram (IMCSH) is used as similarity measure for reliable matching. Experimental results with real world videos show the effectiveness of the proposed algorithm.
Style APA, Harvard, Vancouver, ISO itp.
10

Vesper, Stephen, Jennie Wakefield, Peter Ashley, David Cox, Gary Dewalt i Warren Friedman. "Geographic Distribution of Environmental Relative Moldiness Index Molds in USA Homes". Journal of Environmental and Public Health 2011 (2011): 1–11. http://dx.doi.org/10.1155/2011/242457.

Pełny tekst źródła
Streszczenie:
Objective. The objective of this study was to quantify and describe the distribution of the 36 molds that make up the Environmental Relative Moldiness Index (ERMI).Materials and Methods. As part of the 2006 American Healthy Homes Survey, settled dust samples were analyzed by mold-specific quantitative PCR (MSQPCR) for the 36 ERMI molds. Each species' geographical distribution pattern was examined individually, followed by partitioning analysis in order to identify spatially meaningful patterns. For mapping, the 36 mold populations were divided into disjoint clusters on the basis of their standardized concentrations, and First Principal Component (FPC) scores were computed.Results and Conclusions. The partitioning analyses failed to uncover a valid partitioning that yielded compact, well-separated partitions with systematic spatial distributions, either on global or local criteria. Disjoint variable clustering resulted in seven mold clusters. The 36 molds and ERMI values themselves were found to be heterogeneously distributed across the United States of America (USA).
Style APA, Harvard, Vancouver, ISO itp.

Rozprawy doktorskie na temat "Disjoint component analysis"

1

Zaghloul, Sara. "Application du DCA aux Radars de Surveillances Secondaires". Electronic Thesis or Diss., Reims, 2024. http://www.theses.fr/2024REIMS017.

Pełny tekst źródła
Streszczenie:
L'objectif de cette thèse était de développer un algorithme rapide pour séparer un mélange de signaux de Radars de Surveillance Secondaires (SSR). Ce mélange peut inclure différents modes, tels que le Mode A/C et le Mode S, qui compliquent la séparation en raison de leurs formats variés et de leurs caractéristiques de codage différentes. Au cours de cette thèse, trois méthodes ont été développées en utilisant un critère relativement discret, l'Analyse en Composantes Disjointes (DCA), qui vise à séparer les sources en maximisant la disjonction entre elles.La première est une approche de post-traitement qui utilise l'algèbre linéaire pour résoudre les problèmes rencontrés lors de l'application de la version à valeurs réelles du DCA. Cependant, l'application de cette méthode peut poser plusieurs problèmes, notamment une perte de signal, la présence du mélange résiduel et des dépendances linéaires entre signaux. Par conséquent, nous avons conclu qu'il était nécessaire de développer une méthode qui considère les signaux SSR dans leur format original à valeur complexe.La deuxième méthode vise à démontrer l'efficacité du critère DCA pour les signaux SSR, en utilisant une approche de recherche exhaustive tout en considérant les signaux dans leur format complexe. Cette méthode permet de séparer les signaux avec une grande précision, mais elle coûteux en termes de calcul.La troisième méthode proposée optimise la recherche du minimum à l'aide d'un algorithme de descente de gradient, ce qui améliore considérablement l'efficacité des calculs tout en maintenant une qualité similaire des résultats.Ces algorithmes ont été testés dans des simulations et comparés à divers algorithmes de la littérature, afin d'évaluer leur performance en fonction de différents paramètres de réception. Enfin, ils ont été testés sur des données réelles
The objective of this thesis was to develop a fast algorithm to separate a mixture of Secondary Surveillance Radar (SSR) signals. This mixture may include different modes, such as Mode A/C and Mode S, which complicate the separation due to their varied formats and different coding characteristics. During this thesis, three methods were developed using a relatively discrete criterion, the Disjoint Component Analysis (DCA), which aims to separate sources based on maximizing the disjointness between them.The first is a post-processing approach that uses linear algebra to solve the problems encountered when applying the real-valued version of DCA. However, the application of this method can pose several problems, including signal loss, residual mixing, and signal dependencies. Therefore, we concluded that it was necessary to develop a method that considers SSR signals in their original complex-valued format.The second method aims to demonstrate the effectiveness of the DCA criterion for SSR signals, using an exhaustive search approach while considering signals in their complex format. This method succeeds in separating signals with a high degree of accuracy but is computationally expensive.The third proposed method optimizes the search for the minimum using a gradient descent algorithm, which significantly improves computational efficiency while maintaining similar quality of results.These algorithms were tested in simulations and compared with various algorithms from the literature, to evaluate their performance as a function of different reception parameters. Finally, they were tested on real-world data
Style APA, Harvard, Vancouver, ISO itp.

Części książek na temat "Disjoint component analysis"

1

Nose-Filho, K., L. T. Duarte i J. M. T. Romano. "On Disjoint Component Analysis". W Latent Variable Analysis and Signal Separation, 519–28. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-53547-0_49.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Anemüller, Jörn, i Hendrik Kayser. "Acoustic Source Localization by Combination of Supervised Direction-of-Arrival Estimation with Disjoint Component Analysis". W Latent Variable Analysis and Signal Separation, 99–108. Cham: Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-53547-0_10.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Sekgweleo, Tefo, i Tiko Iyamu. "Human Interactions in Software Deployment". W Advances in Human and Social Aspects of Technology, 161–78. IGI Global, 2014. http://dx.doi.org/10.4018/978-1-4666-6126-4.ch009.

Pełny tekst źródła
Streszczenie:
Software is intended to enable and support organisations, in order for them to function effectively and efficiently, towards achieving their strategic objectives. Hence, its deployment is critically vital to the focal actors (managers and sponsors). Software deployment involves two primary components, technology and non-technology actors. Both actors offer vital contributions to software deployment from different perspectives. Unfortunately, there has been more focus on the technological actors over the years. This could be attributed to why the same types of challenges, such as disjoint between development and implementation, persist. This chapter holistically examines the roles of non-technology actors in the deployment of software in organisations. The Actor-Network Theory (ANT) provides a lens in the analysis of the empirical data.
Style APA, Harvard, Vancouver, ISO itp.

Streszczenia konferencji na temat "Disjoint component analysis"

1

Zaghloul, S., N. Petrochilos i M. Mboup. "Secondary Surveillance Radar replies source separation via the Disjoint Component Analysis". W International Conference on Radar Systems (RADAR 2022). Institution of Engineering and Technology, 2022. http://dx.doi.org/10.1049/icp.2022.2365.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
2

Akhonda, M. A. B. S., Qunfang Long, Suchita Bhinge, Vince D. Calhoun i Tulay Adali. "Disjoint Subspaces for Common and Distinct Component Analysis: Application to Task FMRI Data". W 2019 53rd Annual Conference on Information Sciences and Systems (CISS). IEEE, 2019. http://dx.doi.org/10.1109/ciss.2019.8693045.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
3

Lagniez, Jean-Marie, i Pierre Marquis. "An Improved Decision-DNNF Compiler". W Twenty-Sixth International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2017. http://dx.doi.org/10.24963/ijcai.2017/93.

Pełny tekst źródła
Streszczenie:
We present and evaluate a new compiler, called d4, targeting the Decision-DNNF language. As the state-of-the-art compilers C2D and Dsharp targeting the same language, d4 is a top-down tree-search algorithm exploring the space of propositional interpretations. d4 is based on the same ingredients as those considered in C2D and Dsharp (mainly, disjoint component analysis, conflict analysis and non-chronological backtracking, component caching). d4 takes advantage of a dynamic decomposition approach based on hypergraph partitioning, used sparingly. Some simplification rules are also used to minimize the time spent in the partitioning steps and to promote the quality of the decompositions. Experiments show that the compilation times and the sizes of the Decision-DNNF representations computed by d4 are in many cases significantly lower than the ones obtained by C2D and Dsharp.
Style APA, Harvard, Vancouver, ISO itp.
4

Enright, Michael P., R. Craig McClung, Kwai S. Chan, John McFarland, Jonathan P. Moody i James C. Sobotka. "Micromechanics-Based Fracture Risk Assessment Using Integrated Probabilistic Damage Tolerance Analysis and Manufacturing Process Models". W ASME Turbo Expo 2016: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/gt2016-58089.

Pełny tekst źródła
Streszczenie:
Materials engineering and damage tolerance assessment have traditionally been performed as disjoint processes involving repeated tests that can ultimately prolong the time required for certification of new materials. Computational advances have been made both in the prediction of material properties and probabilistic damage tolerance analysis, but have been pursued primarily as independent efforts. Integrated computational materials engineering (ICME) has the potential to significantly reduce the time required for development and insertion of new materials in the gas turbine industry. A manufacturing process software tool called DEFORM™ has been linked with a probabilistic damage tolerance analysis (PDTA) software tool called DARWIN® to form a new capability for ICME of gas turbine engine components. DEFORM simulates rotor manufacturing processes including forging, heat treating, and machining to compute residual stress and strain, track anomaly location, and predict microstructure including grain size and orientation. DARWIN integrates finite element stress analysis results, fracture mechanics models, material anomaly data, probability of anomaly detection, and inspection schedules to compute the probability of fracture of a gas turbine engine rotor as a function of operating cycles. Previous papers have focused on probabilistic modeling of residual stresses in DARWIN based on manufacturing process training data from DEFORM. This paper describes recent efforts to extend the probabilistic link between DEFORM and DARWIN to enable modeling of residual strain, average grain size, and ALA (unrecrystalized) grain size as random variables. Gaussian Process modeling is used to estimate the relationship among model responses and material processing parameters. These random variables are applied to microstructure-based fatigue crack nucleation and growth models for use in probabilistic risk assessments. The integrated DARWIN-DEFORM capability is demonstrated for a representative engine disk model which illustrates the influences of manufacturing-induced random variables on component fracture risk. The results provide critical insight regarding the potential benefits of integrating probabilistic computational material processing models with probabilistic damage tolerance-based risk assessment.
Style APA, Harvard, Vancouver, ISO itp.
5

Kuczera, Ramon C., Zissimos P. Mourelatos i Michael Latcha. "A Monte Carlo Reliability Assessment for Multiple Failure Region Problems Using Approximate Metamodels". W ASME 2007 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/detc2007-34957.

Pełny tekst źródła
Streszczenie:
An efficient Monte Carlo reliability assessment methodology is presented for engineering systems with multiple failure regions and potentially multiple most probable points. The method can handle implicit, nonlinear limit-state functions, with correlated or non-correlated random variables, which can be described by any probabilistic distribution. It uses a combination of approximate or “accurate-on-demand,” global and local metamodels which serve as indicators to determine the failure and safe regions. Samples close to limit states define transition regions between safe and failure domains. A clustering technique identifies all transition regions which can be in general disjoint, and local metamodels of the actual limit states are generated for each transition region. A Monte Carlo simulation calculates the probability of failure using the global and local metamodels. A robust maximin “space-filling” sampling technique is used to construct the metamodels. Also, a principal component analysis addresses the problem dimensionality making therefore, the proposed method attractive for problems with a large number of random variables. Two numerical examples highlight the accuracy and efficiency of the method.
Style APA, Harvard, Vancouver, ISO itp.
6

Slanjankic, Emir, Haris Balta, Adil Joldic, Alsa Cvitkovic, Djenan Heric i Emir Veledar. "Data mining techniques and SAS as a tool for graphical presentation of principal components analysis and disjoint cluster analysis results". W 2009 XXII International Symposium on Information, Communication and Automation Technologies (ICAT 2009). IEEE, 2009. http://dx.doi.org/10.1109/icat.2009.5348419.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii