Siga este enlace para ver otros tipos de publicaciones sobre el tema: Worst-case complexity analysis.

Artículos de revistas sobre el tema "Worst-case complexity analysis"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "Worst-case complexity analysis".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Szirmay-Kalos, L., and G. Márton. "Worst-case versus average case complexity of ray-shooting." Computing 61, no. 2 (1998): 103–31. http://dx.doi.org/10.1007/bf02684409.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Kwas, Marek, and Youming Li. "Worst case complexity of multivariate Feynman–Kac path integration." Journal of Complexity 19, no. 6 (2003): 730–43. http://dx.doi.org/10.1016/s0885-064x(03)00048-7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Jackowski, Tomasz. "Complexity of multilinear problems in the worst case setting." Journal of Complexity 6, no. 4 (1990): 389–408. http://dx.doi.org/10.1016/0885-064x(90)90030-h.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Milanese, M., and A. Vicino. "Information-Based Complexity and Nonparametric Worst-Case System Identification." Journal of Complexity 9, no. 4 (1993): 427–46. http://dx.doi.org/10.1006/jcom.1993.1028.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Plaskota, Leszek. "Worst Case Complexity of Problems with Random Information Noise." Journal of Complexity 12, no. 4 (1996): 416–39. http://dx.doi.org/10.1006/jcom.1996.0026.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Pemmaraju, Sriram V., and Clifford A. Shaffer. "Analysis of the worst case space complexity of a PR quadtree." Information Processing Letters 49, no. 5 (1994): 263–67. http://dx.doi.org/10.1016/0020-0190(94)90065-5.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Li, Youming, and Grzegorz W. Wasilkowski. "Worst Case Complexity of Weighted Approximation and Integration over Rd." Journal of Complexity 18, no. 1 (2002): 330–45. http://dx.doi.org/10.1006/jcom.2001.0632.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Kumar, Pijush Kanti. "Breaking the O(n^2) Barrier: Novel Approaches to Time Complexity Analysis in Quick Sort." International Journal of Computer Science and Mobile Computing 9, no. 11 (2020): 107–17. http://dx.doi.org/10.47760/ijcsmc.2020.v09i11.010.

Texto completo
Resumen
The most common type of sorting algorithm used is quicksort. As the name suggests it is the one of the most fastest sorting algorithm used since the innovation of sorting. However, this sort has often been criticised for its worst-case time complexity that is O(n^2). Practically the quick sort tends to follow its average case and best-case scenarios most of the time. So practically the quick sort is the most efficient practical sorting algorithm. In this paper we will fine tune this algorithm and remove its worst-case time complexity of O(n^2) and make this the best sorting algorithm both theo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Short, Michael. "Bounds on Worst-Case Deadline Failure Probabilities in Controller Area Networks." Journal of Computer Networks and Communications 2016 (2016): 1–12. http://dx.doi.org/10.1155/2016/5196092.

Texto completo
Resumen
Industrial communication networks like the Controller Area Network (CAN) are often required to operate reliably in harsh environments which expose the communication network to random errors. Probabilistic schedulability analysis can employ rich stochastic error models to capture random error behaviors, but this is most often at the expense of increased analysis complexity. In this paper, an efficient method (of time complexityO(n log n)) to bound the message deadline failure probabilities for an industrial CAN network consisting ofnperiodic/sporadic message transmissions is proposed. The paper
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

De Haan, Ronald, Anna Roubickova, and Stefan Szeider. "Parameterized Complexity Results for Plan Reuse." Proceedings of the AAAI Conference on Artificial Intelligence 27, no. 1 (2013): 224–31. http://dx.doi.org/10.1609/aaai.v27i1.8655.

Texto completo
Resumen
Planning is a notoriously difficult computational problem of high worst-case complexity. Researchers have been investing significant efforts to develop heuristics or restrictions to make planning practically feasible. Case-based planning is a heuristic approach where one tries to reuse previous experience when solving similar problems in order to avoid some of the planning effort. Plan reuse may offer an interesting alternative to plan generation in some settings. We provide theoretical results that identify situations in which plan reuse is provably tractable. We perform our analysis in the f
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Rodriguez Ferrandez, Ivan, Alvaro Jover Alvarez, Matina Maria Trompouki, Leonidas Kosmidis, and Francisco J. Cazorla. "Worst Case Execution Time and Power Estimation of Multicore and GPU Software: A Pedestrian Detection Use Case." ACM SIGAda Ada Letters 43, no. 1 (2023): 111–17. http://dx.doi.org/10.1145/3631483.3631502.

Texto completo
Resumen
Worst Case Execution Time estimation of software running on parallel platforms is a challenging task, due to resource interference of other tasks and the complexity of the underlying CPU and GPU hardware architectures. Similarly, the increased complexity of the hardware, challenges the estimation of worst case power consumption. In this paper, we employ Measurement Based Probabilistic Timing Analysis (MBPTA), which is capable of managing complex architectures such as multicores. We enable its use by software randomisation, which we show for the first time that is also possible on GPUs. We demo
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Dai, Qiangqiang, Rong-Hua Li, Donghang Cui, Meihao Liao, Yu-Xuan Qiu, and Guoren Wang. "Efficient Maximal Biplex Enumerations with Improved Worst-Case Time Guarantee." Proceedings of the ACM on Management of Data 2, no. 3 (2024): 1–26. http://dx.doi.org/10.1145/3654938.

Texto completo
Resumen
A k-biplex is an induced subgraph of a bipartite graph which requires every vertex on the one side disconnecting at most k vertices on the other side. Enumerating all maximal k-biplexes in a bipartite graph is a fundamental operator in bipartite graph analysis and finds applications in various domains, including community detection, online recommendation, and fraud detection in finance networks. The state-of-the-art solutions for maximal k-biplex enumeration suffer from efficiency issues as k increases (k ≥ 2), with the time complexity of O(m 2 n ), where n (m) denotes the number of vertices (
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Kon, Mark, and Leszek Plaskota. "Complexity of Neural Network Approximation with Limited Information: A Worst Case Approach." Journal of Complexity 17, no. 2 (2001): 345–65. http://dx.doi.org/10.1006/jcom.2001.0575.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Vakhania, Nodari. "Probabilistic quality estimations for combinatorial optimization problems." Georgian Mathematical Journal 25, no. 1 (2018): 123–34. http://dx.doi.org/10.1515/gmj-2017-0041.

Texto completo
Resumen
AbstractThe computational complexity of an algorithm is traditionally measured for the worst and the average case. The worst-case estimation guarantees a certain worst-case behavior of a given algorithm, although it might be rough, since in “most instances” the algorithm may have a significantly better performance. The probabilistic average-case analysis claims to derive an average performance of an algorithm, say, for an “average instance” of the problem in question. That instance may be far away from the average of the problem instances arising in a given real-life application, and so the av
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Chan, Tsz Nam, Leong Hou U, Yun Peng, Byron Choi, and Jianliang Xu. "Fast network k-function-based spatial analysis." Proceedings of the VLDB Endowment 15, no. 11 (2022): 2853–66. http://dx.doi.org/10.14778/3551793.3551836.

Texto completo
Resumen
Network K -function has been the de facto operation for analyzing point patterns in spatial networks, which is widely used in many communities, including geography, ecology, transportation science, social science, and criminology. To analyze a location dataset, domain experts need to generate a network K -function plot that involves computing multiple network K -functions. However, network K -function is a computationally expensive operation that is not feasible to support large-scale datasets, let alone to generate a network K -function plot. To handle this issue, we develop two efficient alg
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Matsuo, Hirofumi. "Cyclic sequencing problems in the two-machine permutation flow shop: Complexity, worst-case, and average-case analysis." Naval Research Logistics 37, no. 5 (1990): 679–94. http://dx.doi.org/10.1002/1520-6750(199010)37:5<679::aid-nav3220370507>3.0.co;2-q.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Kacewicz, Bolesław. "Asymptotically tight worst case complexity bounds for initial-value problems with nonadaptive information." Journal of Complexity 47 (August 2018): 86–96. http://dx.doi.org/10.1016/j.jco.2018.02.002.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

ASCHIERI, FEDERICO. "GAME SEMANTICS AND THE GEOMETRY OF BACKTRACKING: A NEW COMPLEXITY ANALYSIS OF INTERACTION." Journal of Symbolic Logic 82, no. 2 (2017): 672–708. http://dx.doi.org/10.1017/jsl.2016.48.

Texto completo
Resumen
AbstractWe present abstract complexity results about Coquand and Hyland–Ong game semantics, that will lead to new bounds on the length of first-order cut-elimination, normalization, interaction between expansion trees and any other dialogical process game semantics can model and apply to. In particular, we provide a novel method to bound the length of interactions between visible strategies and to measure precisely the tower of exponentials defining the worst-case complexity. Our study improves the old estimates on average by several exponentials.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Gordon, Ofir, Yuval Filmus, and Oren Salzman. "Revisiting the Complexity Analysis of Conflict-Based Search: New Computational Techniques and Improved Bounds." Proceedings of the International Symposium on Combinatorial Search 12, no. 1 (2021): 64–72. http://dx.doi.org/10.1609/socs.v12i1.18552.

Texto completo
Resumen
The problem of Multi-Agent Path Finding (MAPF) calls for finding a set of conflict-free paths for a fleet of agents operating in a given environment. Arguably, the state-of-the-art approach to computing optimal solutions is Conflict-Based Search (CBS). In this work we revisit the complexity analysis of CBS to provide tighter bounds on the algorithm's run-time in the worst-case. Our analysis paves the way to better pinpoint the parameters that govern (in the worst case) the algorithm's computational complexity. Our analysis is based on two complementary approaches: In the first approach we boun
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Cade, Chris, Marten Folkertsma, Ido Niesen, and Jordi Weggemans. "Quantifying Grover speed-ups beyond asymptotic analysis." Quantum 7 (October 10, 2023): 1133. http://dx.doi.org/10.22331/q-2023-10-10-1133.

Texto completo
Resumen
Run-times of quantum algorithms are often studied via an asymptotic, worst-case analysis. Whilst useful, such a comparison can often fall short: it is not uncommon for algorithms with a large worst-case run-time to end up performing well on instances of practical interest. To remedy this it is necessary to resort to run-time analyses of a more empirical nature, which for sufficiently small input sizes can be performed on a quantum device or a simulation thereof. For larger input sizes, alternative approaches are required.In this paper we consider an approach that combines classical emulation w
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

HARRINGTON, PAUL, COLM Ó. DÚNLAING, and CHEE K. YAP. "OPTIMAL VORONOI DIAGRAM CONSTRUCTION WITH n CONVEX SITES IN THREE DIMENSIONS." International Journal of Computational Geometry & Applications 17, no. 06 (2007): 555–93. http://dx.doi.org/10.1142/s0218195907002483.

Texto completo
Resumen
This paper presents a worst-case optimal algorithm for constructing the Voronoi diagram for n disjoint convex and rounded semi-algebraic sites in 3 dimensions. Rather than extending optimal 2-dimensional methods,32,16,20,2 we base our method on a suboptimal 2-dimensional algorithm, outlined by Lee and Drysdale and modified by Sharir25,30 for computing the diagram of circular sites. For complexity considerations, we assume the sites have bounded complexity, i.e., the algebraic degree is bounded as is the number of algebraic patches making up the site. For the sake of simplicity we assume that t
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Han, Ai Li. "Complexity Research on B Algorithm." Applied Mechanics and Materials 20-23 (January 2010): 173–77. http://dx.doi.org/10.4028/www.scientific.net/amm.20-23.173.

Texto completo
Resumen
The time complexity of B algorithm, one of the intelligent search algorithms, is discussed. By anatomizing some instances, it is pointed out that the cost of calculating the value of heuristic function should be included in the range of time complexity analysis for B algorithm. And then, an algorithm of calculating the value of heuristic function is presented. By analyzing the cost of calculating the value of heuristic function, it is pointed out that the number of recursions in B algorithm is O(n!) in the worst case. Therefore, the time complexity of B algorithm is exponential instead of O(n2
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Journal, IJSREM. "Advanced In-Place Merge Sort Approach for Enhanced Sorting Performance." INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT 08, no. 008 (2024): 1–5. http://dx.doi.org/10.55041/ijsrem37088.

Texto completo
Resumen
This paper introduces a new sorting algorithm designed to sort array elements directly within the array itself. The algorithm features a best-case time complexity of O(n) and an average and worst-case time complexity of O (n log n). It achieves this performance through a method that combines recursive breakdown with in-place merging techniques. We compare this new approach with existing popular sorting algorithms to assess its relative effectiveness. The paper concludes with insights into the algorithm's strengths and limitations, and proposes potential areas for further development and refine
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Procaccia, A. D., and J. S. Rosenschein. "Junta Distributions and the Average-Case Complexity of Manipulating Elections." Journal of Artificial Intelligence Research 28 (February 28, 2007): 157–81. http://dx.doi.org/10.1613/jair.2148.

Texto completo
Resumen
Encouraging voters to truthfully reveal their preferences in an election has long been an important issue. Recently, computational complexity has been suggested as a means of precluding strategic behavior. Previous studies have shown that some voting protocols are hard to manipulate, but used NP-hardness as the complexity measure. Such a worst-case analysis may be an insufficient guarantee of resistance to manipulation. Indeed, we demonstrate that NP-hard manipulations may be tractable in the average case. For this purpose, we augment the existing theory of average-case complexity with some ne
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Kuhlmann, Marco, Giorgio Satta, and Peter Jonsson. "On the Complexity of CCG Parsing." Computational Linguistics 44, no. 3 (2018): 447–82. http://dx.doi.org/10.1162/coli_a_00324.

Texto completo
Resumen
We study the parsing complexity of Combinatory Categorial Grammar (CCG) in the formalism of Vijay-Shanker and Weir ( 1994 ). As our main result, we prove that any parsing algorithm for this formalism will take in the worst case exponential time when the size of the grammar, and not only the length of the input sentence, is included in the analysis. This sets the formalism of Vijay-Shanker and Weir ( 1994 ) apart from weakly equivalent formalisms such as Tree Adjoining Grammar, for which parsing can be performed in time polynomial in the combined size of grammar and input sentence. Our results
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Goldman, C. V., and S. Zilberstein. "Decentralized Control of Cooperative Systems: Categorization and Complexity Analysis." Journal of Artificial Intelligence Research 22 (November 1, 2004): 143–74. http://dx.doi.org/10.1613/jair.1427.

Texto completo
Resumen
Decentralized control of cooperative systems captures the operation of a group of decision makers that share a single global objective. The difficulty in solving optimally such problems arises when the agents lack full observability of the global state of the system when they operate. The general problem has been shown to be NEXP-complete. In this paper, we identify classes of decentralized control problems whose complexity ranges between NEXP and P. In particular, we study problems characterized by independent transitions, independent observations, and goal-oriented objective functions. Two a
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Lee, Ching-pei, and Stephen J. Wright. "Random permutations fix a worst case for cyclic coordinate descent." IMA Journal of Numerical Analysis 39, no. 3 (2018): 1246–75. http://dx.doi.org/10.1093/imanum/dry040.

Texto completo
Resumen
Abstract Variants of the coordinate descent approach for minimizing a nonlinear function are distinguished in part by the order in which coordinates are considered for relaxation. Three common orderings are cyclic (CCD), in which we cycle through the components of $x$ in order; randomized (RCD), in which the component to update is selected randomly and independently at each iteration; and random-permutations cyclic (RPCD), which differs from CCD only in that a random permutation is applied to the variables at the start of each cycle. Known convergence guarantees are weaker for CCD and RPCD tha
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Yan, Huichao, and Jia Chen. "Tractability of Approximation of Functions Defined over Weighted Hilbert Spaces." Axioms 13, no. 2 (2024): 108. http://dx.doi.org/10.3390/axioms13020108.

Texto completo
Resumen
We investigate L2-approximation problems in the worst case setting in the weighted Hilbert spaces H(KRd,α,γ) with weights Rd,α,γ under parameters 1≥γ1≥γ2≥⋯≥0 and 1&lt;α1≤α2≤⋯. Several interesting weighted Hilbert spaces H(KRd,α,γ) appear in this paper. We consider the worst case error of algorithms that use finitely many arbitrary continuous linear functionals. We discuss tractability of L2-approximation problems for the involved Hilbert spaces, which describes how the information complexity depends on d and ε−1. As a consequence we study the strongly polynomial tractability (SPT), polynomial
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Della Croce, Federico, and Vangelis Th Paschos. "Exploiting dominance conditions for computing non trivial worst-case complexity for bounded combinatorial optimization problems." Operational Research 8, no. 3 (2008): 235–56. http://dx.doi.org/10.1007/s12351-008-0020-8.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Hu, Xiao, and Qichen Wang. "Towards Update-Dependent Analysis of Query Maintenance." Proceedings of the ACM on Management of Data 3, no. 2 (2025): 1–25. https://doi.org/10.1145/3725254.

Texto completo
Resumen
This paper studies the hardness of maintaining self-join-free conjunctive queries over a dynamic database, where tuples can be inserted or deleted. The worst-case complexity of this problem under arbitrary updates has been well understood. It is known that most practical queries require Ω(√|D|) maintenance time for each update to ensure O(1)-delay enumeration, barring a very restricted class of queries (known as "q-hierarchical" queries). Nonetheless, most real-world update sequences are not arbitrary, far away from the worst-case scenario; instead, they are so "nice" that queries can greatly
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Anastasi, G., P. Foglia, and L. Lenzini. "MPEG video traffic on a MetaRing network: complexity reduction of a ‘worst-case’ model for bandwidth allocation analysis." Computer Communications 21, no. 2 (1998): 111–25. http://dx.doi.org/10.1016/s0140-3664(97)00135-7.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Barbay, Jérémy. "Optimal Prefix Free Codes with Partial Sorting." Algorithms 13, no. 1 (2019): 12. http://dx.doi.org/10.3390/a13010012.

Texto completo
Resumen
We describe an algorithm computing an optimal prefix free code for n unsorted positive weights in time within O ( n ( 1 + lg α ) ) ⊆ O ( n lg n ) , where the alternation α ∈ [ 1 . . n − 1 ] approximates the minimal amount of sorting required by the computation. This asymptotical complexity is within a constant factor of the optimal in the algebraic decision tree computational model, in the worst case over all instances of size n and alternation α . Such results refine the state of the art complexity of Θ ( n lg n ) in the worst case over instances of size n in the same computational model, a l
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Xia, Lirong, and Weiqiang Zheng. "The Smoothed Complexity of Computing Kemeny and Slater Rankings." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 6 (2021): 5742–50. http://dx.doi.org/10.1609/aaai.v35i6.16720.

Texto completo
Resumen
The computational complexity of winner determination under common voting rules is a classical and fundamental topic in the field of computational social choice. Previous work has established the NP-hardness of winner determination under some commonly-studied voting rules, such as the Kemeny rule and the Slater rule. In a recent position paper, Baumeister, Hogrebe, and Rothe (2020) questioned the relevance of the worst-case nature of NP-hardness in social choice and proposed to conduct smoothed complexity analysis (Spielman and Teng 2009) under Blaser and Manthey’s (2015) framework. In this pap
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Dai, Jingke, Peng Zhao, Xiao Chen, and Fenggan Zhang. "Analysis and Design of Systematic Rateless Codes in FH/BFSK System with Interference." Mobile Information Systems 2020 (February 10, 2020): 1–8. http://dx.doi.org/10.1155/2020/9049284.

Texto completo
Resumen
The asymptotic analysis of systematic rateless codes in frequency hopping (FH) systems with interference is first provided using discretized density evolution (DDE) and compared with the traditional fixed-rate scheme. A simplified analysis with Gaussian assumption of initial message is proposed in the worst case of interference, which has much lower complexity and provides a very close result to DDE. Based on this simplified analysis, the linear programming is employed to design rateless codes and the simulation results on partial-band interference channels show that the optimized codes have m
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Touil, Imene, and Wided Chikouche. "Primal-dual interior point methods for Semidefinite programming based on a new type of kernel functions." Filomat 34, no. 12 (2020): 3957–69. http://dx.doi.org/10.2298/fil2012957t.

Texto completo
Resumen
In this paper, we propose the first hyperbolic-logarithmic kernel function for Semidefinite programming problems. By simple analysis tools, several properties of this kernel function are used to compute the total number of iterations. We show that the worst-case iteration complexity of our algorithm for large-update methods improves the obtained iteration bounds based on hyperbolic [24] as well as classic kernel functions. For small-update methods, we derive the best known iteration bound.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Khan, Farhan Hai, Tannistha Pal, and Rounik Roy. "Development of a Universal Running Time Predictor using Multivariate Regression." International Journal of Innovative Research in Physics 3, no. 3 (2022): 1–9. http://dx.doi.org/10.15864/ijiip.3301.

Texto completo
Resumen
Various types of running times exist for analysis of algorithmic efficiency. This research presents a more empirical approach to the problem for the practical measurements of the actual running time of algorithms by considering a plethora of randomized inputs Rn and then fitting a regression curve in n to the algorithm of practical time complexity ν (n). This will also provide us the productivity factor η which will quantify the universal running time with respect to the asymptotic worst-case complexity and evaluate the efficiency of the given algorithm with the help of leading coefficients. T
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Neunhöffer, Max, and Cheryl E. Praeger. "Computing Minimal Polynomials of Matrices." LMS Journal of Computation and Mathematics 11 (2008): 252–79. http://dx.doi.org/10.1112/s1461157000000590.

Texto completo
Resumen
AbstractWe present and analyse a Monte-Carlo algorithm to compute the minimal polynomial of ann × nmatrix over a finite field that requiresO(n3) field operations andO(n) random vectors, and is well suited for successful practical implementation. The algorithm, and its complexity analysis, use standard algorithms for polynomial and matrix operations. We compare features of the algorithm with several other algorithms in the literature. In addition we present a deterministic verification procedure which is similarly efficient in most cases but has a worst-case complexity ofO(n4). Finally, we repo
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Selvi, V. "Clustering Analysis of Greedy Heuristic Method in Zero_One Knapsack Problem." International Journal of Emerging Research in Management and Technology 6, no. 7 (2018): 39. http://dx.doi.org/10.23956/ijermt.v6i7.181.

Texto completo
Resumen
Knapsack problem is a surely understood class of optimization problems, which tries to expand the profit of items in a knapsack without surpassing its capacity, Knapsack can be solved by several algorithms such like Greedy, dynamic programming, Branch &amp; bound etc. The solution to the zero_one knapsack problem (KP) can be viewed as the result of a sequence of decision. Clustering is the process of resolving that type of applications. Different clustering application for grouping elements with equal priority. In this paper we are introducing greedy heuristic algorithm for solving zero_one kn
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Yu, Kaiqiang, Kaixin Wang, Cheng Long, Laks Lakshmanan, and Reynold Cheng. "Fast Maximum Common Subgraph Search: A Redundancy-Reduced Backtracking Approach." Proceedings of the ACM on Management of Data 3, no. 3 (2025): 1–27. https://doi.org/10.1145/3725404.

Texto completo
Resumen
Given two input graphs, finding the largest subgraph that occurs in both, i.e., finding the maximum common subgraph, is a fundamental operator for evaluating the similarity between two graphs in graph data analysis. Existing works for solving the problem are of either theoretical or practical interest, but not both. Specifically, the algorithms with a theoretical guarantee on the running time are known to be not practically efficient; algorithms following the recently proposed backtracking framework called McSplit, run fast in practice but do not have any theoretical guarantees. In this paper,
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Jiang, Tianzi. "The worst case complexity of the fredholm equation of the second kind with non-periodic free term and noise information." Numerical Functional Analysis and Optimization 19, no. 3-4 (1998): 329–43. http://dx.doi.org/10.1080/01630569808816831.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Song, Xin, Jingguo Ren, and Qiuming Li. "Doubly Constrained Robust Blind Beamforming Algorithm." Journal of Applied Mathematics 2013 (2013): 1–8. http://dx.doi.org/10.1155/2013/245609.

Texto completo
Resumen
We propose doubly constrained robust least-squares constant modulus algorithm (LSCMA) to solve the problem of signal steering vector mismatches via the Bayesian method and worst-case performance optimization, which is based on the mismatches between the actual and presumed steering vectors. The weight vector is iteratively updated with penalty for the worst-case signal steering vector by the partial Taylor-series expansion and Lagrange multiplier method, in which the Lagrange multipliers can be optimally derived and incorporated at each step. A theoretical analysis for our proposed algorithm i
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Zhao, Lulu, Guang Liang, and Huijie Liu. "An Improved Robust Beamforming Design for Cognitive Multiantenna Relay Networks." Mobile Information Systems 2017 (2017): 1–11. http://dx.doi.org/10.1155/2017/2719543.

Texto completo
Resumen
This paper investigates the robust relay beamforming design for the multiantenna nonregenerative cognitive relay networks (CRNs). Firstly, it is proved that the optimal beamforming matrix could be simplified as the product of a variable vector and the conjugate transposition of a known channel response vector. Then, by exploiting the optimal beamforming matrix with simplified structure, an improved robust beamforming design is proposed. Analysis and simulation results show that, compared with the existing suboptimal scheme, the proposed method can achieve higher worst-case channel capacity wit
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Liu, Yang, Hejiao Huang, Kaiqiang Yu, Shengxin Liu, and Cheng Long. "Efficient Maximum s -Bundle Search via Local Vertex Connectivity." Proceedings of the ACM on Management of Data 3, no. 1 (2025): 1–27. https://doi.org/10.1145/3709687.

Texto completo
Resumen
The s -bundle, as a cohesive subgraph model which relaxes the clique, remains connected whenever fewer than n-s vertices are removed, where n is the number of vertices inside. Finding the largest s -bundle is a fundamental problem and has diverse applications in various fields such as social network analysis, graph visualization, and bioinformatics. Existing studies for solving the problem follow the same branch-and-bound framework and improve the efficiency by developing pruning techniques. As a result, all share the same worst-case time complexity of O* (2 n ), where O* suppresses the polyno
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

WANG, BIING-FENG. "A BETTER ANALYSIS OF BEN-ASHER’S ALGORITHM FOR THE CONDITIONAL CARTESIAN PRODUCT PROBLEM." Parallel Processing Letters 06, no. 03 (1996): 331–44. http://dx.doi.org/10.1142/s0129626496000327.

Texto completo
Resumen
In [2], a new problem called conditional cartesian product problem was introduced. The problem has a trivial time lower bound [Formula: see text], where A and B are two groups of N elements and CA,B is a subset of the cartesian product A×B satisfying some unknown condition C. Ben-Asher proposed an efficient algorithm for the problem and showed that the proposed algorithm can be performed in [Formula: see text] time [2]. As compared with the trivial time lower bound, the algorithm is not optimal. In this paper, with a better analysis, we show that the worst-case time complexity of Ben-Asher’s a
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Behnke, Gregor, and Marcel Steinmetz. "On the Computational Complexity of Stackelberg Planning and Meta-Operator Verification." Proceedings of the International Conference on Automated Planning and Scheduling 34 (May 30, 2024): 20–24. http://dx.doi.org/10.1609/icaps.v34i1.31456.

Texto completo
Resumen
Stackelberg planning is a recently introduced single-turn two-player adversarial planning model, where two players are acting in a joint classical planning task, the objective of the first player being hampering the second player from achieving its goal. This places the Stackelberg planning problem somewhere between classical planning and general combinatorial two-player games. But, where exactly? All investigations of Stackelberg planning so far focused on practical aspects. We close this gap by conducting the first theoretical complexity analysis of Stackelberg planning. We show that in gene
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Meghdouri, Fares, Félix Iglesias Vázquez, and Tanja Zseby. "Modeling data with observers." Intelligent Data Analysis 26, no. 3 (2022): 785–803. http://dx.doi.org/10.3233/ida-215741.

Texto completo
Resumen
Compact data models have become relevant due to the massive, ever-increasing generation of data. We propose Observers-based Data Modeling (ODM), a lightweight algorithm to extract low density data models (aka coresets) that are suitable for both static and stream data analysis. ODM coresets keep data internal structures while alleviating computational costs of machine learning during evaluation phases accounting for a O(n log n) worst-case complexity. We compare ODM with previous proposals in classification, clustering, and outlier detection. Results show the preponderance of ODM for obtaining
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Meghdouri, Fares, Félix Iglesias Vázquez, and Tanja Zseby. "Modeling data with observers." Intelligent Data Analysis 26, no. 3 (2022): 785–803. http://dx.doi.org/10.3233/ida-215741.

Texto completo
Resumen
Compact data models have become relevant due to the massive, ever-increasing generation of data. We propose Observers-based Data Modeling (ODM), a lightweight algorithm to extract low density data models (aka coresets) that are suitable for both static and stream data analysis. ODM coresets keep data internal structures while alleviating computational costs of machine learning during evaluation phases accounting for a O(n log n) worst-case complexity. We compare ODM with previous proposals in classification, clustering, and outlier detection. Results show the preponderance of ODM for obtaining
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Xu, Zhang Yan, Wei Zhang, and Yan Ying Fan. "Attribute Reduction Algorithm of Incomplete Dicision Table Based on Conditional Entropy." Applied Mechanics and Materials 380-384 (August 2013): 1505–9. http://dx.doi.org/10.4028/www.scientific.net/amm.380-384.1505.

Texto completo
Resumen
The search of the attribute reduction algorithm of rough set in incomplete decision table is a research hot spot. Though analysis of the advantages and disadvantages of the existing attribute reduction algorithms,we put forward a definition of relative discernibility matrix base on the positive area. Then we compute the tolerance class with the the idea of cardinal number sorting method, giving a quick heuristic algorithm of attribute reduction with theconditional entropy and relative discernibility matrix, which of the time complexity is in the worst case. The test result shows that the algor
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

García Ojeda, Juan Carlos. "A Look at Algorithm BEPtoPNST." Revista Ingenierías Universidad de Medellín 20, no. 39 (2020): 115–28. http://dx.doi.org/10.22395/rium.v20n39a7.

Texto completo
Resumen
This work analyzes the computational complexity of algorithm BEPtoPNST which transforms a building-evacuation problem (BEP) into a time-ex-panded, process-network synthesis (PNST) problem. The solution of the latter is achieved by resorting to the P-graph method which exploits the combinatorial nature of a BEP. Unlike other approaches, the P-graph method provides not only the optimal solution (best evacuation route as a function of egress time), but also the best n sub-optimal solutions. For the complexity analysis, a generic processor, and a Random-access machine (RAM) model were deployed as
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Liu, Fangyao, Yayu Peng, Zhengxin Chen, and Yong Shi. "Modeling of Characteristics on Artificial Intelligence IQ Test: a Fuzzy Cognitive Map-Based Dynamic Scenario Analysis." International Journal of Computers Communications & Control 14, no. 6 (2019): 653–69. http://dx.doi.org/10.15837/ijccc.2019.6.3692.

Texto completo
Resumen
This research article uses a Fuzzy Cognitive Map (FCM) approach to improve an earlier proposed IQ test characteristics of Artificial Intelligence (AI) systems. The defuzzification process makes use of fuzzy logic and the triangular membership function along with linguistic term analyses. Each edge of the proposed FCM is assigned to a positive or negative influence type associated with a quantitative weight. All the weights are based on the defuzzified value in the defuzzification results. This research also leverages a dynamic scenario analysis to investigate the interrelationships between dri
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!