Literatura académica sobre el tema "Partially-Separable Structure"
Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros
Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Partially-Separable Structure".
Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.
También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.
Artículos de revistas sobre el tema "Partially-Separable Structure":
Sun, Yifan, Martin S. Andersen y Lieven Vandenberghe. "Decomposition in Conic Optimization with Partially Separable Structure". SIAM Journal on Optimization 24, n.º 2 (enero de 2014): 873–97. http://dx.doi.org/10.1137/130926924.
Michelini, Giorgia, Celeste H. M. Cheung, Viryanaga Kitsune, Daniel Brandeis, Tobias Banaschewski, Gráinne McLoughlin, Philip Asherson, Frühling Rijsdijk y Jonna Kuntsi. "The Etiological Structure of Cognitive-Neurophysiological Impairments in ADHD in Adolescence and Young Adulthood". Journal of Attention Disorders 25, n.º 1 (3 de mayo de 2018): 91–104. http://dx.doi.org/10.1177/1087054718771191.
Bai, Fu-Sheng y Ling Xu. "A Partially Parallel Prediction-Correction Splitting Method for Convex Optimization Problems with Separable Structure". Journal of the Operations Research Society of China 5, n.º 4 (24 de abril de 2017): 529–44. http://dx.doi.org/10.1007/s40305-017-0163-5.
Porcelli, Margherita y Philippe L. Toint. "Exploiting Problem Structure in Derivative Free Optimization". ACM Transactions on Mathematical Software 48, n.º 1 (31 de marzo de 2022): 1–25. http://dx.doi.org/10.1145/3474054.
GARTSIDE, PAUL y ANA MAMATELASHVILI. "THE TUKEY ORDER ON COMPACT SUBSETS OF SEPARABLE METRIC SPACES". Journal of Symbolic Logic 81, n.º 1 (marzo de 2016): 181–200. http://dx.doi.org/10.1017/jsl.2015.49.
Wilson, Alexander C. y Dorothy V. M. Bishop. ""If you catch my drift...": ability to infer implied meaning is distinct from vocabulary and grammar skills". Wellcome Open Research 4 (15 de abril de 2019): 68. http://dx.doi.org/10.12688/wellcomeopenres.15210.1.
Wilson, Alexander C. y Dorothy V. M. Bishop. ""If you catch my drift...": ability to infer implied meaning is distinct from vocabulary and grammar skills". Wellcome Open Research 4 (10 de julio de 2019): 68. http://dx.doi.org/10.12688/wellcomeopenres.15210.2.
Wilson, Alexander C. y Dorothy V. M. Bishop. ""If you catch my drift...": ability to infer implied meaning is distinct from vocabulary and grammar skills". Wellcome Open Research 4 (30 de agosto de 2019): 68. http://dx.doi.org/10.12688/wellcomeopenres.15210.3.
Gustavson, Daniel E., Naomi P. Friedman, Pierre Fontanillas, Sarah L. Elson, Abraham A. Palmer y Sandra Sanchez-Roige. "The Latent Genetic Structure of Impulsivity and Its Relation to Internalizing Psychopathology". Psychological Science 31, n.º 8 (27 de julio de 2020): 1025–35. http://dx.doi.org/10.1177/0956797620938160.
Yao, Qiong, Xiang Xu y Wensheng Li. "A Sparsified Densely Connected Network with Separable Convolution for Finger-Vein Recognition". Symmetry 14, n.º 12 (19 de diciembre de 2022): 2686. http://dx.doi.org/10.3390/sym14122686.
Tesis sobre el tema "Partially-Separable Structure":
Raynaud, Paul. "L'exploitation de la structure partiellement-séparable dans les méthodes quasi-Newton pour l'optimisation sans contrainte et l'apprentissage profond". Electronic Thesis or Diss., Université Grenoble Alpes, 2024. http://www.theses.fr/2024GRALI021.
This thesis studies and improves the use of the partially-separable structure for unconstrained optimization, particularly for quasi-Newton methods and training neural networks.A partially-separable function is the sum of element functions, each of lower dimension than the total problem.Thus, the Hessian can be aggregated by separately approximating the Hessian of each element function with a dense matrix.These partitioned quasi-Newton methods are applicable to high-dimensional problems and maintain the sparse structure of the Hessian, unlike a limited-memory quasi-Newton method.In practice, these methods require fewer iterations than a limited-memory quasi-Newton method and are parallelizable by distributing computations related to the element functions.However, a comprehensive literature review on the subject has revealed some limitations, particularly when the dimension of the element functions is large.Additionally, the only open-source optimization software exploiting the partially-separable structure is unusable for inexperienced users, leaving only commercial software as an option.In this thesis, two solutions are proposed to address these shortcomings, along with an application of partially-separable optimization concepts to supervised learning of a neural network.The first contribution is a software suite based on an automatic detection of the partially-separable structure of a problem, i.e., retrieves each reduced-dimensional element function.Following this, partitioned data structures necessary for storing derivatives, or their approximations, are allocated and used to define partitioned quasi-Newton optimization methods.The entire suite is integrated into the "JuliaSmoothOptimizers" ecosystem, which gathers numerous tools for smooth optimization, including optimization algorithms that can therefore exploit the detected partial separability.The second contribution replaces the approximation of an element Hessian by a dense matrix with a limited-memory quasi-Newton linear operator.As a result, the memory cost of the total Hessian approximation is no longer quadratically related to the dimension of the element functions.A limited-memory partitioned quasi-Newton method is then applicable when the element functions are large.Each limited-memory partitioned quasi-Newton method has a proof of global convergence.Additionally, numerical results show that these methods outperform partitioned or limited-memory quasi-Newton methods when the elements are large.The final contribution examines the exploitation of the partially-separable structure during supervised training of a neural network.The optimization problem associated with training is generally not partially-separable.Therefore, a partially-separable loss function and a partitioned network architecture are introduced to make the training partially-separable.Numerical results combining these two contributions are competitive with standard architectures and loss functions according to state-of-the-art training methods.Moreover, this combination produces an additional parallelization scheme to existing methods for supervised learning.Indeed, the calculations of each element loss function can be distributed to a worker requiring only a fraction of the neural network to operate.Finally, a limited-memory partitioned quasi-Newton training is proposed.This training is empirically shown to be competitive with state-of-the-art training methods
Capítulos de libros sobre el tema "Partially-Separable Structure":
Lootsma, F. A. "Dual Methods for Large-scale, Partially-separable Nonlinear Optimization". En Discretization Methods and Structural Optimization — Procedures and Applications, 229–38. Berlin, Heidelberg: Springer Berlin Heidelberg, 1989. http://dx.doi.org/10.1007/978-3-642-83707-4_29.
Actas de conferencias sobre el tema "Partially-Separable Structure":
Coster, J. E., N. Stander y J. A. Snyman. "Trust Region Augmented Lagrangian Methods With Secant Hessian Updating Applied to Structural Optimization". En ASME 1996 Design Engineering Technical Conferences and Computers in Engineering Conference. American Society of Mechanical Engineers, 1996. http://dx.doi.org/10.1115/96-detc/dac-1461.