Academic literature on the topic 'Partially-Separable Structure'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Partially-Separable Structure.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Partially-Separable Structure":
Sun, Yifan, Martin S. Andersen, and Lieven Vandenberghe. "Decomposition in Conic Optimization with Partially Separable Structure." SIAM Journal on Optimization 24, no. 2 (January 2014): 873–97. http://dx.doi.org/10.1137/130926924.
Michelini, Giorgia, Celeste H. M. Cheung, Viryanaga Kitsune, Daniel Brandeis, Tobias Banaschewski, Gráinne McLoughlin, Philip Asherson, Frühling Rijsdijk, and Jonna Kuntsi. "The Etiological Structure of Cognitive-Neurophysiological Impairments in ADHD in Adolescence and Young Adulthood." Journal of Attention Disorders 25, no. 1 (May 3, 2018): 91–104. http://dx.doi.org/10.1177/1087054718771191.
Bai, Fu-Sheng, and Ling Xu. "A Partially Parallel Prediction-Correction Splitting Method for Convex Optimization Problems with Separable Structure." Journal of the Operations Research Society of China 5, no. 4 (April 24, 2017): 529–44. http://dx.doi.org/10.1007/s40305-017-0163-5.
Porcelli, Margherita, and Philippe L. Toint. "Exploiting Problem Structure in Derivative Free Optimization." ACM Transactions on Mathematical Software 48, no. 1 (March 31, 2022): 1–25. http://dx.doi.org/10.1145/3474054.
GARTSIDE, PAUL, and ANA MAMATELASHVILI. "THE TUKEY ORDER ON COMPACT SUBSETS OF SEPARABLE METRIC SPACES." Journal of Symbolic Logic 81, no. 1 (March 2016): 181–200. http://dx.doi.org/10.1017/jsl.2015.49.
Wilson, Alexander C., and Dorothy V. M. Bishop. ""If you catch my drift...": ability to infer implied meaning is distinct from vocabulary and grammar skills." Wellcome Open Research 4 (April 15, 2019): 68. http://dx.doi.org/10.12688/wellcomeopenres.15210.1.
Wilson, Alexander C., and Dorothy V. M. Bishop. ""If you catch my drift...": ability to infer implied meaning is distinct from vocabulary and grammar skills." Wellcome Open Research 4 (July 10, 2019): 68. http://dx.doi.org/10.12688/wellcomeopenres.15210.2.
Wilson, Alexander C., and Dorothy V. M. Bishop. ""If you catch my drift...": ability to infer implied meaning is distinct from vocabulary and grammar skills." Wellcome Open Research 4 (August 30, 2019): 68. http://dx.doi.org/10.12688/wellcomeopenres.15210.3.
Gustavson, Daniel E., Naomi P. Friedman, Pierre Fontanillas, Sarah L. Elson, Abraham A. Palmer, and Sandra Sanchez-Roige. "The Latent Genetic Structure of Impulsivity and Its Relation to Internalizing Psychopathology." Psychological Science 31, no. 8 (July 27, 2020): 1025–35. http://dx.doi.org/10.1177/0956797620938160.
Yao, Qiong, Xiang Xu, and Wensheng Li. "A Sparsified Densely Connected Network with Separable Convolution for Finger-Vein Recognition." Symmetry 14, no. 12 (December 19, 2022): 2686. http://dx.doi.org/10.3390/sym14122686.
Dissertations / Theses on the topic "Partially-Separable Structure":
Raynaud, Paul. "L'exploitation de la structure partiellement-séparable dans les méthodes quasi-Newton pour l'optimisation sans contrainte et l'apprentissage profond." Electronic Thesis or Diss., Université Grenoble Alpes, 2024. http://www.theses.fr/2024GRALI021.
This thesis studies and improves the use of the partially-separable structure for unconstrained optimization, particularly for quasi-Newton methods and training neural networks.A partially-separable function is the sum of element functions, each of lower dimension than the total problem.Thus, the Hessian can be aggregated by separately approximating the Hessian of each element function with a dense matrix.These partitioned quasi-Newton methods are applicable to high-dimensional problems and maintain the sparse structure of the Hessian, unlike a limited-memory quasi-Newton method.In practice, these methods require fewer iterations than a limited-memory quasi-Newton method and are parallelizable by distributing computations related to the element functions.However, a comprehensive literature review on the subject has revealed some limitations, particularly when the dimension of the element functions is large.Additionally, the only open-source optimization software exploiting the partially-separable structure is unusable for inexperienced users, leaving only commercial software as an option.In this thesis, two solutions are proposed to address these shortcomings, along with an application of partially-separable optimization concepts to supervised learning of a neural network.The first contribution is a software suite based on an automatic detection of the partially-separable structure of a problem, i.e., retrieves each reduced-dimensional element function.Following this, partitioned data structures necessary for storing derivatives, or their approximations, are allocated and used to define partitioned quasi-Newton optimization methods.The entire suite is integrated into the "JuliaSmoothOptimizers" ecosystem, which gathers numerous tools for smooth optimization, including optimization algorithms that can therefore exploit the detected partial separability.The second contribution replaces the approximation of an element Hessian by a dense matrix with a limited-memory quasi-Newton linear operator.As a result, the memory cost of the total Hessian approximation is no longer quadratically related to the dimension of the element functions.A limited-memory partitioned quasi-Newton method is then applicable when the element functions are large.Each limited-memory partitioned quasi-Newton method has a proof of global convergence.Additionally, numerical results show that these methods outperform partitioned or limited-memory quasi-Newton methods when the elements are large.The final contribution examines the exploitation of the partially-separable structure during supervised training of a neural network.The optimization problem associated with training is generally not partially-separable.Therefore, a partially-separable loss function and a partitioned network architecture are introduced to make the training partially-separable.Numerical results combining these two contributions are competitive with standard architectures and loss functions according to state-of-the-art training methods.Moreover, this combination produces an additional parallelization scheme to existing methods for supervised learning.Indeed, the calculations of each element loss function can be distributed to a worker requiring only a fraction of the neural network to operate.Finally, a limited-memory partitioned quasi-Newton training is proposed.This training is empirically shown to be competitive with state-of-the-art training methods
Book chapters on the topic "Partially-Separable Structure":
Lootsma, F. A. "Dual Methods for Large-scale, Partially-separable Nonlinear Optimization." In Discretization Methods and Structural Optimization — Procedures and Applications, 229–38. Berlin, Heidelberg: Springer Berlin Heidelberg, 1989. http://dx.doi.org/10.1007/978-3-642-83707-4_29.
Conference papers on the topic "Partially-Separable Structure":
Coster, J. E., N. Stander, and J. A. Snyman. "Trust Region Augmented Lagrangian Methods With Secant Hessian Updating Applied to Structural Optimization." In ASME 1996 Design Engineering Technical Conferences and Computers in Engineering Conference. American Society of Mechanical Engineers, 1996. http://dx.doi.org/10.1115/96-detc/dac-1461.