Gotowa bibliografia na temat „Lipschitz neural network”

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Zobacz listy aktualnych artykułów, książek, rozpraw, streszczeń i innych źródeł naukowych na temat „Lipschitz neural network”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Artykuły w czasopismach na temat "Lipschitz neural network"

1

Zhu, Zelong, Chunna Zhao, and Yaqun Huang. "Fractional order Lipschitz recurrent neural network with attention for long time series prediction." Journal of Physics: Conference Series 2813, no. 1 (2024): 012015. http://dx.doi.org/10.1088/1742-6596/2813/1/012015.

Pełny tekst źródła
Streszczenie:
Abstract Time series data prediction holds a significant importance in various applications. In this study, we specifically concentrate on long-time series data prediction. Recurrent Neural Networks are widely recognized as a fundamental neural network architecture for processing effectively time-series data. Recurrent Neural Network models encounter the gradient disappearance or gradient explosion challenge in long series data. To resolve the gradient problem and improve accuracy, the Fractional Order Lipschitz Recurrent Neural Network (FOLRNN) model is proposed to predict long time series in this paper. The proposed method uses the Lipschitz continuity to alleviate the gradient problem. The fractional order integration is applied to compute the hidden states of the Recurrent Neural Network in the proposed method. The intricate dynamics of long-time series data can be captured by fractional order calculus. It has more accurate predictions compared with Lipschitz Recurrent Neural Networks models. Then self-attention is used to improve feature representation. It can describe the correlation of features and improve predict performance. Some experiments show that the FOLRNN model achieves better results than other methods.
Style APA, Harvard, Vancouver, ISO itp.
2

Zhang, Huan, Pengchuan Zhang, and Cho-Jui Hsieh. "RecurJac: An Efficient Recursive Algorithm for Bounding Jacobian Matrix of Neural Networks and Its Applications." Proceedings of the AAAI Conference on Artificial Intelligence 33 (July 17, 2019): 5757–64. http://dx.doi.org/10.1609/aaai.v33i01.33015757.

Pełny tekst źródła
Streszczenie:
The Jacobian matrix (or the gradient for single-output networks) is directly related to many important properties of neural networks, such as the function landscape, stationary points, (local) Lipschitz constants and robustness to adversarial attacks. In this paper, we propose a recursive algorithm, RecurJac, to compute both upper and lower bounds for each element in the Jacobian matrix of a neural network with respect to network’s input, and the network can contain a wide range of activation functions. As a byproduct, we can efficiently obtain a (local) Lipschitz constant, which plays a crucial role in neural network robustness verification, as well as the training stability of GANs. Experiments show that (local) Lipschitz constants produced by our method is of better quality than previous approaches, thus providing better robustness verification results. Our algorithm has polynomial time complexity, and its computation time is reasonable even for relatively large networks. Additionally, we use our bounds of Jacobian matrix to characterize the landscape of the neural network, for example, to determine whether there exist stationary points in a local neighborhood.
Style APA, Harvard, Vancouver, ISO itp.
3

Araujo, Alexandre, Benjamin Negrevergne, Yann Chevaleyre, and Jamal Atif. "On Lipschitz Regularization of Convolutional Layers using Toeplitz Matrix Theory." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 8 (2021): 6661–69. http://dx.doi.org/10.1609/aaai.v35i8.16824.

Pełny tekst źródła
Streszczenie:
This paper tackles the problem of Lipschitz regularization of Convolutional Neural Networks. Lipschitz regularity is now established as a key property of modern deep learning with implications in training stability, generalization, robustness against adversarial examples, etc. However, computing the exact value of the Lipschitz constant of a neural network is known to be NP-hard. Recent attempts from the literature introduce upper bounds to approximate this constant that are either efficient but loose or accurate but computationally expensive. In this work, by leveraging the theory of Toeplitz matrices, we introduce a new upper bound for convolutional layers that is both tight and easy to compute. Based on this result we devise an algorithm to train Lipschitz regularized Convolutional Neural Networks.
Style APA, Harvard, Vancouver, ISO itp.
4

Xu, Yuhui, Wenrui Dai, Yingyong Qi, Junni Zou, and Hongkai Xiong. "Iterative Deep Neural Network Quantization With Lipschitz Constraint." IEEE Transactions on Multimedia 22, no. 7 (2020): 1874–88. http://dx.doi.org/10.1109/tmm.2019.2949857.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
5

Mohammad, Ibtihal J. "Neural Networks of the Rational r-th Powers of the Multivariate Bernstein Operators." BASRA JOURNAL OF SCIENCE 40, no. 2 (2022): 258–73. http://dx.doi.org/10.29072/basjs.20220201.

Pełny tekst źródła
Streszczenie:
In this study, a novel neural network for the multivariance Bernstein operators' rational powers was developed. A positive integer is required by these networks. In the space of all real-valued continuous functions, the pointwise and uniform approximation theorems are introduced and examined first. After that, the Lipschitz space is used to study two key theorems. Additionally, some numerical examples are provided to demonstrate how well these neural networks approximate two test functions. The numerical outcomes demonstrate that as input grows, the neural network provides a better approximation. Finally, the graphs used to represent these neural network approximations show the average error between the approximation and the test function.
Style APA, Harvard, Vancouver, ISO itp.
6

Ibtihal.J.M and Ali J. Mohammad. "Neural Network of Multivariate Square Rational Bernstein Operators with Positive Integer Parameter." European Journal of Pure and Applied Mathematics 15, no. 3 (2022): 1189–200. http://dx.doi.org/10.29020/nybg.ejpam.v15i3.4425.

Pełny tekst źródła
Streszczenie:
This research is defined a new neural network (NN) that depends upon a positive integer parameter using the multivariate square rational Bernstein polynomials. Some theorems for this network are proved, such as the pointwise and the uniform approximation theorems. Firstly, the absolute moment for a function that belongs to Lipschitz space is defined to estimate the order of the NN. Secondly, some numerical applications for this NN are given by taking two test functions. Finally, the numerical results for this network are compared with the classical neural networks (NNs). The results turn out that the new network is better than the classical one.
Style APA, Harvard, Vancouver, ISO itp.
7

Liu, Kanglin, and Guoping Qiu. "Lipschitz constrained GANs via boundedness and continuity." Neural Computing and Applications 32, no. 24 (2020): 18271–83. http://dx.doi.org/10.1007/s00521-020-04954-z.

Pełny tekst źródła
Streszczenie:
AbstractOne of the challenges in the study of generative adversarial networks (GANs) is the difficulty of its performance control. Lipschitz constraint is essential in guaranteeing training stability for GANs. Although heuristic methods such as weight clipping, gradient penalty and spectral normalization have been proposed to enforce Lipschitz constraint, it is still difficult to achieve a solution that is both practically effective and theoretically provably satisfying a Lipschitz constraint. In this paper, we introduce the boundedness and continuity (BC) conditions to enforce the Lipschitz constraint on the discriminator functions of GANs. We prove theoretically that GANs with discriminators meeting the BC conditions satisfy the Lipschitz constraint. We present a practically very effective implementation of a GAN based on a convolutional neural network (CNN) by forcing the CNN to satisfy the BC conditions (BC–GAN). We show that as compared to recent techniques including gradient penalty and spectral normalization, BC–GANs have not only better performances but also lower computational complexity.
Style APA, Harvard, Vancouver, ISO itp.
8

Othmani, S., N. E. Tatar, and A. Khemmoudj. "Asymptotic behavior of a BAM neural network with delays of distributed type." Mathematical Modelling of Natural Phenomena 16 (2021): 29. http://dx.doi.org/10.1051/mmnp/2021023.

Pełny tekst źródła
Streszczenie:
In this paper, we examine a Bidirectional Associative Memory neural network model with distributed delays. Using a result due to Cid [J. Math. Anal. Appl. 281 (2003) 264–275], we were able to prove an exponential stability result in the case when the standard Lipschitz continuity condition is violated. Indeed, we deal with activation functions which may not be Lipschitz continuous. Therefore, the standard Halanay inequality is not applicable. We will use a nonlinear version of this inequality. At the end, the obtained differential inequality which should imply the exponential stability appears ‘state dependent’. That is the usual constant depends in this case on the state itself. This adds some difficulties which we overcome by a suitable argument.
Style APA, Harvard, Vancouver, ISO itp.
9

Xia, Youshen. "An Extended Projection Neural Network for Constrained Optimization." Neural Computation 16, no. 4 (2004): 863–83. http://dx.doi.org/10.1162/089976604322860730.

Pełny tekst źródła
Streszczenie:
Recently, a projection neural network has been shown to be a promising computational model for solving variational inequality problems with box constraints. This letter presents an extended projection neural network for solving monotone variational inequality problems with linear and nonlinear constraints. In particular, the proposed neural network can include the projection neural network as a special case. Compared with the modified projection-type methods for solving constrained monotone variational inequality problems, the proposed neural network has a lower complexity and is suitable for parallel implementation. Furthermore, the proposed neural network is theoretically proven to be exponentially convergent to an exact solution without a Lipschitz condition. Illustrative examples show that the extended projection neural network can be used to solve constrained monotone variational inequality problems.
Style APA, Harvard, Vancouver, ISO itp.
10

Li, Peiluan, Yuejing Lu, Changjin Xu, and Jing Ren. "Bifurcation Phenomenon and Control Technique in Fractional BAM Neural Network Models Concerning Delays." Fractal and Fractional 7, no. 1 (2022): 7. http://dx.doi.org/10.3390/fractalfract7010007.

Pełny tekst źródła
Streszczenie:
In this current study, we formulate a kind of new fractional BAM neural network model concerning five neurons and time delays. First, we explore the existence and uniqueness of the solution of the formulated fractional delay BAM neural network models via the Lipschitz condition. Second, we study the boundedness of the solution to the formulated fractional delayed BAM neural network models using a proper function. Third, we set up a novel sufficient criterion on the onset of the Hopf bifurcation stability of the formulated fractional BAM neural network models by virtue of the stability criterion and bifurcation principle of fractional delayed dynamical systems. Fourth, a delayed feedback controller is applied to command the time of occurrence of the bifurcation and stability domain of the formulated fractional delayed BAM neural network models. Lastly, software simulation figures are provided to verify the key outcomes. The theoretical outcomes obtained through this exploration can play a vital role in controlling and devising networks.
Style APA, Harvard, Vancouver, ISO itp.
Więcej źródeł
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii