Literatura académica sobre el tema "Constraint networks"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Constraint networks".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Constraint networks"

1

Brosowsky, Mathis, Florian Keck, Olaf Dünkel, and Marius Zöllner. "Sample-Specific Output Constraints for Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 8 (2021): 6812–21. http://dx.doi.org/10.1609/aaai.v35i8.16841.

Texto completo
Resumen
It is common practice to constrain the output space of a neural network with the final layer to a problem-specific value range. However, for many tasks it is desired to restrict the output space for each input independently to a different subdomain with a non-trivial geometry, e.g. in safety-critical applications, to exclude hazardous outputs sample-wise. We propose ConstraintNet—a scalable neural network architecture which constrains the output space in each forward pass independently. Contrary to prior approaches, which perform a projection in the final layer, ConstraintNet applies an input-dependent parametrization of the constrained output space. Thereby, the complete interior of the constrained region is covered and computational costs are reduced significantly. For constraints in form of convex polytopes, we leverage the vertex representation to specify the parametrization. The second modification consists of adding an auxiliary input in form of a tensor description of the constraint to enable the handling of multiple constraints for the same sample. Finally, ConstraintNet is end-to-end trainable with almost no overhead in the forward and backward pass. We demonstrate ConstraintNet on two regression tasks: First, we modify a CNN and construct several constraints for facial landmark detection tasks. Second, we demonstrate the application to a follow object controller for vehicles and accomplish safe reinforcement learning in this case. In both experiments, ConstraintNet improves performance and we conclude that our approach is promising for applying neural networks in safety-critical environments.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Rong, Zihao, Shaofan Wang, Dehui Kong, and Baocai Yin. "Improving object detection quality with structural constraints." PLOS ONE 17, no. 5 (2022): e0267863. http://dx.doi.org/10.1371/journal.pone.0267863.

Texto completo
Resumen
Recent researches revealed object detection networks using the simple “classification loss + localization loss” training objective are not effectively optimized in many cases, while providing additional constraints on network features could effectively improve object detection quality. Specifically, some works used constraints on training sample relations to successfully learn discriminative network features. Based on these observations, we propose Structural Constraint for improving object detection quality. Structural constraint supervises feature learning in classification and localization network branches with Fisher Loss and Equi-proportion Loss respectively, by requiring feature similarities of training sample pairs to be consistent with corresponding ground truth label similarities. Structural constraint could be applied to all object detection network architectures with the assist of our Proxy Feature design. Our experiment results showed that structural constraint mechanism is able to optimize object class instances’ distribution in network feature space, and consequently detection results. Evaluations on MSCOCO2017 and KITTI datasets showed that our structural constraint mechanism is able to assist baseline networks to outperform modern counterpart detectors in terms of object detection quality.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Zhang, Y., and R. H. C. Yap. "Set Intersection and Consistency in Constraint Networks." Journal of Artificial Intelligence Research 27 (December 13, 2006): 441–64. http://dx.doi.org/10.1613/jair.2058.

Texto completo
Resumen
In this paper, we show that there is a close relation between consistency in a constraint network and set intersection. A proof schema is provided as a generic way to obtain consistency properties from properties on set intersection. This approach not only simplifies the understanding of and unifies many existing consistency results, but also directs the study of consistency to that of set intersection properties in many situations, as demonstrated by the results on the convexity and tightness of constraints in this paper. Specifically, we identify a new class of tree convex constraints where local consistency ensures global consistency. This generalizes row convex constraints. Various consistency results are also obtained on constraint networks where only some, in contrast to all in the existing work,constraints are tight.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Kharroubi, Idris, Thomas Lim, and Xavier Warin. "Discretization and machine learning approximation of BSDEs with a constraint on the Gains-process." Monte Carlo Methods and Applications 27, no. 1 (2021): 27–55. http://dx.doi.org/10.1515/mcma-2020-2080.

Texto completo
Resumen
Abstract We study the approximation of backward stochastic differential equations (BSDEs for short) with a constraint on the gains process. We first discretize the constraint by applying a so-called facelift operator at times of a grid. We show that this discretely constrained BSDE converges to the continuously constrained one as the mesh grid converges to zero. We then focus on the approximation of the discretely constrained BSDE. For that we adopt a machine learning approach. We show that the facelift can be approximated by an optimization problem over a class of neural networks under constraints on the neural network and its derivative. We then derive an algorithm converging to the discretely constrained BSDE as the number of neurons goes to infinity. We end by numerical experiments.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Dechter, Rina, Itay Meiri, and Judea Pearl. "Temporal constraint networks." Artificial Intelligence 49, no. 1-3 (1991): 61–95. http://dx.doi.org/10.1016/0004-3702(91)90006-6.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Msaaf, Mohammed, and Fouad Belmajdoub. "Diagnosis of Discrete Event Systems under Temporal Constraints Using Neural Network." International Journal of Engineering Research in Africa 49 (June 2020): 198–205. http://dx.doi.org/10.4028/www.scientific.net/jera.49.198.

Texto completo
Resumen
The good functioning of a discrete event system is related to how much the temporal constraints are respected. This paper gives a new approach, based on a statistical model and neural network, that allows the verification of temporal constraints in DES. We will perform an online temporal constraint checking which can detect in real time any abnormal functioning related to the violation of a temporal constraint. In the first phase, the construction of temporal constraints from statistical model is shown and after that neural networks are involved in dealing with the online temporal constraint checking.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Wang, Xiao Fei, Xi Zhang, Yue Bing Chen, Lei Zhang, and Chao Jing Tang. "Spectrum Assignment Algorithm Based on Clonal Selection in Cognitive Radio Networks." Advanced Materials Research 457-458 (January 2012): 931–39. http://dx.doi.org/10.4028/www.scientific.net/amr.457-458.931.

Texto completo
Resumen
An improved-immune-clonal-selection based spectrum assignment algorithm (IICSA) in cognitive radio networks is proposed, combing graph theory and immune optimization. It uses constraint satisfaction operation to make encoded antibody population satisfy constraints, and realizes the global optimization. The random-constraint satisfaction operator and fair-constraint satisfaction operator are designed to guarantee efficiency and fairness, respectively. Simulations are performed for performance comparison between the IICSA and the color-sensitive graph coloring algorithm. The results indicate that the proposed algorithm increases network utilization, and efficiently improves the fairness.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Buscema, Massimo. "Constraint Satisfaction Neural Networks." Substance Use & Misuse 33, no. 2 (1998): 389–408. http://dx.doi.org/10.3109/10826089809115873.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Suter, D. "Constraint networks in vision." IEEE Transactions on Computers 40, no. 12 (1991): 1359–67. http://dx.doi.org/10.1109/12.106221.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Gottlob, Georg. "On minimal constraint networks." Artificial Intelligence 191-192 (November 2012): 42–60. http://dx.doi.org/10.1016/j.artint.2012.07.006.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
Más fuentes
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía