Journal articles on the topic 'Neural fields'

To see the other types of publications on this topic, follow the link: Neural fields.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Neural fields.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Coombes, Stephen. "Neural fields." Scholarpedia 1, no. 6 (2006): 1373. http://dx.doi.org/10.4249/scholarpedia.1373.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Aigerman, Noam, Kunal Gupta, Vladimir G. Kim, Siddhartha Chaudhuri, Jun Saito, and Thibault Groueix. "Neural jacobian fields." ACM Transactions on Graphics 41, no. 4 (July 2022): 1–17. http://dx.doi.org/10.1145/3528223.3530141.

Full text
Abstract:
This paper introduces a framework designed to accurately predict piecewise linear mappings of arbitrary meshes via a neural network, enabling training and evaluating over heterogeneous collections of meshes that do not share a triangulation, as well as producing highly detail-preserving maps whose accuracy exceeds current state of the art. The framework is based on reducing the neural aspect to a prediction of a matrix for a single given point, conditioned on a global shape descriptor. The field of matrices is then projected onto the tangent bundle of the given mesh, and used as candidate jacobians for the predicted map. The map is computed by a standard Poisson solve, implemented as a differentiable layer with cached pre-factorization for efficient training. This construction is agnostic to the triangulation of the input, thereby enabling applications on datasets with varying triangulations. At the same time, by operating in the intrinsic gradient domain of each individual mesh, it allows the framework to predict highly-accurate mappings. We validate these properties by conducting experiments over a broad range of scenarios, from semantic ones such as morphing, registration, and deformation transfer, to optimization-based ones, such as emulating elastic deformations and contact correction, as well as being the first work, to our knowledge, to tackle the task of learning to compute UV parameterizations of arbitrary meshes. The results exhibit the high accuracy of the method as well as its versatility, as it is readily applied to the above scenarios without any changes to the framework.
APA, Harvard, Vancouver, ISO, and other styles
3

Smaragdis, Paris. "Neural acoustic fields." Journal of the Acoustical Society of America 153, no. 3_supplement (March 1, 2023): A175. http://dx.doi.org/10.1121/10.0018569.

Full text
Abstract:
We present a way to compactly represent acoustic transfer functions using a small, yet flexible, parametric representation. We show that we can use a neural network as a “soft” table lookup and train it to produce the value of transfer functions at arbitrary points in space and time. Doing so allows us to interpolate and produce unseen data, and to represent acoustic environments using a remarkably compact representation. Due to this representation being differentiable, this opens up multiple opportunities to employ such models within more sophisticated audio processing systems.
APA, Harvard, Vancouver, ISO, and other styles
4

Friston, Karl. "Mean-Fields and Neural Masses." PLoS Computational Biology 4, no. 8 (August 29, 2008): e1000081. http://dx.doi.org/10.1371/journal.pcbi.1000081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chappet De Vangel, Benoît, Cesar Torres-huitzil, and Bernard Girau. "Randomly Spiking Dynamic Neural Fields." ACM Journal on Emerging Technologies in Computing Systems 11, no. 4 (April 27, 2015): 1–26. http://dx.doi.org/10.1145/2629517.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Igel, Christian, Wolfram Erlhagen, and Dirk Jancke. "Optimization of dynamic neural fields." Neurocomputing 36, no. 1-4 (February 2001): 225–33. http://dx.doi.org/10.1016/s0925-2312(00)00328-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Belhe, Yash, Michaël Gharbi, Matthew Fisher, Iliyan Georgiev, Ravi Ramamoorthi, and Tzu-Mao Li. "Discontinuity-Aware 2D Neural Fields." ACM Transactions on Graphics 42, no. 6 (December 5, 2023): 1–11. http://dx.doi.org/10.1145/3618379.

Full text
Abstract:
Neural image representations offer the possibility of high fidelity, compact storage, and resolution-independent accuracy, providing an attractive alternative to traditional pixel- and grid-based representations. However, coordinate neural networks fail to capture discontinuities present in the image and tend to blur across them; we aim to address this challenge. In many cases, such as rendered images, vector graphics, diffusion curves, or solutions to partial differential equations, the locations of the discontinuities are known. We take those locations as input, represented as linear, quadratic, or cubic Bézier curves, and construct a feature field that is discontinuous across these locations and smooth everywhere else. Finally, we use a shallow multi-layer perceptron to decode the features into the signal value. To construct the feature field, we develop a new data structure based on a curved triangular mesh, with features stored on the vertices and on a subset of the edges that are marked as discontinuous. We show that our method can be used to compress a 100, 000 2 -pixel rendered image into a 25MB file; can be used as a new diffusion-curve solver by combining with Monte-Carlo-based methods or directly supervised by the diffusion-curve energy; or can be used for compressing 2D physics simulation data.
APA, Harvard, Vancouver, ISO, and other styles
8

Esselle, K. P., and M. A. Stuchly. "Neural stimulation with magnetic fields: analysis of induced electric fields." IEEE Transactions on Biomedical Engineering 39, no. 7 (July 1992): 693–700. http://dx.doi.org/10.1109/10.142644.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bressloff, Paul C., and Matthew A. Webber. "Front Propagation in Stochastic Neural Fields." SIAM Journal on Applied Dynamical Systems 11, no. 2 (January 2012): 708–40. http://dx.doi.org/10.1137/110851031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kilpatrick, Zachary P., and Grégory Faye. "Pulse Bifurcations in Stochastic Neural Fields." SIAM Journal on Applied Dynamical Systems 13, no. 2 (January 2014): 830–60. http://dx.doi.org/10.1137/140951369.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Laing, Carlo R. "Exact Neural Fields Incorporating Gap Junctions." SIAM Journal on Applied Dynamical Systems 14, no. 4 (January 2015): 1899–929. http://dx.doi.org/10.1137/15m1011287.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Bressloff, Paul C. "Spatiotemporal dynamics of continuum neural fields." Journal of Physics A: Mathematical and Theoretical 45, no. 3 (December 14, 2011): 033001. http://dx.doi.org/10.1088/1751-8113/45/3/033001.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Kilpatrick, Zachary P., and Bard Ermentrout. "Wandering Bumps in Stochastic Neural Fields." SIAM Journal on Applied Dynamical Systems 12, no. 1 (January 2013): 61–94. http://dx.doi.org/10.1137/120877106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Buneo, Christopher A. "Analyzing neural responses with vector fields." Journal of Neuroscience Methods 197, no. 1 (April 2011): 109–17. http://dx.doi.org/10.1016/j.jneumeth.2011.02.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Pedrycz, W., M. G. Chun, and G. Succi. "N4: computing with neural receptive fields." Neurocomputing 55, no. 1-2 (September 2003): 383–401. http://dx.doi.org/10.1016/s0925-2312(02)00630-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Pinotsis, D. A., R. J. Moran, and K. J. Friston. "Dynamic causal modeling with neural fields." NeuroImage 59, no. 2 (January 2012): 1261–74. http://dx.doi.org/10.1016/j.neuroimage.2011.08.020.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Touboul, Jonathan. "Propagation of chaos in neural fields." Annals of Applied Probability 24, no. 3 (June 2014): 1298–328. http://dx.doi.org/10.1214/13-aap950.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Ventriglia, Francesco. "Global rhythmic activities in hippocampal neural fields and neural coding." Biosystems 86, no. 1-3 (October 2006): 38–45. http://dx.doi.org/10.1016/j.biosystems.2006.02.015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Zhao, Congyu. "Applications of neural networks in different fields." Applied and Computational Engineering 39, no. 1 (February 21, 2024): 296–301. http://dx.doi.org/10.54254/2755-2721/39/20230618.

Full text
Abstract:
In the realm of modern technology, the rapid advancements in neural networks have ignited transformative shifts across various domains. This essay explores the multifaceted impacts of neural networks on technology, shedding light on their application in diverse sectors. The discussion encompasses the foundational principles of neural networks, including supervised and unsupervised learning, as well as their intersection with cutting-edge techniques. The essay elucidates the role of neural networks in fostering artificial intelligence, enabling autonomous systems, enhancing healthcare diagnostics, and revolutionizing natural language processing. Additionally, it delves into the ethical considerations and challenges presented by these technologies. This paper encapsulates the exploration of neural networks' rise from theoretical constructs to instrumental agents of change in the technological landscape, inviting readers to delve into the intricate interplay between innovation and its implications.
APA, Harvard, Vancouver, ISO, and other styles
20

Doubrovinski, Konstantin, and J. Michael Herrmann. "Stability of Localized Patterns in Neural Fields." Neural Computation 21, no. 4 (April 2009): 1125–44. http://dx.doi.org/10.1162/neco.2008.11-06-392.

Full text
Abstract:
We investigate two-dimensional neural fields as a model of the dynamics of macroscopic activations in a cortex-like neural system. While the one-dimensional case was treated comprehensively by Amari 30 years ago, two-dimensional neural fields are much less understood. We derive conditions for the stability for the main classes of localized solutions of the neural field equation and study their behavior beyond parameter-controlled destabilization. We show that a slight modification of the original model yields an equation whose stationary states are guaranteed to satisfy the original problem and numerically demonstrate that it admits localized noncircular solutions. Typically, however, only periodic spatial tessellations emerge on destabilization of rotationally invariant solutions.
APA, Harvard, Vancouver, ISO, and other styles
21

Pai, Neng-Sheng, Her-Terng Yau, Tzu-Hsiang Hung, and Chin-Pao Hung. "Application of CMAC Neural Network to Solar Energy Heliostat Field Fault Diagnosis." International Journal of Photoenergy 2013 (2013): 1–8. http://dx.doi.org/10.1155/2013/938162.

Full text
Abstract:
Solar energy heliostat fields comprise numerous sun tracking platforms. As a result, fault detection is a highly challenging problem. Accordingly, the present study proposes a cerebellar model arithmetic computer (CMAC) neutral network for automatically diagnosing faults within the heliostat field in accordance with the rotational speed, vibration, and temperature characteristics of the individual heliostat transmission systems. As compared with radial basis function (RBF) neural network and back propagation (BP) neural network in the heliostat field fault diagnosis, the experimental results show that the proposed neural network has a low training time, good robustness, and a reliable diagnostic performance. As a result, it provides an ideal solution for fault diagnosis in modern, large-scale heliostat fields.
APA, Harvard, Vancouver, ISO, and other styles
22

Xie, Yiheng, Towaki Takikawa, Shunsuke Saito, Or Litany, Shiqin Yan, Numair Khan, Federico Tombari, James Tompkin, Vincent sitzmann, and Srinath Sridhar. "Neural Fields in Visual Computing and Beyond." Computer Graphics Forum 41, no. 2 (May 2022): 641–76. http://dx.doi.org/10.1111/cgf.14505.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Karakonstantis, Xenofon, and Efren Fernandez-Grande. "Invertible neural networks for reconstructing acoustic fields." Journal of the Acoustical Society of America 151, no. 4 (April 2022): A231. http://dx.doi.org/10.1121/10.0011156.

Full text
Abstract:
Sound field reconstruction from finite measurement arrays provides a means to interpolate and extrapolate acoustic quantities that describe the field. By assuming a linear projection on a basis that follows a principled source propagation, one can recover accurate estimates of the aforementioned sound fields. However, the recovery of the basis coefficients relies on explicit models of which measurement noise and data incompleteness can profoundly affect the uncertainty of the solution. This work aims to estimate the distribution of the underlying pressure conditioned on the observations of the measured pressure in a room. A framework for approximate inference is adapted for sound field reconstruction by applying generative flow-based models and invertible neural network architectures. In particular, we use conditional normalising flows for fast conditional posterior estimation and uncertainty quantification. The model's evaluation is carried out using experimental data measured with a spherical array and compared to hierarchical Bayes with Markov Chain Monte-Carlo sampling.
APA, Harvard, Vancouver, ISO, and other styles
24

Lu, Y., K. Jiang, J. A. Levine, and M. Berger. "Compressive Neural Representations of Volumetric Scalar Fields." Computer Graphics Forum 40, no. 3 (June 2021): 135–46. http://dx.doi.org/10.1111/cgf.14295.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Price, C. B. "Traveling Turing patterns in nonlinear neural fields." Physical Review E 55, no. 6 (June 1, 1997): 6698–706. http://dx.doi.org/10.1103/physreve.55.6698.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Liu, Caihua, Jie Liu, Zhicheng He, Yujia Zhai, Qinghua Hu, and Yalou Huang. "Convolutional neural random fields for action recognition." Pattern Recognition 59 (November 2016): 213–24. http://dx.doi.org/10.1016/j.patcog.2016.03.019.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Buessler, Jean-Luc, Philippe Smagghe, and Jean-Philippe Urban. "Image receptive fields for artificial neural networks." Neurocomputing 144 (November 2014): 258–70. http://dx.doi.org/10.1016/j.neucom.2014.04.045.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Pinotsis, D. A., and K. J. Friston. "Neural fields, spectral responses and lateral connections." NeuroImage 55, no. 1 (March 2011): 39–48. http://dx.doi.org/10.1016/j.neuroimage.2010.11.081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Kuehn, Christian, and Martin G. Riedler. "Large Deviations for Nonlocal Stochastic Neural Fields." Journal of Mathematical Neuroscience 4, no. 1 (2014): 1. http://dx.doi.org/10.1186/2190-8567-4-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Abbassian, A. H., M. Fotouhi, and M. Heidari. "Neural fields with fast learning dynamic kernel." Biological Cybernetics 106, no. 1 (January 2012): 15–26. http://dx.doi.org/10.1007/s00422-012-0475-9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Stuchly, Maria A., and Karu P. Esselle. "Factors affecting neural stimulation with magnetic fields." Bioelectromagnetics 13, S1 (1992): 191–204. http://dx.doi.org/10.1002/bem.2250130718.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Yulita, Intan Nurma. "Conditional Neural Fields untuk Pengenalan Fase Gerak." Jurnal Informatika: Jurnal Pengembangan IT 2, no. 1 (January 19, 2017): 18–22. http://dx.doi.org/10.30591/jpit.v2i1.432.

Full text
Abstract:
Pengenalan pola merupakan area informatika yang banyak dikaji hingga saat ini. Hal ini dikarenakan pemanfaatannya yang luas diterapkan dalam kehidupan sehari-hari. Di dalam makalah ini disajikan pengenalan pola untuk gerakan khususnya fase gerak. Secara khusus pengenalan fase gerak di dalam makalah ini menitik beratkan pada pengenalan pola pada data berbentuk sekuensial. Pengenalan ini dapat saja mengabaikan faktor sekuensialnya, namun tentu akan menurunkan akurasi yang akan diperoleh. Oleh karena itu untuk mengatasi tantangan tersebut, maka ditawarkan penggunaan Conditional Neural Fields (CNF). Metode ini merupakan gabungan antara Conditional Random Fields (CRF) dan Artifisial Neural Networks (ANN). Representasi ANN disajikan dalam bentuk gate pada lapisan tengah dari CRF. Lapisan ini bertujuan untuk memetakan hubungan non-linear antara input dan output yang terdapat di dalam data. Sebagai hasilnya diperoleh bahwa CNF terbukti lebih efektif dan efisien dibandingkan CRF berdasarkan akurasi dan banyaknya iterasi yang dibutuhkan. Namun penggunaan terlalu banyak gate ternyata tidak efektif dikarenakan konvergensi dari model pengenalan semakin sulit tercapai. Di sisi lain, jika hanya satu gate yang digunakan maka konvergensi tercapai namun akuarsi yang diperoleh rendah. Sehingga diperlukan upaya untuk menemukan banyaknya gate optimal yang diperlukan.
APA, Harvard, Vancouver, ISO, and other styles
33

Chen, Anpei, Zexiang Xu, Xinyue Wei, Siyu Tang, Hao Su, and Andreas Geiger. "Dictionary Fields: Learning a Neural Basis Decomposition." ACM Transactions on Graphics 42, no. 4 (July 26, 2023): 1–12. http://dx.doi.org/10.1145/3592135.

Full text
Abstract:
We present Dictionary Fields, a novel neural representation which decomposes a signal into a product of factors, each represented by a classical or neural field representation, operating on transformed input coordinates. More specifically, we factorize a signal into a coefficient field and a basis field, and exploit periodic coordinate transformations to apply the same basis functions across multiple locations and scales. Our experiments show that Dictionary Fields lead to improvements in approximation quality, compactness, and training time when compared to previous fast reconstruction methods. Experimentally, our representation achieves better image approximation quality on 2D image regression tasks, higher geometric quality when reconstructing 3D signed distance fields, and higher compactness for radiance field reconstruction tasks. Furthermore, Dictionary Fields enable generalization to unseen images/3D scenes by sharing bases across signals during training which greatly benefits use cases such as image regression from partial observations and few-shot radiance field reconstruction.
APA, Harvard, Vancouver, ISO, and other styles
34

Park, Keunhong, Philipp Henzler, Ben Mildenhall, Jonathan T. Barron, and Ricardo Martin-Brualla. "CamP: Camera Preconditioning for Neural Radiance Fields." ACM Transactions on Graphics 42, no. 6 (December 5, 2023): 1–11. http://dx.doi.org/10.1145/3618321.

Full text
Abstract:
Neural Radiance Fields (NeRF) can be optimized to obtain high-fidelity 3D scene reconstructions of objects and large-scale scenes. However, NeRFs require accurate camera parameters as input --- inaccurate camera parameters result in blurry renderings. Extrinsic and intrinsic camera parameters are usually estimated using Structure-from-Motion (SfM) methods as a pre-processing step to NeRF, but these techniques rarely yield perfect estimates. Thus, prior works have proposed jointly optimizing camera parameters alongside a NeRF, but these methods are prone to local minima in challenging settings. In this work, we analyze how different camera parameterizations affect this joint optimization problem, and observe that standard parameterizations exhibit large differences in magnitude with respect to small perturbations, which can lead to an ill-conditioned optimization problem. We propose using a proxy problem to compute a whitening transform that eliminates the correlation between camera parameters and normalizes their effects, and we propose to use this transform as a preconditioner for the camera parameters during joint optimization. Our preconditioned camera optimization significantly improves reconstruction quality on scenes from the Mip-NeRF 360 dataset: we reduce error rates (RMSE) by 67% compared to state-of-the-art NeRF approaches that do not optimize for cameras like Zip-NeRF, and by 29% relative to state-of-the-art joint optimization approaches using the camera parameterization of SCNeRF. Our approach is easy to implement, does not significantly increase runtime, can be applied to a wide variety of camera parameterizations, and can straightforwardly be incorporated into other NeRF-like models.
APA, Harvard, Vancouver, ISO, and other styles
35

Lian, Haojie, Xinhao Li, Leilei Chen, Xin Wen, Mengxi Zhang, Jieyuan Zhang, and Yilin Qu. "Uncertainty Quantification of Neural Reflectance Fields for Underwater Scenes." Journal of Marine Science and Engineering 12, no. 2 (February 18, 2024): 349. http://dx.doi.org/10.3390/jmse12020349.

Full text
Abstract:
Neural radiance fields and neural reflectance fields are novel deep learning methods for generating novel views of 3D scenes from 2D images. To extend the neural scene representation techniques to complex underwater environments, beyond neural reflectance fields underwater (BNU) was proposed, which considers the relighting conditions of on-aboard light sources by using neural reflectance fields, and approximates the attenuation and backscatter effects of water with an additional constant. Because the quality of the neural representation of underwater scenes is critical to downstream tasks such as marine surveying and mapping, the model reliability should be considered and evaluated. However, current neural reflectance models lack the ability of quantifying the uncertainty of underwater scenes that are not directly observed during training, which hinders their widespread use in the field of underwater unmanned autonomous navigation. To address this issue, we introduce an ensemble strategy to BNU that quantifies cognitive uncertainty in color space and unobserved regions with the expectation and variance of RGB values and termination probabilities along the ray. We also employ a regularization method to smooth the density of the underwater neural reflectance model. The effectiveness of the present method is demonstrated in numerical experiments.
APA, Harvard, Vancouver, ISO, and other styles
36

Yulita, Intan Nurma, Mohamad Ivan Fanany, and Aniati Murni Arymurthy. "Fuzzy Latent-Dynamic Conditional Neural Fields for Gesture Recognition in Video." International Journal on Information and Communication Technology (IJoICT) 2, no. 2 (July 25, 2017): 1. http://dx.doi.org/10.21108/ijoict.2016.22.124.

Full text
Abstract:
<p>With the explosion of data on the internet led to the presence of the big data era, so it requires data processing in order to get the useful information. One of the challenges is the gesture recognition the video processing. Therefore, this study proposes Latent-Dynamic Conditional Neural Fields and compares with the other family members of Conditional Random Fields. To improve the accuracy, these methods are combined by using Fuzzy Clustering. From the result, it can be concluded that the performance of Latent-Dynamic Conditional Neural Fields are lower than Conditional Neural Fields but higher than the Conditional Random Fields and Latent-Dynamic Conditional Random Fields. Also, the combination of Latent-Dynamic Conditional Neural Fields and Fuzzy C-Means Clustering has the highest. This evaluation is tested in a temporal dataset of gesture phase segmentation.</p>
APA, Harvard, Vancouver, ISO, and other styles
37

Wu, Benjamin, Chao Liu, Benjamin Eckart, and Jan Kautz. "Neural Interferometry: Image Reconstruction from Astronomical Interferometers Using Transformer-Conditioned Neural Fields." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 3 (June 28, 2022): 2685–93. http://dx.doi.org/10.1609/aaai.v36i3.20171.

Full text
Abstract:
Astronomical interferometry enables a collection of telescopes to achieve angular resolutions comparable to that of a single, much larger telescope. This is achieved by combining simultaneous observations from pairs of telescopes such that the signal is mathematically equivalent to sampling the Fourier domain of the object. However, reconstructing images from such sparse sampling is a challenging and ill-posed problem, with current methods requiring precise tuning of parameters and manual, iterative cleaning by experts. We present a novel deep learning approach in which the representation in the Fourier domain of an astronomical source is learned implicitly using a neural field representation. Data-driven priors can be added through a transformer encoder. Results on synthetically observed galaxies show that transformer-conditioned neural fields can successfully reconstruct astronomical observations even when the number of visibilities is very sparse.
APA, Harvard, Vancouver, ISO, and other styles
38

Hamdan, Baida Abdulredha. "Neural Network Principles and its Application." Webology 19, no. 1 (January 20, 2022): 3955–70. http://dx.doi.org/10.14704/web/v19i1/web19261.

Full text
Abstract:
Neural networks which also known as artificial neural networks is generally a computing dependent technique that formed and designed to create a simulation to the real brain of a human to be used as a problem solving method. Artificial neural networks gain their abilities by the method of training or learning, each method have a certain input and output which called results too, this method of learning works to create forming probability-weighted associations among both of input and the result which stored and saved across the net specifically among its data structure, any training process is depending on identifying the net difference between processed output which is usually a prediction and the real targeted output which occurs as an error, then a series of adjustments achieved to gain a proper learning result, this process called supervised learning. Artificial neural networks have found and proved itself in many applications in a variety of fields due to their capacity to recreate and simulate nonlinear phenomena. System identification and control (process control, vehicle control, quantum chemistry, trajectory prediction, and natural resource management. Etc.) In addition to face recognition which proved to be very effective. Neural network was proved to be a very promising technique in many fields due to its accuracy and problem solving properties.
APA, Harvard, Vancouver, ISO, and other styles
39

Soares, Alessandra M., Bruno J. T. Fernandes, and Carmelo J. A. Bastos-Filho. "Structured Pyramidal Neural Networks." International Journal of Neural Systems 28, no. 05 (April 19, 2018): 1750021. http://dx.doi.org/10.1142/s0129065717500216.

Full text
Abstract:
The Pyramidal Neural Networks (PNN) are an example of a successful recently proposed model inspired by the human visual system and deep learning theory. PNNs are applied to computer vision and based on the concept of receptive fields. This paper proposes a variation of PNN, named here as Structured Pyramidal Neural Network (SPNN). SPNN has self-adaptive variable receptive fields, while the original PNNs rely on the same size for the fields of all neurons, which limits the model since it is not possible to put more computing resources in a particular region of the image. Another limitation of the original approach is the need to define values for a reasonable number of parameters, which can turn difficult the application of PNNs in contexts in which the user does not have experience. On the other hand, SPNN has a fewer number of parameters. Its structure is determined using a novel method with Delaunay Triangulation and k-means clustering. SPNN achieved better results than PNNs and similar performance when compared to Convolutional Neural Network (CNN) and Support Vector Machine (SVM), but using lower memory capacity and processing time.
APA, Harvard, Vancouver, ISO, and other styles
40

Coombes, Stephen, and Helmut Schmidt. "Neural fields with sigmoidal firing rates: Approximate solutions." Discrete & Continuous Dynamical Systems - A 28, no. 4 (2010): 1369–79. http://dx.doi.org/10.3934/dcds.2010.28.1369.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Fu, Bofeng, and Zheng Wang. "Multi-scene Representation Learning with Neural Radiance Fields." Journal of Physics: Conference Series 1880, no. 1 (April 1, 2021): 012034. http://dx.doi.org/10.1088/1742-6596/1880/1/012034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Burlakov, E. O., T. V. Zhukovskaya, E. S. Zhukovskiy, and N. P. Puchkov. "On Continuous and Discontinuous Models of Neural Fields." Journal of Mathematical Sciences 259, no. 3 (October 28, 2021): 272–82. http://dx.doi.org/10.1007/s10958-021-05616-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Rougier, N. P., and A. Hutt. "Synchronous and asynchronous evaluation of dynamic neural fields." Journal of Difference Equations and Applications 17, no. 8 (August 2011): 1119–33. http://dx.doi.org/10.1080/10236190903051575.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Bressloff, Paul C., and Samuel R. Carroll. "Spatiotemporal Dynamics of Neural Fields on Product Spaces." SIAM Journal on Applied Dynamical Systems 13, no. 4 (January 2014): 1620–53. http://dx.doi.org/10.1137/140976339.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Poll, Daniel, and Zachary P. Kilpatrick. "Stochastic Motion of Bumps in Planar Neural Fields." SIAM Journal on Applied Mathematics 75, no. 4 (January 2015): 1553–77. http://dx.doi.org/10.1137/140999505.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Burby, J. W., Q. Tang, and R. Maulik. "Fast neural Poincaré maps for toroidal magnetic fields." Plasma Physics and Controlled Fusion 63, no. 2 (December 18, 2020): 024001. http://dx.doi.org/10.1088/1361-6587/abcbaa.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Engel, A., H. Englisch, and A. Schütte. "Improved Retrieval in Neural Networks with External Fields." Europhysics Letters (EPL) 8, no. 4 (February 15, 1989): 393–97. http://dx.doi.org/10.1209/0295-5075/8/4/016.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Socas-Navarro, Hector. "Measuring solar magnetic fields with artificial neural networks." Neural Networks 16, no. 3-4 (April 2003): 355–63. http://dx.doi.org/10.1016/s0893-6080(03)00024-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Pearson, D. W. "Approximating vertical vector fields for feedforward neural networks." Applied Mathematics Letters 9, no. 2 (March 1996): 61–64. http://dx.doi.org/10.1016/0893-9659(96)00013-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Bressloff, Paul C. "From invasion to extinction in heterogeneous neural fields." Journal of Mathematical Neuroscience 2, no. 1 (2012): 6. http://dx.doi.org/10.1186/2190-8567-2-6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography