Academic literature on the topic 'Neural fields'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Neural fields.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Neural fields"

1

Coombes, Stephen. "Neural fields." Scholarpedia 1, no. 6 (2006): 1373. http://dx.doi.org/10.4249/scholarpedia.1373.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Aigerman, Noam, Kunal Gupta, Vladimir G. Kim, Siddhartha Chaudhuri, Jun Saito, and Thibault Groueix. "Neural jacobian fields." ACM Transactions on Graphics 41, no. 4 (July 2022): 1–17. http://dx.doi.org/10.1145/3528223.3530141.

Full text
Abstract:
This paper introduces a framework designed to accurately predict piecewise linear mappings of arbitrary meshes via a neural network, enabling training and evaluating over heterogeneous collections of meshes that do not share a triangulation, as well as producing highly detail-preserving maps whose accuracy exceeds current state of the art. The framework is based on reducing the neural aspect to a prediction of a matrix for a single given point, conditioned on a global shape descriptor. The field of matrices is then projected onto the tangent bundle of the given mesh, and used as candidate jacobians for the predicted map. The map is computed by a standard Poisson solve, implemented as a differentiable layer with cached pre-factorization for efficient training. This construction is agnostic to the triangulation of the input, thereby enabling applications on datasets with varying triangulations. At the same time, by operating in the intrinsic gradient domain of each individual mesh, it allows the framework to predict highly-accurate mappings. We validate these properties by conducting experiments over a broad range of scenarios, from semantic ones such as morphing, registration, and deformation transfer, to optimization-based ones, such as emulating elastic deformations and contact correction, as well as being the first work, to our knowledge, to tackle the task of learning to compute UV parameterizations of arbitrary meshes. The results exhibit the high accuracy of the method as well as its versatility, as it is readily applied to the above scenarios without any changes to the framework.
APA, Harvard, Vancouver, ISO, and other styles
3

Smaragdis, Paris. "Neural acoustic fields." Journal of the Acoustical Society of America 153, no. 3_supplement (March 1, 2023): A175. http://dx.doi.org/10.1121/10.0018569.

Full text
Abstract:
We present a way to compactly represent acoustic transfer functions using a small, yet flexible, parametric representation. We show that we can use a neural network as a “soft” table lookup and train it to produce the value of transfer functions at arbitrary points in space and time. Doing so allows us to interpolate and produce unseen data, and to represent acoustic environments using a remarkably compact representation. Due to this representation being differentiable, this opens up multiple opportunities to employ such models within more sophisticated audio processing systems.
APA, Harvard, Vancouver, ISO, and other styles
4

Friston, Karl. "Mean-Fields and Neural Masses." PLoS Computational Biology 4, no. 8 (August 29, 2008): e1000081. http://dx.doi.org/10.1371/journal.pcbi.1000081.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Chappet De Vangel, Benoît, Cesar Torres-huitzil, and Bernard Girau. "Randomly Spiking Dynamic Neural Fields." ACM Journal on Emerging Technologies in Computing Systems 11, no. 4 (April 27, 2015): 1–26. http://dx.doi.org/10.1145/2629517.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Igel, Christian, Wolfram Erlhagen, and Dirk Jancke. "Optimization of dynamic neural fields." Neurocomputing 36, no. 1-4 (February 2001): 225–33. http://dx.doi.org/10.1016/s0925-2312(00)00328-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Belhe, Yash, Michaël Gharbi, Matthew Fisher, Iliyan Georgiev, Ravi Ramamoorthi, and Tzu-Mao Li. "Discontinuity-Aware 2D Neural Fields." ACM Transactions on Graphics 42, no. 6 (December 5, 2023): 1–11. http://dx.doi.org/10.1145/3618379.

Full text
Abstract:
Neural image representations offer the possibility of high fidelity, compact storage, and resolution-independent accuracy, providing an attractive alternative to traditional pixel- and grid-based representations. However, coordinate neural networks fail to capture discontinuities present in the image and tend to blur across them; we aim to address this challenge. In many cases, such as rendered images, vector graphics, diffusion curves, or solutions to partial differential equations, the locations of the discontinuities are known. We take those locations as input, represented as linear, quadratic, or cubic Bézier curves, and construct a feature field that is discontinuous across these locations and smooth everywhere else. Finally, we use a shallow multi-layer perceptron to decode the features into the signal value. To construct the feature field, we develop a new data structure based on a curved triangular mesh, with features stored on the vertices and on a subset of the edges that are marked as discontinuous. We show that our method can be used to compress a 100, 000 2 -pixel rendered image into a 25MB file; can be used as a new diffusion-curve solver by combining with Monte-Carlo-based methods or directly supervised by the diffusion-curve energy; or can be used for compressing 2D physics simulation data.
APA, Harvard, Vancouver, ISO, and other styles
8

Esselle, K. P., and M. A. Stuchly. "Neural stimulation with magnetic fields: analysis of induced electric fields." IEEE Transactions on Biomedical Engineering 39, no. 7 (July 1992): 693–700. http://dx.doi.org/10.1109/10.142644.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bressloff, Paul C., and Matthew A. Webber. "Front Propagation in Stochastic Neural Fields." SIAM Journal on Applied Dynamical Systems 11, no. 2 (January 2012): 708–40. http://dx.doi.org/10.1137/110851031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Kilpatrick, Zachary P., and Grégory Faye. "Pulse Bifurcations in Stochastic Neural Fields." SIAM Journal on Applied Dynamical Systems 13, no. 2 (January 2014): 830–60. http://dx.doi.org/10.1137/140951369.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Neural fields"

1

Ueda, Hiroyuki. "Studies on low-field functional MRI to detect tiny neural magnetic fields." Doctoral thesis, Kyoto University, 2021. http://hdl.handle.net/2433/263666.

Full text
Abstract:
付記する学位プログラム名: 京都大学卓越大学院プログラム「先端光・電子デバイス創成学」
京都大学
新制・課程博士
博士(工学)
甲第23205号
工博第4849号
京都大学大学院工学研究科電気工学専攻
(主査)教授 小林 哲生, 教授 松尾 哲司, 特定教授 中村 武恒
学位規則第4条第1項該当
Doctor of Philosophy (Engineering)
Kyoto University
DFAM
APA, Harvard, Vancouver, ISO, and other styles
2

Webber, Matthew. "Stochastic neural field models of binocular rivalry waves." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:c444a73e-20e3-454d-85ae-bbc8831fdf1f.

Full text
Abstract:
Binocular rivalry is an interesting phenomenon where perception oscillates between different images presented to the two eyes. This thesis is primarily concerned with modelling travelling waves of visual perception during transitions between these perceptual states. In order to model this effect in such a way that we retain as much analytical insight into the mechanisms as possible we employed neural field theory. That is, rather than modelling individual neurons in a neural network we treat the cortical surface as a continuous medium and establish integro-differential equations for the activity of a neural population. Our basic model which has been used by many previous authors both within and outside of neural field theory is to consider a one dimensional network of neurons for each eye. It is assumed that each network responds maximally to a particular feature of the underlying image, such as orientation. Recurrent connections within each network are taken to be excitatory and connections between the networks are taken to be inhibitory. In order for such a topology to exhibit the oscillations found in binocular rivalry there needs to be some form of slow adaptation which weakens the cross-connections under continued firing. By first considering a deterministic version of this model, we will show that, in fact, this slow adaptation also serves as a necessary "symmetry breaking" mechanism. Using this knowledge to make some mild assumptions we are then able to derive an expression for the shape of a travelling wave and its wave speed. We then go on to show that these predictions of our model are consistent not only with numerical simulations but also experimental evidence. It will turn out that it is not acceptable to completely ignore noise as it is a fundamental part of the underlying biology. Since methods for analyzing stochastic neural fields did not exist before our work, we first adapt methods originally intended for reaction-diffusion PDE systems to a stochastic version of a simple neural field equation. By regarding the motion of a stochastic travelling wave as being made up of two distinct components, firstly, the drift-diffusion of its overall position, secondly, fast fluctuations in its shape around some average front shape, we are able to derive a stochastic differential equation for the front position with respect to time. It is found that the front position undergoes a drift-diffusion process with constant coefficients. We then go on to show that our analysis agrees with numerical simulation. The original problem of stochastic binocular rivalry is then re-visited with this new toolkit and we are able to predict that the first passage time of a perceptual wave hitting a fixed barrier should be an inverse Gaussian distribution, a result which could potentially be experimentally tested. We also consider the implications of our stochastic work on different types of neural field equation to those used for modelling binocular rivalry. In particular, for neural fields which support pulled fronts propagating into an unstable state, the stochastic version of such an equation has wave fronts which undergo subdiffusive motion as opposed to the standard diffusion in the binocular rivalry case.
APA, Harvard, Vancouver, ISO, and other styles
3

Davenport, Christopher M. "Neural circuitry of retinal receptive fields in primate /." Thesis, Connect to this title online; UW restricted, 2007. http://hdl.handle.net/1773/10652.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Arocena, Miguel. "Control of neural stem cell migration by electric fields." Thesis, University of Aberdeen, 2010. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.540498.

Full text
Abstract:
Neural stem cells showed strong electrotaxis, evidenced by highly directed migration towards the cathode.  Optimal electrotaxis was found to require growth factors and phosphoinositide 3-kinase (PI3-K) signalling, although reduced electrotaxis could be obtained without growth factors at the highest EFs used.  After EF exposure, neural stem cell trajectories became much more linear, and a reduction in the number of cell protrusions oriented towards the anode was observed.  Also, protrusions initially orienting towards the cathode retracted after the polarity of the EF was reversed, suggesting that EFs could inhibit the extension of anodal protrusions.  A simple model of neural stem cell migration was built with only two key parameters, which reproduced accurately neural stem cell migration patterns, and predicted that PI3-K functions in electrotaxis mainly by controlling cell orientation.  Finally, wild-type and Pax6-/- embryonic neural stem cells were exposed simultaneously to EFs and contact guidance cues in conflicting orientations.  Only wild-type neural stem cells showed significant integrative migratory responses, suggesting that Pax6 is important for integration of diverse guidance cues during cell migration. The results obtained in this thesis show that neural stem cells display strong electrotaxis in vitro, which is accompanied by a qualitative change in the pattern of migration.  The results also identify the control of protrusion orientation by EFs as an important element in neural stem cell electrotaxis, contributing insight into the mechanisms of electrotaxis.  Finally, these results warrant further studies to assess the possibility of using EFs in brain repair therapies.
APA, Harvard, Vancouver, ISO, and other styles
5

Ferguson, Archibald Stewart. "Theoretical calculation of magnetic fields generated by neural currents." Case Western Reserve University School of Graduate Studies / OhioLINK, 1991. http://rave.ohiolink.edu/etdc/view?acc_num=case1055524502.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Qi, Yang. "Anomalous neural pattern dynamics: formation mechanisms and functional roles." Thesis, The University of Sydney, 2018. http://hdl.handle.net/2123/18808.

Full text
Abstract:
Spatiotemporal activity patterns with complex dynamics have been widely observed in the cortex, but their formation mechanisms and functional roles remain unclear. In this thesis, we first analyze how interactions of distributed bump activity patterns give rise to anomalous subdiffusive dynamics. Unlike normal diffusion whose mean squared error in representing working memory increases linearly with time, subdiffusion is characterized by a sublinear increase in its error, thereby significantly reducing memory degradation. The computational role of subdiffusive pattern dynamics in working memory is confirmed by our analysis of existing experimental data. To obtain theoretical insights into the mechanism underlying the formation of these activity patterns, we develop a new type of two-dimensional neural field model that incorporates refractoriness as a nonlinear negative feedback. We construct explicit bump solutions and perform a linear stability analysis, which reveals the emergence of stable bump activity as well as a critical transition from the bump state into propagating waves. Using numerical simulation, we show that the neural field exhibits local propagating patterns with rich dynamics including periodic, rotating and chaotic dynamics. We then show that propagating local patterns undergoing Levy flight emerge from a realistic cortical circuit model and that they can account for a wide range of neural response properties during both spontaneous and stimulus-driven activities. The fractional Levy dynamics of the spatiotemporal activity patterns provide a dynamic mechanism for a novel type of probabilistic representation which we refer to as fractional neural sampling (FNS). The Levy process naturally incorporates large discontinuous jumps into the sample path, enabling the neural sampler to effectively ‘tunnel’ through high energy barriers.
APA, Harvard, Vancouver, ISO, and other styles
7

Rohlén, Andreas. "UAV geolocalization in Swedish fields and forests using Deep Learning." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-300390.

Full text
Abstract:
The ability for unmanned autonomous aerial vehicles (UAV) to localize themselves in an environment is fundamental for them to be able to function, even if they do not have access to a global positioning system. Recently, with the success of deep learning in vision based tasks, there have been some proposed methods for absolute geolocalization using vison based deep learning with satellite and UAV images. Most of these are only tested in urban environments, which begs the question: How well do they work in non-urban areas like forests and fields? One drawback of deep learning is that models are often regarded as black boxes, as it is hard to know why the models make the predictions they do, i.e. what information is important and is used for the prediction. To solve this, several neural network interpretation methods have been developed. These methods provide explanations so that we may understand these models better. This thesis investigates the localization accuracy of one geolocalization method in both urban and non-urban environments as well as applies neural network interpretation in order to see if it can explain the potential difference in localization accuracy of the method in these different environments. The results show that the method performs best in urban environments, getting a mean absolute horizontal error of 38.30m and a mean absolute vertical error of 16.77m, while it performed significantly worse in non-urban environments, getting a mean absolute horizontal error of 68.11m and a mean absolute vertical error 22.83m. Further, the results show that if the satellite images and images from the unmanned aerial vehicle are collected during different seasons of the year, the localization accuracy is even worse, resulting in a mean absolute horizontal error of 86.91m and a mean absolute vertical error of 23.05m. The neural network interpretation did not aid in providing an explanation for why the method performs worse in non-urban environments and is not suitable for this kind of problem.
Obemannade autonoma luftburna fordons (UAV) förmåga att lokaliera sig själva är fundamental för att de ska fungera, även om de inte har tillgång till globala positioneringssystem. Med den nyliga framgången hos djupinlärning applicerat på visuella problem har det kommit metoder för absolut geolokalisering med visuell djupinlärning med satellit- och UAV-bilder. De flesta av dessa metoder har bara blivit testade i stadsmiljöer, vilket leder till frågan: Hur väl fungerar dessa metoder i icke-urbana områden som fält och skogar? En av nackdelarna med djupinlärning är att dessa modeller ofta ses som svarta lådor eftersom det är svårt att veta varför modellerna gör de gissningar de gör, alltså vilken information som är viktig och används för gissningen. För att lösa detta har flera metoder för att tolka neurala nätverk utvecklats. Dessa metoder ger förklaringar så att vi kan förstå dessa modeller bättre. Denna uppsats undersöker lokaliseringsprecisionen hos en geolokaliseringsmetod i både urbana och icke-urbana miljöer och applicerar även en tolkningsmetod för neurala nätverk för att se ifall den kan förklara den potentialla skillnaden i precision hos metoden i dessa olika miljöer. Resultaten visar att metoden fungerar bäst i urbana miljöer där den får ett genomsnittligt absolut horisontellt lokaliseringsfel på 38.30m och ett genomsnittligt absolut vertikalt fel på 16.77m medan den presterade signifikant sämre i icke-urbana miljöer där den fick ett genomsnittligt absolut horisontellt lokaliseringsfel på 68.11m och ett genomsnittligt absolut vertikalt fel på 22.83m. Vidare visar resultaten att om satellitbilderna och UAV-bilderna är tagna från olika årstider blir lokaliseringsprecisionen ännu sämre, där metoden får genomsnittligt absolut horisontellt lokaliseringsfel på 86.91m och ett genomsnittligt absolut vertikalt fel på 23.05m. Tolkningsmetoden hjälpte inte i att förklara varför metoden fungerar sämre i icke-urbana miljöer och är inte passande att använda för denna sortens problem.
APA, Harvard, Vancouver, ISO, and other styles
8

Curtis, Maurice A. "Neural progenitor cells in the Huntington's Disease human brain." Thesis, University of Auckland, 2004. http://hdl.handle.net/2292/3114.

Full text
Abstract:
The recent demonstration of endogenous progenitor cells in the adult mammalian brain raises the exciting possibility that these undifferentiated cells may be able to generate new neurons for cell replacement in diseases such as Huntington's disease (HD). Previous studies have shown that neural stem cells in the rodent brain subependymal layer (SEL), adjacent to the caudate nucleus, proliferate and differentiate into neurons and glial cells but no previous study has characterised the human SEL or shown neurogenesis in the diseased human brain. In this study, histochemical and immunohistochemical techniques were used to demonstrate the regional anatomy and staining characteristics of the normal and HD brain SEL using light and laser scanning confocal microscopy. The results demonstrated that the normal and HD SEL contained migrating neuroblasts, glial cells and precursor cells but there were more of each cell type present in the HD brain, and that the increase in cell numbers correlated with HD neuropathological grade. The normal and HD SEL was stained with a proliferative marker, proliferating cell nuclear antigen (PCNA), to label dividing cells. The results showed a significant increase in the number of dividing cells in the HD brain that correlated with HD grade and with CAG repeat length. Furthermore, the results showed that neurogenesis had occurred in the SEL as evidenced by co-localisation of PCNA and the neuronal marker βIII-tubulin. Also, gliogenesis had occurred in the SEL as evidenced by the co-localisation of PCNA with the glial marker GFAP. These studies also revealed a 2.6 fold increase in the number of new neurons in the HD SEL. PCNA positive cells were distributed throughout the SEL overlying the caudate nucleus but most notably the ventral and central regions of the SEL adjacent to the caudate nucleus contained the highest number of proliferating cells. I examined the SEL for mature cell markers and demonstrated many of the same cell types that are present in the normal striatum. With the exception of neuropeptide Y (NPY) neurons, there was a reduction in the number of mature neurons in the HD SEL. The NPY neurons were more abundant in the HD SEL suggesting they play a role in progenitor cell proliferation. The results in this thesis provide evidence of increased progenitor cell proliferation and neurogenesis in the diseased adult human brain and indicate the regenerative potential of the human brain. These findings may be of major relevance to the development of therapeutic approaches in the treatment of neurodegenerative diseases.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Yiming. "Applications of artificial neural networks (ANNs) in several different materials research fields." Thesis, Queen Mary, University of London, 2010. http://qmro.qmul.ac.uk/xmlui/handle/123456789/362.

Full text
Abstract:
In materials science, the traditional methodological framework is the identification of the composition-processing-structure-property causal pathways that link hierarchical structure to properties. However, all the properties of materials can be derived ultimately from structure and bonding, and so the properties of a material are interrelated to varying degrees. The work presented in this thesis, employed artificial neural networks (ANNs) to explore the correlations of different material properties with several examples in different fields. Those including 1) to verify and quantify known correlations between physical parameters and solid solubility of alloy systems, which were first discovered by Hume-Rothery in the 1930s. 2) To explore unknown crossproperty correlations without investigating complicated structure-property relationships, which is exemplified by i) predicting structural stability of perovskites from bond-valence based tolerance factors tBV, and predicting formability of perovskites by using A-O and B-O bond distances; ii) correlating polarizability with other properties, such as first ionization potential, melting point, heat of vaporization and specific heat capacity. 3) In the process of discovering unanticipated relationships between combination of properties of materials, ANNs were also found to be useful for highlighting unusual data points in handbooks, tables and databases that deserve to have their veracity inspected. By applying this method, massive errors in handbooks were found, and a systematic, intelligent and potentially automatic method to detect errors in handbooks is thus developed. Through presenting these four distinct examples from three aspects of ANN capability, different ways that ANNs can contribute to progress in materials science has been explored. These approaches are novel and deserve to be pursued as part of the newer methodologies that are beginning to underpin material research.
APA, Harvard, Vancouver, ISO, and other styles
10

Harris, William H. (William Hunt). "Machine learning transferable physics-based force fields using graph convolutional neural networks." Thesis, Massachusetts Institute of Technology, 2020. https://hdl.handle.net/1721.1/128979.

Full text
Abstract:
Thesis: S.M., Massachusetts Institute of Technology, Department of Materials Science and Engineering, 2020
Cataloged from student-submitted PDF of thesis.
Includes bibliographical references (pages 22-24).
Molecular dynamics and Monte Carlo methods allow the properties of a system to be determined from its potential energy surface (PES). In the domain of crystalline materials, the PES is needed for electronic structure calculations, critical for modeling semiconductors, optical, and energy-storage materials. While first principles techniques can be used to obtain the PES to high accuracy, their computational complexity limits applications to small systems and short timescales. In practice, the PES must be approximated using a computationally cheaper functional form. Classical force field (CFF) approaches simply define the PES as a sum over independent energy contributions. Commonly included terms include bonded (pair, angle, dihedral, etc.) and non bonded (van der Waals, Coulomb, etc.) interactions, while more recent CFFs model polarizability, reactivity, and other higher-order interactions.
Simple, physically-justified functional forms are often implemented for each energy type, but this choice - and the choice of which energy terms to include in the first place - is arbitrary and often hand-tuned on a per-system basis, severely limiting PES transferability. This flexibility has complicated the quest for a universal CFF. The simplest usable CFFs are tailored to specific classes of molecules and have few parameters, so that they can be optimally parameterized using a small amount of data; however, they suffer low transferability. Highly-parameterized neural network potentials can yield predictions that are extremely accurate for the entire training set; however, they suffer over-fitting and cannot interpolate.
We develop a tool, called AuTopology, to explore the trade-offs between complexity and generalizability in fitting CFFs; focus on simple, computationally fast functions that enforce physics-based regularization and transferability; use message-passing neural networks to featurized molecular graphs and interpolate CFF parameters across chemical space; and utilize high performance computing resources to improve the efficiency of model training and usage. A universal, fast CFF would open the door to high-throughput virtual materials screening in the pursuit of novel materials with tailored properties.
by William H. Harris.
S.M.
S.M. Massachusetts Institute of Technology, Department of Materials Science and Engineering
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Neural fields"

1

Coombes, Stephen, Peter beim Graben, Roland Potthast, and James Wright, eds. Neural Fields. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

1919-, Pribram Karl H., and Eccles, John C. Sir, 1903-, eds. Rethinking neural networks: Quantum fields and biological data. Hillsdale, N.J: Erlbaum, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

B, Pinter Robert, and Nabet Bahram, eds. Nonlinear vision: Determination of neural receptive fields, function, and networks. Boca Raton: CRC Press, 1992.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Kozma, Robert, and Walter J. Freeman. Cognitive Phase Transitions in the Cerebral Cortex - Enhancing the Neuron Doctrine by Modeling Neural Fields. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-24406-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Horowitz, John. The effects of hypergravic fields on neural signalling in the hippocampus. [Washington, DC: National Aeronautics and Space Administration, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Booth, John Nicholas. The application of weak complex magnetic fields on the neural correlates of consciousness. Sudbury, Ont: Laurentian University, School of Graduate Studies, 2006.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

1919-, Pribram Karl H., and Eccles, John C. Sir, 1903-, eds. Rethinking neural networks: Quantum fields and biological data : proceedings of the First Appalachian Conference on Behavioral Neurodynamics. Hillsdale, N.J: Erlbaum, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Center, Ames Research, ed. Cascading a systolic array and a feedforward neural network for navigation and obstacle avoidance using potential fields. Moffett Field, Calif: National Aeronautics and Space Administration, Ames Research Center, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

R, Dougherty Edward, and Society of Photo-optical Instrumentation Engineers., eds. Neural, morphological, and stochastic methods in image and signal processing: 10-11 July, 1995, San Diego, California. Bellingham, Wash., USA: SPIE, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Helias, Moritz, and David Dahmen. Statistical Field Theory for Neural Networks. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-46444-8.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Neural fields"

1

Coombes, Stephen, Peter beim Graben, and Roland Potthast. "Tutorial on Neural Field Theory." In Neural Fields, 1–43. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

beim Graben, Peter, and Serafim Rodrigues. "On the Electrodynamics of Neural Networks." In Neural Fields, 269–96. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

beim Graben, Peter, and Roland Potthast. "Universal Neural Field Computation." In Neural Fields, 299–318. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Lins, Jonas, and Gregor Schöner. "A Neural Approach to Cognition Based on Dynamic Field Theory." In Neural Fields, 319–39. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Erlhagen, Wolfram, and Estela Bicho. "A Dynamic Neural Field Approach to Natural and Efficient Human-Robot Collaboration." In Neural Fields, 341–65. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Liley, David T. J. "Neural Field Modelling of the Electroencephalogram: Physiological Insights and Practical Applications." In Neural Fields, 367–92. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1_14.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Steyn-Ross, D. Alistair, Moira L. Steyn-Ross, and Jamie W. Sleigh. "Equilibrium and Nonequilibrium Phase Transitions in a Continuum Model of an Anesthetized Cortex." In Neural Fields, 393–416. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jirsa, Viktor. "Large Scale Brain Networks of Neural Fields." In Neural Fields, 417–32. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Pinotsis, Dimitris A., and Karl J. Friston. "Neural Fields, Masses and Bayesian Modelling." In Neural Fields, 433–55. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wright, James J., and Paul D. Bourke. "Neural Field Dynamics and the Evolution of the Cerebral Cortex." In Neural Fields, 457–82. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-642-54593-1_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Neural fields"

1

Choi, Hyunsoo, and Chulhee Lee. "Neural Network Deinterlacing Using Multiple Fields and Field-MSEs." In 2007 International Joint Conference on Neural Networks. IEEE, 2007. http://dx.doi.org/10.1109/ijcnn.2007.4371072.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Takikawa, Towaki, Alex Evans, Jonathan Tremblay, Thomas Müller, Morgan McGuire, Alec Jacobson, and Sanja Fidler. "Variable Bitrate Neural Fields." In SIGGRAPH '22: Special Interest Group on Computer Graphics and Interactive Techniques Conference. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3528233.3530727.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Müller, Thomas, Alex Evans, Christoph Schied, Marco Foco, András Bódis-Szomorú, Isaac Deutsch, Michael Shelley, and Alexander Keller. "Instant Neural Radiance Fields." In SIGGRAPH '22: Special Interest Group on Computer Graphics and Interactive Techniques Conference. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3532833.3538678.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ost, Julian, Issam Laradji, Alejandro Newell, Yuval Bahat, and Felix Heide. "Neural Point Light Fields." In 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2022. http://dx.doi.org/10.1109/cvpr52688.2022.01787.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kim, Youngchan, Wonjoon Jin, Sunghyun Cho, and Seung-Hwan Baek. "Neural Spectro-polarimetric Fields." In SA '23: SIGGRAPH Asia 2023. New York, NY, USA: ACM, 2023. http://dx.doi.org/10.1145/3610548.3618172.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Tompkin, James. "Neural Fields for Scalable Scene Reconstruction." In Design Computation Input/Output 2022. Design Computation, 2022. http://dx.doi.org/10.47330/dcio.2022.axbl8798.

Full text
Abstract:
Neural fields are a new (and old!) approach to solving problems over spacetime via first-order optimization of a neural network. Over the past three years, combining neural fields with classic computer graphics approaches have allowed us to make significant advances in solving computer vision problems like scene reconstruction. I will present recent work that can reconstruct indoor scenes for photorealistic interactive exploration using new scalable hybrid neural field representations. This has applications where any real-world place needs to be digitized, especially for visualization purposes.
APA, Harvard, Vancouver, ISO, and other styles
7

Gu, Jeffrey, Kuan-Chieh Wang, and Serena Yeung. "Generalizable Neural Fields as Partially Observed Neural Processes." In 2023 IEEE/CVF International Conference on Computer Vision (ICCV). IEEE, 2023. http://dx.doi.org/10.1109/iccv51070.2023.00491.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Luo, Haimin, Anpei Chen, Qixuan Zhang, Bai Pang, Minye Wu, Lan Xu, and Jingyi Yu. "Convolutional Neural Opacity Radiance Fields." In 2021 IEEE International Conference on Computational Photography (ICCP). IEEE, 2021. http://dx.doi.org/10.1109/iccp51581.2021.9466273.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Kania, Kacper, Kwang Moo Yi, Marek Kowalski, Tomasz Trzciniski, and Andrea Tagliasacchi. "CoNeRF: Controllable Neural Radiance Fields." In 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2022. http://dx.doi.org/10.1109/cvpr52688.2022.01807.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hu, Tao, Shu Liu, Yilun Chen, Tiancheng Shen, and Jiaya Jia. "EfficientNeRF - Efficient Neural Radiance Fields." In 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2022. http://dx.doi.org/10.1109/cvpr52688.2022.01256.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Neural fields"

1

Burby, Joshua William, and Qi Tang. Fast neural Poincare maps for toroidal magnetic fields. Office of Scientific and Technical Information (OSTI), July 2020. http://dx.doi.org/10.2172/1637687.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Orkwis, Paul D., and Terry Daviaux. Advanced Neural Network Modeling of Synthetic Jet Flow Fields. Fort Belvoir, VA: Defense Technical Information Center, March 2006. http://dx.doi.org/10.21236/ada473581.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gonzalez Pibernat, Gabriel, and Miguel Mascaró Portells. Dynamic structure of single-layer neural networks. Fundación Avanza, May 2023. http://dx.doi.org/10.60096/fundacionavanza/2392022.

Full text
Abstract:
This article examines the practical applications of single hidden layer neural networks in machine learning and artificial intelligence. They have been used in diverse fields, such as finance, medicine, and autonomous vehicles, due to their simplicit
APA, Harvard, Vancouver, ISO, and other styles
4

Warrick, Arthur W., Gideon Oron, Mary M. Poulton, Rony Wallach, and Alex Furman. Multi-Dimensional Infiltration and Distribution of Water of Different Qualities and Solutes Related Through Artificial Neural Networks. United States Department of Agriculture, January 2009. http://dx.doi.org/10.32747/2009.7695865.bard.

Full text
Abstract:
The project exploits the use of Artificial Neural Networks (ANN) to describe infiltration, water, and solute distribution in the soil during irrigation. It provides a method of simulating water and solute movement in the subsurface which, in principle, is different and has some advantages over the more common approach of numerical modeling of flow and transport equations. The five objectives were (i) Numerically develop a database for the prediction of water and solute distribution for irrigation; (ii) Develop predictive models using ANN; (iii) Develop an experimental (laboratory) database of water distribution with time; within a transparent flow cell by high resolution CCD video camera; (iv) Conduct field studies to provide basic data for developing and testing the ANN; and (v) Investigate the inclusion of water quality [salinity and organic matter (OM)] in an ANN model used for predicting infiltration and subsurface water distribution. A major accomplishment was the successful use of Moment Analysis (MA) to characterize “plumes of water” applied by various types of irrigation (including drip and gravity sources). The general idea is to describe the subsurface water patterns statistically in terms of only a few (often 3) parameters which can then be predicted by the ANN. It was shown that ellipses (in two dimensions) or ellipsoids (in three dimensions) can be depicted about the center of the plume. Any fraction of water added can be related to a ‘‘probability’’ curve relating the size of the ellipse (or ellipsoid) that contains that amount of water. The initial test of an ANN to predict the moments (and hence the water plume) was with numerically generated data for infiltration from surface and subsurface drip line and point sources in three contrasting soils. The underlying dataset consisted of 1,684,500 vectors (5 soils×5 discharge rates×3 initial conditions×1,123 nodes×20 print times) where each vector had eleven elements consisting of initial water content, hydraulic properties of the soil, flow rate, time and space coordinates. The output is an estimate of subsurface water distribution for essentially any soil property, initial condition or flow rate from a drip source. Following the formal development of the ANN, we have prepared a “user-friendly” version in a spreadsheet environment (in “Excel”). The input data are selected from appropriate values and the output is instantaneous resulting in a picture of the resulting water plume. The MA has also proven valuable, on its own merit, in the description of the flow in soil under laboratory conditions for both wettable and repellant soils. This includes non-Darcian flow examples and redistribution and well as infiltration. Field experiments were conducted in different agricultural fields and various water qualities in Israel. The obtained results will be the basis for the further ANN models development. Regions of high repellence were identified primarily under the canopy of various orchard crops, including citrus and persimmons. Also, increasing OM in the applied water lead to greater repellency. Major scientific implications are that the ANN offers an alternative to conventional flow and transport modeling and that MA is a powerful technique for describing the subsurface water distributions for normal (wettable) and repellant soil. Implications of the field measurements point to the special role of OM in affecting wettability, both from the irrigation water and from soil accumulation below canopies. Implications for agriculture are that a modified approach for drip system design should be adopted for open area crops and orchards, and taking into account the OM components both in the soil and in the applied waters.
APA, Harvard, Vancouver, ISO, and other styles
5

Cooper, Leon N., and Christopher L. Scofield. Mean Field Theory of a Neural Network. Fort Belvoir, VA: Defense Technical Information Center, January 1988. http://dx.doi.org/10.21236/ada190801.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Elliott, Daniel S., and David B. Janes. Neutral Atom Lithography With Multi-Frequency Laser Fields. Fort Belvoir, VA: Defense Technical Information Center, June 2006. http://dx.doi.org/10.21236/ada459307.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Jau, Yuan-Yu. Imaging electric field with electrically neutral particles. Office of Scientific and Technical Information (OSTI), July 2021. http://dx.doi.org/10.2172/1821957.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Yaroshchuk, Svitlana O., Nonna N. Shapovalova, Andrii M. Striuk, Olena H. Rybalchenko, Iryna O. Dotsenko, and Svitlana V. Bilashenko. Credit scoring model for microfinance organizations. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3683.

Full text
Abstract:
The purpose of the work is the development and application of models for scoring assessment of microfinance institution borrowers. This model allows to increase the efficiency of work in the field of credit. The object of research is lending. The subject of the study is a direct scoring model for improving the quality of lending using machine learning methods. The objective of the study: to determine the criteria for choosing a solvent borrower, to develop a model for an early assessment, to create software based on neural networks to determine the probability of a loan default risk. Used research methods such as analysis of the literature on banking scoring; artificial intelligence methods for scoring; modeling of scoring estimation algorithm using neural networks, empirical method for determining the optimal parameters of the training model; method of object-oriented design and programming. The result of the work is a neural network scoring model with high accuracy of calculations, an implemented system of automatic customer lending.
APA, Harvard, Vancouver, ISO, and other styles
9

Wilmont, Martyn, Greg Van Boven, and Tom Jack. GRI-96-0452_1 Stress Corrosion Cracking Under Field Simulated Conditions I. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), November 1997. http://dx.doi.org/10.55274/r0011963.

Full text
Abstract:
Electrochemical measurements have been performed on polished and mill scaled steel samples. The solutions investigated have included carbonate bicarbonate mixtures of varying pH as well as solutions of neutral pH such as NS4. Results indicate that the mechanism of corrosion associated with the carbonate bicarbonate environments involves passive film formation. No such passivation is observed for solutions associated with neutral pH SCC. Electrochemical corrosion rates measured on polished steel specimens exposed to NS4 solutions in the pH range 5 to 6.8 were in the region of 5 x 10e-1 to 1 x 10e-2 mm/s. However, rates obtained on mill scaled surfaces went much lower and in the region of 5 x 10e-10 mm/s. Field determined crack propagation rates are estimated to be in the region of 2 x 10e-8 mm/s. Whilst the laboratory determined corrosion rates are lower than the field propagation rate it should be remembered that the laboratory rates were obtained on unstressed specimens. The application of load would be expected to increase the corrosion rate and may indicate that stress focused dissolution process may be sufficient to explain the propagation of neutral pH stress corrosion cracks. However, as hydrogen evolution is the most likely cathodic reaction involved in the mechanism of neutral pH SCC the role of hydrogen in the crack propagation mechanism may also be important.
APA, Harvard, Vancouver, ISO, and other styles
10

Tayeb, Shahab. Taming the Data in the Internet of Vehicles. Mineta Transportation Institute, January 2022. http://dx.doi.org/10.31979/mti.2022.2014.

Full text
Abstract:
As an emerging field, the Internet of Vehicles (IoV) has a myriad of security vulnerabilities that must be addressed to protect system integrity. To stay ahead of novel attacks, cybersecurity professionals are developing new software and systems using machine learning techniques. Neural network architectures improve such systems, including Intrusion Detection System (IDSs), by implementing anomaly detection, which differentiates benign data packets from malicious ones. For an IDS to best predict anomalies, the model is trained on data that is typically pre-processed through normalization and feature selection/reduction. These pre-processing techniques play an important role in training a neural network to optimize its performance. This research studies the impact of applying normalization techniques as a pre-processing step to learning, as used by the IDSs. The impacts of pre-processing techniques play an important role in training neural networks to optimize its performance. This report proposes a Deep Neural Network (DNN) model with two hidden layers for IDS architecture and compares two commonly used normalization pre-processing techniques. Our findings are evaluated using accuracy, Area Under Curve (AUC), Receiver Operator Characteristic (ROC), F-1 Score, and loss. The experimentations demonstrate that Z-Score outperforms no-normalization and the use of Min-Max normalization.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography