Academic literature on the topic 'Back propagation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Back propagation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Back propagation"

1

Chahar, Vikas. "Analysis of Back Propagation Algorithm." International Journal of Scientific Research 2, no. 8 (June 1, 2012): 305–6. http://dx.doi.org/10.15373/22778179/aug2013/98.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

A, Dr Deepa, and Fathima Thasliya P A. "Back Propagation." International Journal for Research in Applied Science and Engineering Technology 11, no. 4 (April 30, 2023): 334–39. http://dx.doi.org/10.22214/ijraset.2023.50077.

Full text
Abstract:
Abstract: Back Propagation Algorithm research is now very active in the Artificial Neural Network (ANN) and machine learning communities. It has increased a wide range of applications, including image compression, pattern recognition, time series prediction, sequence identification, data filtering, and other intelligent processes carried out by the human brain, have had enormous results. In this paper, we give a quick introduction to ANN and BP algorithms, explain how they operate, and highlight some of the ongoing research projects and the difficulties they face
APA, Harvard, Vancouver, ISO, and other styles
3

Hui, Hui, Dayou Liu, and Yafei Wang. "Sequential back-propagation." Journal of Computer Science and Technology 9, no. 3 (July 1994): 252–60. http://dx.doi.org/10.1007/bf02939506.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Shing Wang, Chau. "Power Disturbance Recognition Using Back-Propagation Neural Networks." International Journal of Engineering and Technology 4, no. 4 (2012): 430–33. http://dx.doi.org/10.7763/ijet.2012.v4.403.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bernard, C., and D. Johnston. "Distance-Dependent Modifiable Threshold for Action Potential Back-Propagation in Hippocampal Dendrites." Journal of Neurophysiology 90, no. 3 (September 2003): 1807–16. http://dx.doi.org/10.1152/jn.00286.2003.

Full text
Abstract:
In hippocampal CA1 pyramidal neurons, action potentials generated in the axon back-propagate in a decremental fashion into the dendritic tree where they affect synaptic integration and synaptic plasticity. The amplitude of back-propagating action potentials (b-APs) is controlled by various biological factors, including membrane potential ( Vm). We report that, at any dendritic location ( x), the transition from weak (small-amplitude b-APs) to strong (large-amplitude b-APs) back-propagation occurs when Vm crosses a threshold potential, θ x. When Vm > θ x, back-propagation is strong (mostly active). Conversely, when Vm < θ x, back-propagation is weak (mostly passive). θ x varies linearly with the distance ( x) from the soma. Close to the soma, θ x ≪ resting membrane potential (RMP) and a strong hyperpolarization of the membrane is necessary to switch back-propagation from strong to weak. In the distal dendrites, θ x ≫ RMP and a strong depolarization is necessary to switch back-propagation from weak to strong. At ∼260 μm from the soma, θ260 ≈ RMP, suggesting that in this dendritic region back-propagation starts to switch from strong to weak. θ x depends on the availability or state of Na+ and K+ channels. Partial blockade or phosphorylation of K+ channels decreases θ x and thereby increases the portion of the dendritic tree experiencing strong back-propagation. Partial blockade or inactivation of Na+ channels has the opposite effect. We conclude that θ x is a parameter that captures the onset of the transition from weak to strong back-propagation. Its modification may alter dendritic function under physiological and pathological conditions by changing how far large action potentials back-propagate in the dendritic tree.
APA, Harvard, Vancouver, ISO, and other styles
6

Buscema, Massimo. "Back Propagation Neural Networks." Substance Use & Misuse 33, no. 2 (January 1998): 233–70. http://dx.doi.org/10.3109/10826089809115863.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Solanki, Shital. "A review on back propagation algorithms for Feedforward Networks." Global Journal For Research Analysis 2, no. 1 (June 15, 2012): 73–75. http://dx.doi.org/10.15373/22778160/january2013/61.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Garkani-Nejad, Zahra, and Behzad Ahmadi-Roudi. "Investigating the role of weight update functions in developing artificial neural network modeling of retention times of furan and phenol derivatives." Canadian Journal of Chemistry 91, no. 4 (April 2013): 255–62. http://dx.doi.org/10.1139/cjc-2012-0372.

Full text
Abstract:
A quantitative structure−retention relationship study has been carried out on the retention times of 63 furan and phenol derivatives using artificial neural networks (ANNs). First, a large number of descriptors were calculated using HyperChem, Mopac, and Dragon softwares. Then, a suitable number of these descriptors were selected using a multiple linear regression technique. This paper focuses on investigating the role of weight update functions in developing ANNs. Therefore, selected descriptors were used as inputs for ANNs with six different weight update functions including the Levenberg−Marquardt back-propagation network, scaled conjugate gradient back-propagation network, conjugate gradient back-propagation with Powell−Beale restarts network, one-step secant back-propagation network, resilient back-propagation network, and gradient descent with momentum back-propagation network. Comparison of the results indicates that the Levenberg−Marquardt back-propagation network has better predictive power than the other methods.
APA, Harvard, Vancouver, ISO, and other styles
9

AL-Assady, Nidhal, Baydaa Khaleel, and Shahbaa Khaleel. "Improvement the Back-propagation Technique." AL-Rafidain Journal of Computer Sciences and Mathematics 1, no. 2 (December 1, 2004): 127–51. http://dx.doi.org/10.33899/csmj.2004.164115.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Moody, John, and Chris Darken. "Speedy alternatives to back propagation." Neural Networks 1 (January 1988): 202. http://dx.doi.org/10.1016/0893-6080(88)90239-0.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Back propagation"

1

Lowton, Andrew D. "A constructive learning algorithm based on back-propagation." Thesis, Aston University, 1995. http://publications.aston.ac.uk/10663/.

Full text
Abstract:
There are been a resurgence of interest in the neural networks field in recent years, provoked in part by the discovery of the properties of multi-layer networks. This interest has in turn raised questions about the possibility of making neural network behaviour more adaptive by automating some of the processes involved. Prior to these particular questions, the process of determining the parameters and network architecture required to solve a given problem had been a time consuming activity. A number of researchers have attempted to address these issues by automating these processes, concentrating in particular on the dynamic selection of an appropriate network architecture. The work presented here specifically explores the area of automatic architecture selection; it focuses upon the design and implementation of a dynamic algorithm based on the Back-Propagation learning algorithm. The algorithm constructs a single hidden layer as the learning process proceeds using individual pattern error as the basis of unit insertion. This algorithm is applied to several problems of differing type and complexity and is found to produce near minimal architectures that are shown to have a high level of generalisation ability. (DX 187, 339)
APA, Harvard, Vancouver, ISO, and other styles
2

Sanner, Robert M. (Robert Michael). "Neuromorphic regulation of dynamic systems using back propagation networks." Thesis, Massachusetts Institute of Technology, 1988. http://hdl.handle.net/1721.1/34995.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Fernando, Thudugala Mudalige K. G. "Hydrological applications of MLP neural networks with back-propagation." Thesis, Hong Kong : University of Hong Kong, 2002. http://sunzi.lib.hku.hk/hkuto/record.jsp?B25085517.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bennett, Richard Campbell. "Classification of underwater signals using a back-propagation neural network." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1997. http://handle.dtic.mil/100.2/ADA331774.

Full text
Abstract:
Thesis (M.S. in Electrical Engineering) Naval Postgraduate School, June 1997.
Thesis advisors, Monique P. Fargues, Roberto Cristi. Includes bibliographical references (p. 95). Also available online.
APA, Harvard, Vancouver, ISO, and other styles
5

Ramachandran, Adithya. "HEV fuel optimization using interval back propagation based dynamic programming." Thesis, Georgia Institute of Technology, 2016. http://hdl.handle.net/1853/55054.

Full text
Abstract:
In this thesis, the primary powertrain components of a power split hybrid electric vehicle are modeled. In particular, the dynamic model of the energy storage element (i.e., traction battery) is exactly linearized through an input transformation method to take advantage of the proposed optimal control algorithm. A lipschitz continuous and nondecreasing cost function is formulated in order to minimize the net amount of consumed fuel. The globally optimal solution is obtained using a dynamic programming routine that produces the optimal input based on the current state of charge and the future power demand. It is shown that the global optimal control solution can be expressed in closed form for a time invariant and convex incremental cost function utilizing the interval back propagation approach. The global optimality of both time varying and invariant solutions are rigorously proved. The optimal closed form solution is further shown to be applicable to the time varying case provided that the time variations of the incremental cost function are sufficiently small. The real time implementation of this algorithm in Simulink is discussed and a 32.84 % improvement in fuel economy is observed compared to existing rule based methods.
APA, Harvard, Vancouver, ISO, and other styles
6

Teo, Chin Hock. "Back-propagation neural networks in adaptive control of unknown nonlinear systems." Thesis, Monterey, California. Naval Postgraduate School, 1991. http://hdl.handle.net/10945/26898.

Full text
Abstract:
Approved for public release; distribution is unlimited
The objective of this research is to develop a Back-propagation Neural Network (BNN) to control certain classes of unknown nonlinear systems and explore the network's capabilities. The structure of the Direct Model Reference Adaptive Controller (DMRAC) for Linear Time Invariant (LTI) systems with unknown parameters is first analyzed. This structure is then extended using a BNN for adaptive control of unknown nonlinear systems. The specific structure of the BNN DMRAC is developed for control of four general classes of nonlinear systems modeled in discrete time. Experiments are conducted by placing a representative system from each class under the BNN's control. The condition under which the BNN DMRAC can successfully control these systems are investigated. The design and training of the BNN are also studied. The results of the experiments show that the BNN DMRAC works for the representative systems considered, while the conventional least-squares estimator DMRAC fails. Based on analysis and experimental findings, some genera conditions required to ensure that this technique works are postulated and discussed. General guidelines used to achieve the stability of the BNN learning process and good learning convergence are also discussed. To establish this as a general and significant control technique, further research is required to obtain analytically, the conditions for stability of the controlled system, and to develop more specific rules and guidelines in the BNN design and training.
APA, Harvard, Vancouver, ISO, and other styles
7

Cakarcan, Alpay. "Back-propagation neural networks in adaptive control of unknown nonlinear systems." Thesis, Monterey, California. Naval Postgraduate School, 1994. http://hdl.handle.net/10945/30830.

Full text
Abstract:
The objective of this thesis research is to develop a Back-Propagation Neural Network (BNN) to control certain classes of unknown nonlinear systems and explore the network's capabilities. The structure of the Direct Model Reference Adaptive Controller (DMRAC) for Linear Time Invariant (LTI) systems with unknown parameters is first analyzed and then is extended to nonlinear systems by using BNN, Nonminimum phase systems, both linear and nonlinear, have also be considered. The analysis of the experiments shows that the BNN DMRAC gives satisfactory results for the representative nonlinear systems considered, while the conventional least-squares estimator DMRAC fails. Based on the analysis and experimental findings, some general conditions are shown to be required to ensure that this technique is satisfactory. These conditions are presented and discussed. It has been found that further research needs to be done for the nonminimum phase case in order to guarantee stability and tracking. Also, to establish this as a more general and significant control technique, further research is required to develop more specific rules and guidelines for the BNN design and training.
APA, Harvard, Vancouver, ISO, and other styles
8

Xiao, Nancy Y. (Nancy Ying). "Using the modified back-propagation algorithm to perform automated downlink analysis." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/40206.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1996.
Includes bibliographical references (p. 121-122).
by Nancy Y. Xiao.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
9

Le, Chau Giang. "Application of a back-propagation neural network to isolated-word speech recognition." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1993. http://handle.dtic.mil/100.2/ADA272495.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Chen, Peng. "Analysis of contribution rates and prediction based on back propagation neural networks." Thesis, University of Macau, 2017. http://umaclib3.umac.mo/record=b3691340.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Back propagation"

1

Yves, Chauvin, and Rumelhart David E, eds. Back propagation: Theory, architectures and applications. Hillsdale: Erlbaum, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lowton, Andrew David. A constructive learning algorithm based on back-propagation. Birmingham: Aston University. Department ofComputer Science and Applied Mathematics, 1995.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Narendra, Kumpati S. Back propagation in dynamical systems containing neural networks. New Haven,Co: Yale University Center for Systems Science, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bennett, Richard Campbell. Classification of underwater signals using a back-propagation neural network. Monterey, Calif: Naval Postgraduate School, 1997.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Anil, Phatak, Chatterji Gano, and Ames Research Center, eds. Scene segmentation of natural images using texture measures and back-propagation. Moffett Field, Calif: National Aeronautics and Space Administration, Ames Research Center, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Teo, Chin Hock. Back-propagation neural networks in adaptive control of unknown nonlinear systems. Monterey, Calif: Naval Postgraduate School, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Walker, James L. Back propagation neural networks for predicting ultimate strengths of unidirectional graphite/epoxy tensile specimens. [Washington, DC: National Aeronautics and Space Administration, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

S, Piascik Robert, and Langley Research Center, eds. A back face strain compliance expression for the compact tension specimen. Hampton, Va: National Aeronautics and Space Administration, Langley Research Center, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

N, Sundararajan, and Foo Shou King, eds. Parallel implementations of backpropagation neural networks on transputers: A study of training set parallelism. Singapore: World Scientific, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Plants Plus P/back. Collins, 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Back propagation"

1

Trejo, Luis A., and Carlos Sandoval. "Improving back-propagation: Epsilon-back-propagation." In Lecture Notes in Computer Science, 427–32. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/3-540-59497-3_205.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Henseler, J. "Back Propagation." In Lecture Notes in Computer Science, 37–66. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/bfb0027022.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Utgoff, Paul E. "Constraint Back-Propagation." In The Kluwer International Series in Engineering and Computer Science, 63–90. Boston, MA: Springer US, 1986. http://dx.doi.org/10.1007/978-1-4613-2283-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

McKeon, Richard. "Training with Back Propagation." In Neural Networks for Electronics Hobbyists, 75–98. Berkeley, CA: Apress, 2018. http://dx.doi.org/10.1007/978-1-4842-3507-2_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Sheu, Bing J., and Joongho Choi. "Back-Propagation Neural Networks." In Neural Information Processing and VLSI, 277–96. Boston, MA: Springer US, 1995. http://dx.doi.org/10.1007/978-1-4615-2247-8_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Gasparini, Sonia, and Michele Migliore. "Action Potential Back-Propagation." In Encyclopedia of Computational Neuroscience, 1–6. New York, NY: Springer New York, 2019. http://dx.doi.org/10.1007/978-1-4614-7320-6_123-5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Mashinchi, M. Hadi, and Siti Mariyam H. J. Shamsuddin. "Three-Term Fuzzy Back-Propagation." In Studies in Computational Intelligence, 143–58. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01082-8_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Müller, Berndt, Joachim Reinhardt, and Michael T. Strickland. "BTT: Back-Propagation Through Time." In Neural Networks, 296–302. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/978-3-642-57760-4_28.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Alankar, Bhavya, Nowsheena Yousf, and Shafqat Ul Ahsaan. "Predictive Analytics for Weather Forecasting Using Back Propagation and Resilient Back Propagation Neural Networks." In Advances in Intelligent Systems and Computing, 99–115. Singapore: Springer Singapore, 2019. http://dx.doi.org/10.1007/978-981-13-9330-3_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Bedzak, Miroslaw. "Fuzzy-η for Back Propagation Networks." In Computational Intelligence. Theory and Applications, 86–91. Berlin, Heidelberg: Springer Berlin Heidelberg, 2001. http://dx.doi.org/10.1007/3-540-45493-4_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Back propagation"

1

Fazayeli, Farideh, Lipo Wang, and Wen Liu. "Back-propagation with chaos." In 2008 International Conference on Neural Networks and Signal Processing (ICNNSP). IEEE, 2008. http://dx.doi.org/10.1109/icnnsp.2008.4590298.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gusciora, George L. "Back Propagation On Warp." In 32nd Annual Technical Symposium, edited by J. P. Letellier. SPIE, 1989. http://dx.doi.org/10.1117/12.948568.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

"[Back cover]." In 1985 Antennas and Propagation Society. IEEE, 1985. http://dx.doi.org/10.1109/aps.1985.1149404.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

"[Back cover]." In 1986 Antennas and Propagation Society. IEEE, 1986. http://dx.doi.org/10.1109/aps.1986.1149633.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

"[Back cover]." In 1987 Antennas and Propagation Society. IEEE, 1987. http://dx.doi.org/10.1109/aps.1987.1149917.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Zhang, Xishan, Shaoli Liu, Rui Zhang, Chang Liu, Di Huang, Shiyi Zhou, Jiaming Guo, et al. "Fixed-Point Back-Propagation Training." In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE, 2020. http://dx.doi.org/10.1109/cvpr42600.2020.00240.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Heilpern, T., A. Shlivinski, and E. Heyman. "Beam-based back-propagation imaging." In 2009 International Conference on Electromagnetics in Advanced Applications (ICEAA). IEEE, 2009. http://dx.doi.org/10.1109/iceaa.2009.5297549.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Jurik, M. "Back error propagation - A critique." In COMPCON Spring 88. IEEE, 1988. http://dx.doi.org/10.1109/cmpcon.1988.4895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Son Nguyen, Kanishka Tyagi, Parastoo Kheirkhah, and Michael Manry. "Partially affine invariant back propagation." In 2016 International Joint Conference on Neural Networks (IJCNN). IEEE, 2016. http://dx.doi.org/10.1109/ijcnn.2016.7727283.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Baum, Eric B. "Generalizing back propagation to computation." In AIP Conference Proceedings Volume 151. AIP, 1986. http://dx.doi.org/10.1063/1.36218.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Back propagation"

1

Deller, Jr, Hunt J. R., and S. D. A Simple 'Linearized' Learning Algorithm Which Outperforms Back-Propagation. Fort Belvoir, VA: Defense Technical Information Center, January 1992. http://dx.doi.org/10.21236/ada249697.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

St. George, Brett A. Speech Coding and Phoneme Classification Using a Back-Propagation Neural Network. Fort Belvoir, VA: Defense Technical Information Center, May 1997. http://dx.doi.org/10.21236/ada418472.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Vanderbeek, Richard G., and Alice M. Harper. Back-Propagation Network for Analog Signal Separation in High Noise Environments. Fort Belvoir, VA: Defense Technical Information Center, July 1992. http://dx.doi.org/10.21236/ada254245.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Malik. L51877 Crack Arrest Toughness to Avoid Dynamic Ductile Fracture in Gas Transmission Pipelines. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), March 2001. http://dx.doi.org/10.55274/r0010192.

Full text
Abstract:
Design against long ductile fracture propagation in gas pipelines involves an analysis of the balance between driving force, derived from the gas pressure, and the fracture resistance of the material. Initially, the shelf energy in the Charpy test was successfully used as a measure of fracture propagation resistance. As material strength, pipe diameter and operating pressures increased and required greater fracture propagation resistance, the limitations of the Charpy energy approach became increasingly apparent. This limitation for modern steels is due to the fact that the Charpy test involves significant energy absorption contributions from processes not related to fracture propagation. If an energy-balance approach is to be maintained, and if material resistance is to be measured in a fairly simple laboratory notch bend test (e.g. Charpy or drop-weight tear), the problem reduces to the isolation of the propagation energy absorption per unit of crack advance. To resolve crack propagation energy, a novel modification was evaluated for both Charpy and DWTT specimens by employing a back-slot including a snug fitting shim to replace the removed material. In most cases, this modification was effective in curtailing the load-displacement trace when the propagating crack interacted with the slot on the backside of the specimen. It is also noted that this approach did not affect the initial portion of the load-displacement history and thus allowed crack propagation energies to be resolved.
APA, Harvard, Vancouver, ISO, and other styles
5

Kinnan, Cynthia, Krislert Samphantharak, Robert Townsend, and Diego A. Vera-Cossio. Propagation and Insurance in Village Networks. Inter-American Development Bank, July 2022. http://dx.doi.org/10.18235/0004385.

Full text
Abstract:
In village economies, small firm owners facing idiosyncratic shocks adjust production by cutting spending and reducing employment. Households with whom they trade inputs and labor scale back their own businesses and reduce consumption. As effects reverberate through local economies, the aggregate indirect adverse effects are larger than the direct effects. Propagation is more severe when transmitted through labor networks as opposed to material supply-chain networks, and goes beyond input-output/sectoral considerations as it varies with network position, closeness to a shocked household, and network density. Participation in gift-giving insurance networks mitigates direct and hence indirect effects. Supply chain and labor networks are fragile as the broken links are not easily replaced, leading to persistent damage. Social gains from better-targeted safety nets are substantially higher than private gains.
APA, Harvard, Vancouver, ISO, and other styles
6

Muhlestein, Michael, and Carl Hart. Numerical analysis of weak acoustic shocks in aperiodic array of rigid scatterers. Engineer Research and Development Center (U.S.), October 2020. http://dx.doi.org/10.21079/11681/38579.

Full text
Abstract:
Nonlinear propagation of shock waves through periodic structures have the potential to exhibit interesting phenomena. Frequency content of the shock that lies within a bandgap of the periodic structure is strongly attenuated, but nonlinear frequency-frequency interactions pumps energy back into those bands. To investigate the relative importance of these propagation phenomena, numerical experiments using the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation are carried out. Two-dimensional propagation through a periodic array of rectangular waveguides is per-formed by iteratively using the output of one waveguide as the input for the next waveguide. Comparison of the evolution of the initial shock wave for both the linear and nonlinear cases is presented.
APA, Harvard, Vancouver, ISO, and other styles
7

Wilkins, C. A., and W. A. Sands. Comparison of a Back Propagation Artificial Neural Network Model with a Linear Regression Model for Personnel Selection. Fort Belvoir, VA: Defense Technical Information Center, May 1994. http://dx.doi.org/10.21236/ada280023.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Biagio, Di. L52037 Ductile Fracture Propagation Resistance for Advanced Pipeline Designs. Chantilly, Virginia: Pipeline Research Council International, Inc. (PRCI), January 2008. http://dx.doi.org/10.55274/r0011001.

Full text
Abstract:
The development of a method able to evaluate the ductile fracture behavior on pipelines has been documented. Therefore, methods for the determination of material fracture resistance and crack driving force have been accurately investigated. In particular, the techniques to determine the critical fracture characterizing parameter CTOA (crack tip opening angle) have been reviewed in-depth (back-slotted drop weight tear tests [DWTT], two specimen CTOA tests, etc.), and in view of a future pipe-mill application. For a more reliable CTOA estimate the needed following parameters have been investigated: 1) rotation factor in a DWTT and 2) the material flow stress to be used in dynamic tests. On the other hand, as far as the crack driving force is concerned, a finite element code developed by CSM (PICPRO) has been successfully used to evaluate the correlation between the CTOA inferred by DWT tests and that measured on pipe. In addition PICPRO has been used to determine the driving force acting on pipe in a wide range of operating conditions, finally supplying an appropriate formula for its calculation. Once the driving force and the fracture resistance have been determined their comparison allows crack arrest assessment.
APA, Harvard, Vancouver, ISO, and other styles
9

Kirichek, Galina, Vladyslav Harkusha, Artur Timenko, and Nataliia Kulykovska. System for detecting network anomalies using a hybrid of an uncontrolled and controlled neural network. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3743.

Full text
Abstract:
In this article realization method of attacks and anomalies detection with the use of training of ordinary and attacking packages, respectively. The method that was used to teach an attack on is a combination of an uncontrollable and controlled neural network. In an uncontrolled network, attacks are classified in smaller categories, taking into account their features and using the self- organized map. To manage clusters, a neural network based on back-propagation method used. We use PyBrain as the main framework for designing, developing and learning perceptron data. This framework has a sufficient number of solutions and algorithms for training, designing and testing various types of neural networks. Software architecture is presented using a procedural-object approach. Because there is no need to save intermediate result of the program (after learning entire perceptron is stored in the file), all the progress of learning is stored in the normal files on hard disk.
APA, Harvard, Vancouver, ISO, and other styles
10

Cuadra, Gabriel, and Victoria Nuguer. Research Insights: How Can Macro-Prudential Policy Control the Impact of Cross-Border Bank Flows on Emerging Market Economies? Inter-American Development Bank, June 2021. http://dx.doi.org/10.18235/0003327.

Full text
Abstract:
Advanced economies (AEs) transmit economic crisis to Emerging Market Economies (EMEs) through cross-border bank flows, impacting their output, credit, and assets prices. Empirical evidence suggests that the transmission of the crisis from AEs to EMEs is higher in the absence of macro-prudential policy. A macro-prudential policy in the form of a levy on EMEs banks, when credit grows faster than deposits, reduces the propagation of AEs crisis to EMEs: the consumption drop is 12 percent lower, and the reaction of the labor market smoother, so consumers are better off with the policy than without it.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography