Journal articles on the topic 'Progressive learning'

To see the other types of publications on this topic, follow the link: Progressive learning.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Progressive learning.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Acharya, Avidit, and Juan Ortner. "Progressive Learning." Econometrica 85, no. 6 (2017): 1965–90. http://dx.doi.org/10.3982/ecta14718.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Yu, Zhengxu, Dong Shen, Zhongming Jin, Jianqiang Huang, Deng Cai, and Xian-Sheng Hua. "Progressive Transfer Learning." IEEE Transactions on Image Processing 31 (2022): 1340–48. http://dx.doi.org/10.1109/tip.2022.3141258.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Daniels, Jonathan S., David Moreau, and Brooke N. Macnamara. "Learning and Transfer in Problem Solving Progressions." Journal of Intelligence 10, no. 4 (October 12, 2022): 85. http://dx.doi.org/10.3390/jintelligence10040085.

Full text
Abstract:
Do individuals learn more effectively when given progressive or variable problem-solving experience, relative to consistent problem-solving experience? We investigated this question using a Rubik’s Cube paradigm. Participants were randomly assigned to a progression-order condition, where they practiced solving three progressively more difficult Rubik’s Cubes (i.e., 2 × 2 × 2 to 3 × 3 × 3 to 4 × 4 × 4), a variable-order condition, where they practiced solving three Rubik’s Cubes of varying difficulty (e.g., 3 × 3 × 3 to 2 × 2 × 2 to 4 × 4 × 4), or a consistent-order condition, where they consistently practiced on three 5 × 5 × 5 Rubik’s Cubes. All the participants then attempted a 5 × 5 × 5 Rubik’s Cube test. We tested whether variable training is as effective as progressive training for near transfer of spatial skills and whether progressive training is superior to consistent training. We found no significant differences in performance across conditions. Participants’ fluid reasoning predicted 5 × 5 × 5 Rubik’s Cube test performance regardless of training condition.
APA, Harvard, Vancouver, ISO, and other styles
4

Yu, Zhiwen, Daxing Wang, Jane You, Hau-San Wong, Si Wu, Jun Zhang, and Guoqiang Han. "Progressive subspace ensemble learning." Pattern Recognition 60 (December 2016): 692–705. http://dx.doi.org/10.1016/j.patcog.2016.06.017.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Fayek, Haytham M., Lawrence Cavedon, and Hong Ren Wu. "Progressive learning: A deep learning framework for continual learning." Neural Networks 128 (August 2020): 345–57. http://dx.doi.org/10.1016/j.neunet.2020.05.011.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Ahmed, Naveed, JeeWoong Park, Cristian Arteaga, and Haroon Stephen. "Investigation of Progressive Learning within a Statics Course: An Analysis of Performance Retention, Critical Topics, and Active Participation." Education Sciences 13, no. 6 (June 2, 2023): 576. http://dx.doi.org/10.3390/educsci13060576.

Full text
Abstract:
Previous research has demonstrated a link between prior knowledge and student success in engineering courses. However, while course-to-course relations exist, researchers have paid insufficient attention to internal course performance development. This study aims to address this gap—designed to quantify and thus extract meaningful insights—by examining a fundamental engineering course, Statics, from three perspectives: (1) progressive learning reflected in performance retention throughout the course; (2) critical topics and their influence on students’ performance progression; and (3) student active participation as a surrogate measure of progressive learning. By analyzing data collected from 222 students over five semesters, this study draws insights on student in-course progressive learning. The results show that early learning had significant implications in building a foundation in progressive learning throughout the semester. Additionally, insufficient knowledge on certain topics can hinder student learning progression more than others, which eventually leads to course failure. Finally, student participation is a pathway to enhance learning and achieve excellent course performance. The presented analysis approach provides educators with a mechanism for diagnosing and devising strategies to address conceptual lapses for STEM (science, technology, engineering, and mathematics) courses, especially where progressive learning is essential.
APA, Harvard, Vancouver, ISO, and other styles
7

Boo-Ho Yang and H. Asada. "Progressive learning and its application to robot impedance learning." IEEE Transactions on Neural Networks 7, no. 4 (July 1996): 941–52. http://dx.doi.org/10.1109/72.508937.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Gitlin, Andrew. "Global learning & progressive school change." Literacy Information and Computer Education Journal 6, no. 4 (December 1, 2015): 2064–68. http://dx.doi.org/10.20533/licej.2040.2589.2015.0275.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Hui, Danqing Kang, Haibo He, and Fei-Yue Wang. "APLNet: Attention-enhanced progressive learning network." Neurocomputing 371 (January 2020): 166–76. http://dx.doi.org/10.1016/j.neucom.2019.08.086.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Han, Bo, Ivor W. Tsang, Ling Chen, Celina P. Yu, and Sai-Fu Fung. "Progressive Stochastic Learning for Noisy Labels." IEEE Transactions on Neural Networks and Learning Systems 29, no. 10 (October 2018): 5136–48. http://dx.doi.org/10.1109/tnnls.2018.2792062.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Yu, Zhiwen, Ye Lu, Jun Zhang, Jane You, Hau-San Wong, Yide Wang, and Guoqiang Han. "Progressive Semisupervised Learning of Multiple Classifiers." IEEE Transactions on Cybernetics 48, no. 2 (February 2018): 689–702. http://dx.doi.org/10.1109/tcyb.2017.2651114.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Quan, Ruijie, Yu Wu, Xin Yu, and Yi Yang. "Progressive Transfer Learning for Face Anti-Spoofing." IEEE Transactions on Image Processing 30 (2021): 3946–55. http://dx.doi.org/10.1109/tip.2021.3066912.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Corley, Gene. "Learning from disaster to prevent progressive collapse." Proceedings of the Institution of Civil Engineers - Civil Engineering 161, no. 6 (November 2008): 41–48. http://dx.doi.org/10.1680/cien.2008.161.6.41.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Lv, Yang, and Chen Xi. "PET image reconstruction with deep progressive learning." Physics in Medicine & Biology 66, no. 10 (May 14, 2021): 105016. http://dx.doi.org/10.1088/1361-6560/abfb17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Siddiqui, Zahid Ali, and Unsang Park. "Progressive Convolutional Neural Network for Incremental Learning." Electronics 10, no. 16 (August 5, 2021): 1879. http://dx.doi.org/10.3390/electronics10161879.

Full text
Abstract:
In this paper, we present a novel incremental learning technique to solve the catastrophic forgetting problem observed in the CNN architectures. We used a progressive deep neural network to incrementally learn new classes while keeping the performance of the network unchanged on old classes. The incremental training requires us to train the network only for new classes and fine-tune the final fully connected layer, without needing to train the entire network again, which significantly reduces the training time. We evaluate the proposed architecture extensively on image classification task using Fashion MNIST, CIFAR-100 and ImageNet-1000 datasets. Experimental results show that the proposed network architecture not only alleviates catastrophic forgetting but can also leverages prior knowledge via lateral connections to previously learned classes and their features. In addition, the proposed scheme is easily scalable and does not require structural changes on the network trained on the old task, which are highly required properties in embedded systems.
APA, Harvard, Vancouver, ISO, and other styles
16

Chen, Yisong, Guoping Wang, and Shihai Dong. "Learning with progressive transductive support vector machine." Pattern Recognition Letters 24, no. 12 (August 2003): 1845–55. http://dx.doi.org/10.1016/s0167-8655(03)00008-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Barth, Alison L., and Ajit Ray. "Progressive Circuit Changes during Learning and Disease." Neuron 104, no. 1 (October 2019): 37–46. http://dx.doi.org/10.1016/j.neuron.2019.09.032.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Zhang, Feiqian, Zhengxing Sun, Mofei Song, and Xufeng Lang. "Progressive 3D shape segmentation using online learning." Computer-Aided Design 58 (January 2015): 2–12. http://dx.doi.org/10.1016/j.cad.2014.08.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Xu, Cai, Wei Zhao, Jinglong Zhao, Ziyu Guan, Yaming Yang, Long Chen, and Xiangyu Song. "Progressive Deep Multi-View Comprehensive Representation Learning." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 9 (June 26, 2023): 10557–65. http://dx.doi.org/10.1609/aaai.v37i9.26254.

Full text
Abstract:
Multi-view Comprehensive Representation Learning (MCRL) aims to synthesize information from multiple views to learn comprehensive representations of data items. Prevalent deep MCRL methods typically concatenate synergistic view-specific representations or average aligned view-specific representations in the fusion stage. However, the performance of synergistic fusion methods inevitably degenerate or even fail when partial views are missing in real-world applications; the aligned based fusion methods usually cannot fully exploit the complementarity of multi-view data. To eliminate all these drawbacks, in this work we present a Progressive Deep Multi-view Fusion (PDMF) method. Considering the multi-view comprehensive representation should contain complete information and the view-specific data contain partial information, we deem that it is unstable to directly learn the mapping from partial information to complete information. Hence, PDMF employs a progressive learning strategy, which contains the pre-training and fine-tuning stages. In the pre-training stage, PDMF decodes the auxiliary comprehensive representation to the view-specific data. It also captures the consistency and complementarity by learning the relations between the dimensions of the auxiliary comprehensive representation and all views. In the fine-tuning stage, PDMF learns the mapping from the original data to the comprehensive representation with the help of the auxiliary comprehensive representation and relations. Experiments conducted on a synthetic toy dataset and 4 real-world datasets show that PDMF outperforms state-of-the-art baseline methods. The code is released at https://github.com/winterant/PDMF.
APA, Harvard, Vancouver, ISO, and other styles
20

Cárdenas, Monica, and Daniela Rocio Ramirez Orellana. "Progressive Reduction of Captions in Language Learning." Journal of Information Technology Education: Innovations in Practice 23 (2024): 002. http://dx.doi.org/10.28945/5263.

Full text
Abstract:
Aim/Purpose: This exploratory qualitative case study examines the perceptions of high-school learners of English regarding a pedagogical intervention involving progressive reduction of captions (full, sentence-level, keyword captions, and no-captions) in enhancing language learning. Background: Recognizing the limitations of caption usage in fostering independent listening comprehension in non-captioned environments, this research builds upon and extends the foundational work of Vanderplank (2016), who highlighted the necessity of a comprehensive blend of tasks, strategies, focused viewing, and the need to actively engage language learners in watching captioned materials. Methodology: Using a qualitative research design, the participants were exposed to authentic video texts in a five-week listening course. Participants completed an entry survey, and upon interaction with each captioning type, they wrote individual reflections and participated in focus group sessions. This methodological approach allowed for an in-depth exploration of learners’ experiences across different captioning scenarios, providing a nuanced understanding of the pedagogical intervention’s impact on their perceived language development process. Contribution: By bridging the research-practice gap, our study offers valuable insights into designing pedagogical interventions that reduce caption dependence, thereby preparing language learners for success in real-world, caption-free listening scenarios. Findings: Our findings show that learners not only appreciate the varied captioning approaches for their role in supporting text comprehension, vocabulary acquisition, pronunciation, and on-task focus but also for facilitating the integration of new linguistic knowledge with existing background knowledge. Crucially, our study uncovers a positive reception towards the gradual shift from fully captioned to uncaptioned materials, highlighting a stepwise reduction of caption dependence as instrumental in boosting learners’ confidence and sense of achievement in mastering L2 listening skills. Recommendations for Practitioners: The implications of our findings are threefold: addressing input selection, task design orchestration, and reflective practices. We advocate for a deliberate selection of input that resonates with learners’ interests and contextual realities alongside task designs that progressively reduce caption reliance and encourage active learner engagement and collaborative learning opportunities. Furthermore, our study underscores the importance of reflective practices in enabling learners to articulate their learning preferences and strategies, thereby fostering a more personalized and effective language learning experience. Recommendation for Researchers: Listening comprehension is a complex process that can be clearly influenced by the input, the task, and/or the learner characteristics. Comparative studies may struggle to control and account for all these variables, making it challenging to attribute observed differences solely to caption reduction. Impact on Society: This research responds to the call for innovative teaching practices in language education. It sets the stage for future inquiries into the nuanced dynamics of caption usage in language learning, advocating for a more learner-centered and adaptive approach. Future Research: Longitudinal quantitative studies that measure comprehension as captions support is gradually reduced (full, partial, and keyword) are strongly needed. Other studies could examine a range of individual differences (working memory capacity, age, levels of engagement, and language background) when reducing caption support. Future research could also examine captions with students with learning difficulties and/or disabilities.
APA, Harvard, Vancouver, ISO, and other styles
21

Kimura, Daisuke, and Natalia Kazik. "Learning in-progress." Gesture 16, no. 1 (June 15, 2017): 127–51. http://dx.doi.org/10.1075/gest.16.1.05kim.

Full text
Abstract:
Abstract Though gesture is a growing area in second language research, its role in the teaching and learning of grammar remains on the margins. Drawing from Sociocultural Theory, the present case study addresses this gap by offering a microgenetic analysis of an ESL learner’s developing understanding of the progressive aspect. Our analysis is threefold. First, we observe how the learner’s gesture reveals her initial understanding of the progressive aspect. This is followed by study of the learner’s appropriation of the teacher’s gesture for the progressive aspect. Finally, we examine the crucial ways in which the learner’s gesture differs from the teacher’s, arguing that the learner merged her initial understanding and the teacher’s gesture, instead of merely copying the teacher. We contend that gesture should not be regarded as supplementary to speech but as an indispensable window into the learning process that may not be accessible through the verbal channel alone.
APA, Harvard, Vancouver, ISO, and other styles
22

Rymarchuk, M. I. "DISTANCE LEARNING AS A PROGRESSIVE STEP ON THE POSTGRADUATE LEARNING PLATFORM." Bulletin of Problems Biology and Medicine 3, no. 152 (2019): 247. http://dx.doi.org/10.29254/2077-4214-2019-3-152-247-251.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Clearihan, Lyn, Silvia Vogel, Leon Piterman, and Neil Spike. "Transgenerational learning: maximising resources, minimising teaching gaps and fostering progressive learning." Australian Journal of Primary Health 17, no. 1 (2011): 29. http://dx.doi.org/10.1071/py10057.

Full text
Abstract:
The need to rationalise teaching resources underpinned a project at Monash University that used a Delphi technique to re-examine the teaching curriculum of two key topic areas in the medical curriculum – ophthalmology and dermatology – from an undergraduate, graduate and vocational perspective. Using Bloom’s taxonomy the learning objectives from these topic areas were collated and analysed. This process allowed the revising and redistributing of learning objectives of the curricula to reduce the likelihood of duplication of teaching or more importantly gaps in teaching occurring. This process highlighted the potential utility of using a transgenerational approach to curriculum planning but the outcomes are limited due to the small number of participating educators and the lack of formal evaluation of the method.
APA, Harvard, Vancouver, ISO, and other styles
24

Atsawaraungsuk, Sarutte, Wasaya Boonphairote, Kritsanapong Somsuk, Chanwit Suwannapong, and Suchart Khummanee. "A progressive learning for structural tolerance online sequential extreme learning machine." TELKOMNIKA (Telecommunication Computing Electronics and Control) 21, no. 5 (October 1, 2023): 1039. http://dx.doi.org/10.12928/telkomnika.v21i5.24564.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Hu, Wenyi, Yuchen Jin, Xuqing Wu, and Jiefu Chen. "Progressive transfer learning for low-frequency data prediction in full-waveform inversion." GEOPHYSICS 86, no. 4 (June 1, 2021): R369—R382. http://dx.doi.org/10.1190/geo2020-0598.1.

Full text
Abstract:
To effectively overcome the cycle-skipping issue in full-waveform inversion (FWI), we have developed a deep neural network (DNN) approach to predict the absent low-frequency (LF) components by exploiting the hidden physical relation connecting the LF and high-frequency (HF) data. To efficiently solve this challenging nonlinear regression problem, two novel strategies are proposed to design the DNN architecture and to optimize the learning process: (1) the dual data feed structure and (2) progressive transfer learning. With the dual data feed structure, not only the HF data, but also the corresponding beat tone data, are fed into the DNN to relieve the burden of feature extraction. The second strategy, progressive transfer learning, enables us to train the DNN using a single evolving training data set. Within the framework of progressive transfer learning, the training data set continuously evolves in an iterative manner by gradually retrieving the subsurface information through the physics-based inversion module, progressively enhancing the prediction accuracy of the DNN and propelling the inversion process out of the local minima. The synthetic numerical experiments suggest that, without any a priori geologic information, the LF data predicted by the progressive transfer learning are sufficiently accurate for an FWI engine to produce reliable subsurface velocity models free of cycle-skipping artifacts.
APA, Harvard, Vancouver, ISO, and other styles
26

Fahmi Aajami, Raghad. "A Cognitive Framework in Learning English Progressive Tense." International Journal of Language and Literary Studies 4, no. 2 (June 4, 2022): 100–111. http://dx.doi.org/10.36892/ijlls.v4i2.924.

Full text
Abstract:
Dealing with the English language and the skills of using it is still the focus of interest for many researchers, teachers and workers in the field of education. Cognitive grammar theory founded by Langacker (1987) is one of the prominent theories in this field. Iraqi students face a problem in mastering and understanding the use of the English language tenses. There are a lot of researches that have been conducted in the Iraqi context to improve the level of Iraqi students and to benefit from cognitive theory such as analysing the polysemy of English preposition. This research is an empirical study in which 85 students from the College of Education for Women at the University of Baghdad, English department participated in this experiment. The data were collected through two tests, pre and post, and the SPSS statistical editor was used to measure the extent of improvement in the participants' performance, in addition to a focus group discussion and questionnaire before and after the experiment. The results showed an improvement in the students’ achievement. This indicates that the cognitive grammar theory has positive contributions in improving the understanding, assimilation and use of tenses in the English as a foreign language.
APA, Harvard, Vancouver, ISO, and other styles
27

Ma, Wenguang, Shibiao Xu, Wei Ma, Xiaopeng Zhang, and Hongbin Zha. "Progressive Feature Learning for Facade Parsing With Occlusions." IEEE Transactions on Image Processing 31 (2022): 2081–93. http://dx.doi.org/10.1109/tip.2022.3152004.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Wang, Hui, Hanbin Zhao, and Xi Li. "Progressive Class-Based Expansion Learning for Image Classification." IEEE Signal Processing Letters 28 (2021): 1430–34. http://dx.doi.org/10.1109/lsp.2021.3094174.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Sun, Fanghui, Shen Wang, and Hongli Zhang. "A progressive learning method on unknown protocol behaviors." Journal of Network and Computer Applications 197 (January 2022): 103249. http://dx.doi.org/10.1016/j.jnca.2021.103249.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Duan, Yanfei, Yintian Liu, Ruixiang Wang, Dengguo Yao, and Hang Zhang. "Progressive face super-resolution via learning prior information." Journal of Physics: Conference Series 1651 (November 2020): 012127. http://dx.doi.org/10.1088/1742-6596/1651/1/012127.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Warsito, D. Muhtadi, Sukirwan, and H. Saleh. "The Role of Progressive Mathematics in Geometry Learning." Journal of Physics: Conference Series 1613 (August 2020): 012042. http://dx.doi.org/10.1088/1742-6596/1613/1/012042.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Moore, Philip B. "Research to Practice: Progressive Education and Accelerated Learning." Journal of Continuing Higher Education 54, no. 3 (October 2006): 48–51. http://dx.doi.org/10.1080/07377366.2006.10401225.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Yan, Tiantian, Shijie Wang, Zhihui Wang, Haojie Li, and Zhongxuan Luo. "Progressive learning for weakly supervised fine-grained classification." Signal Processing 171 (June 2020): 107519. http://dx.doi.org/10.1016/j.sigpro.2020.107519.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Yan, Yichao, Bingbing Ni, Huawei Wei, and Xiaokang Yang. "Fine-grained image analysis via progressive feature learning." Neurocomputing 396 (July 2020): 254–65. http://dx.doi.org/10.1016/j.neucom.2018.07.100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Samson, Kurt. "Study Suggests Learning Disabilities May Foretell Progressive Aphasia." Neurology Today 8, no. 6 (March 2008): 30–31. http://dx.doi.org/10.1097/01.nt.0000314565.66129.21.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Maguire, Dee. "Progressive learning: structured induction for the novice nurse." British Journal of Nursing 22, no. 11 (June 13, 2013): 645–49. http://dx.doi.org/10.12968/bjon.2013.22.11.645.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

de Carvalho, A., D. L. Bisset, and M. C. Fairhurst. "Progressive learning algorithm for GSN feedfoward neural architectures." Electronics Letters 30, no. 6 (March 17, 1994): 506–7. http://dx.doi.org/10.1049/el:19940326.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Lonka, Kirsti, Kai Hakkarainen, and Matti Sintonen. "Progressive inquiry learning for children — Experiences, possibilities, limitations." European Early Childhood Education Research Journal 8, no. 1 (January 2000): 7–23. http://dx.doi.org/10.1080/13502930085208461.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Guo, Junwen, Guobao Xiao, Shiping Wang, and Jun Yu. "Graph Context Transformation Learning for Progressive Correspondence Pruning." Proceedings of the AAAI Conference on Artificial Intelligence 38, no. 3 (March 24, 2024): 1968–75. http://dx.doi.org/10.1609/aaai.v38i3.27967.

Full text
Abstract:
Most of existing correspondence pruning methods only concentrate on gathering the context information as much as possible while neglecting effective ways to utilize such information. In order to tackle this dilemma, in this paper we propose Graph Context Transformation Network (GCT-Net) enhancing context information to conduct consensus guidance for progressive correspondence pruning. Specifically, we design the Graph Context Enhance Transformer which first generates the graph network and then transforms it into multi-branch graph contexts. Moreover, it employs self-attention and cross-attention to magnify characteristics of each graph context for emphasizing the unique as well as shared essential information. To further apply the recalibrated graph contexts to the global domain, we propose the Graph Context Guidance Transformer. This module adopts a confident-based sampling strategy to temporarily screen high-confidence vertices for guiding accurate classification by searching global consensus between screened vertices and remaining ones. The extensive experimental results on outlier removal and relative pose estimation clearly demonstrate the superior performance of GCT-Net compared to state-of-the-art methods across outdoor and indoor datasets.
APA, Harvard, Vancouver, ISO, and other styles
40

Budnick, D., A. Ghannoum, F. Steinlehner, A. Weinschenk, W. Volk, S. Huhn, W. Melek, and M. Worswick. "Predicting Dynamic Process Limits in Progressive Die Sheet Metal Forming." IOP Conference Series: Materials Science and Engineering 1238, no. 1 (May 1, 2022): 012068. http://dx.doi.org/10.1088/1757-899x/1238/1/012068.

Full text
Abstract:
Abstract Tool makers have a limited selection of tools and are afforded limited flexibility during progressive die try-outs when attempting to identify suitable process control parameters and optimize throughput. The performance of a given tooling design hinges on selecting a suitable stroke rate for the press. Cost efficiencies are realized when operating a press at higher stroke rates, but risk subjecting the sheet metal strip to larger, uncontrolled oscillations, which can lead to collisions and strip-misalignment during strip progression. Introducing active control to the strip feeder and lifters can offer increased flexibility to tool makers by allowing the strip progression to be fine-tuned to reduce strip oscillations at higher stroke rates. To alleviate uncertainties and assist in fine-tuning the process control parameters, machine learning models, such as an artificial neural network, are constructed to predict whether a given set of process parameters will lead to a collision or strip-misalignment during the strip progression. The machine learning models are trained using a dataset of FEA simulations which model the same progressive die operation using different process control inputs for the feeder, lifter and press. The machine learning models are shown to be capable of predicting the outcome of a given process permutation with a classification accuracy of about 87 % and assist in identifying the dynamic process limits in the progressive die operation.
APA, Harvard, Vancouver, ISO, and other styles
41

Montagut, Núria, Sergi Borrego-Écija, Magdalena Castellví, Immaculada Rico, Ramón Reñé, Mircea Balasa, Albert Lladó, and Raquel Sánchez-Valle. "Errorless Learning Therapy in Semantic Variant of Primary Progressive Aphasia." Journal of Alzheimer's Disease 79, no. 1 (January 5, 2021): 415–22. http://dx.doi.org/10.3233/jad-200904.

Full text
Abstract:
Background: The semantic variant of primary progressive aphasia (svPPA) is characterized by a progressive loss of semantic knowledge impairing the ability to name and to recognize the meaning of words. Objective: We aimed to evaluate the immediate and short-term effect of errorless learning speech therapy on the naming and recognition of commonly used words in patients with svPPA. Methods: Eight participants diagnosed with svPPA received 16 sessions of intensive errorless learning speech therapy. Naming and word comprehension tasks were evaluated at baseline, immediately postintervention, and at follow-up after 1, 3, and 6 months. These evaluations were performed using two item sets (a trained list and an untrained list). Results: In the naming tasks, patients showed a significant improvement in trained items immediately after the intervention, but that improvement decayed progressively when therapy ended. No improvements were found either in trained comprehension or in untrained tasks. Conclusion: Errorless learning therapy could improve naming ability in patients with svPPA. This effect may be due to the relative preservation of episodic memory, but the benefit is not maintained over time, presumably because there is no consolidation.
APA, Harvard, Vancouver, ISO, and other styles
42

Gariscsak, PJ, H. Braund, and F. Haji. "P.132 Investigation of Simulation-Based Lumbar Puncture Teaching Paradigms for Novice Learners." Canadian Journal of Neurological Sciences / Journal Canadien des Sciences Neurologiques 48, s3 (November 2021): S57. http://dx.doi.org/10.1017/cjn.2021.408.

Full text
Abstract:
Background: Simulation-based educations’ prevalence within clinical neuroscience is on the rise, however investigation into what environment is most conducive to optimizing learning performance is limited. We aimed to determine whether training a simple-to-complex (progressive) sequence would result in superior learning compared to complex-to-simple (mixed) or complex-only sequences. Methods: A three-arm, prospective, randomised experiment was conducted to determine the effects on novice learner LP performance and cognitive load during learning and a very complex simulated reality assessment test 9-11 days later. Results: During learning, sterility breaches decreased linearly over time (p<.01) with no group differences, and accuracy was higher in the progressive group compared to complex-only (p<.01) and trended in the mixed group (p<.09). Across the learning phase cognitive load increased in the progressive group (p<.01) and decreased across the mixed group (p<.01). At assessment, there were no group differences in number of sterility breaches (p=.66), needle passes (p=.68) or cognitive load (p=.25). Conclusions: Contrary to our hypothesis, equivocal assessment performance was found across groups. Our results suggest that successive progression in complexity of simulation does not increase novice learner outcomes. Further, a “one-size fits all” approach to simulated environment complexity in clinical neurosciences education may be warranted given equivocal learning and less resources necessary.
APA, Harvard, Vancouver, ISO, and other styles
43

Mulig-Cruz, Charity I., Manuel B. Barquilla, Josefina R. Tabudlong, and Jingle B. Magallanes. "Development of Progressive Learning Theory – Based Physics Enhancement Course." Proceedings Journal of Education, Psychology and Social Science Research 2, no. 1 (May 23, 2015): 27–34. http://dx.doi.org/10.21016/icepss.2015.ap03wf88.

Full text
Abstract:
This paper details the procedures of designing an enhancement training based on existing literature, expert inputs, needs assessment survey, identified constraints (time, material resources and teacher expertise), and validation procedures. The goal of the training is to enhance the Physics competence of non-Physics major Grade 7 teachers through a PLT-based learning environment. The constraints considered in the conceptualization of the design are the time, material and human resources. A needs assessment survey was conducted to identify the topics that were included in the training. The design incorporates equipping participants with ways of promoting self-directed learning particularly on the use of interactive science notebooks, Cornell note-taking method, and the use of graphic organizers. The main learning materials prepared for the participants were worksheets that were also made available to them online. Each learning session consisted of two learning activities one of which is a hands-on inquiry activity. This training design can be used by the Department of Education and private schools to equip science teachers with both content matter and pedagogical skills that they need to teach topics in K-12’s spiraling curriculum.
APA, Harvard, Vancouver, ISO, and other styles
44

Fu, Mengying, Tianning Yuan, Fang Wan, Songcen Xu, and Qixiang Ye. "Agreement-Discrepancy-Selection: Active Learning with Progressive Distribution Alignment." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 8 (May 18, 2021): 7466–73. http://dx.doi.org/10.1609/aaai.v35i8.16915.

Full text
Abstract:
In active learning, the ignorance of aligning unlabeled samples' distribution with that of labeled samples hinders the model trained upon labeled samples from selecting informative unlabeled samples. In this paper, we propose an agreement-discrepancy-selection (ADS) approach, and target at unifying distribution alignment with sample selection by introducing adversarial classifiers to the convolutional neural network (CNN). Minimizing classifiers' prediction discrepancy (maximizing prediction agreement) drives learning CNN features to reduce the distribution bias of labeled and unlabeled samples, while maximizing classifiers' discrepancy highlights informative samples. Iterative optimization of agreement and discrepancy loss calibrated with an entropy function drives aligning sample distributions in a progressive fashion for effective active learning. Experiments on image classification and object detection tasks demonstrate that ADS is task-agnostic, while significantly outperforms the previous methods when the labeled sets are small.
APA, Harvard, Vancouver, ISO, and other styles
45

Liu, Xin, Guobao Xiao, Zuoyong Li, and Riqing Chen. "Point2CN: Progressive two-view correspondence learning via information fusion." Signal Processing 189 (December 2021): 108304. http://dx.doi.org/10.1016/j.sigpro.2021.108304.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Wang, Xinyan. "Ensemble of Online Extreme Learning Machine with Progressive Amnesia." Journal of Information and Computational Science 11, no. 4 (March 1, 2014): 1093–101. http://dx.doi.org/10.12733/jics20102920.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Wu, Yu, Yutian Lin, Xuanyi Dong, Yan Yan, Wei Bian, and Yi Yang. "Progressive Learning for Person Re-Identification With One Example." IEEE Transactions on Image Processing 28, no. 6 (June 2019): 2872–81. http://dx.doi.org/10.1109/tip.2019.2891895.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Xie, Jin, Fan Zhu, Guoxian Dai, Ling Shao, and Yi Fang. "Progressive Shape-Distribution-Encoder for Learning 3D Shape Representation." IEEE Transactions on Image Processing 26, no. 3 (March 2017): 1231–42. http://dx.doi.org/10.1109/tip.2017.2651408.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Melander, Helen, and Fritjof Sahlström. "Learning to Fly—The Progressive Development of Situation Awareness." Scandinavian Journal of Educational Research 53, no. 2 (April 2009): 151–66. http://dx.doi.org/10.1080/00313830902757576.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Venkatesan, Rajasekar, and Meng Joo Er. "A novel progressive learning technique for multi-class classification." Neurocomputing 207 (September 2016): 310–21. http://dx.doi.org/10.1016/j.neucom.2016.05.006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography