Academic literature on the topic 'Gradient Smoothing'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Gradient Smoothing.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Journal articles on the topic "Gradient Smoothing"
Fang, Shuai, Zhenji Yao, and Jing Zhang. "Scale and Gradient Aware Image Smoothing." IEEE Access 7 (2019): 166268–81. http://dx.doi.org/10.1109/access.2019.2953550.
Full textWang, Dongdong, Jiarui Wang, and Junchao Wu. "Superconvergent gradient smoothing meshfree collocation method." Computer Methods in Applied Mechanics and Engineering 340 (October 2018): 728–66. http://dx.doi.org/10.1016/j.cma.2018.06.021.
Full textZhou, Zhengyong, and Qi Yang. "An Active Set Smoothing Method for Solving Unconstrained Minimax Problems." Mathematical Problems in Engineering 2020 (June 24, 2020): 1–25. http://dx.doi.org/10.1155/2020/9108150.
Full textXu, Li, Cewu Lu, Yi Xu, and Jiaya Jia. "Image smoothing via L 0 gradient minimization." ACM Transactions on Graphics 30, no. 6 (December 2011): 1–12. http://dx.doi.org/10.1145/2070781.2024208.
Full textBurke, James V., Tim Hoheisel, and Christian Kanzow. "Gradient Consistency for Integral-convolution Smoothing Functions." Set-Valued and Variational Analysis 21, no. 2 (March 29, 2013): 359–76. http://dx.doi.org/10.1007/s11228-013-0235-6.
Full textPinilla, Samuel, Tamir Bendory, Yonina C. Eldar, and Henry Arguello. "Frequency-Resolved Optical Gating Recovery via Smoothing Gradient." IEEE Transactions on Signal Processing 67, no. 23 (December 1, 2019): 6121–32. http://dx.doi.org/10.1109/tsp.2019.2951192.
Full textAvrashi, Jacob. "High order gradient smoothing towards improved C1 eigenvalues." Engineering Computations 12, no. 6 (June 1995): 513–28. http://dx.doi.org/10.1108/02644409510799749.
Full textWang, Bao, Difan Zou, Quanquan Gu, and Stanley J. Osher. "Laplacian Smoothing Stochastic Gradient Markov Chain Monte Carlo." SIAM Journal on Scientific Computing 43, no. 1 (January 2021): A26—A53. http://dx.doi.org/10.1137/19m1294356.
Full textLin, Qihang, Xi Chen, and Javier Peña. "A smoothing stochastic gradient method for composite optimization." Optimization Methods and Software 29, no. 6 (March 13, 2014): 1281–301. http://dx.doi.org/10.1080/10556788.2014.891592.
Full textHe, Liangtian, and Yilun Wang. "Image smoothing via truncated ℓ 0 gradient regularisation." IET Image Processing 12, no. 2 (February 1, 2018): 226–34. http://dx.doi.org/10.1049/iet-ipr.2017.0533.
Full textDissertations / Theses on the topic "Gradient Smoothing"
Lee, Chang-Kye. "Gradient smoothing in finite elasticity : near-incompressibility." Thesis, Cardiff University, 2016. http://orca.cf.ac.uk/94491/.
Full textMao, Zirui. "A Novel Lagrangian Gradient Smoothing Method for Fluids and Flowing Solids." University of Cincinnati / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1553252214052311.
Full textPierucci, Federico. "Optimisation non-lisse pour l'apprentissage statistique avec régularisation matricielle structurée." Thesis, Université Grenoble Alpes (ComUE), 2017. http://www.theses.fr/2017GREAM024/document.
Full textTraining machine learning methods boils down to solving optimization problems whose objective functions often decomposes into two parts: a) the empirical risk, built upon the loss function, whose shape is determined by the performance metric and the noise assumptions; b) the regularization penalty, built upon a norm, or a gauge function, whose structure is determined by the prior information available for the problem at hand.Common loss functions, such as the hinge loss for binary classification, or more advanced loss functions, such as the one arising in classification with reject option, are non-smooth. Sparse regularization penalties such as the (vector) l1- penalty, or the (matrix) nuclear-norm penalty, are also non-smooth. However, basic non-smooth optimization algorithms, such as subgradient optimization or bundle-type methods, do not leverage the composite structure of the objective. The goal of this thesis is to study doubly non-smooth learning problems (with non-smooth loss functions and non-smooth regularization penalties) and first- order optimization algorithms that leverage composite structure of non-smooth objectives.In the first chapter, we introduce new regularization penalties, called the group Schatten norms, to generalize the standard Schatten norms to block- structured matrices. We establish the main properties of the group Schatten norms using tools from convex analysis and linear algebra; we retrieve in particular some convex envelope properties. We discuss several potential applications of the group nuclear-norm, in collaborative filtering, database compression, multi-label image tagging.In the second chapter, we present a survey of smoothing techniques that allow us to use first-order optimization algorithms designed for composite objectives decomposing into a smooth part and a non-smooth part. We also show how smoothing can be used on the loss function corresponding to the top-k accuracy, used for ranking and multi-class classification problems. We outline some first-order algorithms that can be used in combination with the smoothing technique: i) conditional gradient algorithms; ii) proximal gradient algorithms; iii) incremental gradient algorithms.In the third chapter, we study further conditional gradient algorithms for solving doubly non-smooth optimization problems. We show that an adaptive smoothing combined with the standard conditional gradient algorithm gives birth to new conditional gradient algorithms having the expected theoretical convergence guarantees. We present promising experimental results in collaborative filtering for movie recommendation and image categorization
Bhowmick, Sauradeep. "Advanced Smoothed Finite Element Modeling for Fracture Mechanics Analyses." University of Cincinnati / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1623240613376967.
Full textMayrink, Victor Teixeira de Melo. "Avaliação do algoritmo Gradient Boosting em aplicações de previsão de carga elétrica a curto prazo." Universidade Federal de Juiz de Fora (UFJF), 2016. https://repositorio.ufjf.br/jspui/handle/ufjf/3563.
Full textApproved for entry into archive by Adriana Oliveira (adriana.oliveira@ufjf.edu.br) on 2017-03-07T15:06:57Z (GMT) No. of bitstreams: 1 victorteixeirademelomayrink.pdf: 2587774 bytes, checksum: 1319cc37a15480796050b618b4d7e5f7 (MD5)
Made available in DSpace on 2017-03-07T15:06:57Z (GMT). No. of bitstreams: 1 victorteixeirademelomayrink.pdf: 2587774 bytes, checksum: 1319cc37a15480796050b618b4d7e5f7 (MD5) Previous issue date: 2016-08-31
FAPEMIG - Fundação de Amparo à Pesquisa do Estado de Minas Gerais
O armazenamento de energia elétrica em larga escala ainda não é viável devido a restrições técnicas e econômicas. Portanto, toda energia consumida deve ser produzida instantaneamente; não é possível armazenar o excesso de produção, ou tampouco cobrir eventuais faltas de oferta com estoques de segurança, mesmo que por um curto período de tempo. Consequentemente, um dos principais desafios do planejamento energético consiste em realizar previsões acuradas para as demandas futuras. Neste trabalho, apresentamos um modelo de previsão para o consumo de energia elétrica a curto prazo. A metodologia utilizada compreende a construção de um comitê de previsão, por meio da aplicação do algoritmo Gradient Boosting em combinação com modelos de árvores de decisão e a técnica de amortecimento exponencial. Esta estratégia compreende um método de aprendizado supervisionado que ajusta o modelo de previsão com base em dados históricos do consumo de energia, das temperaturas registradas e de variáveis de calendário. Os modelos propostos foram testados em duas bases de dados distintas e demonstraram um ótimo desempenho quando comparados com resultados publicados em outros trabalhos recentes.
The storage of electrical energy is still not feasible on a large scale due to technical and economic issues. Therefore, all energy to be consumed must be produced instantly; it is not possible to store the production leftover, or either to cover any supply shortages with safety stocks, even for a short period of time. Thus, one of the main challenges of energy planning consists in computing accurate forecasts for the future demand. In this paper, we present a model for short-term load forecasting. The methodology consists in composing a prediction comitee by applying the Gradient Boosting algorithm in combination with decision tree models and the exponential smoothing technique. This strategy comprises a supervised learning method that adjusts the forecasting model based on historical energy consumption data, the recorded temperatures and calendar variables. The proposed models were tested in two di erent datasets and showed a good performance when compared with results published in recent papers.
Heinrich, André. "Fenchel duality-based algorithms for convex optimization problems with applications in machine learning and image restoration." Doctoral thesis, Universitätsbibliothek Chemnitz, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:ch1-qucosa-108923.
Full textHuang, Chih-Ping, and 黃志平. "Piecewise Linear Function Solution Space and Modified-Gradient Smoothing Domain Method." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/51737947931004919031.
Full textCheng, Ching Wen, and 鄭景文. "Simplification Of Centroid Gradient Smoothing Domain Method Using Finite Element Basis." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/52320644772318750123.
Full textJhong, Jhih-Syong, and 鍾智雄. "A Study of Gradient Smoothing Methods for Boundary Value Problems on Triangular Meshes." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/93662964677132066520.
Full textHsieh, Hsun, and 謝. 洵. "Automatic tumor segmentation of breast ultra-sound images using a distance-regularized level-set evolution method with initial contour obtained by guided image filter, L0 gradient minimization smoothing pre-processing, and morphological features." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/t6z6cs.
Full text國立清華大學
電機工程學系所
105
Due to the speckle noise and low contrast in breast ultrasound images, it is hard to locate the contour of the tumor by using a single method. In this thesis, a new method for finding an initial contour is proposed, which can improve the result of DRLSE on the segmentation of BUS images. The new method focuses on improving the algorithm proposed by Tsai-Wen Niu, which is a way to search an initial contour based on the local minimum in the images. When the BUS images contain calcification, it is possible to fail in searching of initial contour through such algorithm, hence leading to a poor segmentation result when the initial contour is on the wrong place. Therefore, we acquire a bigger initial contour by using a series of image smoothing methods and binarization, which can eliminate the weak edges and adjust the contrast in BUS images. In addition, some images without local minimum can be successfully detected by using the proposed method. However, the pixel value in these images are similar. It might be hard to accurately separate the tumor region from non-tumor region by the difference of pixel values. These obstacles are conquered by calculating the difference of length and pixel value in the suspect region. The ranking outcome is improved by using the morphological features. After applying DRLSE, our initial contour can reach the tumor region more accurately. To evaluate the result of segmentation, it is compared with the outcome of DRLSE obtained from different initial contours proposed by Tsai-Wen Niu, expansion DRLSE method, and contraction DRLSE method using three evaluation metrics, including ME, RFAE and MHD. The experimental results indicate that the proposed method is basically better than the other methods. However, the initial contour might contain non-tumor region when the edge of the tumor’s boundary is too ambiguous; even so, the proposed method drastically reduce the number of DRLSE iteration and computation time. According to the experimental results, the proposed method has three advantages over the other methods. First, it sets the initial contour automatically which is more efficient than setting the initial contour manually. Second, the region of the initial contour is much bigger than those obtained by the other methods, which can reduce the computation time and the number of DRLSE iteration. Third, if the tumor boundary is distinct, the new initial contour can improve the segmentation result of DRLSE.
Books on the topic "Gradient Smoothing"
Geological Survey (U.S.), ed. Combining edge-gradient information to improve adaptive discontinuity-preserving smoothing of multispectral images. Reston, VA (521 National Center, Reston 22092): U.S. Geological Survey, 1994.
Find full textGeological Survey (U.S.), ed. Combining edge-gradient information to improve adaptive discontinuity-preserving smoothing of multispectral images. Reston, VA (521 National Center, Reston 22092): U.S. Geological Survey, 1994.
Find full textCombining edge-gradient information to improve adaptive discontinuity-preserving smoothing of multispectral images. Reston, VA (521 National Center, Reston 22092): U.S. Geological Survey, 1994.
Find full textCombining edge-gradient information to improve adaptive discontinuity-preserving smoothing of multispectral images. Reston, VA (521 National Center, Reston 22092): U.S. Geological Survey, 1994.
Find full textGeological Survey (U.S.), ed. Combining edge-gradient information to improve adaptive discontinuity-preserving smoothing of multispectral images. Reston, VA (521 National Center, Reston 22092): U.S. Geological Survey, 1994.
Find full textGeological Survey (U.S.), ed. Combining edge-gradient information to improve adaptive discontinuity-preserving smoothing of multispectral images. Reston, VA (521 National Center, Reston 22092): U.S. Geological Survey, 1994.
Find full textBook chapters on the topic "Gradient Smoothing"
Bui, Tinh Quoc. "A Smoothing Gradient-Enhanced Damage Model." In Computational and Experimental Simulations in Engineering, 91–96. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-27053-7_9.
Full textWelk, Martin. "Diffusion, Pre-smoothing and Gradient Descent." In Lecture Notes in Computer Science, 78–90. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-75549-2_7.
Full textZhang, He, François Petitjean, and Wray Buntine. "Hierarchical Gradient Smoothing for Probability Estimation Trees." In Advances in Knowledge Discovery and Data Mining, 222–34. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-47426-3_18.
Full textHowlett, John, and Alan Zundel. "Size Function Smoothing Using an Element Area Gradient." In Proceedings of the 18th International Meshing Roundtable, 1–12. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-04319-2_1.
Full textCox, Ingemar J., Sunita Hingorani, Bruce M. Maggs, and Satish B. Rao. "Stereo Without Disparity Gradient Smoothing: a Bayesian Sensor Fusion Solution." In BMVC92, 337–46. London: Springer London, 1992. http://dx.doi.org/10.1007/978-1-4471-3201-1_35.
Full textAhmad, Zohaib, Kaizhe Nie, Junfei Qiao, and Cuili Yang. "Batch Gradient Training Method with Smoothing $$l_0$$ Regularization for Echo State Networks." In Machine Learning and Intelligent Communications, 491–500. Cham: Springer International Publishing, 2019. http://dx.doi.org/10.1007/978-3-030-32388-2_42.
Full textChen, Li, Hongzhi Zhang, Dongwei Ren, David Zhang, and Wangmeng Zuo. "Fast Augmented Lagrangian Method for Image Smoothing with Hyper-Laplacian Gradient Prior." In Communications in Computer and Information Science, 12–21. Berlin, Heidelberg: Springer Berlin Heidelberg, 2014. http://dx.doi.org/10.1007/978-3-662-45643-9_2.
Full textUl Rahman, Jamshaid, Akhtar Ali, Masood Ur Rehman, and Rafaqat Kazmi. "A Unit Softmax with Laplacian Smoothing Stochastic Gradient Descent for Deep Convolutional Neural Networks." In Communications in Computer and Information Science, 162–74. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-5232-8_14.
Full textIqbal, Mansoor, Muhammad Awais Rehman, Naveed Iqbal, and Zaheer Iqbal. "Effect of Laplacian Smoothing Stochastic Gradient Descent with Angular Margin Softmax Loss on Face Recognition." In Communications in Computer and Information Science, 549–61. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-5232-8_47.
Full textNajid, Najib Mohamed, Marouane Alaoui-Selsouli, and Abdemoula Mohafid. "Dantzig-Wolfe Decomposition and Lagrangean Relaxation-Based Heuristics for an Integrated Production and Maintenance Planning with Time Windows." In Handbook of Research on Modern Optimization Algorithms and Applications in Engineering and Economics, 601–29. IGI Global, 2016. http://dx.doi.org/10.4018/978-1-4666-9644-0.ch023.
Full textConference papers on the topic "Gradient Smoothing"
Akai, Yuji, Toshihiro Shibata, Ryo Matsuoka, and Masahiro Okuda. "L0 Smoothing Based on Gradient Constraints." In 2018 25th IEEE International Conference on Image Processing (ICIP). IEEE, 2018. http://dx.doi.org/10.1109/icip.2018.8451436.
Full textPinilla, Samuel, Jorge Bacca, Jhon Angarita, and Henry Arguello. "Phase Retrieval via Smoothing Projected Gradient Method." In ICASSP 2018 - 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2018. http://dx.doi.org/10.1109/icassp.2018.8461445.
Full textGudkov, Vladimir, and Ilia Moiseev. "Image Smoothing Algorithm Based on Gradient Analysis." In 2020 Ural Symposium on Biomedical Engineering, Radioelectronics and Information Technology (USBEREIT). IEEE, 2020. http://dx.doi.org/10.1109/usbereit48449.2020.9117646.
Full textFeng Huang, Hu Cheng, and S. Vijayakumar. "Gradient weighted smoothing for MRI intensity correction." In 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference. IEEE, 2005. http://dx.doi.org/10.1109/iembs.2005.1617109.
Full textJiao, Jian, Hong Lu, Zijian Wang, Wenqiang Zhang, and Lizhe Qi. "L0 Gradient Smoothing and Bimodal Histogram Analysis." In MMAsia '19: ACM Multimedia Asia. New York, NY, USA: ACM, 2019. http://dx.doi.org/10.1145/3338533.3366554.
Full textKnyazev, Andrew, and Alexander Malyshev. "Conjugate gradient acceleration of non-linear smoothing filters." In 2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP). IEEE, 2015. http://dx.doi.org/10.1109/globalsip.2015.7418194.
Full textChalasani, Rakesh, and Jose C. Principe. "Dynamic sparse coding with smoothing proximal gradient method." In ICASSP 2014 - 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2014. http://dx.doi.org/10.1109/icassp.2014.6854995.
Full textHeiden, Eric, Luigi Palmieri, Sven Koenig, Kai O. Arras, and Gaurav S. Sukhatme. "Gradient-Informed Path Smoothing for Wheeled Mobile Robots." In 2018 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2018. http://dx.doi.org/10.1109/icra.2018.8460818.
Full textSubhan, Fazli, Salman Ahmed, and Khalid Ashraf. "Extended Gradient Predictor and Filter for smoothing RSSI." In 2014 16th International Conference on Advanced Communication Technology (ICACT). Global IT Research Institute (GIRI), 2014. http://dx.doi.org/10.1109/icact.2014.6779148.
Full textLiu, Jun, Ming Yan, Jinshan Zeng, and Tieyong Zeng. "Image Smoothing Via Gradient Sparsity and Surface Area Minimization." In 2019 IEEE International Conference on Image Processing (ICIP). IEEE, 2019. http://dx.doi.org/10.1109/icip.2019.8804271.
Full text