Добірка наукової літератури з теми "Human Activity Prediction"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Human Activity Prediction".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "Human Activity Prediction"
Dönmez, İlknur. "Human Activity Analysis and Prediction Using Google n-Grams." International Journal of Future Computer and Communication 7, no. 2 (June 2018): 32–36. http://dx.doi.org/10.18178/ijfcc.2018.7.2.516.
Повний текст джерелаYan, Aixia, Zhi Wang, Jiaxuan Li, and Meng Meng. "Human Oral Bioavailability Prediction of Four Kinds of Drugs." International Journal of Computational Models and Algorithms in Medicine 3, no. 4 (October 2012): 29–42. http://dx.doi.org/10.4018/ijcmam.2012100104.
Повний текст джерелаD., Manju, and Radha V. "A survey on human activity prediction techniques." International Journal of Advanced Technology and Engineering Exploration 5, no. 47 (October 21, 2018): 400–406. http://dx.doi.org/10.19101/ijatee.2018.547006.
Повний текст джерелаKeshinro, Babatunde, Younho Seong, and Sun Yi. "Deep Learning-based human activity recognition using RGB images in Human-robot collaboration." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 66, no. 1 (September 2022): 1548–53. http://dx.doi.org/10.1177/1071181322661186.
Повний текст джерелаBragança, Hendrio, Juan G. Colonna, Horácio A. B. F. Oliveira, and Eduardo Souto. "How Validation Methodology Influences Human Activity Recognition Mobile Systems." Sensors 22, no. 6 (March 18, 2022): 2360. http://dx.doi.org/10.3390/s22062360.
Повний текст джерелаGiri, Pranit. "Human Activity Recognition System." International Journal for Research in Applied Science and Engineering Technology 11, no. 5 (May 31, 2023): 6671–73. http://dx.doi.org/10.22214/ijraset.2023.53135.
Повний текст джерелаBhambri, Pankaj, Sachin Bagga, Dhanuka Priya, Harnoor Singh, and Harleen Kaur Dhiman. "Suspicious Human Activity Detection System." December 2020 2, no. 4 (October 31, 2020): 216–21. http://dx.doi.org/10.36548/jismac.2020.4.005.
Повний текст джерелаXu-Nan Tan, Xu-Nan Tan. "Human Activity Recognition Based on CNN and LSTM." 電腦學刊 34, no. 3 (June 2023): 221–35. http://dx.doi.org/10.53106/199115992023063403016.
Повний текст джерелаEsther, Ekemeyong, and Teresa Zielińska. "Predicting Human Activity – State of the Art." Pomiary Automatyka Robotyka 27, no. 2 (June 16, 2023): 31–46. http://dx.doi.org/10.14313/par_248/31.
Повний текст джерелаLiu, Zhenguang, Kedi Lyu, Shuang Wu, Haipeng Chen, Yanbin Hao, and Shouling Ji. "Aggregated Multi-GANs for Controlled 3D Human Motion Prediction." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 3 (May 18, 2021): 2225–32. http://dx.doi.org/10.1609/aaai.v35i3.16321.
Повний текст джерелаДисертації з теми "Human Activity Prediction"
Coen, Paul Dixon. "Human Activity Recognition and Prediction using RGBD Data." OpenSIUC, 2019. https://opensiuc.lib.siu.edu/theses/2562.
Повний текст джерелаBergelin, Victor. "Human Activity Recognition and Behavioral Prediction using Wearable Sensors and Deep Learning." Thesis, Linköpings universitet, Matematiska institutionen, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-138064.
Повний текст джерелаBaldo, Fatima Magdi Hamza. "Integrating chemical, biological and phylogenetic spaces of African natural products to understand their therapeutic activity." Thesis, University of Cambridge, 2019. https://www.repository.cam.ac.uk/handle/1810/289714.
Повний текст джерелаSnyder, Kristian. "Utilizing Convolutional Neural Networks for Specialized Activity Recognition: Classifying Lower Back Pain Risk Prediction During Manual Lifting." University of Cincinnati / OhioLINK, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1583999458096255.
Повний текст джерелаMehdi, Nima. "Approches probabilistes pour la perception et l’interprétation de l’activité humaine." Electronic Thesis or Diss., Université de Lorraine, 2024. http://www.theses.fr/2024LORR0202.
Повний текст джерелаFrom industry to services, intelligent systems are required to observe, interact with, or cooperate with humans. This thesis is therefore set in the context of intelligent perception methods for the analysis of humans, using the pose and activity associated with them. Due to the variable and changing nature of humans, it is difficult to obtain an accurate representation of theprocesses guiding their movements and actions. These difficulties are compounded when it comes to estimating or predicting movements or activities. In order to take account of the uncertainty inherent in humans, we propose a Bayesian approach to the perception and analysis of human activity. The first contribution is dedicated to the simultaneous estimation of human pose and posture. Using a monocular camera and wearable sensors, we aim to estimate human 3D pose in real time. For robust estimation, a multimodal fusion approach is suggested, incorporating measurements from wearable inertial sensors with camera observations. In this way, we overcome measurement ambiguities related to the camera and inertial drift due to inertial units. We use a particle filter so as to take into account the non-deterministic nature of human motion and thenon-Gaussian nature of posture. In order to reduce the computational cost, we put forward an architecture composed of two consecutive filters. A first filter estimates the posture in a factorized way from inertial observations only. Then a second filter estimates the complete pose from the camera, incorporating the estimation of the first filter. Our approach achieves fusion by constructing the sampling distribution of the second filter. This architecture makes it possible to estimate pose and posture simultaneously, at low computational cost, and is robust to cloaking and drift. The second contribution pertains to the prediction of human activity. Hidden Markov models have proved effective for the analysis of human activity through segmentation and activity recognition tasks. However, they have modeling limitations that make them insufficient for prediction. We therefore propose the use of semi-Markovian models for prediction. These models extend the definition of Markov models by explicitly modeling the duration spent in each state. This explicit modeling of duration enables better modeling of non-stationary processes and improves the predictive capability of these models. Our study thus demonstrates the usefulness of such models for activity prediction while taking uncertainty into account
Rozman, Peter Andrew. "Multi-Unit Activity in the Human Cortex as a Predictor of Seizure Onset." Thesis, Harvard University, 2015. http://nrs.harvard.edu/urn-3:HUL.InstRepos:15821597.
Повний текст джерелаKarst, Gregory Mark. "Multijoint arm movements: Predictions and observations regarding initial muscle activity at the shoulder and elbow." Diss., The University of Arizona, 1989. http://hdl.handle.net/10150/184920.
Повний текст джерелаCheradame, Stéphane. "Biomodulation du 5-fluorouracile par l'acide folinique et recherche des facteurs de prédiction de la sensibilité tumorale à cette association." Université Joseph Fourier (Grenoble ; 1971-2015), 1996. http://www.theses.fr/1996GRE10252.
Повний текст джерелаSilva, Joana. "Smartphone Based Human Activity Prediction." Dissertação, 2013. http://hdl.handle.net/10216/74272.
Повний текст джерелаSilva, Joana Raquel Cerqueira da. "Smartphone based human activity prediction." Master's thesis, 2013. http://hdl.handle.net/10216/72620.
Повний текст джерелаКниги з теми "Human Activity Prediction"
Fu, Yun, ed. Human Activity Recognition and Prediction. Cham: Springer International Publishing, 2016. http://dx.doi.org/10.1007/978-3-319-27004-3.
Повний текст джерелаFu, Yun. Human Activity Recognition and Prediction. Springer London, Limited, 2015.
Знайти повний текст джерелаAndersson, Jenny. The Future as Social Technology. Prediction and the Rise of Futurology. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198814337.003.0005.
Повний текст джерелаCook, Diane J., and Narayanan C. Krishnan. Activity Learning: Discovering, Recognizing, and Predicting Human Behavior from Sensor Data. Wiley & Sons, Incorporated, John, 2015.
Знайти повний текст джерелаCook, Diane J., and Narayanan C. Krishnan. Activity Learning: Discovering, Recognizing, and Predicting Human Behavior from Sensor Data. Wiley & Sons, Incorporated, John, 2015.
Знайти повний текст джерелаCook, Diane J., and Narayanan C. Krishnan. Activity Learning: Discovering, Recognizing, and Predicting Human Behavior from Sensor Data. Wiley & Sons, Limited, John, 2015.
Знайти повний текст джерелаActivity Learning: Discovering, Recognizing, and Predicting Human Behavior from Sensor Data. Wiley, 2015.
Знайти повний текст джерелаAndersson, Jenny. The Future of the World. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198814337.001.0001.
Повний текст джерелаЧастини книг з теми "Human Activity Prediction"
Kong, Yu, and Yun Fu. "Activity Prediction." In Human Activity Recognition and Prediction, 107–22. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27004-3_6.
Повний текст джерелаLi, Kang, and Yun Fu. "Actionlets and Activity Prediction." In Human Activity Recognition and Prediction, 123–51. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27004-3_7.
Повний текст джерелаKong, Yu, and Yun Fu. "Action Recognition and Human Interaction." In Human Activity Recognition and Prediction, 23–48. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27004-3_2.
Повний текст джерелаKong, Yu, and Yun Fu. "Introduction." In Human Activity Recognition and Prediction, 1–22. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27004-3_1.
Повний текст джерелаJia, Chengcheng, and Yun Fu. "Subspace Learning for Action Recognition." In Human Activity Recognition and Prediction, 49–69. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27004-3_3.
Повний текст джерелаJia, Chengcheng, Wei Pang, and Yun Fu. "Multimodal Action Recognition." In Human Activity Recognition and Prediction, 71–85. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27004-3_4.
Повний текст джерелаJia, Chengcheng, Yu Kong, Zhengming Ding, and Yun Fu. "RGB-D Action Recognition." In Human Activity Recognition and Prediction, 87–106. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27004-3_5.
Повний текст джерелаLi, Kang, Sheng Li, and Yun Fu. "Time Series Modeling for Activity Prediction." In Human Activity Recognition and Prediction, 153–74. Cham: Springer International Publishing, 2015. http://dx.doi.org/10.1007/978-3-319-27004-3_8.
Повний текст джерелаFriedrich, Björn, and Andreas Hein. "Ensemble Classifier for Nurse Care Activity Prediction Based on Care Records." In Human Activity and Behavior Analysis, 323–32. Boca Raton: CRC Press, 2024. http://dx.doi.org/10.1201/9781003371540-22.
Повний текст джерелаPiergiovanni, A. J., Anelia Angelova, Alexander Toshev, and Michael S. Ryoo. "Adversarial Generative Grammars for Human Activity Prediction." In Computer Vision – ECCV 2020, 507–23. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58536-5_30.
Повний текст джерелаТези доповідей конференцій з теми "Human Activity Prediction"
Shete, Amar, Aashita Gupta, Ajay Waghumbare, Upasna Singh, Triveni Dhamale, and Kiran Napte. "Human Activity Prediction Using Generative Adversarial Networks." In 2024 15th International Conference on Computing Communication and Networking Technologies (ICCCNT), 1–6. IEEE, 2024. http://dx.doi.org/10.1109/icccnt61001.2024.10726013.
Повний текст джерелаSukanya, K., Addagatla Prashanth, and Ugendhar Addagatla. "Development of Human Activity Prediction Systems in Smart Homes." In 2024 IEEE International Conference on Smart Power Control and Renewable Energy (ICSPCRE), 1–6. IEEE, 2024. http://dx.doi.org/10.1109/icspcre62303.2024.10675115.
Повний текст джерелаNirmala, S., and R. A. Priya. "A Human Activity Determination Predicting Abnormality Using SVM Approach for Mining Field Workers." In 2024 7th International Conference on Circuit Power and Computing Technologies (ICCPCT), 1659–63. IEEE, 2024. http://dx.doi.org/10.1109/iccpct61902.2024.10673253.
Повний текст джерелаMansoor, Zara, Mustansar Ali Ghazanfar, Syed Muhammad Anwar, Ahmed S. Alfakeeh, and Khaled H. Alyoubi. "Pain Prediction in Humans using Human Brain Activity Data." In Companion of the The Web Conference 2018. New York, New York, USA: ACM Press, 2018. http://dx.doi.org/10.1145/3184558.3186348.
Повний текст джерелаKarthikeyan, M. V., Mohamed Faisal M, and Jithesh R. "Public Human Assault Prediction Using Human Activity Recognition with AI." In 2024 International Conference on Advances in Data Engineering and Intelligent Computing Systems (ADICS). IEEE, 2024. http://dx.doi.org/10.1109/adics58448.2024.10533461.
Повний текст джерелаZiaeefard, Maryam, Robert Bergevin, and Jean-Francois Lalonde. "Deep Uncertainty Interpretation in Dyadic Human Activity Prediction." In 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA). IEEE, 2017. http://dx.doi.org/10.1109/icmla.2017.00-55.
Повний текст джерелаDönnebrink, Robin, Fernando Moya Rueda, Rene Grzeszick, and Maximilian Stach. "Miss-placement Prediction of Multiple On-body Devices for Human Activity Recognition." In iWOAR 2023: 8th international Workshop on Sensor-Based Activity Recognition and Artificial Intelligence. New York, NY, USA: ACM, 2023. http://dx.doi.org/10.1145/3615834.3615838.
Повний текст джерелаDong-Gyu Lee and Seong-Whan Lee. "Human activity prediction based on Sub-volume Relationship Descriptor." In 2016 23rd International Conference on Pattern Recognition (ICPR). IEEE, 2016. http://dx.doi.org/10.1109/icpr.2016.7899939.
Повний текст джерелаRodrigues, Royston, Neha Bhargava, Rajbabu Velmurugan, and Subhasis Chaudhuri. "Multi-timescale Trajectory Prediction for Abnormal Human Activity Detection." In 2020 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, 2020. http://dx.doi.org/10.1109/wacv45572.2020.9093633.
Повний текст джерелаNagpal, Diana, and Shikha Gupta. "Human Activity Recognition and Prediction: Overview and Research Gaps." In 2023 IEEE 8th International Conference for Convergence in Technology (I2CT). IEEE, 2023. http://dx.doi.org/10.1109/i2ct57861.2023.10126458.
Повний текст джерелаЗвіти організацій з теми "Human Activity Prediction"
Allen-Dumas, Melissa, Kuldeep Kurte, Haowen Xu, Jibonananda Sanyal, and Guannan Zhang. A Spatiotemporal Sequence Forecasting Platform to Advance the Predictionof Changing Spatiotemporal Patterns of CO2 Concentrationby Incorporating Human Activity and Hydrological Extremes. Office of Scientific and Technical Information (OSTI), April 2021. http://dx.doi.org/10.2172/1769653.
Повний текст джерелаHarris, Virginia, Gerald C. Nelson, and Steven Stone. Spatial Econometric Analysis and Project Evaluation: Modeling Land Use Change in the Darién. Inter-American Development Bank, November 1999. http://dx.doi.org/10.18235/0008801.
Повний текст джерелаAlter, Ross, Michelle Swearingen, and Mihan McKenna. The influence of mesoscale atmospheric convection on local infrasound propagation. Engineer Research and Development Center (U.S.), February 2024. http://dx.doi.org/10.21079/11681/48157.
Повний текст джерелаSaville, Alan, and Caroline Wickham-Jones, eds. Palaeolithic and Mesolithic Scotland : Scottish Archaeological Research Framework Panel Report. Society for Antiquaries of Scotland, June 2012. http://dx.doi.org/10.9750/scarf.06.2012.163.
Повний текст джерелаEparkhina, Dina. EuroSea Legacy Report. EuroSea, 2023. http://dx.doi.org/10.3289/eurosea_d8.12.
Повний текст джерела