Статті в журналах з теми "Activities of daily living recognition"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Activities of daily living recognition.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-50 статей у журналах для дослідження на тему "Activities of daily living recognition".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте статті в журналах для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Wu, Jiaxuan, Yunfei Feng, and Peng Sun. "Sensor Fusion for Recognition of Activities of Daily Living." Sensors 18, no. 11 (November 19, 2018): 4029. http://dx.doi.org/10.3390/s18114029.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Activity of daily living (ADL) is a significant predictor of the independence and functional capabilities of an individual. Measurements of ADLs help to indicate one’s health status and capabilities of quality living. Recently, the most common ways to capture ADL data are far from automation, including a costly 24/7 observation by a designated caregiver, self-reporting by the user laboriously, or filling out a written ADL survey. Fortunately, ubiquitous sensors exist in our surroundings and on electronic devices in the Internet of Things (IoT) era. We proposed the ADL Recognition System that utilizes the sensor data from a single point of contact, such as smartphones, and conducts time-series sensor fusion processing. Raw data is collected from the ADL Recorder App constantly running on a user’s smartphone with multiple embedded sensors, including the microphone, Wi-Fi scan module, heading orientation of the device, light proximity, step detector, accelerometer, gyroscope, magnetometer, etc. Key technologies in this research cover audio processing, Wi-Fi indoor positioning, proximity sensing localization, and time-series sensor data fusion. By merging the information of multiple sensors, with a time-series error correction technique, the ADL Recognition System is able to accurately profile a person’s ADLs and discover his life patterns. This paper is particularly concerned with the care for the older adults who live independently.
2

Ihianle, Isibor Kennedy, Usman Naeem, and Abdel-Rahman Tawil. "Recognition of Activities of Daily Living from Topic Model." Procedia Computer Science 98 (2016): 24–31. http://dx.doi.org/10.1016/j.procs.2016.09.007.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Chua, Sook-Ling, Lee Kien Foo, and Hans W. Guesgen. "Predicting Activities of Daily Living with Spatio-Temporal Information." Future Internet 12, no. 12 (November 27, 2020): 214. http://dx.doi.org/10.3390/fi12120214.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The smart home has begun playing an important role in supporting independent living by monitoring the activities of daily living, typically for the elderly who live alone. Activity recognition in smart homes has been studied by many researchers with much effort spent on modeling user activities to predict behaviors. Most people, when performing their daily activities, interact with multiple objects both in space and through time. The interactions between user and objects in the home can provide rich contextual information in interpreting human activity. This paper shows the importance of spatial and temporal information for reasoning in smart homes and demonstrates how such information is represented for activity recognition. Evaluation was conducted on three publicly available smart-home datasets. Our method achieved an average recognition accuracy of more than 81% when predicting user activities given the spatial and temporal information.
4

Ortis, Alessandro, Giovanni M. Farinella, Valeria D’Amico, Luca Addesso, Giovanni Torrisi, and Sebastiano Battiato. "Organizing egocentric videos of daily living activities." Pattern Recognition 72 (December 2017): 207–18. http://dx.doi.org/10.1016/j.patcog.2017.07.010.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Refonaa, J., Bandaru Suhas, B. V. S. Bhaskar, S. L. JanyShabu, S. Dhamodaran, Sardar Maran, Maria Anu, and M. Lakshmi. "Fall Detection and Daily Living Activity Recognition Logic Regression." Journal of Computational and Theoretical Nanoscience 17, no. 8 (August 1, 2020): 3520–25. http://dx.doi.org/10.1166/jctn.2020.9223.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
It is a must to bring the fall detection system in to use with the increasing number of elder people in the world, because the most of them tend live voluntarily and at risk of injuries. Falls are dangerous in a few cases and could even lead to deadly injuries. A very robust fall detection system must be built in order to counter this problem. Here, we establish fall detection and recognition of daily live behavior through machine learning system. In order to detect different types of activities, including the detection of falls and day to-day activities, We use 2 shared archives for the accelerating and lateral speed data during this development. Logistic regression is used to determine motions such as drop, walk, climb, sit, stand and lie bases on the accelerating data and data on angular velocities. More specifically, the triaxial acceleration average value is used to achieve fall detection accuracy.
6

Nguyen, Thi-Hoa-Cuc, Jean-Christophe Nebel, and Francisco Florez-Revuelta. "Recognition of Activities of Daily Living with Egocentric Vision: A Review." Sensors 16, no. 1 (January 7, 2016): 72. http://dx.doi.org/10.3390/s16010072.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Salguero, Alberto, Macarena Espinilla, Pablo Delatorre, and Javier Medina. "Using Ontologies for the Online Recognition of Activities of Daily Living." Sensors 18, no. 4 (April 14, 2018): 1202. http://dx.doi.org/10.3390/s18041202.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Avgerinakis, Konstantinos, Alexia Briassouli, and Ioannis Kompatsiaris. "Activities of daily living recognition using optimal trajectories from motion boundaries." Journal of Ambient Intelligence and Smart Environments 7, no. 6 (November 20, 2015): 817–34. http://dx.doi.org/10.3233/ais-150347.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Cheng, Bo-Chao, Yi-An Tsai, Guo-Tan Liao, and Eui-Seok Byeon. "HMM machine learning and inference for Activities of Daily Living recognition." Journal of Supercomputing 54, no. 1 (October 9, 2009): 29–42. http://dx.doi.org/10.1007/s11227-009-0335-0.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
10

Al Huda, Fais, Herman Tolle, and Rosa Andrie Asmara. "Realtime Online Daily Living Activity Recognition Using Head-Mounted Display." International Journal of Interactive Mobile Technologies (iJIM) 11, no. 3 (April 27, 2017): 67. http://dx.doi.org/10.3991/ijim.v11i3.6469.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Human activity recognition is one of the popular research fields. The results of this study can be applied to many other fields such as the military, commercialism, and health. With the advent of the wearable head mounted display device mainly like google glass raises the possibility of this research. In this study tries to identify everyday activities are often called the ambient activity. Development of the system is done online using a smartphone and a head mounted display. The system produces an accuracy above 90%, which can be concluded that the system was able to recognize the activities with great accuracy.
11

Mohamed, Samer A., and Uriel Martinez-Hernandez. "A Light-Weight Artificial Neural Network for Recognition of Activities of Daily Living." Sensors 23, no. 13 (June 24, 2023): 5854. http://dx.doi.org/10.3390/s23135854.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Human activity recognition (HAR) is essential for the development of robots to assist humans in daily activities. HAR is required to be accurate, fast and suitable for low-cost wearable devices to ensure portable and safe assistance. Current computational methods can achieve accurate recognition results but tend to be computationally expensive, making them unsuitable for the development of wearable robots in terms of speed and processing power. This paper proposes a light-weight architecture for recognition of activities using five inertial measurement units and four goniometers attached to the lower limb. First, a systematic extraction of time-domain features from wearable sensor data is performed. Second, a small high-speed artificial neural network and line search method for cost function optimization are used for activity recognition. The proposed method is systematically validated using a large dataset composed of wearable sensor data from seven activities (sitting, standing, walking, stair ascent/descent, ramp ascent/descent) associated with eight healthy subjects. The accuracy and speed results are compared against methods commonly used for activity recognition including deep neural networks, convolutional neural networks, long short-term memory and convolutional–long short-term memory hybrid networks. The experiments demonstrate that the light-weight architecture can achieve a high recognition accuracy of 98.60%, 93.10% and 84.77% for seen data from seen subjects, unseen data from seen subjects and unseen data from unseen subjects, respectively, and an inference time of 85 μs. The results show that the proposed approach can perform accurate and fast activity recognition with a reduced computational complexity suitable for the development of portable assistive devices.
12

Peate, Ian. "Activities of living, 4: breathing." British Journal of Healthcare Assistants 18, no. 4 (April 2, 2024): 122–27. http://dx.doi.org/10.12968/bjha.2024.18.4.122.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Breathing is the focus of this article in the series discussing the activities of living ( Roper et al, 2000 ). Breathing is a fundamental and vital activity of living; it plays a crucial role in the process of respiration. It is the mechanism by which gases are exchanged, primarily oxygen and carbon dioxide, with their environment. This process is essential for the production of energy through cellular respiration, which is crucial for the survival of most aerobic organisms, including humans. This article highlights the recognition that breathing is a fundamental aspect of daily life that contributes to an individual's overall wellbeing. The healthcare assistant (HCA) and assistant practitioner (AP) have a key role to play in assisting people, where needed, with this activity of living.
13

Su, Muchun, Diana Wahyu Hayati, Shaowu Tseng, Jiehhaur Chen, and Hsihsien Wei. "Smart Care Using a DNN-Based Approach for Activities of Daily Living (ADL) Recognition." Applied Sciences 11, no. 1 (December 22, 2020): 10. http://dx.doi.org/10.3390/app11010010.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Health care for independently living elders is more important than ever. Automatic recognition of their Activities of Daily Living (ADL) is the first step to solving the health care issues faced by seniors in an efficient way. The paper describes a Deep Neural Network (DNN)-based recognition system aimed at facilitating smart care, which combines ADL recognition, image/video processing, movement calculation, and DNN. An algorithm is developed for processing skeletal data, filtering noise, and pattern recognition for identification of the 10 most common ADL including standing, bending, squatting, sitting, eating, hand holding, hand raising, sitting plus drinking, standing plus drinking, and falling. The evaluation results show that this DNN-based system is suitable method for dealing with ADL recognition with an accuracy rate of over 95%. The findings support the feasibility of this system that is efficient enough for both practical and academic applications.
14

Nasreen, Shamila, Muhammad Awais Azam, Usman Naeem, Mustansar Ali Ghazanfar, and Asra Khalid. "Recognition Framework for Inferring Activities of Daily Living Based on Pattern Mining." Arabian Journal for Science and Engineering 41, no. 8 (March 18, 2016): 3113–26. http://dx.doi.org/10.1007/s13369-016-2091-9.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Poularakis, Stergios, Konstantinos Avgerinakis, Alexia Briassouli, and Ioannis Kompatsiaris. "Efficient motion estimation methods for fast recognition of activities of daily living." Signal Processing: Image Communication 53 (April 2017): 1–12. http://dx.doi.org/10.1016/j.image.2017.01.005.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Ferreira, José M., Ivan Miguel Pires, Gonçalo Marques, Nuno M. García, Eftim Zdravevski, Petre Lameski, Francisco Flórez-Revuelta, Susanna Spinsante, and Lina Xu. "Activities of Daily Living and Environment Recognition Using Mobile Devices: A Comparative Study." Electronics 9, no. 1 (January 18, 2020): 180. http://dx.doi.org/10.3390/electronics9010180.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The recognition of Activities of Daily Living (ADL) using the sensors available in off-the-shelf mobile devices with high accuracy is significant for the development of their framework. Previously, a framework that comprehends data acquisition, data processing, data cleaning, feature extraction, data fusion, and data classification was proposed. However, the results may be improved with the implementation of other methods. Similar to the initial proposal of the framework, this paper proposes the recognition of eight ADL, e.g., walking, running, standing, going upstairs, going downstairs, driving, sleeping, and watching television, and nine environments, e.g., bar, hall, kitchen, library, street, bedroom, living room, gym, and classroom, but using the Instance Based k-nearest neighbour (IBk) and AdaBoost methods as well. The primary purpose of this paper is to find the best machine learning method for ADL and environment recognition. The results obtained show that IBk and AdaBoost reported better results, with complex data than the deep neural network methods.
17

Howedi, Aadel, Ahmad Lotfi, and Amir Pourabdollah. "Exploring Entropy Measurements to Identify Multi-Occupancy in Activities of Daily Living." Entropy 21, no. 4 (April 19, 2019): 416. http://dx.doi.org/10.3390/e21040416.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Human Activity Recognition (HAR) is the process of automatically detecting human actions from the data collected from different types of sensors. Research related to HAR has devoted particular attention to monitoring and recognizing the human activities of a single occupant in a home environment, in which it is assumed that only one person is present at any given time. Recognition of the activities is then used to identify any abnormalities within the routine activities of daily living. Despite the assumption in the published literature, living environments are commonly occupied by more than one person and/or accompanied by pet animals. In this paper, a novel method based on different entropy measures, including Approximate Entropy (ApEn), Sample Entropy (SampEn), and Fuzzy Entropy (FuzzyEn), is explored to detect and identify a visitor in a home environment. The research has mainly focused on when another individual visits the main occupier, and it is, therefore, not possible to distinguish between their movement activities. The goal of this research is to assess whether entropy measures can be used to detect and identify the visitor in a home environment. Once the presence of the main occupier is distinguished from others, the existing activity recognition and abnormality detection processes could be applied for the main occupier. The proposed method is tested and validated using two different datasets. The results obtained from the experiments show that the proposed method could be used to detect and identify a visitor in a home environment with a high degree of accuracy based on the data collected from the occupancy sensors.
18

Martinelli, Alessio, Simone Morosi, and Enrico Del Re. "Daily Living Movement Recognition for Pedestrian Dead Reckoning Applications." Mobile Information Systems 2016 (2016): 1–13. http://dx.doi.org/10.1155/2016/7128201.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Nowadays, activity recognition is a central topic in numerous applications such as patient and sport activity monitoring, surveillance, and navigation. By focusing on the latter, in particular Pedestrian Dead Reckoning navigation systems, activity recognition is generally exploited to get landmarks on the map of the buildings in order to permit the calibration of the navigation routines. The present work aims to provide a contribution to the definition of a more effective movement recognition for Pedestrian Dead Reckoning applications. The signal acquired by a belt-mounted triaxial accelerometer is considered as the input to the movement segmentation procedure which exploits Continuous Wavelet Transform to detect and segment cyclic movements such as walking. Furthermore, the segmented movements are provided to a supervised learning classifier in order to distinguish between activities such as walking and walking downstairs and upstairs. In particular, four supervised learning classification families are tested: decision tree, Support Vector Machine,k-nearest neighbour, and Ensemble Learner. Finally, the accuracy of the considered classification models is evaluated and the relative confusion matrices are presented.
19

Dorronzoro Zubiete, Enrique, Keigo Nakahata, Nevrez Imamoglu, Masashi Sekine, Guanghao Sun, Isabel Gomez, and Wenwei Yu. "Evaluation of a Home Biomonitoring Autonomous Mobile Robot." Computational Intelligence and Neuroscience 2016 (2016): 1–8. http://dx.doi.org/10.1155/2016/9845816.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Increasing population age demands more services in healthcare domain. It has been shown that mobile robots could be a potential solution to home biomonitoring for the elderly. Through our previous studies, a mobile robot system that is able to track a subject and identify his daily living activities has been developed. However, the system has not been tested in any home living scenarios. In this study we did a series of experiments to investigate the accuracy of activity recognition of the mobile robot in a home living scenario. The daily activities tested in the evaluation experiment include watching TV and sleeping. A dataset recorded by a distributed distance-measuring sensor network was used as a reference to the activity recognition results. It was shown that the accuracy is not consistent for all the activities; that is, mobile robot could achieve a high success rate in some activities but a poor success rate in others. It was found that the observation position of the mobile robot and subject surroundings have high impact on the accuracy of the activity recognition, due to the variability of the home living daily activities and their transitional process. The possibility of improvement of recognition accuracy has been shown too.
20

Pires, Ivan Miguel, Gonçalo Marques, Nuno M. Garcia, Nuno Pombo, Francisco Flórez-Revuelta, Susanna Spinsante, Maria Canavarro Teixeira, and Eftim Zdravevski. "Recognition of Activities of Daily Living and Environments Using Acoustic Sensors Embedded on Mobile Devices." Electronics 8, no. 12 (December 7, 2019): 1499. http://dx.doi.org/10.3390/electronics8121499.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The identification of Activities of Daily Living (ADL) is intrinsic with the user’s environment recognition. This detection can be executed through standard sensors present in every-day mobile devices. On the one hand, the main proposal is to recognize users’ environment and standing activities. On the other hand, these features are included in a framework for the ADL and environment identification. Therefore, this paper is divided into two parts—firstly, acoustic sensors are used for the collection of data towards the recognition of the environment and, secondly, the information of the environment recognized is fused with the information gathered by motion and magnetic sensors. The environment and ADL recognition are performed by pattern recognition techniques that aim for the development of a system, including data collection, processing, fusion and classification procedures. These classification techniques include distinctive types of Artificial Neural Networks (ANN), analyzing various implementations of ANN and choosing the most suitable for further inclusion in the following different stages of the developed system. The results present 85.89% accuracy using Deep Neural Networks (DNN) with normalized data for the ADL recognition and 86.50% accuracy using Feedforward Neural Networks (FNN) with non-normalized data for environment recognition. Furthermore, the tests conducted present 100% accuracy for standing activities recognition using DNN with normalized data, which is the most suited for the intended purpose.
21

Nguyen, Nhan Duc, Duong Trong Bui, Phuc Huu Truong, and Gu-Min Jeong. "Position-Based Feature Selection for Body Sensors regarding Daily Living Activity Recognition." Journal of Sensors 2018 (September 13, 2018): 1–13. http://dx.doi.org/10.1155/2018/9762098.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
This paper proposes a novel approach to recognize activities based on sensor-placement feature selection. The method is designed to address a problem of multisensor fusion information of wearable sensors which are located in different positions of a human body. Precisely, the approach can extract the best feature set that characterizes each activity regarding a body-sensor location to recognize daily living activities. We firstly preprocess the raw data by utilizing a low-pass filter. After extracting various features, feature selection algorithms are applied separately on feature sets of each sensor to obtain the best feature set for each body position. Then, we investigate the correlation of the features in each set to optimize the feature set. Finally, a classifier is applied to an optimized feature set, which contains features from four body positions to classify thirteen activities. In experimental results, we obtain an overall accuracy of 99.13% by applying the proposed method to the benchmark dataset. The results show that we can reduce the computation time for the feature selection step and achieve a high accuracy rate by performing feature selection for the placement of each sensor. In addition, our proposed method can be used for a multiple-sensor configuration to classify activities of daily living. The method is also expected to deploy to an activity classification system-based big data platform since each sensor node only sends essential information characterizing itself to a cloud server.
22

Cavalcante, Ariany F., Victor H. de L. Kunst, Thiago de M. Chaves, Júlia D. T. de Souza, Isabela M. Ribeiro, Jonysberg P. Quintino, Fabio Q. B. da Silva, André L. M. Santos, Veronica Teichrieb, and Alana Elza F. da Gama. "Deep Learning in the Recognition of Activities of Daily Living Using Smartwatch Data." Sensors 23, no. 17 (August 29, 2023): 7493. http://dx.doi.org/10.3390/s23177493.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The recognition of human activities (HAR) using wearable device data, such as smartwatches, has gained significant attention in the field of computer science due to its potential to provide insights into individuals’ daily activities. This article aims to conduct a comparative study of deep learning techniques for recognizing activities of daily living (ADL). A mapping of HAR techniques was performed, and three techniques were selected for evaluation, along with a dataset. Experiments were conducted using the selected techniques to assess their performance in ADL recognition, employing standardized evaluation metrics, such as accuracy, precision, recall, and F1-score. Among the evaluated techniques, the DeepConvLSTM architecture, consisting of recurrent convolutional layers and a single LSTM layer, achieved the most promising results. These findings suggest that software applications utilizing this architecture can assist smartwatch users in understanding their movement routines more quickly and accurately.
23

Fleury, Anthony, Norbert Noury, and Michel Vacher. "Improving Supervised Classification of Activities of Daily Living Using Prior Knowledge." International Journal of E-Health and Medical Communications 2, no. 1 (January 2011): 17–34. http://dx.doi.org/10.4018/jehmc.2011010102.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The increase in life expectancy is producing a bottleneck at the entry in institutions. Therefore, telemedicine becomes a timely solution, which is largely explored to care after elderly people living independently at home. It requires identifying the behaviors and activities of the person at home, with non-intrusive sensors and to process data to detect the main trends in the health status. This paper presents the results of the study of prior introduction, in Support Vector Machine, to improve the automatic recognition of Activities of Daily Living. From a set of activities, performed in the experimental smart home in Grenoble, the authors obtained models for seven activities of Daily Living and tested the performances of this classification with introduction of spatial and temporal priors. Eventually, different results are discussed.
24

Meditskos, Georgios, and Ioannis Kompatsiaris. "iKnow: Ontology-driven situational awareness for the recognition of activities of daily living." Pervasive and Mobile Computing 40 (September 2017): 17–41. http://dx.doi.org/10.1016/j.pmcj.2017.05.003.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Onthoni, Djeane Debora, and Prasan Kumar Sahoo. "Artificial-Intelligence-Assisted Activities of Daily Living Recognition for Elderly in Smart Home." Electronics 11, no. 24 (December 11, 2022): 4129. http://dx.doi.org/10.3390/electronics11244129.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Activity Recognition (AR) is a method to identify a certain activity from the set of actions. It is commonly used to recognize a set of Activities of Daily Living (ADLs), which are performed by the elderly in a smart home environment. AR can be beneficial for monitoring the elder’s health condition, where the information can be further shared with the family members, caretakers, or doctors. Due to the unpredictable behaviors of an elderly person, performance of ADLs can vary in day-to-day life. Each activity may perform differently, which can affect the sequence of the sensor’s raw data. Due to this issue, recognizing ADLs from the sensor’s raw data remains a challenge. In this paper, we proposed an Activity Recognition for the prediction of the Activities of Daily Living using Artificial Intelligence approach. Data acquisition techniques and modified Naive Bayes supervised learning algorithm are used to design the prediction model for ADL. Our experiment results establish that the proposed method can achieve high accuracy in comparison to other well-established supervised learning algorithms.
26

Ehatisham-ul-Haq, Muhammad, Fiza Murtaza, Muhammad Awais Azam, and Yasar Amin. "Daily Living Activity Recognition In-The-Wild: Modeling and Inferring Activity-Aware Human Contexts." Electronics 11, no. 2 (January 12, 2022): 226. http://dx.doi.org/10.3390/electronics11020226.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Advancement in smart sensing and computing technologies has provided a dynamic opportunity to develop intelligent systems for human activity monitoring and thus assisted living. Consequently, many researchers have put their efforts into implementing sensor-based activity recognition systems. However, recognizing people’s natural behavior and physical activities with diverse contexts is still a challenging problem because human physical activities are often distracted by changes in their surroundings/environments. Therefore, in addition to physical activity recognition, it is also vital to model and infer the user’s context information to realize human-environment interactions in a better way. Therefore, this research paper proposes a new idea for activity recognition in-the-wild, which entails modeling and identifying detailed human contexts (such as human activities, behavioral environments, and phone states) using portable accelerometer sensors. The proposed scheme offers a detailed/fine-grained representation of natural human activities with contexts, which is crucial for modeling human-environment interactions in context-aware applications/systems effectively. The proposed idea is validated using a series of experiments, and it achieved an average balanced accuracy of 89.43%, which proves its effectiveness.
27

Woznowski, Przemysław, Emma Tonkin, and Peter Flach. "Activities of Daily Living Ontology for Ubiquitous Systems: Development and Evaluation." Sensors 18, no. 7 (July 20, 2018): 2361. http://dx.doi.org/10.3390/s18072361.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Ubiquitous eHealth systems based on sensor technologies are seen as key enablers in the effort to reduce the financial impact of an ageing society. At the heart of such systems sit activity recognition algorithms, which need sensor data to reason over, and a ground truth of adequate quality used for training and validation purposes. The large set up costs of such research projects and their complexity limit rapid developments in this area. Therefore, information sharing and reuse, especially in the context of collected datasets, is key in overcoming these barriers. One approach which facilitates this process by reducing ambiguity is the use of ontologies. This article presents a hierarchical ontology for activities of daily living (ADL), together with two use cases of ground truth acquisition in which this ontology has been successfully utilised. Requirements placed on the ontology by ongoing work are discussed.
28

Bhattacharya, Sarnab, Rebecca Adaimi, and Edison Thomaz. "Leveraging Sound and Wrist Motion to Detect Activities of Daily Living with Commodity Smartwatches." Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 6, no. 2 (July 4, 2022): 1–28. http://dx.doi.org/10.1145/3534582.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Automatically recognizing a broad spectrum of human activities is key to realizing many compelling applications in health, personal assistance, human-computer interaction and smart environments. However, in real-world settings, approaches to human action perception have been largely constrained to detecting mobility states, e.g., walking, running, standing. In this work, we explore the use of inertial-acoustic sensing provided by off-the-shelf commodity smartwatches for detecting activities of daily living (ADLs). We conduct a semi-naturalistic study with a diverse set of 15 participants in their own homes and show that acoustic and inertial sensor data can be combined to recognize 23 activities such as writing, cooking, and cleaning with high accuracy. We further conduct a completely in-the-wild study with 5 participants to better evaluate the feasibility of our system in practical unconstrained scenarios. We comprehensively studied various baseline machine learning and deep learning models with three different fusion strategies, demonstrating the benefit of combining inertial and acoustic data for ADL recognition. Our analysis underscores the feasibility of high-performing recognition of daily activities using inertial-acoustic data from practical off-the-shelf wrist-worn devices while also uncovering challenges faced in unconstrained settings. We encourage researchers to use our public dataset to further push the boundary of ADL recognition in-the-wild.
29

Belmonte-Fernández, Óscar, Antonio Caballer-Miedes, Eris Chinellato, Raúl Montoliu, Emilio Sansano-Sansano, and Rubén García-Vidal. "Anomaly Detection in Activities of Daily Living with Linear Drift." Cognitive Computation 12, no. 6 (July 1, 2020): 1233–51. http://dx.doi.org/10.1007/s12559-020-09740-6.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Lee, Jaeryoung, and Nicholas Melo. "Habit Representation Based on Activity Recognition." Sensors 20, no. 7 (March 30, 2020): 1928. http://dx.doi.org/10.3390/s20071928.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
With the increasing elderly population, attention has been drawn to the development of applications for habit assessment using activity data from smart environments that can be implemented in care facilities. In this paper, we introduce a novel habit assessment method based on information of human activities. First, a recognition system tracks the user’s activities of daily living by collecting data from multiple object sensors and ambient sensors that are distributed within the environment. Based on this information, the activities of daily living are expressed using Fourier series representation. The durations and sequence of the activities are represented by the phases and amplitudes of the harmonics. In this manner, each sequence is represented in a form that we refer to as a behavioral spectrum. After that, signals are clustered to find habits. We also calculate the variability, and by comparing the explained variance, the types of habits are found. For an evaluation, two datasets (young and elderly population) were used, and the results showed the potential habits of each group. The outcomes of this study can help improve and expand the applications of smart homes.
31

Javeed, Madiha, Naif Al Mudawi, Abdulwahab Alazeb, Sultan Almakdi, Saud S. Alotaibi, Samia Allaoua Chelloug, and Ahmad Jalal. "Intelligent ADL Recognition via IoT-Based Multimodal Deep Learning Framework." Sensors 23, no. 18 (September 16, 2023): 7927. http://dx.doi.org/10.3390/s23187927.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Smart home monitoring systems via internet of things (IoT) are required for taking care of elders at home. They provide the flexibility of monitoring elders remotely for their families and caregivers. Activities of daily living are an efficient way to effectively monitor elderly people at home and patients at caregiving facilities. The monitoring of such actions depends largely on IoT-based devices, either wireless or installed at different places. This paper proposes an effective and robust layered architecture using multisensory devices to recognize the activities of daily living from anywhere. Multimodality refers to the sensory devices of multiple types working together to achieve the objective of remote monitoring. Therefore, the proposed multimodal-based approach includes IoT devices, such as wearable inertial sensors and videos recorded during daily routines, fused together. The data from these multi-sensors have to be processed through a pre-processing layer through different stages, such as data filtration, segmentation, landmark detection, and 2D stick model. In next layer called the features processing, we have extracted, fused, and optimized different features from multimodal sensors. The final layer, called classification, has been utilized to recognize the activities of daily living via a deep learning technique known as convolutional neural network. It is observed from the proposed IoT-based multimodal layered system’s results that an acceptable mean accuracy rate of 84.14% has been achieved.
32

Iseda, Hikoto, Keiichi Yasumoto, Akira Uchiyama, and Teruo Higashino. "Daily Living Activity Recognition with Frequency-Shift WiFi Backscatter Tags." Sensors 24, no. 11 (May 21, 2024): 3277. http://dx.doi.org/10.3390/s24113277.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
To provide diverse in-home services like elderly care, versatile activity recognition technology is essential. Radio-based methods, including WiFi CSI, RFID, and backscatter communication, are preferred due to their minimal privacy intrusion, reduced physical burden, and low maintenance costs. However, these methods face challenges, including environmental dependence, proximity limitations between the device and the user, and untested accuracy amidst various radio obstacles such as furniture, appliances, walls, and other radio waves. In this paper, we propose a frequency-shift backscatter tag-based in-home activity recognition method and test its feasibility in a near-real residential setting. Consisting of simple components such as antennas and switches, these tags facilitate ultra-low power consumption and demonstrate robustness against environmental noise because a context corresponding to a tag can be obtained by only observing frequency shifts. We implemented a sensing system consisting of SD-WiFi, a software-defined WiFi AP, and physical switches on backscatter tags tailored for detecting the movements of daily objects. Our experiments demonstrate that frequency shifts by tags can be detected within a 2 m range with 72% accuracy under the line of sight (LoS) conditions and achieve a 96.0% accuracy (F-score) in recognizing seven typical daily living activities with an appropriate receiver/transmitter layout. Furthermore, in an additional experiment, we confirmed that increasing the number of overlaying packets enables frequency shift-detection even without LoS at distances of 3–5 m.
33

Camp, Nicola, Martin Lewis, Kirsty Hunter, Julie Johnston, Massimiliano Zecca, Alessandro Di Nuovo, and Daniele Magistro. "Technology Used to Recognize Activities of Daily Living in Community-Dwelling Older Adults." International Journal of Environmental Research and Public Health 18, no. 1 (December 28, 2020): 163. http://dx.doi.org/10.3390/ijerph18010163.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The use of technology has been suggested as a means of allowing continued autonomous living for older adults, while reducing the burden on caregivers and aiding decision-making relating to healthcare. However, more clarity is needed relating to the Activities of Daily Living (ADL) recognised, and the types of technology included within current monitoring approaches. This review aims to identify these differences and highlight the current gaps in these systems. A scoping review was conducted in accordance with PRISMA-ScR, drawing on PubMed, Scopus, and Google Scholar. Articles and commercially available systems were selected if they focused on ADL recognition of older adults within their home environment. Thirty-nine ADL recognition systems were identified, nine of which were commercially available. One system incorporated environmental and wearable technology, two used only wearable technology, and 34 used only environmental technologies. Overall, 14 ADL were identified but there was variation in the specific ADL recognised by each system. Although the use of technology to monitor ADL of older adults is becoming more prevalent, there is a large variation in the ADL recognised, how ADL are defined, and the types of technology used within monitoring systems. Key stakeholders, such as older adults and healthcare workers, should be consulted in future work to ensure that future developments are functional and useable.
34

Salguero, Alberto G., Pablo Delatorre, Javier Medina, Macarena Espinilla, and Antonio J. Tomeu. "Ontology-Based Framework for the Automatic Recognition of Activities of Daily Living Using Class Expression Learning Techniques." Scientific Programming 2019 (April 1, 2019): 1–19. http://dx.doi.org/10.1155/2019/2917294.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The miniaturization and price reduction of sensors have encouraged the proliferation of smart environments, in which multitudinous sensors detect and describe the activities carried out by inhabitants. In this context, the recognition of activities of daily living has represented one of the most developed research areas in recent years. Its objective is to determine what daily activity is developed by the inhabitants of a smart environment. In this field, many proposals have been presented in the literature, many of them being based onad hocontologies to formalize logical rules, which hinders their reuse in other contexts. In this work, we propose the use of class expression learning (CEL), an ontology-based data mining technique, for the recognition of ADL. This technique is based on combining the entities in the ontology, trying to find the expressions that best describe those activities. As far as we know, it is the first time that this technique is applied to this problem. To evaluate the performance of CEL for the automatic recognition of activities, we have first developed a framework that is able to convert many of the available datasets to all the ontology models we have found in the literature for dealing with ADL. Two different CEL algorithms have been employed for the recognition of eighteen activities in two different datasets. Although all the available ontologies in the literature are focused on the description of the context of the activities, the results show that the sequence of the events produced by the sensors is more relevant for their automatic recognition, in general terms.
35

Pires, Ivan Miguel, Gonçalo Marques, Nuno M. Garcia, Francisco Flórez-Revuelta, Maria Canavarro Teixeira, Eftim Zdravevski, Susanna Spinsante, and Miguel Coimbra. "Pattern Recognition Techniques for the Identification of Activities of Daily Living Using a Mobile Device Accelerometer." Electronics 9, no. 3 (March 19, 2020): 509. http://dx.doi.org/10.3390/electronics9030509.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The application of pattern recognition techniques to data collected from accelerometers available in off-the-shelf devices, such as smartphones, allows for the automatic recognition of activities of daily living (ADLs). This data can be used later to create systems that monitor the behaviors of their users. The main contribution of this paper is to use artificial neural networks (ANN) for the recognition of ADLs with the data acquired from the sensors available in mobile devices. Firstly, before ANN training, the mobile device is used for data collection. After training, mobile devices are used to apply an ANN previously trained for the ADLs’ identification on a less restrictive computational platform. The motivation is to verify whether the overfitting problem can be solved using only the accelerometer data, which also requires less computational resources and reduces the energy expenditure of the mobile device when compared with the use of multiple sensors. This paper presents a method based on ANN for the recognition of a defined set of ADLs. It provides a comparative study of different implementations of ANN to choose the most appropriate method for ADLs identification. The results show the accuracy of 85.89% using deep neural networks (DNN).
36

Espinilla, Macarena, Javier Medina, and Chris Nugent. "UCAmI Cup. Analyzing the UJA Human Activity Recognition Dataset of Activities of Daily Living." Proceedings 2, no. 19 (October 26, 2018): 1267. http://dx.doi.org/10.3390/proceedings2191267.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Many real-world applications, which are focused on addressing the needs of a human, require information pertaining to the activities being performed. The UCAmI Cup is an event held within the context of the International Conference on Ubiquitous Computing and Ambient Intelligence, where delegates are given the opportunity to use their tools and techniques to analyse a previously unseen human activity recognition dataset and to compare their results with others working in the same domain. In this paper, the human activity recognition dataset used relates to activities of daily living generated in the UJAmI Smart Lab, University of Jaén. The dataset chosen for the first edition of the UCAmI Cup represents 246 activities performed over a period of ten days carried out by a single inhabitant. The dataset includes four data sources: (i) event streams from 30 binary sensors, (ii) intelligent floor location data, (iii) proximity data between a smart watch worn by the inhabitant and 15 Bluetooth Low Energy beacons and (iv) acceleration of the smart watch. In this first edition of the UCAmI Cup, 26 participants from 10 different countries contacted the organizers to obtain the dataset.‬‬‬‬‬
37

Saint-Maurice, Pedro De, Andres M. Calabró, and Gregory J. Welk. "Validation Of A Pattern-recognition Activity Monitor In Older Adults During Daily Living Activities." Medicine & Science in Sports & Exercise 41 (May 2009): 544. http://dx.doi.org/10.1249/01.mss.0000356208.09006.b3.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Bensamoun, D., R. David, D. Alexandre, V. Marie, and P. Robert. "537 – Comparative study of postural recognition from four actigraphs during activities of daily living." European Psychiatry 28 (January 2013): 1. http://dx.doi.org/10.1016/s0924-9338(13)75833-4.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Pires, Ivan Miguel, Maria Canavarro Teixeira, Nuno Pombo, Nuno M. Garcia, Francisco Flórez-Revuelta, Susanna Spinsante, Rossitza Goleva, and Eftim Zdravevski. "Android Library for Recognition of Activities of Daily Living: Implementation Considerations, Challenges, and Solutions." Open Bioinformatics Journal 11, no. 1 (May 22, 2018): 61–88. http://dx.doi.org/10.2174/1875036201811010061.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Background:Off-the-shelf-mobile devices have several sensors available onboard that may be used for the recognition of Activities of Daily Living (ADL) and the environments where they are performed. This research is focused on the development of Ambient Assisted Living (AAL) systems, using mobile devices for the acquisition of the different types of data related to the physical and physiological conditions of the subjects and the environments. Mobile devices with the Android Operating Systems are the least expensive and exhibit the biggest market while providing a variety of models and onboard sensors.Objective:This paper describes the implementation considerations, challenges and solutions about a framework for the recognition of ADL and the environments, provided as an Android library. The framework is a function of the number of sensors available in different mobile devices and utilizes a variety of activity recognition algorithms to provide a rapid feedback to the user.Methods:The Android library includes data fusion, data processing, features engineering and classification methods. The sensors that may be used are the accelerometer, the gyroscope, the magnetometer, the Global Positioning System (GPS) receiver and the microphone. The data processing includes the application of data cleaning methods and the extraction of features, which are used with Deep Neural Networks (DNN) for the classification of ADL and environment. Throughout this work, the limitations of the mobile devices were explored and their effects have been minimized.Results:The implementation of the Android library reported an overall accuracy between 58.02% and 89.15%, depending on the number of sensors used and the number of ADL and environments recognized. Compared with the results available in the literature, the performance of the library reported a mean improvement of 2.93%, and they do not differ at the maximum found in prior work, that based on the Student’s t-test.Conclusion:This study proves that ADL like walking, going upstairs and downstairs, running, watching TV, driving, sleeping and standing activities, and the bedroom, cooking/kitchen, gym, classroom, hall, living room, bar, library and street environments may be recognized with the sensors available in off-the-shelf mobile devices. Finally, these results may act as a preliminary research for the development of a personal digital life coach with a multi-sensor mobile device commonly used daily.
40

Bouchabou, Damien, Juliette Grosset, Sao Mai Nguyen, Christophe Lohr, and Xavier Puig. "A Smart Home Digital Twin to Support the Recognition of Activities of Daily Living." Sensors 23, no. 17 (September 1, 2023): 7586. http://dx.doi.org/10.3390/s23177586.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
One of the challenges in the field of human activity recognition in smart homes based on IoT sensors is the variability in the recorded data. This variability arises from differences in home configurations, sensor network setups, and the number and habits of inhabitants, resulting in a lack of data that accurately represent the application environment. Although simulators have been proposed in the literature to generate data, they fail to bridge the gap between training and field data or produce diverse datasets. In this article, we propose a solution to address this issue by leveraging the concept of digital twins to reduce the disparity between training and real-world data and generate more varied datasets. We introduce the Virtual Smart Home, a simulator specifically designed for modeling daily life activities in smart homes, which is adapted from the Virtual Home simulator. To assess its realism, we compare a set of activity data recorded in a real-life smart apartment with its replication in the VirtualSmartHome simulator. Additionally, we demonstrate that an activity recognition algorithm trained on the data generated by the VirtualSmartHome simulator can be successfully validated using real-life field data.
41

Najjar, Mehdi, François Courtemanche, Habib Hamam, Alexandre Dion, and Jéremy Bauchet. "Intelligent Recognition of Activities of Daily Living for Assisting Memory and/or Cognitively Impaired Elders in Smart Homes." International Journal of Ambient Computing and Intelligence 1, no. 4 (October 2009): 46–62. http://dx.doi.org/10.4018/jaci.2009062204.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The article describes a recognition approach of undertaken activities of daily living (ADLs) performed by memory and/or cognitively impaired elders in smart homes. The proposed technique is materialized via a recognition module inserted in a modular generic architecture which aims to offer a framework to conceive intelligent ADLs assistance systems.
42

Lye, Mohd Haris, Nouar AlDahoul, and Hezerul Abdul Karim. "Fusion of Appearance and Motion Features for Daily Activity Recognition from Egocentric Perspective." Sensors 23, no. 15 (July 30, 2023): 6804. http://dx.doi.org/10.3390/s23156804.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Vidos from a first-person or egocentric perspective offer a promising tool for recognizing various activities related to daily living. In the egocentric perspective, the video is obtained from a wearable camera, and this enables the capture of the person’s activities in a consistent viewpoint. Recognition of activity using a wearable sensor is challenging due to various reasons, such as motion blur and large variations. The existing methods are based on extracting handcrafted features from video frames to represent the contents. These features are domain-dependent, where features that are suitable for a specific dataset may not be suitable for others. In this paper, we propose a novel solution to recognize daily living activities from a pre-segmented video clip. The pre-trained convolutional neural network (CNN) model VGG16 is used to extract visual features from sampled video frames and then aggregated by the proposed pooling scheme. The proposed solution combines appearance and motion features extracted from video frames and optical flow images, respectively. The methods of mean and max spatial pooling (MMSP) and max mean temporal pyramid (TPMM) pooling are proposed to compose the final video descriptor. The feature is applied to a linear support vector machine (SVM) to recognize the type of activities observed in the video clip. The evaluation of the proposed solution was performed on three public benchmark datasets. We performed studies to show the advantage of aggregating appearance and motion features for daily activity recognition. The results show that the proposed solution is promising for recognizing activities of daily living. Compared to several methods on three public datasets, the proposed MMSP–TPMM method produces higher classification performance in terms of accuracy (90.38% with LENA dataset, 75.37% with ADL dataset, 96.08% with FPPA dataset) and average per-class precision (AP) (58.42% with ADL dataset and 96.11% with FPPA dataset).
43

Polo-Rodriguez, Aurora, Jose Manuel Vilchez Chiachio, Cristiano Paggetti, and Javier Medina-Quero. "Ambient Sound Recognition of Daily Events by Means of Convolutional Neural Networks and Fuzzy Temporal Restrictions." Applied Sciences 11, no. 15 (July 29, 2021): 6978. http://dx.doi.org/10.3390/app11156978.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
The use of multimodal sensors to describe activities of daily living in a noninvasive way is a promising research field in continuous development. In this work, we propose the use of ambient audio sensors to recognise events which are generated from the activities of daily living carried out by the inhabitants of a home. An edge–fog computing approach is proposed to integrate the recognition of audio events with smart boards where the data are collected. To this end, we compiled a balanced dataset which was collected and labelled in controlled conditions. A spectral representation of sounds was computed using convolutional network inputs to recognise ambient sounds with encouraging results. Next, fuzzy processing of audio event streams was included in the IoT boards by means of temporal restrictions defined by protoforms to filter the raw audio event recognition, which are key in removing false positives in real-time event recognition.
44

Ayanniyi, Abdulkabir A., Christianah O. Fadamiro, Fatai O. Olatunji, Mustafa B. Hassan, Bola J. Adkoya, Joshua F. Owoeye, and Isaac A. Uyanne. "Visual Disability: Causes and Implications on Patients’ Daily Living." Asian Journal of Medical Sciences 4, no. 1 (April 24, 2013): 21–29. http://dx.doi.org/10.3126/ajms.v4i1.6842.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Objective: To determine the causes and implications of visual disability (VD) on patients’ daily living. Methods: A cross section survey of 130 visually disabled (vd) Nigerians on visually related basic life activities, psychology and emotion in 2008. Both structured interview and relevant clinical examination were conducted for the vd to gather the necessary information. Results: VD was due mainly to cataract (82, 63.1%) and glaucoma (29, 22.3%). At least 78% of the causes of the VD were avoidable (treatable, curable). VD reduced/eliminated available manpower/workforce and increased the number of dependants. The most severely affected activities included driving, reading, threading a needle, but most vd could still cope with feeding and wearing of clothes. The activities missed most by the participants were appreciation of the beauty of nature, people/object recognition and reading. There was an association between the activities missed most and the vd levels of education (P=0.001) but not with gender (P=0.406). Most participants (85%) expressed sadness over VD and reported sadness had an association with educational levels (P=0.042) but not with gender (P=0.167). Though (97.7%) thought life was meaningless due to VD, all (100%) had hope in regaining normal vision. Most participants (82.3%) expressed sadness over dependence on the sighted for basic visual demanding tasks. Conclusion: Both cataract and glaucoma are leading causes of visual disability. Visual Disability diminishes quality of daily living and has economic, psychosocial and emotional implications. Renewed efforts towards preventing avoidable blindness and rehabilitating irreversibly blind will reduce the burden of vd. DOI: http://dx.doi.org/10.3126/ajms.v4i1.6842 Asian Journal of Medical Sciences 4(2013) 21-29
45

Li, Yun Jie, and Hui Song. "Applying Data Mining Techniques on Continuous Sensed Data for Daily Living Activity Recognition." Applied Mechanics and Materials 738-739 (March 2015): 191–96. http://dx.doi.org/10.4028/www.scientific.net/amm.738-739.191.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
In this paper, several data mining techniques were discussed and analyzed in order to achieve the objective of human daily activities recognition based on a continuous sensing data set. The data mining techniques of decision tree, Naïve Bayes and Neural Network were successfully applied to the data set. The paper also proposed an idea of combining the Neural Network with the Decision Tree, the result shows that it works much better than the typical Neural Network and the typical Decision Tree model.
46

Hegde, Nagaraj, Matthew Bries, Tracy Swibas, Edward Melanson, and Edward Sazonov. "Automatic Recognition of Activities of Daily Living Utilizing Insole-Based and Wrist-Worn Wearable Sensors." IEEE Journal of Biomedical and Health Informatics 22, no. 4 (July 2018): 979–88. http://dx.doi.org/10.1109/jbhi.2017.2734803.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
47

Shuai Zhang, S. I. McClean, and B. W. Scotney. "Probabilistic Learning From Incomplete Data for Recognition of Activities of Daily Living in Smart Homes." IEEE Transactions on Information Technology in Biomedicine 16, no. 3 (May 2012): 454–62. http://dx.doi.org/10.1109/titb.2012.2188534.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
48

Herrera-Alcántara, Oscar, Ari Barrera-Animas, Miguel González-Mendoza, and Félix Castro-Espinoza. "Monitoring Student Activities with Smartwatches: On the Academic Performance Enhancement." Sensors 19, no. 7 (April 3, 2019): 1605. http://dx.doi.org/10.3390/s19071605.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Motivated by the importance of studying the relationship between habits of students and their academic performance, daily activities of undergraduate participants have been tracked with smartwatches and smartphones. Smartwatches collect data together with an Android application that interacts with the users who provide the labeling of their own activities. The tracked activities include eating, running, sleeping, classroom-session, exam, job, homework, transportation, watching TV-Series, and reading. The collected data were stored in a server for activity recognition with supervised machine learning algorithms. The methodology for the concept proof includes the extraction of features with the discrete wavelet transform from gyroscope and accelerometer signals to improve the classification accuracy. The results of activity recognition with Random Forest were satisfactory (86.9%) and support the relationship between smartwatch sensor signals and daily-living activities of students which opens the possibility for developing future experiments with automatic activity-labeling, and so forth to facilitate activity pattern recognition to propose a recommendation system to enhance the academic performance of each student.
49

Mekruksavanich, Sakorn, and Anuchit Jitpattanakul. "Identifying Smartphone Users Based on Activities in Daily Living Using Deep Neural Networks." Information 15, no. 1 (January 15, 2024): 47. http://dx.doi.org/10.3390/info15010047.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Анотація:
Smartphones have become ubiquitous, allowing people to perform various tasks anytime and anywhere. As technology continues to advance, smartphones can now sense and connect to networks, providing context-awareness for different applications. Many individuals store sensitive data on their devices like financial credentials and personal information due to the convenience and accessibility. However, losing control of this data poses risks if the phone gets lost or stolen. While passwords, PINs, and pattern locks are common security methods, they can still be compromised through exploits like smudging residue from touching the screen. This research explored leveraging smartphone sensors to authenticate users based on behavioral patterns when operating the device. The proposed technique uses a deep learning model called DeepResNeXt, a type of deep residual network, to accurately identify smartphone owners through sensor data efficiently. Publicly available smartphone datasets were used to train the suggested model and other state-of-the-art networks to conduct user recognition. Multiple experiments validated the effectiveness of this framework, surpassing previous benchmark models in this area with a top F1-score of 98.96%.
50

Banerjee, Tanvi, James M. Keller, Mihail Popescu, and Marjorie Skubic. "Recognizing complex instrumental activities of daily living using scene information and fuzzy logic." Computer Vision and Image Understanding 140 (November 2015): 68–82. http://dx.doi.org/10.1016/j.cviu.2015.04.005.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.

До бібліографії