Fall Classification Using Neural Network

Project Enquiry:

Fields with * are mandatory

ABSTRACT

Automated fall detection provides important information to doctors in treating their patients. This information is particularly important in children with cerebral palsy as they are more vulnerable to falling.

Shriners Hospitals for Children–Northern California is searching for an automatic fall detection system that can accurately recognize not only fall versus non–fall events but also different types of falls. The purpose of this work is to design effective fall detection methods using neural networks utilizing the MATLAB neural network tool. Impact, orientation, and rotation features are extracted from signal waveforms generated from accelerometers worn on the lower backs of children with and without CP.

The designed neural networks use the gradient descent, scaled gradient descent, and Levenberg–Marquardt learning algorithms in search for a high accuracy neural network for detecting falls. The results show that the Levenberg–Marquadt algorithm is the best, at 84.6% accuracy, in categorizing torso falls, falls to the knees, falls to the bottom, and non–falls events. When creating a new falls class, made up of torso falls, falls to the knees, and falls to the bottom, accuracy improves to 94.3%. Alternatively, when the falls to the knees and falls to the bottom are included in the non–fall class, the best performing neural network is 93.2% accurate.

PAST RESEARCH

Two common approaches to fall detection are image processing and signal processing. The image processing approach uses fuzzy logic and threshold. This approach requires setting up cameras to record data, which is not a practical method for implementing in a wearable device. The signal–processing approach uses support vector machines, naive Bayes algorithm , wavelet–based methods thresholds , and neural networks. All the works, except for Smith and Bagley and McCurry–Nieto ,were developed to assist elderly independence.

ARTIFICIAL NEURAL NETWORKS

Artificial Neural Network:

Often referred to as a neural network, an artificial neural network (ANN) is “a computing system made up of a number of simple, highly interconnected processing elements, which process information by their dynamic state response to external inputs”.

The processing elements, called neurons, are arranged in layers in the network. Similar to a biological neuron system, such networks have the ability to learn and adapt. They learn by recognizing patterns in the input data, such as when you show a child an apple and tell the child that the object is an apple, the child eventually recognizes the object as an apple. Artificial neural networks have wide applications in object detection, pattern recognition, classification, and prediction.

Artificial Neural Network Components:

An ANN has an input layer, hidden layers, and an output layer. The input layer receives patterns as input values. Layers that are neither input nor output are hidden layers. Learning rules occur in the hidden layers and the output. The learning rules are determined based on initializing weights and biases. Weights are values that express the importance of inputs with respect to outputs.

Example of a Neural Network. There Are Two Inputs in the Input Layer, Three Hidden Neurons in One Hidden Layer, and Two Output Neurons in the Output Layer

Example of a Neural Network. There Are Two Inputs in the Input Layer, Three Hidden Neurons in One Hidden Layer, and Two Output Neurons in the Output Layer.

DATA AND EXTRACTED FEATURES

Description of Data:

Dr. Anita Bagley of Shriners Hospitals for Children–Northern California, Dr. Warren D. Smith, California State University, Sacramento, and 12 of his students collected the data used in this work. A total of 86 children of ages 2 to 14 played in 15–minute sessions. Among these children, 35 were diagnosed with CP and 51 were not. They were also of different ethnicities, including Caucasian, African American, Hispanic, Asian, and Pacific Islander. Three to four cameras were set up to record the activities of the children playing.

Extracted Features:

Dr. Smith provided the data set with three extracted features for each data event: fall impact, orientation, and rotation. Fall impact measures the overall non–gravity acceleration from the tri–axial accelerometers. This value is the moving average of sums of squared accelerations over a 0.3s time period.

Orientation measures how far away from vertical the z–axis of the accelerometer monitor is Rotation is the rate the z–axis deviates away from the vertical direction. These three features are used as inputs to the neural networks in this work. A plot of the three extracted features for all 88 events.

NEURAL NETWORK IMPLEMENTATION

Matlab Neural Network Tool:

The MATLAB software package is a computing environment with programming capabilities. It allows easy data manipulations, graphic presentations of data, fast data processing, and extensive tools for specific purposes such as optimization, signal processing, image processing, etc. For this project, MATLAB ’s neural network tool is utilized extensively.

Neural Network Creation:

Using the neural network tool, a neural network in MATLAB is created by collecting data, configuring the model, initializing its biases and weights, training the network, validating the network, and applying the network. A flowchart of this workflow.

Workflow of an Artificial Neural Network Model in MATLAB.

Workflow of an Artificial Neural Network Model in MATLAB.

EXPERIMENTS

Categorize Four Classes:NF, FB, FK, and TF:

The purpose of this work is to search for an effective neural network that has a percent accuracy on the combined training, validation, and testing sets of at least 90% in categorizing the four different classes of fall and non–fall events. Using the MATLAB neural network tool nntool, a neural network is created by defining the network type, the input data, the desired output, the training function, the number of hidden layers, the number of neurons in each hidden layer, and the neuron transfer function. The network type here is the feed–forward backpropagation.

Neural Network Architecture with Three Inputs, Two Hidden Neurons in a Single Hidden Layer, and Four Outputs.

Neural Network Architecture with Three Inputs, Two Hidden Neurons in a Single Hidden Layer, and Four Outputs.

RESULTS

Experimental Results for Categorizing Four Classes: TF, FB, FK, and NF:

Gradient Descent Algorithm Experiments Using Neural Network Tool

When using the gradient descent algorithm and two hidden neurons, all 39 TF’s are correctly classified, and none of the events in the other three categories is correctly classified. With three hidden neurons, one out of six FB’s is correctly classified, but 37 out of 39 TF’s are correctly classified.

When using four hidden neurons, none of the TF’s is correctly classified, but the network can classify all 25 NF’s and four out of six FB’s. The five–hidden–neuron case can classify 24 of 26 NF’s, two of 18 FK’s, and seven of 39 TF’s. For six hidden neurons, the network goes back to recognizing all of the NF’s and none of the other three categories. The confusion matrices for two to six hidden neurons through , respectively. In these figures, the results are for the entire dataset, that is, for the combined training set, validation set, and test sets.

CONCLUSIONS

This work uses the neural network approach to create automatic fall detection systems for children with cerebral palsy. Using the three features of impact,  orientation, and rotation, the developed neural network models are shown to be effective, with 84.6% in one experiment, in classifying torso falls, falls to the bottom, falls to the knees, and non–falls. The Levenberg–Marquardt (LM) algorithm yields the highest accuracy in categorizing the four classes.

Even though the pattern recognition tool specializes in classification problems, it does not have the highest accuracy among all the four–class experiments. The general neural network tool with the LM algorithm has higher performances.

When only considering falls and non–falls, the neural networks can detect more effectively. Grouping TF, FB, and FK into an expanded fall (EF) class results ina highest accuracy of 94.3% using four hidden neurons and the LM learning algorithm. When regrouping FB, FK, and NF into a new expanded non–falls (ENF) class, the neural network with five hidden neurons performs the best at 93.2% accuracy with the LM method.

It is noteworthy to point out that, even though the neural network training cycle uses a relatively small sample data size of 88, the resulting neural network still may be useful. Raudys and Jain concluded that for a simple neural network structure, if the number of features is small, a neural network could be sufficiently trained on a small sample.

Source: California State University
Author: Thao Thanh Chau

Download Project

>> Best Matlab Project Ideas for Engineering Students with Full Project Materials

Project Enquiry:

Fields with * are mandatory

Leave a Comment

Your email address will not be published. Required fields are marked *