top of page

Desire or Need prediction system for children with PIMD/SMID

Children with profound intellectual and multiple disabilities (PIMD) or severe motor and intellectual disabilities (SMID) only communicate through movements, vocalizations, body postures, muscle tensions, or facial expressions on a pre- or protosymbolic level. Yet, to the best of our knowledge, hardly any system has been developed to collect, categorize and interpret their behaviors for independent communication and mobility.

​

This project inlcudes the design development of ChildSIDE app that collects and transmits children’s behaviors and associated location and environmentdata from data sources (GPS and iBeacon device, ALPS Sensor and OpenWeatherMap API) to the database. 

​

We also investigated whether recalibrating the datasets including either minor or major behaviour categories or both, combining location and weather data and feature selection method training (Boruta) would allow more accurate classification of behaviour discriminated to binary and multiclass classification outcomes using eXtreme Gradient Boosting (XGB), support vector machine (SVM), random forest (RF), and neural network (NN) classifiers.

​

One of the recently developed technologies to capture human movements is the optical motion capture system, in which outputs can be analyzed using trajectory analyses, a powerful tool in motor behavior studies. Facial features extraction, eye tracking and movement recognition, are some of the most developed and advanced systems that aid in communication and the interpretation of the needs of children with OIMD/SMID. However, an investigation on whether these movements can be used to classify and predict behavior has not been done. Thus, we also investigated whether body and hand movements and facial expression data can be used to predict the behavior of children with PIMD/SMID using machine learning algorithms.

ChildSIDE app development and environment data

Fig1.png
Fig2.png
Fig4.png
Fig3.png

Machine-learning behavior classification

Figure3.png
Figure4.png

Movement analysis and prediction using Kinovea motion trajectory software

bottom of page