Abstract
Human activity recognition (HAR) plays a vital role in the field of ambient assisted living (AAL) for the welfare of the elders who live alone in the home. AAL provides service through ambient sensors, vision systems, smartphone devices, and wearable sensors. Smartphone devices are familiar, portable, cost-effective, and make the process of monitoring easier. Various research works have proposed smartphone-based HAR systems to recognize basic and complex activities. However, the results are not satisfactory for the case of postural transitions such as stand-to-sit, sit-to-sleep, etc. To improve the recognition rate, this paper couples principal component analysis with stacking ensemble learning for dimensionality reduction and classification respectively. Extensive experimentation of UCI repository datasets such as UCI-HAR has been performed and the performances are measured using familiar metrics such as accuracy, precision, and recall.