for the past two months, my team and i have been exploring the intersection of neuroscience, artificial intelligence through Machine Learning, and the Internet of Things (IoT).
we call our work the Life2Well Project, which stands for Learning at the intersection of AI, physiology, EEG, our environment and well-being.
we are very privileged and happy that we have just been afforded the opportunity to share our early work in the Cognitive Computing and IoT track of the 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022) which will take place between the 24th and the 28th of July.
my co-authors and i have titled our work: The Life2Well Project: studying environment, physiology and EEG. it is scheduled to be read on the 26th of July. the abstract follows:
The 2021 United Nations Climate Change Conference (COP26) resulted in the Glasgow Climate Pact, which encouraged urgent cuts in greenhouse gas emissions and promised climate finance to developing countries. In Singapore, the Centre for Climate Research released its second National Climate Change Study in December 2021, which projects increases in daily mean temperatures and greater contrasts between wetter and drier months. This paper describes on-going work as part of an independent research study. Preliminary work in this study investigated relationships between environment and physiological measurements using smartwatches, and self-designed bespoke environmental modules which are wearable around the waist. Data from this preliminary work was analysed with a Random Forest regression model. The work reported in this paper describes a subsequent phase, in which neurophysiological measurement, specifically electroencephalography (EEG), was introduced to the model to explore how the changes in environmental or biometric measurements correlate with changes in neurophysiological measurements. In this latter study, EEG data is viewed as an independent data type that is distinct from environmental and other physiological data. The headset model used to record EEG data is again a bespoke hand-made design, comprising a combination of biosensing board and electrodes from OpenBCI and widely available items like adhesive tapes and staples. All data recorded is stored in Google Drive; Python is used to synchronize, pre-process data and train regression models. The first headset prototype was assembled in mid-October 2021, and was tested and developed in early November. From mid-November to late January 2022, the authors wore the devices for one to two hours per day to collect data. For EEG data, eight channels were recorded, basic filters (bandpass and notch) and REST re-referencing are applied. In this project, EEG time-series are used as input in regression models with other data types as output. Two regression models were trained then compared, the first being convolutional neural network with pre-built architecture and the other being a Random Forest model with features extracted from EEG time-series. Inferences are made from the models using open-source interpreters, with an eventual aim to infer how one’s local environment might impact one’s emotions and health. To elaborate, the authors have a long-term goal to contribute to work identifying ideal ranges of microclimate for optimum human health and emotions, with a view to improving productivity and quality of life for future generations. Further, the analysis of biometric data from this project could be developed into a smart device that might eventually suggest proactive measures facilitative of good health. Wearables arising from this and similar studies have the potential to improve the lived environment and well-being of the community as personal health devices, in addition to being incorporated in to smart homes. The regression results could also be used to improve the life satisfaction and productivity, especially in urban areas (Palme & Salvati, 2020).
we hope to be able to interact with you during our session!