A New Scene Sensing Model Based on Multi-Source Data from Smartphones

Sensors (Basel). 2024 Oct 16;24(20):6669. doi: 10.3390/s24206669.

Abstract

Smartphones with integrated sensors play an important role in people's lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect the method and results of multi-source fusion positioning. Based on the multi-source data from smartphone sensors, this study explores five types of information-Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), cellular networks, optical sensors, and Wi-Fi sensors-characterizing the temporal, spatial, and mathematical statistical features of the data, and it constructs a multi-scale, multi-window, and context-connected scene sensing model to accurately detect the environmental scene in indoor, semi-indoor, outdoor, and semi-outdoor spaces, thus providing a good basis for multi-sensor positioning in a multi-sensor navigation system. Detecting environmental scenes provides an environmental positioning basis for multi-sensor fusion localization. This model is divided into four main parts: multi-sensor-based data mining, a multi-scale convolutional neural network (CNN), a bidirectional long short-term memory (BiLSTM) network combined with contextual information, and a meta-heuristic optimization algorithm.

Keywords: CNN; GNSS; data mining; multi-source sensor; scene classification.