Over the past decade, the explosion of Artificial Intelligence (Machine, Deep, and Reinforcement Learning) has expanded the directions of sensor data fusion (SDF). These SDF-AI approaches are mainly focused on low-level information fusion (LLIF) object assessment; however there is a need to consider the directions of AI aligned with high-level information fusion (situation/impact assessment and sensor, user, and mission management). This tutorial brings together the contemporary concepts, models, and definitions to give the attendee a summary of the state-of-the-art in HLIF theories (operational, functional, formal, and cognitive) representations (semantics, ontologies, axiomatics, and agents) and design (modelling, testbeds, evaluation, and human-machine interfaces).
Track-to-Track Fusion is required in numerous multi sensor applications either for distributing the computational load of processing raw sensor data or for sharing information on jointly observed tracks via low bandwidth communication links. Due to the common process noise, track fusion is a tricky task and in general, no optimal solution exists. This tutorial explains the problem of cross-correlations in distributed target tracking and presents derivations for various solutions which have been found such that the participant can understand the advantages and boundaries of each algorithm.
This tutorial will provide to the participants several of the latest state-of-the art advanced algorithms to estimate the states of multiple targets in clutter and multisensor information fusion. These form the basis of automated decision systems for advanced surveillance and targeting.
This tutorial provides an introduction to graph-based inference and its application to localization, tracking, and mapping. It covers the theoretical background and methods for developing a factor graph from a statistical model and deriving a message-passing algorithm for parameter estimation based on the developed factor graph. In particular, the tutorial also introduces graph-based modeling and inference techniques for localization, tracking, and mapping problems. The intended audience is graduate students or postdocs with a background in probability theory, statistical signal processing, linear algebra, and filtering.
This tutorial will introduce the emerging area of statistical graph signal processing, which extends traditional signal processing concepts to data indexed by generic graphs. The tutorial will cover fundamental graph signal processing (GSP) concepts, including the graph Fourier transform, graph filter design, and sampling and recovery of graph signals. The tutorial will then focus on developing statistical methods for GSP, including the estimation and tracking graph signals, anomaly detection, and topology identification.
Although the Kalman filter has been widely applied to target tracking applications since its introduction in the early 1960s, until recently, no systematic design methodology was available to predict tracking performance for maneuvering targets and optimize filter parameter selection. When tracking maneuvering targets with a Kalman filter, the selection of the process noise (e.g., acceleration errors) variance is complicated by the fact that the motion modeling errors are represented as white Gaussian, while target maneuvers are deterministic or highly correlated in time. This tutorial presents systematic procedures for selecting the optimal process noise variance for the nearly constant velocity (NCV) and nearly constant acceleration (NCA) Kalman filters.
In this tutorial, we will introduce the trend of Deep Learning based fusion architectures: early, late and middle fusion. We will discuss how they are employed for different computer vision tasks: classification, detection and segmentation. The talk covers methods and principles behind these architectures in two popular case studies: autonomous ships and remote sensing. Attendees to this tutorial will leave with a good sense of how deep learning can be used for multispectral, multiresolution and multisensor data fusion.
The objectives of this tutorial are as follows: gain a working knowledge of how probability vectors relates to Dempster-Shafer belief functions, of how to apply important pre-fusion techniques to improve fusion results, and of how to build a tensor fusion AI system leveraging Dempster-Shafer Theory. To accomplish these objectives, this tutorial covers essential concepts in Dempster-Shafer Theory, transformations between probability and belief space, pre-fusion processing techniques, tensor data fusion techniques, and post fusion processing. This tutorial will include a functional codebase in MATLAB that will be distributed to attendees, which will allow them to see the techniques in action as they follow along.
Knowledge of a dynamic system model is crucial for many tracking, state estimation, signal processing, and fault detection problems. Incorrect description of the stochastic part of the model, i.e., the noise statistics, may result in a significant worsening of tracking, estimation, signal processing, detection, or even failure of the underlying algorithms. The tutorial introduces a more than six decades-long history, recent advances and state-of-the-art methods for estimating the properties of the noises, including implementation issues.
Quantum algorithms for data fusion may become game changers as soon as quantum processing kernels embedded in hybrid processing architectures with classical processors will exist. While emerging quantum technologies such as quantum computers directly apply quantum physics, quantum algorithms do not exploit quantum physical phenomena as such, but rather use the sophisticated framework of quantum physics to deal with “uncertainty”. Both approaches may be used, and potentially combined, to find new solutions to problems in data fusion.