Knowledge Acquisition
Creating Awareness with Sensor Fusion
Creating Awareness with Sensor Fusion
Brainteaser: Given the following sequence, “ABCDFGHJ,” predict the next character. Are memories of college aptitude tests flooding back? Perhaps a MENSA entrance exam? It might help if you are told the characters are letters in sequence from the English alphabet.

Photo by olafpictures on Pixabay
It might make more sense if I told you I recently spilled a diet soda all over my keyboard and the third row of keys do not work anymore. So your guess is “K”? Well, actually, the next letter in the sequence is “L.” The K-key got fouled up with mustard a while back. Accepting I have a bad habit of snacking over my computer, the salient thing to notice is how additional relevant information can be extremely valuable to the processes of drawing conclusions and making predictions.
“More is Better” has definitely been a major battle cry of the data acquisition community over the past two decades. Infusion of the personal computer into the laboratory has facilitated the collection of untold terabytes of values that were previously only collected by the handful using technicians uniformed in black-rimmed safety glasses and lab coats. The appearance of magnetic and optical storage media has sent the clipboard to the buggy whip club and has transitioned the technicians from the task of data recording to data management and analysis. Individual laboratory notebooks are being supplanted by networked laboratory information management systems (LIMS) that benefit from information technology (IT) advances in equipment and software designed for business enterprise resource planning (ERP) systems.
Handling large amounts of data is not a new requirement and has been the subject of study by statisticians for hundreds of years. While chemical analysis was only beginning to emerge from beneath the shroud of alchemy, sovereigns maintained databases of subjects to levy the correct amount of taxation and administer the defense of their realms. Around the beginning of the last century, William S. Gossett, a statistician working for the Guinness Brewing Company in Ireland, developed a table of probability distributions used to compare the average value of a small number of measurements to the mean and standard deviation of a much larger collection. Over time, the brewing company had collected a formidable archive of analysis values attesting the quality of their product and Gossett’s tables determined if the current batches differed from the historical best while allowing for random fluctuations in measurement error. Realizing the academic significance of his development, Gossett sought to publish in the open literature. However, his employer required that he publish under the penname of “Student.” The “Student’s-t” tests are still used today to determine if a sample is different from a known standard, if two samples are different from each other, or if two sampling methods are different. These tests can be used for quality control, to prove the innocence of an accused criminal, or to certify an alternative analysis method.
The myriad instrumental analysis techniques presently available for quality assurance and quality control can be used to differentiate between acceptable and unacceptable product, unfortunately often after the fact. What is needed is the ability to incorporate these many measurements into the processes that generated them and navigate the production toward optimum product rather than evaluate the output pass/fail. Much like the progression of a calculus course, the task of data acquisition and analysis is evolving from differentiation to integration.
At first, integrating additional values acquired from various sensors and instruments may appear trivial. Simply include these new values along with the others and generate an updated average value. The difficulty is that different sensors may have diverse resolutions and accuracies. They most often produce data at different rates and in unique formats, and the values may be redundant or even contradictory to other sensors. The process of integrating these varied values into a valid estimation of reality having higher quality and reliability than that provided by the individual values is known as data or sensor fusion. Humans integrate measurements from their senses of sight, hearing, smell, taste, and touch to provide an awareness of their current environment. At a higher level, image and speech processing provide extremely high bandwidth communication channels. Sensor fusion seeks to mimic these abilities at the machine level to make our factories and tools more “aware” of their own performance.
In 1960, Rudolph E. Kalman published one of the most widely used algorithms for sensor fusion, known today as the Kalman filter. It utilizes a set of mathematical equations to model the current state of a stochastic system’s parameters in terms of their current values and standard deviations. The algorithm predicts future values of the state parameters using model extrapolation and the predictions are compared with actual additional measurements. Weighted by the respective known accuracy and precision of the sensors, the predicted model parameter values are adjusted to reflect the state values actually measured. In this fashion, the Kalman filter iteratively predicts its model parameters and corrects them using the latest available (but noisy) sensor readings to reach an optimal estimate of the actual values. As originally proposed, the Kalman filter utilizes linear models to predict the behavior of linear systems. However, the Extended-Kalman filter (EKF) has been developed to handle non-linear systems.
In addition to predicting and correcting model values along discrete values of time, the Kalman filter can be used to incorporate data from multiple sensors obtained at any single state in time. As additional values are fed to the Kalman filter model, its estimation of the current state is optimized based on all available sensor data, much like the input from our five senses. The robotics industry has made great use of the Kalman filter in its sensor fusion applications for unmanned combat air vehicles (UCAVs). Fusing sensor readings from velocimeters, accelerometers, pressure sensors and global positioning system (GPS) transponders, permits the UCAV to fly autonomously while the remote-control pilot concentrates on navigation and mission objectives.
Research into the development of additional sensor fusion algorithms is very active and generally utilizes the tools of the optimization community. Fuzzy logic, neural networks, genetic algorithms, and gradient searches are finding utility in sensor fusion. Manufacturers are investigating the application of the technique to intelligent process control (IPC); the ability to adjust dynamically the control parameters required to produce a defect-free product or operate a machine at near-optimal efficiency. The defense industry is developing it for multi-spectral imaging of battlefields, while the medical industry seeks to fuse images obtained by x-ray, MRI, and ultrasound diagnostics in order to make more accurate diagnoses.
Researchers also are working toward “virtual sensors” that permit a value to be interpolated at any X-Y-Z spatial position by fusing the data obtained from a network of distributed sensors. For example, a team of mobile robots exploring the ocean floor, the surface of a distant planet, or playing soccer in the laboratory can communicate their individual views of the surroundings to their teammates. Each robot can fuse the views obtained from different positions and angles to generate an image of what lies behind an obstacle currently obscuring its own view. All of these advances are made practical by the development of sensor technology, data distribution networks, processing power, and algorithm research over the past two decades. Just as the personal computer created entirely new economies in the mid-1980s, data, information and knowledge technologies are poised to create even more.
This material originally appeared as a Contributed Editorial in Scientific Computing and Instrumentation 20:10 September 2003, pp. 16–17.
William L. Weaver is an Associate Professor in the Department of Integrated Science, Business, and Technology at La Salle University in Philadelphia, PA USA. He holds a B.S. Degree with Double Majors in Chemistry and Physics and earned his Ph.D. in Analytical Chemistry with expertise in Ultrafast LASER Spectroscopy. He teaches, writes, and speaks on the application of Systems Thinking to the development of New Products and Innovation.

