Machine Learning and Friends Lunch





home
past talks
resources
conferences

Information-driven Inference in Resource-constrained Envronments


John Fisher
MIT

Abstract

Inference in resource-constrained environments is a challenging problem where optimal solutions are generally intractable. For example, Bayesian filtering in distributed sensor networks presents a fundamental trade-off between the value of information contained in a distributed set of measurements versus the resources expended to acquire them, fuse them into a model of uncertainty, and then transmit the resulting model. Approximate approaches have been proposed that treat a subset of these issues; however, the approaches are indirect and usually consider at most one or two future time steps. I will discuss a method which enables long time-horizon sensor planning in the context of object tracking with a distributed sensor network. The approach integrates the value of information discounted by resource expenditures over a rolling time horizon. Simulation results demonstrate that the resulting algorithm can provide similar estimation performance to that of the common "most informative sensor selection" method for a fraction of the resource expenditures.


Additionally, I will present performance guarantees bounding the performance difference between optimal and approximate methods of measurement selection for information-driven Bayesian filtering. Note that the structure of these bounds applies to a variety of machine learning problems including active learning and inference in graphical models. Furthermore, it can be shown that the bounds are tight.


This is joint work with Jason Williams

Back to ML Lunch home