News
We show that both the PHD and CPHD filters fit in the context of assumed density filtering and implicitly perform Kullback-Leibler divergence (KLD) minimizations after the prediction and update steps.
Abstract: We propose a new geometric regularization principle for reconstructing vector fields based on prior knowledge about their divergence. As one important example of this general idea, we focus ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results