Next: Collaborative Filtering
Up: General Dependency Networks
Previous: Near Consistency: Theoretical Considerations
Before we consider new applications of dependency networks, we review
related work on the basic concepts. As we have already mentioned,
several researchers who developed Markov networks began with an
examination of what we call consistent dependency networks. For an
excellent discussion of this development as well as original
contributions in this area, see Besag (1974). Besag
(1975) also described an approach called
pseudo-likelihood estimation, in which the conditionals are learned
directly--as in our approach--without respecting the consistency
constraints. We use the name pseudo-Gibbs sampling to make a
connection to his work. Tresp and Hofmann (1998)
describe (general) dependency networks, calling them Markov
blanket networks. They stated and proved Theorem 3, and evaluated
the predictive accuracy of the representation on several data sets
using local distributions consisting of conditional Parzen windows.
Journal of Machine Learning Research,
2000-10-19