Motivation
N-gram models and HMM Tagging only allowed us to process sentences linearly.
However, even simple sentences require a nonlinear model that reflects the hierarchical structure of sentences rather than the linear order of words.
Probabilistic Context Free Grammars are the simplest and most natural probabilistic model for tree structures and the algorithms for them are closely related to those for HMMs.
Note, however, that there are other ways of building probabilistic models of syntactic structure (see Chapter 12).