In a general graph of MC, there can be any number of states and the transition probabilities are full described between all possible pair of states (a non-possible transition has transition probability of 0). In the original example “I go home”, the word “go” is a verb, so the above MC would predict that the word after “go” shoulod have 90% chance being a noun, which would correctly predict the POS tag of “home” because in this case “home” appears after “go”. When the current word is a verb, there is 90% chance that the next word is a noun, and 10% chance that the next word is still a verb.When the current word is a noun, there is 50% chance that the next word is a verb, and 50% chance that the next word is still a noun.In the simple (made-up) example below two just 2 states (tags): We can fit the POS tag as a state and use it model the state transition probability from one POS tag to another in sentense. Markov Chain (MC) is an important model used in speech recognition as well as POS tagging. It’s not difficult for human to do if he/she understands the English language, but our goal is to let a machine do it.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |