Revision history of "Tokenizer"

From WebLichtWiki

Jump to: navigation, search

Diff selection: Mark the radio boxes of the revisions to compare and hit enter or the button at the bottom.
Legend: (cur) = difference with latest revision, (prev) = difference with preceding revision, m = minor edit.

  • (cur | prev) 16:16, 7 March 2012Kbeck (Talk | contribs)(833 bytes) (Created page with "A token is an individual unit within a sentence. Tokens are single words, numbers, punctuation marks, etc. Extracting words and sentences are fundamental operations that are r...")