Article,

IRS for Computer Character Sequences Filtration: a new software tool and algorithm to support the IRS at tokenization process

.
International Journal of Advanced Computer Science and Applications(IJACSA), (2013)

Abstract

Tokenization is the task of chopping it up into pieces, called tokens, perhaps at the same time throwing away certain characters, such as punctuation. A token is an instance of token a sequence of characters in some particular document that are grouped together as a useful semantic unit for processing. New software tool and algorithm to support the IRS at tokenization process are presented. Our proposed tool will filter out the three computer character Sequences: IP-Addresses, Web URLs, Date, and Email Addresses. Our tool will use the pattern matching algorithms and filtration methods. After this process, the IRS can start a new tokenization process on the new retrieved text which will be free of these sequences.

Tags

Users

  • @thesaiorg

Comments and Reviews