jeudi 3 décembre 2015

Python tokenizer unit testing: insert one token inside a generated token list

I implement a python tokenizer to extract tokens from a text file. Tokens relate to strings which "fit to" a pattern (regular expression) i defined for every token. I use the lexer functionality from the python package ply to implement the tokenizer. After scanning the text file all found tokens are returned as iterator. For unit testing i would like to insert additional tokens at defined places within the "returned token list" to verify if the tokenizer handles correctly in such a bad case situation.

How can i add an additonal token in this situation (the python ply.lex module)? Is there also a more common unit test pattern in this situation?

Aucun commentaire:

Enregistrer un commentaire