nth_char(0)
next()
cursor.first()
and optimize the iterator returned by `tokenize(). This improves lexer performance by 35%