Skip to content

lex: consolidate literal parsing into lexer #51

@coderkalyan

Description

@coderkalyan

Currently, literal parsing is handled in two stages. Initial discrimination is handled by the lexer, but full sanitization and parsing (calculating the underlying value) is handled by the AST. This is wasteful, and buggy because not all invalid literal tokens are rejected by the lexer.

A better solution is to integrate the integer and float literal state machines into the lexer, and have it append the calculated literal to a ArrayList(u64) literal "tape". The parser can then pop literals off this when it encounters a corresponding .int_lit or .float_lit token.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions