Com 212 intro to system programming book Theory



Download 0.65 Mb.
View original pdf
Page44/72
Date13.05.2021
Size0.65 Mb.
#56617
1   ...   40   41   42   43   44   45   46   47   ...   72
com-212-introduction-to-system-programming-theory
9833 SS1 FISHERY LESSON NOTE
Lexical analysis
In computer science, lexical analysis is the process of converting a sequence of characters into a sequence of tokens. Programs performing lexical analysis are called lexical analyzers orb lexersb. A lexer is often organized as separate scanner and tokenizer functions, though the boundaries may not be clearly defined.

Page | 47
Lexical grammar
The specification of a programming language will include a set of rules, often expressed syntactically, specifying the set of possible character sequences that can form a token or lexeme. The whitespace characters are often ignored during lexical analysis.
Tokens
A token is a categorized block of text. The block of text corresponding to the token is known as a lexeme. A lexical analyzer processes lexemes to categorize them according to function, giving them meaning. This assignment of meaning is known as tokenization. A token can look like anything it just needs to be a useful part of the structured text. Consider this expression in the C programming language sum=3+2;
Tokenized in the following table

Download 0.65 Mb.

Share with your friends:
1   ...   40   41   42   43   44   45   46   47   ...   72




The database is protected by copyright ©ininet.org 2024
send message

    Main page