lexical analysis

lexical analysis
The conversion of a stream of characters to a stream of meaningful tokens; normally to simplify parsing.

While its often not difficult to identify tokens while parsing, having a separate stage for lexical analysis simplifies the structure of your compiler.


Wikipedia foundation.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • lexical analysis — noun (computing) A stage during the compilation of a program in which standard components of a statement are replaced by internal codes (tokens) which identify their meaning • • • Main Entry: ↑lexicon …   Useful english dictionary

  • Lexical analysis — In computer science, lexical analysis is the process of converting a sequence of characters into a sequence of tokens. Programs performing lexical analysis are called lexical analyzers or lexers. A lexer is often organized as separate scanner and …   Wikipedia

  • lexical analysis — leksikos analizė statusas T sritis informatika apibrėžtis ↑Leksemų (vardų, bazinių žodžių, skaičių, keliais ženklais užrašytų operacijų ženklų) aptikimas programos, parašytos programavimo kalba, arba scenarijaus tekste ir jų kodavimas, kad toliau …   Enciklopedinis kompiuterijos žodynas

  • Lexical analysis — …   Википедия

  • Analysis — (from Greek ἀνάλυσις , a breaking up ) is the process of breaking a complex topic or substance into smaller parts to gain a better understanding of it. The technique has been applied in the study of mathematics and logic since before Aristotle,… …   Wikipedia

  • lexical analyzer — noun A computer program that performs lexical analysis. Syn: lexer …   Wiktionary

  • Lexical functional grammar — (LFG) is a grammar framework in theoretical linguistics, a variety of generative grammar. The development of the theory was initiated by Joan Bresnan and Ronald Kaplan in the 1970s, in reaction to the direction research in the area of… …   Wikipedia

  • Lexical decision task — The lexical decision task is a procedure used in many psychology and psycholinguistics experiments. The basic procedure involves measuring how quickly people classify stimuli as words or nonwords. Although versions of the task had been used by… …   Wikipedia

  • Lexical density — In computational linguistics, lexical density constitutes the estimated measure of content per functional (grammatical) and lexical units (lexemes) in total. Specifically, this is a coefficient of the word type to token ratio of a text. The main… …   Wikipedia

  • Conversation analysis — (commonly abbreviated as CA) is the study of talk in interaction (both verbal and non verbal in situations of everyday life). CA generally attempts to describe the orderliness, structure and sequential patterns of interaction, whether… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”