Lexical Analysis

Updated: 04/26/2017 by Computer Hope

In computer science, lexical analysis is the process of converting a sequence of characters into meaningful strings; these meaningful strings are referred to as tokens. A program that performs lexical analysis is called a lexical analyzer, lexer, or tokenizer. This program is often used in conjunction with a software component (called a parser) that converts the string into structured data.

Where is lexical analysis used?

Lexical analysis and parsing are used by programs like compilers that can use the parsed data from a programmer's code to create a compiled binary executable. They are also used by web browsers to format and display a web page by using data parsed from HTML, CSS and JavaScript.

Code, Compiler, Computer science, Executable, Programming terms