answersLogoWhite

0


Best Answer

Structured or formal grammars are a method of describing a computer language in another meta-language, that if the grammar. For example, you could use EBNF to describe the grammar for the language Pascal. The meta-language, the language in which the grammar is expressed, varies with the tools in use.

Structured grammars are typically used to implement programming languages, but are also commonly used to create parsers for meta-languages such as HTML or XML, for command oriented streams, or for complex protocols (such as HTTP).

This formal grammar definition acts as input to a parser generator. Parser generators are also compiler generators or compiler compilers, but these tools never generate a compiler. Instead, they generate lexical analyzers and parser tools, typically resulting in an abstract syntax tree (semantic node tree) in some form.

A lexical analyzer is the tool that digests the input in the final language (e.g. Pascal), and decides on each fragment (token) what it is: a keyword, a number, etc. The parser generator folds the stream of tokens into semantic nodes (an expression, a variable definition, etc).

At that point, the input (the Pascal program in this example) is known to be syntactically correct. The next stages include semantic analysis (does it make sense and what does it mean?), code generation and code optimization.

The most famous (and infamous) representative is the pair of lex and yacc. Lex generates a lexical analyzer, yacc (short for Yet Another Compiler Compiler) generates a parser. Lex and Yacc's more modern successors are Flex and Bison, other popular tools include ANTLR. Other tools are available.

User Avatar

Wiki User

12y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What are structured grammars and languages generated by a grammar?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Continue Learning about Engineering

What is Chomsky's classification of languages?

I believe it has something to do with the articulatory aspect (as opposed to other's acoustic and perceptual classifications). > No, it is not. This is a hierarchy of formal grammars that rule the production of (human, computer, nature, etc.) "assertions". This approach is focused on a generative view of the meaningful sentences: each one of those could be generated by rules defined by a grammar, or syntactical rules. The classification is ordered by levels of expressiveness and complexity. See the related link on Wikipedia for further information.


What is the use of theory of computation?

In simple words to learn any natural language like ENGLISH, HINDI,FRENCH.... firstly we need to learn the vocabulary and grammar of that language. That means we have to learn how the language is actually specified. In the same way programming languages(formal languages) like C,C++, JAVA.... has their own vocabulary and grammar and such grammar is specified with the help of mathematical model that is called as Theory of Computation.


What is the grammatical rules governing programming languages?

Language consists of a set of strings (syntactically correct programs) of characters from some alphabet of symbols. Grammar -Formal definition of the syntax of the language. -It naturally defines the hierarchical structure of many PL's. Source: My CMSC 124 (Design and Implementation of Programming Languages) Teacher


Difference between chomsky normal form and greibach normal form?

1,In computer science, a formal grammar is said to be in Chomsky normal form if all of its production rules are of the form: where A, B and C are nonterminal symbols, α is a terminal symbol (a symbol that represents a constant value), S is the start symbol, and λ is the empty string. Also, neither B nor C may be the start symbol. Every grammar in Chomsky normal form is context-free, and conversely, every context-free grammar can be efficiently transformed into an equivalent one which is in Chomsky normal form. With the exception of the optional rule Sλ (included when the grammar may generate the empty string), all rules of a grammar in Chomsky normal form are expansive; thus, throughout the derivation of a string, each string of terminals and nonterminals is always either the same length or one element longer than the previous such string. The derivation of a string of length n is always exactly 2n − 1 steps long. Furthermore, since all rules deriving nonterminals transform one nonterminal to exactly two nonterminals, a parse tree based on a grammar in Chomsky normal form is a binary tree, and the height of this tree is limited to at most the length of the string. Because of these properties, many proofs in the field of languages and computability make use of the Chomsky normal form. These properties also yield various efficient algorithms based on grammars in Chomsky normal form; for example, the CYK algorithm that decides whether a given string can be generated by a given grammar uses the Chomsky normal form. The Chomsky normal form is named after Noam Chomsky, the US linguist who invented the Chomsky hierarchy. 2,In computer science, to say that a context-free grammar is in Greibach normal form (GNF) means that all production rules are of the form: where A is a nonterminal symbol, α is a terminal symbol, X is a (possibly empty) sequence of nonterminal symbols not including the start symbol, S is the start symbol, and λ is the null string. Observe that the grammar must be without left recursions. Every context-free grammar can be transformed into an equivalent grammar in Greibach normal form. (Some definitions do not consider the second form of rule to be permitted, in which case a context-free grammar that can generate the null string cannot be so transformed.) This can be used to prove that every context-free language can be accepted by a non-deterministic pushdown automaton. Given a grammar in GNF and a derivable string in the grammar with length n, any top-down parser will halt at depth n. Greibach normal form is named after Sheila Greibach.


What is syntax anlyzer?

In computer science and linguistics, parsing, or, more formally, syntactic analysis, is the process of analyzing a text, made of a sequence of tokens (for example, words), to determine its grammatical structure with respect to a given (more or less) formal grammar.Parsing is also an earlier term for the diagramming of sentences of natural languages, and is still used for the diagramming of inflected languages, such as the Romance languages or Latin. The term parsing comes from Latin pars (ōrātiōnis), meaning part (of speech).In computing, a parser is one of the components in an interpreter or compiler, which checks for correct syntax and builds a data structure (often some kind of parse tree, abstract syntax tree or other hierarchical structure) implicit in the input tokens. The parser often uses a separate lexical analyser to create tokens from the sequence of input characters. Parsers may be programmed by hand or may be (semi-)automatically generated (in some programming languages) by a tool (such as Yacc) from a grammar written in Backus-Naur form.

Related questions

What is the difference between context free grammar and regular grammar?

A context-free grammar can generate languages that regular grammars cannot, as it allows for the use of non-terminal symbols in productions. Regular grammars, on the other hand, are less powerful and can only generate regular languages, which are a subset of context-free languages. Context-free grammars are more expressive and have more flexibility in rule definitions compared to regular grammars.


What has the author Anton Nijholt written?

Anton Nijholt has written: 'Context-free grammars' -- subject(s): Formal languages, Parsing (Computer grammar)


What is relation between regular languages finite automaton and regular grammars?

finite automaton is the graphical representation of language and regular grammar is the representation of language in expressions


Context sensitive grammar?

A context-sensitive grammar is a formal grammar type where the production rules are written in the form αAβ → αγβ, where α and β are strings of terminals and nonterminals, A is a nonterminal, and γ is a nonempty string. These grammars are more powerful than context-free grammars and can handle languages that require context to be fully defined. They are often used in linguistic analysis and natural language processing.


What are the examples of transformatinal grammar?

Examples of transformational grammars include Chomsky's phrase structure grammar and tree-adjoining grammar. These grammars employ transformational rules to generate sentences by transforming basic phrase structure trees according to specific syntactic operations. Transformational grammars are used in linguistics to study the underlying structure of language.


What has the author Theodore August Norman written?

Theodore August Norman has written: 'Simultaneous rule application in context-free grammars' -- subject(s): Artificial Languages, Comparative and general Grammar, Grammar, Comparative and general, Languages, Artificial, Programming languages (Electronic computers) 'Discriminant analysis document classification' -- subject(s): Information storage and retrieval systems, Documentation, Automatic indexing, Classification


What has the author V de Fivas written?

V. de Fivas has written: 'New grammar of French grammars..' -- subject(s): French language, Grammar


What is Chomsky's classification of languages?

I believe it has something to do with the articulatory aspect (as opposed to other's acoustic and perceptual classifications). > No, it is not. This is a hierarchy of formal grammars that rule the production of (human, computer, nature, etc.) "assertions". This approach is focused on a generative view of the meaningful sentences: each one of those could be generated by rules defined by a grammar, or syntactical rules. The classification is ordered by levels of expressiveness and complexity. See the related link on Wikipedia for further information.


Is Danish same as swedish?

Danish and Swedish are both North Germanic languages that are mutually intelligible to some extent due to their similarities in vocabulary and grammar. However, they are distinct languages with different sounds, spelling, and some vocabulary differences.


Is German grammar inspired by Latin grammar?

Both German and Latin descended from a common-ancestor language called Proto-Indo-European, which was likely spoken in the steppes of what is today southern Russia and the Ukraine, perhaps 3,000 years ago or more. Both German and Latin are related to dozens of other Indo-European Languages as well (from Irish to Farsi, from Portuguese to Russian, from Hindi to Albanian). Because German and Latin both come from the same source, Indo-European, their grammars are quite a bit more similar to each other than they are to other completely unrelated languages such as Bantu, Chinese, Navajo, or Samoan.To answer your question, then, German grammar is *not* "inspired" by Latin grammar, but instead both Germ and and Latin grammars are closely related and therefor similar to each other.


Does Finnish have grammar or is it just random letters?

All languages have grammar.


What has the author Jeffrey Steven Gruber written?

Jeffrey Steven Gruber has written: 'Functions of the lexicon in formal descriptive grammars' -- subject(s): Comparative and general Grammar, Grammar, Comparative and general, Lexicology, Semantics, Syntax