click below
click below
Normal Size Small Size show me how
Compilers History1.1
Kenneth Louden Compiler Construction Principles and Practice
| Question | Answer |
|---|---|
| who created the stored program computer | John von Neumann |
| when was the stored program computer created | late 1940s |
| what changes did the stored program computer create | it became necessary to write sequences of codes (or programs) that would cause these computers to perform the desired computations |
| what were programs originally written in | programs were written in machine language |
| key terms | machine language, assembly language, assembler, grammars, Chomsky hierarchy , context-free grammars, parsing problem, finite automata, regular expressions, optimization techniques, code improvement techniques, parser generators, scanner generators, IDE |
| what is machine language | numeric codes that represented the actual machine operations to be performed |
| what is an example of machine language | C7 06 0000 0002 (represents the instruction to move the number 2 to the location 0000 in hexadecimal on the Intel 8x86 processors used in IBM PCs) |
| why is writing machine code hard | writing such codes is extremely time consuming and tedious |
| what was the form of coding by writing machine code replaced by | assembly language |
| assembly language | instructions and memory locations are given symbolic forms |
| example of assembly language instructions | MOV X,2 (represents the instruction to move the number 2 to the location 0000 in hexadecimal on the Intel 8x86 processors assuming the symbolic memory location X is 0000) |
| assembler | translates the symbolic codes and memory locations of assembly language into the corresponding numeric codes of machine language |
| what was improved with assembly language | assembly language greatly improved the speed and accuracy with which programs could be written |
| when is assembly language used today | especially when extreme speed or conciseness of code is needed |
| what are the defects of assembly language | it is still not easy to write, and it is difficult to read and understand. Assembly language is extremely dependent on the particular machine for which it was written (so code written for one computer must be re-written for another computer) |
| what was the goals of the next major step of programming after assembly language | write the operations of a program in a concise form more nearly resembling mathematical notation or natural language, in a way that was independent of any one particular machine and yet capable of itself being translated by a program into executable code |
| write the machine-independent form for assembly instruction: MOV X,2 | x = 2 |
| what were initial considerations to develop a machine independent form of computation instructions | might not be possible if it was then the object code would be so inefficient as to be useless |
| what event showed that a machine independent form of computation instructions was possible | the development of FORTRAN language and its compiler |
| About FORTRAN | team at IBM led by John Backus between 1954 to 1957 |
| Describe the process involved in the development of FORTRAN language and its compiler | success of this project came about only with a great deal of effort, since most of the processes involved in translating programming languages were not well understood at the time |
| when was Noam Chomsky studying | 1954 to 1957 same time that the first compiler was under development |
| what was Noam Chomsky studying | study of the structure of natural language |
| why is Noam Chomsky significant | his findings eventually made the construction of compilers considerably easier and even capable of partial automation. |
| what was the output of Chomsky's study | Chomsky's study led to the classification of languages according to the complexity of their grammars (the rules specifying their structure) and the power of the algorithms needed to recognize them |
| grammars | rules specifying languages structure |
| Chomskys heirarchy | four levels of grammars: called type 0, type 1, type 2 and type 3 grammars (each is a specialization of its predecessor) |
| what type of grammars proves to be most useful for programming languages | type 2 or context free grammars |
| type 2 grammars | context free grammars |
| context free grammars | type 2 grammars |
| what are context free grammar used for | they are the standard way to represent the structure of programming languages |
| what are type 2 grammar used for | they are the standard way to represent the structure of programming languages |
| parsing problem | the determination of efficient algorithms for the recognition of context free languages |
| when was the parsing problem pursued | 1960s and 1970s |
| is the parsing problem solved | led to a fairly complete solution of this problem |
| how does the parsing problem relate to compilers | has become a standard part of compiler theory |
| context free languages | topic |
| parsing algorithms | topic |
| what are related to context free grammars | finite automata and regular expressions |
| finite automata | topic |
| regular expressions | topic |
| what corresponds to Chomskys type 3 grammars | finite automata and regular expressions |
| what did the study of finite automata and regular expressions led to | led to symbolic methods for expressing the structure of the words, or tokens of a program language |
| what has been a complex topic in the process of compiler construction | the development of methods for generating efficient object code |
| is the development of methods for generating efficient object code still relevant | began with the first compilers and continues to this day |
| optimization techniques | the development of methods for generating efficient object code |
| why is the name optimization techniques misleading | they really should be called code improvement techniques since they almost never result in truely optimal object code but only improve its efficiency |
| optimization techniques | code improvement techniques |
| code improvement techniques | optimization techniques |
| after the parsing problem became well understood what was researched in response | a great deal of work was devoted to developing programs that would automate this part of compiler development |
| what were the programs called that were developed to automate the parsing problem part of compiler development | compiler-compilers or parser genrators |
| parser generators | programs that were developed to automate the parsing problem part of compiler development |
| parser genrator | compiler-compilers |
| compiler-compilers | parser generator |
| why the name parser generator instead of compiler-compilers | they only automate one part of the compilation process |
| what are the best known parser generators | YACC (yet another compiler compiler) written by Steve Johnson in 1975 for Unix |
| what did the study of finite automata led to | the development of another tool called a scanner generator |
| scanner generator | (study of finite automata?) |
| what is the best known scanner generator | Lex developed for Unix by Mike Lesk about same time as YACC so 1975 |
| what happened during the 1970s and 1980s in regards to compiler research | number of projects focused on automating the generation of other parts of a compiler - including code generation |
| projects focused on automating the generation of other parts of a compiler have been less successful because | the complex nature of the operations and our less than perfect understanding of them (research opportunity) |
| what are recent advances in compiler design (part1) | compilers have included the application of more sophisticated algorithms for inferring and/or simplifying information contained in a program - hand in hand with the development of more sophisticated programming languages that allow this kind of analysis. |
| algorithm used in the compilation of functional languages | Hindly Milner type checking |
| what is an advanced algorithm used in recent advances in compiler design | Hindly Milner type checking |
| what are recent advances in compiler design (part2) | compilers have become more a part of a windows based IDE that includes editors, linkers, debuggers and project managers |
| interactive development enviorment | IDE |
| IDE | interactive development enviorment |
| what is an IDE | includes editors, linkers, debuggers, and project managers (and compilers) |
| what is an interactive development enviorment | includes editors, linkers, debuggers, and project managers (and compilers) |
| is there standardization of IDEs | little standardization but the development of standard windowing environments is leading in that direction |
| IDES | topic |