| Term | Definition |
| Reasons for Studying Concepts of Programming Languages | -Increased ability to express ideas
-Improved background for choosing appropriate languages
-Increased ability to learn new languages
-Better understanding of significance of implementation |
| Programming Domains | -Scientific applications
-Business applications
-Artificial Intelligence
- Systems programming
-Web Software
-Mobile Devices
-Real Time
-Concurrent |
| Language Evaluation Criteria | -Readability: the ease with which programs can be read and understood
-Writability: the ease with which a language can be used to create programs
-Reliability: conformance to specifications
-Cost: the ultimate total cost |
| Evaluation Criteria: Readability | -Overall simplicity
-Orthogonality
-Data types
-Syntax considerations |
| Evaluation Criteria: Writability | -Simplicity and orthogonality
-Support for abstraction
-Expressivity |
| Evaluation Criteria: Reliability | -Type checking
-Exception handling
-Aliasing
-Readability and writability |
| Evaluation Criteria: Cost | -Training programmers to use the language
-Writing programs
-Compiling programs
-Executing programs
-Language implementation system: availability of free compilers
-Reliability: poor reliability leads to high costs
-Maintaining programs |
| Evaluation Criteria: Others | -Portability
-Generality
-Well-definedness |
| Von Neumann Architecture Influence | Imperative languages are most dominant
- variables model memory cells
- assignments model piping
- iteration is efficient |
| Von Neumann Computers | - data and programs stored in memory
- memory is separate from CPU
- instructions are piped in from memory to CPU |
| 1950s and early 1960s Programming Methodologies | simple applications and worry about machine efficiency |
| Late 1960s Programming Methodologies | people efficiency became important; readability, better control structures
-structured programming
-top-down design and step-wise refinement |
| Reasons for Studying Concepts of Programming Languages pt 2 | -Better use of languages that are already known
-Overall advancement of computing
-Increased Job prospects |
| Late 1970s Programming Methodologies | Process-oriented to data-oriented
-data abstraction |
| Middle 1980s Programming Methodologies | Object-oriented programming
-Data abstraction + inheritance + polymorphism |
| Imperative Language Category | -features: variables, assignment statements, and iteration
-include languages that support object-oriented programming
-C, java, perl, javascript, Visual BASIC, .NET, C++ |
| Functional Language Category | -main means of making comparisons is by applying functions to given parameters
-LISP, Scheme, ML, F# |
| Logic Language Category | -rule-based
- prolog |
| Markup/programming hybrid | -markup languages extended to support some programming
-JSTL, SXLT |
| Language Design Trade-Offs | -Reliability vs. cost of execution
-Readability vs. writability
-Writability vs. reliability |
| Compilation Implementation Method | -Programs are translated into machine language; includes JIT systems
-Faster and code can be optimized
-Only translates once
-Use: Large commercial applications |
| Pure Interpretation Implementation Method | -Programs are interpreted by another program known as an interpreter
-Nicer to develop (REPL)
-Use: Small programs or when efficiency is not an issue |
| Hybrid Implementation Method | -A compromise between compilers and pure interpreters
-Use: Small and medium systems when efficiency is not the first concern
-Java |
| Von Neumann Bottleneck | -Connection speed between a computer’s memory and its processor determines the speed of a computer
-Program instructions often can be executed much faster than the speed of the connection
-The primary limiting factor in the speed of computers |
| What is a PL? | syntax: grammatical rules
semantics: meaning
natural languages are enormous and imprecise and ambiguous |
| PL Generations | 1st: machine instructions (binary)
2nd: assembly languages (not portable & low level)
3rd: high level languages
4th: code generators
5th: prolog? |
| Abstractions on PLs | -data
-control
-process |
| Programs | data structures + algorithms = programs |
| Zuse's Plankalkul | -Designed in 1945 but not published until 1972
-Never implemented
-Advanced data structures (floating point, arrays, records)
- invariants
-Designed by Konrad Zuse |
| Issues with using machine code | -Poor readability
-Poor modifiability
-Expression coding was tedious
-Machine deficiencies--no indexing or floating point |
| Short Code | -developed by John Mauchly in 1949
-developed for the BINAC computer
- expression were coded L->R
-was implemented with a pure interpreter |
| Speedcoding | -developed by John Backus in 1954
- developer for the IBM 701
-pseudo ops for arithmetic and math functions
-conditional and unconditional branching
-Auto-increment register for array access
-Slow! |
| UNIVAC Compiling System | -developed between 1951 and 1953 by a team led by Grace Hopper
- expanded a pseudo code into machine code subprograms |
| Fortran | -Designed for the IBM 704, which had index registers and floating point hardware
- Fortran I was developed in 1957
- Developed by John Backus
-Fortran II was distributed in 1958
-Changed forever the way computers are used
-Was developed for scientifi |
| LISP | -Designed by John McCarthy in 1958
-AI research needed a language to process data in lists and symbolic computation
-syntax is based on lambda calculus
-pioneered functional programming
-still the dominant language for AI |
| Scheme | -Developed at MIT in the mid 1970s
-small
-extensive use of static scoping
-functions as first class entities
-well suited to educational applications
-functional programming language |
| Common Lisp | -an effort to combine features of several dialects of Lisp into a single language
-large and complex language |
| ALGOL 60 | -result of efforts to design a universal language
-ACM and GAMM met for four days in 1958
-was the standard way to publish algorithms for over 20 years
-all subsequent imperative languages are based on it
-first machine-independent language
-lacked I |
| ALGOL 60 pt 2 | -first language whose syntax was formally defined (BNF)
-never widely used
-lacked I/O
-lack of support from IBM |
| COBOL | -computerizing business records
-based on FLOW-MATIC
-first design meeting (Pentagon) May 1959
-nested selection statements
-still the most widely used business application language
-first language required by the DoD |
| Basic | -designed by Kemeny & Kurtz at Dartmouth in 1971
-Easy to learn and user for non-science students
-current popular dialect: Visual Basic
-first widely used language with time sharing |
| PL/I | -designed by IBM and Share in 1964
-originally was named a new programming language
-built to do both scientific and business applications
-too large and too complex
-first pointer data type
- |
| APL | -a programming language
-designed by Ken Iverson at IBM in 1960
-highly expressive
-programs are difficult to read |
| SNOBOL | -designed as a string manipulation language
-Designed at Bell Labs by Farber, Griswold, and Polensky in 1964
- powerful operators for string pattern matching
-still used for certain text processing tasks |
| SIMULA 67 | -Designed for system simulation in Norway by Nygaard and Dahl
-based on ALGOL 60 and SIMULA 1
-Coroutines
-classes, objects, and inheritance |
| ALGOL 68 | -user-defined data structures
-reference types
-dynamic arrays
-less usage than AGLOL 60 |
| Pascal | -developed in 1971 by Niklaus Wirth
-designed for teaching structured programming
-small and simple
-largest impact was on teaching programming |
| C | -designed for systems programming at Bell Labs by Dennis Richie in 1972
-powerful set of operators, but poor type checking
-designed as a systems language
-used in many application areas
-initially spread through UNIX |
| Prolog | -developed by Comerauer and Rousesel
-early 1970s
-based on formal logic
-non-procedural
-comparitively inefficient
-few application areas |
| Ada | -huge design effort and took about 8 years
-Sequence of requirements (1975-1978)
-Named Ada after Augusta Ada Byron, the first programmer
-exception handling
-concurrency
-generic program units
--packages |
| Ada 95 | -began in 1988
-support for OOP
-new concurrency features |
| Ada 2005 | -interfaces and synchronizing interfaces
-popularity suffered because the DoD no longer requires its uses but also because of the popularity of C++ |
| Smalltalk | -developed at Xerox PARC by Alan Kay and Adele Goldberg
-first full object-oriented language
-pioneered the graphical user interface design
-promoted oop |
| C++ | -developed at Bell Lab by Bjarne Stroustrup at Bell Labs in 1980
-evolved from C and SIMULA 67
-supports both procedural and OOP
-rapidly grew in popularity
-ANSI standard approved in November 1997 |
| Objective-C | -designed by Brad Cox early 1980s
-used by apple for systems programs
-uses smalltalk's method for calling syntax |
| Java | -imperative- based OOP language
-developed at Sun in the early 1990s
-based on c++
-supports only OOP
-has references, but not pointers
-supports concurrency
-portable (JVM)
-widely used for web programming
-use increased faster than any prev lang |
| Perl | -designed by Larry Wall in 1987
-scripting language
-variables are statically typed but implicitly declared
-powerful and somewhat dangerous
-replacement for UNIX system administration language |
| JavaScript | -Began at Netscape
-client-side HTML embedded scripting language
purely interpreted
-related to Java only through syntax |
| PHP | -hypertext preprocessor
-designed by Rasmus Lerdorf
-purely interpreted |
| Python | - oo interpreted scripting language
- dynamically typed
-type checked
-supports lists, tuples, and hashed |
| Ruby | -designed in Japan by Yukihiro Matsumoto (Matz)
-began as a replacement for Perl and Python
-a pure object-oriented scripting language
-purely interpreted |
| Lua | -an oo interpreted scripting language
-type checked but dynamically typed
-easily extendable
-supports lists, tuples, and hashes all with its single data structure, the table |
| C# | -Part of the .NET development platform (2000)
-based on C++, Java and Delphi
-include pointers, delegates, properties, enumeration times, a limited kind of dynamic typing, and anonymous types
-is evolving rapidly |
| XSLT | -eXtensible markup language (XML)
-eXtensible Stylesheet Language Transformation
-programming constructs
-hybrid language |
| JSP | -java server pages: collection of technologies to support dynamic Web Documents
-JSTL, a JSP library, include programming constructs in the form of HTML elements |
| syntax | the form or structure of the expression, statements, and program units (grammar) |
| semantics | the meaning of the expressions, statements, and program units |
| Language definition | provided by the syntax and semantics of a language |
| sentence | a string of characters over some alphabet |
| language | a set of sentences |
| lexeme | the lowest level syntactic unit of a language (*, sum, begin) |
| token | is a category of lexemes (identifier) |
| recognizer | a recognition device that reads input strings over the alphabet of the language and decides whether the input strings belong to the language |
| generators | -a device that generates sentences of a language
-one can determine if the syntax of a particular sentence is syntactically correct by comparing it to the structure of the generator |
| context-free grammars | -developed by Noam Chomsky in the mid-1950s
-language generators, meant to describe the syntax of natural languages
-define a class of languages called context-free languages |
| Backus-Naur Form | -invented by John Backus to describe the syntax of Algol 58
-BNF is equivalent to context-free grammars
-developed in 1959 |
| BNF Fundamentals | -abstractions are used to represent classes of syntactic structures
-terminals are lexemes or tokens
-LHS (nonterminal) and RHS (string of terminals and/or nonterminals)
-nonterminals are enclosed in angle brackets < > |
| Grammar | a finite non-empty set of rules |
| Start symbol | a special element of the nonterminals of a grammar |
| BNF Rules | -an abstraction (nonterminal) can have more than one RHS |
| Describing Lists BNF | -syntactic lists are described using recursion
<ident_list> -> ident | ident, <ident_list> |
| derivation | a repeated application of rules, starting with the start symbol and ending with a sentence (all terminal symbols) |
| Example BNF Grammar | <program> -> <stmts>
<stmts> -> <stmt> | <stmt> ; <stmts>
<stmt> -> <var> = <expr>
<var> -> a | b | c | d
<expr> -> <term> + <term> | <term> - <term>
<term> -> <var> | const |
| Example BNF Derivation | <program> => <stmts> => <stmt>
=> <var> = <expr>
=> a = <expr>
=> a = <term> + <term>
=> a = <var> + <term>
=> a = b + <term>
=> a = b + const |
| derivations | -every string of symbols in a derivation is a sentential form
-a sentence is a sentential form that only has terminal symbols
-can be leftmost or rightmost |
| left-most derivation | a derivation in which the left-most nonterminal in each sentential form is the one that is expanded |
| ambiguous grammar | if an only if it generates a sential form that has two or more distinct parse trees |
| precedence levels | must be unambiguous expression grammar |
| Extended BNF (EBNF) | -optional parts are places in brackets [ ]
-alternative parts of RHSs are placed inside parentheses and separated via vertical bars
-Repetitions (0 or more) are placed inside braces {} |
| Context-sensitive syntax | -static semantics (scope rules: must declare id prior to referencing)
-attribute grammars: associate attribute values with symbols |
| Operational Semantics | define semantics in terms of effect on state of simple virtual machine |
| Axiomatic semantics | goal of proving program correctness, impose constraints called preconditions and postconditions |
| denotation semantics | formal approach based on recursive function theory |
| Theory of computation | -computability
-complexity |
| P | polynomial time |
| NP | nodeterministic solution in polynomial time |
| Formal Language Theory (Syntax) | Chomsky Hierarchy
-type 3: regular languages (finite automata)
-type 2: context free languages (pushdown automata)
-type 1: context sensitive languages (linear bounded automata)
-type 0: recursively enumerable languages (Turing machines) |
| non determinism | the ability to "guess" correct alternative |
| close relation between generators and recognizers | generators: production rules
recognizers: automata |
| In E(BNF) you can produce a recognizer from a generator | |
| lexical analysis (tokenizing) | -break stream of characters into toxens (lexemes)
-pattern matching |
| syntax analysis (parsing) | check syntax and build a parse tree |
| parse tree | can parse top-down or bottom-up |
| concurrency can occur at four levels | -machine instruction level
-high-level language statement level
-unit level
-program level |
| Late 1950s multiprocessor architectures | one general purpose processor and one or more special purpose processors for input and output operations |
| Early 1960s multiprocessor architectures | multiple complete processors, used for program-level concurrency |
| Mid 1960s multiprocessor architectures | multiple partial processors, used for instruction level concurrency |
| Flynn's Taxonomy | SISD: single instruction, single data
SIMD: single instruction, multiple data
MISD: multiple instructions, single data
MIMD: multiple instructions, multiple data |
| Categories of Concurrency | -physical concurrency
- logical concurrency |
| Physical Concurrency | Multiple independent processors (multiple threads of control) |
| Logical Concurrency | The appearance of physical concurrency is presented by time-sharing one processor (software can be designed as if there were multiple threads of control) |
| Coroutines (quasi-concurrency) | have a single thread of control |
| thread of control | the sequence of program points reached as control flows through the program |
| Concurrency vs. Parallelism | -concurrency >= 1 CPU
-parallelism > 1 CPU |
| multithreading | > 1 thread |
| concurrent units | coroutines, tasks |
| task/process/thread | a program unit that can be in concurrent execution with other program units |
| heavyweight tasks | execute in their own address space |
| lightweight tasks | run in the same address space
more efficient |
| disjoint task | does not communicate with or affect the execution of any other task in the program in any way |
| Task Synchronization | a mechanism that controls the order in which tasks execute |
| Task communication is necessary for synchornization | -shared nonlocal variables
-parameters
-message passing |
| cooperation synchronization | task A must wait for task B to complete some specific activity before task A can continue its execution (producer-consumer problem) |
| competition | two or more tasks must use the same resource that cannot be simultaneously used (a shared counter)
-usually provided by mutually exclusive access |
| Major issues | -communication between tasks (message passing, lock on nonlocal variables)
-synchronization (cooperative, competitive) |