PLaC-4.1 4. Attribute grammars and semantic analysis Input: abstract program tree Tasks: Compiler module: name analysis environment module properties of program entities definition module type analysis, operator identification signature module Output: attributed program tree Standard implementations and generators for compiler modules Operations of the compiler modules are called at nodes of the abstract program tree Model: dependent computations in trees Specification: attribute grammars generated: a tree walking algorithm that calls functions of semantic modules in specified contexts and in an admissible order -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 401 Objectives: Tasks and methods of semantic analysis In the lecture: Explanation of the * tasks, * compiler modules, * principle of dependent computations in trees. Suggested reading: Kastens / Übersetzerbau, Section Introduction of Ch. 5 and 6 -------------------------------------------------------------------------------- PLaC-4.2 4.1 Attribute grammars Attribute grammar (AG): specifies dependent computations in abstract program trees; declarative: explicitly specified dependences only; a suitable order of execution is computed Computations solve the tasks of semantic analysis (and transformation) Generator produces a plan for tree walks that execute calls of the computations, such that the specified dependences are obeyed, computed values are propagated through the tree Result: attribute evaluator; applicable for any tree specified by the AG Example: AG specifies size of declarations tree with dependent attributes RULE: Decls ::= Decls Decl COMPUTE evaluated Decls Decls[1].size = size 16 Add (Decls[2].size, Decl.size); END; Decls size 12 Decl size 4 RULE: Decls ::= Decl COMPUTE Decls.size = Decl.size; END; Decls Decl size 4 size 8 RULE: Decl ::= Type Name COMPUTE Decl.size = Type.size; Decl END; size 4 (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 402 Objectives: Get an informal idea of attribute grammars In the lecture: Explain computations in tree contexts using the example Suggested reading: Kastens / Übersetzerbau, Section 5, 5.1 Questions: Why is it useful NOT to specify an evaluation order explicitly? -------------------------------------------------------------------------------- PLaC-4.3 Basic concepts of attribute grammars (1) An AG specifies computations in trees expressed by computations associated to productions of the abstract a tree context of type q: syntax RULE q: X ::= w COMPUTE X f(...); g(...); q END; f(...) g(...) computations f(...) and g(...) are executed in every tree w context of type q An AG specifies dependences between computations: expressed by attributes associated to grammar symbols RULE p: Y ::= u X v COMPUTE a tree context of type p: Y.b = f(X.a); b Y X.a = g(...); END; p f(...) g(...) Attributes represent: properties of symbols and pre- and post-conditions of computations: u X a v post-condition = f (pre-condition) f(X.a) uses the result of g(...); hence X.a = g(...) is specified to be executed before f(X.a) (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 403 Objectives: Get a basic understanding of AGs In the lecture: Explain * the AG notation, * dependent computations Suggested reading: Kastens / Übersetzerbau, Section 5, 5.1 Assignments: * Read and modify examples in Lido notation to introduce AGs -------------------------------------------------------------------------------- PLaC-4.4 Basic concepts of attribute grammars (2) dependent computations in adjacent contexts: adjacent contexts RULE q: Y ::= u X v COMPUTE of types q and p: Y.b = f(X.a); END; b RULE p: X ::= w COMPUTE Y X.a = g(...); q END; f(...) X u a v p g(...) w attributes may specify dependences without propagating any value; specifies the order of effects of computations: X.GotType = ResetTypeOf(...); Y.Type = GetTypeOf(...) <- X.GotType; ResetTypeOf will be called before GetTypeOf (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 404 Objectives: Get a basic understanding of AGs In the lecture: Explain * dependent computations in adjacent contexts in trees Suggested reading: Kastens / Übersetzerbau, Section 5, 5.1 Assignments: * Read and modify examples in Lido notation to introduce AGs -------------------------------------------------------------------------------- PLaC-4.5 Definition of attribute grammars An attribute grammar AG = (G, A, C) is defined by Y ::= u X v Y * a context-free grammar G (abstract syntax) q * for each symbol X of G a set of attributes A(X), written X.a if a element A(X) AI(X) * for each production (rule) p of G u AS(X) v a set of computations of one of the forms p X.a = f ( ... Y.b ... ) or g (... Y.b ... ) where X and Y occur in p X ::= w w Consistency and completeness of an AG: Each A(X) is partitioned into two disjoint subsets: AI(X) and AS(X) AI(X): inherited attributes are computed in rules p where X is on the right-hand side of p AS(X): synthesized attributes are computed in rules p where X is on the left-hand side of p Each rule p: Y::= ... X... has exactly one computation for each attribute of AS(Y), for the symbol on the left-hand side of p, and for each attribute of AI(X), for each symbol occurrence on the right-hand side of p (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 405 Objectives: Formal view on AGs In the lecture: The completeness and consistency rules are explained using the example of PLaC-4.6 -------------------------------------------------------------------------------- PLaC-4.6 AG Example: Compute expression values The AG specifies: The value of each expression is computed and printed at the root: ATTR value: int; SYMBOL Opr: left, right: int; RULE: Root ::= Expr COMPUTE RULE: Opr ::= quotesingle +quotesingle COMPUTE printf ("value is \%d\n", Opr.value = Expr.value); ADD (Opr.left, Opr.right); END; END; TERM Number: int; RULE: Opr ::= quotesingle *quotesingle COMPUTE Opr.value = RULE: Expr ::= Number COMPUTE MUL (Opr.left, Opr.right); Expr.value = Number; END; END; RULE: Expr ::= Expr Opr Expr A (Expr) = AS(Expr) = {value} COMPUTE AS(Opr) = {value} Expr[1].value = Opr.value; AI(Opr) = {left, right} Opr.left = Expr[2].value; A(Opr) = {value, left, right} Opr.right = Expr[3].value; END; (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 406 Objectives: Exercise formal definition In the lecture: * Show synthesized, inherited attributes. * Check consistency and completeness. Questions: * Add a computation such that a pair of sets AI(X), AS(X) is no longer disjoint. * Add a computation such that the AG is inconsistent. * Which computations can be omitted whithout making the AG incomplete? * What would the effect be if the order of the three computations on the bottom left of the slide was altered? -------------------------------------------------------------------------------- PLaC-4.7 AG Binary numbers Attributes: L.v, B.v value L.lg number of digits in the sequence L L.s, B.s scaling of B or the least significant digit of L RULE p1: D ::= L quotesingle .quotesingle L COMPUTE D.v = ADD (L[1].v, L[2].v); L[1].s = 0; L[2].s = NEG (L[2].lg); END; RULE p2: L ::= L B COMPUTE L[1].v = ADD (L[2].v, B.v); B.s = L[1].s; L[2].s = ADD (L[1].s, 1); L[1].lg = ADD (L[2].lg, 1); END; RULE p3: L ::= B COMPUTE L.v = B.v; B.s = L.s; L.lg = 1; END; RULE p4: B ::= quotesingle 0quotesingle COMPUTE B.v = 0; END; RULE p5: B ::= quotesingle 1quotesingle COMPUTE scaled binary value: B.v = Power2 (B.s); B.s END; B.v = 1 * 2 (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 407 Objectives: A complete example for an AG In the lecture: * Explain the task. * Explain the role of the attributes. * Explain the computations in tree contexts. * Show a tree with attributes and dependencies (PLaC-4.8) -------------------------------------------------------------------------------- PLaC-4.8 An attributed tree for AG Binary numbers D p1 5.25 dependence established by a computation 0 L 3 5 -2 L 2 .25 p2 p2 1 L 0 B -1 -2 2 4 1 L B 1 0 .25 p2 p5 1 p3 p5 1 attributes: 2 D 1 -1 L v 1 4 B B 0 0 s p3 0 p4 L p4 0 lg v 2 B 4 s B p5 v 1 (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 408 Objectives: An attributed tree In the lecture: * Show a tree with attributes. * Show tree contexts specified by grammar rules. * Relate the dependences to computations. * Evaluate the attributes. Questions: * Some attributes do not have an incoming arc. Why? * Show that the attribues of each L node can be evaluated in the order lg, s, v. -------------------------------------------------------------------------------- PLaC-4.9 Dependence graphs for AG Binary numbers D v s p1 L lg v B s v p3 p4 s s L L lg v lg v B s v s L If a tree exists, that p2 lg v B s has a path from X.a to v X.b at some node of s p5 Type X, the graphs L have an indirect B s lg v v dependence X.a X.b (c) 2008 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 409 Objectives: Represent dependences In the lecture: * graph representation of dependences that are specified by computations, * compose the graphs to yield a tree with dependences, * explain indirect dependences * Use the graphs as an example for partitions (PLaC-4.9) * Use the graphs as an example for LAG(k) algorithm (see a later slide) -------------------------------------------------------------------------------- PLaC-4.10 Attribute partitions The sets AI(X) and AS(X) are partitioned each such that AI (X, i) is computed before the i-th visit of X AS (X, i) is computed during the i-th visit of X upper context of X Y p: Y ::= u X v dependences between attributes AI (X,1) AI (X,2) u AS (X,1) AS (X,2) v lower context of X context switch q : X ::= w on tree walk w Necessary precondition for the existence of such a partition: No node in any tree has direct or indirect dependences that contradict the evaluation order of the sequence of sets:AI (X, 1), AS (X, 1), ..., AI (X, k), AS (X, k) (c) 2008 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 410 Objectives: Understand the concept of attribute partitions In the lecture: Explain the concepts * context switch, * attribute partitions: sequence of disjoint sets which alternate between synthesized and inherited Suggested reading: Kastens / Übersetzerbau, Section 5.2 Assignments: Construct AGs that are as simple as possible and each exhibits one of the following properties: * There are some trees that have a dependence cycle, other trees don't. * The cycles extend over more than one context. * There is an X that has a partition with k=2 but not with k=1. * There is no partition, although no tree exists that has a cycle. (caution: difficult puzzle!) (Exercise 22) -------------------------------------------------------------------------------- PLaC-4.11 Construction of attribute evaluators For a given attribute grammar an attribute evaluator is constructed: * It is applicable to any tree that obeys the abstract syntax specified in the rules of the AG. * It performs a tree walk and executes computations in visited contexts. * The execution order obeys the attribute dependences. A Pass-oriented strategies for the tree walk: AG class: k times depth-first left-to-right LAG (k) k times depth-first right-to-left RAG (k) B C alternatingly left-to-right / right-to left AAG (k) once bottom-up (synth. attributes only) SAG D E AG is checked if attribute dependences fit to desired pass-oriented strategy; see LAG(k) check. A non-pass-oriented strategies: visit-sequences: OAG an individual plan for each rule of the abstract syntax B C A generator fits the plans to the dependences of the AG. D E (c) 2011 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 411 Objectives: Tree walk strategies In the lecture: * Show the relation between tree walk strategies and attribute dependences. Suggested reading: Kastens / Übersetzerbau, Section 5, 5.1 -------------------------------------------------------------------------------- PLaC-4.11a Hierarchy of AG classes Attribute Grammar non-circular AG (no dependence cycle in any apt) ANCAG (absolutely non-circular) visit-seq.AG (a set of visit-sequences exists) OAG AAG(k) LAG(k) RAG(k) SAG (c) 2011 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 411a Objectives: Understand the AG hierarchy In the lecture: It is explained * A grammar class is more powerful if it covers AGs with more complex dependencies. * The relationship of AG classes in the hierarchy. Suggested reading: Kastens / Übersetzerbau, Section 5, 5.1 -------------------------------------------------------------------------------- PLaC-4.12 Visit-sequences A visit-sequence (dt. Besuchssequenz) vsp for each production of the tree grammar: p: Xo ::= X1 ... Xi ... Xn A visit-sequence is a sequence of operations: arrowdown i, j j-th visit of the i-th subtree arrowup j j-th return to the ancestor node evalc execution of a computation c associated to p Example out of the AG for binary numbers: s L vsp3: L ::= B lg v L.lg=1; arrowup 1; B.s=L.s; arrowdown B,1; L.v=B.v; arrowup 2 p3 B s v (c) 2011 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 412 Objectives: Understand the concept of visit-sequences In the lecture: Using the example it is explained: * operations, * context switch, * sequence with respect to a context Suggested reading: Kastens / Übersetzerbau, Section 5.2.2 -------------------------------------------------------------------------------- PLaC-4.13 Interleaving of visit-sequences Visit-sequences for adjacent contexts are executed interleaved. upper context The attribute partition of the common nonterminal specifies the interface between the AI (X,1) AI (X,2) upper and lower visit-sequence: AS (X,1) AS (X,2) lower context Example in the tree: interleaved visit-sequences: A vsp: ... arrowdown C,1 ...arrowdown B,1 ...arrowdown C,2 ...arrowup 1 p: A::= BC B C D E vsq: ... arrowdown D,1 ... arrowup 1 ... arrowdown E,1 ... arrowup 2 q: C::= DE Implementation:one procedure for each section of a visit-sequence upto arrowup a call with a switch over applicable productions for arrowdown (c) 2010 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 413 Objectives: Understand interleaved visit-sequences In the lecture: Explain * interleaving of visit-sequences for adjacent contexts, * partitions are "interfaces" for context switches, * implementation using procedures and calls Suggested reading: Kastens / Übersetzerbau, Section 5.2.2 Assignments: * Construct a set of visit-sequences for a small tree grammar, such that the tree walk solves a certain task. * Find the description of the design pattern "Visitor" and relate it to visit-sequences. Questions: * Describe visit-sequences which let trees being traversed twice depth-first left-to-right. -------------------------------------------------------------------------------- PLaC-4.14 Visit-sequences for the AG Binary numbers vsp1: D ::= L quotesingle .quotesingle L arrowdown L[1],1; L[1].s=0; arrowdown L[1],2; arrowdown L[2],1; L[2].s=NEG(L[2].lg); arrowdown L[2],2; D.v=ADD(L[1].v, L[2].v); arrowup 1 vsp2: L ::= L B arrowdown L[2],1; L[1].lg=ADD(L[2].lg,1); arrowup 1 L[2].s=ADD(L[1].s,1); arrowdown L[2],2; B.s=L[1].s; arrowdown B,1; L[1].v=ADD(L[2].v, B.v); arrowup 2 vsp3: L ::= B L.lg=1; arrowup 1; B.s=L.s; arrowdown B,1; L.v=B.v; arrowup 2 vsp4: B ::= quotesingle 0quotesingle s L visited twice B.v=0; arrowup 1 lg v vsp5: B ::= quotesingle 1quotesingle s B visited B.v=Power2(B.s); arrowup 1 v once Implementation: Procedure vs

for each section of a vsp to a arrowup i a call with a switch over alternative rules for arrowdown X,i (c) 2010 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 414 Objectives: Example for visit-sequences used in PLaC-4.13 In the lecture: * Show interfaces and interleaving, * show tree walk (PLaC-4.15), * show sections for implementation. Questions: * Check that adjacent visit-sequences interleave correctly. * Check that all dependencies between computations are obeyed. * Write procedures that implement these visit-sequences. -------------------------------------------------------------------------------- PLaC-4.14a Visit-Sequences for AG Binary numbers (tree patterns) D v p1 s L lg v s s L L p2 lg v lg v s L B s lg v v s L lg v B s v s p3 p4 B v B s p5 v (c) 2011 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 414a Objectives: Example for visit-sequences used in PLaC-4.13 In the lecture: * Create a tree walk by pasting instances of visit-sequnces together -------------------------------------------------------------------------------- PLaC-4.15 Tree walk for AG Binary numbers D p1 5.25 0 L 3 5 -2 L 2 .25 p2 tree walk p2 1 L 0 2 4 B -1 -2 1 L B 1 0 .25 p2 p5 1 p3 p5 1 attributes: 2 D 1 -1 L v 1 4 B B 0 0 s p3 0 p4 L p4 0 lg v 2 B 4 s B p5 v 1 (c) 2004 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 415 Objectives: See a concrete tree walk In the lecture: Show that the visit-sequences of PLaC-4.15 produce this tree walk for the tree of PLaC-4.8. -------------------------------------------------------------------------------- PLaC-4.16 LAG (k) condition An AG is a LAG(k), if: For each symbol X there is an attribute partition A (X,1), ..., A (X, k), such that the attributes in A (X, i) can be computed in the i-th depth-first left-to-right pass. Crucial dependences: In every dependence graph every dependence * Y.a -> X.b where X and Y occur on the right-hand side and Y is right of X implies that element Y.a belongs to an earlier pass than X.b, and * X.a -> X.b where X occurs on the right-hand side implies that element X.a belongs to an earlier pass than X.b Necessary and sufficient condition over dependence graphs - expressed graphically: A dependency A dependence element from right to left b a a b at one symbol X Y X on the right-hand side A(X,j) A(Y,i) A(X,i) A(X,j) j > i i < j element (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 416 Objectives: Understand the LAG condition In the lecture: * Explain the LAG(k) condition, * motivate it by depth-first left-to-right tree walks. Suggested reading: Kastens / Übersetzerbau, Section 5.2.3 -------------------------------------------------------------------------------- PLaC-4.17 LAG (k) algorithm Algorithm checks whether there is a k>=1 such that an AG is LAG(k). Method: compute iteratively A(1), ..., A(k); in each iteration try to allocate all remaining attributes to the current pass, i.e. A(i); remove those which can not be evaluated in that pass Algorithm: Set i=1 and Cand= all attributes b a repeat X Y set A(i) = Cand; set Cand to empty; while still attributes can be removed from A(i) do remove an attribute X.b from A(i) and add it to Cand if a b - there is a crucial dependence X Y.a -> X.b s.t. X and Y are on the right-hand side, Y to the right of X and Y.a in A(i)or X.a -> X.b s.t. X is on the right-hand side and X.a is in A(i) - X.b depends on an attribute that is not yet in any A(i) if Cand is empty: exit: the AG is LAG(k) and all attributes are assigned to their passes if A(i) is empty: exit: the AG is not LAG(k) for any k else: set i = i + 1 (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 417 Objectives: Understand the LAG(k) check In the lecture: * explain the algorithm using the example of PLaC-4.10. Suggested reading: Kastens / Übersetzerbau, Section 5.2.3 Assignments: * Check LAG(k) condition for AGs (Exercise 20) Questions: * At the end of each iteration of the i-loop one of three conditions hold. Explain them. -------------------------------------------------------------------------------- PLaC-4.17a AG not LAG(k) for any k S p0: S ::= A A a b p1: A ::= C A C c A a d b p2: C ::= quotesingle ,quotesingle p1: A ::= C A Cc A a d b p2: C ::= quotesingle ,quotesingle p3: A ::= quotesingle .quotesingle A.a can be allocated to the first left-to-right pass. C.c, C.d, A.b can not be allocated to any pass. The AG is RAG(1), AAG(2) and can be evaluated by visit-sequences. (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 417a Objectives: Understand a non-LAG pattern In the lecture: * Explain the tree, * derive the AG, * try the LAG(k) algorithm. -------------------------------------------------------------------------------- PLaC-4.17b AG not evaluable in passes S p0: S ::= A c A a b d p1: A ::= quotesingle ,quotesingle A No attribute can be c A a allocated to any pass for b d any strategy. p1: A ::= quotesingle ,quotesingle A The AG can be evaluated by visit-sequences. c A a b d p2: A ::= quotesingle .quotesingle (c) 2013 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 417b Objectives: Understand a non-pass pattern In the lecture: * Explain the tree, * derive the AG, * try the LAG(k) algorithm. -------------------------------------------------------------------------------- PLaC-4.18 Generators for attribute grammars LIGA University of Paderborn OAG FNC-2 INRIA ANCAG (superset of OAG) CoCo Universität Linz LAG(k) Properties of the generator LIGA * integrated in the Eli system, cooperates with other Eli tools * high level specification language Lido * modular and reusable AG components * object-oriented constructs usable for abstraction of computational patterns * computations are calls of functions implemented outside the AG * side-effect computations can be controlled by dependencies * notations for remote attribute access * visit-sequence controlled attribute evaluators, implemented in C * attribute storage optimization -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 418 Objectives: See what generators can do In the lecture: * Explain the generators * Explain properties of LIGA Suggested reading: Kastens / Übersetzerbau, Section 5.4 -------------------------------------------------------------------------------- PLaC-4.19 Explicit left-to-right depth-first propagation ATTR pre, post: int; RULE: Root ::= Block COMPUTE Block.pre = 0; Definitions are END; RULE: Block ::= quotesingle {quotesingle Constructs quotesingle }quotesingle COMPUTE enumerated and Constructs.pre = Block.pre; Block.post = Constructs.post; printed from left to right. END; RULE: Constructs ::= Constructs Construct COMPUTE Constructs[2].pre = Constructs[1].pre; The next Definition Construct.pre = Constructs[2].post; number is propagated Constructs[1].post = Construct.post; END; by a pair of attributes at RULE: Constructs ::= COMPUTE Constructs.post = Constructs.pre; each node: END; RULE: Construct ::= Definition COMPUTE Definition.pre = Construct.pre; pre (inherited) Construct.post = Definition.post; post (synthesized) END; RULE: Construct ::= Statement COMPUTE Statement.pre = Construct.pre; The value is initialized Construct.post = Statement.post; END; in the Root context and RULE:Definition ::= quotesingle definequotesingle Ident quotesingle ;quotesingle COMPUTE Definition.printed = printf ("Def \%d defines \%s in line \%d\n", incremented in the Definition.pre, StringTable (Ident), LINE); Definition context. Definition.post = ADD (Definition.pre, 1) <- Definition.printed; END; The computations for RULE: Statement ::= quotesingle usequotesingle Ident quotesingle ;quotesingle COMPUTE propagation are Statement.post = Statement.pre; END; systematic and RULE: Statement ::= Block COMPUTE Block.pre = Statement.pre; redundant. Statement.post = Block.post; END; (c) 2014 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 419 Objectives: Understand left-to-right propagation In the lecture: Explain * systematic use of attribute pairs for propagation, * strict dependences of computations on the "propagation chain". Questions: How would the output look like if we had omitted the state attributes and their dependencies? -------------------------------------------------------------------------------- PLaC-4.20 Left-to-right depth-first propagation using a CHAIN CHAIN count: int; A CHAIN specifies a left-to-right depth-first RULE: Root ::= Block COMPUTE dependency through a CHAINSTART Block.count = 0; subtree. END; One CHAIN name; RULE: Definition ::= quotesingle definequotesingle Ident quotesingle ;quotesingle attribute pairs are COMPUTE generated where needed. Definition.print = printf ("Def \%d defines \%s in line \%d\n", CHAINSTART initializes the Definition.count, /* incoming */ CHAIN in the root context StringTable (Ident), LINE); of the CHAIN. Definition.count = /* outgoing */ Computations on the ADD (Definition.count, 1) CHAIN are strictlybound <- Definition.print; by dependences. END; Trivial computations of the form X.pre = Y.pre in CHAIN order can be omitted. They are generated where needed. (c) 2014 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 420 Objectives: Understand LIDO's CHAIN constructs In the lecture: * Explain the CHAIN constructs. * Compare the example with PLaC-4.19. -------------------------------------------------------------------------------- PLaC-4.21 Dependency pattern INCLUDING ATTR depth: int; The nesting depths of Blocks are computed. RULE: Root ::= Block COMPUTE Block.depth = 0; An attribute at the root of END; a subtree is accessed from within the subtree. RULE: Statement ::= Block COMPUTE Block.depth = Propagation from ADD (INCLUDING Block.depth, 1); computation to the uses END; are generated as needed. RULE: Definition ::= quotesingle definequotesingle Ident COMPUTE No explicit computations printf ("\%s defined on depth \%d\n", or attributes are needed StringTable (Ident), for the remaining rules INCLUDING Block.depth); and symbols. END; INCLUDING Block.depth accesses the depth attribute of the next upper node of type Block. (c) 2014 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 421 Objectives: Understand the LIDO construct INCLUDING In the lecture: Explain the use of the INCLUDING construct. -------------------------------------------------------------------------------- PLaC-4.22 Dependency pattern CONSTITUENTS RULE: Root ::= Block COMPUTE A CONSTITUENTS Root.DefDone = computation accesses CONSTITUENTS Definition.DefDone; attributes from the END; subtree below its context. RULE: Definition ::= quotesingle definequotesingle Ident quotesingle ;quotesingle Propagation from COMPUTE computation to the Definition.DefDone = CONSTITUENTS construct is printf ("\%s defined in line \%d\n", generated where needed. StringTable (Ident), LINE); The shown combination END; with INCLUDING is a RULE: Statement ::= quotesingle usequotesingle Ident quotesingle ;quotesingle COMPUTE common dependency printf ("\%s used in line \%d\n", pattern. StringTable (Ident), LINE) All printf calls in <- INCLUDING Root.DefDone; Definition contexts are END; done before any in a Statement context. CONSTITUENTS Definition.DefDone accesses the DefDone attributes of all Definition nodes in the subtree below this context (c) 2014 bei Prof. Dr. Uwe Kastens -------------------------------------------------------------------------------- Lecture Programming Languages and Compilers WS 2013/14 / Slide 422 Objectives: Understand the LIDO construct CONSTITUENTS In the lecture: Explain the use of the CONSTITUENTS construct. --------------------------------------------------------------------------------