"FUN" With Attribute Grammars

Presented by Jeff Palm
Transcribed by Peter Douglass
 

Some motivations for attribute grammars:

Knuth '92

This paper is a retrospective

What is an attribute gramar?

example:

productions                       attribute equations

                                                   (v)                      (s)                                   (l)

Z -> L1 '.' L2                    Z.v = L1.v + L2.v      L1.s =0    L2.s = -L2.l
       |  L                             Z.v = L.v

L -> L1 B                                                           L1.s = 0   B.s = L.s
      |  B

B -> '1'                             B.v = 2 B.s addded sup tag
      |  '0'                            B.v = 0

This example provides an attribute grammar for binary numbers, and all expectations hold

------------------
 

How did attribute grammars come about?

Knuth and Wegner
Wegner asked "What is the big problem in computer science?
Knuth answers "We don't know how to talk about programming languages".
Solution proposed:  up and down attribute grammars

Theory of attribute grammars was worked out in the 70's
In the 80's there was research into the application of attribute grammars
After 80's little research

Papers

Reps '84      Synthesizer Generator

    attribute grammar
    generate a syntax directed editor from attribute grammar
    similar to YACC

    specify abstract syntax
    exp: NullExpr()
           | Sum(exp exp)
           | Const(Int)

    paper uses phylum/phyla for type/types

    Question:  Why is therea NullExpr() in the abstract syntax?
    Answer::  We want to be able to determine all attributes at all times.

    Syntax with attributes:

exp:
    NullExp()        { exp.v = 0 }
 |  Sum(exp exp)  { exp0.v = exp1.v + exp2.v }
 |  Const(Int)        { exp.v = Int)

Other components of tthis system were

    an unparser
        Sum [exp1 "+" exp2]
        Const [Int]

    Question:  If we want to go bottom up, what do we need?
    Answer:
        exp {abs} added the exp
            sum: { exp0 abs = Sum (exp1 abs, exp2 abs)

    Question:  This looks redundant?
                    Does it have to do with the differentce between a parse tree and an AST?
    Answer:  An abstract grammar is not revealed by a concrete grammar

    Question:  Why don't people use these
    Answer:  They don't scale up
    Disagreement:  Theyadded 'y' do scale up, suggestion that an ADA compiler uses attribute grammar technology

    The next two systems have admitted hacks

    Question:  Do YACC/Bison use attribute grammars?
    Answer:   No, they use action routines, which are stronger.

    Question:  Why restrict yourself if action routines are stronger and can produce safe programs?
    Answer:  Attribute grammars have some noice properties
 
        denotable values in language could have attributes
        environments, to add a new type, add it as a syntactic entity

        ENV = nullEnv()
                    | EnvConcat(Binding ENV)

        This is similar to define-datatype in scheme, in that it provides constructors and destructors

     We could write a calculator with this method, but not much more expressive than that.  Why?
 
    Question:  What property are they taking advantage of?
    Answer:  Equations are functional -- no side effects.  Action routines may have side effects
                    Language is "static", can't specify "dynamic" properties of a language such as state
                    The Reps system is strongly normalizing.
    Comment:  There was a company formed which made use of this technology.  It isn't completely useless
 

Gonaapathy and Fisher '82  -- Compiler Generator Generator

The aim of the work in this paper is to create retargetable comilers with attribute grammars.
There is an Attribute grammar for the target architecture.
The system would generate code.
Claim is that it this method would support both machine dependent and indepenent optimizations

Question:  What are the constraints on the target language?
Answer:  Must be expressible in a grammar with two types of productions

Example:

   (The '^' symbol indicates synthesized attributes, '_' inicates inherited attributes)

   (addressing mode)
    address^a -> Disp^b  Base^c ADDR(_b_c^a)

            := Addr^a - Addr^b Addr^c

    (instruction selection)
    Byte^r -> Byte^a Byte^r EMIT(addr2_a_r)
    byte^x -> Addr^x IsByte(_x)   (<-- disambiguating predicate)

Disambiguating predicates not in formal definition of Attribute Grammars

Possible optimizations in code generation

    Byte^s Byte^r IsOne(_r) EMIT(decb_s)
    Byte^r -> Byte^a Byte^r Emit(sub2_a_r)

The claim is that this system is able to specify portible optimizations
However, the optimizations are specified in the attribute grammar of the target langauge!

Discussion:  Why is this "portable" or "retargetable"?
    Optimizations are declaratively stated.
    Perhapss in 1982 "copy and paste" retargetable was meant
    Perhaps in 1982 "many" machines, had similar architecture to example
 

Generator from Attribute Grammars '84

Reps tried to do whole thing, spit out whole system
CGG just a code generator

GAG (Generator from Attribute Grammars) not all in a single grammar
--front end only

Able to generate a number of front ends for PASCAL, PEARL (not PERL), ADA
Biggest hurdles seemd to be scalability
Could it have worked 20 years later?

Discussion:  Why did people stop researching this area?
1)  Most pure research completed
2)  practical aspects ran into troubles
3)  Denotational semantics had greater success in specifying compilers succinctly