comp.lang.ada
 help / color / mirror / Atom feed
* Re: The Red Language
       [not found]   ` <340ED5D8.2DEF6D3@ux4.sp.cs.cmu.edu>
@ 1997-09-04  0:00     ` Robert Munck
  1997-09-07  0:00       ` Robert Dewar
       [not found]     ` <199709051335.PAA25952@basement.replay.com>
  1 sibling, 1 reply; 46+ messages in thread
From: Robert Munck @ 1997-09-04  0:00 UTC (permalink / raw)



On Thu, 04 Sep 1997 11:38:00 -0400, "Dean F. Sutherland"
<dfsuther@ux4.sp.cs.cmu.edu> wrote:

>We used a descendent of Red as our main implementation language at
>Tartan in the mid-80s. ... Gnal ... Tartan was named in honor of
>the company's roots at Carnegie-Mellon.

I stand corrected.  I was thinking of a language design called
"Tartan" that was done in answer to the four colored design
proposals.  The name that comes to mind is Mary Shaw (and wasn't
she at CMU then?) but I'm not sure about that.  (I do remember
where the report was on the shelves of the SofTech library some
two decades ago.)

Tartan was done as a protest of the complexity of the four 
bespoke designs.  It was much simpler; the specification was
about 1/20th as thick as the others (I know, that's not a very
good measure of language complexity).  It was said to be compliant
with Steelman, a claim I was unqualified to evaluate.  (I do
remember John Goodenough saying that our (SofTech's) design
(Blue?) came closer to meeting that goal than did Green or either
of the others, and that we should have been a bit less fanatical
about Steelman.  Hey, how does Ada95 do against Steelman?  It
might be interesting to hear in what major ways Ada95 violates
the old requirements.)

Bob Munck
Mill Creek Systems LC




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
       [not found]     ` <199709051335.PAA25952@basement.replay.com>
@ 1997-09-05  0:00       ` Dean F. Sutherland
  1997-09-08  0:00         ` Robert A Duff
  0 siblings, 1 reply; 46+ messages in thread
From: Dean F. Sutherland @ 1997-09-05  0:00 UTC (permalink / raw)



> How does this differ from a function with a pragma Inline, and a
> compiler that does inlining?
> 

The first significant difference that I recall is that Gnal required
that the inlining must occur, even across compilation units.  As you
know, Ada's pragma Inline is advisory; compilers may ignore it if they
so choose.

The second difference is the name binding.  Arguments were evaluated on
each reference in the routine, rather than just once at the call site. 
Note that name binding was a choice you could make, it was not required.
(Looking at my old Gnal manual, I find:

nam	This primary binding mode is permitted only for formals of
	inline functions and abbreviations.  Each reference to the 	formal
results in a reevaluation of the actual object
	expression.

Extracted from section 6.3, binding rules)
Other choices for binding were val, ref, any, and repval. 

Dean




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
       [not found] <340E2DC5.25D7@worldnet.att.net>
       [not found] ` <340ebdaf.230366903@news.mindspring.com>
@ 1997-09-07  0:00 ` Robert Dewar
  1997-09-08  0:00   ` Tucker Taft
  1997-09-12  0:00 ` Robert A Duff
  2 siblings, 1 reply; 46+ messages in thread
From: Robert Dewar @ 1997-09-07  0:00 UTC (permalink / raw)



Michael said

<<So. Would the Red language be "doable" with today's compiler
technology? Would it (might it) be superior to present-day
languages?

Or do we already HAVE the Red language in our midst?
(Red = Intermetrics = STT = Ada95)  ;^)>>


No, Ada 95 is nothing like Red (at least not in the respects where
Red and Green most importantly differed). The genesis of Ada 95,
despite your rather confused equation, is that first a requirements
document was written, which was very much foucssed on the revision
requests that had been received. As one of the authors of this document,
I cannot remember Red coming up at any point during writing the requirements.

Then, given these requirements, a contract was let to Intermetrics to design
to these requirements, and at that point the design effort, lead by STT, but
involving many others of course, was a process of realizing these requirements,
which of course were only guideposts, nothing like a full language design.

But even in this later design process, I can barely remember Red being
considered as a source of inspiration. Most of the critical missing 
capability in Ada 83 was also missing in GNAT, so mostly Red/Green
differences were simply not relevant to the process.

Do not assume that everyone agrees with the idea that Red was superior,
but too hard to implement. I personally *always* preferred Green from
a technical point of view, even allowing for the fact that Green was
far more polished and complete than Red.





^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-04  0:00     ` The Red Language Robert Munck
@ 1997-09-07  0:00       ` Robert Dewar
  1997-09-08  0:00         ` Richard Kenner
  0 siblings, 1 reply; 46+ messages in thread
From: Robert Dewar @ 1997-09-07  0:00 UTC (permalink / raw)



Bob says

<<Tartan was done as a protest of the complexity of the four
bespoke designs.  It was much simpler; the specification was
about 1/20th as thick as the others (I know, that's not a very
good measure of language complexity).  It was said to be compliant
with Steelman, a claim I was unqualified to evaluate.  (I do
remember John Goodenough saying that our (SofTech's) design
(Blue?) came closer to meeting that goal than did Green or either
of the others, and that we should have been a bit less fanatical
about Steelman.  Hey, how does Ada95 do against Steelman?  It
might be interesting to hear in what major ways Ada95 violates
the old requirements.)>>


One obvious violation is the presence of pointers to procedures, which
were specifically ruled out (for good reason) by Steelman! There are
undoubtedly others, but this comes immediately to mind :-)





^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-05  0:00       ` Dean F. Sutherland
@ 1997-09-08  0:00         ` Robert A Duff
  1997-09-09  0:00           ` Arthur Evans Jr
  0 siblings, 1 reply; 46+ messages in thread
From: Robert A Duff @ 1997-09-08  0:00 UTC (permalink / raw)



In article <34102DF5.36D589F1@ux4.sp.cs.cmu.edu>,
Dean F. Sutherland <dfsuther@ux4.sp.cs.cmu.edu> wrote:
>The first significant difference that I recall is that Gnal required
>that the inlining must occur, even across compilation units.  As you
>know, Ada's pragma Inline is advisory; compilers may ignore it if they
>so choose.

How can a language definition *require* any such thing?  How could you
test whether the implementation did so?

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-07  0:00       ` Robert Dewar
@ 1997-09-08  0:00         ` Richard Kenner
  1997-09-12  0:00           ` David Wheeler
  0 siblings, 1 reply; 46+ messages in thread
From: Richard Kenner @ 1997-09-08  0:00 UTC (permalink / raw)



In article <dewar.873678467@merv> dewar@merv.cs.nyu.edu (Robert Dewar) writes:
><<Hey, how does Ada95 do against Steelman?  It
>might be interesting to hear in what major ways Ada95 violates
>the old requirements.)>>
>
>One obvious violation is the presence of pointers to procedures, which
>were specifically ruled out (for good reason) by Steelman! There are
>undoubtedly others, but this comes immediately to mind :-)

There was an article "rating" Ada95, C, C++, and Java against Steelman
in the latest Ada Letters (XII.4).

That was one.  Another is 3-3F (nonassignable record components).   The
third was 10F (assertions, but they point out that GNAT has this).
There was also 5 "partial" and 11 "mostly" entries.






^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-07  0:00 ` Robert Dewar
@ 1997-09-08  0:00   ` Tucker Taft
  0 siblings, 0 replies; 46+ messages in thread
From: Tucker Taft @ 1997-09-08  0:00 UTC (permalink / raw)



Robert Dewar (dewar@merv.cs.nyu.edu) wrote:

: Michael said

: <<So. Would the Red language be "doable" with today's compiler
: technology? Would it (might it) be superior to present-day
: languages?

: Or do we already HAVE the Red language in our midst?
: (Red = Intermetrics = STT = Ada95)  ;^)>>

: No, Ada 95 is nothing like Red (at least not in the respects where
: Red and Green most importantly differed). 

Although I work at Intermetrics, I have very little familiarity
with "Red."  I joined Intermetrics in September 1980, after the "Red" 
language effort was over, and John Nestor had left for "greener"
pastures ;-).

Almost all of the ideas for Ada 95 were based on experiences with
Ada 83 and other 80's languages such as C++, Modula-X, CLOS, etc.
plus reading knowledge of other languages such as Oberon, ORCA,
Haskell, etc.

--
-Tucker Taft   stt@inmet.com   http://www.inmet.com/~stt/
Intermetrics, Inc.  Burlington, MA  USA




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-08  0:00         ` Robert A Duff
@ 1997-09-09  0:00           ` Arthur Evans Jr
       [not found]             ` <dewar.873953300@merv>
  0 siblings, 1 reply; 46+ messages in thread
From: Arthur Evans Jr @ 1997-09-09  0:00 UTC (permalink / raw)



In article <EG7FzG.5Dx@world.std.com>, bobduff@world.std.com (Robert A
Duff) wrote:

> In article <34102DF5.36D589F1@ux4.sp.cs.cmu.edu>,
> Dean F. Sutherland <dfsuther@ux4.sp.cs.cmu.edu> wrote:
> >The first significant difference that I recall is that Gnal required
> >that the inlining must occur, even across compilation units.  As you
> >know, Ada's pragma Inline is advisory; compilers may ignore it if they
> >so choose.
> 
> How can a language definition *require* any such thing?  How could you
> test whether the implementation did so?

Well, the language definition required it saying in the document that
that was how the language worked.

Since the main target of the language definition (written by John
Nestor) was the implementation team (John Nestor), testing of this kind
never showed up as an issue.  John said he would do it; later he said he
had done it.  We believed him; why not?

Bob: I think maybe you've been involved too long with Ada.  :-)

Art

Arthur Evans Jr, PhD
Ada Consulting
evans@evans.pgh.pa.us




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-11  0:00               ` Robert Dewar
@ 1997-09-11  0:00                 ` Arthur Evans Jr
  1997-09-12  0:00                   ` Robert A Duff
  1997-09-12  0:00                   ` Robert Dewar
  1997-09-11  0:00                 ` Dean F. Sutherland
  1 sibling, 2 replies; 46+ messages in thread
From: Arthur Evans Jr @ 1997-09-11  0:00 UTC (permalink / raw)



In article <dewar.873953773@merv>, dewar@merv.cs.nyu.edu (Robert Dewar) wrote:

> Just because a language definition says XXX does not make it so,
> if the statement is vacuous.

Robert: You are missing the point entirely.  If a language definition,
like Ada, is to be useful in the presence of implementors other than the
designer, then of course it's vacuous to imclude specifications that
cannot be tested.

OTOH, if the only implementor is the designer, and the definition is
provided largely for the language's users, then such specs are of
interest to the users, and therefor useful.  That was the situation in
gnal.

> Remember that a compiler is always allowed to treat a language definition
> "as-if", which means that any generated code that makes a compiler run
> the code the same way as some other way is equivalent. Since inlining
> has, by definition, no semantic effect, it is obvious that semantic
> rules in the language can neither require it nor proscribe it.

If inlined code is more time-efficient and the app thereby makes its
deadlines but does not otherwise, then there's a semantic effect.  The
plane flies one way; it crashes the other.

> As for "we believed him, why not?", the why not is because the claim was
> on its face unsupportable!

There are places in this world where folks say things and other folks
believe them.  The GNAT documentation explains the circumstances in
which 'pragma inline' has an effect; I believe what is written.  In
fact, I could test the claim by examining compiled code; why bother?

Art




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
       [not found]             ` <dewar.873953300@merv>
@ 1997-09-11  0:00               ` Robert Dewar
  1997-09-11  0:00                 ` Arthur Evans Jr
  1997-09-11  0:00                 ` Dean F. Sutherland
  0 siblings, 2 replies; 46+ messages in thread
From: Robert Dewar @ 1997-09-11  0:00 UTC (permalink / raw)



Art said

<<Well, the language definition required it saying in the document that
  that was how the language worked.

  Since the main target of the language definition (written by John
  Nestor) was the implementation team (John Nestor), testing of this kind
  never showed up as an issue.  John said he would do it; later he said he
  had done it.  We believed him; why not?

  Bob: I think maybe you've been involved too long with Ada.  :-)>>

Just because a language definition says XXX does not make it so,
if the statement is vacuous.

For example, supposed we come across a statement in a language definition
that sayys

  All procedures in this language shall be octagonal.

Now we go and look up in the index what octagonal means, and if we don't
find a definition, then at best this is a comment with no normative force,
and at worst it is rubbish.

A statement that certain procedure calls shall be inlined is similarly
dubious in the absence of a definition of inlining *at an appropriate
level of abstraction* -- a definition that talks about generated code
is junk, since you cannot define the notion of generated code in abstract.

Remember that a compiler is always allowed to treat a language definition
"as-if", which means that any generated code that makes a compiler run
the code the same way as some other way is equivalent. Since inlining
has, by definition, no semantic effect, it is obvious that semantic
rules in the language can neither require it nor proscribe it.

As for "we believed him, why not?", the why not is because the claim was
on its face unsupportable!






^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-11  0:00               ` Robert Dewar
  1997-09-11  0:00                 ` Arthur Evans Jr
@ 1997-09-11  0:00                 ` Dean F. Sutherland
  1997-09-12  0:00                   ` Robert A Duff
  1 sibling, 1 reply; 46+ messages in thread
From: Dean F. Sutherland @ 1997-09-11  0:00 UTC (permalink / raw)
  To: Robert Dewar


I guess that Robert is correct for the general case of specifying inline
expansion.  There's one specific case where I believe that the Gnal
"spec" does indeed reasonbly "mandate" inline expansion (possibly even
to Robert's satisfaction).

Routines with nam binding for their formals would seem to "require"
actual inline expansion.  (You'll recall from earlier in this thread
that nam binding specifies re-evaluation of the actual object expression
from the call site at each reference in the body of the routine.) 
Treating this as (fully semantically checked) macro expansion is the
obvious and straightforward way to achieve this effect.

On the other hand, I suppose that one _could_ do something with
carefully produced function closures or some other such hack (I mean
workaround) -- but I'm hard pressed to see why it would be worth the
effort.

A formalist might argue that, given the lack of a formal language
definition, the inline expansion "mandate" is no more than strongly
worded implementation advice.  Of course, this would also imply that
many common procurement specifications for compilation systems are also
no more than strongly worded implementation advice...

Dean




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-08  0:00         ` Richard Kenner
@ 1997-09-12  0:00           ` David Wheeler
  1997-09-12  0:00             ` Robert A Duff
  0 siblings, 1 reply; 46+ messages in thread
From: David Wheeler @ 1997-09-12  0:00 UTC (permalink / raw)



Richard Kenner (kenner@lab.ultra.nyu.edu) wrote:
: In article <dewar.873678467@merv> dewar@merv.cs.nyu.edu (Robert Dewar) writes:
: ><<Hey, how does Ada95 do against Steelman?  It
: >might be interesting to hear in what major ways Ada95 violates
: >the old requirements.)>>
: >
: >One obvious violation is the presence of pointers to procedures, which
: >were specifically ruled out (for good reason) by Steelman! There are
: >undoubtedly others, but this comes immediately to mind :-)

: There was an article "rating" Ada95, C, C++, and Java against Steelman
: in the latest Ada Letters (XII.4).

This paper is also available on-line.
It's called "Ada, C, C++, and Java vs. The Steelman" and it's at:

  http://www.adahome.com/History/Steelman/steeltab.htm

This paper gives the exact Steelman text and how well each of these
languages supports the Steelman requirement (with additional commentary).

I would not use the term "rating"; the paper is simply a comparison.
Here's the abstract of the paper:
"This paper compares four computer programming languages (Ada95, C, C++,
 and Java) with the requirements of "Steelman", the original 1978
 requirements document for the Ada computer programming language. This
 paper provides a view of the capabilities of each of these languages,
 and should help those trying to understand their technical
 similarities, differences, and capabilities."

And note the concluding paragraph:
"Again, users of this paper should apply caution; differing percentages
 of "yes" values and capabilities do not necessarily imply that a
 particular language is more suitable than another for a given specific
 task. Readers should examine each answer and determine which "yes" and
 "no" values are of importance to them. This information is intended to
 help readers gain additional understanding of each of these different
 languages."

Also - an electronic version of Steelman is available at:
  http://www.adahome.com/History/Steelman/intro.htm


--- David A. Wheeler
    dwheeler@ida.org





^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
       [not found] <340E2DC5.25D7@worldnet.att.net>
       [not found] ` <340ebdaf.230366903@news.mindspring.com>
  1997-09-07  0:00 ` Robert Dewar
@ 1997-09-12  0:00 ` Robert A Duff
  1997-09-12  0:00   ` Michael & Amy Hartsough
                     ` (2 more replies)
  2 siblings, 3 replies; 46+ messages in thread
From: Robert A Duff @ 1997-09-12  0:00 UTC (permalink / raw)



In article <340E2DC5.25D7@worldnet.att.net>,
Michael & Amy Hartsough  <Hartsough@worldnet.att.net> wrote:
>What about the "Red" language?

It's interesting, for historical reasons if nothing else.

>I recall reading somewhere (was it an Ada Letters article
>by JP Barnes? Or was it in the "History" chapter of Booch's
>Brown book?) something along the lines of, "The Red language
>contained the seed of a potentially superior language, but
>was deemed beyond the capabilities of then current compiler
>technology."

I think that's bogus.  Red had some nasty holes which made it impossible
to implement, but then so did Green (and so did Ada 83, and so does Ada
95).  If you read the manual assuming anything unimplementable is just
ridiculous, so they couldn't have meant that, and assuming that Red had
undergone all the language bug-fixing that Green did, I think you will
find that Red is no harder to implement than Green.

>Of course, since I'm going by rather old memory, the quote may
>have mentioned either the Blue or the Yellow language, but my
>memory does say, "Red".

I know nothing of Blue or Yellow.  I've read the Red and Green manuals,
as well as various follow-ons of Green, including of course Ada 83 and
Ada 95.

>So. Would the Red language be "doable" with today's compiler
>technology?

Yes, given certain bug fixes.

>... Would it (might it) be superior to present-day
>languages?

In some ways, perhaps, but not overall, IMHO.  Ada has gone a long ways
since those days.  So have lots of other languages.

>Or do we already HAVE the Red language in our midst?
>(Red = Intermetrics = STT = Ada95)  ;^)

The Ada 9X team did not pay a whole lot of attention to Red.  I had read
the Red manual.  Probably Tucker had, too, but Tucker started working at
Intermetrics after Green was chosen to be Ada.  Lots and lots of
languages had some influence on Ada 95.

>From where can I obtain a draft of the Red language?

Beats me.  Maybe Ben Brosgol knows.  I have a paper copy...

In most ways, Red and Ada are similar, which is not surprising, since
they both came out of the same Steelman requirements, and they were both
(very roughly) based on Pascal.

Interesting things about Red:

Its modules are called "capsules" (not "packages").  Capsules are given
in one piece, not spec-vs-body.  (Note recent
mountains-out-of-mole-hills thread about Eiffel classes vs Ada
packages.)

Capsules can be "invoked" as either "old" or "new".  "Old" is sort of
like an Ada with_clause.  "New" is sort of like an Eiffel class, in that
it creates a new copy of that capsule (but not like an Eiffel class in
many other ways, of course.)

No underscores in numeric literals.  I really like Ada's ability to say
1_000_000, rather than 1000000, and I don't understand why many people
on't use it.  Red doesn't allow it.

Enumeration literals are syntactically distinguished (preceding quote).
Every enum type is essentially a character type.

You can declare variables locally within any sequence of statements
without introducing a declare block.

Forward references are allowed to procedures and capsules and so forth,
but not to variables and constants and so forth.  (Again, check out the
recent arguments re: Eiffel vs Ada.)

Assertions (similar in semantics to GNAT's pragma Assert).

Imports clauses -- you have so say what you import, as well as what you
export.  This applies to nested things like procedures, unlike Ada.

Parameter passing semantics is nailed down.  (E.g. the programmer
chooses whether pass-by-ref-read-only vs. pass-by-copy-read-only is
used.)

Strings always start at 1.

You can redefine "=" and "<", and then "/=", ">", "<=", ">=" follow.
Ada does this for "/=", but not the others.  In Red, redefining the
latter four is illegal.

if <statically-false-expression> then
    <something-illegal>
end if;

is a warning, not an error.  Similarly for case statements.  I guess
this is their take on conditional compilation.

Weaker overload resolution rules (than Ada).  Eg no top-down resolution,
except in limited cases.  Function result types not considered for
resolution.  But enum lits can be overloaded (and there's no pretense
that they're functions).

More user-defineable stuff.  Like literal notation, array indexing and
slicing notation, dot-selection notation, etc.  One can redefine the
meaning of case statements and for loops, for a given type.

Case statements are more general than in Ada.  The when's don't need to
be static.  So you can say something like:

    case True is
        when Foo(...) => ...;
        when Bar(...) => ...;
    end case;

where Foo and/or Bar might return True.  If more than one alternative
matches, then the choice is arbitrary!

Function return of caller-unknown-size values is not allowed in Red.

No named-notation parameters.

User-defined finalization.

User-defined assignment.

Generics: Capsules and subprograms can be generic, but so can types.
Generic instantiation is implicit.  (See recent thread arguing about the
readability of such a feature.  The examples in the Red manual look
pretty readable to me, in this regard.)

Tasking uses lower-level primitives (than Ada).  Eg mailboxes.

Delay-until statement.

User-defined scheduling.

Garbage collection seems to be expected.  Unchecked_Deallocation raises
an exception on error, but this can be suppressed.

Uninitialized variables detected at run time.

One suppresses *exceptions* (not checks).  The predefined exceptions are
at a finer granularity than in Ada (which conflates many things under
Constraint_Error, for example).  Green was like Red in this respect
(i.e. Green had one exception for each "check" defined in 11.7, more or
less).

And lots of other stuff.

It's hard for me to judge Red, since I tend to compare it to Ada as it
is today, rather than Green as it was then, which is rather unfair.

Some of it seems better than Ada, some of it seems worse.  I would
particularly miss named-notation parameters, for example.

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-11  0:00                 ` Dean F. Sutherland
@ 1997-09-12  0:00                   ` Robert A Duff
  0 siblings, 0 replies; 46+ messages in thread
From: Robert A Duff @ 1997-09-12  0:00 UTC (permalink / raw)



In article <3418390D.81B18056@ux4.sp.cs.cmu.edu>,
Dean F. Sutherland <dfsuther@ux4.sp.cs.cmu.edu> wrote:
>I guess that Robert is correct for the general case of specifying inline
>expansion.  There's one specific case where I believe that the Gnal
>"spec" does indeed reasonbly "mandate" inline expansion (possibly even
>to Robert's satisfaction).

I doubt it.  (That is, I doubt it's to Robert's satisfaction.  It
certainly doesn't satisfy me.)

>Routines with nam binding for their formals would seem to "require"
>actual inline expansion.  (You'll recall from earlier in this thread
>that nam binding specifies re-evaluation of the actual object expression
>from the call site at each reference in the body of the routine.) 
>Treating this as (fully semantically checked) macro expansion is the
>obvious and straightforward way to achieve this effect.

I assume that "nam" binding is the normal Algol notion of by-name
parameter passing.  I believe that passing thunks (ie function closures)
was the expected way to implement it in Algol.

>On the other hand, I suppose that one _could_ do something with
>carefully produced function closures or some other such hack (I mean
>workaround) -- but I'm hard pressed to see why it would be worth the
>effort.

I'm not convinced it's easier to implement using macro-expansion.  And
thunks might in fact be more efficient in some cases.  Anyway, I'm not
sure why you consider it a hack/workaround, since it was, apparently, a
well-known implementation technique in the 1960's (before I knew what a
computer even was).

>A formalist might argue that, given the lack of a formal language
>definition, the inline expansion "mandate" is no more than strongly
>worded implementation advice.

I do so argue.

>...Of course, this would also imply that
>many common procurement specifications for compilation systems are also
>no more than strongly worded implementation advice...

Indeed.

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-12  0:00           ` David Wheeler
@ 1997-09-12  0:00             ` Robert A Duff
  0 siblings, 0 replies; 46+ messages in thread
From: Robert A Duff @ 1997-09-12  0:00 UTC (permalink / raw)



In article <5vbntn$gsb@news.ida.org>, David Wheeler <dwheeler@ida.org> wrote:
>: There was an article "rating" Ada95, C, C++, and Java against Steelman
>: in the latest Ada Letters (XII.4).
>
>This paper is also available on-line.
>It's called "Ada, C, C++, and Java vs. The Steelman" and it's at:
>
>  http://www.adahome.com/History/Steelman/steeltab.htm
>
>This paper gives the exact Steelman text and how well each of these
>languages supports the Steelman requirement (with additional commentary).

The paper overall was good, but I must say I disagreed with some of the
ratings.  That is, I thought there were several cases where the paper
claimed that Ada met a requirement that it does not.  I also think that
some (small number) of the Steelman requirements were foolish, and it
was quite right of Jean Ichbiah to violate them (or, in a few cases,
where he should have been more bold, and violated them where he didn't).

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-11  0:00                 ` Arthur Evans Jr
@ 1997-09-12  0:00                   ` Robert A Duff
  1997-09-12  0:00                   ` Robert Dewar
  1 sibling, 0 replies; 46+ messages in thread
From: Robert A Duff @ 1997-09-12  0:00 UTC (permalink / raw)



In article <evans-1109972149130001@ppp33.s8.pgh.net>,
Arthur Evans Jr <evans@evans.pgh.pa.us> wrote:
>In article <dewar.873953773@merv>, dewar@merv.cs.nyu.edu (Robert Dewar) wrote:
>
>> Just because a language definition says XXX does not make it so,
>> if the statement is vacuous.
>
>Robert: You are missing the point entirely.  If a language definition,
>like Ada, is to be useful in the presence of implementors other than the
>designer, then of course it's vacuous to imclude specifications that
>cannot be tested.
>
>OTOH, if the only implementor is the designer, and the definition is
>provided largely for the language's users, then such specs are of
>interest to the users, and therefor useful.  That was the situation in
>gnal.

In other words, the definition of Gnal was one particular compiler.  OK,
but I can't see how I could be expected to write a compiler for Gnal for
a different computer, since I would be required to have the exact timing
properties of the "original".

Remember this all started by somebody saying Ada was bad, because Gnal
*required* inlining, whereas Ada merely *recommends* inlining.  I claim
that's bogus nonsense.  I claim that Gnal *pretends* to require
inlining, whereas Ada is more honest, and says that inlining has no
semantic effect (but will speed up the code in some (unspecified)
cases).

>> Remember that a compiler is always allowed to treat a language definition
>> "as-if", which means that any generated code that makes a compiler run
>> the code the same way as some other way is equivalent. Since inlining
>> has, by definition, no semantic effect, it is obvious that semantic
>> rules in the language can neither require it nor proscribe it.
>
>If inlined code is more time-efficient and the app thereby makes its
>deadlines but does not otherwise, then there's a semantic effect.  The
>plane flies one way; it crashes the other.

Nonsense.  Inlining is *less* efficient in many cases.  If you claim
that the requirement is that an inlined procedure run faster than a
non-inlined one, well that's easy to implement -- the compiler can just
put a "delay Ten_Years;" at the start of each non-inlined procedure.  Or
do you claim that an inlined procedure must run faster than the
most-efficient, cleverest non-inlined version?  Well, then, show me the
cleverest compiler writer as part of your language definition.

>> As for "we believed him, why not?", the why not is because the claim was
>> on its face unsupportable!
>
>There are places in this world where folks say things and other folks
>believe them.  The GNAT documentation explains the circumstances in
>which 'pragma inline' has an effect; I believe what is written.  In
>fact, I could test the claim by examining compiled code; why bother?

Sure, you trust the GNAT docs.  So do I.  But that doesn't mean Gnal is
somehow better than Ada in *requiring* inlining as opposed to merely
*recommending* inlining.  I claim that, in a language standard,
requiring and recommending are equivalent in this case, and it's
obfuscatory to pretend that "require" means anything.  It certainly
doesn't mean the code will run faster.  Proof: Take your favorite large
Ada program, and enable inlining on every subprogram.  If the compiler
is doing its job, then that program will most certainly run slower.
(Note I said "large".)

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-12  0:00 ` Robert A Duff
@ 1997-09-12  0:00   ` Michael & Amy Hartsough
  1997-09-13  0:00   ` Matthew Heaney
  1997-09-16  0:00   ` Brian Rogoff
  2 siblings, 0 replies; 46+ messages in thread
From: Michael & Amy Hartsough @ 1997-09-12  0:00 UTC (permalink / raw)



Robert A Duff wrote:
> 
> Interesting things about Red:
> 
snip snip.
Lots of great info deleted.
 
> - Bob

Thanks for the great info Bob!

	Michael




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-11  0:00                 ` Arthur Evans Jr
  1997-09-12  0:00                   ` Robert A Duff
@ 1997-09-12  0:00                   ` Robert Dewar
  1 sibling, 0 replies; 46+ messages in thread
From: Robert Dewar @ 1997-09-12  0:00 UTC (permalink / raw)



Art says

<<Robert: You are missing the point entirely.  If a language definition,
  like Ada, is to be useful in the presence of implementors other than the
  designer, then of course it's vacuous to imclude specifications that
  cannot be tested.

  OTOH, if the only implementor is the designer, and the definition is
  provided largely for the language's users, then such specs are of
  interest to the users, and therefor useful.  That was the situation in
  gnal.
>>

No, Art, you are still confusing a language definition with implementation
notes. Who implements the language specification is quite irrelevant!

<<If inlined code is more time-efficient and the app thereby makes its
  deadlines but does not otherwise, then there's a semantic effect.  The
  plane flies one way; it crashes the other.>>

A very common confusion, actually two confusions mixed up.

  1. A language definition usually has nothing to say about timing
     performance, so a compiler is free to implement any statement
     efficiently or inefficiently. Since there is no constraint on
     efficiency, a change in efficiency is NOT a semantic effect,
     even though it may affect the timing of the program. Basically
     if you take timing into consideration, then all the language
     semantics are non-deterministic, and a change in performance
     is simply a manifestation of this non-determinism.

  2. You are assuming that inlining necessarily makes the program run
     faster. This is not at all the case:

        2a. Any movement of code like this may alter instruction cache
            effects, and arbitrarily slow down or speed up execution.

        2b. On some machines it is simply less efficient to inline the
            code. For example, on the IBM 360, if you inline, then you
            increase base register pressure, whereas out of line, the
            call establishes a convenient local base register.

     So it cannot be the case that ignoring inlining is an effect because
     it would slow things down and make the app miss its deadline, it is
     equally likely that *doing* the inlining would slow things down and
     make the deadline be missed.

It is point 2 incidentally which leads us to be a bit vague in Ada about
what pragma Inline means, even informally. Generally a programmer who
specifies pragma Inline is asking for the compiler to generate code that
is as fast as possible, even if it requires more space. In a situation
where inlining would slow things down, it seems quite reasonable, even
desirable for the compiler to ignore the request.

> As for "we believed him, why not?", the why not is because the claim was
> on its face unsupportable!

<<There are places in this world where folks say things and other folks
  believe them.  The GNAT documentation explains the circumstances in
  which 'pragma inline' has an effect; I believe what is written.  In
  check 3612-003>>

But the issue is whether GNAL achieved the effect of requiring inlining
*at the language definition level*. That's the issue here. Sure you
can believe an implementor who says something about his implementation,
and it was quite reasononable for you to believe that. However, you have
given no basis at all for believing that he achieved the goal of requiring
inlining at the language definition level.

Remember it was you who started this thread with the claim that GNAL, unlike
Ada, required inlining at the language definition level. You have not at all
substantiated that claim (nor do I think it is possible for you to do so!)






^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-12  0:00 ` Robert A Duff
  1997-09-12  0:00   ` Michael & Amy Hartsough
@ 1997-09-13  0:00   ` Matthew Heaney
  1997-09-14  0:00     ` Robert A Duff
  1997-09-16  0:00   ` Brian Rogoff
  2 siblings, 1 reply; 46+ messages in thread
From: Matthew Heaney @ 1997-09-13  0:00 UTC (permalink / raw)



In article <EGEv58.7pn@world.std.com>, bobduff@world.std.com (Robert A
Duff) wrote:

>You can declare variables locally within any sequence of statements
>without introducing a declare block.

Funny you should mention that.  Ada wisely creates an implicit block for
statement lists in if statement alternatives and case statement
alternatives, etc, but, interestingly, doesn't extend implicit block-ness
to objects declared there.


>Weaker overload resolution rules (than Ada).  Eg no top-down resolution,
>except in limited cases.  Function result types not considered for
>resolution.

I like the Ada choice.  Why do so many language designers think less of
function return types?


>
>More user-defineable stuff.  Like literal notation, array indexing and
>slicing notation, dot-selection notation, etc.

This is something I'd still like to see added to Ada: the ability to define
an index operator and a slice-operator for abstract data types, even if its
application can only appear on the RHS of assignment.

Of course, I'd really like to see name expressions allowed on the LHS.  So
we could say

Top (Stack) := 5;

or, if they really wanted to make me happy, adding user-definable selectors, ie

type Stack is private;

selector Stack.Top return Integer;

so I could say

Stack.Top := 5;

It would be cool too to be able use an instantiation of
unchecked_conversion on the LHS of an assignment statement, so you could
have a writeable, alternate view of a variable.


>Case statements are more general than in Ada.  The when's don't need to
>be static.  So you can say something like:
>
>    case True is
>        when Foo(...) => ...;
>        when Bar(...) => ...;
>    end case;
>
>where Foo and/or Bar might return True.  If more than one alternative
>matches, then the choice is arbitrary!

This was probably inspired by Dijkstra's guarded command
alternative-statement.  Actually, his command made it into Green in another
guise: as a task select statement.

(Funny, I just read chap 10 of Gries' Science of Programming, the subject
of which is...guarded commands!)

>User-defined finalization.
>User-defined assignment.

Yeah, it's cool these finally got added to Ada.

And another thing I want: real constructors.  I'd like to be able to have
indefinate limited types, that can be initialized in the declarative
region:

type Root_Stack is abstract tagged null record;

type Root_Stack_Iterator is abstract tagged limited null record;
 
constructor New_Iterator (Stack : access constant Root_Stack) return
Root_Stack_Iterator'Class;

which I can use to:

procedure Copy (Left : in Root_Stack'Class; Right : in out Bounded_Stack) is

   Iterator : Root_Stack_Iterator'Class'New_Iterator (Left'Access);
begin

Look Ma, no heap allocation!

--------------------------------------------------------------------
Matthew Heaney
Software Development Consultant
<mailto:matthew_heaney@acm.org>
(818) 985-1271




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-13  0:00   ` Matthew Heaney
@ 1997-09-14  0:00     ` Robert A Duff
  1997-09-16  0:00       ` Brian Rogoff
  0 siblings, 1 reply; 46+ messages in thread
From: Robert A Duff @ 1997-09-14  0:00 UTC (permalink / raw)



In article <mheaney-ya023680001309970036280001@news.ni.net>,
Matthew Heaney <mheaney@ni.net> wrote:
>Funny you should mention that.  Ada wisely creates an implicit block for
>statement lists in if statement alternatives and case statement
>alternatives, etc, but, interestingly, doesn't extend implicit block-ness
>to objects declared there.

I don't understand what you mean by "creates an implicit block".

>>Weaker overload resolution rules (than Ada).  Eg no top-down resolution,
>>except in limited cases.  Function result types not considered for
>>resolution.
>
>I like the Ada choice.

Me too.

>...Why do so many language designers think less of
>function return types?

Perhaps because it's harder to implement.  In C++, you can do overload
resolution in a single bottom up pass.  Start at the leaves of the
expression, and propagate the type of each expression up.  In Ada, you
need to propagate *sets* of types (or sets of interpretations, or some
such) up the tree, and then you have to do a second (top down) pass to
propagate the information back to the leaves.

>And another thing I want: real constructors.  I'd like to be able to have
>indefinate limited types, that can be initialized in the declarative
>region:

Why don't access discriminants do what you want?  Just because there's
no access constant?

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-12  0:00 ` Robert A Duff
  1997-09-12  0:00   ` Michael & Amy Hartsough
  1997-09-13  0:00   ` Matthew Heaney
@ 1997-09-16  0:00   ` Brian Rogoff
  2 siblings, 0 replies; 46+ messages in thread
From: Brian Rogoff @ 1997-09-16  0:00 UTC (permalink / raw)



On Fri, 12 Sep 1997, Robert A Duff wrote:
> No underscores in numeric literals.  I really like Ada's ability to say
> 1_000_000, rather than 1000000, and I don't understand why many people
> on't use it.  Red doesn't allow it.

Yes, its little details like this that I really like about Ada.

> Parameter passing semantics is nailed down.  (E.g. the programmer
> chooses whether pass-by-ref-read-only vs. pass-by-copy-read-only is
> used.)

This sounds interesting.

> Weaker overload resolution rules (than Ada).  Eg no top-down resolution,
> except in limited cases.  Function result types not considered for
> resolution.  But enum lits can be overloaded (and there's no pretense
> that they're functions).

> More user-defineable stuff.  Like literal notation, array indexing and
> slicing notation, dot-selection notation, etc.  One can redefine the
> meaning of case statements and for loops, for a given type.

This also sounds very nice. Frequently Ada proponents argue that such
capabilities hinder readability, but one could argue that overloading and 
use clauses similarly hinder readability. I prefer to think about
readability in the context of a good programmer, in whose hands these 
features would lead to more readable programs (IMO of course). In
particular, I'd like to have user definable infix operators for such things 
as convolution, morphological operations on images (erosion, dilation,
etc.) and also the ability to redefine array indexing to create associative 
arrays and such.

> No named-notation parameters.

This is a feature I miss very much when I'm not using Ada, as I use named 
parameters quite a bit and feel it contributes greatly to readability.

> Generics: Capsules and subprograms can be generic, but so can types.
> Generic instantiation is implicit.  (See recent thread arguing about the
> readability of such a feature.  The examples in the Red manual look
> pretty readable to me, in this regard.)

I was beginning to think I was the only person who liked Ada and thought
this way :-). I certainly think a little bit of optional implicit
instantiation reduces clutter and leads to more readable code. I reject the 
argument that it can lead to an unreadable mess in the same way I'd reject 
a similar argument about use clauses or overloading.

-- Brian






^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-14  0:00     ` Robert A Duff
@ 1997-09-16  0:00       ` Brian Rogoff
  1997-09-18  0:00         ` Robert Dewar
                           ` (2 more replies)
  0 siblings, 3 replies; 46+ messages in thread
From: Brian Rogoff @ 1997-09-16  0:00 UTC (permalink / raw)



On Sun, 14 Sep 1997, Robert A Duff wrote:
> In article <mheaney-ya023680001309970036280001@news.ni.net>,
> Matthew Heaney <mheaney@ni.net> wrote:
> Me too.
> 
> >...Why do so many language designers think less of
> >function return types?
> 
> Perhaps because it's harder to implement.  In C++, you can do overload
> resolution in a single bottom up pass.  Start at the leaves of the
> expression, and propagate the type of each expression up.  In Ada, you
> need to propagate *sets* of types (or sets of interpretations, or some
> such) up the tree, and then you have to do a second (top down) pass to
> propagate the information back to the leaves.

Is this second pass strictly necessary? I seem to remember reading that it 
is possible to do overload resolution in one pass in Ada.

-- Brian






^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-18  0:00         ` Robert Dewar
@ 1997-09-18  0:00           ` Brian Rogoff
  0 siblings, 0 replies; 46+ messages in thread
From: Brian Rogoff @ 1997-09-18  0:00 UTC (permalink / raw)



On 18 Sep 1997, Robert Dewar wrote:
> Brian said
> 
> <<Is this second pass strictly necessary? I seem to remember reading that it
> is possible to do overload resolution in one pass in Ada.
> 
> -- Brian>>
> 
> You remember wrong, yes, of course you need to go up the tree once and
> then down, this is fundamental and obvious if you think about it.

Sheesh, after I had forgiven you, too :-). See my reply to Bob Duff.

-- Brian






^ permalink raw reply	[flat|nested] 46+ messages in thread

* Overload Resolution in Ada (Re: The Red Language)
  1997-09-18  0:00         ` Robert A Duff
@ 1997-09-18  0:00           ` Brian Rogoff
  1997-09-19  0:00             ` Robert A Duff
  1997-09-19  0:00             ` Robert Dewar
  1997-09-19  0:00           ` The Red Language Robert Dewar
  1 sibling, 2 replies; 46+ messages in thread
From: Brian Rogoff @ 1997-09-18  0:00 UTC (permalink / raw)



My apologies for the off topic post...

On Thu, 18 Sep 1997, Robert A Duff wrote:
> In article <Pine.SGI.3.95.970916190053.19184D-100000@shellx.best.com>,
> Brian Rogoff  <bpr@shellx.best.com> wrote:
> >Is this second pass strictly necessary? I seem to remember reading that it 
> >is possible to do overload resolution in one pass in Ada.
> 
> Well, I suppose you can always do a two-pass algorithm in one pass, by
> having a sufficienty complicated back-patching scheme or some such.  I
> read a paper a long time ago about one-pass overload resolution in Ada,
> and my impression was that the one-pass algorithm was basically
> calculating (during the first pass) all possible results that *might*
> occur on the second pass, and then throwing all but one of them away,
> thus avoiding the need for the second pass.

OK, I cracked open my compiler reference (Dragon book, if you care) to
refresh my memory, and in the bibliographic notes to chapter 6 we find, 
after a mention of the straightforward two pass algorithms 

	... Baker (1982) avoids an explicit backward pass by carrying along 
	a DAG of possible types.  

where the paper cited is 

	Baker, T.P. (1982) "A one-pass algorithm for overload resolution 
	in Ada"  TOPLAS 4:4 601-614

I haven't read this paper, but from that one line description it appears 
that you are correct as to the approach. I'll try to find a copy and 
confirm this.

> This seemed silly at the
> time -- it seemed like the so-called one-pass algorithm would be more
> complicated to program, and less efficient, than the two-pass algorithm.

Out of curiosity, what is the worst case running time(s) of the two pass 
algorithm(s)?
 
> <... nice example arguing for two-passedness deleted ...> 
> 
> The denotation of F is determined in part by the possible denotations of
> G, and vice versa.  This seems fundamentally two-pass to me -- I don't
> like to call it "one pass" if the one pass is doing all the work of all
> possible second passes.

I agree that the two pass algorithm is obvious, and that (what I am
guessing Baker does) is cheating :-), but I did write "strictly speaking". 

-- Brian






^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-16  0:00       ` Brian Rogoff
  1997-09-18  0:00         ` Robert Dewar
@ 1997-09-18  0:00         ` Robert A Duff
  1997-09-18  0:00           ` Overload Resolution in Ada (Re: The Red Language) Brian Rogoff
  1997-09-19  0:00           ` The Red Language Robert Dewar
  1997-09-18  0:00         ` Robert Dewar
  2 siblings, 2 replies; 46+ messages in thread
From: Robert A Duff @ 1997-09-18  0:00 UTC (permalink / raw)



In article <Pine.SGI.3.95.970916190053.19184D-100000@shellx.best.com>,
Brian Rogoff  <bpr@shellx.best.com> wrote:
>Is this second pass strictly necessary? I seem to remember reading that it 
>is possible to do overload resolution in one pass in Ada.

Well, I suppose you can always do a two-pass algorithm in one pass, by
having a sufficienty complicated back-patching scheme or some such.  I
read a paper a long time ago about one-pass overload resolution in Ada,
and my impression was that the one-pass algorithm was basically
calculating (during the first pass) all possible results that *might*
occur on the second pass, and then throwing all but one of them away,
thus avoiding the need for the second pass.  This seemed silly at the
time -- it seemed like the so-called one-pass algorithm would be more
complicated to program, and less efficient, than the two-pass algorithm.

Suppose the following declarations are visible:

   type Int1 is range 1..10;
   type Int2 is range 1..10;
   type Int3 is range 1..10;

   procedure P(X: Int1; Y: Int1);
   procedure P(X: Int2; Y: Int2); -- (*)
   procedure P(X: Int3; Y: Int3);

   function F(X: Int1) return Int1;
   function F(X: Int2) return Int2; -- (*)

   function G(X: Int2) return Int2; -- (*)
   function G(X: Int3) return Int3;

And suppose we're trying to resolve this statement:

   P(F(1), G(2));

The correct answer is that P, F, and G denote the declarations that are
marked with "-- (*)" above.  It's easy to see that by trying all
combinations of denotations.

The denotation of F is determined in part by the possible denotations of
G, and vice versa.  This seems fundamentally two-pass to me -- I don't
like to call it "one pass" if the one pass is doing all the work of all
possible second passes.

To make the example more interesting, change it to:

   P(-(-(-(-(-(F(1)))))), -(-(-(-(-(G(2)))))));

where P, F, and G resolve as in the previous example, but the necessary
information is buried deep within each subtree.  (All of the occurrences
of "-" resolve to the predefined unary minus of type Int2.)  We can't
know which F we're talking about until we've propagated information all
the way up from the literal 2, and we can't know which G we're talking
about until we've propagated information all the way up from the literal
1.

In C++ and Red, the above examples would be considered ambiguous.  (I
mean, of course, after converting the syntax as appropriate.)  They're
perfectly legal in Ada (if I haven't made any silly mistakes!).

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-16  0:00       ` Brian Rogoff
@ 1997-09-18  0:00         ` Robert Dewar
  1997-09-18  0:00           ` Robert A Duff
  1997-09-18  0:00         ` Robert A Duff
  1997-09-18  0:00         ` Robert Dewar
  2 siblings, 1 reply; 46+ messages in thread
From: Robert Dewar @ 1997-09-18  0:00 UTC (permalink / raw)



Matthew said

<<> Perhaps because it's harder to implement.  In C++, you can do overload
> resolution in a single bottom up pass.  Start at the leaves of the
> expression, and propagate the type of each expression up.  In Ada, you
> need to propagate *sets* of types (or sets of interpretations, or some
> such) up the tree, and then you have to do a second (top down) pass to
> propagate the information back to the leaves.>>


I think only a non-implementor would say this. The two passes are not
a significant complexity issue. The difficulties in overloading resolution
in Ada (or C++ for that matter) have to do with special situations and
special rules that must be taken into acccount. The basic 2-pass resolution
algorityhm is trivial.





^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-16  0:00       ` Brian Rogoff
  1997-09-18  0:00         ` Robert Dewar
  1997-09-18  0:00         ` Robert A Duff
@ 1997-09-18  0:00         ` Robert Dewar
  1997-09-18  0:00           ` Brian Rogoff
  2 siblings, 1 reply; 46+ messages in thread
From: Robert Dewar @ 1997-09-18  0:00 UTC (permalink / raw)



Brian said

<<Is this second pass strictly necessary? I seem to remember reading that it
is possible to do overload resolution in one pass in Ada.

-- Brian>>

You remember wrong, yes, of course you need to go up the tree once and
then down, this is fundamental and obvious if you think about it.






^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-18  0:00         ` Robert Dewar
@ 1997-09-18  0:00           ` Robert A Duff
  1997-09-20  0:00             ` Robert Dewar
  0 siblings, 1 reply; 46+ messages in thread
From: Robert A Duff @ 1997-09-18  0:00 UTC (permalink / raw)



In article <dewar.874578577@merv>, Robert Dewar <dewar@merv.cs.nyu.edu> wrote:
>Matthew said

No, I said it.  (That's the third time this week, Robert.  ;-))

><<> Perhaps because it's harder to implement.  In C++, you can do overload
>> resolution in a single bottom up pass.  Start at the leaves of the
>> expression, and propagate the type of each expression up.  In Ada, you
>> need to propagate *sets* of types (or sets of interpretations, or some
>> such) up the tree, and then you have to do a second (top down) pass to
>> propagate the information back to the leaves.>>
>
>
>I think only a non-implementor would say this.

Not true.  I'm an implementor, and I said it.  ;-)

>... The two passes are not
>a significant complexity issue. The difficulties in overloading resolution
>in Ada (or C++ for that matter) have to do with special situations and
>special rules that must be taken into acccount. The basic 2-pass resolution
>algorityhm is trivial.

It's true that all kinds of special situations are a pain, but I still
claim that the 2-pass algorithm required by Ada is more complicated than
the 1-pass algorithm of C++ and Red.  The 2-pass algorithm requires
*sets* of types to be passed up the tree, whereas in the 1-pass
algorithm you can pass a single type up the tree.

Yeah, I agree that's not such a big deal, and I certainly think the Ada
way is Good.

So, Robert, why do *you* think the C++ and Red designers decided that
function results should not affect overload resolution?  My claim was
that it's because it's harder to implement.  Maybe that implementation
concern is not such a big deal, but I can't think of any *other* reason,
so what do *you* think?  Perhaps "It's not (much) harder to implement,
but they *thought* it was at the time"?

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Overload Resolution in Ada (Re: The Red Language)
  1997-09-18  0:00           ` Overload Resolution in Ada (Re: The Red Language) Brian Rogoff
@ 1997-09-19  0:00             ` Robert A Duff
  1997-09-19  0:00               ` Brian Rogoff
  1997-09-19  0:00             ` Robert Dewar
  1 sibling, 1 reply; 46+ messages in thread
From: Robert A Duff @ 1997-09-19  0:00 UTC (permalink / raw)



In article <Pine.SGI.3.95.970918180906.18953A-100000@shellx.best.com>,
Brian Rogoff  <bpr@shellx.best.com> wrote:
>My apologies for the off topic post...

Heh?  Surely you jest!

>Out of curiosity, what is the worst case running time(s) of the two pass 
>algorithm(s)?

Umm, I don't know.  I guess you can make it arbitrarily slow.  Are you
asking for O(f(N)), where N is the number of nodes in the expression
tree (in the "complete context", which is a single statement or
declaration or whatever).  Well, I don't think you can answer that
usefully, because it's also affected by the number of currently visible
declarations, and by the symbol table organization, and by the details
of the identifier lookup mechanism, and by the representation chosen for
"sets of possible interpretations".  E.g.:

    procedure P(X: T_1);
    procedure P(X: T_2);
    ...
    procedure P(X: T_10_000);

    X: T1;

    P(X);

Resolving the procedure call is probably going to have to look at 10,000
declarations of P, and we can make it arbitrarily worse without changing
the size of the procedure call.  On the other hand, the compiler could
make this particular sort of case much faster using various tricks.

I suppose you could say the 2-pass algorithm looks at each node O(N)
times (at most twice each), but that's a useless measure, since the
"look at node" operation is an enormously complicated algorithm.  Symbol
table lookups in compilers often have pretty good average time, but
horrible worst-case time.  A hash table is linear in the worst case, for
example, making a compiler O(M*N) overall (even without overloading),
where M is the number of decls, and N is the number of references.  A
compiler isn't a hard real-time program, so people tend to worry about
average or typical case times more than worst case times.

Well, maybe it's not totally useless.  I guess I would worry if the
compiler had to look at each node O(N**2) times.  Hmm.  N isn't usually
very big in Ada, since it's not an expression language, but it can be --
I've seen things like "X := (... 10_000 component positional aggregate ...);"
in machine-generated code.  Note that Ada has a special rule for
aggregates -- the compiler has to figure out the type of the aggregate
*before* looking inside it.

Anyway, in an *optimizing* compiler, overload resolution is probably not
the bottleneck.

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-18  0:00         ` Robert A Duff
  1997-09-18  0:00           ` Overload Resolution in Ada (Re: The Red Language) Brian Rogoff
@ 1997-09-19  0:00           ` Robert Dewar
  1997-09-19  0:00             ` Robert A Duff
  1997-09-19  0:00             ` Brian Rogoff
  1 sibling, 2 replies; 46+ messages in thread
From: Robert Dewar @ 1997-09-19  0:00 UTC (permalink / raw)




<<Well, I suppose you can always do a two-pass algorithm in one pass, by
having a sufficienty complicated back-patching scheme or some such.  I
read a paper a long time ago about one-pass overload resolution in Ada,
and my impression was that the one-pass algorithm was basically
calculating (during the first pass) all possible results that *might*
occur on the second pass, and then throwing all but one of them away,
thus avoiding the need for the second pass.  This seemed silly at the
time -- it seemed like the so-called one-pass algorithm would be more
complicated to program, and less efficient, than the two-pass algorithm.>>


Whether you call things one pass or two pass is to some extent just an
excercise in semantics. But when I say that two passes is fundamental,
what I mean is that before assigning the proper type to a given node,
it is necessary to visit nodes both above you and below you in the tree.
That means there is no canonical traversal scheme which allows you to
visit nodes just once. 

OK, of course you can visit nodes just once and then build a list of
results which are accessed separately, but by one pass scheme I would
mean a scheme that operates as follows.

There is a canonical traversal of the operator tree which visits nodes
one by one and assigns types to each node at the time they are visited.

Any other definition of one pass is unuseful. For example, the more flexible
definition would lead one to consider all compilers one pass because they
only look at the source once, and from then on they are dealing with
auxiliary data structures -- not a useful definition.

Note that in Algol-68, which uses operand type overloading only, there is
indeed a one pass algorithm in the sense I define it above.





^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-19  0:00           ` The Red Language Robert Dewar
@ 1997-09-19  0:00             ` Robert A Duff
  1997-09-21  0:00               ` Robert Dewar
  1997-09-30  0:00               ` Charles Lindsey
  1997-09-19  0:00             ` Brian Rogoff
  1 sibling, 2 replies; 46+ messages in thread
From: Robert A Duff @ 1997-09-19  0:00 UTC (permalink / raw)



In article <dewar.874664808@merv>, Robert Dewar <dewar@merv.cs.nyu.edu> wrote:
>...but by one pass scheme I would
>mean a scheme that operates as follows.
>
>There is a canonical traversal of the operator tree which visits nodes
>one by one and assigns types to each node at the time they are visited.

Well said.  That's what I was *trying* to say.

>Any other definition of one pass is unuseful. For example, the more flexible
>definition would lead one to consider all compilers one pass because they
>only look at the source once, and from then on they are dealing with
>auxiliary data structures -- not a useful definition.

Well, I suppose in the bad old days, a "pass" in a compiler had
something to do with reading a representation of the source program from
the disk, and writing back a different representation.  I agree this is
not a useful way to look at it.  (I recall using a Pascal compiler that
required me to remove the "pass 1" floppy, and insert the "pass 2"
floppy, for each compile.)

>Note that in Algol-68, which uses operand type overloading only, there is
>indeed a one pass algorithm in the sense I define it above.

Could you briefly explain the Algol 68 rule?  Better yet, tell me where
I can get my hands on the manual!  I've read stuff *about* Algol 68, but
I want to read the actual language definition (which, I understand, is
rather tough going).

I believe C++ overloading is also one pass.  As is Red.  Strangely, Red
allows overloading of enumeration literals, but not function results.
So actually, there's sort of a mini second pass needed in the case of
literals, but I'm not sure I'd call that a "pass", since it just goes
down one level in the tree.

One-pass overload resolution seems really horrible to me.  It requires
kludgery, in C++, where you have to write the type of a numeric literal
as part of the literal (as in 0L, for the literal zero of type long).
Imagine how that would work in a language with user-defined numeric
types.  Or user-defined string types, or just 2 built in string types,
as in Ada 95.

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Overload Resolution in Ada (Re: The Red Language)
  1997-09-19  0:00             ` Robert A Duff
@ 1997-09-19  0:00               ` Brian Rogoff
  1997-09-20  0:00                 ` Robert Dewar
  0 siblings, 1 reply; 46+ messages in thread
From: Brian Rogoff @ 1997-09-19  0:00 UTC (permalink / raw)



On Fri, 19 Sep 1997, Robert A Duff wrote:

> In article <Pine.SGI.3.95.970918180906.18953A-100000@shellx.best.com>,
> Brian Rogoff  <bpr@shellx.best.com> wrote:
> >My apologies for the off topic post...
> 
> Heh?  Surely you jest!

I don't jest, and my name isn't Shirley :-). 
Yeah, I was kidding. I read news only once a day now, and snide little 
remarks like that relieve my annoyance at the Eternal Interlingual Flamewar 
whose skirmishes are now making a constant din on cla. I'll use a ":-)" 
next time.

> >Out of curiosity, what is the worst case running time(s) of the two pass 
> >algorithm(s)?

I suppose I should have elaborated further, though you took a very nice
stab at an answer. I'm not suggesting that overload resolution is in any 
way a limiting factor on the speed of an Ada compiler; I'm just slowly 
trying to understand how automatic generic instantiation could be made to 
work and there is a great deal of interaction between overload resolution
and type inference (as I'm sure you know ML type inference is exponential, 
but not too bad in practice :-).  

> Umm, I don't know.  I guess you can make it arbitrarily slow.  Are you
> asking for O(f(N)), where N is the number of nodes in the expression
> tree (in the "complete context", which is a single statement or
> declaration or whatever). 

Yes, although you can certainly introduce other variables in the big O
estimate for the other factors which matter. I'm most interested in 
average case behavior. We can factor out symbol table complexity and 
similar problem areas (those that can be tackled by complexity reducing
"tricks") to get a first approximation of the complexity.

> Well, maybe it's not totally useless.  I guess I would worry if the
> compiler had to look at each node O(N**2) times.  Hmm.  N isn't usually
> very big in Ada, since it's not an expression language, but it can be --
> I've seen things like "X := (... 10_000 component positional aggregate ...);"
> in machine-generated code.  Note that Ada has a special rule for
> aggregates -- the compiler has to figure out the type of the aggregate
> *before* looking inside it.

Good point. Its hard to anticipate what will be done in machine generated 
code, well, actually its easy, I guess what I mean is that I'd never expect 
to see a 10_000 component positional aggregate in *my* code :-). 

> Anyway, in an *optimizing* compiler, overload resolution is probably not
> the bottleneck.

Sure. As I said this is really an academic sort of question. We can now
return to the "God only codes in Eiffel" threads. :-)

-- Brian
 





^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-19  0:00           ` The Red Language Robert Dewar
  1997-09-19  0:00             ` Robert A Duff
@ 1997-09-19  0:00             ` Brian Rogoff
  1 sibling, 0 replies; 46+ messages in thread
From: Brian Rogoff @ 1997-09-19  0:00 UTC (permalink / raw)



On 19 Sep 1997, Robert Dewar wrote:
> <... on one-pass vs two passes for overload resolution in Ada ...> 
> 
> Whether you call things one pass or two pass is to some extent just an
> excercise in semantics. But when I say that two passes is fundamental,
> what I mean is that before assigning the proper type to a given node,
> it is necessary to visit nodes both above you and below you in the tree.
> That means there is no canonical traversal scheme which allows you to
> visit nodes just once. 
> 
> OK, of course you can visit nodes just once and then build a list of
> results which are accessed separately, but by one pass scheme I would
> mean a scheme that operates as follows.
> 
> There is a canonical traversal of the operator tree which visits nodes
> one by one and assigns types to each node at the time they are visited.
> 
> Any other definition of one pass is unuseful. For example, the more flexible
> definition would lead one to consider all compilers one pass because they
> only look at the source once, and from then on they are dealing with
> auxiliary data structures -- not a useful definition.

I read that last paragraph and thought "Duhhh. He's right!". I sit
corrected. Of course I can blame the Dragon book for the memory, but I 
should have thought this definition of "one pass" suspect.

-- Brian








^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Overload Resolution in Ada (Re: The Red Language)
  1997-09-18  0:00           ` Overload Resolution in Ada (Re: The Red Language) Brian Rogoff
  1997-09-19  0:00             ` Robert A Duff
@ 1997-09-19  0:00             ` Robert Dewar
  1 sibling, 0 replies; 46+ messages in thread
From: Robert Dewar @ 1997-09-19  0:00 UTC (permalink / raw)



Brian asks

<<Out of curiosity, what is the worst case running time(s) of the two pass
algorithm(s)?>>

I assume you are asking for the asymptotic behavior, and of course both
the two pass and so-called "one pass" algorithm are linear if you are
just counting nodes visted, potentially quadratic if enough choices of
types are possible, but in practice you will see close to linear behavior.





^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-18  0:00           ` Robert A Duff
@ 1997-09-20  0:00             ` Robert Dewar
  1997-09-22  0:00               ` Robert A Duff
  0 siblings, 1 reply; 46+ messages in thread
From: Robert Dewar @ 1997-09-20  0:00 UTC (permalink / raw)



Bob Duff said (I'm really really sure it was him :-)

<<It's true that all kinds of special situations are a pain, but I still
claim that the 2-pass algorithm required by Ada is more complicated than
the 1-pass algorithm of C++ and Red.  The 2-pass algorithm requires
*sets* of types to be passed up the tree, whereas in the 1-pass
algorithm you can pass a single type up the tree.

Yeah, I agree that's not such a big deal, and I certainly think the Ada
way is Good.>>


Right, it's not such a big deal, and Bob Duff's suggestion that the
concern for difficulty of compilation is what might motivate the choice
between the bottom-up overloading (first introduced in Algol-68) and the
both-ways overloading (first introduced in Ada 83) does not match the
history.

As almost always these days, the simplicity/complexity decisions are much
more focussed on users than on the implementor, especially in a case like
this where the implementation decisions are clear.

Certainly the designers of Algol-68 deliberately did NOT introduce the
both-way scheme, and this decision had nothing at all to do with concern
for ease of implementation (after all in Algol-68, just telling whether
two variables are the same type requires a sufficiently complex algorithm
that more than one PhD thesis has been written on the topic!)

The concern is for simplicity from a users point of view. We have had this
long discussion in the (rather tedious) Eiffel-vs-Ada thread about the
desirability of linear elaboration, and this issue is related. If you have
a really complex expression and you want to figure out what is being calld
where, then clearly the one-pass scheme is easier for a human to figure out.

The question is whether this simplicitly is gained at the expense of loss
of expressive power.

There is no question that there are cases where the Ada scheme is
a great advantage:

  generic
    type Element_Type is private
  package Set_Package is
    functoin Empty_Set return Element_Type;
    function Value (A : String) return Element_Type;
    ...

It is really useful to be able to instantiation Set_Package for various
different types, apply a use clause to the insantiations, and have the
compiler figure out which Empty_Set and Value functions you want.

The time that the two-pass scheme begins to cause trouble is in mixed
operations. For example, one might think it was useful to have all the
following in Unbounded_String:

(where UString is short for Unbounded_String)

function "&" (A : String;  B : String) return Unbounded_String;
function "&" (A : UString; B : String) return Unbounded_String;
function "&" (A : String;  B : UString) return Unbounded_String;
function "&" (A : UString; B : UString) return Unbounded_String;

But it is a mistake, because now

   Ustring := "abcc" & Stringvar & Ustring;

is annoyingly ambiguous. It is quite easy to trip up on this problem if
you are not careful in designing such sets of mixed operations.






^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Overload Resolution in Ada (Re: The Red Language)
  1997-09-19  0:00               ` Brian Rogoff
@ 1997-09-20  0:00                 ` Robert Dewar
  0 siblings, 0 replies; 46+ messages in thread
From: Robert Dewar @ 1997-09-20  0:00 UTC (permalink / raw)



> Well, maybe it's not totally useless.  I guess I would worry if the
> compiler had to look at each node O(N**2) times.  Hmm.  N isn't usually
> very big in Ada, since it's not an expression language, but it can be --
> I've seen things like "X := (... 10_000 component positional aggregate ...);"
> in machine-generated code.  Note that Ada has a special rule for
> aggregates -- the compiler has to figure out the type of the aggregate
> *before* looking inside it.


Certainly you have to look at each node at most twice, so that much is
linear. However, you are manipulating sets of types, and unless you
play very special tricks, this can make things potentially quadratic
in theory, though in practice it will be rare to see any significant
non-linear behavior.





^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-19  0:00             ` Robert A Duff
@ 1997-09-21  0:00               ` Robert Dewar
  1997-09-21  0:00                 ` Algol 68 references (Was Re: The Red Language) Brian Rogoff
                                   ` (3 more replies)
  1997-09-30  0:00               ` Charles Lindsey
  1 sibling, 4 replies; 46+ messages in thread
From: Robert Dewar @ 1997-09-21  0:00 UTC (permalink / raw)



<<Well, I suppose in the bad old days, a "pass" in a compiler had
  something to do with reading a representation of the source program from
  the disk, and writing back a different representation.  I agree this is
  not a useful way to look at it.  (I recall using a Pascal compiler that
  required me to remove the "pass 1" floppy, and insert the "pass 2"
  floppy, for each compile.)>>

I think this is still the inspiration for whatever meaning a "pass" has
in compilers today, i.e. a full traversal of some representation of the
program.

The rub is "full", just how full must a traversal be to be considered
a pass. For example, in GNAT, there is a circuit at the end of the
parser to fill in declarations for labels. Is this a pass? I would
say no, but it is open to argument. As soon as you have random access
to the representation, the definition is no longer clear (and is
incidentally, not particularly useful -- in the old days there was
far more connection between the notions of efficiency in compilation
time and number of passes than there is today).

<<Could you briefly explain the Algol 68 rule?  Better yet, tell me where
  I can get my hands on the manual!  I've read stuff *about* Algol 68, but
  I want to read the actual language definition (which, I understand, is
  rather tough going).>>

Algol-68 overloading is strictly based on operand types. There are two
sources for looking at A68. First there was an article in Computing
Surveys (one of its early issues), I *think* the title was something
like "Algol-68 without Tears", but I could be imagining.

A more comprehensive reference is Charles Lindsay's informal introduction
(Charles is a sometime reader of CLA, maybe he will notice this reference
and tell us the current availability of this book). Note that this is
informal only by comparison to the formal definition, it is in fact a
thorough and precise description of the entire language that is very
readable.

Finally, one more reference for just learning Algol-68 in a hurry is
Ian Currie's magnificent 70-page "yellow book" that came with Algol-68R.
This is a masterpiece of covering a lot of critical technical material,
at an easily readable level, in a remarkably short space. A corresponding
Ada 95 text would be most welcome.

Ian's book was the key for me in getting "into" Algol-68. I had given up
on the Revised Report (though later I joined the very select club of those
who really new the RR very well, and appreciated it's precision and style).





^ permalink raw reply	[flat|nested] 46+ messages in thread

* Algol 68 references (Was Re: The Red Language)
  1997-09-21  0:00               ` Robert Dewar
@ 1997-09-21  0:00                 ` Brian Rogoff
  1997-09-22  0:00                   ` Mark L. Fussell
  1997-09-22  0:00                 ` The Red Language Richard Kenner
                                   ` (2 subsequent siblings)
  3 siblings, 1 reply; 46+ messages in thread
From: Brian Rogoff @ 1997-09-21  0:00 UTC (permalink / raw)



On 21 Sep 1997, Robert Dewar wrote:
> <<Could you briefly explain the Algol 68 rule?  Better yet, tell me where
>   I can get my hands on the manual!  I've read stuff *about* Algol 68, but
>   I want to read the actual language definition (which, I understand, is
>   rather tough going).>>
> 
> Algol-68 overloading is strictly based on operand types. There are two
> sources for looking at A68. First there was an article in Computing
> Surveys (one of its early issues), I *think* the title was something
> like "Algol-68 without Tears", but I could be imagining.
begin
    comment
      Mixing up refs, but close. Charles Lindsey wrote "Algol 68 with fewer 
      tears" (Computer Journal, v15n2 1972) and A. Tanenbaum wrote "A 
      Tutorial on Algol 68" (Computing Surveys v8n2 1976). The reason I
      know is that I acquired reading knowledge of A68 from these two 
      sources. I was never brave enough to have a go at the ref manual, 
      which was printed in Acta Informatica v5 (1975). 

      Lindsey also wrote a (sad and beautiful) article on the history
      of Algol 68 for HOPL-II, I don't have that ref, as I read it in a
      bookstore.
    comment # end of my citations #
end # read Lindsey's ref to get the joke here :-) #
 
> Finally, one more reference for just learning Algol-68 in a hurry is
> Ian Currie's magnificent 70-page "yellow book" that came with Algol-68R.
> This is a masterpiece of covering a lot of critical technical material,
> at an easily readable level, in a remarkably short space. A corresponding
> Ada 95 text would be most welcome.

I've never read this (before my time :-) but I think the short (~20 page) 
description of SML by Mads Tofte (Tips on Standard ML?) seems like a
similar bit of "modern" literature from your description

Incidentally, SML also has a very precise description too, "The Definition
of Standard ML" by Milner and others. ML is an interesting language, and I 
suspect the designers of Ada 95 were aware of it since generic formal
package parameters and signature packages make Ada a bit ML-like.

-- Brian






^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-21  0:00               ` Robert Dewar
                                   ` (2 preceding siblings ...)
  1997-09-22  0:00                 ` Richard A. O'Keefe
@ 1997-09-22  0:00                 ` Chris Morgan
  3 siblings, 0 replies; 46+ messages in thread
From: Chris Morgan @ 1997-09-22  0:00 UTC (permalink / raw)



dewar@merv.cs.nyu.edu (Robert Dewar) writes:

> Finally, one more reference for just learning Algol-68 in a hurry is
> Ian Currie's magnificent 70-page "yellow book" that came with Algol-68R.
> This is a masterpiece of covering a lot of critical technical material,
> at an easily readable level, in a remarkably short space. A corresponding
> Ada 95 text would be most welcome.

Could you describe just what it was that made Currie's work so
magnificent so any budding authors on the group have some pointers?
Failing that please just go ahead and write the Ada version, I'm
getting impatient :-) :-)

Chris
-- 
"O gummier hum, warder buffer-lore rum"




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-21  0:00               ` Robert Dewar
  1997-09-21  0:00                 ` Algol 68 references (Was Re: The Red Language) Brian Rogoff
@ 1997-09-22  0:00                 ` Richard Kenner
  1997-09-22  0:00                 ` Richard A. O'Keefe
  1997-09-22  0:00                 ` Chris Morgan
  3 siblings, 0 replies; 46+ messages in thread
From: Richard Kenner @ 1997-09-22  0:00 UTC (permalink / raw)



In article <dewar.874850594@merv> dewar@merv.cs.nyu.edu (Robert Dewar) writes:
>The rub is "full", just how full must a traversal be to be considered
>a pass. For example, in GNAT, there is a circuit at the end of the
>parser to fill in declarations for labels. Is this a pass? I would
>say no, but it is open to argument. As soon as you have random access
>to the representation, the definition is no longer clear (and is
>incidentally, not particularly useful -- in the old days there was
>far more connection between the notions of efficiency in compilation
>time and number of passes than there is today).

In GCC, trying to define a "pass" is even harder.

Consider something like this, from combine.c:

  for (insn = f, i = 0; insn; insn = NEXT_INSN (insn))
    if (INSN_UID (insn) > i)
      i = INSN_UID (insn);

This loop is finding the highest value of INSN_UID for use in an array
allocation.  But this does scan all instructions in the subprogram
being compiled, so it might, by some definitions, be considered a
"pass".  But there are dozens of things like this all over the
compiler, to scan for one thing or another.




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-20  0:00             ` Robert Dewar
@ 1997-09-22  0:00               ` Robert A Duff
  0 siblings, 0 replies; 46+ messages in thread
From: Robert A Duff @ 1997-09-22  0:00 UTC (permalink / raw)



In article <dewar.874757740@merv>, Robert Dewar <dewar@merv.cs.nyu.edu> wrote:
>Bob Duff said (I'm really really sure it was him :-)

:-) :-) 

Now, how can I prove it?  Hmm.  How about, "We compared maple syrups at
the Vt ARG meeting."  Convinced?

>The concern is for simplicity from a users point of view. We have had this
>long discussion in the (rather tedious) Eiffel-vs-Ada thread about the
>desirability of linear elaboration, and this issue is related. If you have
>a really complex expression and you want to figure out what is being calld
>where, then clearly the one-pass scheme is easier for a human to figure out.

Ah, that makes sense.

I have often thought that the rules ought to allow bottom-up
information, but only left-to-right.  For example, the first argument of
a procedure can be used to resolve the second one, but not the other way
around.  It's always seemed strange to me that the right-hand side of an
assignment statement can help resolve the left-hand side.

So, "X := X + 1;" would be legal, but "X := 1 + X;" would be ambiguous.
(Of course, "X := 2*X;" would also be illegal, which would be somewhat
unfortunate.)

>The question is whether this simplicitly is gained at the expense of loss
>of expressive power.
>
>There is no question that there are cases where the Ada scheme is
>a great advantage:
>
>  generic
>    type Element_Type is private
>  package Set_Package is
>    functoin Empty_Set return Element_Type;
>    function Value (A : String) return Element_Type;
>    ...
>
>It is really useful to be able to instantiation Set_Package for various
>different types, apply a use clause to the insantiations, and have the
>compiler figure out which Empty_Set and Value functions you want.

Yes, that's nice.

>The time that the two-pass scheme begins to cause trouble is in mixed
>operations. For example, one might think it was useful to have all the
>following in Unbounded_String:
>
>(where UString is short for Unbounded_String)
>
>function "&" (A : String;  B : String) return Unbounded_String;
>function "&" (A : UString; B : String) return Unbounded_String;
>function "&" (A : String;  B : UString) return Unbounded_String;
>function "&" (A : UString; B : UString) return Unbounded_String;
>
>But it is a mistake, because now
>
>   Ustring := "abcc" & Stringvar & Ustring;
>
>is annoyingly ambiguous. It is quite easy to trip up on this problem if
>you are not careful in designing such sets of mixed operations.

But this would be illegal in the Red/C++/Algol-68 rules, too.

- Bob




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-21  0:00               ` Robert Dewar
  1997-09-21  0:00                 ` Algol 68 references (Was Re: The Red Language) Brian Rogoff
  1997-09-22  0:00                 ` The Red Language Richard Kenner
@ 1997-09-22  0:00                 ` Richard A. O'Keefe
  1997-09-25  0:00                   ` Bruce Link
  1997-09-22  0:00                 ` Chris Morgan
  3 siblings, 1 reply; 46+ messages in thread
From: Richard A. O'Keefe @ 1997-09-22  0:00 UTC (permalink / raw)



dewar@merv.cs.nyu.edu (Robert Dewar) writes:

>Algol-68 overloading is strictly based on operand types. There are two
>sources for looking at A68. First there was an article in Computing
>Surveys (one of its early issues), I *think* the title was something
>like "Algol-68 without Tears", but I could be imagining.

The Computer Journal.
"Algol 68 with fewer tears."

Wasn't the RR published as an issue of Numerische Mathematik?
-- 
Unsolicited commercial E-mail to this account is prohibited; see section 76E
of the Commonwealth Crimes Act 1914 as amended by the Crimes Legislation
Amendment Act No 108 of 1989.  Maximum penalty:  10 years in gaol.
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: Algol 68 references (Was Re: The Red Language)
  1997-09-21  0:00                 ` Algol 68 references (Was Re: The Red Language) Brian Rogoff
@ 1997-09-22  0:00                   ` Mark L. Fussell
  0 siblings, 0 replies; 46+ messages in thread
From: Mark L. Fussell @ 1997-09-22  0:00 UTC (permalink / raw)
  To: Brian Rogoff


Brian Rogoff wrote:
...
>       Lindsey also wrote a (sad and beautiful) article on the history
>       of Algol 68 for HOPL-II, I don't have that ref, as I read it in a
>       bookstore.
...

"A History of ALGOL 68", C.H. Lindsey

Thomas J. Bergin & Richard G. Gibson, 
History of Programming Languages-II
Addison Wesley, Reading MA, 1996
ISBN 0-201-89502-1

--Mark
mark.fussell@chimu.com




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-22  0:00                 ` Richard A. O'Keefe
@ 1997-09-25  0:00                   ` Bruce Link
  0 siblings, 0 replies; 46+ messages in thread
From: Bruce Link @ 1997-09-25  0:00 UTC (permalink / raw)



Richard A. O'Keefe (ok@goanna.cs.rmit.edu.au) wrote:
: 
: Wasn't the RR published as an issue of Numerische Mathematik?
: -- 

My copy is Springer-Verlag, 1976.  I also found it very hard to read
and (as Robert) learned Algol 68 from Lindsey and van der Meulen's book.  
I found it easier to follow after learning VW grammers from Cleaveland and
Uzgalis' book "Grammars for Programming Languages" Elsevier 1977.
-- 
----------------------------------------------------------------------------
Bruce Link                         |Team OS/2
bdl@mda.ca                         |Team Ada




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-19  0:00             ` Robert A Duff
  1997-09-21  0:00               ` Robert Dewar
@ 1997-09-30  0:00               ` Charles Lindsey
  1997-10-03  0:00                 ` Robert I. Eachus
  1 sibling, 1 reply; 46+ messages in thread
From: Charles Lindsey @ 1997-09-30  0:00 UTC (permalink / raw)



In <EGrv5y.C2D@world.std.com> bobduff@world.std.com (Robert A Duff) writes:

>Well, I suppose in the bad old days, a "pass" in a compiler had
>something to do with reading a representation of the source program from
>the disk, and writing back a different representation.  I agree this is
>not a useful way to look at it.  (I recall using a Pascal compiler that
>required me to remove the "pass 1" floppy, and insert the "pass 2"
>floppy, for each compile.)

In the REAL "bad old days" (before programmers took to eating quiche), the
passes were done to/from tape.

>>Note that in Algol-68, which uses operand type overloading only, there is
>>indeed a one pass algorithm in the sense I define it above.

>Could you briefly explain the Algol 68 rule?  Better yet, tell me where
>I can get my hands on the manual!  I've read stuff *about* Algol 68, but
>I want to read the actual language definition (which, I understand, is
>rather tough going).

You can deduce the operator solely from the type of its two operands.
There are some "related" rules which prevent you from declaring two
versions of an operator which could have confused that deduction (remember
that some coercions are allowed on the operators).

References:

Revised Report on the Algorithmic Language ALGOL 68.
	Acta Informatica Vol 5 (1975) pts 1-3
or	SIGPLAN Notices Vol 12 No 5 (1977)

Or send me your snail mail address and I will send you a microfiche (a bit
blurry :-( ).

See also my paper in the History of Pragramming Languages-II (ACM Press).

-- 
Charles H. Lindsey ---------At Home, doing my own thing-------------------------
Email:     chl@clw.cs.man.ac.uk   Web:   http://www.cs.man.ac.uk/~chl
Voice/Fax: +44 161 437 4506       Snail: 5 Clerewood Ave, CHEADLE, SK8 3JU, U.K.
PGP: 2C15F1A9      Fingerprint: 73 6D C2 51 93 A0 01 E7  65 E8 64 7E 14 A4 AB A5




^ permalink raw reply	[flat|nested] 46+ messages in thread

* Re: The Red Language
  1997-09-30  0:00               ` Charles Lindsey
@ 1997-10-03  0:00                 ` Robert I. Eachus
  0 siblings, 0 replies; 46+ messages in thread
From: Robert I. Eachus @ 1997-10-03  0:00 UTC (permalink / raw)




   Charles Lindsey (chl@clw.cs.man.ac.uk) writes:

   > In the REAL "bad old days" (before programmers took to eating quiche), the
   > passes were done to/from tape.

   Which type of tape?  I've used compilers that required multiple
source passes of paper tape, magnetic tape, and punched cards.  One
compiler required 11 passes!  The worst was a 4 pass compiler for the
IBM 650 that required reloading card decks, since the likelihood of
disaster between passes was low with either type of tape.

   The problem wasn't dropping your deck on the floor, assuming you
numbered the deck and had a card sorter handy.  The problem was
that the old mechanical card readers were likely to eat a card or
three each pass.  The standard tactic was to dup your deck, and keep
that as a master, duplicating cards from it to replace those eaten.
--

					Robert I. Eachus

with Standard_Disclaimer;
use  Standard_Disclaimer;
function Message (Text: in Clever_Ideas) return Better_Ideas is...




^ permalink raw reply	[flat|nested] 46+ messages in thread

end of thread, other threads:[~1997-10-03  0:00 UTC | newest]

Thread overview: 46+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
     [not found] <340E2DC5.25D7@worldnet.att.net>
     [not found] ` <340ebdaf.230366903@news.mindspring.com>
     [not found]   ` <340ED5D8.2DEF6D3@ux4.sp.cs.cmu.edu>
1997-09-04  0:00     ` The Red Language Robert Munck
1997-09-07  0:00       ` Robert Dewar
1997-09-08  0:00         ` Richard Kenner
1997-09-12  0:00           ` David Wheeler
1997-09-12  0:00             ` Robert A Duff
     [not found]     ` <199709051335.PAA25952@basement.replay.com>
1997-09-05  0:00       ` Dean F. Sutherland
1997-09-08  0:00         ` Robert A Duff
1997-09-09  0:00           ` Arthur Evans Jr
     [not found]             ` <dewar.873953300@merv>
1997-09-11  0:00               ` Robert Dewar
1997-09-11  0:00                 ` Arthur Evans Jr
1997-09-12  0:00                   ` Robert A Duff
1997-09-12  0:00                   ` Robert Dewar
1997-09-11  0:00                 ` Dean F. Sutherland
1997-09-12  0:00                   ` Robert A Duff
1997-09-07  0:00 ` Robert Dewar
1997-09-08  0:00   ` Tucker Taft
1997-09-12  0:00 ` Robert A Duff
1997-09-12  0:00   ` Michael & Amy Hartsough
1997-09-13  0:00   ` Matthew Heaney
1997-09-14  0:00     ` Robert A Duff
1997-09-16  0:00       ` Brian Rogoff
1997-09-18  0:00         ` Robert Dewar
1997-09-18  0:00           ` Robert A Duff
1997-09-20  0:00             ` Robert Dewar
1997-09-22  0:00               ` Robert A Duff
1997-09-18  0:00         ` Robert A Duff
1997-09-18  0:00           ` Overload Resolution in Ada (Re: The Red Language) Brian Rogoff
1997-09-19  0:00             ` Robert A Duff
1997-09-19  0:00               ` Brian Rogoff
1997-09-20  0:00                 ` Robert Dewar
1997-09-19  0:00             ` Robert Dewar
1997-09-19  0:00           ` The Red Language Robert Dewar
1997-09-19  0:00             ` Robert A Duff
1997-09-21  0:00               ` Robert Dewar
1997-09-21  0:00                 ` Algol 68 references (Was Re: The Red Language) Brian Rogoff
1997-09-22  0:00                   ` Mark L. Fussell
1997-09-22  0:00                 ` The Red Language Richard Kenner
1997-09-22  0:00                 ` Richard A. O'Keefe
1997-09-25  0:00                   ` Bruce Link
1997-09-22  0:00                 ` Chris Morgan
1997-09-30  0:00               ` Charles Lindsey
1997-10-03  0:00                 ` Robert I. Eachus
1997-09-19  0:00             ` Brian Rogoff
1997-09-18  0:00         ` Robert Dewar
1997-09-18  0:00           ` Brian Rogoff
1997-09-16  0:00   ` Brian Rogoff

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox