comp.lang.ada
 help / color / mirror / Atom feed
From: dewar@cs.nyu.edu (Robert Dewar)
Subject: Re: defaults on discriminants
Date: 20 Nov 1994 12:14:04 -0500
Date: 1994-11-20T12:14:04-05:00	[thread overview]
Message-ID: <3ao04s$jg2@gnat.cs.nyu.edu> (raw)
In-Reply-To: 3abe70$o5d@babyblue.cs.yale.edu

This is such a common question that it should be in the Ada FAQ

When you give a default discriminant, then one method (I actually think it
is the preferred method) of implementation is to allocate the maximum
possible length. Since your discriminant is of type Natural, this clearly
won't work!

GNAT may compile it, but it won't run it, and indeed I consider it a GNAT
bug (on the todo list) that no warning is issued at compile time for this
misuse.

Some compilers, notably Alsys and RR, have at least partially "solved"
this problem by introducing hidden pointers, but this to me is an undesirable
implementation choice.

First, it means there is hidden heap activity, which seems undesirable. In
a language where pointers are explicit, it is generally a good idea if
allocation is also explicit, and certainly for real-time work, hidden anything
is worrisome.

Second, it is not easy to do uniformly. Alsys ends up introducing arbitrary
restrictions on the composition of such types (try making an array of them),
and RR introduces non-contiguous representations, which are legal but
troublesome.

To "solve" the problem yourself, just declare a reasonable maximum length,
and use a subtype representing this length as the subtype of the
discriminant.




      reply	other threads:[~1994-11-20 17:14 UTC|newest]

Thread overview: 2+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
1994-11-15 22:54 defaults on discriminants Gene McCulley
1994-11-20 17:14 ` Robert Dewar [this message]
replies disabled

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox