From: Anh Vo <anhvofrcaus@gmail.com>
Subject: Re: getting started with Spark; generic queue package
Date: Tue, 9 Jul 2019 09:18:55 -0700 (PDT)
Date: 2019-07-09T09:18:55-07:00 [thread overview]
Message-ID: <455ec3b5-2249-4404-914b-f15b65762086@googlegroups.com> (raw)
In-Reply-To: <64ac9bde-041a-4fcd-b293-8dc48da1e760@googlegroups.com>
On Monday, July 8, 2019 at 5:23:51 PM UTC-7, Stephen Leake wrote:
> On Monday, July 8, 2019 at 8:48:37 AM UTC-7, Anh Vo wrote:
> > On Friday, July 5, 2019 at 5:51:29 PM UTC-7, Stephen Leake wrote:
> > > I'm finally getting around to trying Spark, mostly because it finally has some support for access types. For now, I'm using GNAT Community 2018; I'll upgrade to 2019 soon.
> > >
> > > To start, I'm trying to add contracts to an existing bounded queue package:
> > >
> > > generic
> > > type Item_Type is private;
> > > package SAL.Gen_Bounded_Definite_Queues
> > > with Spark_Mode
> > > is
> > > pragma Pure;
> > >
> > > type Queue_Type (Size : Positive) is tagged private;
> > > -- Size is maximum number of items in the queue.
> > > -- Tagged to allow Object.Method syntax.
> > >
> > > function Get_Overflow_Handling (Queue : in Queue_Type) return Overflow_Action_Type;
> > > procedure Set_Overflow_Handling (Queue : in out Queue_Type; Handling : in Overflow_Action_Type);
> > > -- See Add for meaning of Overflow_Handling. Default is Error.
> > >
> > > procedure Clear (Queue : in out Queue_Type)
> > > with Post => Queue.Count = 0;
> > > -- Empty Queue of all items.
> > >
> > > function Count (Queue : in Queue_Type) return Natural;
> > > -- Returns count of items in the Queue
> > >
> > > function Is_Empty (Queue : in Queue_Type) return Boolean
> > > with Post => (if Is_Empty'Result then Queue.Count = 0);
> > > -- Returns true if no items are in Queue.
> > >
> > > function Is_Full (Queue : in Queue_Type) return Boolean
> > > with Post => (if Is_Full'Result then Queue.Count = Queue.Size);
> > > -- Returns true if Queue is full.
> > >
> > > function Remove (Queue : in out Queue_Type) return Item_Type with
> > > Pre => Queue.Count > 0,
> > > Post => Queue.Count = Queue.Count'Old - 1;
> > > -- Remove head item from Queue, return it.
> > > --
> > > -- Raises Container_Empty if Is_Empty.
> > >
> > > -- other stuff deleted
> > > private
> > >
> > > type Item_Array_Type is array (Positive range <>) of Item_Type;
> > >
> > > type Queue_Type (Size : Positive) is tagged record
> > > Overflow_Handling : Overflow_Action_Type := Error;
> > >
> > > Head : Natural := 0;
> > > Tail : Natural := 0;
> > > Count : Natural := 0;
> > > Data : Item_Array_Type (1 .. Size);
> > > -- Add at Tail + 1, remove at Head. Count is current count;
> > > -- easier to keep track of that than to compute Is_Empty for
> > > -- each Add and Remove.
> > > end record;
> > >
> > > end SAL.Gen_Bounded_Definite_Queues;
> > >
> > >
> > > with SAL.Gen_Bounded_Definite_Queues;
> > > package Prove_Integer_Queues is new SAL.Gen_Bounded_Definite_Queues (Integer);
> > >
> > >
> > > Running gnatprove on this gives info messages like:
> > >
> > > prove_integer_queues.ads:2:01: info: in instantiation at sal-gen_bounded_definite_queues.ads:36
> > > prove_integer_queues.ads:2:01: info: implicit function contract not available for proof ("Count" might not return)
> > >
> > > Is there any way to turn those off? I don't see such an option in 'gnatprove -help', nor in the Spark 2014 user guide (not I have read every page yet; I did go thru the tutorial).
> > >
> > > Worse, in the gnatprove.out file, it says:
> > >
> > > Prove_Integer_Queues.Add at sal-gen_bounded_definite_queues.ads:66, instantiated at prove_integer_queues.ads:2 skipped
> > >
> > > and so on, for every subprogram.
> > >
> > > Is there a way to find out why it was skipped? And what do I need to do to fix this?
> > >
> > > The body file has 'with Spark_Mode' on the package.
> > >
> > > In general, is it useful to use Spark on generic packages? Section 4.3 in the Spark user guide says:
> > >
> > > Note that a generic package instance is considered to be declared at its
> > > instantiation point. For example, a generic package cannot be both
> > > marked SPARK_Mode and instantiated in a subprogram body.
> > >
> > > But I think that only applies if I run gnatprove on that instantiation; if I only run it on Prove_Integer_Queues, it should be ok. On the other hand, does that mean other instantiations of Gen_Bounded_Definite_Queues are "proved" in any sense?
> > >
> > > Most of my packages rely on Ada.Finalization, which is not currently supported in Spark. Is there any plan to add that?
> > >
> > > -- Stephe
> >
> > There is a dedicated mailing list for SPARK. The mailing address is spark2014-discuss@lists.adacore.com. You will get a quick reply from the experts in this list.
>
> ok, I'll post there.
>
> > By the way, I tried to download your SAL. So, I can check it out with GNAT Community 2019. But, the server was not available/failed.
>
> I assume that's from http://stephe-leake.org/ada/sal.html; apparently I got halfway thru my most recent release process and stopped. Fixed now.
It failed still as before. The download status is "Failed - No file".
Anh Vo
next prev parent reply other threads:[~2019-07-09 16:18 UTC|newest]
Thread overview: 7+ messages / expand[flat|nested] mbox.gz Atom feed top
2019-07-06 0:51 getting started with Spark; generic queue package Stephen Leake
2019-07-08 15:48 ` Anh Vo
2019-07-09 0:23 ` Stephen Leake
2019-07-09 16:18 ` Anh Vo [this message]
2019-07-10 16:53 ` Stephen Leake
2019-07-10 17:03 ` Stephen Leake
2019-07-13 21:35 ` Stephen Leake
replies disabled
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox