comp.lang.ada
 help / color / mirror / Atom feed
* getting started with Spark; generic queue package
@ 2019-07-06  0:51 Stephen Leake
  2019-07-08 15:48 ` Anh Vo
  2019-07-10 17:03 ` Stephen Leake
  0 siblings, 2 replies; 7+ messages in thread
From: Stephen Leake @ 2019-07-06  0:51 UTC (permalink / raw)


I'm finally getting around to trying Spark, mostly because it finally has some support for access types. For now, I'm using GNAT Community 2018; I'll upgrade to 2019 soon.

To start, I'm trying to add contracts to an existing bounded queue package:

generic
   type Item_Type is private;
package SAL.Gen_Bounded_Definite_Queues
  with Spark_Mode
is
   pragma Pure;

   type Queue_Type (Size : Positive) is tagged private;
   --  Size is maximum number of items in the queue.
   --  Tagged to allow Object.Method syntax.

   function Get_Overflow_Handling (Queue : in Queue_Type) return Overflow_Action_Type;
   procedure Set_Overflow_Handling (Queue : in out Queue_Type; Handling : in Overflow_Action_Type);
   --  See Add for meaning of Overflow_Handling. Default is Error.

   procedure Clear (Queue : in out Queue_Type)
   with Post => Queue.Count = 0;
   --  Empty Queue of all items.

   function Count (Queue : in Queue_Type) return Natural;
   --  Returns count of items in the Queue

   function Is_Empty (Queue : in Queue_Type) return Boolean
   with Post => (if Is_Empty'Result then Queue.Count = 0);
   --  Returns true if no items are in Queue.

   function Is_Full (Queue : in Queue_Type) return Boolean
   with Post => (if Is_Full'Result then Queue.Count = Queue.Size);
   --  Returns true if Queue is full.

   function Remove (Queue : in out Queue_Type) return Item_Type with
     Pre => Queue.Count > 0,
     Post => Queue.Count = Queue.Count'Old - 1;
   --  Remove head item from Queue, return it.
   --
   --  Raises Container_Empty if Is_Empty.

   --  other stuff deleted
private

   type Item_Array_Type is array (Positive range <>) of Item_Type;

   type Queue_Type (Size : Positive) is tagged record
      Overflow_Handling : Overflow_Action_Type := Error;

      Head  : Natural := 0;
      Tail  : Natural := 0;
      Count : Natural := 0;
      Data  : Item_Array_Type (1 .. Size);
      --  Add at Tail + 1, remove at Head. Count is current count;
      --  easier to keep track of that than to compute Is_Empty for
      --  each Add and Remove.
   end record;

end SAL.Gen_Bounded_Definite_Queues;


with SAL.Gen_Bounded_Definite_Queues;
package Prove_Integer_Queues is new SAL.Gen_Bounded_Definite_Queues (Integer);


Running gnatprove on this gives info messages like:

prove_integer_queues.ads:2:01: info: in instantiation at sal-gen_bounded_definite_queues.ads:36
prove_integer_queues.ads:2:01: info: implicit function contract not available for proof ("Count" might not return)

Is there any way to turn those off? I don't see such an option in 'gnatprove -help', nor in the Spark 2014 user guide (not I have read every page yet; I did go thru the tutorial).

Worse, in the gnatprove.out file, it says:

  Prove_Integer_Queues.Add at sal-gen_bounded_definite_queues.ads:66, instantiated at prove_integer_queues.ads:2 skipped

and so on, for every subprogram.

Is there a way to find out why it was skipped? And what do I need to do to fix this?

The body file has 'with Spark_Mode' on the package.

In general, is it useful to use Spark on generic packages? Section 4.3 in the Spark user guide says:

   Note that a generic package instance is considered to be declared at its   
   instantiation point.  For example, a generic package cannot be both 
   marked SPARK_Mode and instantiated in a subprogram body.

But I think that only applies if I run gnatprove on that instantiation; if I only run it on Prove_Integer_Queues, it should be ok. On the other hand, does that mean other instantiations of Gen_Bounded_Definite_Queues are "proved" in any sense?

Most of my packages rely on Ada.Finalization, which is not currently supported in Spark. Is there any plan to add that?

-- Stephe

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: getting started with Spark; generic queue package
  2019-07-06  0:51 getting started with Spark; generic queue package Stephen Leake
@ 2019-07-08 15:48 ` Anh Vo
  2019-07-09  0:23   ` Stephen Leake
  2019-07-10 17:03 ` Stephen Leake
  1 sibling, 1 reply; 7+ messages in thread
From: Anh Vo @ 2019-07-08 15:48 UTC (permalink / raw)


On Friday, July 5, 2019 at 5:51:29 PM UTC-7, Stephen Leake wrote:
> I'm finally getting around to trying Spark, mostly because it finally has some support for access types. For now, I'm using GNAT Community 2018; I'll upgrade to 2019 soon.
> 
> To start, I'm trying to add contracts to an existing bounded queue package:
> 
> generic
>    type Item_Type is private;
> package SAL.Gen_Bounded_Definite_Queues
>   with Spark_Mode
> is
>    pragma Pure;
> 
>    type Queue_Type (Size : Positive) is tagged private;
>    --  Size is maximum number of items in the queue.
>    --  Tagged to allow Object.Method syntax.
> 
>    function Get_Overflow_Handling (Queue : in Queue_Type) return Overflow_Action_Type;
>    procedure Set_Overflow_Handling (Queue : in out Queue_Type; Handling : in Overflow_Action_Type);
>    --  See Add for meaning of Overflow_Handling. Default is Error.
> 
>    procedure Clear (Queue : in out Queue_Type)
>    with Post => Queue.Count = 0;
>    --  Empty Queue of all items.
> 
>    function Count (Queue : in Queue_Type) return Natural;
>    --  Returns count of items in the Queue
> 
>    function Is_Empty (Queue : in Queue_Type) return Boolean
>    with Post => (if Is_Empty'Result then Queue.Count = 0);
>    --  Returns true if no items are in Queue.
> 
>    function Is_Full (Queue : in Queue_Type) return Boolean
>    with Post => (if Is_Full'Result then Queue.Count = Queue.Size);
>    --  Returns true if Queue is full.
> 
>    function Remove (Queue : in out Queue_Type) return Item_Type with
>      Pre => Queue.Count > 0,
>      Post => Queue.Count = Queue.Count'Old - 1;
>    --  Remove head item from Queue, return it.
>    --
>    --  Raises Container_Empty if Is_Empty.
> 
>    --  other stuff deleted
> private
> 
>    type Item_Array_Type is array (Positive range <>) of Item_Type;
> 
>    type Queue_Type (Size : Positive) is tagged record
>       Overflow_Handling : Overflow_Action_Type := Error;
> 
>       Head  : Natural := 0;
>       Tail  : Natural := 0;
>       Count : Natural := 0;
>       Data  : Item_Array_Type (1 .. Size);
>       --  Add at Tail + 1, remove at Head. Count is current count;
>       --  easier to keep track of that than to compute Is_Empty for
>       --  each Add and Remove.
>    end record;
> 
> end SAL.Gen_Bounded_Definite_Queues;
> 
> 
> with SAL.Gen_Bounded_Definite_Queues;
> package Prove_Integer_Queues is new SAL.Gen_Bounded_Definite_Queues (Integer);
> 
> 
> Running gnatprove on this gives info messages like:
> 
> prove_integer_queues.ads:2:01: info: in instantiation at sal-gen_bounded_definite_queues.ads:36
> prove_integer_queues.ads:2:01: info: implicit function contract not available for proof ("Count" might not return)
> 
> Is there any way to turn those off? I don't see such an option in 'gnatprove -help', nor in the Spark 2014 user guide (not I have read every page yet; I did go thru the tutorial).
> 
> Worse, in the gnatprove.out file, it says:
> 
>   Prove_Integer_Queues.Add at sal-gen_bounded_definite_queues.ads:66, instantiated at prove_integer_queues.ads:2 skipped
> 
> and so on, for every subprogram.
> 
> Is there a way to find out why it was skipped? And what do I need to do to fix this?
> 
> The body file has 'with Spark_Mode' on the package.
> 
> In general, is it useful to use Spark on generic packages? Section 4.3 in the Spark user guide says:
> 
>    Note that a generic package instance is considered to be declared at its   
>    instantiation point.  For example, a generic package cannot be both 
>    marked SPARK_Mode and instantiated in a subprogram body.
> 
> But I think that only applies if I run gnatprove on that instantiation; if I only run it on Prove_Integer_Queues, it should be ok. On the other hand, does that mean other instantiations of Gen_Bounded_Definite_Queues are "proved" in any sense?
> 
> Most of my packages rely on Ada.Finalization, which is not currently supported in Spark. Is there any plan to add that?
> 
> -- Stephe

There is a dedicated mailing list for SPARK. The mailing address is spark2014-discuss@lists.adacore.com. You will get a quick reply from the experts in this list.

By the way, I tried to download your SAL. So, I can check it out with GNAT Community 2019. But, the server was not available/failed. In the meantime, I will use the codes you posted and fill in the missing ones before proving them.

Anh Vo


^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: getting started with Spark; generic queue package
  2019-07-08 15:48 ` Anh Vo
@ 2019-07-09  0:23   ` Stephen Leake
  2019-07-09 16:18     ` Anh Vo
  0 siblings, 1 reply; 7+ messages in thread
From: Stephen Leake @ 2019-07-09  0:23 UTC (permalink / raw)


On Monday, July 8, 2019 at 8:48:37 AM UTC-7, Anh Vo wrote:
> On Friday, July 5, 2019 at 5:51:29 PM UTC-7, Stephen Leake wrote:
> > I'm finally getting around to trying Spark, mostly because it finally has some support for access types. For now, I'm using GNAT Community 2018; I'll upgrade to 2019 soon.
> > 
> > To start, I'm trying to add contracts to an existing bounded queue package:
> > 
> > generic
> >    type Item_Type is private;
> > package SAL.Gen_Bounded_Definite_Queues
> >   with Spark_Mode
> > is
> >    pragma Pure;
> > 
> >    type Queue_Type (Size : Positive) is tagged private;
> >    --  Size is maximum number of items in the queue.
> >    --  Tagged to allow Object.Method syntax.
> > 
> >    function Get_Overflow_Handling (Queue : in Queue_Type) return Overflow_Action_Type;
> >    procedure Set_Overflow_Handling (Queue : in out Queue_Type; Handling : in Overflow_Action_Type);
> >    --  See Add for meaning of Overflow_Handling. Default is Error.
> > 
> >    procedure Clear (Queue : in out Queue_Type)
> >    with Post => Queue.Count = 0;
> >    --  Empty Queue of all items.
> > 
> >    function Count (Queue : in Queue_Type) return Natural;
> >    --  Returns count of items in the Queue
> > 
> >    function Is_Empty (Queue : in Queue_Type) return Boolean
> >    with Post => (if Is_Empty'Result then Queue.Count = 0);
> >    --  Returns true if no items are in Queue.
> > 
> >    function Is_Full (Queue : in Queue_Type) return Boolean
> >    with Post => (if Is_Full'Result then Queue.Count = Queue.Size);
> >    --  Returns true if Queue is full.
> > 
> >    function Remove (Queue : in out Queue_Type) return Item_Type with
> >      Pre => Queue.Count > 0,
> >      Post => Queue.Count = Queue.Count'Old - 1;
> >    --  Remove head item from Queue, return it.
> >    --
> >    --  Raises Container_Empty if Is_Empty.
> > 
> >    --  other stuff deleted
> > private
> > 
> >    type Item_Array_Type is array (Positive range <>) of Item_Type;
> > 
> >    type Queue_Type (Size : Positive) is tagged record
> >       Overflow_Handling : Overflow_Action_Type := Error;
> > 
> >       Head  : Natural := 0;
> >       Tail  : Natural := 0;
> >       Count : Natural := 0;
> >       Data  : Item_Array_Type (1 .. Size);
> >       --  Add at Tail + 1, remove at Head. Count is current count;
> >       --  easier to keep track of that than to compute Is_Empty for
> >       --  each Add and Remove.
> >    end record;
> > 
> > end SAL.Gen_Bounded_Definite_Queues;
> > 
> > 
> > with SAL.Gen_Bounded_Definite_Queues;
> > package Prove_Integer_Queues is new SAL.Gen_Bounded_Definite_Queues (Integer);
> > 
> > 
> > Running gnatprove on this gives info messages like:
> > 
> > prove_integer_queues.ads:2:01: info: in instantiation at sal-gen_bounded_definite_queues.ads:36
> > prove_integer_queues.ads:2:01: info: implicit function contract not available for proof ("Count" might not return)
> > 
> > Is there any way to turn those off? I don't see such an option in 'gnatprove -help', nor in the Spark 2014 user guide (not I have read every page yet; I did go thru the tutorial).
> > 
> > Worse, in the gnatprove.out file, it says:
> > 
> >   Prove_Integer_Queues.Add at sal-gen_bounded_definite_queues.ads:66, instantiated at prove_integer_queues.ads:2 skipped
> > 
> > and so on, for every subprogram.
> > 
> > Is there a way to find out why it was skipped? And what do I need to do to fix this?
> > 
> > The body file has 'with Spark_Mode' on the package.
> > 
> > In general, is it useful to use Spark on generic packages? Section 4.3 in the Spark user guide says:
> > 
> >    Note that a generic package instance is considered to be declared at its   
> >    instantiation point.  For example, a generic package cannot be both 
> >    marked SPARK_Mode and instantiated in a subprogram body.
> > 
> > But I think that only applies if I run gnatprove on that instantiation; if I only run it on Prove_Integer_Queues, it should be ok. On the other hand, does that mean other instantiations of Gen_Bounded_Definite_Queues are "proved" in any sense?
> > 
> > Most of my packages rely on Ada.Finalization, which is not currently supported in Spark. Is there any plan to add that?
> > 
> > -- Stephe
> 
> There is a dedicated mailing list for SPARK. The mailing address is spark2014-discuss@lists.adacore.com. You will get a quick reply from the experts in this list.

ok, I'll post there.

> By the way, I tried to download your SAL. So, I can check it out with GNAT Community 2019. But, the server was not available/failed. 

I assume that's from http://stephe-leake.org/ada/sal.html; apparently I got halfway thru my most recent release process and stopped. Fixed now.

--  Stephe


^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: getting started with Spark; generic queue package
  2019-07-09  0:23   ` Stephen Leake
@ 2019-07-09 16:18     ` Anh Vo
  2019-07-10 16:53       ` Stephen Leake
  0 siblings, 1 reply; 7+ messages in thread
From: Anh Vo @ 2019-07-09 16:18 UTC (permalink / raw)


On Monday, July 8, 2019 at 5:23:51 PM UTC-7, Stephen Leake wrote:
> On Monday, July 8, 2019 at 8:48:37 AM UTC-7, Anh Vo wrote:
> > On Friday, July 5, 2019 at 5:51:29 PM UTC-7, Stephen Leake wrote:
> > > I'm finally getting around to trying Spark, mostly because it finally has some support for access types. For now, I'm using GNAT Community 2018; I'll upgrade to 2019 soon.
> > > 
> > > To start, I'm trying to add contracts to an existing bounded queue package:
> > > 
> > > generic
> > >    type Item_Type is private;
> > > package SAL.Gen_Bounded_Definite_Queues
> > >   with Spark_Mode
> > > is
> > >    pragma Pure;
> > > 
> > >    type Queue_Type (Size : Positive) is tagged private;
> > >    --  Size is maximum number of items in the queue.
> > >    --  Tagged to allow Object.Method syntax.
> > > 
> > >    function Get_Overflow_Handling (Queue : in Queue_Type) return Overflow_Action_Type;
> > >    procedure Set_Overflow_Handling (Queue : in out Queue_Type; Handling : in Overflow_Action_Type);
> > >    --  See Add for meaning of Overflow_Handling. Default is Error.
> > > 
> > >    procedure Clear (Queue : in out Queue_Type)
> > >    with Post => Queue.Count = 0;
> > >    --  Empty Queue of all items.
> > > 
> > >    function Count (Queue : in Queue_Type) return Natural;
> > >    --  Returns count of items in the Queue
> > > 
> > >    function Is_Empty (Queue : in Queue_Type) return Boolean
> > >    with Post => (if Is_Empty'Result then Queue.Count = 0);
> > >    --  Returns true if no items are in Queue.
> > > 
> > >    function Is_Full (Queue : in Queue_Type) return Boolean
> > >    with Post => (if Is_Full'Result then Queue.Count = Queue.Size);
> > >    --  Returns true if Queue is full.
> > > 
> > >    function Remove (Queue : in out Queue_Type) return Item_Type with
> > >      Pre => Queue.Count > 0,
> > >      Post => Queue.Count = Queue.Count'Old - 1;
> > >    --  Remove head item from Queue, return it.
> > >    --
> > >    --  Raises Container_Empty if Is_Empty.
> > > 
> > >    --  other stuff deleted
> > > private
> > > 
> > >    type Item_Array_Type is array (Positive range <>) of Item_Type;
> > > 
> > >    type Queue_Type (Size : Positive) is tagged record
> > >       Overflow_Handling : Overflow_Action_Type := Error;
> > > 
> > >       Head  : Natural := 0;
> > >       Tail  : Natural := 0;
> > >       Count : Natural := 0;
> > >       Data  : Item_Array_Type (1 .. Size);
> > >       --  Add at Tail + 1, remove at Head. Count is current count;
> > >       --  easier to keep track of that than to compute Is_Empty for
> > >       --  each Add and Remove.
> > >    end record;
> > > 
> > > end SAL.Gen_Bounded_Definite_Queues;
> > > 
> > > 
> > > with SAL.Gen_Bounded_Definite_Queues;
> > > package Prove_Integer_Queues is new SAL.Gen_Bounded_Definite_Queues (Integer);
> > > 
> > > 
> > > Running gnatprove on this gives info messages like:
> > > 
> > > prove_integer_queues.ads:2:01: info: in instantiation at sal-gen_bounded_definite_queues.ads:36
> > > prove_integer_queues.ads:2:01: info: implicit function contract not available for proof ("Count" might not return)
> > > 
> > > Is there any way to turn those off? I don't see such an option in 'gnatprove -help', nor in the Spark 2014 user guide (not I have read every page yet; I did go thru the tutorial).
> > > 
> > > Worse, in the gnatprove.out file, it says:
> > > 
> > >   Prove_Integer_Queues.Add at sal-gen_bounded_definite_queues.ads:66, instantiated at prove_integer_queues.ads:2 skipped
> > > 
> > > and so on, for every subprogram.
> > > 
> > > Is there a way to find out why it was skipped? And what do I need to do to fix this?
> > > 
> > > The body file has 'with Spark_Mode' on the package.
> > > 
> > > In general, is it useful to use Spark on generic packages? Section 4.3 in the Spark user guide says:
> > > 
> > >    Note that a generic package instance is considered to be declared at its   
> > >    instantiation point.  For example, a generic package cannot be both 
> > >    marked SPARK_Mode and instantiated in a subprogram body.
> > > 
> > > But I think that only applies if I run gnatprove on that instantiation; if I only run it on Prove_Integer_Queues, it should be ok. On the other hand, does that mean other instantiations of Gen_Bounded_Definite_Queues are "proved" in any sense?
> > > 
> > > Most of my packages rely on Ada.Finalization, which is not currently supported in Spark. Is there any plan to add that?
> > > 
> > > -- Stephe
> > 
> > There is a dedicated mailing list for SPARK. The mailing address is spark2014-discuss@lists.adacore.com. You will get a quick reply from the experts in this list.
> 
> ok, I'll post there.
> 
> > By the way, I tried to download your SAL. So, I can check it out with GNAT Community 2019. But, the server was not available/failed. 
> 
> I assume that's from http://stephe-leake.org/ada/sal.html; apparently I got halfway thru my most recent release process and stopped. Fixed now.

It failed still as before. The download status is "Failed - No file".

Anh Vo


^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: getting started with Spark; generic queue package
  2019-07-09 16:18     ` Anh Vo
@ 2019-07-10 16:53       ` Stephen Leake
  0 siblings, 0 replies; 7+ messages in thread
From: Stephen Leake @ 2019-07-10 16:53 UTC (permalink / raw)


On Tuesday, July 9, 2019 at 9:18:56 AM UTC-7, Anh Vo wrote:
> On Monday, July 8, 2019 at 5:23:51 PM UTC-7, Stephen Leake wrote:
> > On Monday, July 8, 2019 at 8:48:37 AM UTC-7, Anh Vo wrote:
> > > On Friday, July 5, 2019 at 5:51:29 PM UTC-7, Stephen Leake wrote:
> > > > I'm finally getting around to trying Spark, mostly because it finally has some support for access types. For now, I'm using GNAT Community 2018; I'll upgrade to 2019 soon.
> > > > 
> > > > To start, I'm trying to add contracts to an existing bounded queue package:
> > > > 
> > > > generic
> > > >    type Item_Type is private;
> > > > package SAL.Gen_Bounded_Definite_Queues
> > > >   with Spark_Mode
> > > > is
> > > >    pragma Pure;
> > > > 
> > > >    type Queue_Type (Size : Positive) is tagged private;
> > > >    --  Size is maximum number of items in the queue.
> > > >    --  Tagged to allow Object.Method syntax.
> > > > 
> > > >    function Get_Overflow_Handling (Queue : in Queue_Type) return Overflow_Action_Type;
> > > >    procedure Set_Overflow_Handling (Queue : in out Queue_Type; Handling : in Overflow_Action_Type);
> > > >    --  See Add for meaning of Overflow_Handling. Default is Error.
> > > > 
> > > >    procedure Clear (Queue : in out Queue_Type)
> > > >    with Post => Queue.Count = 0;
> > > >    --  Empty Queue of all items.
> > > > 
> > > >    function Count (Queue : in Queue_Type) return Natural;
> > > >    --  Returns count of items in the Queue
> > > > 
> > > >    function Is_Empty (Queue : in Queue_Type) return Boolean
> > > >    with Post => (if Is_Empty'Result then Queue.Count = 0);
> > > >    --  Returns true if no items are in Queue.
> > > > 
> > > >    function Is_Full (Queue : in Queue_Type) return Boolean
> > > >    with Post => (if Is_Full'Result then Queue.Count = Queue.Size);
> > > >    --  Returns true if Queue is full.
> > > > 
> > > >    function Remove (Queue : in out Queue_Type) return Item_Type with
> > > >      Pre => Queue.Count > 0,
> > > >      Post => Queue.Count = Queue.Count'Old - 1;
> > > >    --  Remove head item from Queue, return it.
> > > >    --
> > > >    --  Raises Container_Empty if Is_Empty.
> > > > 
> > > >    --  other stuff deleted
> > > > private
> > > > 
> > > >    type Item_Array_Type is array (Positive range <>) of Item_Type;
> > > > 
> > > >    type Queue_Type (Size : Positive) is tagged record
> > > >       Overflow_Handling : Overflow_Action_Type := Error;
> > > > 
> > > >       Head  : Natural := 0;
> > > >       Tail  : Natural := 0;
> > > >       Count : Natural := 0;
> > > >       Data  : Item_Array_Type (1 .. Size);
> > > >       --  Add at Tail + 1, remove at Head. Count is current count;
> > > >       --  easier to keep track of that than to compute Is_Empty for
> > > >       --  each Add and Remove.
> > > >    end record;
> > > > 
> > > > end SAL.Gen_Bounded_Definite_Queues;
> > > > 
> > > > 
> > > > with SAL.Gen_Bounded_Definite_Queues;
> > > > package Prove_Integer_Queues is new SAL.Gen_Bounded_Definite_Queues (Integer);
> > > > 
> > > > 
> > > > Running gnatprove on this gives info messages like:
> > > > 
> > > > prove_integer_queues.ads:2:01: info: in instantiation at sal-gen_bounded_definite_queues.ads:36
> > > > prove_integer_queues.ads:2:01: info: implicit function contract not available for proof ("Count" might not return)
> > > > 
> > > > Is there any way to turn those off? I don't see such an option in 'gnatprove -help', nor in the Spark 2014 user guide (not I have read every page yet; I did go thru the tutorial).
> > > > 
> > > > Worse, in the gnatprove.out file, it says:
> > > > 
> > > >   Prove_Integer_Queues.Add at sal-gen_bounded_definite_queues.ads:66, instantiated at prove_integer_queues.ads:2 skipped
> > > > 
> > > > and so on, for every subprogram.
> > > > 
> > > > Is there a way to find out why it was skipped? And what do I need to do to fix this?
> > > > 
> > > > The body file has 'with Spark_Mode' on the package.
> > > > 
> > > > In general, is it useful to use Spark on generic packages? Section 4.3 in the Spark user guide says:
> > > > 
> > > >    Note that a generic package instance is considered to be declared at its   
> > > >    instantiation point.  For example, a generic package cannot be both 
> > > >    marked SPARK_Mode and instantiated in a subprogram body.
> > > > 
> > > > But I think that only applies if I run gnatprove on that instantiation; if I only run it on Prove_Integer_Queues, it should be ok. On the other hand, does that mean other instantiations of Gen_Bounded_Definite_Queues are "proved" in any sense?
> > > > 
> > > > Most of my packages rely on Ada.Finalization, which is not currently supported in Spark. Is there any plan to add that?
> > > > 
> > > > -- Stephe
> > > 
> > > There is a dedicated mailing list for SPARK. The mailing address is spark2014-discuss@lists.adacore.com. You will get a quick reply from the experts in this list.
> > 
> > ok, I'll post there.
> > 
> > > By the way, I tried to download your SAL. So, I can check it out with GNAT Community 2019. But, the server was not available/failed. 
> > 
> > I assume that's from http://stephe-leake.org/ada/sal.html; apparently I got halfway thru my most recent release process and stopped. Fixed now.
> 
> It failed still as before. The download status is "Failed - No file".
> 
> Anh Vo

Did you refresh the page? it should be downloading org.stephe_leake.sal-3.1.tar.bz2 (it was 3.0 before my fix). It works for me.

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: getting started with Spark; generic queue package
  2019-07-06  0:51 getting started with Spark; generic queue package Stephen Leake
  2019-07-08 15:48 ` Anh Vo
@ 2019-07-10 17:03 ` Stephen Leake
  2019-07-13 21:35   ` Stephen Leake
  1 sibling, 1 reply; 7+ messages in thread
From: Stephen Leake @ 2019-07-10 17:03 UTC (permalink / raw)


On Friday, July 5, 2019 at 5:51:29 PM UTC-7, Stephen Leake wrote:
> I'm finally getting around to trying Spark, mostly because it finally has some support for access types. For now, I'm using GNAT Community 2018; I'll upgrade to 2019 soon.
> 
> To start, I'm trying to add contracts to an existing bounded queue package:
> 
> generic
>    type Item_Type is private;
> package SAL.Gen_Bounded_Definite_Queues
>   with Spark_Mode
> is
>    pragma Pure;
> 
>    type Queue_Type (Size : Positive) is tagged private;
>    --  Size is maximum number of items in the queue.
>    --  Tagged to allow Object.Method syntax.
> 
>    function Get_Overflow_Handling (Queue : in Queue_Type) return Overflow_Action_Type;
>    procedure Set_Overflow_Handling (Queue : in out Queue_Type; Handling : in Overflow_Action_Type);
>    --  See Add for meaning of Overflow_Handling. Default is Error.
> 
>    procedure Clear (Queue : in out Queue_Type)
>    with Post => Queue.Count = 0;
>    --  Empty Queue of all items.
> 
>    function Count (Queue : in Queue_Type) return Natural;
>    --  Returns count of items in the Queue
> 
>    function Is_Empty (Queue : in Queue_Type) return Boolean
>    with Post => (if Is_Empty'Result then Queue.Count = 0);
>    --  Returns true if no items are in Queue.
> 
>    function Is_Full (Queue : in Queue_Type) return Boolean
>    with Post => (if Is_Full'Result then Queue.Count = Queue.Size);
>    --  Returns true if Queue is full.
> 
>    function Remove (Queue : in out Queue_Type) return Item_Type with
>      Pre => Queue.Count > 0,
>      Post => Queue.Count = Queue.Count'Old - 1;
>    --  Remove head item from Queue, return it.
>    --
>    --  Raises Container_Empty if Is_Empty.
> 
>    --  other stuff deleted
> private
> 
>    type Item_Array_Type is array (Positive range <>) of Item_Type;
> 
>    type Queue_Type (Size : Positive) is tagged record
>       Overflow_Handling : Overflow_Action_Type := Error;
> 
>       Head  : Natural := 0;
>       Tail  : Natural := 0;
>       Count : Natural := 0;
>       Data  : Item_Array_Type (1 .. Size);
>       --  Add at Tail + 1, remove at Head. Count is current count;
>       --  easier to keep track of that than to compute Is_Empty for
>       --  each Add and Remove.
>    end record;
> 
> end SAL.Gen_Bounded_Definite_Queues;
> 
> 
> with SAL.Gen_Bounded_Definite_Queues;
> package Prove_Integer_Queues is new SAL.Gen_Bounded_Definite_Queues (Integer);
> 
> 
> Running gnatprove on this gives info messages like:
> 
> prove_integer_queues.ads:2:01: info: in instantiation at sal-gen_bounded_definite_queues.ads:36
> prove_integer_queues.ads:2:01: info: implicit function contract not available for proof ("Count" might not return)
> 
> Is there any way to turn those off? I don't see such an option in 'gnatprove -help', nor in the Spark 2014 user guide (not I have read every page yet; I did go thru the tutorial).
> 
> Worse, in the gnatprove.out file, it says:
> 
>   Prove_Integer_Queues.Add at sal-gen_bounded_definite_queues.ads:66, instantiated at prove_integer_queues.ads:2 skipped
> 
> and so on, for every subprogram.
> 

Upgrading to GNAT Community 2019 improved this; it only skips the subprograms that don't have pre and post conditions. Back to experimenting.

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: getting started with Spark; generic queue package
  2019-07-10 17:03 ` Stephen Leake
@ 2019-07-13 21:35   ` Stephen Leake
  0 siblings, 0 replies; 7+ messages in thread
From: Stephen Leake @ 2019-07-13 21:35 UTC (permalink / raw)


Just for completeness, posting the final version of the package that allows Spark to prove all the contract conditions.

--  Abstract:
--
--  A generic queue, allowing definite non-limited item types.
--
--  Copyright (C) 2004, 2008, 2009, 2011, 2017, 2019 Stephen Leake.  All Rights Reserved.
--
--  This library is free software;  you can redistribute it and/or modify it
--  under terms of the  GNU General Public License  as published by the Free
--  Software  Foundation;  either version 3,  or (at your  option) any later
--  version. This library is distributed in the hope that it will be useful,
--  but WITHOUT ANY WARRANTY;  without even the implied warranty of MERCHAN-
--  TABILITY or FITNESS FOR A PARTICULAR PURPOSE.
--
--  As a special exception under Section 7 of GPL version 3, you are granted
--  additional permissions described in the GCC Runtime Library Exception,
--  version 3.1, as published by the Free Software Foundation.

pragma License (Modified_GPL);

generic
   type Item_Type is private;
   Max_Size : in Integer := 100;
package SAL.Gen_Bounded_Definite_Queues
  with Spark_Mode
is
   pragma Pure;

   subtype Size_Type is Integer range 1 .. Max_Size with
     Dynamic_Predicate => Max_Size in 1 .. Integer'Last / 2 - 1;
   --  We need to be able to say Head + N in expressions.

   type Queue_Type (Size : Size_Type) is tagged private;
   --  Size is maximum number of items in the queue.
   --  Tagged to allow Object.Method syntax.

   procedure Clear (Queue : in out Queue_Type)
   with Post => Queue.Count = 0;
   --  Empty Queue of all items.

   function Count (Queue : in Queue_Type) return Natural;
   --  Returns count of items in the Queue

   function Is_Empty (Queue : in Queue_Type) return Boolean
   with Post => (if Is_Empty'Result then Queue.Count = 0);
   --  Returns true if no items are in Queue.

   function Is_Full (Queue : in Queue_Type) return Boolean
   with Post => (if Is_Full'Result then Queue.Count = Queue.Size);
   --  Returns true if Queue is full.

   procedure Remove (Queue : in out Queue_Type; Item : out Item_Type) with
     Pre'Class => Queue.Count > 0,
     Post      => Queue.Count = Queue'Old.Count - 1;
   --  Remove head item from Queue, return it.
   --
   --  Raises Container_Empty if Is_Empty.

   function Remove (Queue : in out Queue_Type) return Item_Type with
     Spark_Mode => Off;

   procedure Drop (Queue : in out Queue_Type) with
     Pre'Class => Queue.Count > 0,
     Post      => Queue.Count = Queue'Old.Count - 1;
   --  Remove head item from Queue, discard it.
   --
   --  Raises Container_Empty if Is_Empty.

   function Peek (Queue : in Queue_Type; N : Integer := 0) return Item_Type with
     Pre'Class => Queue.Count > 0 and
                  N in 0 .. Queue.Count - 1;
   --  Return a copy of a queue item, without removing it. N = 0 is
   --  the queue head.

   procedure Add (Queue : in out Queue_Type; Item : in Item_Type) with
     Pre'Class => Queue.Count < Queue.Size;
   --  Add Item to the tail of Queue.

   procedure Add_To_Head (Queue : in out Queue_Type; Item : in Item_Type) with
     Pre'Class => Queue.Count < Queue.Size;
   --  Add Item to the head of Queue.

private

   type Item_Array_Type is array (Positive range <>) of Item_Type;

   type Queue_Type (Size : Size_Type) is tagged
   record
      Head       : Integer := 1;
      Tail       : Integer := 1;
      Item_Count : Integer := 0; -- not 'Count' to distinguish from the function
      Data       : Item_Array_Type (1 .. Size);
      --  Add at Tail + 1, remove at Head. Item_Count is current count;
      --  easier to keep track of that than to compute Is_Empty for
      --  each Add and Remove.
      --
      --  Empty is indicated by Item_Count = 0; head and tail are arbitrary
      --  in that case.
   end record with
     Dynamic_Predicate =>
       Head in 1 .. Size and
       Tail in 1 .. Size and
       Item_Count in 0 .. Size;

   function Count (Queue : in Queue_Type) return Natural is (Queue.Item_Count);

end SAL.Gen_Bounded_Definite_Queues;

--  Abstract:
--
--  See spec.
--
--  Copyright (C) 2004, 2008, 2009, 2011, 2017, 2019 Stephen Leake.  All Rights Reserved.
--
--  This library is free software;  you can redistribute it and/or modify it
--  under terms of the  GNU General Public License  as published by the Free
--  Software  Foundation;  either version 3,  or (at your  option) any later
--  version. This library is distributed in the hope that it will be useful,
--  but WITHOUT ANY WARRANTY;  without even the implied warranty of MERCHAN-
--  TABILITY or FITNESS FOR A PARTICULAR PURPOSE.
--
--  As a special exception under Section 7 of GPL version 3, you are granted
--  additional permissions described in the GCC Runtime Library Exception,
--  version 3.1, as published by the Free Software Foundation.

pragma License (Modified_GPL);

package body SAL.Gen_Bounded_Definite_Queues
  with Spark_Mode
is

   --  Local subprograms

   function Wrap (Queue : in Queue_Type; I : in Integer) return Integer with
     Pre  => I in 0 .. (Queue.Size - 1) + Queue.Size,
     Post => Wrap'Result in 1 .. Queue.Size
   is begin
      if I > Queue.Size then
         return I - Queue.Size;
      elsif I < 1 then
         return Queue.Size + I;
      else
         return I;
      end if;
   end Wrap;

   ----------
   --  Public subprograms

   procedure Clear (Queue : in out Queue_Type) is
   begin
      Queue.Item_Count := 0;
   end Clear;

   function Is_Empty (Queue : in Queue_Type) return Boolean is
   begin
      return Queue.Item_Count = 0;
   end Is_Empty;

   function Is_Full (Queue : in Queue_Type) return Boolean is
   begin
      return Queue.Item_Count = Queue.Size;
   end Is_Full;

   procedure Remove (Queue : in out Queue_Type; Item : out Item_Type)
   is begin
      Item := Queue.Data (Queue.Head);

      Queue.Item_Count := Queue.Item_Count - 1;

      if Queue.Item_Count > 0 then
         Queue.Head := Wrap (Queue, Queue.Head + 1);
      end if;
   end Remove;

   function Remove (Queue : in out Queue_Type) return Item_Type with
     Spark_Mode => Off
   is begin
      return Item : Item_Type  do
         Remove (Queue, Item);
      end return;
   end Remove;

   procedure Drop (Queue : in out Queue_Type)
   is begin
      Queue.Item_Count := Queue.Item_Count - 1;

      if Queue.Item_Count > 0 then
         Queue.Head := Wrap (Queue, Queue.Head + 1);
      end if;
   end Drop;

   function Peek (Queue : in Queue_Type; N : Integer := 0) return Item_Type
   is begin
      return Queue.Data (Wrap (Queue, Queue.Head + N));
   end Peek;

   procedure Add (Queue : in out Queue_Type; Item : in Item_Type) is
   begin
      if Queue.Item_Count = 0 then
         Queue.Tail       := 1;
         Queue.Head       := 1;
         Queue.Item_Count := 1;
         Queue.Data (1)   := Item;
      else
         Queue.Tail              := Wrap (Queue, Queue.Tail + 1);
         Queue.Data (Queue.Tail) := Item;
         Queue.Item_Count        := Queue.Item_Count + 1;
      end if;
   end Add;

   procedure Add_To_Head (Queue : in out Queue_Type; Item : in Item_Type) is
   begin
      if Queue.Item_Count = 0 then
         Queue.Tail       := 1;
         Queue.Head       := 1;
         Queue.Item_Count := 1;
         Queue.Data (1)   := Item;
      else
         Queue.Head              := Wrap (Queue, Queue.Head - 1);
         Queue.Data (Queue.Head) := Item;
         Queue.Item_Count        := Queue.Item_Count + 1;
      end if;
   end Add_To_Head;

end SAL.Gen_Bounded_Definite_Queues;


^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~2019-07-13 21:35 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2019-07-06  0:51 getting started with Spark; generic queue package Stephen Leake
2019-07-08 15:48 ` Anh Vo
2019-07-09  0:23   ` Stephen Leake
2019-07-09 16:18     ` Anh Vo
2019-07-10 16:53       ` Stephen Leake
2019-07-10 17:03 ` Stephen Leake
2019-07-13 21:35   ` Stephen Leake

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox