From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=unavailable autolearn_force=no version=3.4.4 X-Received: by 2002:a37:a389:: with SMTP id m131mr14683896qke.168.1562631830591; Mon, 08 Jul 2019 17:23:50 -0700 (PDT) X-Received: by 2002:a9d:3e4e:: with SMTP id h14mr15665226otg.182.1562631830338; Mon, 08 Jul 2019 17:23:50 -0700 (PDT) Path: eternal-september.org!reader01.eternal-september.org!feeder.eternal-september.org!weretis.net!feeder6.news.weretis.net!feeder.usenetexpress.com!feeder-in1.iad1.usenetexpress.com!border1.nntp.dca1.giganews.com!nntp.giganews.com!b26no1665811qtq.0!news-out.google.com!g23ni821qtq.1!nntp.google.com!b26no1665805qtq.0!postnews.google.com!glegroupsg2000goo.googlegroups.com!not-for-mail Newsgroups: comp.lang.ada Date: Mon, 8 Jul 2019 17:23:49 -0700 (PDT) In-Reply-To: <80f41865-e398-490d-bb39-653daf84034b@googlegroups.com> Complaints-To: groups-abuse@google.com Injection-Info: glegroupsg2000goo.googlegroups.com; posting-host=76.77.182.20; posting-account=W2gdXQoAAADxIuhBWhPFjUps3wUp4RhQ NNTP-Posting-Host: 76.77.182.20 References: <198a2d3c-2ce2-4016-9df5-9699568742e7@googlegroups.com> <80f41865-e398-490d-bb39-653daf84034b@googlegroups.com> User-Agent: G2/1.0 MIME-Version: 1.0 Message-ID: <64ac9bde-041a-4fcd-b293-8dc48da1e760@googlegroups.com> Subject: Re: getting started with Spark; generic queue package From: Stephen Leake Injection-Date: Tue, 09 Jul 2019 00:23:50 +0000 Content-Type: text/plain; charset="UTF-8" Xref: reader01.eternal-september.org comp.lang.ada:56832 Date: 2019-07-08T17:23:49-07:00 List-Id: On Monday, July 8, 2019 at 8:48:37 AM UTC-7, Anh Vo wrote: > On Friday, July 5, 2019 at 5:51:29 PM UTC-7, Stephen Leake wrote: > > I'm finally getting around to trying Spark, mostly because it finally has some support for access types. For now, I'm using GNAT Community 2018; I'll upgrade to 2019 soon. > > > > To start, I'm trying to add contracts to an existing bounded queue package: > > > > generic > > type Item_Type is private; > > package SAL.Gen_Bounded_Definite_Queues > > with Spark_Mode > > is > > pragma Pure; > > > > type Queue_Type (Size : Positive) is tagged private; > > -- Size is maximum number of items in the queue. > > -- Tagged to allow Object.Method syntax. > > > > function Get_Overflow_Handling (Queue : in Queue_Type) return Overflow_Action_Type; > > procedure Set_Overflow_Handling (Queue : in out Queue_Type; Handling : in Overflow_Action_Type); > > -- See Add for meaning of Overflow_Handling. Default is Error. > > > > procedure Clear (Queue : in out Queue_Type) > > with Post => Queue.Count = 0; > > -- Empty Queue of all items. > > > > function Count (Queue : in Queue_Type) return Natural; > > -- Returns count of items in the Queue > > > > function Is_Empty (Queue : in Queue_Type) return Boolean > > with Post => (if Is_Empty'Result then Queue.Count = 0); > > -- Returns true if no items are in Queue. > > > > function Is_Full (Queue : in Queue_Type) return Boolean > > with Post => (if Is_Full'Result then Queue.Count = Queue.Size); > > -- Returns true if Queue is full. > > > > function Remove (Queue : in out Queue_Type) return Item_Type with > > Pre => Queue.Count > 0, > > Post => Queue.Count = Queue.Count'Old - 1; > > -- Remove head item from Queue, return it. > > -- > > -- Raises Container_Empty if Is_Empty. > > > > -- other stuff deleted > > private > > > > type Item_Array_Type is array (Positive range <>) of Item_Type; > > > > type Queue_Type (Size : Positive) is tagged record > > Overflow_Handling : Overflow_Action_Type := Error; > > > > Head : Natural := 0; > > Tail : Natural := 0; > > Count : Natural := 0; > > Data : Item_Array_Type (1 .. Size); > > -- Add at Tail + 1, remove at Head. Count is current count; > > -- easier to keep track of that than to compute Is_Empty for > > -- each Add and Remove. > > end record; > > > > end SAL.Gen_Bounded_Definite_Queues; > > > > > > with SAL.Gen_Bounded_Definite_Queues; > > package Prove_Integer_Queues is new SAL.Gen_Bounded_Definite_Queues (Integer); > > > > > > Running gnatprove on this gives info messages like: > > > > prove_integer_queues.ads:2:01: info: in instantiation at sal-gen_bounded_definite_queues.ads:36 > > prove_integer_queues.ads:2:01: info: implicit function contract not available for proof ("Count" might not return) > > > > Is there any way to turn those off? I don't see such an option in 'gnatprove -help', nor in the Spark 2014 user guide (not I have read every page yet; I did go thru the tutorial). > > > > Worse, in the gnatprove.out file, it says: > > > > Prove_Integer_Queues.Add at sal-gen_bounded_definite_queues.ads:66, instantiated at prove_integer_queues.ads:2 skipped > > > > and so on, for every subprogram. > > > > Is there a way to find out why it was skipped? And what do I need to do to fix this? > > > > The body file has 'with Spark_Mode' on the package. > > > > In general, is it useful to use Spark on generic packages? Section 4.3 in the Spark user guide says: > > > > Note that a generic package instance is considered to be declared at its > > instantiation point. For example, a generic package cannot be both > > marked SPARK_Mode and instantiated in a subprogram body. > > > > But I think that only applies if I run gnatprove on that instantiation; if I only run it on Prove_Integer_Queues, it should be ok. On the other hand, does that mean other instantiations of Gen_Bounded_Definite_Queues are "proved" in any sense? > > > > Most of my packages rely on Ada.Finalization, which is not currently supported in Spark. Is there any plan to add that? > > > > -- Stephe > > There is a dedicated mailing list for SPARK. The mailing address is spark2014-discuss@lists.adacore.com. You will get a quick reply from the experts in this list. ok, I'll post there. > By the way, I tried to download your SAL. So, I can check it out with GNAT Community 2019. But, the server was not available/failed. I assume that's from http://stephe-leake.org/ada/sal.html; apparently I got halfway thru my most recent release process and stopped. Fixed now. -- Stephe