From mboxrd@z Thu Jan 1 00:00:00 1970 X-Spam-Checker-Version: SpamAssassin 3.4.4 (2020-01-24) on polar.synack.me X-Spam-Level: X-Spam-Status: No, score=-1.9 required=5.0 tests=BAYES_00 autolearn=unavailable autolearn_force=no version=3.4.4 X-Received: by 2002:ac8:24b8:: with SMTP id s53mr5079125qts.276.1562374288974; Fri, 05 Jul 2019 17:51:28 -0700 (PDT) X-Received: by 2002:aca:f256:: with SMTP id q83mr3495728oih.99.1562374288623; Fri, 05 Jul 2019 17:51:28 -0700 (PDT) Path: eternal-september.org!reader01.eternal-september.org!feeder.eternal-september.org!weretis.net!feeder6.news.weretis.net!feeder.usenetexpress.com!feeder-in1.iad1.usenetexpress.com!border1.nntp.dca1.giganews.com!nntp.giganews.com!m24no10935946qtm.0!news-out.google.com!g23ni325qtq.1!nntp.google.com!m24no10935942qtm.0!postnews.google.com!glegroupsg2000goo.googlegroups.com!not-for-mail Newsgroups: comp.lang.ada Date: Fri, 5 Jul 2019 17:51:28 -0700 (PDT) Complaints-To: groups-abuse@google.com Injection-Info: glegroupsg2000goo.googlegroups.com; posting-host=76.77.182.20; posting-account=W2gdXQoAAADxIuhBWhPFjUps3wUp4RhQ NNTP-Posting-Host: 76.77.182.20 User-Agent: G2/1.0 MIME-Version: 1.0 Message-ID: <198a2d3c-2ce2-4016-9df5-9699568742e7@googlegroups.com> Subject: getting started with Spark; generic queue package From: Stephen Leake Injection-Date: Sat, 06 Jul 2019 00:51:28 +0000 Content-Type: text/plain; charset="UTF-8" Xref: reader01.eternal-september.org comp.lang.ada:56827 Date: 2019-07-05T17:51:28-07:00 List-Id: I'm finally getting around to trying Spark, mostly because it finally has some support for access types. For now, I'm using GNAT Community 2018; I'll upgrade to 2019 soon. To start, I'm trying to add contracts to an existing bounded queue package: generic type Item_Type is private; package SAL.Gen_Bounded_Definite_Queues with Spark_Mode is pragma Pure; type Queue_Type (Size : Positive) is tagged private; -- Size is maximum number of items in the queue. -- Tagged to allow Object.Method syntax. function Get_Overflow_Handling (Queue : in Queue_Type) return Overflow_Action_Type; procedure Set_Overflow_Handling (Queue : in out Queue_Type; Handling : in Overflow_Action_Type); -- See Add for meaning of Overflow_Handling. Default is Error. procedure Clear (Queue : in out Queue_Type) with Post => Queue.Count = 0; -- Empty Queue of all items. function Count (Queue : in Queue_Type) return Natural; -- Returns count of items in the Queue function Is_Empty (Queue : in Queue_Type) return Boolean with Post => (if Is_Empty'Result then Queue.Count = 0); -- Returns true if no items are in Queue. function Is_Full (Queue : in Queue_Type) return Boolean with Post => (if Is_Full'Result then Queue.Count = Queue.Size); -- Returns true if Queue is full. function Remove (Queue : in out Queue_Type) return Item_Type with Pre => Queue.Count > 0, Post => Queue.Count = Queue.Count'Old - 1; -- Remove head item from Queue, return it. -- -- Raises Container_Empty if Is_Empty. -- other stuff deleted private type Item_Array_Type is array (Positive range <>) of Item_Type; type Queue_Type (Size : Positive) is tagged record Overflow_Handling : Overflow_Action_Type := Error; Head : Natural := 0; Tail : Natural := 0; Count : Natural := 0; Data : Item_Array_Type (1 .. Size); -- Add at Tail + 1, remove at Head. Count is current count; -- easier to keep track of that than to compute Is_Empty for -- each Add and Remove. end record; end SAL.Gen_Bounded_Definite_Queues; with SAL.Gen_Bounded_Definite_Queues; package Prove_Integer_Queues is new SAL.Gen_Bounded_Definite_Queues (Integer); Running gnatprove on this gives info messages like: prove_integer_queues.ads:2:01: info: in instantiation at sal-gen_bounded_definite_queues.ads:36 prove_integer_queues.ads:2:01: info: implicit function contract not available for proof ("Count" might not return) Is there any way to turn those off? I don't see such an option in 'gnatprove -help', nor in the Spark 2014 user guide (not I have read every page yet; I did go thru the tutorial). Worse, in the gnatprove.out file, it says: Prove_Integer_Queues.Add at sal-gen_bounded_definite_queues.ads:66, instantiated at prove_integer_queues.ads:2 skipped and so on, for every subprogram. Is there a way to find out why it was skipped? And what do I need to do to fix this? The body file has 'with Spark_Mode' on the package. In general, is it useful to use Spark on generic packages? Section 4.3 in the Spark user guide says: Note that a generic package instance is considered to be declared at its instantiation point. For example, a generic package cannot be both marked SPARK_Mode and instantiated in a subprogram body. But I think that only applies if I run gnatprove on that instantiation; if I only run it on Prove_Integer_Queues, it should be ok. On the other hand, does that mean other instantiations of Gen_Bounded_Definite_Queues are "proved" in any sense? Most of my packages rely on Ada.Finalization, which is not currently supported in Spark. Is there any plan to add that? -- Stephe