comp.lang.ada
 help / color / mirror / Atom feed
* Everything You Know Is Wrong
@ 2015-12-27  0:37 Jeffrey R. Carter
  2015-12-27  7:55 ` J-P. Rosen
                   ` (8 more replies)
  0 siblings, 9 replies; 40+ messages in thread
From: Jeffrey R. Carter @ 2015-12-27  0:37 UTC (permalink / raw)


When I started out in S/W development, I learned some rules, such as, "Integer
math is much faster than floating point," and, "Memory is scarce."

In the 90s, processors began to typically have an FPU, and floating-point math
became as fast as integer, and in some cases, since it could proceed in parallel
with the CPU, faster.

When computers began to commonly have RAM in quantities measured in GB rather
than KB or MB, memory ceased to be scarce, and things that were previously
laughable, such as

type Int_Set is array (Integer) of Boolean;
for Int_Set'Component_Size use Boolean'Size;

became possible (for a 32-bit Integer, Int_Set'Size would be 512 MB). What I
knew is wrong.

Today we learn that memory is much faster than persistent storage. That may soon
be wrong, too. I've been reading about non-volatile memory research, and it
seems that in a few years NV RAM will be available as fast current RAM and as
persistent and durable as current disks.

This will no doubt revolutionize computer architectures and programming
languages. Instead of computers with distinct memory and storage, there will
probably be computers with lots of NV RAM (1-10 TB?) but no disk.

People will no doubt still want a hierarchical naming system for data stored in
that memory, but presumably S/W will map variables onto these "files". So
instead of the current "open, loop over read/modify/write, close" paradigm, we
might have something like

type R is record ...

type L is array (Positive range <>) of R;

F: mapped L with File_Name => "name";

All_Records : for I in F'range loop -- or "of F"

where the bounds of F will be determined from "name". A mechanism will be needed
for collections of heterogenous data as well. F would be equivalent to a
Direct_IO file with in-out mode.

I would think that the Ada 2X project should be thinking about these things, and
wonder what others here think about them.

-- 
Jeff Carter
"He that hath no beard is less than a man."
Much Ado About Nothing
132

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  0:37 Everything You Know Is Wrong Jeffrey R. Carter
@ 2015-12-27  7:55 ` J-P. Rosen
  2015-12-27 17:37   ` Jeffrey R. Carter
  2015-12-27  8:46 ` Dmitry A. Kazakov
                   ` (7 subsequent siblings)
  8 siblings, 1 reply; 40+ messages in thread
From: J-P. Rosen @ 2015-12-27  7:55 UTC (permalink / raw)


Le 27/12/2015 01:37, Jeffrey R. Carter a écrit :
> This will no doubt revolutionize computer architectures and programming
> languages. Instead of computers with distinct memory and storage, there will
> probably be computers with lots of NV RAM (1-10 TB?) but no disk.
> 
> People will no doubt still want a hierarchical naming system for data stored inFor th
> that memory, but presumably S/W will map variables onto these "files". 

Time to resurrect Multics?

(For the education of the young generation: Multics was an OS of the
60s-70s, where everything was organized as "segments" of virtual memory.
There was a command to list "named segments" (equivalent of files),
which was naturally named... "ls".)

-- 
J-P. Rosen
Adalog
2 rue du Docteur Lombard, 92441 Issy-les-Moulineaux CEDEX
Tel: +33 1 45 29 21 52, Fax: +33 1 45 29 25 00
http://www.adalog.fr

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  0:37 Everything You Know Is Wrong Jeffrey R. Carter
  2015-12-27  7:55 ` J-P. Rosen
@ 2015-12-27  8:46 ` Dmitry A. Kazakov
  2015-12-27 17:36   ` Jeffrey R. Carter
  2015-12-28  9:57 ` Georg Bauhaus
                   ` (6 subsequent siblings)
  8 siblings, 1 reply; 40+ messages in thread
From: Dmitry A. Kazakov @ 2015-12-27  8:46 UTC (permalink / raw)


On 2015-12-27 01:37, Jeffrey R. Carter wrote:

> Today we learn that memory is much faster than persistent storage. That may soon
> be wrong, too. I've been reading about non-volatile memory research, and it
> seems that in a few years NV RAM will be available as fast current RAM and as
> persistent and durable as current disks.
>
> This will no doubt revolutionize computer architectures and programming
> languages. Instead of computers with distinct memory and storage, there will
> probably be computers with lots of NV RAM (1-10 TB?) but no disk.

There will be no files and no I/O.

The idea of a memory-mapped object-oriented system is nothing new. On 
the contrary, it is more than 20 years old.

BTW, we always will have multiple speed memory. When the memory of each 
computer will get a unique place in the global address space and thus 
networking I/O will be eliminated, remote memory mapping will remain 
much slower than the local one. Similarly, locally shared writable 
memory will always be slower than the single-ported one and that slower 
than cache etc.

> People will no doubt still want a hierarchical naming system for data stored in
> that memory, but presumably S/W will map variables onto these "files". So
> instead of the current "open, loop over read/modify/write, close" paradigm, we
> might have something like
>
> type R is record ...
>
> type L is array (Positive range <>) of R;
>
> F: mapped L with File_Name => "name";

Static persistent object binding makes no sense, of course.

> All_Records : for I in F'range loop -- or "of F"
>
> where the bounds of F will be determined from "name". A mechanism will be needed
> for collections of heterogenous data as well. F would be equivalent to a
> Direct_IO file with in-out mode.

No, it will be equivalent to a container library.

BTW, Direct_IO stems from block-oriented devices, namely disks. Once 
there will be no I/O there will be no need to slice data into same sized 
blocks.

> I would think that the Ada 2X project should be thinking about these things, and
> wonder what others here think about them.

Well, long ago I wrote here about requirements the language must have in 
order to support such persistent memory. The most important one is 
getting away from the trusted model. E.g. operations of a persistent 
protected object must go through the supervisor mode in order to keep 
protected members out of the caller's memory space. And you won't get 
anywhere without proper interfaces and an elaborated type system.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  8:46 ` Dmitry A. Kazakov
@ 2015-12-27 17:36   ` Jeffrey R. Carter
  2016-01-04 14:44     ` Alejandro R. Mosteo
  0 siblings, 1 reply; 40+ messages in thread
From: Jeffrey R. Carter @ 2015-12-27 17:36 UTC (permalink / raw)


On 12/27/2015 01:46 AM, Dmitry A. Kazakov wrote:
> 
> The idea of a memory-mapped object-oriented system is nothing new. On the
> contrary, it is more than 20 years old.

Well, as Rosen pointed out, if you accept virtual memory as memory, then it's
even older than that.

> No, it will be equivalent to a container library.

Yes, thinking more about the idea, when current S/W writes a file, it often has
no idea how big that file will be until it's finished. The equivalent would seem
to be a memory-mapped unbounded container that persists, with a name, after the
program ends.

Another thought I've had is the need to wipe non-mapped objects at program
termination for security.

-- 
Jeff Carter
"Who wears beige to a bank robbery?"
Take the Money and Run
144

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  7:55 ` J-P. Rosen
@ 2015-12-27 17:37   ` Jeffrey R. Carter
  0 siblings, 0 replies; 40+ messages in thread
From: Jeffrey R. Carter @ 2015-12-27 17:37 UTC (permalink / raw)


On 12/27/2015 12:55 AM, J-P. Rosen wrote:
> 
> (For the education of the young generation: Multics was an OS of the
> 60s-70s, where everything was organized as "segments" of virtual memory.
> There was a command to list "named segments" (equivalent of files),
> which was naturally named... "ls".)

The name Unix was chosen as being the opposite of Multics.

-- 
Jeff Carter
"Who wears beige to a bank robbery?"
Take the Money and Run
144

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  0:37 Everything You Know Is Wrong Jeffrey R. Carter
  2015-12-27  7:55 ` J-P. Rosen
  2015-12-27  8:46 ` Dmitry A. Kazakov
@ 2015-12-28  9:57 ` Georg Bauhaus
  2015-12-28 11:19   ` Dmitry A. Kazakov
  2015-12-28 17:19 ` Nicholas Collin Paul de Gloucester
                   ` (5 subsequent siblings)
  8 siblings, 1 reply; 40+ messages in thread
From: Georg Bauhaus @ 2015-12-28  9:57 UTC (permalink / raw)


On 27.12.15 01:37, Jeffrey R. Carter wrote:
> I would think that the Ada 2X project should be thinking about these things, and
> wonder what others here think about them.

Maybe the word "distributed" needs to include more features
of the language and its library.

Considering economy, the increase in resources does not seem
to always entail new algorithms or corresponding new features
of languages(*). I have seen it simplify scaling when we could
just reuse the same program in a larger address space.

A program system that serves many instead of one will save the cost
of a few computers or their parts, so virtualization covers some
use cases already, without programs needing to change.

Another influence of economy on source text, given no major advances
in battery technology, dynamos, generators, etc., is energy
savings. This may inspire a whole now approach to optimizations in
program translation: I imagine an aspect called "Importance".  It
lets the compiler choose instructions and orders of execution in a
way that uses resources when needed or as available.

__
(*) “We don’t have better algorithms. We just have more data.”
                                           -- P. Norvig

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-28  9:57 ` Georg Bauhaus
@ 2015-12-28 11:19   ` Dmitry A. Kazakov
  2015-12-28 16:27     ` Nicholas Collin Paul de Gloucester
  0 siblings, 1 reply; 40+ messages in thread
From: Dmitry A. Kazakov @ 2015-12-28 11:19 UTC (permalink / raw)


On 2015-12-28 10:57, Georg Bauhaus wrote:

> Another influence of economy on source text, given no major advances
> in battery technology, dynamos, generators, etc., is energy
> savings. This may inspire a whole now approach to optimizations in
> program translation: I imagine an aspect called "Importance".  It
> lets the compiler choose instructions and orders of execution in a
> way that uses resources when needed or as available.

I doubt that will play any significant role. What must change is the 
methods of binding. Modular, distributed, persistent data, long-living 
software require much more than plain linkers/loaders presently offer.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-28 11:19   ` Dmitry A. Kazakov
@ 2015-12-28 16:27     ` Nicholas Collin Paul de Gloucester
  2015-12-28 17:30       ` Dmitry A. Kazakov
  0 siblings, 1 reply; 40+ messages in thread
From: Nicholas Collin Paul de Gloucester @ 2015-12-28 16:27 UTC (permalink / raw)


On December 28th, 2015, Dmitry A. Kazakov posted:
|-----------------------------------------------------------------------------|
|"On 2015-12-28 10:57, Georg Bauhaus wrote:                                   |
|                                                                             |
|> Another influence of economy on source text, given no major advances       |
|> in battery technology, dynamos, generators, etc., is energy                |
|> savings. This may inspire a whole now approach to optimizations in         |
|> program translation: I imagine an aspect called "Importance".  It          |
|> lets the compiler choose instructions and orders of execution in a         |
|> way that uses resources when needed or as available.                       |
|                                                                             |
|I doubt that will play any significant role. What must change is the methods |
|of binding. Modular, distributed, persistent data, long-living software      |
|require much more than plain linkers/loaders presently offer.                |
|                                                                             |
|--                                                                           |
|Regards,                                                                     |
|Dmitry A. Kazakov                                                            |
|  http://www.dmitry-kazakov.de  "                                            |
|-----------------------------------------------------------------------------|


Reducing energy consumption is not a novel ideal. E.g. I quote from a
review by me of Petru Eles; Krzysztof Kuchcinski; and Zebo Peng,
"System Synthesis with VHDL", Kluwer Academic Publishers (
   WWW.ACCU.org/index.php?module=bookreviews&func=search&rid=1291 
) which was published in "CVu", Volume 16, Number 2:
"[. . .]

I learnt from the low-power synthesis chapter that two's complement is
believed to consume more power than sign-magnitude due to the high
level of switching needed if a variable/signal toggles between
positive and negative often.

[. . .]"

With kind regards,
Nicholas Collin Paul de Gloucester


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  0:37 Everything You Know Is Wrong Jeffrey R. Carter
                   ` (2 preceding siblings ...)
  2015-12-28  9:57 ` Georg Bauhaus
@ 2015-12-28 17:19 ` Nicholas Collin Paul de Gloucester
  2015-12-29 23:37 ` darkestkhan
                   ` (4 subsequent siblings)
  8 siblings, 0 replies; 40+ messages in thread
From: Nicholas Collin Paul de Gloucester @ 2015-12-28 17:19 UTC (permalink / raw)


On December 26th, 2015, Jeffrey R. Carter posted:
|--------------------------------------------------------------------------------|
|"When I started out in S/W development, I learned some rules, such as, "Integer |
|math is much faster than floating point," and, "Memory is scarce."              |
|                                                                                |
|In the 90s, processors began to typically have an FPU, and floating-point math  |
|became as fast as integer, and in some cases, since it could proceed in parallel|
|with the CPU, faster.                                                           |
|                                                                                |
|When computers began to commonly have RAM in quantities measured in GB rather   |
|than KB or MB, memory ceased to be scarce, [. . .]                              |
|[. . .]                                                                         |
|                                                                                |
|[. . .]                                                                         |
|                                                                                |
|--                                                                              |
|Jeff Carter"                                                                    |
|--------------------------------------------------------------------------------|


Even a number-crunching workstation is likely to have caches which are
much smaller than ideal.

Not all current FPGAs nor embedded microcontrollers have an FPU (nor
much memory, let alone a cache).

Yours sincerely,
Nicholas Collin Paul de Gloucester

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-28 16:27     ` Nicholas Collin Paul de Gloucester
@ 2015-12-28 17:30       ` Dmitry A. Kazakov
  2015-12-28 18:50         ` Nicholas Collin Paul de Gloucester
  0 siblings, 1 reply; 40+ messages in thread
From: Dmitry A. Kazakov @ 2015-12-28 17:30 UTC (permalink / raw)


On 2015-12-28 17:27, Nicholas Collin Paul de Gloucester wrote:

> Reducing energy consumption is not a novel ideal. E.g. I quote from a
> review by me of Petru Eles; Krzysztof Kuchcinski; and Zebo Peng,
> "System Synthesis with VHDL", Kluwer Academic Publishers (
>    WWW.ACCU.org/index.php?module=bookreviews&func=search&rid=1291 )
> which was published in "CVu", Volume 16, Number 2:
> "[. . .]
>
> I learnt from the low-power synthesis chapter that two's complement is
> believed to consume more power than sign-magnitude due to the high
> level of switching needed if a variable/signal toggles between
> positive and negative often.

That is not the goal of energy saving agenda, which is doing less 
computations, e.g. by turning off some circuits or reducing frequency. I 
don't think anybody would seriously consider this an advantage. From the 
SW POV it is only more complexity, less safety etc.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-28 17:30       ` Dmitry A. Kazakov
@ 2015-12-28 18:50         ` Nicholas Collin Paul de Gloucester
  2015-12-28 20:40           ` Dmitry A. Kazakov
  0 siblings, 1 reply; 40+ messages in thread
From: Nicholas Collin Paul de Gloucester @ 2015-12-28 18:50 UTC (permalink / raw)


On December 28th, 2015, Dmitry A. Kazakov posted:
|----------------------------------------------------------------------------|
|"On 2015-12-28 17:27, Nicholas Collin Paul de Gloucester wrote:             |
|                                                                            |
|> Reducing energy consumption is not a novel ideal. E.g. I quote from a     |
|> review by me of Petru Eles; Krzysztof Kuchcinski; and Zebo Peng,          |
|> "System Synthesis with VHDL", Kluwer Academic Publishers (                |
|>    WWW.ACCU.org/index.php?module=bookreviews&func=search&rid=1291 )       |
|> which was published in "CVu", Volume 16, Number 2:                        |
|> "[. . .]                                                                  |
|>                                                                           |
|> I learnt from the low-power synthesis chapter that two's complement is    |
|> believed to consume more power than sign-magnitude due to the high        |
|> level of switching needed if a variable/signal toggles between            |
|> positive and negative often.                                              |
|                                                                            |
|That is not the goal of energy saving agenda, which is doing less           |
|computations, e.g. by turning off some circuits or reducing frequency. I    |
|don't think anybody would seriously consider this an advantage. From the SW |
|POV it is only more complexity, less safety etc.                            |
|                                                                            |
|--                                                                          |
|Regards,                                                                    |
|Dmitry A. Kazakov                                                           |
| http://www.dmitry-kazakov.de "                                             |
|----------------------------------------------------------------------------|

What is "this"?

If "this" would be a reification of an integer type which would not be
two's complement, then alternative reifications are of interest:
e.g. sign-magnitude as documented by Eles; Kuchcinski; and Peng for
saving energy; or Gray code for a different motivation.

If "this" would be "turning off some circuits", then witness clock
gating.

If "this" would be "reducing frequency", then there would also be
cases of this.

Happy New Year.

Regards,
Nicholas Collin Paul de Gloucester

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-28 18:50         ` Nicholas Collin Paul de Gloucester
@ 2015-12-28 20:40           ` Dmitry A. Kazakov
  2015-12-29 11:42             ` G.B.
  0 siblings, 1 reply; 40+ messages in thread
From: Dmitry A. Kazakov @ 2015-12-28 20:40 UTC (permalink / raw)


On 2015-12-28 19:50, Nicholas Collin Paul de Gloucester wrote:

> What is "this"?

Energy saving set as the goal [of computing?]. The proper goal is better 
performance, when that require lower voltage be it so.

I understood Georg meant "green computing" which is reverse - buying 
energy by selling performance.

In the real world we get better performance at lower energy, just per 
laws of physics, as you pointed out.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-28 20:40           ` Dmitry A. Kazakov
@ 2015-12-29 11:42             ` G.B.
  2015-12-29 12:36               ` Dmitry A. Kazakov
  0 siblings, 1 reply; 40+ messages in thread
From: G.B. @ 2015-12-29 11:42 UTC (permalink / raw)


On 28.12.15 21:40, Dmitry A. Kazakov wrote:
> On 2015-12-28 19:50, Nicholas Collin Paul de Gloucester wrote:
>
>> What is "this"?
>
> Energy saving set as the goal [of computing?]. The proper goal is better
> performance, when that require lower voltage be it so.
>
> I understood Georg meant "green computing" which is reverse - buying
> energy by selling performance.

I was thinking of mobile computing, among other forms of computing
that depend on the presence of, say, batteries.

The hardware people do a lot to reduce energy needs.  The software
people could add to that. One addition is almost existing already,
in JIT compilers that deploy alternative compiled routines. Of
course, the compilation step doesn't necessarily save energy
now. But optimization demonstrates that compiled routines could
differ by energy consumption.

Considering the above I was also thinking of resource sharing, the
resource, however, being energy. When the amount of energy available
to the computer depends on wall clock time, then a programmer
arranges for code whose execution can be postponed to when there is
sufficient energy.  For example, if two subprograms are independent
and their execution can be postponed until a third one needs their
results, then several orders of execution become possible and this
will not require tasking. Even interleaving is o.K.

op1
  |     op1     op2
op2       \   /
  |        op3
op3

Can pragma Pure be extended to allow this optimization?


> In the real world we get better performance at lower energy, just per
> laws of physics, as you pointed out.

You would be in control of Watts for performance.


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 11:42             ` G.B.
@ 2015-12-29 12:36               ` Dmitry A. Kazakov
  2015-12-29 13:50                 ` G.B.
  0 siblings, 1 reply; 40+ messages in thread
From: Dmitry A. Kazakov @ 2015-12-29 12:36 UTC (permalink / raw)


On 2015-12-29 12:42, G.B. wrote:

> The hardware people do a lot to reduce energy needs.  The software
> people could add to that.

I don't think so. Software is too costly and too volatile. Any potential 
win is negligible and will be overtaken by new hardware in just one year.

> For example, if two subprograms are independent
> and their execution can be postponed until a third one needs their
> results,

Firstly, you cannot know that. Secondly, this is a classic abstraction 
inversion example. Subprograms are result of software decomposition, 
which is driven by the problem. If the decomposition is driven by a 
constraint, and this happens sometimes, e.g. in real-time systems, you 
get handed huge design problem. We know how expensive real-time systems 
are. Now with the costs and risks of software development it simply does 
not make sense to do this. In one year there will be new hardware and, 
considering mobile platforms, the battery will degrade so much that you 
will notice no difference anyway.

>> In the real world we get better performance at lower energy, just per
>> laws of physics, as you pointed out.
>
> You would be in control of Watts for performance.

Yeah. Remember that "turbo" button on desktops in 90's?

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 12:36               ` Dmitry A. Kazakov
@ 2015-12-29 13:50                 ` G.B.
  2015-12-29 14:06                   ` J-P. Rosen
  2015-12-29 14:16                   ` Dmitry A. Kazakov
  0 siblings, 2 replies; 40+ messages in thread
From: G.B. @ 2015-12-29 13:50 UTC (permalink / raw)


On 29.12.15 13:36, Dmitry A. Kazakov wrote:
> On 2015-12-29 12:42, G.B. wrote:
>
>> The hardware people do a lot to reduce energy needs.  The software
>> people could add to that.
>
> I don't think so. Software is too costly and too volatile. Any potential
> win is negligible and will be overtaken by new hardware in just one year.

The managers of the billion $ computer companies have been
addressing battery related running times for years already. Suppose
that a compiler optimizes a program for energy consumption. So, no
big batteries are needed. If the sales people can therefore make the
designers design a thinner device or a device that runs longer
before recharging is needed, then that makes optimizing for energy
consumption a sales argument that will affect production.

>> For example, if two subprograms are independent
>> and their execution can be postponed until a third one needs their
>> results,
>
> Firstly, you cannot know that.

Independence of subprograms can follow from an abstract design
and from a solution, both of which the programmers know. A compiler
may not know it generally if there is no language for it.
(No "with" of one by the other, and no dependence on anything
with "Address" in it, or "Import", or I/O, ....)

SPARK will allow full formalization, I think.


> Secondly, this is a classic abstraction
> inversion example. Subprograms are result of software decomposition,
> which is driven by the problem.

If problem driven decomposition happens to give independent
subprograms, why not have optimization make good use of the
opportunity? As "optimization" indicates, the corresponding
compiler switch would not normally drive designs.



^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 13:50                 ` G.B.
@ 2015-12-29 14:06                   ` J-P. Rosen
  2015-12-29 14:16                   ` Dmitry A. Kazakov
  1 sibling, 0 replies; 40+ messages in thread
From: J-P. Rosen @ 2015-12-29 14:06 UTC (permalink / raw)


On 29.12.15 13:36, Dmitry A. Kazakov wrote:
>> On 2015-12-29 12:42, G.B. wrote:
>>
>>> The hardware people do a lot to reduce energy needs.  The software
>>> people could add to that.
>>
>> I don't think so. Software is too costly and too volatile. Any potential
>> win is negligible and will be overtaken by new hardware in just one year.

Well, I can tell you that the lifetime of my Nexus5 (on battery)
sensibly increased with the delivery of Android-Marshmallow...

-- 
J-P. Rosen
Adalog
2 rue du Docteur Lombard, 92441 Issy-les-Moulineaux CEDEX
Tel: +33 1 45 29 21 52, Fax: +33 1 45 29 25 00
http://www.adalog.fr


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 13:50                 ` G.B.
  2015-12-29 14:06                   ` J-P. Rosen
@ 2015-12-29 14:16                   ` Dmitry A. Kazakov
  2015-12-29 16:31                     ` Dennis Lee Bieber
  2015-12-29 16:57                     ` G.B.
  1 sibling, 2 replies; 40+ messages in thread
From: Dmitry A. Kazakov @ 2015-12-29 14:16 UTC (permalink / raw)


On 2015-12-29 14:50, G.B. wrote:
> On 29.12.15 13:36, Dmitry A. Kazakov wrote:
>> On 2015-12-29 12:42, G.B. wrote:
>>
>>> The hardware people do a lot to reduce energy needs.  The software
>>> people could add to that.
>>
>> I don't think so. Software is too costly and too volatile. Any potential
>> win is negligible and will be overtaken by new hardware in just one year.
>
> The managers of the billion $ computer companies have been
> addressing battery related running times for years already. Suppose
> that a compiler optimizes a program for energy consumption. So, no
> big batteries are needed.

In what sense? Less battery drain? 0.1%? The point that at the 
instruction level, if we are talking about optimization, not software 
redesign, no optimization could give you anything visible. With 
redesign, you could probably get 1% bought by massive software problems, 
as if we had only few today.

What does reduce energy consumption is the hardware architecture. E.g. 
designs with circuits having their own micro power sources. In a 
massively parallel system with distributed power supplies you will get 
better power management for free.

>>> For example, if two subprograms are independent
>>> and their execution can be postponed until a third one needs their
>>> results,
>>
>> Firstly, you cannot know that.
>
> Independence of subprograms can follow from an abstract design
> and from a solution, both of which the programmers know.

The compiler must become an oracle, not just programmer's mind reader, 
to guess that a subprogram call can be postponed.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 14:16                   ` Dmitry A. Kazakov
@ 2015-12-29 16:31                     ` Dennis Lee Bieber
  2015-12-29 17:02                       ` G.B.
  2015-12-29 16:57                     ` G.B.
  1 sibling, 1 reply; 40+ messages in thread
From: Dennis Lee Bieber @ 2015-12-29 16:31 UTC (permalink / raw)


On Tue, 29 Dec 2015 15:16:38 +0100, "Dmitry A. Kazakov"
<mailbox@dmitry-kazakov.de> declaimed the following:

>
>The compiler must become an oracle, not just programmer's mind reader, 
>to guess that a subprogram call can be postponed.

	It sure wouldn't make life easy for someone trying to certify the
software for civil avionics... WCET would have to assume the calls are
always invoked, the time would have to be included in the scheduling of
processing slices, etc.
-- 
	Wulfraed                 Dennis Lee Bieber         AF6VN
    wlfraed@ix.netcom.com    HTTP://wlfraed.home.netcom.com/


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 14:16                   ` Dmitry A. Kazakov
  2015-12-29 16:31                     ` Dennis Lee Bieber
@ 2015-12-29 16:57                     ` G.B.
  2015-12-29 17:36                       ` Dmitry A. Kazakov
  1 sibling, 1 reply; 40+ messages in thread
From: G.B. @ 2015-12-29 16:57 UTC (permalink / raw)


On 29.12.15 15:16, Dmitry A. Kazakov wrote:

>>>> For example, if two subprograms are independent
>>>> and their execution can be postponed until a third one needs their
>>>> results,
>>>
>>> Firstly, you cannot know that.
>>
>> Independence of subprograms can follow from an abstract design
>> and from a solution, both of which the programmers know.
>
> The compiler must become an oracle, not just programmer's mind reader,
> to guess that a subprogram call can be postponed.

No oracle is needed:

A parallel loop is non-trivially parallel only if in its body
there are, ultimately, statements that can be executed independently
of one another.

The point is that, since order does not matter among these parallel
computations, and since parallel loops are not just fantasy,
the possibility of postponement follows from parallel loops.

So, while no oracle is needed, presumably some rules in the language
need so say what can be safely put inside a parallel loop.

Consider F1 and F2 from a Pure package, X a variable of type T
and T without progenitors:

    declare
       A : T := F1 (1,2,3);
       B : T := F2 (X);
    begin
       G (A, B);
    end;

Is it impossible for a compiler to see that the initializing parts of
the declarations of A and B are independent?

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 16:31                     ` Dennis Lee Bieber
@ 2015-12-29 17:02                       ` G.B.
  0 siblings, 0 replies; 40+ messages in thread
From: G.B. @ 2015-12-29 17:02 UTC (permalink / raw)


On 29.12.15 17:31, Dennis Lee Bieber wrote:
> On Tue, 29 Dec 2015 15:16:38 +0100, "Dmitry A. Kazakov"
> <mailbox@dmitry-kazakov.de> declaimed the following:
>
>>
>> The compiler must become an oracle, not just programmer's mind reader,
>> to guess that a subprogram call can be postponed.
>
> 	It sure wouldn't make life easy for someone trying to certify the
> software for civil avionics... WCET would have to assume the calls are
> always invoked, the time would have to be included in the scheduling of
> processing slices, etc.
>

Apart from combinatorial complexity, is there a fundamental
difference? Can postponement or reordering be made analogues
of events that happen at unpredictable points in time?


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 16:57                     ` G.B.
@ 2015-12-29 17:36                       ` Dmitry A. Kazakov
  2015-12-29 17:53                         ` G.B.
  2015-12-29 21:58                         ` Randy Brukardt
  0 siblings, 2 replies; 40+ messages in thread
From: Dmitry A. Kazakov @ 2015-12-29 17:36 UTC (permalink / raw)


On 2015-12-29 17:57, G.B. wrote:

> The point is that, since order does not matter among these parallel
> computations, and since parallel loops are not just fantasy,
> the possibility of postponement follows from parallel loops.

Ada does not have parallel loops. The job is done!

> Consider F1 and F2 from a Pure package, X a variable of type T
> and T without progenitors:
>
>     declare
>        A : T := F1 (1,2,3);
>        B : T := F2 (X);
>     begin
>        G (A, B);
>     end;
>
> Is it impossible for a compiler to see that the initializing parts of
> the declarations of A and B are independent?

No, because initialization of T may have side effects. Ada mandates B 
initialized after A. Again, the "optimization" you wanted is done.

BTW, why do you believe that the same amount of computation performed 
consecutively should require less energy or (not quite same) less 
battery drain? It is no obvious. In any case it might depend on the 
specifics of the battery, the board, the memory controller, the number 
of cores etc. You cannot optimize for these, it would lead a 
combinatoric explosion of targets. All that mess to save 0.1% drain?

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 17:36                       ` Dmitry A. Kazakov
@ 2015-12-29 17:53                         ` G.B.
  2015-12-29 18:09                           ` G.B.
                                             ` (2 more replies)
  2015-12-29 21:58                         ` Randy Brukardt
  1 sibling, 3 replies; 40+ messages in thread
From: G.B. @ 2015-12-29 17:53 UTC (permalink / raw)


On 29.12.15 18:36, Dmitry A. Kazakov wrote:
> On 2015-12-29 17:57, G.B. wrote:
>
>> The point is that, since order does not matter among these parallel
>> computations, and since parallel loops are not just fantasy,
>> the possibility of postponement follows from parallel loops.
>
> Ada does not have parallel loops.

Ada is about to have parallel loops. Ada compilers already produce
code for parallel execution of loop bodies on today's processors.


>> Consider F1 and F2 from a Pure package, X a variable of type T
>> and T without progenitors:
>>
>>     declare
>>        A : T := F1 (1,2,3);
>>        B : T := F2 (X);
>>     begin
>>        G (A, B);
>>     end;
>>
>> Is it impossible for a compiler to see that the initializing parts of
>> the declarations of A and B are independent?
>
> No, because initialization of T may have side effects.

I assume you meant Yes, impossible. But,
if F1 and F2 are from a Pure package, is the compiler allowed to
ignore side effects because the programmer specified Pure? I think
that follows.

> BTW, why do you believe that the same amount of computation performed
> consecutively should require less energy or (not quite same) less
> battery drain?

Less energy at a certain point in time. Postpone calling sub if, say,
a photocell should first be given time to recharge a battery while
while regular operation continues, then call sub.



^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 17:53                         ` G.B.
@ 2015-12-29 18:09                           ` G.B.
  2015-12-29 22:05                           ` Randy Brukardt
  2016-01-04 14:51                           ` Alejandro R. Mosteo
  2 siblings, 0 replies; 40+ messages in thread
From: G.B. @ 2015-12-29 18:09 UTC (permalink / raw)


On 29.12.15 18:53, G.B. wrote:

> if F1 and F2 are from a Pure package, is the compiler allowed to
> ignore side effects because the programmer specified Pure? I think
> that follows.

(It doesn't)

Better: If T's (automatic) initialization, if any, has side effects,
then the program is either unsuitable, or, more like Ada, the
programmer needs to specify "I know what I'm doing". (For example,
side effects being innocuous if repeated.)


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 17:36                       ` Dmitry A. Kazakov
  2015-12-29 17:53                         ` G.B.
@ 2015-12-29 21:58                         ` Randy Brukardt
  1 sibling, 0 replies; 40+ messages in thread
From: Randy Brukardt @ 2015-12-29 21:58 UTC (permalink / raw)


"Dmitry A. Kazakov" <mailbox@dmitry-kazakov.de> wrote in message 
news:n5ugbf$fqr$1@speranza.aioe.org...
> On 2015-12-29 17:57, G.B. wrote:
>
>> The point is that, since order does not matter among these parallel
>> computations, and since parallel loops are not just fantasy,
>> the possibility of postponement follows from parallel loops.
>
> Ada does not have parallel loops. The job is done!

But Ada 202x most likely will.

>> Consider F1 and F2 from a Pure package, X a variable of type T
>> and T without progenitors:
>>
>>     declare
>>        A : T := F1 (1,2,3);
>>        B : T := F2 (X);
>>     begin
>>        G (A, B);
>>     end;
>>
>> Is it impossible for a compiler to see that the initializing parts of
>> the declarations of A and B are independent?
>
> No, because initialization of T may have side effects. Ada mandates B 
> initialized after A. Again, the "optimization" you wanted is done.

Irrelevant - the compiler knows if any side-effects are possible (T is known 
to the compiler). That is going to be a conservative determination, but it 
will be true for many types (there is no side-effect for elementary types, 
for instance). Similarly, the compiler could know that other side-effects 
aren't possible, either by inspecting the code or by a declaration (many 
compilers have mechanisms for this, and Ada 202x is likely to get one as 
well).

In any case, optimization is a very important part of compilation. Pretty 
much all Ada compilers make decisions like the one shown here every time you 
compile anything - the results would be terrible if they did not. For 
Janus/Ada, it makes a difference of up to 25% in time and space (compared to 
the unoptimized code). And of course one can optimize for many different 
things, including power usage -- I'm pretty sure the compilers used for 
mobile platforms do such optimizations as a matter of course. And that's for 
C -- Ada provides many more optimization possibilities because it has far 
more information about the program and the programmers intentions.

                                          Randy.





> BTW, why do you believe that the same amount of computation performed 
> consecutively should require less energy or (not quite same) less battery 
> drain? It is no obvious. In any case it might depend on the specifics of 
> the battery, the board, the memory controller, the number of cores etc. 
> You cannot optimize for these, it would lead a combinatoric explosion of 
> targets. All that mess to save 0.1% drain?
>
> -- 
> Regards,
> Dmitry A. Kazakov
> http://www.dmitry-kazakov.de 


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 17:53                         ` G.B.
  2015-12-29 18:09                           ` G.B.
@ 2015-12-29 22:05                           ` Randy Brukardt
  2016-01-04 14:51                           ` Alejandro R. Mosteo
  2 siblings, 0 replies; 40+ messages in thread
From: Randy Brukardt @ 2015-12-29 22:05 UTC (permalink / raw)


"G.B." <bauhaus@futureapps.invalid> wrote in message 
news:n5uh6q$chh$1@dont-email.me...
> On 29.12.15 18:36, Dmitry A. Kazakov wrote:
...
>> No, because initialization of T may have side effects.
>
> I assume you meant Yes, impossible. But,
> if F1 and F2 are from a Pure package, is the compiler allowed to
> ignore side effects because the programmer specified Pure? I think
> that follows.


Ada does allow compilers to ignore side-effects for Pure functions, but only 
for successive calls to the same function with the same parameters. Which is 
not enough for your example.

And you're right, it *should* follow, but Pure is broken. For parallel 
execution, we need something stronger - and it's in the hopper today. 
(Algebraic optimizations also need something stronger.) [Pure, for instance, 
allows dereferencing of pointers, which could change by some other path. And 
making it apply to an entire package screws up organization, since most 
packages have a bunch of pure functions and many other, non-pure 
operations.] One of the reasons Ada 2012 has expression functions is that if 
the compiler can see the entire function definition, it can then do these 
sorts of optimizations. (But that's a weak solution for many reasons.)

                                 Randy.



^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  0:37 Everything You Know Is Wrong Jeffrey R. Carter
                   ` (3 preceding siblings ...)
  2015-12-28 17:19 ` Nicholas Collin Paul de Gloucester
@ 2015-12-29 23:37 ` darkestkhan
  2016-01-05 13:52 ` brbarkstrom
                   ` (3 subsequent siblings)
  8 siblings, 0 replies; 40+ messages in thread
From: darkestkhan @ 2015-12-29 23:37 UTC (permalink / raw)


If processor is not used heavily then they don't use their full 'capacity' - modern CPUs (and GPUs too) can be underclocked by OS - and usually if you don't have much work to be done atm then they are put to lowest frequency possible (not to mention that nowadays if you don't make use of certain circuits in CPU then cpus themselves can just switch that part off). Thus to save energy you want your task to be computed ASAP - so that it can be in low frequency mode for as long as possible. Turns out it is the best power saving optimization. And as counterintuitive as it is - it is most important for big data centers (less power used => less heat waste => less cooling needed)


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27 17:36   ` Jeffrey R. Carter
@ 2016-01-04 14:44     ` Alejandro R. Mosteo
  0 siblings, 0 replies; 40+ messages in thread
From: Alejandro R. Mosteo @ 2016-01-04 14:44 UTC (permalink / raw)


On 27/12/15 18:36, Jeffrey R. Carter wrote:
> On 12/27/2015 01:46 AM, Dmitry A. Kazakov wrote:
>>
>> The idea of a memory-mapped object-oriented system is nothing new. On the
>> contrary, it is more than 20 years old.
>
> Well, as Rosen pointed out, if you accept virtual memory as memory, then it's
> even older than that.
>
>> No, it will be equivalent to a container library.
>
> Yes, thinking more about the idea, when current S/W writes a file, it often has
> no idea how big that file will be until it's finished. The equivalent would seem
> to be a memory-mapped unbounded container that persists, with a name, after the
> program ends.

I'd say that JSON-persisted data structures is essentially this?

> Another thought I've had is the need to wipe non-mapped objects at program
> termination for security.

Sure. And we'll have problems with the fact that rebooting will not be 
tabula rasa ;) New jokes ahead?


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-29 17:53                         ` G.B.
  2015-12-29 18:09                           ` G.B.
  2015-12-29 22:05                           ` Randy Brukardt
@ 2016-01-04 14:51                           ` Alejandro R. Mosteo
  2 siblings, 0 replies; 40+ messages in thread
From: Alejandro R. Mosteo @ 2016-01-04 14:51 UTC (permalink / raw)


On 29/12/15 18:53, G.B. wrote:
> On 29.12.15 18:36, Dmitry A. Kazakov wrote:
>> On 2015-12-29 17:57, G.B. wrote:
>>
>>> The point is that, since order does not matter among these parallel
>>> computations, and since parallel loops are not just fantasy,
>>> the possibility of postponement follows from parallel loops.
>>
>> Ada does not have parallel loops.
>
> Ada is about to have parallel loops. Ada compilers already produce
> code for parallel execution of loop bodies on today's processors.

In this regard, just the other day I discovered Matlab's parfor, which 
as expected just parallelizes what would be a plain loop. It even emits 
some warnings about how the input data should be structured/used to 
minimize data transmission to the workers (which may be remote). I guess 
it would be interesting to look at how this and other languages do it.

-Álex.

>
>
>>> Consider F1 and F2 from a Pure package, X a variable of type T
>>> and T without progenitors:
>>>
>>>     declare
>>>        A : T := F1 (1,2,3);
>>>        B : T := F2 (X);
>>>     begin
>>>        G (A, B);
>>>     end;
>>>
>>> Is it impossible for a compiler to see that the initializing parts of
>>> the declarations of A and B are independent?
>>
>> No, because initialization of T may have side effects.
>
> I assume you meant Yes, impossible. But,
> if F1 and F2 are from a Pure package, is the compiler allowed to
> ignore side effects because the programmer specified Pure? I think
> that follows.
>
>> BTW, why do you believe that the same amount of computation performed
>> consecutively should require less energy or (not quite same) less
>> battery drain?
>
> Less energy at a certain point in time. Postpone calling sub if, say,
> a photocell should first be given time to recharge a battery while
> while regular operation continues, then call sub.
>
>


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  0:37 Everything You Know Is Wrong Jeffrey R. Carter
                   ` (4 preceding siblings ...)
  2015-12-29 23:37 ` darkestkhan
@ 2016-01-05 13:52 ` brbarkstrom
  2016-01-10 14:46 ` Michael Erdmann
                   ` (2 subsequent siblings)
  8 siblings, 0 replies; 40+ messages in thread
From: brbarkstrom @ 2016-01-05 13:52 UTC (permalink / raw)


There are a three articles in the new Comm. ACM that pick up on this theme:

Greengard, S.: Better Memory, pp. 23-25

Mamavati, M., Schwazkopf, M., and Warfield, A.: Non-Volatile Storage

Helland, P.: Immutability Changes Everything

The first two of these are on NVRAM and architectural changes that are similar
to the comments in this thread.  The last one deals more with database design
when you don't have to worry so much about versions because you can just store
everything.

This thread might also want to pick up on the security implications of using
block ciphers instead of just the usual encryption.

Bruce B.


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  0:37 Everything You Know Is Wrong Jeffrey R. Carter
                   ` (5 preceding siblings ...)
  2016-01-05 13:52 ` brbarkstrom
@ 2016-01-10 14:46 ` Michael Erdmann
  2016-02-29 12:14 ` Jacob Sparre Andersen
  2016-02-29 12:27 ` Jacob Sparre Andersen
  8 siblings, 0 replies; 40+ messages in thread
From: Michael Erdmann @ 2016-01-10 14:46 UTC (permalink / raw)


I think you forget the changes of the development ecosystem :-) 
"Some language" should relect this as well. 

/Michael


BTW: All development organisationts are capable of creating code with error 1202. 


On 27 Dec 2015 01:37 AM ,"Jeffrey R. Carter" <spam.jrcarter.not@spam.not.acm.org> wrote:
> When I started out in S/W development, I learned some rules, such as, "Integer
> math is much faster than floating point," and, "Memory is scarce."
> 
> In the 90s, processors began to typically have an FPU, and floating-point math
> became as fast as integer, and in some cases, since it could proceed in parallel
> with the CPU, faster.
> 
> When computers began to commonly have RAM in quantities measured in GB rather
> than KB or MB, memory ceased to be scarce, and things that were previously
> laughable, such as
> 
> type Int_Set is array (Integer) of Boolean;
> for Int_Set'Component_Size use Boolean'Size;
> 
> became possible (for a 32-bit Integer, Int_Set'Size would be 512 MB). What I
> knew is wrong.
> 
> Today we learn that memory is much faster than persistent storage. That may soon
> be wrong, too. I've been reading about non-volatile memory research, and it
> seems that in a few years NV RAM will be available as fast current RAM and as
> persistent and durable as current disks.
> 
> This will no doubt revolutionize computer architectures and programming
> languages. Instead of computers with distinct memory and storage, there will
> probably be computers with lots of NV RAM (1-10 TB?) but no disk.
> 
> People will no doubt still want a hierarchical naming system for data stored in
> that memory, but presumably S/W will map variables onto these "files". So
> instead of the current "open, loop over read/modify/write, close" paradigm, we
> might have something like
> 
> type R is record ...
> 
> type L is array (Positive range <>) of R;
> 
> F: mapped L with File_Name => "name";
> 
> All_Records : for I in F'range loop -- or "of F"
> 
> where the bounds of F will be determined from "name". A mechanism will be needed
> for collections of heterogenous data as well. F would be equivalent to a
> Direct_IO file with in-out mode.
> 
> I would think that the Ada 2X project should be thinking about these things, and
> wonder what others here think about them.
> 
> -- 
> Jeff Carter
> "He that hath no beard is less than a man."
> Much Ado About Nothing
> 132


-- 
Posted by Mimo Usenet Browser v0.2.5
http://www.mimousenet.com/mimo/post



^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  0:37 Everything You Know Is Wrong Jeffrey R. Carter
                   ` (6 preceding siblings ...)
  2016-01-10 14:46 ` Michael Erdmann
@ 2016-02-29 12:14 ` Jacob Sparre Andersen
  2016-03-02 14:11   ` vincent.diemunsch
  2016-02-29 12:27 ` Jacob Sparre Andersen
  8 siblings, 1 reply; 40+ messages in thread
From: Jacob Sparre Andersen @ 2016-02-29 12:14 UTC (permalink / raw)


Jeffrey R. Carter wrote:

> When I started out in S/W development, I learned some rules, such as,
> "Integer math is much faster than floating point," and, "Memory is
> scarce."

Some of the rules which (apparently) aren't wrong yet:

   http://www.cse.msu.edu/~cse320/Documents/FloatingPoint.pdf

Greetings,

Jacob
-- 
Photo of the day:
                    http://billeder.sparre-andersen.dk/dagens/2016-02-26

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2015-12-27  0:37 Everything You Know Is Wrong Jeffrey R. Carter
                   ` (7 preceding siblings ...)
  2016-02-29 12:14 ` Jacob Sparre Andersen
@ 2016-02-29 12:27 ` Jacob Sparre Andersen
  8 siblings, 0 replies; 40+ messages in thread
From: Jacob Sparre Andersen @ 2016-02-29 12:27 UTC (permalink / raw)


Jeffrey R. Carter wrote:

> People will no doubt still want a hierarchical naming system for data
> stored in that memory, but presumably S/W will map variables onto
> these "files". So instead of the current "open, loop over
> read/modify/write, close" paradigm, we might have something like
>
> type R is record ...
>
> type L is array (Positive range <>) of R;
>
> F: mapped L with File_Name => "name";
>
> All_Records : for I in F'range loop -- or "of F"
>
> where the bounds of F will be determined from "name". A mechanism will
> be needed for collections of heterogenous data as well. F would be
> equivalent to a Direct_IO file with in-out mode.
>
> I would think that the Ada 2X project should be thinking about these
> things, and wonder what others here think about them.

I think it looks somewhat similar to some of my experiments on
simplified persistence in Ada:

   http://www.jacob-sparre.dk/persistence/ae2010-slides.pdf
   http://www.jacob-sparre.dk/programming/persistent_containers-2015-paper.pdf

Considering the difficulties of implementing persistence in a simple way
with a library, I think it is worthwhile experimenting with something
like this - and considering it for the next major revision of Ada.

Greetings,

Jacob
-- 
"Get rid of the mess with the species"
 Ken Haste Andersen


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2016-02-29 12:14 ` Jacob Sparre Andersen
@ 2016-03-02 14:11   ` vincent.diemunsch
  2016-03-02 14:23     ` J-P. Rosen
  2016-03-02 14:32     ` Dmitry A. Kazakov
  0 siblings, 2 replies; 40+ messages in thread
From: vincent.diemunsch @ 2016-03-02 14:11 UTC (permalink / raw)


Le lundi 29 février 2016 13:14:09 UTC+1, Jacob Sparre Andersen a écrit :

> Some of the rules which (apparently) aren't wrong yet:
> 
>    http://www.cse.msu.edu/~cse320/Documents/FloatingPoint.pdf
> 
> Greetings,
> 
> Jacob

Very interesting ! 
Here is the point about Ada : 

Remarkably enough, some languages don't clearly specify that if x is a
floating-point variable (with say a value of 3.0/10.0), then every occurrence
of (say) 10.0*x must have the same value. For example Ada, which is based
on Brown's model, seems to imply that floating-point arithmetic only has to
satisfy Brown's axioms, and thus expressions can have one of many possible
values. Thinking about floating-point in this fuzzy way stands in sharp
contrast to the IEEE model, where the result of each floating-point operation is
precisely defined. In the IEEE model, we can prove that (3.0/10.0)*10.0
evaluates to 3 (Theorem 7). In Brown's model, we cannot.

Maybe this could be a topic for a new revision of Ada ?

Kind regards,

Vincent


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2016-03-02 14:11   ` vincent.diemunsch
@ 2016-03-02 14:23     ` J-P. Rosen
  2016-03-02 15:44       ` Bob Brown
  2016-03-02 14:32     ` Dmitry A. Kazakov
  1 sibling, 1 reply; 40+ messages in thread
From: J-P. Rosen @ 2016-03-02 14:23 UTC (permalink / raw)


Le 02/03/2016 15:11, vincent.diemunsch@gmail.com a écrit :
> Here is the point about Ada : 
> 
> Remarkably enough, some languages don't clearly specify that if x is a
> floating-point variable (with say a value of 3.0/10.0), then every occurrence
> of (say) 10.0*x must have the same value. For example Ada, which is based
> on Brown's model, seems to imply that floating-point arithmetic only has to
> satisfy Brown's axioms, and thus expressions can have one of many possible
> values. Thinking about floating-point in this fuzzy way stands in sharp
> contrast to the IEEE model, where the result of each floating-point operation is
> precisely defined. In the IEEE model, we can prove that (3.0/10.0)*10.0
> evaluates to 3 (Theorem 7). In Brown's model, we cannot.
> 
> Maybe this could be a topic for a new revision of Ada ?

This was a deliberate decision, to make Ada compatible with various
floating point models. Many (most? all?) number crunching machines do
not have IEEE arithmetic, and there has been famous papers claiming that
Java requiring IEEE arithmetic was a huge mistake.

-- 
J-P. Rosen
Adalog
2 rue du Docteur Lombard, 92441 Issy-les-Moulineaux CEDEX
Tel: +33 1 45 29 21 52, Fax: +33 1 45 29 25 00
http://www.adalog.fr

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2016-03-02 14:11   ` vincent.diemunsch
  2016-03-02 14:23     ` J-P. Rosen
@ 2016-03-02 14:32     ` Dmitry A. Kazakov
  2016-03-02 15:31       ` vincent.diemunsch
  1 sibling, 1 reply; 40+ messages in thread
From: Dmitry A. Kazakov @ 2016-03-02 14:32 UTC (permalink / raw)


On 02/03/2016 15:11, vincent.diemunsch@gmail.com wrote:
> Le lundi 29 février 2016 13:14:09 UTC+1, Jacob Sparre Andersen a écrit :
>
>> Some of the rules which (apparently) aren't wrong yet:
>>
>>     http://www.cse.msu.edu/~cse320/Documents/FloatingPoint.pdf
>>
>> Greetings,
>>
>> Jacob
>
> Very interesting !
> Here is the point about Ada :
>
> Remarkably enough, some languages don't clearly specify that if x is a
> floating-point variable (with say a value of 3.0/10.0), then every occurrence
> of (say) 10.0*x must have the same value. For example Ada, which is based
> on Brown's model, seems to imply that floating-point arithmetic only has to
> satisfy Brown's axioms, and thus expressions can have one of many possible
> values. Thinking about floating-point in this fuzzy way stands in sharp
> contrast to the IEEE model, where the result of each floating-point operation is
> precisely defined. In the IEEE model, we can prove that (3.0/10.0)*10.0
> evaluates to 3 (Theorem 7). In Brown's model, we cannot.
>
> Maybe this could be a topic for a new revision of Ada ?

No.

IEEE model is wrong and Ada 83 model is right being closer to the 
interval computations.

BTW, it is advisable to always turn the IEEE semantics off. E.g. not to 
use Float but

    subtype Proper_Float is Float range Float'Range;

instead. This saves a lot of debugging later.

 From the computational, rounding error control, engineering, etc POVs 
there is no reason to suggest that (x/y)*y=x should hold for 
floating-point numbers. They are not reals and cannot be reals. The 
model must sacrifice some less important properties like (x/y)*y=x 
(which is wrong for any measured values anyway) for more important ones 
like rounding errors control, performance, optimization.

-- 
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2016-03-02 14:32     ` Dmitry A. Kazakov
@ 2016-03-02 15:31       ` vincent.diemunsch
  0 siblings, 0 replies; 40+ messages in thread
From: vincent.diemunsch @ 2016-03-02 15:31 UTC (permalink / raw)


Le mercredi 2 mars 2016 15:33:18 UTC+1, Dmitry A. Kazakov a écrit :
> 
> IEEE model is wrong and Ada 83 model is right being closer to the interval computations.

Could you elaborate or give me a link to papers ?

> 
> BTW, it is advisable to always turn the IEEE semantics off. E.g. not to 
> use Float but
> 
>     subtype Proper_Float is Float range Float'Range;
> 
> instead. This saves a lot of debugging later.

Why is the IEEE semantics involved if the Ada model doesn't follow IEEE model ?
And what is the difference between the Proper_Float and the Float ?

> 
>  From the computational, rounding error control, engineering, etc POVs 
> there is no reason to suggest that (x/y)*y=x should hold for 
> floating-point numbers. They are not reals and cannot be reals. The 
> model must sacrifice some less important properties like (x/y)*y=x 
> (which is wrong for any measured values anyway) for more important ones 
> like rounding errors control, performance, optimization.

I can accept this. The sensible point for me is to be able to prove un upper bound to rounding errors.

Regards,

Vincent

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2016-03-02 14:23     ` J-P. Rosen
@ 2016-03-02 15:44       ` Bob Brown
  2016-03-02 16:30         ` J-P. Rosen
  0 siblings, 1 reply; 40+ messages in thread
From: Bob Brown @ 2016-03-02 15:44 UTC (permalink / raw)


On 2016-03-02, J-P. Rosen <rosen@adalog.fr> wrote:
> Le 02/03/2016 15:11, vincent.diemunsch@gmail.com a ?crit :
>> Here is the point about Ada : 
>> 
>> Remarkably enough, some languages don't clearly specify that if x is a
>> floating-point variable (with say a value of 3.0/10.0), then every occurrence
>> of (say) 10.0*x must have the same value. For example Ada, which is based
>> on Brown's model, seems to imply that floating-point arithmetic only has to
>> satisfy Brown's axioms, and thus expressions can have one of many possible
>> values. Thinking about floating-point in this fuzzy way stands in sharp
>> contrast to the IEEE model, where the result of each floating-point operation is
>> precisely defined. In the IEEE model, we can prove that (3.0/10.0)*10.0
>> evaluates to 3 (Theorem 7). In Brown's model, we cannot.
>> 
>> Maybe this could be a topic for a new revision of Ada ?
>
> This was a deliberate decision, to make Ada compatible with various
> floating point models. Many (most? all?) number crunching machines do
> not have IEEE arithmetic, 

Really? Which one(s) are you talking about? I didn't even know there were
any number crunchers aside from the sad ones made from a kid's pool full
of Intel chips stuck together with tape and glue. All the real stuff like
CDC/Cray has been gone for ages.

> and there has been famous papers claiming that Java requiring IEEE
> arithmetic was a huge mistake. 

I would be interested in seeing the references if you have them at hand.

Thank you,

Bob

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2016-03-02 15:44       ` Bob Brown
@ 2016-03-02 16:30         ` J-P. Rosen
  2016-03-02 16:36           ` Bob Brown
  0 siblings, 1 reply; 40+ messages in thread
From: J-P. Rosen @ 2016-03-02 16:30 UTC (permalink / raw)


Le 02/03/2016 16:44, Bob Brown a écrit :
>> This was a deliberate decision, to make Ada compatible with various
>> floating point models. Many (most? all?) number crunching machines do
>> not have IEEE arithmetic, 
> 
> Really? Which one(s) are you talking about? I didn't even know there were
> any number crunchers aside from the sad ones made from a kid's pool full
> of Intel chips stuck together with tape and glue. All the real stuff like
> CDC/Cray has been gone for ages.
Well, true enough, I don't know for recent machines. Even Silicon
Graphics has disappeared (not so long ago), and AFAIK, they were not IEEE.

>> and there has been famous papers claiming that Java requiring IEEE
>> arithmetic was a huge mistake. 
> 
> I would be interested in seeing the references if you have them at hand.
> 
https://www.cs.berkeley.edu/~wkahan/JAVAhurt.pdf




-- 
J-P. Rosen
Adalog
2 rue du Docteur Lombard, 92441 Issy-les-Moulineaux CEDEX
Tel: +33 1 45 29 21 52, Fax: +33 1 45 29 25 00
http://www.adalog.fr


^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2016-03-02 16:30         ` J-P. Rosen
@ 2016-03-02 16:36           ` Bob Brown
  2016-03-02 16:52             ` Bob Brown
  0 siblings, 1 reply; 40+ messages in thread
From: Bob Brown @ 2016-03-02 16:36 UTC (permalink / raw)


On 2016-03-02, J-P. Rosen <rosen@adalog.fr> wrote:
> Le 02/03/2016 16:44, Bob Brown a ?crit :
>>> This was a deliberate decision, to make Ada compatible with various
>>> floating point models. Many (most? all?) number crunching machines do
>>> not have IEEE arithmetic, 
>> 
>> Really? Which one(s) are you talking about? I didn't even know there were
>> any number crunchers aside from the sad ones made from a kid's pool full
>> of Intel chips stuck together with tape and glue. All the real stuff like
>> CDC/Cray has been gone for ages.
> Well, true enough, I don't know for recent machines. Even Silicon
> Graphics has disappeared (not so long ago), and AFAIK, they were not IEEE.
>
>>> and there has been famous papers claiming that Java requiring IEEE
>>> arithmetic was a huge mistake. 
>> 
>> I would be interested in seeing the references if you have them at hand.
>> 
> https://www.cs.berkeley.edu/~wkahan/JAVAhurt.pdf

Thanks. I'll check that one out. I'm a little surprised before reading it. I
thought Kahan was the main reason IEEE math exists at all.

Bob

>
>
>
>

^ permalink raw reply	[flat|nested] 40+ messages in thread

* Re: Everything You Know Is Wrong
  2016-03-02 16:36           ` Bob Brown
@ 2016-03-02 16:52             ` Bob Brown
  0 siblings, 0 replies; 40+ messages in thread
From: Bob Brown @ 2016-03-02 16:52 UTC (permalink / raw)


On 2016-03-02, Bob Brown <bob@justplainbob.com> wrote:
> On 2016-03-02, J-P. Rosen <rosen@adalog.fr> wrote:
>> Le 02/03/2016 16:44, Bob Brown a ?crit :
>>>> This was a deliberate decision, to make Ada compatible with various
>>>> floating point models. Many (most? all?) number crunching machines do
>>>> not have IEEE arithmetic, 
>>> 
>>> Really? Which one(s) are you talking about? I didn't even know there were
>>> any number crunchers aside from the sad ones made from a kid's pool full
>>> of Intel chips stuck together with tape and glue. All the real stuff like
>>> CDC/Cray has been gone for ages.
>> Well, true enough, I don't know for recent machines. Even Silicon
>> Graphics has disappeared (not so long ago), and AFAIK, they were not IEEE.
>>
>>>> and there has been famous papers claiming that Java requiring IEEE
>>>> arithmetic was a huge mistake. 
>>> 
>>> I would be interested in seeing the references if you have them at hand.
>>> 
>> https://www.cs.berkeley.edu/~wkahan/JAVAhurt.pdf

Ok, skimming this quickly it does not seem to me that the issue was Java
requiring IEEE arithmetic. The issue was Java's implementation of IEEE
arithmetic is seriously broken and error prone. They did not support the
highest precision type even though it is in virtually all desktop
hardware. And they didn't support the various signalling and trap mechanisms
required (and provided by other languages) to protect against serious errors
in computations.

Furthermore, and this is well-known but only touched-on very briefly by
Kahan, the lack of integer math support in Java (and other very popular
scripting languages like Javascript) is a terribly wrong-headed notion that
causes no end of problems when floating point is misapplied and integer math
should have been used instead.

What's really scary is reading claims by various knuckelheads claiming it's
ok to do financial and accounting calculations with floating point...

Bob


^ permalink raw reply	[flat|nested] 40+ messages in thread

end of thread, other threads:[~2016-03-02 16:52 UTC | newest]

Thread overview: 40+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2015-12-27  0:37 Everything You Know Is Wrong Jeffrey R. Carter
2015-12-27  7:55 ` J-P. Rosen
2015-12-27 17:37   ` Jeffrey R. Carter
2015-12-27  8:46 ` Dmitry A. Kazakov
2015-12-27 17:36   ` Jeffrey R. Carter
2016-01-04 14:44     ` Alejandro R. Mosteo
2015-12-28  9:57 ` Georg Bauhaus
2015-12-28 11:19   ` Dmitry A. Kazakov
2015-12-28 16:27     ` Nicholas Collin Paul de Gloucester
2015-12-28 17:30       ` Dmitry A. Kazakov
2015-12-28 18:50         ` Nicholas Collin Paul de Gloucester
2015-12-28 20:40           ` Dmitry A. Kazakov
2015-12-29 11:42             ` G.B.
2015-12-29 12:36               ` Dmitry A. Kazakov
2015-12-29 13:50                 ` G.B.
2015-12-29 14:06                   ` J-P. Rosen
2015-12-29 14:16                   ` Dmitry A. Kazakov
2015-12-29 16:31                     ` Dennis Lee Bieber
2015-12-29 17:02                       ` G.B.
2015-12-29 16:57                     ` G.B.
2015-12-29 17:36                       ` Dmitry A. Kazakov
2015-12-29 17:53                         ` G.B.
2015-12-29 18:09                           ` G.B.
2015-12-29 22:05                           ` Randy Brukardt
2016-01-04 14:51                           ` Alejandro R. Mosteo
2015-12-29 21:58                         ` Randy Brukardt
2015-12-28 17:19 ` Nicholas Collin Paul de Gloucester
2015-12-29 23:37 ` darkestkhan
2016-01-05 13:52 ` brbarkstrom
2016-01-10 14:46 ` Michael Erdmann
2016-02-29 12:14 ` Jacob Sparre Andersen
2016-03-02 14:11   ` vincent.diemunsch
2016-03-02 14:23     ` J-P. Rosen
2016-03-02 15:44       ` Bob Brown
2016-03-02 16:30         ` J-P. Rosen
2016-03-02 16:36           ` Bob Brown
2016-03-02 16:52             ` Bob Brown
2016-03-02 14:32     ` Dmitry A. Kazakov
2016-03-02 15:31       ` vincent.diemunsch
2016-02-29 12:27 ` Jacob Sparre Andersen

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox