comp.lang.ada
 help / color / mirror / Atom feed
From: Niklas Holsti <niklas.holsti@tidorum.invalid>
Subject: Re: C time_t 2038 problem s-os_lib.ads
Date: Fri, 24 Sep 2021 12:44:38 +0300	[thread overview]
Message-ID: <ir5l07F5kirU1@mid.individual.net> (raw)
In-Reply-To: <3c0272f8-4117-46a4-9051-5419d1edfdc6n@googlegroups.com>

On 2021-09-24 12:32, Joakim Strandberg wrote:
>> In C and C++, int is required to be at least 16 bits (POSIX
>> requires 32), long is at least 32 bits, and long long is at least
>> 64 bits. On most 64-bit Linux-based systems, int is 32 bits, and
>> long and long long are both 64 bits. On 64-bit MS Windows, int and
>> long are both 32 bits, and long long is 64 bits. time_t is 64 bits
>> on almost all 64-bit systems. I've never seen a 128-bit time_t; 64
>> bits with 1-second resolution is good for several hundred billion
>> years.
> 
> Thanks for the summary of different types of integers on different
> platforms Keith. When I wrote above I had simply done a quick Google
> search and found
> https://www.tutorialspoint.com/what-is-long-long-in-c-cplusplus where
> it said "On Linux environment the long takes 64-bit (8-bytes) of
> space, and the long long takes 128-bits (16-bytes) of space." I have
> never seen 128-bit integers either but have seen on the development
> log on AdaCore's website that support for 128-bit integers have been
> added to the Interfaces package (Interfaces.Integer_128 and
> Interfaces.Unsigned_128).


Good that they have been added.


> I believe they are part of the new Ada2022 standard.

I believe not. The draft Ada2022 RM still requires no specific integer 
widths in section B.2, "The Package Interfaces". As in earlier 
standards, it still says:

"An implementation shall provide the following declarations in the 
visible part of package Interfaces: - Signed and modular integer types 
of n bits, if supported by the target architecture, for each n that is 
at least the size of a storage element and that is a factor of the word 
size. The names of these types are of the form Integer_n for the signed 
types, and Unsigned_n for the modular types"

The change by AdaCore probably reflects the fact that gcc now supports 
128-bit integers on common platforms.

Wikipedia has a summary: https://en.wikipedia.org/wiki/128-bit_computing.

  reply	other threads:[~2021-09-24  9:44 UTC|newest]

Thread overview: 12+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2021-09-23 10:42 C time_t 2038 problem s-os_lib.ads Kevin Chadwick
2021-09-23 14:26 ` Jeffrey R. Carter
2021-09-23 15:01   ` Kevin Chadwick
2021-09-23 15:08     ` Joakim Strandberg
2021-09-23 15:39       ` Kevin Chadwick
2021-09-23 15:57         ` Kevin Chadwick
2021-09-23 19:52       ` Keith Thompson
2021-09-24  9:32         ` Joakim Strandberg
2021-09-24  9:44           ` Niklas Holsti [this message]
2021-09-24 22:54           ` Keith Thompson
2021-09-25 10:22             ` G.B.
2021-09-25 11:23             ` Simon Wright
replies disabled

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox