From: "Randy Brukardt" <randy@rrsoftware.com>
Subject: Re: generating and compiling a very large file
Date: Mon, 4 Jun 2018 16:56:09 -0500
Date: 2018-06-04T16:56:09-05:00 [thread overview]
Message-ID: <pf4chp$46s$1@franka.jacob-sparre.dk> (raw)
In-Reply-To: e1aaa443-00b7-4217-8fe8-dfc9098e247b@googlegroups.com
"Shark8" <onewingedshark@gmail.com> wrote in message
news:e1aaa443-00b7-4217-8fe8-dfc9098e247b@googlegroups.com...
On Sunday, June 3, 2018 at 1:14:40 PM UTC-6, Stephen Leake wrote:
>> I'm working on a parser generator. One of the files generated is very
>> large;
>
>The idea of a large aggregate is good,...
I can't speak to GNAT specifically, but in the case of Janus/Ada, the best
solution would be to use a number of medium-size aggregates. The usual
performance problem in Janus/Ada is the optimizer, and very large aggregates
get really slow as the code to construct them gets rearranged by the
optimizer. OTOH, lots of tiny subprogram calls also can get rearranged by
the optimizer.
So I'd suggest (a) turning off optimization to see if that alone is the
culprit, and (b) using fewer calls using medium size aggregates - one for
each state, perhaps?
BTW, what Janus/Ada does for its own LALR parse tables is have a program to
stream out a binary representation of them to a file ("Janus1.Ovl"), the
compiler then streams that in to start. (It's a pure binary read/write -
these days, I'd use Stream_IO to do it, avoid stream attributes as they can
easily drop to component-by-component.) A text file would be quite a bit
slower, since Text_IO requires a lot more processing that just pure binary
I/O.
Randy.
next prev parent reply other threads:[~2018-06-04 21:56 UTC|newest]
Thread overview: 5+ messages / expand[flat|nested] mbox.gz Atom feed top
2018-06-03 19:14 generating and compiling a very large file Stephen Leake
2018-06-03 19:48 ` Dmitry A. Kazakov
2018-06-03 21:43 ` Shark8
2018-06-04 21:56 ` Randy Brukardt [this message]
2018-07-13 7:04 ` Stephen Leake
replies disabled
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox