TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Compile times, and why “the obvious” might not be so

4 pointsby parsecsabout 4 years ago

1 comment

ggmabout 4 years ago
Love this blog, love this article. The &quot;has GC&quot; thing and &quot;is a VM&quot; thing goes pretty quickly to &quot;you used a REPL didn&#x27;t you?&quot;<p>To compile times.<p>A lot of code I use is .&#x2F;configure. the autoconf&#x2F;autogen thing looks to me to be doing shell wrapper around basic CPP&#x2F;CC&#x2F;AS&#x2F;LD operations (well, actually the ${CC} called to do the phases mostly) and I <i>think</i> this gets me .o and related files with timestamps I can trust, but past experience was, it was inherently unsafe to modify a .h and then expect the dependency chain to be correct.<p>So. I got into the habit of destroying the intermediate products to be sure to be sure the dependency chains remade everything they needed.<p>I think this kind of thing blows out costs.<p>Also, there are the kind make scenarios where &quot;make all&quot; does stuff, but make install actually does MORE and I hate those. The delayed cost of making things &quot;for real&quot; is high, for the thing which I thought I had already done.<p>My own code had&#x2F;has really tiny structure. I think if i had coded enough to need richness in library and modules I&#x27;d be more conscious of the maintenance side of this, and of the burden to make Makefiles which did the right thing, with dependencies.<p>I remember older compilers (OSF&#x2F;1?) which demanded you make it, run it for 30 sec, then let the compiler use the runtime to decide which optimisations to re-apply making it for real.