Minus the fuzz: A multitape Turing machine running in time <i>t</i> can be simulated using <i>O(sqrt(t log t))</i> space (and typically more than <i>t</i> time).<p><a href="https://arxiv.org/abs/2502.17779" rel="nofollow">https://arxiv.org/abs/2502.17779</a>
From the „Camel Book”, one of my favorite programming books (not because it was enlightening, but because it was entertaining); on the Perl philosophy:<p>“If you’re running out of memory, you can buy more. But if you’re running out of time, you’re screwed.”
Lookup tables with precalculated things for the win!<p>In fact I don’t think we would need processors anymore if we were centrally storing all of the operations ever done in our processors.<p>Now fast retrieval is another problem for another thread.
I think it is very intuitive that more space beats the pants off of more time.<p>In time O(n) you can use O(n) cells on a tape, but there are O(2^n) possible configurations of symbols on a tape of length n (for an alphabet with 2 symbols), so you can do so much more with n space than with n time.
Interesting. It's one of those things that I’ve always “just assumed,” without thinking about it.<p>I did a lot of raster graphics programming, in my career, and graphics work makes <i>heavy</i> use of lookup tables.<p>Yesterday, I posted a rather simple tool I wrote[0]: a server that “frontloads” a set of polygons into a database, and then uses them, at query time. It’s fairly fast (but I’m sure it could be a lot faster). I wrote it in a few hours, and got pretty good performance, right out of the starting gate.<p>Pretty basic stuff. I doubt the pattern is unique, but it’s one that I’ve used for ages. It’s where I try to do as much “upfront” work as possible, and store “half-baked” results into memory.<p>Like I said, I always “just worked that way,” and never really thought about it. There’s been a lot of “rule of thumb” stuff in my work. Didn’t have an MIT professor to teach it, but it’s sort of gratifying to see that it wasn’t just “old wives” stuff.<p>There’s probably a ton of stuff that we do, every day, that we don’t think about. Some of it almost certainly originated from really smart folks like him, finding the best way (like the “winding number” algorithm, in that server[1]), and some of it also probably comes from “grug-brained programmers,” simply doing what makes sense.<p>[0] <a href="https://news.ycombinator.com/item?id=44046227">https://news.ycombinator.com/item?id=44046227</a><p>[1] <a href="https://github.com/LittleGreenViper/LGV_TZ_Lookup/blob/e247f2f97fb83e5392d31f81c321814baa26dc1b/src/Sources/LGV_TZ_Lookup_Query.class.php#L101">https://github.com/LittleGreenViper/LGV_TZ_Lookup/blob/e247f...</a>
I always thought that the "reverse" relationship between the time and the space requirements is explained by the simple fact that when you have a set of algorithms with certain time and space requirements and you constrain one of these parameters, the other parameter will not be necessarily (in practice oftentimes not) the most optimal. But it doesn't necessarily mean that the faster algorithm will require less memory or vice versa.<p>This "reverse" relationship works with other parameters. Such as in sorting algorithms, where besides time and space requirements you have stability requirement. If you constrain the sorting algorithms by stability, you won't have any advantage (will likely to have some disadvantage) in performance over non constrained algorithms. And wee see that the known stable sorting algorithms are either slower or require more memory. Nobody has yet invented a stable sorting algorithm that has comparable performance (both time and space) to non stable sorting algorithms. That's it.
Ryan Williams lecture (how he started): <a href="https://www.youtube.com/live/0DrFB2Cp7tg" rel="nofollow">https://www.youtube.com/live/0DrFB2Cp7tg</a><p>And paper:
<a href="https://people.csail.mit.edu/rrw/time-vs-space.pdf" rel="nofollow">https://people.csail.mit.edu/rrw/time-vs-space.pdf</a>
I am confused. If a single-tape turing machine receives a digit N in binary, and is supposed to write N ones on the tape, on the right side of the digit N, it performs N steps.<p>If you expect N ones at the output, how can this machine be simulated in the space smaller than N?<p>This machine must decrement the digit N at the beginning of the tape, and move to the end of the tape to write "1", so it runs in time O(N^2), not O(N)? (as it takes N "trips" to the end of the tape, and each "trip" takes 1, 2, 3 .. N steps)<p>Since turing machines can not jump to any place on a tape in constant time (like computers can), does it have any impact on real computers?
As an aside, I really enjoy a lot of the quanta magazine articles! They manage to write articles that both appeal to compsci people as well as interested "outsiders". The birds-eye view and informal how and why they pack into their explanation style often gives me new perspectives and things to think about!
The poetic style of Quanta makes it impossible to understand what does this mean. Can someone familiar with the topic explain is this applicable to real world computers or just a theoretical scenario? Does this mean we need more memory in computers even if they need to run at a lower clock speed?
At the cost of sounding ridiculous: can there be a notion of "speed of light" in the theory of computation, determining the ultimate limit of memory (space) vs runtime?
Give algorithm designers a little more memory, and they will find a way to shave off the time.<p>Give operating system development organizations a lot more memory and they will find a way to worsen the response time.
> Williams’ proof established a mathematical procedure for transforming any algorithm — no matter what it does — into a form that uses much less space.<p>Ok, but space is cheap, and we usually want to trade processing time for space.<p>I.e., the opposite.
You can determine both the time and space complexity for any algorithm, should you need to. And in some cases you can adjust to have more of one or the other, depending on your particular constraints.<p>More at eleven.
“ If the problem is solved next week, Williams will be kicking himself. Before he wrote the paper, he spent months trying and failing to extend his result”<p>What a strange, sad way to think about this. Academia is perverse.