- Without paywall: https://www.removepaywall.com/search?url=https://www.scienti...by baruchel - 3 weeks ago
- https://arxiv.org/abs/2502.17779by trueismywork - 2 weeks ago
Link to paper
- Here's the gist:by zerof1l - 2 weeks ago
For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).
Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.
- One of my favourite sci comms YouTubers explained this in great detail https://youtu.be/8JuWdXrCmWgby cyrillite - 2 weeks ago
Highly recommend
- This seems very theoretical, just a lower bound on space required, without talking about what is being computed. Does it have any import on real algorithms?by kristianp - 2 weeks ago
- I get that this is an interesting theoretical proof. I'm more interested in the inverse, trading memory for time, to make things faster, even if they take more space. It seems to me the halting problem is almost a proof the inverse of this is impossible.by mikewarot - 2 weeks ago
Memoization is likely the best you can do.
- Related:by JohnKemeny - 2 weeks ago
For algorithms, a little memory outweighs a lot of time. 343 points, 139 comments, 39 days ago. https://news.ycombinator.com/item?id=44055347
You need much less memory than time. 126 points, 11 comments, 22 days ago. https://news.ycombinator.com/item?id=44212855
- Here's a quote from the SciAm article: "Technically, that equation was t/log(t), but for the numbers involved log(t) is typically negligibly small."by bluenose69 - 2 weeks ago
Huh?
- One thing that's surprised me throughout my career is just how inefficient most of the code that I've inherited is. I suspect we could do a lot more with the hardware we have if we simply became better at programming.by gwbas1c - 2 weeks ago
(Perhaps more AI coaching could help?)
- by krackers - 2 weeks ago
- I don't understand what they are saying at all. If I have a computation that requires going through a loop n times, why should the memory requirements increase with n at all?by gweinberg - 2 weeks ago
- >(Technically, that equation was t/log(t), but for the numbers involved log(t) is typically negligibly small.)by snickerbockers - 2 weeks ago
My dude, that is NOT how rational numbers work.
- [ made this a top-level comment as I don't see it mentioned in other comments: ]by utopcell - 2 weeks ago
The result doesn't actually apply to _all_ algorithms, only oblivious ones. Oblivious algorithms are ones where step i does not depend on decisions made in steps j < i. Almost all useful algorithms are non-oblivious, with some notable exceptions [1]. This work is a step forward towards proving the result for the general case, with little practical usage (and that's perfectly okay).
- The impossible chess board problem must have something to do with the idea of solving tree eval with little memory (https://youtu.be/wTJI_WuZSwE?si=lgTc65RhXQesKchR)! When the chess board is random, it feels impossible to add additional information. However, by choosing which cell to flip and knowing that you followed a certain operation, you can encode the position information in the random data, without needing more cells.by Gehinnn - 2 weeks ago