Birbla

Login
    New proof dramatically compresses space needed for computation (scientificamerican.com)
    190 points by baruchel - 3 weeks ago

  • Without paywall: https://www.removepaywall.com/search?url=https://www.scienti...
    by baruchel - 3 weeks ago
  • https://arxiv.org/abs/2502.17779

    Link to paper

    by trueismywork - 2 weeks ago
  • Here's the gist:

    For nearly 50 years theorists believed that if solving a problem takes t steps, it should also need roughly t bits of memory: 100 steps - 100bits. To be exact t/log(t).

    Ryan Williams found that any problem solvable in time t needs only about sqrt(t) bits of memory: a 100-step computation could be compressed and solved with something on the order of 10 bits.

    by zerof1l - 2 weeks ago
  • One of my favourite sci comms YouTubers explained this in great detail https://youtu.be/8JuWdXrCmWg

    Highly recommend

    by cyrillite - 2 weeks ago
  • This seems very theoretical, just a lower bound on space required, without talking about what is being computed. Does it have any import on real algorithms?
    by kristianp - 2 weeks ago
  • I get that this is an interesting theoretical proof. I'm more interested in the inverse, trading memory for time, to make things faster, even if they take more space. It seems to me the halting problem is almost a proof the inverse of this is impossible.

    Memoization is likely the best you can do.

    by mikewarot - 2 weeks ago
  • Related:

    For algorithms, a little memory outweighs a lot of time. 343 points, 139 comments, 39 days ago. https://news.ycombinator.com/item?id=44055347

    You need much less memory than time. 126 points, 11 comments, 22 days ago. https://news.ycombinator.com/item?id=44212855

    by JohnKemeny - 2 weeks ago
  • Here's a quote from the SciAm article: "Technically, that equation was t/log(t), but for the numbers involved log(t) is typically negligibly small."

    Huh?

    by bluenose69 - 2 weeks ago
  • One thing that's surprised me throughout my career is just how inefficient most of the code that I've inherited is. I suspect we could do a lot more with the hardware we have if we simply became better at programming.

    (Perhaps more AI coaching could help?)

    by gwbas1c - 2 weeks ago
  • Related: https://news.ycombinator.com/item?id=44212855 and https://news.ycombinator.com/item?id=44055347
    by krackers - 2 weeks ago
  • I don't understand what they are saying at all. If I have a computation that requires going through a loop n times, why should the memory requirements increase with n at all?
    by gweinberg - 2 weeks ago
  • >(Technically, that equation was t/log(t), but for the numbers involved log(t) is typically negligibly small.)

    My dude, that is NOT how rational numbers work.

    by snickerbockers - 2 weeks ago
  • [ made this a top-level comment as I don't see it mentioned in other comments: ]

    The result doesn't actually apply to _all_ algorithms, only oblivious ones. Oblivious algorithms are ones where step i does not depend on decisions made in steps j < i. Almost all useful algorithms are non-oblivious, with some notable exceptions [1]. This work is a step forward towards proving the result for the general case, with little practical usage (and that's perfectly okay).

    [1] https://en.wikipedia.org/wiki/Sorting_network

    by utopcell - 2 weeks ago
  • The impossible chess board problem must have something to do with the idea of solving tree eval with little memory (https://youtu.be/wTJI_WuZSwE?si=lgTc65RhXQesKchR)! When the chess board is random, it feels impossible to add additional information. However, by choosing which cell to flip and knowing that you followed a certain operation, you can encode the position information in the random data, without needing more cells.
    by Gehinnn - 2 weeks ago

© 2025 Birbla.com, a Hacker News reader · Content · Terms · Privacy · Support