- The cray 1 did look futuristic like something out of star trek.by hoppp - 1 day ago
It kinda reminded me of the trash can mac. I wonder if it was inspiration for it
- > but then again if you'd showed me an RPi5 back in 1977 I would have said "nah, impossible" so who knows?by delichon - 1 day ago
I was reading lots of scifi in 1977, so I may have tried to talk to the pi like Scotty trying to talk to the mouse in Star Trek IV. And since you can run an LLM and text to speech on an RPi5, it might have answered.
- > If AI systems continue to improve at the current rate and we combine that with improvements in hardware that are measured in orders of magnitude every 15 years or so then it stands to reason that we'll get that "super-intelligent GAI" system any day now.by Cheer2171 - 1 day ago
Oh come off it now. This could have been just a good blog post that didn't make me want to throw my phone across the room. GenAI is a hell of a drug. It's shocking how many technical professionals fall into the hype and become irrationally exuberant.
- What happened to the programs/problems the Cray 1 solved? If anyone can do it on commodity hardware - is it being done? Is it all solved?by bombcar - 1 day ago
- Cray1 should be compared to nowadays raspberry pi pico 2 / rp2350 which has similar specs (using external ram).by benob - 1 day ago
- These comparisons are fun at all but a better one would be the difference between whatever "computer" a citizen lambda would have used back in the day and the cray1 and whatever on can use now and the current "cray" (or whatever humans use now) and see the difference of cost.by zouhair - 1 day ago
- Hardware has gone a long way...by ajsnigrutin - 1 day ago
...software... well, that's a different story.
While a cray could compute millions of things and did a bunch of usable stuff for many groups of people who used it back then, a raspberrypi today has trouble even properly displaying a weather forecast at "acceptable speeds", because modern software has become very bloated, and that includes weather forecast sites that somehow have to include autoplaying video, usually an ad.
- No benchmarks. Hard to take this seriously.by lawik - 1 day ago
- Reading this I wonder, say we did have a time machine and were somehow able to give scientists back in the day access to an RPI5. What sort of crazy experiments would that have spawned?by _fat_santa - 1 day ago
I'm sure when the Cray 1 came out, access to it must have been very restricted and there must have been hoards of scientists clamoring to run their experiments and computations on it. What would have happened if we gave every one of those clamoring scientists an RPI5?
And yes I know this raises an interface problem of how would they even use one back in the day but lets put that to the side and assume we figured out how to make an RPI5 behave exactly like a Cray 1 and allowed scientists to use it in a productive way.
- Finding more & more that power efficiency is what's driving me towards new gear rather than lack of horsepower.by Havoc - 1 day ago
A few niche uses aside (gaming, llm) a vaguely modern desktop is good enough regardless of details.
- Comparing against a raspberry pi 5 is kind of overkill. While a Pico 2 is close to computationally equivalent to a cray 1 now (version 2 added hardware floating point), the cray still has substantially more memory - almost 9MB vs 520k.by dgacmu - 1 day ago
For parity, you have to move up to a raspberry pi zero 2, which costs $15 and uses about 2W of powerm
A million times cheaper than a cray in 2025 dollars and quite a bit more capable.
- Are there any details or examples of computational work the Cray 1 used for?by omega3 - 1 day ago
- My former boss (Steve Parker, RIP) shared a story of Turner Whitted making predictions about how much compute would be needed to achieve real-time ray tracing, some time around when his seminal paper was published (~1980). As the story goes, Turner went through some calculations and came to the conclusion that it’d take 1 Cray per pixel. Because of the space each Cray takes, they’d be too far apart and he thought they wouldn’t be able to link it to a monitor and get the results in real time, so instead you’d probably have to put the array of Crays in the desert, each one attached to an RGB light, and fly over it in an airplane to see the image.by dahart - 1 day ago
Another comparison that is equally astonishing to the RPi is that modern GPUs have exceeded Whitted’s prediction. Turner’s paper used 640x480 images. At that resolution, extrapolating the 160 Mflops number, 1 Cray per pixel would be 49 Tera flops. A 4080 GPU has just shy of 50 Tflops peak performance, so it has surpassed what Turner thought we’d need.
Think about that - not just faster than a Cray for a lot less money, but one cheap consumer device is faster than 300,000 Crays.(!) Faster than a whole Cray per pixel. We really have come a long, long way.
The 5090 has over 300 Tflops of ray tracing perf, and the Tensor cores are now in the Petaflops range (with lower precision math), so we’re now exceeding the compute needed for 1 Cray per pixel at 1080p. 1 GPU faster than 2M Crays. Mind blowing.
- Adjust the price of the Cray-1, for inflation, but not the power, for Moore's law? Need I get my napkin out for a few calculations? or do we just FORGET MOORE'S LAW ( that is mention no less that 4 times, without quantification? Cray-1 (1976 ). RPi ( 2012 ). 37 years of elapsed time. 24. 2/3 elapsed generations. 26,509,000 times increase in power. Cray 1 160Mf. In a 26M times faster, would yield 4,241Gf ( 4.2Pf) , while the PI1 is capable of 13.5Gf, so the RPi-1 ( 2012 ) is about 0.31% of where Moore's law power doubling is.by ForOldHack - 24 hours ago
Now lets compare this to the top 500. ( see the point? )( do not speak of Moore's law, while ignoring the mathematical implications. ) ( and yes, 3/1000s is three thousandths ).
Top 500 is 1.7 Exaflops, but by Moore's law should be 4,241Gf or 4.2Xf. So the top 500 is not keeping up with Moore's law.
- This thread should be a MasterClass. Awesome reading. Seriously. -a gen x'er.by grubrunner666 - 21 hours ago
- And you can 3D print a Cray YMP case for your Raspberry Pi: https://www.thingiverse.com/thing:6947303by smcameron - 14 hours ago
- What I find somewhat puzzling is that these machines were used for the "really big problems". We used supercomputers for weather forecasting, finite element simulations, molecular modeling. And we were getting results.by jwr - 13 hours ago
I don't feel we are getting results that are thousands of times better today.
- It is a frequent fantasy of mine to bring tech back to historical figures, like to show my phone to Galileo or to take Leonardo da Vinci for a ride in my car. But I guess you don't need to go that far to blow minds.by ziofill - 11 hours ago
- Back in 2020, someone built a working model of a Cray-1.[1] Not only is it instruction compatible, using an FPGA, it's built into a 1/10 scale case that looks like a Cray-1.by Animats - 11 hours ago
The Cray-1 is really a very simple machine, with a small instruction set. It just has 64 of everything. It was built from discrete components, almost the last CPU built that way.
[1] https://www.cpushack.com/2010/09/15/homebrew-cray-1a-1976-vs...
- The pi has a sub $100 accelerator card that takes it to 30 TFLOPs. So you can add three more orders of magnitude of performance for a rough doubling of the price.by _tom_ - 10 hours ago
- > the Cray had about 160MFLOPS of raw processing power; the Pi has... up to 30GFLOPS. Yes... that's gigaFLOPS. This makes it almost 200 times faster than the Cray.by dale_huevo - 10 hours ago
Imagine traveling back to 1977 and explaining to someone that in 2025 we've allocated all that extra computing power to processing javascript bundles and other assorted webshit.
- In 2013 I'd just built a new top-spec PC. I looked up the performance and then back-calculated using the TOP500† and I believe it would have been the most powerful supercomputer in the world in about 1993. If you back-calculated further, I think around 1980 it became more powerful than every computer on the planet combined.by qingcharles - 9 hours ago
- I guess I'm old because this hasn't really been that insightful of interesting observation just by itself anymore. People often talk about technological advancement of computing as if it is a force of nature whereas the amazing specs of say a rp2350 compared to the cray-1 is more of a story of the economies of scale as opposed to merely technical know-how and design. The reason a rp2350 is a few dollars is because of fabs, infrastructure, and institutional knowledge that likely dwarf the cost of producing a cray-1. I wouldn't even be surprised if someone bothered to do a similar calculation of the cost of infrastructure needed behind each cray-1 at the time that it could even be less what is needed to produce rp2350s today. The unit price of a rp2350 to consumers being so cheap (right now that fabs still want to make it) somewhat elides the actual costs involved.by noobermin - 7 hours ago
Animats below said that the Cray-1 was made from discrete components. Good luck making a rp2350 from discrete components, it likely wouldn't even function well at the desired frequency due to speed of light and RF interference issues--it would likely be even worse for GHz broadcoms used in the rpi5. This means that in a post-apocolyptic future you could make another cray-1 given enough time and resources. In 20 years when the fabs have stopped making rp2350s there simply will not be any more of them.
- That was a weird turn to AI at the end, but otherwise an interesting reflection. I'm a little too young to have grown up in the era of the Cray-1, but even in the early 90s, processors ran at 90 MHz and hard drives cost $1 per megabyte. Back when personal computers ran at single-digit megahertz and had kilobytes of RAM, a Cray was mind-blowing.by username223 - 3 hours ago
The exciting part back then was that, while computers were never "good enough," they were getting noticeably better every few months. If you were in the market for a computer, you knew you could get a noticeably better one for the same price if you just waited a little while. The next model was exciting, because it was tangibly better. At some point personal computers became "good enough" for most people. Other than compensating for creeping software bloat, there hasn't been much reason for most people to be excited about new computers in a decade or more.
- Yes but can you sit on your Raspberry Pi like this https://volumeone.org/uploads/image/article/005/898/5898/hea...by qgin - 3 hours ago