Hacker Newsnew | past | comments | ask | show | jobs | submit | bpye's commentslogin

> The best camera is the one you have with you.

This is why I ended up picking up an (admittedly quite expensive) Ricoh GR IV. It's tiny enough to take with me everywhere, has a modern APS-C sensor and great IBIS.


So keen on one of these! Hard to beat on a performance/quality to size ratio.

Nothing says you have to use the same browser at work and outside of work? I use Edge for work, Firefox everywhere else.

Alternative - return to tradition with beige - https://www.silverstonetek.com/en/product/info/computer-chas...

Nice, turbo button and all.

Out of interest, what are you seeing for token generation - especially as the context fills?

At least today, it isn't practical for most people to run these models locally- I think adding a dependency on a cloud service is different enough to some local (possibly open source) tool like an IDE.


Self hosting at a reasonable scale is much cheaper than people think. I am running clusters of DGX Spark machines with BiFrost load balancers in our company and for client projects. They work flawlessly!

128 GB unified memory, Nvidia chip and ARM CPU for just around 3k€ net. They easily push ~400 input and ~100 output tokens per second per device on say gpt-oss-120b. With two devices in a cluster, thats enough performance for >20 concurrent RAG users or >3 "AI augmented" developers.

And they don't even pull that much power.


factor in depreciation and energy costs, and a subscription might just be cheaper.


It is definetly cheaper now. What I want to say with this, is that token costs rising so dramatically that AI usage becomes uneconomical is not a high probability future. Even if AI subscriptions were sold heavily below cost (which is also unlikely, after R&D).


Slack, GitHub, Figma, AWS, etc

Lots of people use firebase, supabase etc.

Many people's jobs are centered around using Salesforce

It all makes me uncomfortable- I want to be able to work without internet. But it's getting more difficult to do it


> Putting compute in space is expensive but so is building a data center in the US.

You know what's also really hard in a vacuum? Dissipating heat.


> You know what's also really hard in a vacuum? Dissipating heat

Correct. The economics of space-based DCs comes down to permitting delays versus radiator mass.

At ISS-weight radiators (12 to 15 W/kg (EDIT: kg/kW)), you need almost decade-long delays on the ground (or 10+ percent interest rates) to make lifting worthwhile. Get down to current state-of-the-art in the 5 to 10 W/kg (EDIT: kg/kW) range, however, and you only need permiting delays of 2 to 3 years.

If there is a game-changing start-up waiting to be built, it's in someone commercialising a better vacuum-rated radiator.


Would you want more wattage per kg for a better radiator?


Yes! Thank you–fixed.


I believe the Mali graphics also require a blob if you want 3D acceleration.


You're right, 3D needs /lib/firmware/arm/mali/arch10.8/mali_csffw.bin, distributed by linux-firmware package.


There are definitely projects where getting a full test pass can take a day or two. I worked on one where we only got a full run each weekend, and if someone broke the tests? Nobody gets their results...


My B580 works fine on Linux. Graphics perf is a bit worse than under Windows, but supposedly compute is pretty much the same.


I'm using a B580 for a windows 10 media pc and it's fine even for moderate gaming when I drop down to 1080p on my 4k tv, although I did notice a little stuttering from time to time.

To be fair, that might be due to still running Windows 10 or due to not having reset the PC in 4 years. It's going to be moved over to Linux soon, I'm just being lazy.


Over the last two years I bought 2 4TB SSDs, 64GB DDR5 ECC UDIMM and 4 14TB HDDs.

I couldn't justify buying any of them today.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: