Hacker Newsnew | past | comments | ask | show | jobs | submit | toast42's commentslogin

"I’m here to save you some time and energy by telling you that the distro you pick doesn’t matter that much. "

That's the point I stopped reading.


It really doesn't. And the desktop Linux community's obsession with bundling a few tweaks and preinstalled applications as separate "distros", splintering the community into a million tiny sub-groups, is part of the reason it's failed to achieve mainstream popularity.


I'd agree that it doesn't matter much within a certain band of distros. Fedora vs. Ubuntu, for a new user without opinions about things like Flatpack or rpm vs. dpkg, doesn't matter much, sure, I agree. Throw in Slackware, Gentoo, Void, and Arch, and now it kinda does matter which you choose. Even Debian, since your software will be farther behind current releases and its preferences about things like non-free software are likely to be something you notice and have to deal with, one way or another.

But, among the small set of relatively user-friendly distros, sure, it doesn't actually matter that much. A generous reading would take that meaning from it, I think.


I've noticed that people tend to think of the surface level of any desktop OS.

They think about the GUI, the command-line programs that ship with it (curl/grep/ls/etc.), the driver support, and the package manager it ships with. These are all trivial abstractions built on deeper facilities.

The farther down you go, the less people understand. Who actually can express the difference between X11 and Wayland that isn't full of weasel words and equivocation? What about mesa, or dbus, or pulseaudio? These are all core components that alter the "flavor" of a desktop Linux system. And yet, all distros basically use the same off-the-shelf components. They only change the higher level GUIs and package managers and stuff.

And people GROSSLY underestimate how much the kernel contributes to the "flavor" of Linux. (I could go on and on about how the GNU GPL directly impacts how drivers are developed for the kernel, or how the small number of core devs are overwhelmed by additions for hardware drivers which move rapidly and break things, and the subsequent vulnerability patches, leaving little time for desktop-focused improvements).

People tend to say things like "the kernel just manages the hardware" or "the kernel is just an interface layer for the hardware" or "MacOS and BSD are the same, only the kernel differs". If only they knew. The kernel is like a seed crystal that defines what can grow outward from there (without massive painful compatibility shims).

Lastly, people OVERestimate the importance of things that are entirely irrelevant to a desktop OS. Just look at how many desktop Linux users are arguing over SystemD vs SysV vs whatever else. Are desktop Linux users really digging into log files, and are annoyed that the log files are now in a binary format? Are desktop Linux users really annoyed that sudo is now part of systemd, instead of a standalone binary that they can swap out? I think the number is low, but the number of desktop Linux users arguing about such things is high.


That's why after trying out a few distros back in the day I settled with a rolling release one and truly learned linux.

I totally agree with the "uselessness" of too many distros but not with your conclusion.

There are multiple reasons for linux not being Mainstream on the desktop.

Windows comes preinstalled on 99% of all non self built laptops/desktops.

Installing an OS is not something the average user does.

It's different and people hate change, unless it's popular


Why is there such a fascination with having so many versions?


Off the top of my head...

Some people want to build everything from source.

Some people want a preconfigured Gnome enviornment.

Some people want a preconfigured KDE environment.

Some people want a distro with the latest version of packages.

Some people want a distro that is conservative about upgrading packages.

Some people are running in VMs and don't need all that extra crap.

Some people want a version that pedantically sticks to the spirit of the GPL.

Some people want a version that is maximally convinient GPL be damned.

I'm sure there's more reasons.


I think you have pretty much covered all the main reasons.

I use Arch (formerly Gentoo) on my work and home PCs/laptops because I like rolling releases with a bleeding edge. I generally run Ubuntu LTS minimal installs for servers because they are tiny and stable and guarantee to be upgradeable to the next LTS release. I run Home Assistant IoT wranglers on Debian because that's what HA insists on for "Supervised".

My wife uses Arch because I look after it and she doesn't care. It simply has to just work and it has for years now without skipping a beat.

Upgrading hardware for laptops and PCs means dumping the filesystems to files on a server or whatever and blatting them onto the new device. If there is physical space, put the old HD/SSD/whatevs into the new box and use a live CD like system rescue or Clonezilla. All the drivers are built in out of the box. These days most things simply work with minimal fiddling. I can't remember the last time I fiddled with xorg.conf. OK I disabled the touchscreen on this laptop when I cracked it and that involved fiddling with xorg. I remember setting modelines by hand in XFree86 ...


Some people hate systemd :)


It gets on my nerves sometimes but then I remember the days of creating init scripts a la Miguel van Smoorenborg (with various dialects), upstart, Gentoo etc style OpenRC jobbies and the rest.

I get on with my day and you can barely see the blood dribble out of the corner of my mouth when I put the verb in the wrong position. systemctl rofl restart ... [fuck] backspace etc ... [bollocks][arse] ... hit enter.


Thank you for the elaborate answer, so flexibility is key.


I ask the same questions about programming languages but usually get downvoted to hell.


Well, I upvoted you.

IMO it's about time we converge on just 4-5 languages -- each with unique benefits that can't be had easily in the other ones! -- and finally start to be truly productive.

Programming is mostly a complete mess today and everybody loves their own ugly disabled baby. Sigh.


Upvoted and do like the idea, its an interesting question to see how many languages do we really need. I remember some paper from the 80s suggesting in the future all we would need is Ada and Lisp. This is somewhat echoed with the GNU projects attempt at C for low level dev and Guile for higher level dev and scripting. With no judgement of suitableness or not trying to exclude just tossing out examples, I'd guess it looks something like:

Small footprint systems language - C, Zig, Ada/SPARK; Higher level dynamic language - CL, Scheme, JS, Clojure; Full Spectrum Language - Rust, Red + Red/System; Basic low level embedded - Forth; Higher level static language - Haskel, SML, Java;

But already these start breaking down, particularly static languages are a minefield. Lazy vs eager evaluation, nominal vs structural typing are two major axis of differentiation there. Then for both static and dynamic languages the question of immutable by default vs immutable as an add-on. Further there's a question of do you need 3 languages for system, high level, and full spectrum vs picking one of the two approaches? And even this binning of 5 completely ignores to what degree concurrency should be a first-class concern of the language. Then there's the whole question of VMs and is there value to building your whole ecosystem on a shared VM? And further, now there's the problem of exposing functionality to non-developers. Should we include a tool like Lua or R that targets non-devs, use a tool like Racket or Red with their very explicit support for creating small custom DSLs, or is that a total non issue because the correct solution is to write GUI tools for that market?

Then the bigger question is, to kitchen sink or not. Languages like CL, Scala, and C++ have taken an approach of implement a ton of features and then trust the developers to sort it out. Other languages are laser focused on a single feature and then take it to its extreme, kinda like how Clojure does with immutability or Pony with actors. Yet if we don't embrace multiparadigm languages, we're leaving a ton of research on the table or accepting the zoo of programing languages.

Yeah programming is a shitshow.


> I remember some paper from the 80s suggesting in the future all we would need is Ada and Lisp. This is somewhat echoed with the GNU projects attempt at C for low level dev and Guile for higher level dev and scripting.

This is an even more true argument today (minus the concrete language names). We desperately need such a curated subset.

The problem is of course us the pesky humans with short-sighted feelings we cling to as if our life depends on it. Desired job security is a big offender as well.

The pertinent questions today with regards to a language are:

- Is the runtime fast (if it's interpreted)?

- If it's compiled ahead-of-time, does it produce efficient machine code? (Golang is one example of machine code that can be better, whereas OCaml and Rust are known to produce some seriously fast machine code.)

- Are the runtime's performance characteristics predictable, e.g. latency remains stable under load? (Especially if the runtime has a garbage collector.)

- Does it run on a reasonable amount of platforms? ARM, x86 (32/64 bit), AVR, and a few more? Can it run on embedded devices?

- Does the language/runtime/ecosystem give you good parallel/concurrent abilities? Preemptively scheduled actors and/or green threads is probably the best idea for servers (Erlang / Elixir are good example do due to the underlying BEAM VM). IMO this is hugely important nowadays. Stuff like parallel iterators and parallel map/reduce/join/various-transform operations are other important enablers.

- Does the language help you avoid various bugs? Examples are Rust's borrow checker or many languages' support for sum types.

---

Additionally... why do we even use languages that don't get compiled by GCC or LLVM, at all? I am tired of listening to people's stories about their beautifully simple and genius compiler... which of course can't even do SIMD intrinsics or properly unroll loops. Yeah, "genius" and super simple indeed. /facepalms

Will we ever learn? Gods.

---

I've been around and I can claim that many times people get attached to the language simply because it has good libraries and good community. I fully relate to that and it's the reason why I want to work with Elixir even after 5 years of mostly uninterrupted career with it but... at one point, after many other problems are solved, you inevitably start hitting brick walls and you question your career choices.

I can't claim any strong experience or authority even after almost 20 years in the industry but so far only Rust seems to be a very good all-around language with a dead-serious and dedicated community that's trying to penetrate basically everywhere. The good news is that they might just succeed because the language is that good -- although it too started showing some warts but so far they're bearable.

But I don't see the world putting their differences aside and starting to work on a common cause. It's sadly not how humanity works. :(


Are you asking why everyone doesn't want the same things from their computer?


The splintering is real, but the differences between distros is also real. The pace of updates and the availability of packages is pretty important, as is a large community that you can lean on for support, even if passively by searching for solutions to common problems. Maybe you mean "of the 5 most popular distros it doesn't matter"; then sure.


I don't know if the splintering is that big a deal. For nearly everybody Ubuntu for personal use, rhel for enterprise right?

Ubuntu you just use gnome, and I've don't think I've ever actually seen redhat (or any centos) attached to a desktop environment.


Been using Ubuntu for 15+ years and never used gnome. I always install xfce4 (which has basic tiling), Plank and a few other little gadgets. It stays so much out of my way that I basically don't even notice I'm using it. I tried one or two other distros but generally come back to Ubuntu because it hits the sweet spot of modernity and stability.


Red-Hat moved away from desktop, because they were the first to realise there is no money on Linux Desktop.

https://m.slashdot.org/story/100116


What's the best way to bundle those?


Just don't. Why bundle stuff if the user can just install it using the package manager?

It's easier to add to the system than remove redundant packages, and if the distro developers focus on the repositories and package management software it benefits everyone much more. That's one of the big reasons people go for Arch.

Nobody cares if LibreOffice or Inkscape or whatever is preinstalled. Just put a usable appstore on the taskbar by default, it works for smartphones too. Many distros are about as necessary as Samsung's or LG's customized Android builds.


Why would you bundle them at all?


I think Ubuntu-style 'flavours' of distro pattern work well. They're called Spins in Fedora.


Some distros are far less buggy then others... IE: I don't have any issues in Arch, but I did with Manjaro (often there was bugs that prevented updates to complete successfully).


I've been using Linux for years, so I've got my opinions about distros. Why should the distribution a new user pick matter that much? They can easily switch if they don't like the first one they pick.


GP probably conflated distro with DE, which is covered in the next paragraph in TFA, maybe shouldn't have stopped reading after all.


Well, IMO, a new user should pick a distribution that has different DE's available in it's repository. That makes switching and trying different DE's very easy. As opposed to some distros which are specifically tailored to only a specific or supported DE.


Easily switch? You mean by reinstalling from scratch, or is there another way?

I’d like to know because I am interested in trying different distros but don’t want to have to keep setting up machines.


Stability of install.

New users probably shouldn't use a rolling release because updates can cause the install to be unbootable, leaving a bad impression of linux.


What happens then is that the new Linux user soon needs a newer version of some package and they have to add a third party repo to their stable system and after having done this a few times, soon an update from one of them will cause the install to be unbootable.

I've had a way better experience with desktop Linux when using a managed rolling release like Manjaro.


And I’d like to know the reason


> my productivity is easily 5-10x what it was before

How are you measuring this? Those numbers seem incredulous.


I’d maybe get 1-2 hours of semi-productive time per day in. Write code 1 or 2 times per week. Honestly, I was pretty shocked at how I could get away with that and it was very demotivating to not have any reason to do more.

Now I’m firing on all cylinders for 7-9 hours a day, completing the entirety of a previous WFH day by 11 AM, and it feels _really_ good.


well he gets done by 11am(8am-11am?) so 8 hours into 3 hours, 8/3 != 5x let alone 10x, sounds like maybe an exaggeration


Wait, who in tech is starting work at 8am? My average day starts around 10. 10x is an exaggeration to be sure, but I will happily defend 5x.


Lots of people. You never noticed because you're never in that early :).


I’m pretty sure it’s BS propaganda.


Oh, get a grip. Have you considered that other people might have different experiences from you?

The switch to WFH _destroyed_ my productivity and made me extremely depressed. At a previous company I switched to remote because of a move and the same thing happened, so it's not the pandemic that's causing it either.

Different people are different. For me, WFH is absolutely soul sucking.


“BS propaganda”? What would I possibly have to gain by lying…? Other people are allowed to have differing perspectives and opinions.


https://www.coindesk.com/tesla-sold-bitcoin-in-q1-for-procee...

This was very easy to find, and makes me question your position that you weren't aware of it.


that sale appears to have taken place before any of the communication, so that can't have been part of any pump&dump scheme either.



Stripe acquired TaxJar ~45 days ago – that's awesome


Looks like they lost all their possessions in an apartment fire: https://twitter.com/marak/status/1320465599319990272


Additionally the FBI were called and Marak was charged with reckless endangerment for the potentially explosive bomb making materials found on the premise, including potassium nitrate, magnesium powder, sulfur powder, copper powder, aluminum powder, hobby fuse and mixing cups, and books about military explosives, booby traps. (https://abc7ny.com/suspicious-package-queens-astoria-fire/64...)

So yeah "seems something is going on with him"


I was acquainted with Marak many years ago and he was an awful person then and probably still is. He put out some very well circulated revenge porn of his ex-girlfriend when she broke up with him long before that term was being used. He's a shitty person who seems to still be.


Has there been any coverage on the revenge porn incident? It sounds plausible given the circumstances but it's a pretty heavy claim.


It was a pretty well known incident when it happened. This was back in the Kazaa days (p2p file sharing). He clipped the whole thing together and made it like one of those old mastercard commercials. The file was called "master card revenge." He's in the video himself and it's clear he made it because it was an explicit fuck you to her. He even put her email address and physical college address in it. I'm not sure what came of the incident. This was about 15 years ago.


>Next-door neighbor Debbie Riga said the box was suspicious, and so they decided to open it.

>"Obviously the man is sick," Riga said.

I guess that must be sick in the "pyrotechnics are fun" way, not sick in the neighborhood cat lady prying into other's possessions and then running their mouth about it to the news, sort of way.


If someone chooses to endanger their neighbors by bringing dangerous, explosive materials into a densely populated living area that shows a very bad decision making process at minimum, and at worst malicious intent. That's something that others need to know about, in order to protect themselves and their families.

What you are calling "running their mouth" is the community protecting itself from someone who has already considerably disrespected the safety and lives of the community.



How can I browse the recipes?


As a fairly intentional design decision, you need to provide context from your kitchen - a search engine query, essentially - before you see results.

The concept is that RecipeRadar will ultimately help you track your kitchen inventory (all on your own device; there are no user accounts), and then the search step will disappear into the background - you'll be straight into browsing recipes that match your context.

It's not there yet and I'd appreciate hearing about how you like to browse recipes in case that's something else that can be supported.


You may want to provide 'suggested cupboards', with a limited selection of multi-use ingredients, to show how the app can help make the most of ingredients.


That's a great idea - "kitchen staples", perhaps? Thanks!


No problemo!

Sure, you may want to come up with a few different sets, perhaps Italian stuff, Asian-focused, vegetarian, etc.


I've translated this into a rough use case / spec at https://github.com/openculinary/frontend/issues/172 in case you'd like to follow along.


I definitely will! Thanks for the heads-up!


Plug for my favorite MUD, even if I haven't played in years: https://www.aardwolf.com/


This was one of the most popular MUDs if I recall? The one I used to play definitely stole their website design.


> He and Sokchea scaled back their lives to live within their solar-powered means — ditching their toaster and microwave, giving up laundry on cloudy days when their batteries wouldn’t be able to recharge.

I read this as they completely got rid of their toaster and microwave. Otherwise I think it would read "giving up their toaster, microwave and laundry on cloudy days".


The Google Fiber router doesn't support IPv6 port forwarding. That was a frustrating afternoon and a bit of a surprise to learn.


What would you possibly want to use IPv6 port forwarding for?


IPv6 port forwarding likely just means allowing incoming connections in IPv6 firewall. Exposing internal machines with IPv4 uses NAT and is called "port forwarding". IPv6 doesn't need NAT but people still call it "port forwarding".


does it do ipv6 nat - that is a very rare scenario ? otherwise I don't see how it can do portforwarding.


Yes ipv6 have nat and do port forwarding at least on linux, usefull scenario is when developing embedded devices and don't want make easily guess your isp how many devices have.


That's a phenomenally strange requirement. If you wanna run interference on your ISPs device count snoops, just create a bunch of fake interfaces, but why you would want to do that is beyond me.


note that I was asking the parent specifically about the google router . While ipv6 does have nat, i have yet to see isps deploying it anywhere


As a follow-up to this question, imagine a job where a significant portion of time is spent waiting on a computer (rendering animations, code compiling, etc).

Two contractors are hired, one with a modern laptop and the other with a 10 year old machine. The older machine takes at least twice as long to process the work.

Is it A) ethical to bill for time spent waiting for the machine to process and B) ethical to use the older machine? Assume the contractor using the older machine is using the best equipment currently available to them.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: