Isn't the Macintosh desktop (with Cmd as the modifier for standard shortcuts) older than Windows and Linux desktops? So historically, it's not Apple that deviated but the others?
(I did not do an extensive search into this, so there might be Ctrl-based standard shortcuts that predate Apple.)
Apple moved the Ctrl key around at least a couple times. On the Apple II it was next to the A key, the same as it was on the Xerox Star. The CMD key was a later addition.
At this point, I'd say let history be history. It'd be better to standardize on what most people are using.
I think modern browsers are actually quite good here. They show a template in the form TT.MM.JJJJ for me (so the German equivalent of MM/DD/YYYY, with the usual order and separator in German). I can just type the date, including the dots if I want (they're just ignored; there would be extra points for moving me to the next component when typing "2.", but the world's not perfect). If I'm confused about the format, or want to see a calendar view, I can click on the calendar icon (also accessible via tab) and select a date there.
For normal date inputs, I really don't think there is a good reason to use anything else. (Possible exceptions I can think of: Selecting date ranges and/or showing extra data about the dates (like daily prices).)
No, modern browsers are horrible at this as they are often ignoring your settings (at least Chrome and Edge on Windows do). They are basing the format entirely on the language instead of the date format configured in your Windows settings. Safari on iOS seems to not have this issue though as far as I can tell.
I am also at the office almost every day because I think it's better for my mental health and food. But I also appreciate that for many, it's different, so actually having the choice individually is nice.
Wouldn't it also be much better to evaluate the Taylor polynomials using Horner's method, instead? (Maybe C++ can do this automatically, but given that there might be rounding differences, it probably won't.)
Fun fact: I looked this up in the online version of the Duden (the predominant German dictionary). It does have an entry "Black Hole" (so the English term!) but not for "schwarzes Loch", which is the normal German term for it.
(In the printed versions, you might need to go to the Universalwörterbuch or so to find the English entry, it might not be in the normal "Die deutsche Rechtschreibung"; I have not checked.)
I wrote "predominant", not "official". And I think that is still true.
Also, from what I can tell using the site, it does not serve as a full dictionary. Rather, it lists the general rules of German orthography (as decided by the Rechtschreibrat) and has some limited tables of special words.
I don't quite get what you mean here. While you need to allow infinite expansions without repeating patterns, you also need to expansions with these pattern to get all reals. Maybe the most difficult part is to explain why 0.(9) and 1 should be the same, though, while no such identification happens for repeating patterns that are not (9).
This article is weird. It wants to shoot against JSON, but then has to make half of its arguments against YAML instead. JSON has no "ambiguity in parsing", JSON has no indentation-based syntax, and JSON has no implicit typing to turn "no" into false.
XML "lost" because it is also a poor format for structured data. It requires constant choice between whether some content should be conveyed using attributes or child nodes. If you use child nodes, indentation may or may not be relevant (because it can introduce leading/trailing spaces). There is also no standard for data types. (Do you write a boolean attribute foo as foo="foo", foo="true", or foo="yes"? Sure, a schema can describe the allowed values, but whoever interprets the XML still needs to add the right translation to true and false in their programming language by hand, presumably.) Due to the manifold ways of expressing essentially the same data, working with XML is always painful.
This is not really XML's fault. It was designed as a way of marking up text. When that is your goal, XML is not a bad choice, and all of its features make a lot more sense. (For example, this child-attribute distinction: Is this part of the text? If so, it should be represented as child nodes. Otherwise, it should be an attribute.) Here, something like JSON really falls flat.
But most data is not marked up text. And yet, some people and institutions tried to push XML as the universal solution to all data transfer problems. This is the reason for the push-back against it.
This isn't to say that we did not also lose some things. In complex cases, having something like a schema is good. Comments are appreciated when the files are also read and edited by humans. JSON's data types are underspecified. But I firmly believe that the solution to this is not XML.
> This is the reason for the push-back against it.
Do you have evidence for that? From memory, it was basically because it was associated with the java/.net bloat from the early 2000s. Then ruby on rails came.
I think that's basically the same reason, right? XML itself is bloated if you use it as a format for data that is not marked-up text, so it comes with bloated APIs (which where pushed by Java/.NET proponents). I believe that if XML been kept to its intended purpose, it would be considered a relatively sane solution.
(But I don't have a source; I was just stating my impression/opinion.)
I assume that "all the different levels" might not exist yet. The author is probably creating them a bit in advance, and will keep going as long as they're motivated. Having a regular schedule for new releases helps, and doing it daily seems as sensible as any other schedule.
There is also the problem that they decided to make all references nullable, so `NullPointerException`s could appear everywhere. This "forced" them to introduce the escape hatch of `RuntimeException`, which of course was way overused immediately, normalizing it.
(I did not do an extensive search into this, so there might be Ctrl-based standard shortcuts that predate Apple.)
reply