The adjective modern is so popular, yet it's so devoid of meaning. What does having a modern syntax even mean? The impression I get when I read projects described like that is that the authors are saying they are following the latest trends they could find, but they don't quite know why.
I think it's effectively code for "it's not like C". Easy iteration, lambdas, generics, patterns, type inference etc...
None of these features are technically novel but they do tend to be shared by newer system languages, as opposed to C/Pascal/Fortran/Cobol and friends. Then of course there's C++ that's both old-school and modern because C++ is slowly but surely evolving to become the programming language equivalent of the Borg and eventually all other programming languages are bound to become a subset of C++.
Same with "readable syntax". Nobody sets out to design something deliberately unreadable. It usually means that it conforms to the author's personal flavours and there's nothing wrong with that but the implied "everything else is unreadable" subtweet doesn't really help anything. Case in point:
each planets() (planet){
println("Hello " + planet + "!");
}
I bet that the double () () here leads to some pretty unreadable code outside of this happy path. But that's just my opinion, not some objective truth
Not to mention keywords such as ret, inl, imm, mut, mat and even yon - 'readable' is certainly subjective. Not to dissuade anybody from trying anything outside the norm but I wonder if calling it readable is accurate.
Buzzwords are abound in modern technology circles anymore.
To your point; "modern" is especially insidious because it implies that older technology is bad. I'd take vanilla PHP and JavaScript over TypeScript and the frameworks any day. I recognize that's an extremely unpopular opinion but those languages are battle tested over 25 years and if I run into a problem that I haven't seen before, there's a high probability I'll get the correct answer from a search. For the "modern" stuff, I get five different answers and none of them solve my problem because the language and/or framework has changed as many times since the question was first asked.
That said; given the choice I'd choose Clojure and ClojureScript. It's a young pup compared to PHP and JS but it's still based on a 60-year old language and just a joy to work with.
"My advanced, fast, high-performance, and powerful language/system/framework is so efficient, flexible, concise, usable, readable, extensible, structured, and simple."
In general if I read any of those descriptors without the associated comparative object, I categorize them as advertising weasel word for whatever X is being described as having them.
Those words are comparison words. They are meaningless without either benchmarks, or direct comparisons with other items in the IT landscape that have well known quantities for X.
So plop modern on that pile as well. It's even more of a loaded term, because "modern" implies a range of component interpretations, from trendiness to actual advancements in the "state of the art" (such as UTF-8 vs ASCII).
"following the latest trends" basically works as a definition for the word "modern" in the context of programming languages. "involving recent techniques" etc. why modern? is a fair question I guess but why not? are there any good arguments against things like type inference, generics and lambdas?
I see the first paragraph of a language's website as sort of the "elevator pitch" for that language. You probably want to cut down any vague and uneeded words in that context and focus in the defining aspects of your language. Following the latest trends is also vague and, the more precisely you define it, the more you will need a timestamp next to the term. I think it's better to jsut spell out what are the actual features you really care about.
as far as programming language evolution goes I have to assume "recent" refers to a span of 20 years or so - otherwise I don't think we have any recent techniques
I’m fairly certain we’ve had lambdas (lisp) for something like 40 years. Type inference (ML/lisp compilers) for 30-40 years. And generics, at least in the sense of parametric polymorphism, (Ada/C++/Java/ML) for about the same amount of time. As far as I can tell, the borrow checker and some features around dependent types are newish, but actually modern features are few and far between.
Lisp had lambdas for 60 years, I think ML has had type inference for about 50 (maybe there are earlier languages that had it?), ML also had generics for that long. So, none of your examples were recent developments by your standards.
Even in C-like languages, generics were part of C++ (or rather templates, in that case) since the beginning, I think (that's 35 years ago). I think you can make a case for type inference reaching the C language family recently, though.
Either way, it’s hard to compare a 30 year old language to a modern compiler because the modern compiler has benefited from 30 years of optimizations: you can’t tell how the old language would perform with a similar investment.
The adjective has a lot of meanings, nothing I'd call a void.
> More common, especially in the West, are those who see it as a socially progressive trend of thought that affirms the power of human beings to create, improve and reshape their environment with the aid of practical experimentation, scientific knowledge, or technology. From this perspective, modernism encouraged the re-examination of every aspect of existence, from commerce to philosophy, with the goal of finding that which was 'holding back' progress, and replacing it with new ways of reaching the same end.
Though the word does have a dictionary definition, it's just sort of empty. That's what I was criticising. Both adjectives, modern and readable, are subjective and not really meaningful in a technical context. Unless you define precisely what modern and readable are, they are just vacuous words in an otherwise technical write up.