> My background is in OO programming, mostly using C#. C# being the versatile language it is, I have had the perception that whatever you do in other programming languages, you can with a little more code and hassle achieve in C# as well. If need be, I can program C# using a functional paradigm. And, of course I use recursion all the time. I know all there is to know about recursion.
IME there are two kinds of programmers: ones who learned a single language deeply and have almost religious confidence that it's the best thing ever, even if they try to approach other languages with an open mind (i don't blame them, i was like that a couple decades ago, too) and some who understand that tools are just that - tools, which you should pick accordingly to the problem you want to solve, because the other way ends up in a tail-wags-dog situation. the blog post sounds like the author has realized that and started a journey from the first type to the second type, which is to be commended. good job.
I see what you are saying but I have also seen plenty of Python code written by what are clearly Java developers. And Framework collectors who have learned learn the very basics of Django before they moved onto something new - and hence written a load of overcomplicated crap that could have been done a lot cleaner if they had learned the framework in more depth. Learning some things in depth definitely has benefits.
Yours is concerned with code quality and maintainability, both generally benefit from idiomatic programming style which people new to the language rarely have. And you're 100% right.
BUT the parent post was talking about ideologies "the one true language" vs "hey, i can make this work anywhere"
the latter camp affects you in that many of them don't have the discipline or patience to really ingest the new language before spewing out code that "works" but sucks for maintainers.
All that recursion, lazy evaluation, monads, lambdas, virtual methods and coroutines. In the end, they are all stack manipulation and jump instructions. High level languages are just a more convenient way of writing assembly.
I like to take that approach when comparing languages. What a program might do in term of machine instructions, and see how I can make another language output similar instructions. The one that can do it the more naturally wins.
>In the end, it is all machine code.
All that recursion, lazy evaluation, monads, lambdas, virtual methods and coroutines. In the end, they are all stack manipulation and jump instructions. High level languages are just a more convenient way of writing assembly.
Well, in the end the universe dies in the heat death stage, and nothing matters.
But we're not "in the end", so this final analysis "it's all machine code anyway" is reductionistic.
It might be all machine code, but the abstraction at which we think and program matters, for the correctness, speed of writing, speed of understanding it later, and performance of the deliverable...
Lets extend this analogy to books: all books are just letters and words writen on a page. And some pictures too. It doesn’t matter how a book is written as long as it produces letters, words and paragraphs and pictures. It doesn’t sound right, does it?
i don't blame them, i was like that a couple decades ago, too
This seems like a fairly good indication there probably aren't really two kinds of programmers. Unless you experienced that particular change, from those particular initial conditions in some strikingly discrete fashion and also believe those are both universal.
I have worked with a couple of devs who decide to pick and choose what they work on. Usually they are seen as important by management / company owners and not team players.
Sure, you can, but its usually not desirable to jump ship the moment a task comes along that is less suitable to your tools of choice.
I often find myself using languages I’d rather not (yaml for ansible/docker-compose or shell scripts are common). Or maybe you love everything else about the job, besides the language you must use. Or you love everything and the language, but now you’re asked to complete a task that’s not really suited to the language. Ir maybe you love everything about your job and then are asked to fix something in a legacy project.
most devs have "appropriate jobs" for their skillset. It doesn't follow that the tooling chosen by others before you at that job is "appropriate" or that the language is "appropriate" or that you have any choice in the matter. Most of us don't. Most of us work within the constraints provided. Some of us work to remove or change those constraints.
getting an "appropriate job" rarely addresses these problems and even more rarely makes it evident that they exist before you take the "appropriate job"
I think it is better because it pushes you beyond your comfort zone. You will learn new things. And sometimes those things are better than what you currently think is "best". If you can only function well in a very specific environment then you are not very flexible.
That has not been my experience. In a C# shop, if you want to do something not-C#, best is to switch employers. I don't say that placing such high value just into what tool you use is smart; but changing tools in a company is often a total no-go.
I have been told many times in the last 20 years that the only thing that gets used comes from Microsoft. Literally the same shops end up with several piles of unmaintained crap which is either "not invented here" stuff, abandoned microsoft frameworks and all the staff have left to go find another job because it's not shiny any more.
Behind the enterprise tooling propaganda, there is the real programming endeavor with whatever tools are appropriate for the tasks (given the constraints of time, team size, developer expertise, maturity of libraries, etc.)
A software development business is not really about programming languages, libraries, or tools, but about the game of power in the organization (even when the end user doesn't care how the product was developed). Anything that threatens that power is easily dismissed.
A NIH attitude is kinda the the opposite of what I meant. Choosing the right job for the tool, would mean that all the wrong jobs should be outsourced.
> Choosing the right job for the tool, would mean that all the wrong jobs should be outsourced.
outsourcing that is just "kicking the can down the road" some poor developer is still left dealing with the same problem. It doesn't change anything other than who has to do the thing you don't want to / don't think is right.
I've done the opposite, striking out C++ code in a mostly C / Python shop.
In practice, it is far more important to allow your team to function when you eventually leave. Otherwise, they'll just throw your code away. Fortunately, a lot of code is thrown-away. So maybe it doesn't matter. In this particular instance, I knew my C++ code was basically a throwaway script (and Python, the other language that my fellow coworkers use, was too slow to get the job done).
But if you ever create something that truly matters and needs long-term maintenance (which I have), it better be written in a way that the team can maintain that code. Writing it in the language your coworkers speak (that the managers are hiring / interviewing for, etc. etc) is very important.
This brings to mind the joke with the drunk looking for his car keys under a street light. A guy tries to help him, and after searching in vain for a while, asks him:
"Are you sure you lost them here?"
"No, I'm pretty sure I've dropped then somewhere in the other side of the parking lot",
"So why are we looking for them here?"
"Because there's more light here which makes it easier to search!"
So, what's easier is not always what best. E.g. it doesn't mean sticking to the job rather than the preferred tool is the optimal path to expertise, career success, or personal development.
I think it is easier, but I don't think that is important or the right question. We live in a world that evolves and problems (Jobs) change all the time. It is to everyone's best interest to teach themselves new tools both to stay relevant and also to find interesting new problems to solve. So yes we need to change from the mindset of religious worship of a certain tool by only searching problems that tool can solve to researching the best tool for the problem we need to solve and learning it if necessary.
you generally have an established codebase of tens of thousands of lines of code that the business is based on... usually with little to no test coverage. Creating a parallel version of the code in the new language would take ages and be filled with errors because the lack of tests proving the behavior in the old one means you rarely know if you new version is "correct" or not ... and then you end up with annoyed customers but... all of that is moot because you can't get management to sign off on using so many resources for so long to do a thing that customers will never see and can't be marketed.
So if you consider some stack to be inferior (maybe because its highly inconsistent in its design or it only runs on closed and locked down platforms or whatever) you still should just dismiss it as 'oh its just a tool' instead of accepting that it's shitty and makes you miserable when working with it? Why wouldn't I want to work with best thing ever if it helps me keep my sanity every day?
Do you want to use something that makes you miserable? Is any language including those designed by 3 year old 'just a tool' that is to be used despite obviously being shitty?
I think an analogy would be the following:
You need to hammer a nail in a piece of wood. You could do this with a screwdriver but hammer is better suited for the job. However, a hammer made out of glass is less useful than the screwdriver.
In other words, some kinds of tools are better suited for a problem than others, but there exist tools that inferior even though they were designed to solve your original problem.
You should pick the best tool for the job. It might be unclear what that tool is and some tools are better than others at solving the same problem, even if they are designed to solve the same problem.
I take your point, but can we avoid this analogy please? Let’s talk about actual code if we’re going to talk about the best tool for a job that requires it.
It reminds me of something early in my career that irked me like few work-related things things have: “Building software is like building a house” — No, it isn’t. Not even a little bit.
Never heard of that one, and if it works for you then more power to you! Analogy's are all about making sense of new data using older previously understood data, so if the old reference doesn't ring any bells... you just need to find one that does.
One of the main problems I have with that analogy is that a language is not like a tool at all. Frameworks and libraries are like tools. Different classes are like a different flavor of tools - Haskell/ML, C#/Java, and a plethora of others.
Maybe you should look within yourself and fix up whatever part of your personality is making you miserable? I'm not being flippant. Programming languages don't make people miserable.
There isn't really an argument, just an observation that the most efficient way of thinking about programming is in data structures and algorithms which are mostly language independent. Nobody should be spending most of their time leveraging language features, so it isn't critical to use any particular language.
I have a large range of pens of different quality. I have a favourite pen. But I'm willing to use any pen if I need to write something down. My focus is on composing a message, not on worrying about the quality of my calligraphy or the occasional scratchy mark. If 'm going to be writing all day I'd rather have my favourite pen but any one will do.
Programming is in my opinion not just about data structures and algorithms. It is also about designing abstractions that capture the domain you're working in well, whether it be by designing classes in an OOP language, or data types and functions in a functional language.
It's not just about what a program does and how quickly it does it. What matters at least as much is how easy it is to understand, modify and extend. Of course, that isn't necessarily language-dependent either, but I think some languages make it easier for you to design nice abstractions.
For example, C can make it more difficult to write nice abstractions since it forces you to worry about memory allocation and memory safety, and doesn't really offer facilities for generics.
Saying a language makes you "miserable" does not forcibly mean you literally want to kill yourself because of it.
I can totally see myself having more or less fun executing the exact same programming job, depending on the language I'm asked to used. And yes, to some extent, some projects (especially the ones where you have to start from an old codebase in a deprecated and ancient language) make me "miserable". That doesn't mean I'll quit my job on the spot.
I used to be a C++ programmer. I was good at it. I could do advanced template magic. I understood multiple inheritance and used it correctly, for the right purpose. I literally had dreams in C++.
Then I started using Java, and realised that I could be 5x as productive with far fewer headaches chasing down yet another linker error with some unintelligible error message.
What do you mean __malloc is undefined!? How could it possibly be missing? It was there a minute ago! I'm linking the standard library already, so what exactly do you want me to do about it, Mr C++ Compiler!?
Then I discovered IntelliJ IDEA, and I swear that I felt like the skies had opened up, there was a beam of light shining upon me, my spirits were lifted, and I could hear the voice of God saying: "Fear not! For now you can make changes globally and ye can rest assured that this will not break thy code!"
And then along came C#, which was just-like-Java, but with another 2x boost in productivity from all the built-in stuff that Java was missing, the language niceties, and the removal of the boilerplate. It has made me so happy over the years. Nearly two decades of effortless productivity.
I mean, seriously, how ridiculously awesome and simple is the "async" keyword!? It's like magic! I love it. LINQ is neat. The debugger is awesome. IntelliTrace is nifty. I could go on.
At one point recently I was forced to use C++ to write just a few hundred lines of code. It was physically painful. The language is a quagmire of footguns. The code I wrote was trivial string processing code, yet despite using only std::string it still somehow managed to crash.
I decided right then and there that I'm never going back to C++. Never! It's not worth it. My sanity, nay, my very soul deserves better.
I agree with you that there is a deeper problem if choice of programming languages makes you miserable (with the possible exception of the situation where things are so bad that it makes it difficult for you to do your job and that makes you miserable).
At the same time though, I think you might be writing off the link between choice of programming language too easily. Rather than pens, where there is relatively little difference even between a fountain pen and a ballpoint, a better example might be a carpenter choosing between power tools and hand tools. There are definitely carpenters who are strongly specialized in using hand tools who would not enjoy being forced to use power tools.
I suspect this may boil down to different people coming to programming for different reasons. Some people are primarily interested in the end product of programming while others are interested in the process of using code to build abstractions. My opinion is that room should be made for each approach.
Vast majority of the time, I pick up a project half-way with an existing codebase rather than start from scratch, and it's natural to just keep using what's there unless there's a showstopper, in which case I make the bare minimum addition so I could keep going.
The things that make work pleasant are a sense of purpose, feeling valued, being stretched, having autonomy, personable colleagues, a good physical environment, decent equipment. These are far more important than what language you happen to be using.
That attack seems unwarranted, especially for the argument put forth above, which you really don't seem to have understood.
The argument is that, even when you compare between very good tools, each one of them will make your life miserable when you apply it to a problem for which it is not designed, and for which a different tool is a much better choice.
I.e. there's no "one tool that is the best thing" that can be applied to all your tasks, because the tasks are widely different so tools are necessarily limited. That doesn't preclude some tools designed for the same type of task being clearly worse than others of the same kind (your argument).
First you can't always choose the stack you work with. You work with other people, who may have expertise in different languages. You may have to reuse existing code written in different languages. And it's rarely the case that some stack is better than another on every aspects, there is usually some form of compromises to be made.
This was such a nice read! I've only ever written Java & JavaScript really, and Haskell is always at the top of the list of things I want to learn!
When I got to the 7 line snippet at the end I was blown away by the elegance, and the fact that I couldn't really understand exactly what each thing did. So I decided to spend a few hours going through it character by character and documented my process here [1] in case it was useful to anyone else.
I'll continue by reading a book on Haskell, but open to any advice from anyone on good ways to get into it!
IEnumerable<int> filter(int p, IEnumerable<int> xs) {
foreach(var j in xs)
if (j % p > 0) yield return j;
}
IEnumerable<int> sieve(IEnumerable<int> s) {
var p = s.First();
yield return p;
foreach(var e in sieve(filter(p,s.Skip(1))))
yield return e;
}
var n = sieve(Enumerable.Range(2, 10000)).Skip(10).First();
nth :: Integral a => Int -> Maybe a
nth n | n < 1 = Nothing
| True = Just (primes !! (n-1))
primes :: Integral a => [a]
primes = sieve [2..]
where sieve (p:xs) = p : sieve [x | x <- xs, x `mod` p /= 0]
"It is rather an attempt to get my head around functional programming, and to me Haskell doesn’t seem to have any practical application beyond that."
Cardano/ADA's core Ouroboros protocol was entirely written, with formal proofs, in Haskell. It is by far the most serious attempt at proof of stake in the crypto industry.
I think what you're really saying is, you won't find many jobs out there w/Haskell as a requirement... but I wouldn't go as far as saying Haskell has no purpose past teaching FP. It certainly is the poster child of FP, however it is not without practical/industrial use. You just have to look harder to see where it's being used.
>You just have to look harder to see where it's being used.
This is where advocacy slips over the line into a kind of blind faith evangelism -- with an added pinch of pedantry peculiar to our field.
I will state without proof that every language ever invented is currently being used for something practical somewhere. E.g. someone has a useful shell utility they wrote in Brainfuck that they run every day and that they love. This does not mean that BF has 'practical application' in the usual sense of the phrase. In the same way, just because there is a real program written in Haskell somewhere does not mean it has practical application.
IMHO a language (or a tool in general) should have a community of practitioners that regularly think in and create with the tool, and a good signal that that is happening are the number of open-source releases in that language. I have never in my life installed a Haskell program, or known of anyone installing one. (The exception being this one Haskell fan at JPL who loved talking about the language and the lambda calculas, but to my knowledge, never wrote anything real in it, and he just installed learning toys, not utilities.)
All that said, there are some good examples of unpopular tools that enjoyed success later in life. Quaternions lost to vectors historically, but graphics programmers rediscovered their benefits and resurrected them. Maybe Haskell will have it's time, but that time is not now.
I'm not sure that holds up. Real people working at real companies are writing real code in Haskell every day. I found several job postings in London that advertised Haskell as a requirement / nice to have. Haskell is also taught and used extensively at my alma matter, and is in fact the first language you will be introduced to at a CS/SE degree.
How is that in any way, shape, or form comparable to Brainfuck?!
When was the last time you installed and/or ran a Haskell program? If the answer is more recent then "never", then what was it?
It's possible that Haskell is like COBOL in that it's used in industry but not at all by the community. But I'm not sure how I would characterize it, then. Does industry use count as "practical"? I honestly don't know. Anyway, examples would be nice.
I installed and used Pandoc the other week, and you've likely used it without knowing it (e.g. via web backends) if you don't use it directly. I also use it for Elm (i.e. the Elm compiler) almost every day.
Haskell indeed has less penetration than something more popular like JavaScript.
But the Haskell language is not easy to use if you're a script-kiddy. Most "easy" languages can have some form of copy-paste reuse, and a lot of "the community", it would seem, just append together useful snippets to get their programs done (and whether they have a deep understanding or not is not needed).
This isn't true for Haskell. It's actually quite hard to just "copy/paste" Haskell without out knowing the underlying concepts. I wager that the reason there seems to be less.of a community in Haskell is due to this property.
I think it is a bit of an oversimplification to say that Haskell is less popular because it is not as easy to "copy/paste" together code for it. I would say it also isn't that easy to copy/paste together C++ code, for instance. This also conveys a message of "Haskell is only for smart people", which I don't think is the right attitude.
Haskell is significantly less popular than some other languages, but it has a non-trivial amount of people using it in serious applications in industry (e.g., a large part of Facebook’s spam filtering system is written in Haskell [1]), as well as for some reasonably popular open source projects like Pandoc. Furthermore, Haskell has a rich ecosystem of open source libraries on Hackage.
How does this not qualify it as a language with practical applications? Frankly, your comment comes across as being based upon hearsay, instead of actual experience with the language and its community. You’re right that Haskell is not as popular as more mainstream languages, but that does not make it similar to Brainfuck.
The crypto industry at its current stage is far from any practical purpose. It's just unfortunate that "Haskell in Practice" is associated with crypto BS.
One of the most beautiful introduction to recursion is in Chapter 8 using dragon stories, of the `Common Lisp: A Gentle Introduction to Symbolic Computation` book by David S. Touretzky. [1], is the link to the free pdf.
(and yes, the nested yield is ABSOLUTELY an anti-pattern. They did promise to do a bulk yield at some point, I haven't used C# in a few years so don't know if I'm out of date!)
Depending on the particular language and platform, it can be quite dangerous to use recursion in production software due to the risk of a stack overflow. To be safe you have to first determine analytically that this can never happen. For algorithms that manipulate tree data structures it's often safer to avoid real recursion and instead sort of simulate recursion using a list or stack data structure allocated on the heap. At least that gives you a better opportunity to fail gracefully if the input data is too large to process within your resource constraints.
Tbf most code doesn't encounter trees deep enough to generate a stack overflow.
It's more of a concern with things like mutual recursion (multiplying your frame depth at each step, instead of just +1) or if your recursion depth is based on unbounded user input (eg iterating over an AST, or this code snippet)
I was thinking about this recently: is there ever a reason to put recursion in an everyday “workman” code base?
Seems like it would be so out of place in a real industry code base, like a infinite loop waiting to happen. There are always better more readable and maintainable ways to accomplish the same thing.
Sure there is: it lets you express loops immutably.
Rather than `state := null; while condition do: mutate-state-and-recompute-condition`, you can do `let loop(state) = if shouldContinue(condition) then loop(newState) else resultOfTheLoop`. Rely on the tail-call optimiser to compile this to a genuine imperative loop.
This looks very odd the first few times you see it, but it's much harder to get wrong.
There are contexts where it's banned (MISRA, and other embedded scenarios). But it's very hard to work with tree structures without recursion - if you're not careful you just end up with an explicit stack rather than using the software stack.
In situations where you think auto-vectorisation might help, you definitely want to do it as iteration.
This is partly why I like the system of transformers that LINQ is built out of; you can specify your query in a natural nice functional manner, and then let the optimizer convert it into a fast query.
Loops are also infinite loops waiting to happen, yet we still use them :)
The reason one might want to use recursion is when you're working on a data structure that is recursive in shape, like trees. And trees are very common, the Hacker News comments are one such example.
Good developers can solve problems and be productive with any programming language/tools. Bad developers can't solve anything unless they use the one specific tool they know.
Clearly much worse than the original solution. Basically, a nested sequence of filters is built, one to filter out multiples of each prime number. One way in which the complexity of this is bad is that if you want prime numbers < N you only need to filter out primes smaller than sqrt(N). It is telling that the first solution contains an sqrt but not the second one....
These kind of functional solutions are very cute mathematically, but...
I once wrote this way of generating an infinite list of primes in unlambda. That was kind of interesting to do.
> One way in which the complexity of this is bad is that if you want prime numbers < N you only need to filter out primes smaller than sqrt(N). It is telling that the first solution contains an sqrt but not the second one....
That's not really true, though, because that filter is applied lazily, so it only get evaluated up until the Nth prime and no further.
I'd say O(n) for both. The generated list only contains primes, so that's straightforward linear. Then there is the recursive sieving, which adds an additional pass over the sieve list for every prime we've found so far, we need keep the computation of each pass in memory, so that's some additional memory that's also linear.
The time complexity is less obvious, each prime adds an additional pass to evaluate, so that seems linear, but each new pass only applies to a list that already had all the previous passes filtered out, so that's not technically linear, but I find it hard to determine how much it actually is.
clearly O(n^2) for time. Well, perhaps O(n^2/log n) or something, but that is not much different. The thing is, every prime p gets passed through a filter that filters multiples of primes p' for all p' < p.
> The thing is, every prime p gets passed through a filter that filters multiples of primes p' for all p' < p
But the filter for multiples of 3 only sees the values that weren't already filtered by 2. So if we have N filters we don't evaluate all N filters for each value. The 2 is applied to everything, the 3 filter is only applied to things that are not multiples of 2, the 5 filter is only applied to things that aren't multiple of 2 and 3, etc. That doesn't sound very quadratic to me.
The thing is that the prime numbers have to go through all of the filters. Although the prime numbers are a minority among the numbers they are not a very small minority. In fact, a random number N has about a probability of 1/log(N) to be prime. So that could reduce N^2 to maybe N^2/log(N) but not any further. So I suppose the complexity actually is somewhere between N^2 and N^2/log(N), but that is not much less than N^2. The trick of only checking primes smaller than sqrt(N) reduces the complexity to N^1.5, also maybe involving some division by log(N), but that is actually an improvement in the exponent.
I do not get what this publication is trying to say. Is Haskell better than C#? are Haskell programs shorter than C#? is recursion difficult? is recursion difficult in C#? I don't get it.
I can't tell if you're seriously enquiring, or simply being snarky. However, taking your comment at face value ...
To me this post is saying that in some languages (in this case Haskell) there are ways of working that are hard to emulate in other languages (in this case C#). The post is talking through a specific example of this, and pointing out that if you only know one language (in this case C#) then you might be missing out on styles of thought that can help in solving certain problems.
The post isn't about one language being better or worse in absolute terms, it's about broadening your range as a programmer so you are aware of other techniques and styles of thought.
To go all "new age-y" it's about enlightenment. Usually Lisp is the language used to help programmers achieve enlightenment, but other options exist, such as fexl, Haskell, and others.
Seriously, many languages are "about the same," but some really do make you think differently. You can be an excellent and productive programmer without them, and the vast majority of programmers never try to step outside the bubble of Imperative & OO languages, but you will be missing out if you don't at some stage embrace one of these other "pure(ish) functional" languages that is genuinely different.
Thanks for a great explanation of a weird topic, that I’ve struggled to do put into words!
I’ve had a few similar experiences in my humble programming education.
The first was at uni, having to learn c, c++ and python in one semester, after only using java for the first two semesters. (And a tiny bit of php and visual basic before that)
The second was exposure to scheme/racket and real functional programming.
The third time was the most amazing mix of haskell, type theory, lambda calculus, logics, agda, category theory, proof theory, model theory and just theoretical computer science in general.
It leaves you with this wonderful and strange view of programming, without any of the concrete computational models.
IME there are two kinds of programmers: ones who learned a single language deeply and have almost religious confidence that it's the best thing ever, even if they try to approach other languages with an open mind (i don't blame them, i was like that a couple decades ago, too) and some who understand that tools are just that - tools, which you should pick accordingly to the problem you want to solve, because the other way ends up in a tail-wags-dog situation. the blog post sounds like the author has realized that and started a journey from the first type to the second type, which is to be commended. good job.