Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it's very interesting how some people are repulsed by the use of mathematical symbols in the sample code. While others love it.

The vast majority of the code I write is not mathy. But I do write a fair bit of mathy code because of the field I operate in. Because of this, I'd side with the ones that hate it for some of these algos: the various sorts, for example. But for others, the use of mathematical symbols brings clarity, IMO. Assuming you already know the algo.

I've had to translate mathematical notation from a set of papers into code before. When I first picked up Julia, I re-did something I had previously done in Python/pandas/numpy. I used mathy notation... and julia code, ended up both MUCH clearer and quite a bit faster. I call that a win in my book.



In a vacuum real math symbols look better but they're virtually never used in practice and they mess up people's pattern recognition.

It's uncanny valley but for code.


> they mess up people's pattern recognition

you have described perfectly the problem that I have when reading n0code written by programmers who are not mathematicians... my mind wants to read variables with multiple letters as products of each letter.

I find multi-letter variable names extremely old fasioned, as when math was written explicitly in latin before the advent of algeraic notation.

EDIT: an illustrative tweet of my concern: https://mobile.twitter.com/fermatslibrary/status/14109451739...


> I find multi-letter variable names extremely old fasioned

Sometimes I read things here on Hacker News that throw me so hard I leave the site for a month or two. Congratulations, this time it's your fault. Goodbye.


One-letter variables have all approximately the same physical size; this makes the “tokenization” step of reading a formula faster.

They are also less descriptive, and this makes the semantic interpretation more difficult.

Usually mathematicians read entire papers, or large excerpts, at a time. In this situation the semantic association symbols<->concepts is often made at the beginning of each section and reused for several formulas, making mathematical notation more effective.

Programmers instead often look at code in smaller fragments. They don’t have thesame level of contextual information readily available and so they often prefer to embed this information in variable names.

Add that programs are written mostly in ASCII, on a keyboard, with autocomplete, in a single typeface, and math by hand, on paper or blackboard, with much more graphic possibilities.


What can I say... glad to have helped you to boost your productivity ;)


>they mess up people's pattern recognition.

Quite the opposite. Verbose, "readable" code destroys pattern recognition.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: