Hacker Newsnew | past | comments | ask | show | jobs | submit | ethmarks's commentslogin

I love Zed editor's git UI for this. If you click on the left gutter of a hunk, it opens a little interface in the top right corner that lets you either Stage or Restore that hunk.

Exactly. When was the last time you heard HTML called "HyperText Markup Language"? When was the last time you heard CSS called "Cascading Style Sheets"? We should stop saying "JavaScript" and fully switch to JS.

And if each cell was a cubic micrometer (which is a side length 200-300 times smaller than a pixel on a typical screen and 50-100 times thinner than a human hair), it'd still stretch 3.7 kilometers, which is about the length of a commercial airport runway.

The average ChatGPT user isn't utilizing agentic workflows.

And even for the people that do, just because an LLM isn't absolutely state of the art doesn't invalidate it from being useful.


I noticed that the boing sound gets deeper and lower with smaller-magnitude boings. Is the boing audio generated procedurally/realistically in response to the physics of the boing, or is just playing a premade boing sound effect that's dynamically pitch shifted?


The original is pretty low, it appears to be sped up. Check the network panel.


Building datacenters in the arctic also has the added benefit that sysadmins would have to take polar bear safety lessons, which would be pretty funny.


> I wonder if it is enforceable.

I can't imagine that it wouldn't be. If a company has explicit written permission from the copyright owner granting permission to use that copyright, then they can use it.

Also, it wouldn't be a special license. If you wanted to do a "For my friends everything, for my enemies the law" thing, you'd just set it as all rights reserved and add special note encouraging people to ask for permission to use it.

Plus, copyright enforcement typically goes in the other direction. It's not about who you can sue, it's about who you can't. Licenses are just a way of specifying who you cannot sue. If you want everybody to use your project but don't want to bother with a license, you can make it all rights reserved (the legal default) and just not sue anybody. You could sue them if you wanted to (which is why nobody would ever use your code: because of the risk that you change your mind and sue them), but nobody is forcing you to.


> My recommendation would be to encourage the tutor to ask the student how they use the LLM and to school them in effective use strategies

It's obviously not quite the same as programming, but my English professor assigned an essay a few weeks ago where we had to ask ChatGPT a question and then analyze its response, check its sources, and try to spot hallucinations. It was worth about 5% of our overall grade. I thought that it was a fascinating exercise in teaching responsible LLM use.


You can still argue that LLMs won't replace human programmers without downplaying their capabilities. Modern SOTA LLMs can often produce genuinely impressive code. Full stop. I don't personally believe that LLMs are good enough to replace human developers, but claiming that LLMs are only capable of writing bad code is ridiculous and easily falsifiable.


> most students will learn a lot less than say 5 years ago while the top 5% or so will learn a lot more

If we assume that AI will automate many/most programming jobs (which is highly debatable and I don't believe is true, but just for the sake of argument), isn't this a good outcome? If most parts of programming are automatable and only the really tricky parts need human programmers, wouldn't it be convenient if there are fewer human programmers but the ones that do exist are really skilled?


[flagged]


Well, as a college student planning to start a CS program, I can tell you that it actually sounds fine to me.

And I think that teachers can adapt. A few weeks ago, my English professor assigned us an essay where we had to ask ChatGPT a question and analyze its response and check its sources. I could imagine something similar in a programming course. "Ask ChatGPT to write code to this spec, then iterate on its output and fix its errors" would teach students some of the skills to use LLMs for coding.


This is probably useful and better than nothing, but the problem is that by the time you graduate it's unlikely that reading the output of the LLM will be useful.


Tons of devs (CS grad devs that is) have made their career writing basic CRUD apps, iOS apps, or python stuff that probably doesn't scratch the surface of all the CS course work they did in their degree. It's just like everyone cramming for leetcode interviews but never using that stuff in the job. Being familiar with LLMs today will give you an advantage when they change tomorrow, you can adapt with the technology after college is over. Granted, there likely will be less devs needed but the demand for the highly skilled ones could be moving upwards as the demand for this new AI tech increases


Fair point. Perhaps I'm just too pessimistic or narrow-minded, but I don't believe that LLMs will progress to that level of capability any time soon. If you think that they will, your view makes a great deal of sense. Agree to disagree.


Right, but if AI gets to the point where it can replace developers (which includes a lot of fuzzy requirement interpretation etc.); then it will replace most other jobs as well, and it wouldn't have helped to become a lawyer or doctor.


> It's not good if you're a freshman currently starting a CS program

CS is the new MBA. A thoughtless path to a safe, secure job.

Cruelly, but necessarily, a society has to destroy those pathways. Otherwise, it becomes sclerotic.


Its not cruel, its stupid. Why would we organize our society in such a way that people would be drawn towards such paths in the first place, where your comfort and security are your first concerns and taking risks, doing something new, is not even on your mind?


> where your comfort and security are your first concerns and taking risks, doing something new, is not even on your mind?

Because individually, lots of people seek low-risk high-return occupations. Systematically, that doesn’t exist in the long run.

Societies do better when they take risks. Encouraging the population to integrate that risk taking has been a running them in successful societies from the Romans and Chinese dynasties through American commerce and jugaad.


How about switching to English? There is a high demand for people who are very good at communication and writing nowadays.


The only task required from a dev is to think

AI does not think

Ergo, AI will not take "programming jobs"

It may highlight some "fraud people" (do not know how to say it in english .. you know, people who fake the job so hard but are just clowns, do not produce anything, are basically worthless and just here to grab some money as long as the fraud is working)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: