We have a nearby state university with a so-so computer science program. 2-3 graduates every year go to FAANG, a handful go to the local fortune 1000 companies, and we (a < 500 employee company) get to pick through the rest.
They all list C, C++, .NET, HTML, CSS, and Java on their resumes but haven't done anything except a simple group project in any of them, half the time not even writing any code for the project. Which means a role in documentation, testing, etc.
They come with a chip on their shoulder, flaunt their degree, and demand a salary close to what they think they should be getting, according to their advisers or statistics or whatever.
To make it worse, they don't have any personal projects to show. I tell every single person that I interview to create something, even if it's a failure. At least you can come back and discuss your experience trying to create something, and you'll learn more about development working on that project than you did the whole semester you learned Java.
It's a failure of the education system. They want to teach the academic side but ignore the skill side. They're taught the language but none of the tooling. When I was in school, which granted was a while ago now, they didn't teach you subversion, make, how to use an IDE, how to install and use a compiler, linker, etc. I remember having to logon to a VAX machine so we could do some fortran assignment. It was a cruel joke. There was little to no instruction on how to use the damn thing. You just had to make your way through it. I don't think much has changed. The students who are curious seek out and learn to use IDEs, git, maven, npm, jenkins, whatever, but there are others who do what's asked of them and they're not asked to learn these things. As a result they're unable to even start on anything other than the trivial assignments they're given in class.
I think they seem entitled because they did what they were told and possibly received good grades as a result and their attitude may be them saying, "I bought an education. I was graded and found to be exceptional. If I had needed it, it would have been a part of my education". Right or wrong you should feel sorry for these kids. They paid a fortune and got ripped off. They just haven't realized it yet.
But the purpose of schools has always been to "teach the academic side." The whole system until recently has been that schools give you the foundational knowledge, and you learn the skills at work -- in olden days, corporations would actually train you on the job.
But then:
1) Employers decided that it's better to lay off people you don't need anymore, which meant that
2) Employees realized it's more lucrative to jump from employer to employer than maintain loyalty for a company, which meant that
3) Employers realized it's not productive to spend 6 months training a junior who's going to leave in 8.
Trade schools and community colleges will focus more on skills than theory, but there's an image problem among well off children... and the social environment is far different (boring by most 19yo standards).
> It's a failure of the education system. They want to teach the academic side but ignore the skill side.
Common misconception, but no this is not a failure of the academic side. This is exactly what universities are for. The thinking always was that if you teach the fundamentals, then the practical applications are a foregone conclusion. Someone who is well versed in advanced theories of computer science and electrical engineering: boolean logic, circuits, abstract computation machines and models of computing, data structures, algorithms, language design should be able to pick up whatever FooBar FOTM programming language or paradigm or framework that comes out.
If you think that the situation today is difficult with all the languages and frameworks we have just know that it was almost the exact same back then as well, except you didn't have the internet, chat rooms, and robust documentation and Q/A sites to help you answer your questions. Imagine the, excuse my frank language here, whining that would be happening right now on this forum and many others if you were just slapped with an IBM manual and told to bootstrap things yourself. I could only imagine!
Trade schools are suppose to be the institutions that teach you practical skills "to get a job". Universities are for expanding your mind and learning. I don't know why we mixed the two and then act surprised when a multi-century institution fails at modern workforce demands.
To anyone reading this: Do NOT go to a University to get a job. If your goal is just to get a job in this field, you can do that in far easier and cheaper ways (start reading some programming docs and get busy building things). You go to a university to learn. The getting a job part is a natural consequence of the learning you have done and are now are able to do.
I don't see the people in the chemistry department saying, "We don't teach students to stir shit in test tubes we teach fundamentals!". I get it, they're not there to teach job skills but it's also not an excuse to completely ignore it. If you want to do gas chromatography you're going to have to learn to use the machine. You don't just say, "Hey get some trade school person to do it" and it isn't just a "foregone conslusion" from fundamentals.
This is not an apt counter analogy for many reasons, and actually can be made to argue against the point you are trying to make. I'll give a few examples here:
- In a computer science education, you do indeed still use a computer.
- There are still proprietary techniques, substrates, solutions and materials that industry chemists use that you almost certainly don't have access to in your standard university classroom. These would be akin to the variety of frameworks, tools, and libraries that exist in the programming world, many of which are open and documented, many of which are proprietary and closed sourced. Hopefully, however, your university education has taught you how to learn to learn in order to use these things.
- Chemistry is not about test tubes and beakers. That's just what it looks like today. Likewise Programming by typing characters on a screen is just what it looks like today. To introduce another analogy, just like Geometry is no longer about rulers and measuring pyramids, Computer Science is about how to formalize knowledge and it just happens to look like semicolons, braces, and 0's and 1's today.
This x1000. Universities are not trade schools but definitely over the last couple decades have been used as such by companies because for certain fields, that's "all there was". Couple that with the "everyone must go to college" mantra of the past decade or so and you get the prevailing thought you have described.
Yeah I see people complain all the time that graduates with a "Computer Science" degree don't know "tools of the trade" like source control, debugging, etc.
Computer Science isn't about learning a trade, you're learning a science.
Yet another argument to better/more trade schools that are multi-year programs (like an Associate's degree) rather than universities.
That's often been my experience as well. Some students choose the major because it is interesting to them and they're smart and motivated, others choose it because they think there's a six figure job on the other side of the door. Smart and motivated people will be good regardless of curriculum content, people doing just enough to get by will be bad regardless of curriculum content.
As a young lad, working at an IT consulting company I once got sent out to "fix the computer" at some cruise line. I went out with no idea what I'd encounter, and found myself faced with an IBM system I'd never even seen before. Fortunately, the service manual was with it and with its aid I was able to effect a circuit level repair and got them back up and running. ::shrugs::
The material you can dig up on the internet is worlds better than poor documentation. But a comparison with good documentation isn't always so clear.
I think it's a fundamental divide between people who see degrees as a way to expand your knowledge and people who see degrees as a fast track to getting a better job. Somewhere in the middle is the ultimate spot for an educational institution to be. For example, MIT has somewhat recently started a class on "hacker tools" (https://missing.csail.mit.edu/) with stuff like shell scripting, Vim, Git, debuggers, and general computer knowledge.
The trouble is that the tooling du jour gets out of date quickly. People who know how to use Eclipse, Java, and Subversion because that's what they used in school might not have the skills to pick up the tooling used in another company (say VS Code, JavaScript, and Git). The foundation of CS knowledge is changed much more slowly.
School should encourage the curiosity to learn on one's own and provide resources to help. It should also provide a framework to be taught things that will last beyond the end of the degree. They need to do a better job of instilling curiosity, but they shouldn't go all the way to the other side and become vocational training.
> They're taught the language but none of the tooling
I want to expand on this a bit, as I think what students are being thought is the syntax of languages but none of the "how do you design a project continuously as requirements change?" which is the really difficult part of software engineering. So they might know how to write the code, but they don't know how to come up with what to actually write.
It is a co-failing of education and employers. Employers should be hiring Software developers\engineers instead of computer scientists. Educational institutions should have more\stronger Software Engineering programs, and software developer job training programs.
Instead we get the odd equivalent of companies hiring math and physics majors and being suprised they aren't electrical engineers.
I've had students come in from other colleges (usually lower-ranked for-profit schools) that were great. They actually built useful projects and wrote non-trivial code.
Where I work we regularly get a bunch of kids from the neighborhood college who have projects but are looking for mentors, or are looking for projects/mentors. Its not at all complicated to setup.
Don't wait around for stuff to happen. Call the local Univ CS dept head and have a chat. Its much more effective than talking to kids or expecting them to do things by themselves.
As a professor at a state university, I try to tell my students all of this.
They’ll get hired at non-tech companies no problem with just a degree. But if they want the high salary tech company positions, they have to spend considerable time practicing their skills outside of class and building a portfolio.
Our curriculum is great for exposing them to a variety of CS topics, but it won’t make them stellar engineers by itself.
There needs to be a "portfolio" movement in software development degree programs. That, and a clearer distinction between a "cs" degree and "software engineering" degree.
In, say, an architecture degree, students and professors understand that you will apply to positions with a portfolio. I think students are able to use work from the degree program in their portfolio. Their professors have portfolios from when they first applied for positions out of school, so they can help or at least provide examples.
Currently, we just have a vague notion that you should have code or contributions on github, and that is for some reason not part of most degrees. Or, too many students get a cs degree and try to use it to apply for software development positions. In that case, yeah, you're going to need to self-study if you want to apply for positions you didn't actually get a degree is.
If you put serious effort into your class projects outside of what is required (https://thume.ca/ray-tracer-site/), you can certainly put that in your portfolio (that project is absolutely crazy). That said, I agree with you that more students should take "software engineering" degrees if they have trouble building things outside of class, plus that universities should make it clearer that a CS degree != a job offer from Google.
It's funny because I have made sure to emphasize my personal projects on GitHub on my resume and in interviews, I'm pretty sure they've never looked at anything.
When I was interviewing after I graduated college last December, all we ever talked about was my projects. We would skip over school (every other candidate they interviewed had a comp-sci degree, not a differentiator) very quickly.
I even had an interview where one my interviewers pulled up my GitHub during the interview and asked me questions about design and tooling decisions on certain projects.
From each company where I received an interview, I was told that it was because they were interested by my side projects.
Of course- it may depend on what your side projects are! Mine were of interest for the cloud based roles I was looking for- RESTful APIs, Kubernetes, AWS, Linux SysAdmin stuff, etc.
Then you applied at the wrong companies. For people with a real GitHub repo with a project or contributions, I always skip all programming tests and jump on the real code. It's a real pleasure to have an excited candidate explaining reasons for architecture or choice of tools etc. compared to fizz buzz. I can only encourage everyone to bring a small side project, it will make your interviews easier.
You can say this as much as you want but it's an industry practice. Almost every company I interviewed with in the bay area never looked at any source code for projects I did. (Not even a project that had over 50,000 users - they just installed it and said, "Oh, yeah, that's sweet. Thanks! Now finish writing this sudoku checker and then doing a spiral matrix traversal.")
They're using a standardized interview format and they don't care about anything else. I've interviewed with others where I did have projects up, they viewed the source code, and it went over well with them - but that was extremely rare (I can only recall 1 at the moment out of the 100+ companies I've interviewed at).
How big is your sample size? What phase of the interview are you thinking of? Context changes the approach.
If I'm doing phone screens (i.e. top of the funnel and looking at large numbers of people) I glance over github projects and links quickly - about a minute per resume to look over everything (this is is why formatting, styling, and ease of reading matter on resumes). If I'm doing in person interviews (i.e. bottom of the funnel, talking to small numbers of people) I look at almost every personal project, website, and link the candidate puts on their resume with the intent of asking the candidate about them.
I've definitely had people look at my personal projects. As an example, I applied to a company doing computer vision and they sniffed out my CV projects. They definitely came up in the interview.
It's possible that they were missed because so many people list their GitHub where it contains absolutely zero work written by them (or even nothing at all).
As a 3rd year CS student, I see this a lot. I personally love CS and programming is a hobby for me, so naturally I have projects on my GitHub. I always list my three personal projects I'm most proud of on my resume. I only list a language if I have proof of competence on my GitHub or in previous employment.
Some of my peers will list everything they've written a "hello world" in, and from the employers I've spoken to, a good chunk of them see a long list of languages without any proof to mean they're over projecting competence. Doesn't mean there is no interview, but it hurts the chances of getting one. The interview ends up being the point where they can test how truthful they are.
The "simple group projects" we made during my 5-year degree were the result of 3-5 people putting in ~200 hours each (over 4 months). They were of higher quality, better designed, tested and documented than most I have encountered in the industry now 5 years later.
I would not say that I am much better technically now, than I was 5 years ago. Much of what I have learnt in the industry is about coping with working in an office, having pointless meetings, doing design by committee, etc.
To my mind there are orthogonal issues here. The much bigger one, not the lack of personal projects, is the chip on their shoulder. I've observed a certain amount of hubris in the culture of CS departments, at least in the university/college level programs I've been exposed to, that is very off putting. An ideal candidate would approach an area they are unfamiliar with, with curiosity, zeal and humility. If I can find people like that, who cares what they've built or not? I'm confident they will come up to speed and become a top level contributor.
It's likely that people who approach tech with the desirable attitude above will have built something personal they can show. But having built a personal project is neither necessary nor sufficient to demonstrate curiosity and zeal for craft.
EDIT: To be clear, I think people willing to get the degree and do the work should be paid according to market value. You come across a little bitter that the top performers went off chasing jobs at FANGs... maybe if you paid market rate more of them would stick around.
I agree that some or even most important skills in software development are learned through experience. But the solution is a lower salary and job-training.
When anyone writes "C, C++, .NET, HTML, CSS and Java" on their resume, that obviously has to be considered alongside their experience. If you don't want to hire and train inexperienced programmers, I guess that's fine. But I find this idea that there is a minimum experience (and accompanying salary) threshold for the entire industry that can only be reached through personal projects of a certain size frustrating.
I agree. Honestly this thread makes me super-glad I graduated undergrad when I did: Decades ago, when you could actually get a junior position at a company without having to be Ken Thompson or Linus Torvalds. I graduated from a basic state school CSE program. We didn't have "Github profiles" and there was no expectation of having a portfolio for a junior position. You also weren't expected to crank out JIRA tickets starting day one. You could be hired on aptitude and potential only, and learn what you needed on the job.
The universities and training you get in undergrad is probably miles better today than it was back when I went to school. This does not seem like an education system failure--it seems like inflated expectations from employers for entry-level jobs. If employers are looking for mid-to-senior level employees who can be productive from day one, that's fine but they should probably not rely heavily on university recruiting.
That's something I've benefited from. I taught myself how to program when I was 14-15 by reading books and playing with basic graphics processing stuff to make games. None of my games were ever any good or impressive, but it led to me learning how to do web and network programming for fun, and led to me learning how to build stuff.
This has been really good for me, since I don't have a degree, but I still can typically get through a whiteboard interview in most jobs that I apply for (though occasionally I'll get a problem that trips me up). School is certainly useful, but it should be used in addition to personal projects, not a substitute for them.
I am a developer who started as a trainee/junior with no formal education on the field. last year I started a degree course alongside my job, thinking I would learn more and become a better developer. This is not the case. I get a small amount of actual programming done in a variety of languages, but a lot of the assessments are based on writing a report about how or what I developed. I can see how a software engineering degree can be useless compared to actual experience coding.
> They come with a chip on their shoulder, flaunt their degree, and demand a salary close to what they think they should be getting, according to their advisers or statistics or whatever.
Commoners!
The arrogant peasants you're hiring expect to be paid? They're relying on data and expert advice to decide how much money to seek?!?
You would be wise to steer well clear of such fools.
You're right - it applies fully to me, and I'm aware of how hard that rule can be to follow, so correction is welcome.
As I read the GP comment, it accuses the OP not just of a straw man, but implies being an asshole (i.e. thinking that others are peasants who shouldn't expect to be paid). That's clearly not a fair reading of bluedino's post. Each move in such a direction is a step towards degraded discussion; hence the moderation reply.
Obviously these are matters of interpretation, though. People often read the same things differently.
demand a salary close to what they think they should be getting, according to their advisers or statistics or whatever.
The OP's comment implies that the job applicants are entitled for making salary demands based on statistics. There is nothing entitled about asking for market rates.
The comment responding to it points out the problem with this thinking in a lighthearted way. To me it was the perfect response. I understand you may disagree, but it was in no way unreasonable.
My comment on the other hand could have come across better. I implied you were not following the rules you set which I don't think is fair. I may disagree with your decision, but I could have done it in a more respectful way.
I don't believe I was straw-manning the guy (never my intent), and I did very much believe his comment made him come across as - to use your word - an asshole. Rereading the portion I quoted, I find his sentiment to be just completely horrible. Zooming out to the rest of his comments here for context, the whole thing is just a sad reflection of what faces the graduate of a typical computer science program when entering the profession and interviewing with a sadly typical... well... you know.
What entry into the profession should be like for a new CS graduate, and how we fall short compared to the other professions, feels like a topic for a much longer comment. Although perhaps it's a little bit like the book "The No Asshole Rule," where one might read the title and kind of get the point, and just skip it. Or possibly one might read the title and feel vaguely threatened, and just skip it.
OTOH, I really do appreciate you pointing out that I might not have been elevating the discussion with attempted humor, and I appreciate your work in general.
You'd do better to post a comment giving your own perspective rather than flaming someone else for theirs, which you may not have evaluated accurately. (I think you misread the GP, and speaking generally, we're more likely to misread other people on topics we feel strongly about.) Since it sounds like you have experience with the topic, sharing your experience and stating the conclusions it's led you to would be a substantive contribution.
In addition, sharing experience is at lower risk of misunderstanding and less likely to descend into flamewar. There's no contradiction between experiences. A hiring manager in (let's say) one state has their experience, and a new college grad in (let's say) another state has theirs. Both are real because they happened. If we report from that level, we can't contradict each other.
It's when we turn our experience into abstractions (e.g. the abstraction "new CS grads") that trouble arises. Abstraction A appears to contradict abstraction B. Both are lossy conceptualizations of a more complicated reality, and it's the lossiness that makes it seem like we're at odds with each other.
> A hiring manager in (let's say) one state has their experience, and a new college grad in (let's say) another state has theirs.
That's interesting, and a fair point. The differences I'm admittedly fixated on are the differences between software development as a profession versus engineering, medicine, or law, for example - or even the creative arts or science, not just the licensed professions! And I really do think it's a bit much to unpack adequately in the comments here, though I have seen a few good ones that touch on some things I think are important.
> the abstraction "new CS grads"
No abstraction intended, you can read that as "members of the set of students who recently graduated from CS programs in the United States and will now look for jobs or continue further in academia" if you like.
I would encourage them, if any were nutty enough to ask me for advice, to be enthusiastic about their future contribution to what is (well... should be) an amazing field, to be proud of their education, and to negotiate as well as possible with their first employer out of college, since that will help set the trajectory of their future career.
And to avoid people who misunderstand them by saying things like:
They come with a chip on their shoulder, flaunt their degree, and demand a salary close to what they think they should be getting, according to their advisers or statistics or whatever.
They all list C, C++, .NET, HTML, CSS, and Java on their resumes but haven't done anything except a simple group project in any of them, half the time not even writing any code for the project. Which means a role in documentation, testing, etc.
They come with a chip on their shoulder, flaunt their degree, and demand a salary close to what they think they should be getting, according to their advisers or statistics or whatever.
To make it worse, they don't have any personal projects to show. I tell every single person that I interview to create something, even if it's a failure. At least you can come back and discuss your experience trying to create something, and you'll learn more about development working on that project than you did the whole semester you learned Java.