Succeed personally: no. Some dev jobs obviously require a lot of foundational knowledge. I wouldn't want to have a colleague who didn't know linear algebra, simply because that's important in my field. That's not important for making Wordpress sites.
I think there are things "every good developer should know" such as knowledge about how cpus work, how OS'es work, how some important algorithms work, some basic set theory and complexity etc. All of this is doable without a degree, but I'd probably be more willing to bet that someone with a degree knows it, than someone without. Also of course - there are jobs that are doable without knowing foundations, but I'd call those "code monkey jobs" and whether such a job constitutes success I guess depends on who you ask. And what it pays.
Most importantly, when I recruit I don't see the diploma as evidence of what they know but as evidence that they could learn, quickly. If the diploma is from a prestigeous school but says they failed math 101 four times over that would make me doubt their capacity. Being able to pick something up is the capability I'm looking for. A curious mind, with a taste for the abstract. A diploma is just one clue.
One of the much-downplayed things that degrees are more likely to indicate than personal projects is foundational knowledge. When you work exclusively on Wordpress sites, this is irrelevant. When you work on distributed systems or cryptography, then foundational knowledge gets pretty important.
It is completely possible for any person to learn foundational knowledge outside of a formal instructional setting. I personally know excellent developers who have done so. It's unquestionably possible. I firmly believe that given sufficient time and determination and help, it's possible for any person to learn any subject to any level.
It is, however, possibly true that not all people learn discrete mathematics, relational algebra, and computer architecture in an ideal manner when entirely self-directed. Or perhaps even all developers.
This is what struck me about my formal education. While I also believe you can learn anything with enough material and dedication on your own, having the structure to ensure that you cover a set of subjects that constitute "foundational knowledge" can be invaluable.
With all the resources available these days, including MOOCs, textbooks, syllabi, and recorded lectures from the best schools, there's a middle ground between "degree" and "entirely self-directed".
With everything that's available in documentation, on YouTube, in forums, and in boot camps and other trainings, doesn't it seem like there are much more efficient ways to get the foundational knowledge that don't require four years of expensive college courses?
Step one: Find that material organized into a complete lesson plan. Without curation, a library is just a pile of books.
Step two: Package that with a bunch of people (both experienced and inexperienced) who can help with coursework and answer questions when they come up, teams of people to work with, somewhere to live with dozens of others doing the same work, hundreds more learning unrelated things (but who might be useful to you in the future), and you've got something close to a replacement for college.
All the material exists in one form or another, but I've yet to see anyone put together a credible "DIY CS degree" online resource.
Some day most (certainly not all) aspects of university education will be supplanted by high-quality universally accessible materials. But it's not here yet.
It does seem utterly inconceivable that getting copiously documented things into the heads of people would be something only accomplishable through an outmoded and expensive educational process. Surely we've come up with something better by now with all the effort being poured into it!
Yet, perhaps there's something about that process. After all, there are aspects to a formal collegiate experience that are not neatly captured by boot camps, fora, YouTube, and documentation. Perhaps some of those aspects, like an instructional environment that values theory, are of nontrivial significance.
Obviously, it's still possible that something better can be done. This may not be the same as a better option having been developed and being on offer, though.
> With everything that's available in documentation, on YouTube, in forums, and in boot camps and other trainings, doesn't it seem like there are much more efficient ways to get the foundational knowledge that don't require four years of expensive college courses?
Sure, but the risk of getting led onto an inefficient, time wasting path that doesn't give you a good grounding in the foundational knowledge is also high.
The skill to evaluate approaches to learning the information and the set of information you need to learn is not something you are likely to have without having studied the information, and there are lots of people with pet theories or financial interest in your actions trying to promote different approaches.
Maybe early in career you are correct, however after 5+ years of professional experience nobody even cares to which school you went or do you even have a degree.
The exceptions are companies like Facebook and Google, whose hiring practices seem to really be focused on hiring people straight out of college as opposed to established programmers.
Perhaps it's too broad a generalization - I haven't worked at Google - but all the companies who ape the GAFA hiring policies are indeed only interested in developers straight out of college. And I have a hard time blaming them; college students are fairly inexpensive to hire, and there's a lot of them out there looking.
I knew linear algebra in high school. I taught myself matrix math because I wanted to make games, and because the most challenging math my lackluster rural public high school offered was basic calculus.
My top university's CS program had a linear algebra class that loaded me up with theory though like Hilbert spaces, wavelet transforms, spectral theory, abstract algebra, and of course, principal component analysis (but the latter, along with SVD, is becoming more valuable these days with data science-y type jobs).
It was more concerned with proofs and advanced concepts than applications, which was fine for me, because I already learned the applications in high school because I was bored and self-motivated. The most valuable benefit of learning all of this extra theory to me was learning how to solve problems because doing proofs isn't some algorithm like long division or evaluating a derivative that you can just crunch as a cog-in-the-wheel.
The question is whether all of that extra rigmarole is worth it in my day-to-day development work. The answer to that is "no." Occasionally, I'll borrow some code that does thing X that is tangentially related (like convolution for computer vision and deep learning), but the theory behind it rarely comes in handy because I'm not doing academia or R&D work.
Of course, your usual full stack developer job won't even come close to touching any of this, but if you're doing moonshot type work like self-driving cars, you're going to need all of this theory to really attack the types of unsolved problems you're facing. Those types of jobs are rare, sometimes highly-compensated, and inaccessible to your Hack Reactor grad.
Linear algebra is used in even the simplest of games, and almost anything related to cad, computer vision, engineering, ...
If the problem is related to web and databases and crap then there is rarely any linear algebra (heck there are rarely even real numbers) but we didn't go to school to work with databases and web forms did we ;)
Obviously hilbert spaces might not be useful everywhere but for example, anecdotally I did stumble across one interesting problem already in my first year out of university: A drawing program needed to scale a drawn ellipse in one direction, which wasn't necessarily along one of its axes. Q: is the result an ellipse? What are its axes?
A: (I had to ask a friend with better linear algebra skills, which still bugs me): the axes will be given by the eigenvectors of the matrix involved. The math was fairly hairy compared to most other things in a simple drawing program, but it was fun and really rewarding when the math worked.
> ... I don't see the diploma as evidence of what they know but as evidence that they could learn, quickly.
It's a better evidence that they can follow steps to complete something, but is not a good indicator of the one thing I value in SEs: being self-directed in tackling a problem.
Also this varies a ton by sub-industry, over in games we don't care about degrees at all, just demonstrated work. Once you move on the sliding scale over to the larger corps it starts becoming a checkbox for HR that won't get you through the filter(most of the time).
> I think there are things "every good developer should know" such as knowledge about how cpus work, how OS'es work
Considering the complexity of modern CPU's and OS's, does college go into enough depth to tech them well or does it just let students think they know how they work?
Well, what you need to know as a developer of either is pretty limited. Understanding the basic existence of the following will get you a long way: cache hierarchy, branch prediction, instruction reordering, syscalls, interrupts, virtual memory and page mapping, the layers of the networking stack, etc. You might need to go in depth about a particular CPU/OS later during your working life, but most likely not, just know enough to have a semi-realistic picture of the layers below your own code. Those are all things that can be self-taught to some level, specially if you have the discipline to write some experimental code to test them in your system. For most developers, however, those are things that are easier to learn in a classroom than on your own time.
(Some of the theoretical parts are even more suitable for college and equally useful: complexity/computability theory, lambda calculus, graph theory, autómata theory, ML, etc)
My classes taught me the existence of a lot of low-level constructs and showed us example implementations. To me, it's more like some handholds to use when I need to go spelunking into a particular implementation. Do I always know how something works? No. But I've got starting places to investigate when reality doesn't match my mental model.
"A curious mind, with a taste for the abstract. A diploma is just one clue."
This is me and my BA in math. I would like employers and coworkers and people to understand this point. They often don't get it. Also, I think I've come to realize that I am actually a slow learner. I need practice and patience to deeply understand something and confidently talk about it. Where do I belong then? Personalities who say they are quick and are able to confidently sell things erk me and makes me question if they truly have the best they can give. You could read a Wikipedia page about some things and sound awesome if you have prior knowledge and a quick wit be able to convince most people of expertise.
I'm someone who graduated with a non-cs engineering degree (and worked professionally as an engineer) whose now in a MSCS program after trying my hand in the job market. Not for a lack of trying, but my experiences definitely relate to an uphill struggle just to get even noticed. And before anyone says "you should do x|y|z", I've done my part pretty well. A few decent sized personal projects, constant studying, meetups, networking, github, etc.
I hear a lot of anecdotal evidence (mostly from professional devs who have degrees) that they know a friend or coworker who doesn't have a degree.
The truth is that the system for recruiting and hiring is really based around colleges, students, and degree holders. Starting from the application requirements, you see that most applications require a CS degree (or equivalent). Then the technical interview is aimed at fundamental CS questions, most of which students are accustomed to.
So for someone self taught, face to face interactions are a lot better. But even going to events/meetups, you notice a distinct change in demeanor of the recruiter when you mention you don't have a CS degree. It really is a tiring battle just to prove that you're competent.
Do I believe that you 100% need a degree to get a CS job? Definitely not. But without one, your options are very limited and on top of that you'll need a lot of luck and effort just to get past the first hurdle.
IIRC, the article says that a whopping 56% of developers don't have a degree. Given that the majority of developers don't have a degree, I'm not sure I'd agree with saying " without one, your options are very limited". Maybe "somewhat limited"?
My anecdotal evidence, looking at both my own career and a lot of developers I've worked with over the past 17 years, is that having a degree, or not, isn't a big deal.
What could be true with the last generation may not necessarily hold true for the next generation.
Many are working towards abstracting schooling out of a being a developer, great.
That also means we're moving towards making being a developer a blue collar job rather than a white collar job. We're also making private businesses now the trainers of labor.
The time to be a dropout rockstar developer was in the 2000s but we're moving towards 2020 now.
The real question is meant for the generation of 13 yo now in the next 5 years. They can study software development on their own but in 10 years time, will they be better spending half of that with expensive college education or nah.
Everyone in this thread are giving out answers based on their own experience surviving the past 20 years as if what happened the last 20 years in the tech industry is where we're at right now or will be going forward the next 20.
I wrote this on hardware and software made by and improved by organizations of highly educated and highly specialized people. It's incredulous to think the future is only going to be built by Lucky Palmers and 16 yo hacker geniuses.
I think for an out of the gate job this is true. But after you've had your first job this stops being as much of an issue. I have an Aerospace Engineering degree. I was able to get a WebForms job without much experience and a minor in CS. I will never knock that job because they took a chance on me, paid me a semi-competitive salary, and let me learn a lot.
I haven't had a hard time finding something since then. I do hate the basic algorithms questions every interviewer asks, but a month or so of Coursera and I'm ready to answer all of the "Implement a QuickSort algorithm in 20 minutes" questions again.
You either need luck or you need to be willing to take a job that isn't as reputable to learn programming as a profession, once you have that many doors will open up to you.
Just the first hurdle though. It took me a year of serious effort to land my first dev job. The second one came to me: I was contacted by a recruiter and asked to interview and ended up accepting that offer about a year after starting in my first position.
I dropped out of high school ten years ago and did nothing of note between then and teaching myself to code.
Just stick with it until you land the first one and you'll be fine.
I graduated with a degree in operations management. I've probably done over 100 personal projects to gain experience, with maybe 15 being worth talking about. I've read quite a few books, actually a lot if you count the PAKT ones. When I lived in Seattle I went to the meetups and found a couple groups there I enjoyed hanging out with. I've had a really hard time getting interviews except from a couple of small teams, and I mostly feel ignored. At this point I'm taking a computer systems class and discrete structures at a CC. The systems class is helping my debugging skills in ways that I wouldn't have put myself through but the price tag for this knowledge is high, and it's beyond just money. I still learn a magnitude more on my own than directly from the coursework.
Do I think a degree is necessary? No. I do think you'll have a hard time getting a job without one so you might as well start your own company.
Like you, I also didn't have a CS degree (I did take foundational CS coursework in undergrad, and also had an MS in something akin to HCI). I was hired for my first software engineering job in SF by a couple not-very-technical founders--both of whom were rather taken with a game I had designed/developed--after completing a week-long contract job for them. No CS fundamentals interview process.
After that first job, it was very easy to get interviews, although I had to prep a bit for those CS questions. I'm now interviewing for my third job, and feel like I've taken on too many interviews, many at big name startups and big corps. Among 18ish cold applications, I've only been turned down without so much as a recruiter call twice, likely because I was missing some keyword on my resume or something.
Where are you located? What kind of positions are you looking for?
I'm sure things are (potentially) quite different now compared to the 17 years ago when I first landed a job building web applications. (My very first job was an internship - sort of - while pursuing a CS degree. My next job was after I had dropped out, left that job and was busy delivering hoagies as my job.) So that was "luck" and being "in the CS education world" when I got started.
Still, I think if I have a point, I started out in silly, minor jobs for little local companies that needed someone that could code more than they needed a highly polished computer scientist or engineer. From there I, more so, "learned on the job" the rest of what I needed to advance (and I continually self-taught by working on my own projects).
We are such a weird profession. What other industry is like "Hey, you can do my job. It's easy, here are lots of extremely detailed courses, documentation and guides on how"
Actually, I'm always amazed at how many Real Estate Agents suggest that others should get into it because it is so lucrative. They seem to ignore the fact that they are welcoming more competition in a finite market.
In the US, there's a bit of pyramid to real-estate commissions and a lot of the recruitment is done by brokers. And even at the sales person level, the commission structure of most company's pays better when an agent sells the listing of another agent in the company so the more agents the better the odds of a better commission.
But coming from the architecture profession, I don't discount encouraging others as a form of self-validation of one's own career choices.
I can tear apart any engine to understand how it works.
I can rip out the bathroom in my house and install a totally new one.
With closed source, I can't just rip out component X and replace it with component Y. I can't read the binary to understand how it works (Technically I could, but I mean that with modern languages, the assembly looks massively different from the original source).
It's free labor, and companies love free labor. They claim to value candidates that produce open source, but somehow those who toil night after night on their passion projects never seem to see any job offers materializing as a result. At best they get recruiter spam (for which many here argue they should be grateful) inviting them to submit themselves to a whiteboarding carried out by an arrogant 20-something often fresh out of college and eager to assert their dominance.
I used to do card tricks for fun. I made an interesting observation about card tricks. I could show someone how to perform the slight of hand. I could even show them the trick and explain how misdirection works. Despite that, they couldn't do the same thing (even if it was a fairly simple trick). Free info simply wasn't enough to copy the trick. They had to have the aptitude and put in the hours if they wanted to do it themselves.
As coders, we do something similar. We show the code and explain how it is put together. We even talk about the theory behind the application. But for them to learn, they have to have the aptitude and the practice.
Coders have the aptitude, and they seem to insist anyone can do it too. Companies go along because it's in their best interest if this is true. The reality is that people have varying degrees of ability for handling abstract logic. Only a small portion of the population is adept enough to handle abstract logic at scale and we need that skill in more fields than just programming.
Reality seems to bear this out as we have increased the amount of recruiting every year at every level of business, school, and government, but with diminishing returns.
I believe that we offer to teach because we like finding people who think like we do (or maybe wish to believe we think the same way as the rest of the world), because we like to believe we are making other people's lives better, and on some level, because it's safe in that the returns are small enough to not threaten our own wellbeing, but maybe I'm just too cynical...
It's due to our poor social skills. Flight attendants successfully lobbied the media and public to refer to them that way and not as stewardesses/stewards to improve the esteem of the profession, but we're so clueless that we don't object at all when journalists refer to us as coders, a term also used for medical clerical workers. Many here will even argue with you that it's not derogatory.
Well, that's because companies can't afford to be as picky right now. Once everyone "learns to code." and there is a glut, you'd better believe degrees become more valuable.
Some people are self taught and self motivated, and they are often more capable than someone whose primary exposure to development (or technology beyond consumer use) was through a CS program. That is not to suggest that CS degrees are unnecessary, but once you understand the entirety of someones experience and knowledge you will often find their degree or lack thereof is entirely irrelevant.
Of course there are some employers who won't even acknowledge someones resume exists without a degree, which I have always found silly.
>Of course there are some employers who won't even acknowledge someones resume exists without a degree, which I have always found silly.
It really is stupid, I would hire a software engineer who dropped out of high school over a PhD if he's better. Never understood why people care about degrees, the whole point of college is to learn, not get a piece of paper. Only excuse I can think of is that it's a sorting mechanism for lazy businesses to filter applications.
Define succeed. Nice job, nice wife, etc... or FU money level of success?
I can only speak from my experience hiring and getting hired but all things being equal that piece of paper will likely get you hired over the other applicant(s). For inexperienced folks it also makes getting those critical first positions easier, which will allow you to quickly move up, or out, in the field.
I question the validity of their claim that by omitting a degree over 60% of the job postings didn't require a degree of some sort (see my previous point). Hiring staff is a real pain when you're not one of the big boys so casting a wider net is somewhat necessary.
>I can only speak from my experience hiring and getting hired but all things being equal that piece of paper will likely get you hired over the other applicant(s).
It may take you 10% more applications to get a single job, but that's not much if you live in an area with a lot of jobs. I have read about struggles from both sides of the fence (degreed/not) to the point where I can't take common advice of degree advantages seriously any more.
We really need to find a way to assess skills and make a great interview process that many companies can adopt, and that will solve the problem because then the CompSci curricula will have to switch over if they want to keep incoming students and graduate hiring rates high. "Who cares about your degree? Take this test to see if you're actually worth employing at a given career level!"
10% better hiring speed just does not seem attractive considering you spend tens of thousands plus the opportunity cost of working somewhere and contributing to your retirement (which for millennials is apparently very far from previous generations). Student debt is only increasing. Number of graduates are only increasing. Degrees make you stand out significantly less-so unless you are able to attend a prestigious school -- and at that point, the decision may not even have to do with the technical rigor involved. The hiring managers are playing the same risk aversion game everyone else is, only they can sometimes afford to be picky or not.
The only time a degree will become super-important is if the middle and low level jobs suddenly start disappearing. I haven't heard any doom and gloom about that, but if it was actually happening, it'd be the biggest sleeping giant of a news story for our industry.
Even then, most of the hiring that gets done will shift to people with experience and/or a degree. It's just that there will be less hiring overall.
Plus, you can always move laterally to another portion that still uses Excel as its backing store and probably automate your job away and do 30 minutes of work per day -- I'm seriously thinking about doing this because it seems easier to reduce the amount you work than to increase your salary for the same 40h/week.
I responded to a recent 'Will quitting school hurt my career?' type thread. I banged my head against the wall of youthful perspectives about what experiencing a career will be like across several variations of 'you play the odds and take your lumps when you lose' until I ran out of steam. Later, having thought about it (by writing) I recognized a very distinct benefit that pretty much only a college degree will provide.
It comes up on 'Ask HN' from time to time that a person without a degree is looking for a way around a degree based work visa requirement in order to immigrate for a software job. Now for a lot of developers in the US, this is probably less of a big deal since they don't have to emigrate for a high paying job at sexy company. Probably not as big a deal within the EU either. But there are a lot of other places in the great big world where it can matter and not having a degree does curtail access to jobs that a developer might want and opportunities that a developer might want to pursue for professional or other reasons.
Except in pretty rare circumstances where a person qualifies for a 'snowflake visa', a portfolio does not fill that gap. Likewise graduation from a coding bootcamp probably won't be an alternative visa qualification any time soon.
Despite their drawbacks and for better or worse, credentials provide an objective criteria for decision making and in contexts with legal implications that is very useful -- not just visas but for complying with employment laws and any other CYA context.
Rhetorical question: Am I the only one who did get a degree to actually learn stuff?
I was self-taught, had programmed as a job for 4 years but wanted to make a leap in my skills. Dedicating 5 years to just focus on learning stuff, practicing, all in a structured way (courses) was really, in hindsight, pretty amazing in terms of accelerating my learning. I did get an MSc in CS, but the main motivation was to learn stuff, not to get a degree on paper.
Disclaimer: EU-citizen. Studied in Sweden. No tuition fee.
I'm in the same boat. My self-taught experience inspired me to pursue a CS degree. The degree did put me in a serendipitous position that led to a number of great industry opportunities but personal growth was my main reason for starting it.
FWIW, I'm studying in Canada and paying my way through school with internships.
No, but it helps. Autodidacts are rare (possibly not on this site, but in general), and few people who haven't been to college even know what they don't know. Unfortunately a large proportion of college grads are equally clueless. College is a place where you can get an education, but nobody's going to insist, and neither degree nor GPA is any indication one way or another. What's worse, people who learned next-to-nothing in college don't even know that they don't know anything, and they can interview quite well. As the numerous HN threads about hiring and interviews attest, it's murderously difficult to separate the wheat from the chaff.
I think there are things "every good developer should know" such as knowledge about how cpus work, how OS'es work, how some important algorithms work, some basic set theory and complexity etc. All of this is doable without a degree, but I'd probably be more willing to bet that someone with a degree knows it, than someone without. Also of course - there are jobs that are doable without knowing foundations, but I'd call those "code monkey jobs" and whether such a job constitutes success I guess depends on who you ask. And what it pays.
Most importantly, when I recruit I don't see the diploma as evidence of what they know but as evidence that they could learn, quickly. If the diploma is from a prestigeous school but says they failed math 101 four times over that would make me doubt their capacity. Being able to pick something up is the capability I'm looking for. A curious mind, with a taste for the abstract. A diploma is just one clue.
It is completely possible for any person to learn foundational knowledge outside of a formal instructional setting. I personally know excellent developers who have done so. It's unquestionably possible. I firmly believe that given sufficient time and determination and help, it's possible for any person to learn any subject to any level.
It is, however, possibly true that not all people learn discrete mathematics, relational algebra, and computer architecture in an ideal manner when entirely self-directed. Or perhaps even all developers.
This is what struck me about my formal education. While I also believe you can learn anything with enough material and dedication on your own, having the structure to ensure that you cover a set of subjects that constitute "foundational knowledge" can be invaluable.
With all the resources available these days, including MOOCs, textbooks, syllabi, and recorded lectures from the best schools, there's a middle ground between "degree" and "entirely self-directed".
I'd say that these days it's easier to gain all that "foundational knowledge" without attending a university.
I landed a role in a blue chip and had get a great boss who taught me the foundational theory I was missing.
I didn't ask him to, per-se, but I think he just sensed I needed it.
It made a huge difference to me as a developer. So I do agree, foundational knowledge is kind of a big deal.
Step two: Package that with a bunch of people (both experienced and inexperienced) who can help with coursework and answer questions when they come up, teams of people to work with, somewhere to live with dozens of others doing the same work, hundreds more learning unrelated things (but who might be useful to you in the future), and you've got something close to a replacement for college.
Some day most (certainly not all) aspects of university education will be supplanted by high-quality universally accessible materials. But it's not here yet.
Yet, perhaps there's something about that process. After all, there are aspects to a formal collegiate experience that are not neatly captured by boot camps, fora, YouTube, and documentation. Perhaps some of those aspects, like an instructional environment that values theory, are of nontrivial significance.
Obviously, it's still possible that something better can be done. This may not be the same as a better option having been developed and being on offer, though.
Sure, but the risk of getting led onto an inefficient, time wasting path that doesn't give you a good grounding in the foundational knowledge is also high.
The skill to evaluate approaches to learning the information and the set of information you need to learn is not something you are likely to have without having studied the information, and there are lots of people with pet theories or financial interest in your actions trying to promote different approaches.
I don't think diplomas nor resumes normally list failed classes ;)
The usual sign is someone e.g getting a masters degree 2000-2005, and passing the introductory math classes in 2005.
The exceptions are companies like Facebook and Google, whose hiring practices seem to really be focused on hiring people straight out of college as opposed to established programmers.
Perhaps it's too broad a generalization - I haven't worked at Google - but all the companies who ape the GAFA hiring policies are indeed only interested in developers straight out of college. And I have a hard time blaming them; college students are fairly inexpensive to hire, and there's a lot of them out there looking.
My top university's CS program had a linear algebra class that loaded me up with theory though like Hilbert spaces, wavelet transforms, spectral theory, abstract algebra, and of course, principal component analysis (but the latter, along with SVD, is becoming more valuable these days with data science-y type jobs).
It was more concerned with proofs and advanced concepts than applications, which was fine for me, because I already learned the applications in high school because I was bored and self-motivated. The most valuable benefit of learning all of this extra theory to me was learning how to solve problems because doing proofs isn't some algorithm like long division or evaluating a derivative that you can just crunch as a cog-in-the-wheel.
The question is whether all of that extra rigmarole is worth it in my day-to-day development work. The answer to that is "no." Occasionally, I'll borrow some code that does thing X that is tangentially related (like convolution for computer vision and deep learning), but the theory behind it rarely comes in handy because I'm not doing academia or R&D work.
Of course, your usual full stack developer job won't even come close to touching any of this, but if you're doing moonshot type work like self-driving cars, you're going to need all of this theory to really attack the types of unsolved problems you're facing. Those types of jobs are rare, sometimes highly-compensated, and inaccessible to your Hack Reactor grad.
If the problem is related to web and databases and crap then there is rarely any linear algebra (heck there are rarely even real numbers) but we didn't go to school to work with databases and web forms did we ;)
Obviously hilbert spaces might not be useful everywhere but for example, anecdotally I did stumble across one interesting problem already in my first year out of university: A drawing program needed to scale a drawn ellipse in one direction, which wasn't necessarily along one of its axes. Q: is the result an ellipse? What are its axes?
A: (I had to ask a friend with better linear algebra skills, which still bugs me): the axes will be given by the eigenvectors of the matrix involved. The math was fairly hairy compared to most other things in a simple drawing program, but it was fun and really rewarding when the math worked.
It's a better evidence that they can follow steps to complete something, but is not a good indicator of the one thing I value in SEs: being self-directed in tackling a problem.
Also this varies a ton by sub-industry, over in games we don't care about degrees at all, just demonstrated work. Once you move on the sliding scale over to the larger corps it starts becoming a checkbox for HR that won't get you through the filter(most of the time).
Considering the complexity of modern CPU's and OS's, does college go into enough depth to tech them well or does it just let students think they know how they work?
(Some of the theoretical parts are even more suitable for college and equally useful: complexity/computability theory, lambda calculus, graph theory, autómata theory, ML, etc)
I hear a lot of anecdotal evidence (mostly from professional devs who have degrees) that they know a friend or coworker who doesn't have a degree.
The truth is that the system for recruiting and hiring is really based around colleges, students, and degree holders. Starting from the application requirements, you see that most applications require a CS degree (or equivalent). Then the technical interview is aimed at fundamental CS questions, most of which students are accustomed to.
So for someone self taught, face to face interactions are a lot better. But even going to events/meetups, you notice a distinct change in demeanor of the recruiter when you mention you don't have a CS degree. It really is a tiring battle just to prove that you're competent.
Do I believe that you 100% need a degree to get a CS job? Definitely not. But without one, your options are very limited and on top of that you'll need a lot of luck and effort just to get past the first hurdle.
My anecdotal evidence, looking at both my own career and a lot of developers I've worked with over the past 17 years, is that having a degree, or not, isn't a big deal.
Many are working towards abstracting schooling out of a being a developer, great.
That also means we're moving towards making being a developer a blue collar job rather than a white collar job. We're also making private businesses now the trainers of labor.
The time to be a dropout rockstar developer was in the 2000s but we're moving towards 2020 now.
The real question is meant for the generation of 13 yo now in the next 5 years. They can study software development on their own but in 10 years time, will they be better spending half of that with expensive college education or nah.
Everyone in this thread are giving out answers based on their own experience surviving the past 20 years as if what happened the last 20 years in the tech industry is where we're at right now or will be going forward the next 20.
I wrote this on hardware and software made by and improved by organizations of highly educated and highly specialized people. It's incredulous to think the future is only going to be built by Lucky Palmers and 16 yo hacker geniuses.
I haven't had a hard time finding something since then. I do hate the basic algorithms questions every interviewer asks, but a month or so of Coursera and I'm ready to answer all of the "Implement a QuickSort algorithm in 20 minutes" questions again.
You either need luck or you need to be willing to take a job that isn't as reputable to learn programming as a profession, once you have that many doors will open up to you.
I dropped out of high school ten years ago and did nothing of note between then and teaching myself to code.
Just stick with it until you land the first one and you'll be fine.
Do I think a degree is necessary? No. I do think you'll have a hard time getting a job without one so you might as well start your own company.
These last few years have made me salty.
Like you, I also didn't have a CS degree (I did take foundational CS coursework in undergrad, and also had an MS in something akin to HCI). I was hired for my first software engineering job in SF by a couple not-very-technical founders--both of whom were rather taken with a game I had designed/developed--after completing a week-long contract job for them. No CS fundamentals interview process.
After that first job, it was very easy to get interviews, although I had to prep a bit for those CS questions. I'm now interviewing for my third job, and feel like I've taken on too many interviews, many at big name startups and big corps. Among 18ish cold applications, I've only been turned down without so much as a recruiter call twice, likely because I was missing some keyword on my resume or something.
I'm sure things are (potentially) quite different now compared to the 17 years ago when I first landed a job building web applications. (My very first job was an internship - sort of - while pursuing a CS degree. My next job was after I had dropped out, left that job and was busy delivering hoagies as my job.) So that was "luck" and being "in the CS education world" when I got started.
Still, I think if I have a point, I started out in silly, minor jobs for little local companies that needed someone that could code more than they needed a highly polished computer scientist or engineer. From there I, more so, "learned on the job" the rest of what I needed to advance (and I continually self-taught by working on my own projects).
But coming from the architecture profession, I don't discount encouraging others as a form of self-validation of one's own career choices.
I can tear apart any engine to understand how it works.
I can rip out the bathroom in my house and install a totally new one.
With closed source, I can't just rip out component X and replace it with component Y. I can't read the binary to understand how it works (Technically I could, but I mean that with modern languages, the assembly looks massively different from the original source).
I can go to planning offices and see the plans of any building. Almost no-one has "secret painting methods", or "secret plumbing methods".
Deleted Comment
As coders, we do something similar. We show the code and explain how it is put together. We even talk about the theory behind the application. But for them to learn, they have to have the aptitude and the practice.
Coders have the aptitude, and they seem to insist anyone can do it too. Companies go along because it's in their best interest if this is true. The reality is that people have varying degrees of ability for handling abstract logic. Only a small portion of the population is adept enough to handle abstract logic at scale and we need that skill in more fields than just programming.
Reality seems to bear this out as we have increased the amount of recruiting every year at every level of business, school, and government, but with diminishing returns.
I believe that we offer to teach because we like finding people who think like we do (or maybe wish to believe we think the same way as the rest of the world), because we like to believe we are making other people's lives better, and on some level, because it's safe in that the returns are small enough to not threaten our own wellbeing, but maybe I'm just too cynical...
Some people are self taught and self motivated, and they are often more capable than someone whose primary exposure to development (or technology beyond consumer use) was through a CS program. That is not to suggest that CS degrees are unnecessary, but once you understand the entirety of someones experience and knowledge you will often find their degree or lack thereof is entirely irrelevant.
Of course there are some employers who won't even acknowledge someones resume exists without a degree, which I have always found silly.
It really is stupid, I would hire a software engineer who dropped out of high school over a PhD if he's better. Never understood why people care about degrees, the whole point of college is to learn, not get a piece of paper. Only excuse I can think of is that it's a sorting mechanism for lazy businesses to filter applications.
I can only speak from my experience hiring and getting hired but all things being equal that piece of paper will likely get you hired over the other applicant(s). For inexperienced folks it also makes getting those critical first positions easier, which will allow you to quickly move up, or out, in the field.
I question the validity of their claim that by omitting a degree over 60% of the job postings didn't require a degree of some sort (see my previous point). Hiring staff is a real pain when you're not one of the big boys so casting a wider net is somewhat necessary.
It may take you 10% more applications to get a single job, but that's not much if you live in an area with a lot of jobs. I have read about struggles from both sides of the fence (degreed/not) to the point where I can't take common advice of degree advantages seriously any more.
We really need to find a way to assess skills and make a great interview process that many companies can adopt, and that will solve the problem because then the CompSci curricula will have to switch over if they want to keep incoming students and graduate hiring rates high. "Who cares about your degree? Take this test to see if you're actually worth employing at a given career level!"
10% better hiring speed just does not seem attractive considering you spend tens of thousands plus the opportunity cost of working somewhere and contributing to your retirement (which for millennials is apparently very far from previous generations). Student debt is only increasing. Number of graduates are only increasing. Degrees make you stand out significantly less-so unless you are able to attend a prestigious school -- and at that point, the decision may not even have to do with the technical rigor involved. The hiring managers are playing the same risk aversion game everyone else is, only they can sometimes afford to be picky or not.
The only time a degree will become super-important is if the middle and low level jobs suddenly start disappearing. I haven't heard any doom and gloom about that, but if it was actually happening, it'd be the biggest sleeping giant of a news story for our industry.
Even then, most of the hiring that gets done will shift to people with experience and/or a degree. It's just that there will be less hiring overall.
Plus, you can always move laterally to another portion that still uses Excel as its backing store and probably automate your job away and do 30 minutes of work per day -- I'm seriously thinking about doing this because it seems easier to reduce the amount you work than to increase your salary for the same 40h/week.
It comes up on 'Ask HN' from time to time that a person without a degree is looking for a way around a degree based work visa requirement in order to immigrate for a software job. Now for a lot of developers in the US, this is probably less of a big deal since they don't have to emigrate for a high paying job at sexy company. Probably not as big a deal within the EU either. But there are a lot of other places in the great big world where it can matter and not having a degree does curtail access to jobs that a developer might want and opportunities that a developer might want to pursue for professional or other reasons.
Except in pretty rare circumstances where a person qualifies for a 'snowflake visa', a portfolio does not fill that gap. Likewise graduation from a coding bootcamp probably won't be an alternative visa qualification any time soon.
Despite their drawbacks and for better or worse, credentials provide an objective criteria for decision making and in contexts with legal implications that is very useful -- not just visas but for complying with employment laws and any other CYA context.
I was self-taught, had programmed as a job for 4 years but wanted to make a leap in my skills. Dedicating 5 years to just focus on learning stuff, practicing, all in a structured way (courses) was really, in hindsight, pretty amazing in terms of accelerating my learning. I did get an MSc in CS, but the main motivation was to learn stuff, not to get a degree on paper.
Disclaimer: EU-citizen. Studied in Sweden. No tuition fee.
FWIW, I'm studying in Canada and paying my way through school with internships.