The content taught here is the highest payout thing you can learn, in my opinion. Certainly more important than actually writing code or learning algos.
What this content covers should unlock iteration speed, which is the single greatest lever in learning and growing faster (on a computer). Thus it gives you more cycles to go back to improving your code, experimenting with algos, etc. Probably also highly correlated with upwards mobility in the software job market.
Great seeing this under a common umbrella I can hand to students and new grads.
I have this feeling that the kids who appear better at uni are the ones who happened to pick up certain not-quite-programming skills before they started. Basics of networking, how installation of programs happens, how to use the command line, that kind of thing.
I looked over her shoulder as my wife was doing a CS degree, and I realised there's a bunch of these little things that make life a lot easier if you know them.
There's definitely a thing with CS that, at least at more elite schools, there's an assumption that you more or less know how to program--at least a language like Python--and you also more or less know your way around a computer well enough to use it as a tool for programming. (Or you pick it up quickly on the side along with your full course load.)
This is more or less unique among college majors outside of some arts disciplines like music. Yes, there's a requirement for some secondary school algebra and some basic science but an electrical engineering major could basically have never assembled a circuit before attending college and probably wouldn't be at any particular disadvantage.
And, per the original post, MIT is certainly one of the institutions that does this. The 6.001 MOOC(s)--basically intro to algorithms--teaches a bit of Python on the side but clearly you're intended to mostly learn it on your own.
(By contrast, back in the day, I took a FORTRAN course as part of a non-CS engineering major. The assumption was that you had never touched a computer before.)
In my class 5 years ago, the differentiating factor was whether the kids who grew up online or not. Students like myself or my roommate who’d spent their formative years in front of a computer, not because they had to, but because it seemed like the thing to do at the time. I never realized how much I was learning during the time I spent scripting RuneScape bots and writing toy “viruses”, I did it because it was more fun.
I mean, yes, this was me. I came in knowing most/all of this. I knew six (or more?) programming languages, had at least played with CVS/SVN, had installed Linux (read: fought with the Linux bootloader to get my AMD CPU to boot without crashing), had dipped my toe in a few open source communities.
But I was a tutor in college and I interacted with a bunch of people who didn't come in with any of this experience. Many of those people struggled, but I also know a bunch of people who came into CS knowing nothing, loved it, and went from zero-to-sixty faster than even the people who came in knowing a lot.
I'm still not sure I can identify what the ingredient was, but experience alone is not enough to explain it.
Yeah, I remember taking this machine architecture course with a friend who was getting a journalism degree and had never written any code or done anything more technical than playing StarCraft brood war before college.
One of the later assignments they gave us this "bomb" executable and we had to use gdb to pick it apart and modify the instructions or find ROP gadgets or something to make the code not "explode". He was my partner in the assignment and I spent most of the time trying to teach him what I was doing in gdb. And trying to express that, sincerely, he wasn't stupid GDB is just really hard and he didn't have the background knowledge to make learning it easier.
I sat with a talented developer whilst we wrestled with a threading issue this past week. I wanted to inspect the value of a variable within a method during execution and asked him to set a breakpoint. He didn't know how to do that in the IDE, which he'd been using for over a year. Debugging is indeed a skill which needs learning.
I agree you should know how to use a debugger, however will also note that some companies mandate use of a specific IDE (or heavily encourage).
I have seen devs scoff at the thought of print debugging, but I recall that in systems programming there are many times you can’t use a debugger or need to rely on other tool.
I’d rather schools teach the concept of step debugging vs runtime debugging. Teach students to try to understand the code, make hypotheses, and verify them.
I have seen some people use a debugger solely because they only know how to step debug. Meaning they start from main or another entry point and step through every line of code.
My point being that, you can’t judge a developer by if they use print statements or a debugger. Judge them by the methodology of how they debug.
It's interesting. I mainly use functional-first languages, and I rarely need breakpoints. Programming is much more compositional with certain languages.
Well. a big part is curiosity, and apparently he isn’t curious not interested in what his tools do.
I used to read every help file on windows / visual c++, the FreeBSD manual, plowed through the file system and tested this it. Just because I was curious
I went to Purdue over a decade ago and we had a 1 credit hour lab that taught this stuff. Unix command line, git, bash, and finally some python.
Like you said, it's been a complete game changer. I feel these skills continue to differentiate me from my peers in terms of how I can attack arbitrary problems bravely to this day.
They still teach this stuff at Purdue. I graduated a few years ago and by far the most important class was about how unix works, moving around the command line and finally introducing us to vim
Indeed. While I think learning algorithms and ds is a non-negotiable thing, in 90% of the companies out there in 90% of the situations one will never have to write a binary search from scratch or implement a queue from scratch. On the other hand, profiling, debugging, glueing things together via bash, etc., that’s what distinguishes you from the colleges who only write passable code.
I dropped out after a year of college and have since weaseled my way into a dev position, your observation could not be more true in my experience. I've since strongly considered going back to school for a degree, but as interesting and probably useful some of the material may be, I'm not convinced it's worth it as an investment into making me better at my job.
If only I could use 529 funds on some of these online course providers.
I have tried in vain to get this implemented at our uni. I can say a few things I find interesting:
- Students used to get this stuff but no longer do, for example all workstations used to be unix, so when you left, you "knew" "unix" (shell, vim, etc)
- Due to things like ABET, classes are crammed with need-to-know-for-accreditation info so, well, some items need to go by the wayside (many are in the MIT list)
- There is a huge push, even by ABET, for security and crypto to be somehow integrated into nearly every class.
- Professors seem aware that we need this "missing class", but it is hard for administrators to implement, because: Universities were pressured into lowering credits to grad, so some courses were removed, so there is no room left for another course.
I am not pushing one way or another, and I only have the vision of working at two universities, but I think unis need to take a real hard look at their courses from a holistic point of view. I recall stumbling across that MIT course at least 5 years ago. I do not know many others who implemented something like that.
> I do not know many others who implemented something like that.
the school I did in france, ENSEIRB-MATMECA, started with three weeks where you only learn shell commands, emacs, LaTeX etc before doing anything else. Here are the slides (in french sorry, although they all have useful reference cards at the end):
Note that this is not part of the CS curriculum at MIT. It's a series of classes during Independent Activities Period in January which is mostly unstructured time when students can do or not do any activities that catch their fancy. It works well for this sort of thing but students also do a ton of stuff that isn't especially academically-related.
Thank you for making the point about accreditation, it's sort of a pet peeve of mine. I taught "intro to C++" last year at the Harrisburg campus of PSU. The students were a mix of non-CS majors who didn't know what a file was, a handful of students who already knew how to program and a bunch in the middle.
Re: accreditation... the admin is very reluctant to change anything about the courses. Even specific textbooks had to be recommended (I was warned for suggesting in the syllabus that the textbook wasn't needed).
Seemed a little more strict than teaching mathematics, which I did in graduate school.
Re: kids these days... a significant portion didn't understand the concept of a file. I blame apps and the cloud (funny because I now work in cloud storage). I ended up writing my own pre-cursor doc to the "missing semester". It was challenge to get a student from not understanding the filesystem to having some sort of understanding of linear search and pointers. (If you're interested: https://www.dropbox.com/s/jar1r0l5vdgspcl/basics.pdf?dl=0)
I tried to stress, especially to the non-majors, that this "missing" stuff was perhaps the most important thing they could learn. That, and how to properly google/search for things. I would experiment and try to re-word homework questions so that interesting StackOverflow answers appeared in search results.
C++ needs a “missing semester” around tooling. Most material I see focus on the core language but leave out setting up build systems, package management, clang tidy, testing etc.
This “MIT Course” was taught by grad students because they felt the actual course work left this out.
So when you say it’s a shame your uni doesn’t teach this, well, that’s what these grad students were saying as well. Perhaps the students could seize the initiative at your institution as well?
I might have to get a separate group to do it, like the ACM students. If I suggest it, even as an online sort of thing, it gets cut down. All of the sudden, it can not be "something we support" that is not "part of the curriculum".
We're trying to implement exactly this. I assign the MIT missing semester materials in my sophomore systems programming course, and we do all of our assignments using a CLI, C, GCC, and Git. Prior to my course, the students know only Java and IDE programming.
One problem we have is that the prereq chain for our courses is very long, so adding another course as a pre-req to all others lengthens that chain.
The MIT course offers probably too much info for our purposes, or at least info students don't need preloaded. Just basic CLI and basic Git clone/fork/pull/push are enough for up to probably Junior year. The problem is that the intro courses are so sanitized that students aren't even getting basic CLI until Sophomore year, which means by the time they graduate, they're behind where they should be imo.
This "class" was run by our CS library computer lab and was something your TA might push you to attend, but not part of the formal curriculum. That worked around some of the admin nonsense but still got help to motivated students.
It would seem this would be the perfect kind of class for online learning. Perhaps even as a wiki to crowd source helpful contributions. Most learning doesn't even have to be directed, but just point students in the right direction. This is a ripe area to find ways to better facilitate individual initiative, because a moderate level of effort yields tremendous reward. Plus, you can easily determine if you are doing it right.
We're teaching a course at ETH Zurich [1] where --besides the actual payload of solving partial differential equations (PDEs) on GPUs-- we put a lot of emphasis on "tools". Thus students learn how to use git and submit their homework via pushing to a repo of theirs on github, we teach testing and continuous integration, writing documentation, running code on a cluster, etc. In their final project, again submitted as a GitHub repo, they need to make use of all of theses skills (and of course solve some PDEs).
Note that excellent work in this space is done by the Software Carpentry project which exists since 1998 [2].
As an alumni, thanks a lot for doing this. Looking back, all the things that I've learned in just the first few weeks in the industry made writing code so much more productive - if only someone had shown some of it already during some early semester, even just during some assistant teaching hour, it would have saved so many hours.
I remember specifically when one of the exercises for some compiler lecture contained unit tests the code had to satisfy, and I was like, wow, why didn't I already knew about this during algorithm classes earlier where I was fumbling around with some diff-tools to check my output. Let alone proper version control, now that would have been a blessing.
In hindsight, it's a bit embarrassing that I didn't bother to, well, just google for it, but neither did my colleagues - I guess we were so busy with exercises and preparing for exams that we just didn't have the time to think further than that.
Thank you very much for the GPU course. Even though my college taught shell usage to some extent, when I asked about GPU programming it was considered a nerd topic back in 2009.
Lots of laudatory comments here about this being essential but missing teaching but I have a different take. The content looks good, nothing wrong with teaching these things. But these things can be and are learned on the job fairly quickly for anyone interested enough in the field and with enough aptitude. In fact, I would say these things can be learned on your own time as a side effect of being interested in computers.
So I would say it's good content, but not essential for a CS program.
I think it can be useful to distinguish between the "known and unknown unknowns" here. For example, everyone will quickly realise that that they need to know git, and they will learn what they need to know (a known unknown). A university course would maybe save them time, but it would not really change what you know after 2 years in industry. Compared to e.g awk or shell scripting which can be incredible usefull, but maybe not something people realise by themselves that they need (a unknown unknown). A university should make people at least aware of these latter tools.
Totally agree. Further, these skills actively support rapid iteration on learning other core university outcomes. In CS it’s very easy to waste lots of time doing things the wrong way that could otherwise be spent doing something useful.
Good point. Some things you will learn by necessity, but if you dont know about say regexes you probably wont think of searching specifically for such a tool.
a) Universities are places of higher learning that do not need to cater to industry
b) There is an implicit social contract whereby universities should produce industry-ready graduates
c) Something in-between
… the skills taught in this course are as useful for the PhD candidate as for the junior software engineer.
Why leave these out or up to the student, then?
Furthermore, in many other academic disciplines, it’s very common to teach applied technique (eg, on writing essays or structuring research), is this any different?
I don't think you're wrong that you can learn this stuff on the job, but a "primer" kind of class like this which surveys several useful tools helps new engineers develop pattern matching skills around how to continue self-teaching these kinds of things. Shell/scripty/vim-ey,linux-ey stuff can be really challenging to learn how to learn for some people.
> The class is being run during MIT’s “Independent Activities Period” in January 2020 — a one-month semester that features shorter student-run classes. While the lectures themselves are only available to MIT students, we will provide all lecture materials along with video recordings of lectures to the public.
It seems that the course is only 1 month and runs alongside some student-run classes. So this is from the get go not an essential course of a CS program.
There's quite a broad mix of offerings during IAP--a lot of which isn't part of a regular academic program and a lot of which is just for fun/intellectual interest. It's mostly all non-credit. And there are a fair number of sessions dealing with practical details that may not be covered in regular courses.
I think it’s dependent on the person learning the material. My college had a similar course and it was also highly regarded as useful by students. However, while I found the course fun I already knew most of the basics from my research job, and the advanced stuff hasn’t really come up again even in work (e.g. fancy git or gdb commands), so I’ve entirely forgotten it; my biggest takeaway is probably ctrl r for searching in shell. But I can see why a guided intro would be really helpful to someone who had no experience or some trouble getting started learning this kind of material (which is very different from programming or computer science).
There’s probably an aptitude threshold past which the course’s value diminishes - don’t mean any disrespect to anyone, just trying to expand on your point. The top students either have already figured it out or will do so easily, so they might be better off doing something else with the time they would’ve invested in this. But for a lot of students learning about these tools and concepts can be a real force multiplier, more so than a random upper level course.
This is also what i thought too. The content is effectively a roadmap on how one could use the computer well to perform tasks (without the algorithmic/programming portion).
If your main usage environment is windows, none of these are that helpful, but the ideas could be translated mostly (windows shell is similar enough, that you can just as easily script them with batch files).
Yeah I pretty much agree with this comment. But I think my take on whether universities should be training for academia or for industry is "why not both?". I think these are just different valid tracks. There is a lot of education that overlaps between "computer science researcher academic" and "software engineering professional with strong fundamentals". So that should be the required set of credits. But then there are different tracks toward where you want to take your degree. Those sound like electives. Academics should learn more about research techniques and publishing and presenting at academic conferences, etc. And those on the professional track should learn more about tools and techniques used in industry.
There's no conflict or contradiction here, just different strokes for different folks.
It's amazing how many CS programs fail to teach you even the basic tools of being a software developer. Yes, yes, CS is not programming, but there is a non-trivial amount of CS that is indeed programming, and that is generally what people do with their CS degrees, so it'd make sense for a CS program to teach the basics. Maybe even more than the basics.
CS programs prepare you first and foremost for a PhD in CS. They're great for learning the theory and fundamentals, but teach practical Software Engineering skills as a side effect.
Ironically, as a CS PhD student I spend an enormous amount of my time on the subject matter listed. There are very few research areas in which PhD students can get away with ignorance of the command line, shell environments, build systems, version control, debugging, etc.
A common scenario is that I want to reproduce and extend some earlier research, and there's a GitHub repository for it that hasn't been touched for the last seven years, which was forked from an even older repository that some grad student hastily pieced together. And I need to run components of it on our high-performance computing cluster, where I don't have sudo privileges. So it's whole lot of moving things around between VMs and Docker containers, figuring out what to do about ancient versions of packages and libraries that are dependencies (especially everything that's still written for Python 2.7); either refactoring things to update it all, because I want to take advantage of newer functionality, or setting up some isolated environment with older releases of everything built from source.
This very much varies by school. In California, undergrad as prep for grad school is the default for the UC system, but not so for the Cal State system.
I think this is mostly a US thing, or countries based on similar education systems.
In Portugal our computing degrees are Informatics Engineering to use a literal translation, a mix of CS stuff and Engineering. And validated by the Engineering Order as fulfilling certain requirements, for anyone that at end of the degree also wants to do the admission exam for the professional title.
Those that only care about the theory part of CS take a mathematics degree with major in computing.
I was on the CS faculty at a Canadian university in the 1980s. I proposed a course with almost exactly this outline, only to be told it wasn't university-level material. MIT seems not to have got this message; good on them!
I personally don't like the undertone of the class (tho very grateful that this material exists!!!) - this idea that universities are failing their students by not teaching them necessary material. I think a better phrasing is that students are failing themselves by not learning the material. I've personally never considered it the responsibility of my university to educate me - some of the classes are certainly useful for learning, but the ultimate onus falls on me to gain the skills that will lead me to success. I find it kind of distasteful how classes encourage a sort of passive victim mentality when it comes to learning - as if students need to be bribed with credits and cudgeled with a gpa to be forced to learn genuinely useful things.
You don't consider paying tens of thousands of dollars as creating responsibility to educate?
Of course, students need to be active in the learning process. But in my experience, it is more likely that professors and departments are terrible at educating than it is for students to not be motivated to learn.
> You don't consider paying tens of thousands of dollars as creating responsibility to educate?
I get OPs point. It's like getting an english literature degree and you've never read a book on your own.
My guess is most people needing the missing semester never coded outside of their assigned tasks. Which is fair enough, but its surprising to me to meet phd candidates who marvel over the missing semester (I've met 2).
There’s absolutely a responsibility to educate on the topics needed for the degree to be granted.
This class is an adjacency to an EE or CS candidate. Are universities also failing their students by not offering/requiring a touch-typing class? I don’t think so, in large part because computer science is not programmer occupational training.
A degree takes already 3 or 4 years. In order to incorporate this “missing semester” universities would have to either a) remove existing material to make space for it, or b) extend the degree one more semester.
I don’t think universities should remove existing material in general to incorporate “bash 101”. Mainly because learning bash is easy and one can learn it by oneself without a professor. Extending the degree one more semester doesn’t make much sense either.
I'm surprised anyone would object to this. Universities have a responsibility to prepare their students.
When I saw the title I figured it was another "computer science" class. But the curriculum was a significant portion of what I lacked when I graduated, which prevented me from finding work for a year.
Had someone at university told me before I graduated that I'd have no chance of finding work if I didn't know Git, Linux, REST, how to use the command-line, how to use an IDE, how to use an editor on the command line, and bash, I would have prepared myself for those things.
> the ultimate onus falls on me to gain the skills that will lead me to success
I see where you're coming from, but sometimes you don't even know what the necessary skills are. Even if you're very self-motivated and enthusiastic, you can still benefit by being pointed in the right direction. That's part of what a good school or teacher should do for you. (And while they're at it, they can provide materials that smooth out the path to get there.)
You should never expect them to cover 100% of that, but if they're aware of a way that they can get closer to 100% than they currently are, then it's a good thing for them to do it.
I think you’re conflating two different things: universities selecting and presenting a syllabus needed to earn a certain degree, and students actually learning the material.
The latter is the solely the responsibility of each student, but I don’t understand why the former would be. Some of the content in this course strikes me as unknown unknowns for new programmers. Why would they be to blame if no one told them to learn a particular skill?
Honestly, because its something that should be a prerequisite for starting the degree program in the same way basic algebra is a prerequisite. Likewise, not knowing you need to know this stuff is a sign that you are probably not at the point where you should even be able to have declared the major. The fact that colleges allow this at all is doing a disservice to students, many of whom will go on to permanently damage their academic records.
People are going to really dislike what you said but I agree to a certain extent, especially when it comes to the basics of working in the command line. If somebody can't read the manual on that and figure it out then they are going to be so hopeless for so many other things that I don't want anything to do with them.
If someone has literally never opened a terminal with a command line, you're probably being rather dismissive of how unintuitive it will be for a lot of people at first.
I started college in 1993 and my school had a mandatory "Introduction to Unix Computing Environment" class for all incoming engineering freshmen.
We learned the basics of the shell, file system, file editing, AFS ACL's for group projects, and more. It looks very similar to this MIT course which makes sense as our school's computing environment was based on MIT's Project Athena and AFS (Andrew File System)
I looked and the same course is still mandatory for incoming engineering students.
I'm in the semiconductor industry and everything runs on Unix / Linux. Back in 2000 we would get new grads that knew very little about Unix, command lines, or scripting. That kind of stuff is half my job. These days Linux is so popular that most of the new grads know this stuff.
Even though a lot of incoming students would have had PCs by that time, they'd mostly have been running Windows. As you suggest, as I understand it MIT really focused on Project Athena clusters for engineering work and people used PCs more for word processing, etc.
My university had Dijkstra's quote "Computer science is no more about computers than astronomy is about telescopes". They made us aware of tools and we were free to use them as little or as much as possible to do the science. I always assumed that software engineering degrees focused on tools more than computer science degrees (among other differences).
The following isn't aimed at you in particular, but in HN threads about the Missing Semester there will always be someone who earnestly repeats this stinking turd of a Dijkstra quote, so I'll put my rant here:
Dijkstra was full of it. He wanted CS to be just a branch of abstract mathematics but that's never been the case. That's a retconning of history by people with math envy. Before Alan Turing had ever heard of the Entscheidungsproblem, he had already built simple mechanical computers with his bare hands.
It's cousin to a stupid mindset you see in software engineering, that you can somehow be a good engineer while not knowing what your hardware is actually doing. That's how you get complicated architecture-astronaut systems with good theoretical big-O characteristics, that get crushed by a simple for loop written by the guy who ran a profiler and knows what a cache line is. We live in a world made of atoms, not lemmas.
Research fields go rotten when they don't come into contact with reality enough: quantum computing, string theory, etc.
And as for astronomy: knowing how telescopes are constructed, how they work, their optical characteristics, limitations, failure modes, all of that is essential to observational astronomy. And if you study astronomy, you sure as fuck are taught how to use a telescope!!!
Astronomy as we know it didn't exist until we had good telescopes. Cosmological theories have risen and fallen on the advances in optical theory and engineering. Astronomy is very much about telescopes.
What other field is so ashamed of its own tools? Like, art isn't about pencils, but art students are taught how to hold a pencil! Stop repeating this thought-terminating cliche.
I estimate that astronomers need to know about tradeoffs on a telescopes' settings for the data they are looking at. But I'm unconvinced that they necessarily need to know how to operate it (would depend on the workplace) and I certainly disagree that how they are constructed is absolutely necessary for all astronomers.
More knowledge is always good, so of course learn what you want. But it's not being "ashamed of tools" to say that a CS degree should "do one thing and do it well".
Additionally, we can simultaneously say that a university should encourage tool mastery while also saying that they don't need to teach entire courses on it.
Completely agree. Things like shell scripting, debugging tools, IDE usage can all be naturally picked up on the job given whatever tools that they recommend you use at their company.
You know what you're not going to be able to pick up at your first software engineering position? Discrete mathematics or linear algebra.
Not trying to dismiss the importance of knowing discrete math etc. in general, but I would posit that vast majority of entry level swe positions require no knowledge of it.
However, knowing the tools of the trade is something that is invaluable. And yes, it can be picked up on the job, but deliberate learning and practice is more effective and less stressful.
This was my first thought too. The tools talked about in the link are useful but they aren’t really computer science. This was also hit home in my CS courses. I was being taught the science behind computers, not necessarily the practical application of them.
What this content covers should unlock iteration speed, which is the single greatest lever in learning and growing faster (on a computer). Thus it gives you more cycles to go back to improving your code, experimenting with algos, etc. Probably also highly correlated with upwards mobility in the software job market.
Great seeing this under a common umbrella I can hand to students and new grads.
I looked over her shoulder as my wife was doing a CS degree, and I realised there's a bunch of these little things that make life a lot easier if you know them.
This is more or less unique among college majors outside of some arts disciplines like music. Yes, there's a requirement for some secondary school algebra and some basic science but an electrical engineering major could basically have never assembled a circuit before attending college and probably wouldn't be at any particular disadvantage.
And, per the original post, MIT is certainly one of the institutions that does this. The 6.001 MOOC(s)--basically intro to algorithms--teaches a bit of Python on the side but clearly you're intended to mostly learn it on your own.
(By contrast, back in the day, I took a FORTRAN course as part of a non-CS engineering major. The assumption was that you had never touched a computer before.)
I mean, yes, this was me. I came in knowing most/all of this. I knew six (or more?) programming languages, had at least played with CVS/SVN, had installed Linux (read: fought with the Linux bootloader to get my AMD CPU to boot without crashing), had dipped my toe in a few open source communities.
But I was a tutor in college and I interacted with a bunch of people who didn't come in with any of this experience. Many of those people struggled, but I also know a bunch of people who came into CS knowing nothing, loved it, and went from zero-to-sixty faster than even the people who came in knowing a lot.
I'm still not sure I can identify what the ingredient was, but experience alone is not enough to explain it.
One of the later assignments they gave us this "bomb" executable and we had to use gdb to pick it apart and modify the instructions or find ROP gadgets or something to make the code not "explode". He was my partner in the assignment and I spent most of the time trying to teach him what I was doing in gdb. And trying to express that, sincerely, he wasn't stupid GDB is just really hard and he didn't have the background knowledge to make learning it easier.
um yea thats my job dude and i fought for it so other people should too these people waltzing in not knowing shot should not get into CS
I have seen devs scoff at the thought of print debugging, but I recall that in systems programming there are many times you can’t use a debugger or need to rely on other tool.
I’d rather schools teach the concept of step debugging vs runtime debugging. Teach students to try to understand the code, make hypotheses, and verify them.
I have seen some people use a debugger solely because they only know how to step debug. Meaning they start from main or another entry point and step through every line of code.
My point being that, you can’t judge a developer by if they use print statements or a debugger. Judge them by the methodology of how they debug.
I used to read every help file on windows / visual c++, the FreeBSD manual, plowed through the file system and tested this it. Just because I was curious
Like you said, it's been a complete game changer. I feel these skills continue to differentiate me from my peers in terms of how I can attack arbitrary problems bravely to this day.
If only I could use 529 funds on some of these online course providers.
https://www.youtube.com/watch?v=ZQnyApKysg4
It's all about leveraging "simple" cli tools and combine their powers to cut through work very swiftly. The opposite of many days for many people.
the school I did in france, ENSEIRB-MATMECA, started with three weeks where you only learn shell commands, emacs, LaTeX etc before doing anything else. Here are the slides (in french sorry, although they all have useful reference cards at the end):
- intro: http://mfaverge.vvv.enseirb-matmeca.fr/wordpress/wp-content/...
- unix, shell: https://cours-mf.gitlabpages.inria.fr/if104/docs/01-unix.pdf
- emacs: https://cours-mf.gitlabpages.inria.fr/if104/docs/02-emacs.pd...
- latex: https://mfaverge.vvv.enseirb-matmeca.fr/wordpress/wp-content...
- "advanced" shell scripting: https://cours-mf.gitlabpages.inria.fr/if104/docs/05-scripts....
Re: accreditation... the admin is very reluctant to change anything about the courses. Even specific textbooks had to be recommended (I was warned for suggesting in the syllabus that the textbook wasn't needed). Seemed a little more strict than teaching mathematics, which I did in graduate school.
Re: kids these days... a significant portion didn't understand the concept of a file. I blame apps and the cloud (funny because I now work in cloud storage). I ended up writing my own pre-cursor doc to the "missing semester". It was challenge to get a student from not understanding the filesystem to having some sort of understanding of linear search and pointers. (If you're interested: https://www.dropbox.com/s/jar1r0l5vdgspcl/basics.pdf?dl=0)
I tried to stress, especially to the non-majors, that this "missing" stuff was perhaps the most important thing they could learn. That, and how to properly google/search for things. I would experiment and try to re-word homework questions so that interesting StackOverflow answers appeared in search results.
That's a really interesting pedagogical approach, I like it a lot.
Deleted Comment
So when you say it’s a shame your uni doesn’t teach this, well, that’s what these grad students were saying as well. Perhaps the students could seize the initiative at your institution as well?
One problem we have is that the prereq chain for our courses is very long, so adding another course as a pre-req to all others lengthens that chain.
The MIT course offers probably too much info for our purposes, or at least info students don't need preloaded. Just basic CLI and basic Git clone/fork/pull/push are enough for up to probably Junior year. The problem is that the intro courses are so sanitized that students aren't even getting basic CLI until Sophomore year, which means by the time they graduate, they're behind where they should be imo.
You start with 2 weeks of only Linux C + POSIX shell, and during this you're only allowed to use i3 + vim/emacs.
Uni's are requiring more generals, especially humanities. The problem isn't credit requirements, it's accreditation, marketing, and politics.
Note that excellent work in this space is done by the Software Carpentry project which exists since 1998 [2].
[1] https://pde-on-gpu.vaw.ethz.ch/ [2] https://software-carpentry.org/
I remember specifically when one of the exercises for some compiler lecture contained unit tests the code had to satisfy, and I was like, wow, why didn't I already knew about this during algorithm classes earlier where I was fumbling around with some diff-tools to check my output. Let alone proper version control, now that would have been a blessing.
In hindsight, it's a bit embarrassing that I didn't bother to, well, just google for it, but neither did my colleagues - I guess we were so busy with exercises and preparing for exams that we just didn't have the time to think further than that.
So I would say it's good content, but not essential for a CS program.
a) Universities are places of higher learning that do not need to cater to industry
b) There is an implicit social contract whereby universities should produce industry-ready graduates
c) Something in-between
… the skills taught in this course are as useful for the PhD candidate as for the junior software engineer.
Why leave these out or up to the student, then?
Furthermore, in many other academic disciplines, it’s very common to teach applied technique (eg, on writing essays or structuring research), is this any different?
It seems that the course is only 1 month and runs alongside some student-run classes. So this is from the get go not an essential course of a CS program.
There’s probably an aptitude threshold past which the course’s value diminishes - don’t mean any disrespect to anyone, just trying to expand on your point. The top students either have already figured it out or will do so easily, so they might be better off doing something else with the time they would’ve invested in this. But for a lot of students learning about these tools and concepts can be a real force multiplier, more so than a random upper level course.
If your main usage environment is windows, none of these are that helpful, but the ideas could be translated mostly (windows shell is similar enough, that you can just as easily script them with batch files).
There's no conflict or contradiction here, just different strokes for different folks.
A common scenario is that I want to reproduce and extend some earlier research, and there's a GitHub repository for it that hasn't been touched for the last seven years, which was forked from an even older repository that some grad student hastily pieced together. And I need to run components of it on our high-performance computing cluster, where I don't have sudo privileges. So it's whole lot of moving things around between VMs and Docker containers, figuring out what to do about ancient versions of packages and libraries that are dependencies (especially everything that's still written for Python 2.7); either refactoring things to update it all, because I want to take advantage of newer functionality, or setting up some isolated environment with older releases of everything built from source.
In Portugal our computing degrees are Informatics Engineering to use a literal translation, a mix of CS stuff and Engineering. And validated by the Engineering Order as fulfilling certain requirements, for anyone that at end of the degree also wants to do the admission exam for the professional title.
Those that only care about the theory part of CS take a mathematics degree with major in computing.
But name is the only thing we get right in CS education.
Of course, students need to be active in the learning process. But in my experience, it is more likely that professors and departments are terrible at educating than it is for students to not be motivated to learn.
I get OPs point. It's like getting an english literature degree and you've never read a book on your own.
My guess is most people needing the missing semester never coded outside of their assigned tasks. Which is fair enough, but its surprising to me to meet phd candidates who marvel over the missing semester (I've met 2).
This class is an adjacency to an EE or CS candidate. Are universities also failing their students by not offering/requiring a touch-typing class? I don’t think so, in large part because computer science is not programmer occupational training.
I don’t think universities should remove existing material in general to incorporate “bash 101”. Mainly because learning bash is easy and one can learn it by oneself without a professor. Extending the degree one more semester doesn’t make much sense either.
When I saw the title I figured it was another "computer science" class. But the curriculum was a significant portion of what I lacked when I graduated, which prevented me from finding work for a year.
Had someone at university told me before I graduated that I'd have no chance of finding work if I didn't know Git, Linux, REST, how to use the command-line, how to use an IDE, how to use an editor on the command line, and bash, I would have prepared myself for those things.
Did your interviews ask specific questions about Git, Linux things not covered in a standard operating systems course, and command line editing?
I see where you're coming from, but sometimes you don't even know what the necessary skills are. Even if you're very self-motivated and enthusiastic, you can still benefit by being pointed in the right direction. That's part of what a good school or teacher should do for you. (And while they're at it, they can provide materials that smooth out the path to get there.)
You should never expect them to cover 100% of that, but if they're aware of a way that they can get closer to 100% than they currently are, then it's a good thing for them to do it.
The latter is the solely the responsibility of each student, but I don’t understand why the former would be. Some of the content in this course strikes me as unknown unknowns for new programmers. Why would they be to blame if no one told them to learn a particular skill?
We learned the basics of the shell, file system, file editing, AFS ACL's for group projects, and more. It looks very similar to this MIT course which makes sense as our school's computing environment was based on MIT's Project Athena and AFS (Andrew File System)
https://en.wikipedia.org/wiki/Project_Athena
I looked and the same course is still mandatory for incoming engineering students.
I'm in the semiconductor industry and everything runs on Unix / Linux. Back in 2000 we would get new grads that knew very little about Unix, command lines, or scripting. That kind of stuff is half my job. These days Linux is so popular that most of the new grads know this stuff.
Dijkstra was full of it. He wanted CS to be just a branch of abstract mathematics but that's never been the case. That's a retconning of history by people with math envy. Before Alan Turing had ever heard of the Entscheidungsproblem, he had already built simple mechanical computers with his bare hands.
It's cousin to a stupid mindset you see in software engineering, that you can somehow be a good engineer while not knowing what your hardware is actually doing. That's how you get complicated architecture-astronaut systems with good theoretical big-O characteristics, that get crushed by a simple for loop written by the guy who ran a profiler and knows what a cache line is. We live in a world made of atoms, not lemmas.
Research fields go rotten when they don't come into contact with reality enough: quantum computing, string theory, etc.
And as for astronomy: knowing how telescopes are constructed, how they work, their optical characteristics, limitations, failure modes, all of that is essential to observational astronomy. And if you study astronomy, you sure as fuck are taught how to use a telescope!!!
Astronomy as we know it didn't exist until we had good telescopes. Cosmological theories have risen and fallen on the advances in optical theory and engineering. Astronomy is very much about telescopes.
What other field is so ashamed of its own tools? Like, art isn't about pencils, but art students are taught how to hold a pencil! Stop repeating this thought-terminating cliche.
I estimate that astronomers need to know about tradeoffs on a telescopes' settings for the data they are looking at. But I'm unconvinced that they necessarily need to know how to operate it (would depend on the workplace) and I certainly disagree that how they are constructed is absolutely necessary for all astronomers.
More knowledge is always good, so of course learn what you want. But it's not being "ashamed of tools" to say that a CS degree should "do one thing and do it well".
Additionally, we can simultaneously say that a university should encourage tool mastery while also saying that they don't need to teach entire courses on it.
You know what you're not going to be able to pick up at your first software engineering position? Discrete mathematics or linear algebra.
However, knowing the tools of the trade is something that is invaluable. And yes, it can be picked up on the job, but deliberate learning and practice is more effective and less stressful.