Screw all of that. Just make the education modular, let anyone take any module and any time and certify them based on requirements for that particular module. Regardless of their attendence and other crap. Then once they have requisite modules, issue them the corresponding degree.
The whole idiocy of linear education makes me furious. I have only gotten motivated to study certain topics later in my life, where it's effectively impossible to return to college because I would have to study a whole bunch of unrelated crap and get the degree or go home instead of gaining a small portion of it and perhaps finishing the rest later.
And let me just take the exams when you have nothing to teach me! I mean, seriously, why would I have to enroll and attend just to get certified for Linux system administration when I make living off it? Just give me the module. Same with software development, databases and so on.
Sure, I will take the OCSP / BGP advanced networking classes when I have time for them, but don't force me to study routing now (or you won't have enough credits for this semester and you'll have to go away).
This is precisely how Western Governor's University (WGU) works, as I understand. You do the modules and when you've completed them (at your own pace), you get the degree. I had a teacher in high school who did this to get a IT degree, and they have lots of other ones as well.
My college had some dependencies between the courses, but IIRC it was possible to graduate after 2,5 years instead of the usual 4. Some people in my original group did just that and got out a year earlier.
My personal record was 14 courses (standard being 5-6 per semester) due to narrowly avoiding dropping out thanks to talking with the dean and getting approval for making up for my failures in the previous semester. I failed three of them, but that was still enough to stay afloat. Can't imagine actually going through this every semester.
Just note, if you take a look at the actual materials covered and single final assignment required for classes at WGU you will find they are a joke compared to what you find at standard universities (e.g. university of utah, Virginia tech,…)
the uk has had modular degree schemes going back to the 1980s, when i wrote the software to administer one (nomad/2, ibm vm/cms, if anyone is interested). you could do the modules part-time, take a year off etc.
I've done some MOOCs and I think that should at least be an option for most classes. The current system is still very medieval and hasn't adapted enough to modern reality.
Some schools are doing stackable degrees. You take a block of a few classes (e.g. 3-4 classes), get a graduate certificate for that area and once you get a few of them, you can convert them to a graduate degree. It might be trickier for undergrad as that's where the grit is being created.
Why don’t they just eliminate attendance as well and just have students pay $100k for the degree directly and the students can sign a pledge promising they learned all the material?
Pass/Fail already exists as an option if students want to use that.
Grades aren’t about proving learning, they are a demonstration of competence. I took many classes with prerequisites that required a C or even B in other classes before allowing me to take it. This was because of demand for classes as well as out of necessity of prior knowledge. It was possible to get a waiver from the professor, and some did. But many wouldn’t even meet because they were too busy to meet individually and assess a student’s knowledge.
How will this work without grades? It seems like this is just opting for short term happiness and pushing the actual reasons for grades down the road.
I think instruction and examination could and should be split. Independent assessors could accept anyone, much like you can take a foreign language proficiency test now, or a driver license test. Some common standards will be adopted, or emerge where absent.
Certain kind of examination will take pretty long to be comparable with what a university can assess. A course project, one that demonstrates some breadth and depth of knowledge of the course's material, may take weeks or months to complete. Which, on the other hand, should be fine, both in a traditional university setting, and outside it.
This is actually how independent study worked at the university I went to. You got a big stack of material to learn. There were designated study group times with a room provided, but that was if you wanted to show up and discuss with other students.
At the end of the semester some guy showed up and gave you an oral exam followed by a written exam. They were usually a professor at another nearby university that actually offered the course. If you passed both you got a passing grade for the course.
Neither assessment nor feedback require letter grades.
Some schools without grades use a portfolio system where you have to complete projects in various areas and some sort of capstone project or thesis to graduate. Graduate programs largely work that way as well. They may have a qualifying process, but actual letter grades are largely irrelevant.
Students still get feedback on their projects, and they graduate with a CV and work portfolio that they can show to employers or graduate programs.
A degree from a university that has gotten rid of grades will not be worth much compared to one that hasn't.
> If a student already knew the material before taking the class and got that A, "they didn't learn anything," said Greene. And "if the student came in and struggled to get a C-plus, they may have learned a lot."
Sorry, Jody Greene, you're out to lunch here. The grade is assigned by the lecturer of a course, as a confirmation that the student knows a thing or two about certain fixed topics. If the student learned about those topics somewhere else before taking the course, they earned that A just as much as someone who learned it in the course.
It is extremely important to intellectual freedom that the time and place where some knowledge or skills were attained is considered immaterial. The idea of someone not getting accreditation in a course because they already knew the material beforehand and so didn't learn anything is completely wrongheaded and abhorrent. I dare say, an attack on Western civilization's intellectual legacy.
> And so if we were to shift our focus on to learning and away from grades, we would be able to tell whether we were graduating people with the skills that we say we're graduating them with."
To "tell whether you are graduating people with skills", you need a test of those skills. That's a grade.
A skill test could actually be a lot harder than earning grades in some disciplines.
For instance, if we expect that computer science graduates ought to be skilled software developers and start testing that, it will be a shitshow.
> If a student already knew the material before taking the class and got that A, "they didn't learn anything," said Greene
Also, the fuck they didn't. They learned patience (have to listen to/discuss other people's questions even if they're stupid to you) and the ability to do tasks that are boring/tedious/something you've already done before - 2 qualities that are excellent to have when working. If you're used to picking things up quickly, learning to handle boring and tedious environments is probably the best skill you could take out of your schooling. How many meetings have we all sat in where we just have to sit there until somebody important reaches the conclusion we reached in 10 minutes?
This seems funny to me as well as the point of learning isn’t the actual learning, it’s the knowledge gained. That’s why we learn.
Yes the journey is valuable but the purpose of classes isn’t to measure learningness as a process, but to measure the knowledge and skills gained.
If someone already knew material and learned nothing and got an A, it means they have knowledge of that topic and now there’s a little more signal of that knowledge. It doesn’t matter that they went from 90% knowledge to 95%.
If someone got a C+ and went from 0-78% knowledge they learned much but still can’t demonstrate much knowledge.
If we wanted a course on learning processes specifically, then create one. But thinking that the point of grades is measuring how much was learned seems so backwards.
It’s good to recognize effort, but the point of effort is to achieve an outcome, not the outcome itself. “Prepend is so great, ve works so hard. Totally sucks and is stupid but works so hard.” Is silly if used for ability to do a particular thing, but useful in measuring grit and discipline or whatever.
Greene's take on education is very odd. Should I be considered not to have a CS degree because I did not need to learn anything to pass the classes? I would often leave myself a gap of 1 hour between project deadlines and when I would start on them, because I knew I needed 30 minutes to 45 minutes to complete them.
I'd spend the remaining 15 minutes before class correcting the errors in the assignment and submitting those to the instructor as proposed revisions.
I had a high school professor that just gave us the university's Chemistry introduction series (200 level, if that matters) as our high school curriculum. I didn't learn much in those classes at university, mostly because I had already completed the work.
> A degree from a university that has gotten rid of grades will not be worth much compared to one that hasn't.
You may be surprised to learn that quite a few well-regarded universities adopted this sort of policy years ago. In particular, grading freshmen pass/fail is not uncommon.
>The idea of someone not getting accreditation in a course because they already knew the material beforehand and so didn't learn anything is completely wrongheaded and abhorrent.
Whoa, Nelly! No one is talking about refusing degrees to students who read too much. I don't know where you got that idea. It definitely wasn't from Jody Greene.
Right now, attending university is both an education and a credential. These two goals are in tension, and both are compromised as a result. Students should be able to pursue an education without fear that trying something challenging will permanently affect their career. Employers looking for proof of knowledge shouldn't care where or how that knowledge was learned.
>To "tell whether you are graduating people with skills", you need a test of those skills.
Right.
>That's a grade.
Wrong. Traditional letter grades are one way of measuring mastery, but the interviewees in this article describe a few alternatives.
Sadly they don't describe these alternatives in detail. I wish this article were a bit more nuts-and-bolts. It leaves more questions than it answers.
>The grade is assigned by the lecturer of a course, as a confirmation that the student knows a thing or two about certain fixed topics.
That's an idea of what grades are and what they're for. It's not the idea. I would call this an ideal, often fallen short of, and often misguided.
In practice, grades are used in all sorts of ways for all sort of reasons. Many times, they are used for no good reason.
The idea that tests are impartial accreditation is abstract. It rarely plays in reality. There's a place for GMEDs, bar exams, TOEFLS and whatnot... but it's limited.
>It is extremely important to intellectual freedom that the time and place where some knowledge or skills were attained is considered immaterial.
> The idea of someone not getting accreditation in a course because they already knew the material beforehand and so didn't learn anything is completely wrongheaded and abhorrent
It's abhorrent only within the accreditation framework of ideals. This just isn't reality. That objectivity ideal for education is always betrayed because it doesn't work.
People don't value exam results, in their own right, because they're not that valuable in their own right. A course's job irl is to educate, not accredit.
> That's an idea of what grades are and what they're for. It's not the idea.
Can't see how there can be any other idea for grades as long as they're being used as a measuring stick for completing a class, and if they aren't something else will just using another name. You need to be able to know enough about a topic to pass a class, and there needs to be a way to measure that, sure the measuring can be often flawed but unless the class itself if a "learning to learn" class you can't have the goal to be maximize how much you learned about a topic, instead of if you know the topic presented.
> People don't value exam results, in their own right, because they're not that valuable in their own right. A course's job irl is to educate, not accredit.
I agree that the commoditization of higher education is horrible and leading to a decline in something that should be about curiosity and not as a -now, often needlessly mandatory- career stepping stone, but the "free" curiosity part of it should be open classes and such, even in this kind of more relaxed regime you still have limited resources from labs and such that need to thin out their option for candidates and accreditation will come into play sooner or later for those that wish to follow into serious research, particularly in areas where you actually need access to tools that realistically only research labs have. This isn't a huge problem in maths or computer ""science"" but most hard science fields you just can't do advanced research without specialized stuff.
But a course is part of a larger system which is built on accreditation. Universities no longer have a monopoly on information/education so accreditation is really the main value they hold. Erode that and there’s not much vocational reason to go to college (we could argue about the role of higher education, but it’s been more vocationally focused than civically focused going back to the Morrill act in the mid 1800s)
I would guess 80% of the career value is captured by satisfying admissions and graduating. I think the actual grading isn't looked at except to rank similar peers for admission to other competitive programs.
This is exactly why companies should drop degree requirements. If 80% of the degree requirements are met when you graduate from HS (and I would say it's more than 80%), then why make them wait 4 years and take on a pile of debt?
No, this is a sales tactic to improve customer retention.
By lifting restrictions, anxiety, undue pressure, the customer (freshman) is more likely to commit to the product (4-year degree). After the customer has committed for a year, they are less likely to change or drop the brand (the college). Then it's business as usual. Four full years of tuition, less customer churn.
They don't care about their students "adapting". They care about lots and lots of money.
Not really. This is needlessly cynical. The transition to college is a big one for many students, with a whole lot of "it depends". At a hard school, students are put in a situation where they might have to manage time and learn to study for the first time, and many learn some very hard lessons along that path.
MIT has had first years go pass/no record for decades for exactly this reason. And I see the exact same pattern in my undergrads at CMU.
(And yes, we really do care about our students adapting, at least at the faculty level. We're educators, for crying out loud. You don't think we take the industry pay cut because we feel guilty about earning money, do you?)
>No, this is a sales tactic to improve customer retention.
This is one of Jonathan Haidt’s points about what has led to many of the problems at universities. When the “students” are treated as “customers” it causes universities to work against their stated missions in order to placate their customer base.
Say what you like but... Students ARE the customers. When the state/local government, tax payers etc paid for education, they could reasonably have a say. But that's not the case anymore. And If I am being asked for 100k I damn well expect a few parties, a nice gym and a low stress experience.
Silly as this sounds, if it’s just freshman year, that would mean that your graduating GPA is weighted more heavily towards non-intro classes, which i view as a plus overall.
My chops as a computer science graduate was about my mastery of algorithms, compilers, and automata when I was 22, not of discrete math when I was 18.
I went to UCSC, which at the time was known for not having grades. Instead, teachers wrote "narrative evaluations" which were slightly more personalized and detailed. In most science classes, the evals were generated programmatically based on your test scores.
When applying to grad school, I had to provide a GPA so I made an estimate (which I clearly stated was an estimate) and gave myself a 3.5.
I believe UCSC has moved towards having more conventional grades. But, for me, not having grades was really nice. I hated the whole race to get high scores. But I don't think narrative evals really make much sense in classes with 100+ students.
They had grades by the time I went there and I used them for grad school applications.
But, they still have the narrative evals (Please be right!).
My evals were critical for my grad school applications. All the PIs that I talked with said that they reviewed mine, mostly because it was so unique to have a 50+ page transcript. Their reviewing of my evals was what got me into a few places.
Personally, I treasure those evals. Some were pretty bland and report-y, but enough are good descriptions of myself at a young age from otherwise unbiased observers. Those deeper and longer evals are fantastic and really helped me grow as a young person. Taking them seriously and chewing through them allowed me to choose the right major and the right path, for me, at the time.
Likely, they're gone by now, subsumed into the grind of academia. But I hope they aren't. They were really special and really helpful.
I don't think my grad program read my narrative evals. Their process is to basically make a stack of applicants and anybody with a GPA lower than 3.5 or so is rejected without any further inspection. An application without a GPA would also be rejected immediately. Fortunately, my conversion table ("Excellent = 4.0, Very Good = 3.5") made the cutoff!
> I went to UCSC, which at the time was known for not having grades. Instead, teachers wrote "narrative evaluations" which were slightly more personalized and detailed. In most science classes, the evals were generated programmatically based on your test scores.
Apparently UCSC didn't have grades for the first 35 years of its existence:
When I was in high school I could ace most of it without studying at all. I thought it was great at the time, but the shitty part - only apparent later - was that I failed to develop good study habits.
When I started at university in a technical major, it was a real kick in the teeth. The material was substantially more difficult, but the hardest part for me - and the biggest piece of learning - was the realization that I had to study consistently and be purposeful about managing my time in a way I never really had to before I got there.
My grades freshman year were abysmal. But they also helped me internalize that I needed to buckle down and put in the work. I'm glad that that happened in a relatively forgiving environment like school, instead of in the workplace.
MIT's had pass / no-record for Freshman for some time now.
WPI has had similar things to the point where getting Fs wiped off your record can result in what was called "snowflaking" or a blank transcript for a term...
RPI uses whole letter grades. A/B/C/D/F. Those are the only valid grades. A high GPA was far from guaranteed, because A- to B+ is a big shift.
Overall, I think Pass/No Record is good. I think it results in less stressed out Freshman especially at elite schools, where some of them will get their first B or C, later on. They need to understand as a friend so eloquently put it:
At MIT, you take all the kids who sit in the front of the class, and put them together. Guess what... at MIT there will be a kid at the front of THAT class.
(And likely... it isn't you.)
People need time to adjust their world view, or suicide rates, etc.. aren't pretty, they aren't great as IS. Many of us know someone who took their own lives, or just left after falling apart.
Depression and suicide (unsurprisingly the second most common cause of death after accidental injuries such as car crashes) are serious problems among college students. I expect that inflexible and punitive grading and/or academic policies don't help.
It's heartbreaking to hear of a student dying from suicide. These are young people with amazing potential. And they're our fellow humans.
The whole idiocy of linear education makes me furious. I have only gotten motivated to study certain topics later in my life, where it's effectively impossible to return to college because I would have to study a whole bunch of unrelated crap and get the degree or go home instead of gaining a small portion of it and perhaps finishing the rest later.
And let me just take the exams when you have nothing to teach me! I mean, seriously, why would I have to enroll and attend just to get certified for Linux system administration when I make living off it? Just give me the module. Same with software development, databases and so on.
Sure, I will take the OCSP / BGP advanced networking classes when I have time for them, but don't force me to study routing now (or you won't have enough credits for this semester and you'll have to go away).
Aaaaargh.
HN story where someone got a CS degree in three months by speedrunning the tests: https://news.ycombinator.com/item?id=25467900
My personal record was 14 courses (standard being 5-6 per semester) due to narrowly avoiding dropping out thanks to talking with the dean and getting approval for making up for my failures in the previous semester. I failed three of them, but that was still enough to stay afloat. Can't imagine actually going through this every semester.
That does not inspire confidence at all.
Pass/Fail already exists as an option if students want to use that.
Grades aren’t about proving learning, they are a demonstration of competence. I took many classes with prerequisites that required a C or even B in other classes before allowing me to take it. This was because of demand for classes as well as out of necessity of prior knowledge. It was possible to get a waiver from the professor, and some did. But many wouldn’t even meet because they were too busy to meet individually and assess a student’s knowledge.
How will this work without grades? It seems like this is just opting for short term happiness and pushing the actual reasons for grades down the road.
Certain kind of examination will take pretty long to be comparable with what a university can assess. A course project, one that demonstrates some breadth and depth of knowledge of the course's material, may take weeks or months to complete. Which, on the other hand, should be fine, both in a traditional university setting, and outside it.
At the end of the semester some guy showed up and gave you an oral exam followed by a written exam. They were usually a professor at another nearby university that actually offered the course. If you passed both you got a passing grade for the course.
Some schools without grades use a portfolio system where you have to complete projects in various areas and some sort of capstone project or thesis to graduate. Graduate programs largely work that way as well. They may have a qualifying process, but actual letter grades are largely irrelevant.
Students still get feedback on their projects, and they graduate with a CV and work portfolio that they can show to employers or graduate programs.
> If a student already knew the material before taking the class and got that A, "they didn't learn anything," said Greene. And "if the student came in and struggled to get a C-plus, they may have learned a lot."
Sorry, Jody Greene, you're out to lunch here. The grade is assigned by the lecturer of a course, as a confirmation that the student knows a thing or two about certain fixed topics. If the student learned about those topics somewhere else before taking the course, they earned that A just as much as someone who learned it in the course.
It is extremely important to intellectual freedom that the time and place where some knowledge or skills were attained is considered immaterial. The idea of someone not getting accreditation in a course because they already knew the material beforehand and so didn't learn anything is completely wrongheaded and abhorrent. I dare say, an attack on Western civilization's intellectual legacy.
> And so if we were to shift our focus on to learning and away from grades, we would be able to tell whether we were graduating people with the skills that we say we're graduating them with."
To "tell whether you are graduating people with skills", you need a test of those skills. That's a grade.
A skill test could actually be a lot harder than earning grades in some disciplines.
For instance, if we expect that computer science graduates ought to be skilled software developers and start testing that, it will be a shitshow.
Also, the fuck they didn't. They learned patience (have to listen to/discuss other people's questions even if they're stupid to you) and the ability to do tasks that are boring/tedious/something you've already done before - 2 qualities that are excellent to have when working. If you're used to picking things up quickly, learning to handle boring and tedious environments is probably the best skill you could take out of your schooling. How many meetings have we all sat in where we just have to sit there until somebody important reaches the conclusion we reached in 10 minutes?
Yes the journey is valuable but the purpose of classes isn’t to measure learningness as a process, but to measure the knowledge and skills gained.
If someone already knew material and learned nothing and got an A, it means they have knowledge of that topic and now there’s a little more signal of that knowledge. It doesn’t matter that they went from 90% knowledge to 95%.
If someone got a C+ and went from 0-78% knowledge they learned much but still can’t demonstrate much knowledge.
If we wanted a course on learning processes specifically, then create one. But thinking that the point of grades is measuring how much was learned seems so backwards.
It’s good to recognize effort, but the point of effort is to achieve an outcome, not the outcome itself. “Prepend is so great, ve works so hard. Totally sucks and is stupid but works so hard.” Is silly if used for ability to do a particular thing, but useful in measuring grit and discipline or whatever.
I'd spend the remaining 15 minutes before class correcting the errors in the assignment and submitting those to the instructor as proposed revisions.
I had a high school professor that just gave us the university's Chemistry introduction series (200 level, if that matters) as our high school curriculum. I didn't learn much in those classes at university, mostly because I had already completed the work.
You may be surprised to learn that quite a few well-regarded universities adopted this sort of policy years ago. In particular, grading freshmen pass/fail is not uncommon.
>The idea of someone not getting accreditation in a course because they already knew the material beforehand and so didn't learn anything is completely wrongheaded and abhorrent.
Whoa, Nelly! No one is talking about refusing degrees to students who read too much. I don't know where you got that idea. It definitely wasn't from Jody Greene.
Right now, attending university is both an education and a credential. These two goals are in tension, and both are compromised as a result. Students should be able to pursue an education without fear that trying something challenging will permanently affect their career. Employers looking for proof of knowledge shouldn't care where or how that knowledge was learned.
>To "tell whether you are graduating people with skills", you need a test of those skills.
Right.
>That's a grade.
Wrong. Traditional letter grades are one way of measuring mastery, but the interviewees in this article describe a few alternatives.
Sadly they don't describe these alternatives in detail. I wish this article were a bit more nuts-and-bolts. It leaves more questions than it answers.
That's an idea of what grades are and what they're for. It's not the idea. I would call this an ideal, often fallen short of, and often misguided.
In practice, grades are used in all sorts of ways for all sort of reasons. Many times, they are used for no good reason.
The idea that tests are impartial accreditation is abstract. It rarely plays in reality. There's a place for GMEDs, bar exams, TOEFLS and whatnot... but it's limited.
>It is extremely important to intellectual freedom that the time and place where some knowledge or skills were attained is considered immaterial.
> The idea of someone not getting accreditation in a course because they already knew the material beforehand and so didn't learn anything is completely wrongheaded and abhorrent
It's abhorrent only within the accreditation framework of ideals. This just isn't reality. That objectivity ideal for education is always betrayed because it doesn't work.
People don't value exam results, in their own right, because they're not that valuable in their own right. A course's job irl is to educate, not accredit.
Can't see how there can be any other idea for grades as long as they're being used as a measuring stick for completing a class, and if they aren't something else will just using another name. You need to be able to know enough about a topic to pass a class, and there needs to be a way to measure that, sure the measuring can be often flawed but unless the class itself if a "learning to learn" class you can't have the goal to be maximize how much you learned about a topic, instead of if you know the topic presented.
> People don't value exam results, in their own right, because they're not that valuable in their own right. A course's job irl is to educate, not accredit.
I agree that the commoditization of higher education is horrible and leading to a decline in something that should be about curiosity and not as a -now, often needlessly mandatory- career stepping stone, but the "free" curiosity part of it should be open classes and such, even in this kind of more relaxed regime you still have limited resources from labs and such that need to thin out their option for candidates and accreditation will come into play sooner or later for those that wish to follow into serious research, particularly in areas where you actually need access to tools that realistically only research labs have. This isn't a huge problem in maths or computer ""science"" but most hard science fields you just can't do advanced research without specialized stuff.
But a course is part of a larger system which is built on accreditation. Universities no longer have a monopoly on information/education so accreditation is really the main value they hold. Erode that and there’s not much vocational reason to go to college (we could argue about the role of higher education, but it’s been more vocationally focused than civically focused going back to the Morrill act in the mid 1800s)
Deleted Comment
But also, you know what they call the dude who got C's at medical school? Doctor.
Deleted Comment
No, this is a sales tactic to improve customer retention.
By lifting restrictions, anxiety, undue pressure, the customer (freshman) is more likely to commit to the product (4-year degree). After the customer has committed for a year, they are less likely to change or drop the brand (the college). Then it's business as usual. Four full years of tuition, less customer churn.
They don't care about their students "adapting". They care about lots and lots of money.
MIT has had first years go pass/no record for decades for exactly this reason. And I see the exact same pattern in my undergrads at CMU.
(And yes, we really do care about our students adapting, at least at the faculty level. We're educators, for crying out loud. You don't think we take the industry pay cut because we feel guilty about earning money, do you?)
This is one of Jonathan Haidt’s points about what has led to many of the problems at universities. When the “students” are treated as “customers” it causes universities to work against their stated missions in order to placate their customer base.
What better lesson could the colleges teach them to prepare them for the business world?
My chops as a computer science graduate was about my mastery of algorithms, compilers, and automata when I was 22, not of discrete math when I was 18.
Dead Comment
When applying to grad school, I had to provide a GPA so I made an estimate (which I clearly stated was an estimate) and gave myself a 3.5.
I believe UCSC has moved towards having more conventional grades. But, for me, not having grades was really nice. I hated the whole race to get high scores. But I don't think narrative evals really make much sense in classes with 100+ students.
They had grades by the time I went there and I used them for grad school applications.
But, they still have the narrative evals (Please be right!).
My evals were critical for my grad school applications. All the PIs that I talked with said that they reviewed mine, mostly because it was so unique to have a 50+ page transcript. Their reviewing of my evals was what got me into a few places.
Personally, I treasure those evals. Some were pretty bland and report-y, but enough are good descriptions of myself at a young age from otherwise unbiased observers. Those deeper and longer evals are fantastic and really helped me grow as a young person. Taking them seriously and chewing through them allowed me to choose the right major and the right path, for me, at the time.
Likely, they're gone by now, subsumed into the grind of academia. But I hope they aren't. They were really special and really helpful.
Apparently UCSC didn't have grades for the first 35 years of its existence:
https://www.sfgate.com/education/article/UC-Santa-Cruz-To-St...
I think it's good for UC to have different campuses with different academic environments.
Deleted Comment
When I started at university in a technical major, it was a real kick in the teeth. The material was substantially more difficult, but the hardest part for me - and the biggest piece of learning - was the realization that I had to study consistently and be purposeful about managing my time in a way I never really had to before I got there.
My grades freshman year were abysmal. But they also helped me internalize that I needed to buckle down and put in the work. I'm glad that that happened in a relatively forgiving environment like school, instead of in the workplace.
WPI has had similar things to the point where getting Fs wiped off your record can result in what was called "snowflaking" or a blank transcript for a term...
RPI uses whole letter grades. A/B/C/D/F. Those are the only valid grades. A high GPA was far from guaranteed, because A- to B+ is a big shift.
Overall, I think Pass/No Record is good. I think it results in less stressed out Freshman especially at elite schools, where some of them will get their first B or C, later on. They need to understand as a friend so eloquently put it:
At MIT, you take all the kids who sit in the front of the class, and put them together. Guess what... at MIT there will be a kid at the front of THAT class.
(And likely... it isn't you.)
People need time to adjust their world view, or suicide rates, etc.. aren't pretty, they aren't great as IS. Many of us know someone who took their own lives, or just left after falling apart.
It's heartbreaking to hear of a student dying from suicide. These are young people with amazing potential. And they're our fellow humans.