Readit News logoReadit News
light_hue_1 · 2 years ago
As an ML researcher I'm sad to say that this painfully out of date. It's definitely a course from 5 years ago. Totally irrelevant as an intro to NLP today.
fantispug · 2 years ago
It covers a lot of the fundamentals in some detail (attention and transformers, decoding, transfer learning) that are underneath current cutting edge NLP; this is still a very good foundation likely to be good for several more years.

What might be missing is in-context learning, prompt engineering, novel forms of attention, RLHF, and LoRA (though it covers adaptors), but this is still changing rapidly and the details may be irrelevant in another year. If you have a look at a recent course like Stanford CS224N 2023 there's a lot of overlap.

linooma_ · 2 years ago
Has much changed as far as parts of speech tagging in the last 5-10 years?
screye · 2 years ago
The sad truth is - all of classical nlp is Dead, with a capital D.

The bottleneck for accuracy was always data quality and human effort, not model architecture.

Llms make the data and human problems so much easier, that the benefits of supporting different architectures just doesn't make sense. With quantization, I'm not even sure classical models win out on cost anymore, and they had already lost on (real world) accuracy.

LLMs are the O365 subscription that you just can't fight against with bespoke mini solution. An all in one solution is simply too appealing.

Also, if you have to learn pre-2020 NLP I would just learn to use spacy. It pretty much covers all of pre-2020 NLP out of the box in a well documented package with strong GPU and CPU support.

nvtop · 2 years ago
A lot. POS taggers used to be linear classifiers + features. In 2018 they switched to BERT and similar encoder-only models. In 2023, POS tagging is largely irrelevant, because it was used as a part of a larger pipeline, but now you can have everything end-to-end with better accuracy by fine-tuning a sufficienly large pretrained model (LLM or encoder-decoder like T5)
thierrydamiba · 2 years ago
What resources would you recommend for someone learning about LLMs?
victorbjorklund · 2 years ago
Definatly not responsive design but syllabus looks promising
hackernewds · 2 years ago
Definatly not the best critique