(context: Jon Skeet is (was?) the highest-rated user on StackOverflow.)
That's starting to change now - this AI is getting good, powerful, and alarmingly convincing. I still don't feel like the AI apocalypse is inevitable, but it's starting to feel possible, and it makes me uneasy.
I'm happy about this feature, but they don't really elaborate on what these "limitations" are, except a passing reference later on to "making the parsing engine slower".
Anyone know what the issue is exactly? Why can't they implement it the other way without making the parsing engine slower?
Nesting is great to avoid selector repetition.
Custom properties (CSS variables) and CSS functions give us tools to avoid magic numbers and to encode layout relationships.
New sets of selectors and container queries let us decouple and re-use declarations more.
CSS Houdini gives us further power in extending CSS functionality.
Of course there are things that SCSS provides which are unlikely in the scope of future CSS proposals such as datastructures (maps) and loops to generate classes and mixins. But I can already imagine a world were those aren't necessary anymore for many cases.
Reminds me of how Coffeescript died out because its best ideas got integrated into Javascript.
What I want to know is: how the hell did this take so long? Nesting is such an obviously useful feature to have in CSS, and once I'd experienced it in SASS I never wanted to be without it. Why did it take 10+ years for this to be introduced to CSS, especially when other browser-based technologies like JS have evolved enormously in that time?