This is an interesting move and I really don't think we are getting the whole story on what caused such a drastic change. It's interesting because, for the most part, I think it is safe to assume that most basecamp employees are liberal leaning, including the founders. My first thought was, why not just deal with these specific problem employees individually? Well, surely they must have tried. I can't imagine going public with such a drastic change if they didn't first at least try to correct the behavior of misbehaving employees. What's misbehaving? I would classify that as slinging personal insults/threats across the room. Would you fire these employees if they didn't stop this behavior? I'm not sure. It doesn't seem like something basecamp would do. In fact, it might make it _worse_ for them from a PR perspective if they did this, even if the employees were completely out of line for what is appropriate workplace conversation.
So not sure. In this situation, I would give them the benefit of the doubt(other than on the poor communication). I've never worked at a place where politics was discussed heavily in either work chat or in person (outside of friends talking with friends, which IMO, is perfectly acceptable), but I've seen some conversations (such as the leaked Github chat) that are just plain toxic and I wouldn't feel comfortable rebuttaling extreme views without fear of retribution. I personally don't think that people should have to deal with that in a work environment, but just my 2c. I guess it's just where do you draw the line, and internally, they must have hit that point.
I also suspect things weren't necessarily toxic within Basecamp. Given their recent book title "It Doesn’t Have to Be Crazy at Work" and activity on Twitter, maybe they just see the writing on the wall and would prefer to nip this in the bud. "Political talk" (however we're defining it) has become all-consuming in the past year and perhaps they recognize it as a big distraction for anyone trying to get work done.
It is a privileged stance, but I'm sure there are many who appreciate the workplace being a venue where employees are encouraged to talk about other things.
Another solution I've seen was to play a simultaneous video feed of the slides only. Not particularly efficient. Would love to see something like this instead for instructional content!
Super-resolution is only guessing. It's ok for art, not for critical tasks.
Who knows how this evolves and what new applications people may devise? For today, I agree: it's just art.
1. I got the impression that he felt this "fauxtomation" was the final iteration of any service practicing it.
To me, "fauxtomation" follows a fairly established startup methodology of doing things manually before automating. It allows sufficient time for engineers to observe the full scope before attempting to automate. In addition, it is provided as a sort of API to the end user while true automation efficiencies can be implemented incrementally with the in-house team.
The end goal of these services is almost certainly full automation. To a greater extent, I suspect this is the intended future of entire gig economies ("fauxtomation" at scale) created by the likes of Uber and DoorDash who are pushing for further advancements in technology to replace the comparatively high cost of human labor.
2. Where was the "partial automation" alternative to "full automation" in this article?
It seems that most of the described technologies fall short because the scope is too big. They do not allowing enough flexibility for fixing (and learning from) errors or accommodating unplanned use-cases. If a product (like self-service checkout) typically requires training for an employee to use properly, it is not ready for the average consumer.
I have heard the following quote used frequently:
> What can be automated, should be automated.
What is not given enough airtime is the inverse:
> What cannot be automated, should not be automated.
The immediate future is more cyborg than robot. A modern website is an assortment of modular APIs glued together. Some have referred to smartphones as a human appendage, and I think this is the proper analogy to view technology as a whole over the next decade: upgrades rather than replacements to our biological limitations.
Instead of replacing an entire industry with an industry-sized vending machine, large amounts of untrained workers will slowly be replaced by a smaller amount of trained practitioners who know how to use better tools. What this will mean for society as a whole, though, is anyone's guess.
Also if we do see strong evidence of centralization in certain areas, does that mean this trend will continue in the future? I wonder if we are entering into a new phase (#3):
1. 1970-2000 - decentralized, and fringe.
2. 2001-2020 - centralized, and mainstream. eBay/Airbnb as examples platforms that normalized online transactions and acted as a single trusted intermediary
3. 2021+ - decentralized, and mainstream. A substantial majority now accept the internet. Where the intermediaries were once providing a service, they may be perceived as an obstacle.