No real surprise. A good number of people who have grown up with the promise of the tech are pretty well disgusted by what it's turned out to deliver, which is mostly "A handful of multibillionaires treating the rest of us as sets of eyeballs to be monetized at all possible costs."
I'm certainly there. I grew up with the promises of the internet, and I have to agree with Doctorow about "Enshitification." Most of the promised stuff has turned out to have some pretty nasty side effects and consequences. Turns out, humans don't scale to a global conversation very well, and especially not when your goal there turns into "ensuring they see as much of the platform as possible to view your ads."
That's before getting into the fact that we can't trust computers in the slightest, because they're too complex for even the people who make them to reason about, and our software is a hot mess - but, hey, we have tools to bring in all 2700 un-audited dependencies for the Electron app! Hey, where'd my crypto wallet contents go? Huh. Find someone who's done computer security for long, and they'll either be a weird off grid prepper or be planning for something of the sort, with nothing more complex than a microcontroller or two.
We've tried, for north of a decade, and with a solid couple years of effort, to build human interaction with various forms of consumer tech intermediating all the interactions, and it's been an unmitigated disaster in no shortage of ways (David Sax's book The Future is Analog is a good survey of the topic). I've been having good results lately returning to analog human interaction around a campfire on a regular basis.
That's before you get into the slave labor, near slave labor, and "I Can't Believe It's Not Slave Labor" that goes into pretty much all our modern devices - from the cobalt on up (Cobalt Red by Kara is a good read here, Dying for an iPhone is relevant, and there's no shortage of others). It's nasty, and behind every promise to do better seems to be some mechanism or another to further obfuscate the human labor going into the modern short lived electronics (because, of course, long lived devices are bad for profit).
So, yeah. Good for him. The tech thing has rotted. Let's try something different.
There is a pervasive hubris throughout tech that I’ve had trouble understanding: namely, that the mechanisms of human interaction between individuals and with the world are well understood, and are thus replicable.
Their reductive view of humans as manipulable minds in a mechanical sack of flesh has led us to nothing but isolation and disconnect.
But the truth is, we are integrated beings integrated in a world we do not and may never fully understand. Tech should thus help us deepen and enrich our sense of embodiment. Instead, it has done the exact opposite.
The hubris in tech comes in a variety of forms, but you've definitely hit on one of them. It meshes well with the view of humans you see from the self driving companies - "Humans process the world with a couple crappy cameras and some neural network stuff, we have HD cameras, how hard can it be?" Well, a decade or so later, "Really Hard" seems to be the answer.
There's also a consistent trend of "We know code, and are as like gods in the synthetic world of 99.95% reliable APIs and networks, therefore we can do anything we want in reality!" Reality, of course, disagrees, often entertainingly.
But these are the mentalities driving a lot of what runs our world, and it's quite frankly terrifying to watch in implementation.
This strikes so near & dear to my big feels. Tech sees itself as the reasoned expert intermediary. We have personas and product to guide us to the ideal solution.
But the truth is, tech is better when it's not so pretentious. We don't know what will come and planning for that is what we miss.
We should be optimizing for humility, for our limited capacity as techies to forsee & envision. Gibson's maxim to the highest wonder computing could deliver - to what soft wares should be associated with - is "The street finds it's own use for things."
Softness. Humble & open futures. Banking on human spirit. Instead, the tyranny of pre-designed interface, talking down to us forever.
Their reductive view of humans as manipulable minds in a mechanical sack of flesh has led us to nothing but isolation and disconnect.
I couldn't agree more with this, it's the most stupid way of looking at living beings possible. As a separate, disconnected piece of "hardware" running "software"
Walk outside without shoes on, in the grass, breathe the air and take a swim in the ocean and you'll soon realize that you might be made of "meat", but you're also made of everything around you and you're part of it, just like everything else.
When people describe themselves or others in that way, they're describing their value in the context of this horrible economic machine we have built and they're trying their best to understand how they fit into this system.
We did know and there are many experts who could do a good long exposition of “I told you so.” , but they were pushed aside by CEOs and business peeps. Or “bozos” as Guy Kawasaki would say: “The higher up you go, the less air there is.”
Some technology has done that, others have done as you would hope. Intent is everything.
If it comes from a shareholder-driven biz floated on the stock exchange, expect profits over innovation and people. If it's FOSS and driven by problem-solving, passion and grants, you might be in for something nice for a time.
At this point, we seem to be making worse versions of everything with the goal of just having something new and different to show. So few technologies reach the maturity of what came before it as it'll end up replaced before it reaches that point, thus they don't appeal to power users, and most are just figuring out ways to milk the largest market demographic possible.
Sad state of affairs. It doesn't have to be this way.
Anti-monopoly legislation and strong unions would work. They worked the last time things got too centralized, the era of railroads and steel.
As with railroads and steel mills, what we now call "tech" has a scaling property which pushes towards monopoly. So there needs to be some force pushing back hard against monopolies.
It is a shame there isn't more creative discussion around ideas for regulation. Perhaps something as simple as "corporate tax rate should scale with revenue" might have prevented the walmarts of the world from eating the mom-and-pops.
I would be concerned that globalization has changed the economics. Railroads and steel mills didn't have to seriously compete with other countries, so you could break them up and they'd still be viable.
Apple's value proposition is that fundamentally everything they make plays well with each other, and that investments in one product frequently pay off in other (often future) products. Break that into silos and they're individually less compelling.
Americans are too busy being preoccupied on whats woke, whether teachers should say gay to kids under a certain age, and whether the same kids should be allowed to see… drag shows? Also what books can be allowed in libraries, what textbooks should say, and whether people should be able to wear guns visibly in holsters.
I'm not sure any of this is exclusive to tech and so abandoning tech in hopes of something else isn't going to change much.
For instance, you bring up how much of it is about ads and how pervasive they are, this exists outside of tech as well. Billboards everywhere, posters on apartments and buses, this is what happens without strict regulations.
Another point you bring up is not trusting computers because of their complexity, but that's everything. We don't trust car engines because we understand them, we instead have regulations in which when things do fuck up (and then inevitably do), creators are punished for them and learn not to do it again. You might say car engines are tech, in which case let's talk food! You trust eating some place not because the chef is some talent blessed by god or your sixth sense, but because the restaurant would have shut down when health inspectors came around.
I don't have to talk about slave labor here, it's obvious.
Rushkoff doesn't even agree with the idea tech is rotted and we should try something different, but rather the primary focus of each innovation should be humans and not money. Which isn't happening currently in a majority of fields, tech is just one of these where it's exacerbated due to how little it's been regulated. Again something Rushkoff brings up in his potential solutions.
Advertising is as old as humans, certainly. But there's also a general difference, as I see it, between "the same for everyone" sort of broad advertising (billboards and the like), vs the "customized for each person, to the benefit of the payer" sort of thing we see on the internet - and have seen regularly abused for all sorts of nasty ends (see any time politics and Twitter or Meta end up in the news).
As for engines... yeah. We're just going to have to disagree there. I understand engines pretty well, and am familiar with roughly the last 100 years or so of them in various forms. I work on them, can reason about them, and they're generally a fairly simple set of electrical and mechanical systems. I'm not sure what your point about regulations actually is - I don't believe we have reliable engines because of regulations, but because "the companies who figured out how to build reliable engines" rather outsold those who built 100k mile engines. And if you trust health inspectors and restaurants, well, you probably don't know many people in food service. I also don't eat out that often.
What's obvious about slave labor, though? That literally every bit of consumer tech has aspects of that somewhere in the supply chain? I agree! It probably means we shouldn't be using the stuff.
Dis-intermediating & making tech real again, a genuine experience, feels like the only conceivable path through. I think you're both super on key here, but the analog mentality feels like trying to gear back down when I strongly think the only way out is through. Give the luddites control of the machines, don't smash them!
> solid couple years of effort, to build human interaction with various forms of consumer tech intermediating all the interactions, and it's been an unmitigated disaster in no shortage of ways.
You've hit on so many issues, recapped well, but the huge blank this article leaves, that Rushkoff's abandonment ask, is what could we hope for?
Right now we are beholden to software. We need to figure out how to make the rawer real experience tractable, interesting, & engaging. Something between coding & systems will emerge that lets us not just be ignorant consumers. We can start to create a hypermedium which enables literacy, not merely enabling spectacle.
A while back I saw the original unveiling of the Next Cube, when Steve Jobs called computers a bicycle for the mind - it was so apparent in what they demoed.
A machine that helped people create rather than just consume, even simple things like having an inbuilt dictionary. We still have that but it is always running off to the internet to do what was done in 1987. And that leap to something external is not just for convenience, it is a leap of how we used these things.
Like Rushoff, for the last 15 years I have been trying to figure out ways to return a little of that of that isolated but functional style to computing. I no longer try. Every attempt has just been meet with further disappointment. Unlike others here advocating for pushing through, I don't think we can. In pushing through it is merely providing more fuel for potential missuse.
I still use modern technology but only that which is basically forced on me by the rest of society. Even then its impact is minimized - I think Hacker news is the last regular thing I really engage with in this space and that is slowing down with time as well.
Maybe there is a part where spectacle is the main driving force that allowed these technologies to become so pervasive in our world? That is to say that if we never went down this path computers would be treated more like we treated type writers. A neat tool but something few would get obsessed with.
I think it's an open question as to if we can turn our current tech into something human-centric and not violently human-toxic at some level or another.
And I'll argue that it's really not. And that, yes, we should "gear back down," and stop using nearly as much. I've been on a gradual trip that way over the past half decade or so, and it's pretty darn nice. Tech has been my entire career. I look forward to the day I can put a computer down for the last time.
You're arguing that "something like what a lot of people hoped computers and the internet would become" can still happen, and I think the time window on that has closed. Within a rounding error of "nobody" understands computers deeply anymore - and most who still can keep up with the low level stuff are in their 40s or older, grew up in the ISA era, and generally are looking to retire. An entire generation doesn't know how computers work now, for an awfully wide range of definitions.
But unfortunately, even if you could "give the luddites control of the machines," most of those neoluddites don't want control of the machines. They've tried them for a few decades, and have found an awful lot of it wanting. And if you do insist on asking them, you'll find out that most of them lurk on IRC, and are quite happy with the 80s level of interaction IRC allows. :)
> A good number of people who have grown up with the promise of the tech are pretty well disgusted by what it's turned out to deliver (...)
I grew up with the PC and matured with the Internet, being optimistic along most of the journey even when I didn't view some of the outcomes favorably. When I read about the history of computing these days, even the ones that glorify progress, the interpretation is dark. It's not that I view the technology as bad. I have seen the good it can bring and I see the potential for how much more it can improve society. That said, I now believe that the promises were lies. The machines of my childhood, the "computers for masses, not the classes," or the, "bicycles for the mind," lost their luster of computers as the great equalizers once I realized these slogans were enablers for large corporations that were attempting to consolidate control over the industry.
Yes, I realize that interpretation is horrendously unfair. A lot of people at a lot of levels within those corporations probably believed in the social value of what they were delivering. When you look at what they were replacing, such as relatively inexpensive microcomputers for the consumer market replacing expensive computers for the business market, there is some truth to those promises. Yet it is also difficult to see those promises as anything more than ideals exploited by opportunists for their own personal gain. They didn't really care about the outcomes, so the outcomes never ended up reflecting the promises.
Things started going downhill when the "establishment" took over - when computers became a surveillance tool with stuff like Intel Management Engine, AMS "Secure" Technology, Intel AMT etc started getting pushed down our throats with nary a wimper from any major company complaining about the impact on security and privacy.
It became clear to me then that we are the slave class.. The rulers will have us use technology the way they would want us to - for their benefit, not ours.
Things have gone downhill when spyware and spam stopped being considered a crime and became a viable, socially-acceptable business model.
Nobody needs to use Intel ME to spy on or control people when people willingly install malware such as social media apps, give them all their data and their network effects force others to do the same.
Intel ME/etc is the least of my worries when it comes to computing freedom.
I cry for what could have, should have been. We went to the moon with 2kb of RAM. Far more engineering effort than went into Apollo now goes into manipulating humans to be insatiable ingesters of disposable physical and digital product. Can you imagine how rich and beautiful human culture could be if they had allotted even some of these resources towards unselfish genuine innocent curiosity?
It's very refreshing to see such a sensible thread on HN of all places. A lot of people in the industry need to take a good hard look in the mirror and evaluate what they've done with their careers.
The biggest puzzle we've got to solve right now is, how do we make people realize that Karl Marx was not the devil, but in fact, the most perceptive thinker of his time who put his finger right on the problem that bedevils us today?
People have this lazy idea of the word "capitalism," which they consider to be "our not-communist system".
Capitalism is the use and espousal of Capital.
Money is not Capital; money is a representation of labour done and value created.
Capital is when you take money and frankenstein it into acting like a magic vegetable/seed which you can plant in its current form and harvest in the same form and then plant some more of in that same form and then harvest more in the same form and it never changes, just magically grows through the mystical power of what Economists call "externalities".
Externalities are mysteries forces that we don't really understand, but which provide us with resources to use, and according to our model, no harm is ever done.
We need to start treating money like money. Capitalism needs to be snuffed out. Work, business, trade, all that can (indeed, must!) go on, but Capital must end.
Can you explain how you envision that would work? Does the state own all capital to prevent people from accumulating capital, profiting off that and accumulating more?
Postman is well, well worth a read. His observations on TV, before the internet was ever really more than a curiosity, can be just as easily applied to what the internet tried to do.
The problem is that we're not, collectively, any good at actually asking any questions before deciding some new bit of technological wizardry is worth using. We don't consider the opportunity cost (that's a lost concept these days), and the running question seems to be nothing beyond "Can I imagine some possible way in which I might find this useful?" If it turns out not to be, well, we're already in the system, and sunk cost fallacy applies hard.
I read Technopoly not too long ago, and by the end I found myself grieving that Postman still wasn't alive to write about the world we live in today. He predicted it in the early 90s, I can only imagine what he would have to say now, 30 years later. I can't imagine it'd be good.
The answer is "yes" to both. We're changing some kinds of suffering and changing who it affects without really solving the fundamental problems humanity is facing. "Playing Tetris with problems" is how my father described it to me.
Nice article. It engagingly tells the 30 year evolution of Rushkoff's Philosophy of Technology from enthralled technophile to jaded techno-revile... one Rushkoff book at a time.
At the same time, Malcolm Harris' able writing inevitably reminds the reader of the myriad changes to WiReD magazine over the same period -- from the premier forum for techno-evangelists of every stripe, to whatever its become 30 years hence, a mostly harmless mainstream media outlet that, every so often, shows a little spark of revisiting its former glory.
As a long time subscriber, thanks for the memories, Malcolm.
"""
So what answers does Rushkoff offer? His programmatic conclusions in Survival are surprisingly conventional: “Buy local, engage in mutual aid, and support cooperatives. Use monopoly law to break up anticompetitive behemoths, environmental regulation to limit waste, and organized labor to promote the rights of gig workers. Reverse tax policy so that those receiving passive capital gains on their wealth pay higher rates than those actively working for their income.”
Technology isn't entirely bad, many aspects of our modern digitized word are indeed good and the vast majority of the world at least tacitly agrees with their existence (Doug Rushkoff certainly doesn't stop himself from fully using the "digital revolution" to promote himself.
But, one can believe that and also viciously reject swathes of the dehumanizing people-are-data-KPIs-and-eyeballs-to-be-monetized surveillance world of algorithmic controls, non-human interfaces and other grotesque idiocies perpetuated by modern tech megacorps and their mini techcorp lead followers. Not to even speak of governments taking on the same attitudes. This second aspect of the digital revolution is certainly grotesque and very dangerous.
Not really. It's too diffuse. The closest thing we have to a counterculture is the MAGA movement, and their leader is a billionaire. No mob is marching up to 3000 Sand Hill Road yelling "String them up!"
I've been following Douglas Rushkoff for about a decade now. He's written books about this for the last 10-15 years, and his podcast 'team human'interviews some very interesting people.
I don't know about this article, but Douglas is as real as they come. I'm not sure why people here are being cynical.
Probably because The DR is as real as they come, and "it's difficult to get a man to understand something when his salary depends on not understanding it". Who wants to be told they're the reason civil society broke? Unfortunately for them, Nuremberg established a precedent where "just following orders" is not an excuse.
I'm certainly there. I grew up with the promises of the internet, and I have to agree with Doctorow about "Enshitification." Most of the promised stuff has turned out to have some pretty nasty side effects and consequences. Turns out, humans don't scale to a global conversation very well, and especially not when your goal there turns into "ensuring they see as much of the platform as possible to view your ads."
That's before getting into the fact that we can't trust computers in the slightest, because they're too complex for even the people who make them to reason about, and our software is a hot mess - but, hey, we have tools to bring in all 2700 un-audited dependencies for the Electron app! Hey, where'd my crypto wallet contents go? Huh. Find someone who's done computer security for long, and they'll either be a weird off grid prepper or be planning for something of the sort, with nothing more complex than a microcontroller or two.
We've tried, for north of a decade, and with a solid couple years of effort, to build human interaction with various forms of consumer tech intermediating all the interactions, and it's been an unmitigated disaster in no shortage of ways (David Sax's book The Future is Analog is a good survey of the topic). I've been having good results lately returning to analog human interaction around a campfire on a regular basis.
That's before you get into the slave labor, near slave labor, and "I Can't Believe It's Not Slave Labor" that goes into pretty much all our modern devices - from the cobalt on up (Cobalt Red by Kara is a good read here, Dying for an iPhone is relevant, and there's no shortage of others). It's nasty, and behind every promise to do better seems to be some mechanism or another to further obfuscate the human labor going into the modern short lived electronics (because, of course, long lived devices are bad for profit).
So, yeah. Good for him. The tech thing has rotted. Let's try something different.
Their reductive view of humans as manipulable minds in a mechanical sack of flesh has led us to nothing but isolation and disconnect.
But the truth is, we are integrated beings integrated in a world we do not and may never fully understand. Tech should thus help us deepen and enrich our sense of embodiment. Instead, it has done the exact opposite.
There's also a consistent trend of "We know code, and are as like gods in the synthetic world of 99.95% reliable APIs and networks, therefore we can do anything we want in reality!" Reality, of course, disagrees, often entertainingly.
But these are the mentalities driving a lot of what runs our world, and it's quite frankly terrifying to watch in implementation.
But the truth is, tech is better when it's not so pretentious. We don't know what will come and planning for that is what we miss.
We should be optimizing for humility, for our limited capacity as techies to forsee & envision. Gibson's maxim to the highest wonder computing could deliver - to what soft wares should be associated with - is "The street finds it's own use for things."
Softness. Humble & open futures. Banking on human spirit. Instead, the tyranny of pre-designed interface, talking down to us forever.
I couldn't agree more with this, it's the most stupid way of looking at living beings possible. As a separate, disconnected piece of "hardware" running "software"
Walk outside without shoes on, in the grass, breathe the air and take a swim in the ocean and you'll soon realize that you might be made of "meat", but you're also made of everything around you and you're part of it, just like everything else.
When people describe themselves or others in that way, they're describing their value in the context of this horrible economic machine we have built and they're trying their best to understand how they fit into this system.
If it comes from a shareholder-driven biz floated on the stock exchange, expect profits over innovation and people. If it's FOSS and driven by problem-solving, passion and grants, you might be in for something nice for a time.
At this point, we seem to be making worse versions of everything with the goal of just having something new and different to show. So few technologies reach the maturity of what came before it as it'll end up replaced before it reaches that point, thus they don't appeal to power users, and most are just figuring out ways to milk the largest market demographic possible.
Sad state of affairs. It doesn't have to be this way.
Instead of empathy and understanding others worldview, they think everyone shares their view and should be happy.
Apple's value proposition is that fundamentally everything they make plays well with each other, and that investments in one product frequently pay off in other (often future) products. Break that into silos and they're individually less compelling.
Cheers!
For instance, you bring up how much of it is about ads and how pervasive they are, this exists outside of tech as well. Billboards everywhere, posters on apartments and buses, this is what happens without strict regulations.
Another point you bring up is not trusting computers because of their complexity, but that's everything. We don't trust car engines because we understand them, we instead have regulations in which when things do fuck up (and then inevitably do), creators are punished for them and learn not to do it again. You might say car engines are tech, in which case let's talk food! You trust eating some place not because the chef is some talent blessed by god or your sixth sense, but because the restaurant would have shut down when health inspectors came around.
I don't have to talk about slave labor here, it's obvious.
Rushkoff doesn't even agree with the idea tech is rotted and we should try something different, but rather the primary focus of each innovation should be humans and not money. Which isn't happening currently in a majority of fields, tech is just one of these where it's exacerbated due to how little it's been regulated. Again something Rushkoff brings up in his potential solutions.
As for engines... yeah. We're just going to have to disagree there. I understand engines pretty well, and am familiar with roughly the last 100 years or so of them in various forms. I work on them, can reason about them, and they're generally a fairly simple set of electrical and mechanical systems. I'm not sure what your point about regulations actually is - I don't believe we have reliable engines because of regulations, but because "the companies who figured out how to build reliable engines" rather outsold those who built 100k mile engines. And if you trust health inspectors and restaurants, well, you probably don't know many people in food service. I also don't eat out that often.
What's obvious about slave labor, though? That literally every bit of consumer tech has aspects of that somewhere in the supply chain? I agree! It probably means we shouldn't be using the stuff.
> solid couple years of effort, to build human interaction with various forms of consumer tech intermediating all the interactions, and it's been an unmitigated disaster in no shortage of ways.
You've hit on so many issues, recapped well, but the huge blank this article leaves, that Rushkoff's abandonment ask, is what could we hope for?
Right now we are beholden to software. We need to figure out how to make the rawer real experience tractable, interesting, & engaging. Something between coding & systems will emerge that lets us not just be ignorant consumers. We can start to create a hypermedium which enables literacy, not merely enabling spectacle.
A machine that helped people create rather than just consume, even simple things like having an inbuilt dictionary. We still have that but it is always running off to the internet to do what was done in 1987. And that leap to something external is not just for convenience, it is a leap of how we used these things.
Like Rushoff, for the last 15 years I have been trying to figure out ways to return a little of that of that isolated but functional style to computing. I no longer try. Every attempt has just been meet with further disappointment. Unlike others here advocating for pushing through, I don't think we can. In pushing through it is merely providing more fuel for potential missuse.
I still use modern technology but only that which is basically forced on me by the rest of society. Even then its impact is minimized - I think Hacker news is the last regular thing I really engage with in this space and that is slowing down with time as well.
Maybe there is a part where spectacle is the main driving force that allowed these technologies to become so pervasive in our world? That is to say that if we never went down this path computers would be treated more like we treated type writers. A neat tool but something few would get obsessed with.
And I'll argue that it's really not. And that, yes, we should "gear back down," and stop using nearly as much. I've been on a gradual trip that way over the past half decade or so, and it's pretty darn nice. Tech has been my entire career. I look forward to the day I can put a computer down for the last time.
You're arguing that "something like what a lot of people hoped computers and the internet would become" can still happen, and I think the time window on that has closed. Within a rounding error of "nobody" understands computers deeply anymore - and most who still can keep up with the low level stuff are in their 40s or older, grew up in the ISA era, and generally are looking to retire. An entire generation doesn't know how computers work now, for an awfully wide range of definitions.
But unfortunately, even if you could "give the luddites control of the machines," most of those neoluddites don't want control of the machines. They've tried them for a few decades, and have found an awful lot of it wanting. And if you do insist on asking them, you'll find out that most of them lurk on IRC, and are quite happy with the 80s level of interaction IRC allows. :)
Dead Comment
I grew up with the PC and matured with the Internet, being optimistic along most of the journey even when I didn't view some of the outcomes favorably. When I read about the history of computing these days, even the ones that glorify progress, the interpretation is dark. It's not that I view the technology as bad. I have seen the good it can bring and I see the potential for how much more it can improve society. That said, I now believe that the promises were lies. The machines of my childhood, the "computers for masses, not the classes," or the, "bicycles for the mind," lost their luster of computers as the great equalizers once I realized these slogans were enablers for large corporations that were attempting to consolidate control over the industry.
Yes, I realize that interpretation is horrendously unfair. A lot of people at a lot of levels within those corporations probably believed in the social value of what they were delivering. When you look at what they were replacing, such as relatively inexpensive microcomputers for the consumer market replacing expensive computers for the business market, there is some truth to those promises. Yet it is also difficult to see those promises as anything more than ideals exploited by opportunists for their own personal gain. They didn't really care about the outcomes, so the outcomes never ended up reflecting the promises.
It became clear to me then that we are the slave class.. The rulers will have us use technology the way they would want us to - for their benefit, not ours.
Nobody needs to use Intel ME to spy on or control people when people willingly install malware such as social media apps, give them all their data and their network effects force others to do the same.
Intel ME/etc is the least of my worries when it comes to computing freedom.
People have this lazy idea of the word "capitalism," which they consider to be "our not-communist system".
Capitalism is the use and espousal of Capital.
Money is not Capital; money is a representation of labour done and value created.
Capital is when you take money and frankenstein it into acting like a magic vegetable/seed which you can plant in its current form and harvest in the same form and then plant some more of in that same form and then harvest more in the same form and it never changes, just magically grows through the mystical power of what Economists call "externalities".
Externalities are mysteries forces that we don't really understand, but which provide us with resources to use, and according to our model, no harm is ever done.
We need to start treating money like money. Capitalism needs to be snuffed out. Work, business, trade, all that can (indeed, must!) go on, but Capital must end.
Postman's 6 questions about technology:
1. “What is the problem to which this technology is the solution?”
2. “Whose problem is it?”
3. “Which people and what institutions might be most seriously harmed by a technological solution?”
4. “What new problems might be created because we have solved this problem?”
5. “What sort of people and institutions might acquire special economic and political power because of technological change?”
6. “What changes in language are being enforced by new technologies, and what is being gained and lost by such changes?”
Also:
Neil Postman on Technopoly (1992) and Collapse of Civilization
https://www.youtube.com/watch?v=sFj6-z8KeeU
The problem is that we're not, collectively, any good at actually asking any questions before deciding some new bit of technological wizardry is worth using. We don't consider the opportunity cost (that's a lost concept these days), and the running question seems to be nothing beyond "Can I imagine some possible way in which I might find this useful?" If it turns out not to be, well, we're already in the system, and sunk cost fallacy applies hard.
But, yes. Read Postman. He's very insightful.
Are we reducing or increasing suffering?
At the same time, Malcolm Harris' able writing inevitably reminds the reader of the myriad changes to WiReD magazine over the same period -- from the premier forum for techno-evangelists of every stripe, to whatever its become 30 years hence, a mostly harmless mainstream media outlet that, every so often, shows a little spark of revisiting its former glory.
As a long time subscriber, thanks for the memories, Malcolm.
""" So what answers does Rushkoff offer? His programmatic conclusions in Survival are surprisingly conventional: “Buy local, engage in mutual aid, and support cooperatives. Use monopoly law to break up anticompetitive behemoths, environmental regulation to limit waste, and organized labor to promote the rights of gig workers. Reverse tax policy so that those receiving passive capital gains on their wealth pay higher rates than those actively working for their income.”
"""
But, one can believe that and also viciously reject swathes of the dehumanizing people-are-data-KPIs-and-eyeballs-to-be-monetized surveillance world of algorithmic controls, non-human interfaces and other grotesque idiocies perpetuated by modern tech megacorps and their mini techcorp lead followers. Not to even speak of governments taking on the same attitudes. This second aspect of the digital revolution is certainly grotesque and very dangerous.
That is tragic, if true.
It is not true, here, thousands of miles, and across an ocean, away (-45.75, 170.57)
Counter-cultures pumping here
I don't know about this article, but Douglas is as real as they come. I'm not sure why people here are being cynical.