Readit News logoReadit News
munin · 7 years ago
This kind of change has also impacted me. It shows up when I'm trying to give students advice about starting their careers, and I realize that the first jobs I had (system administrator, SOC worker) have been replaced by robots. Especially in the SOC, I was a "Tier 1" analyst that would do monitoring (watching a bank of green lights waiting for one to turn red) and first level triage and analysis. This has been replaced by ML driven data processing systems.

So I think the apocalypse is double-bladed: while automation kicks a bunch of current workers out by making them immediately redundant, it also freezes out the next generation by removing entry level jobs and not really replacing them with anything equivalent. Meanwhile, universities and vocational ed programs won't get this memo for another ten years so they will continue to happily propel waves of students onto a set of closed and locked doors.

ukoki · 7 years ago
The pessimists' view is that automation will deprecate a heap of jobs in the tech industry that will never return. The optimists' view is that automation simply allows companies to do more stuff: things they couldn't afford to do before, and soon, things they have to do in order to stay competitive. For the optimists the number employed in the tech industry stays the same or increases, but the proportion of different roles changes (ie no more green light watchers).
0xffff2 · 7 years ago
New college grads today aren't prepared to do anything more complex than watch the green lights. If we automate out all of the entry-level work, that means we have to train workers to a higher level before they enter the work force, which is clearly untenable with today's college tuition (at least in the US).
dleslie · 7 years ago
What you're describing is _training_; and the act of training employees has long since gone out of vogue in western markets.
org3432 · 7 years ago
I've worked in IT type jobs for over two decades, in some sense training would help, but to be frank the code and systems IT folks have built are of such low quality that it's better for everyone if they're handed over to more competent teams that can cost effectively maintain them long term.
Sevii · 7 years ago
If you increase competition for jobs you can push the cost of training onto employees.
mac01021 · 7 years ago
Why train employees when you can pay just a little bit more and get pretrained ones?
shdh · 7 years ago
There are internships
JKCalhoun · 7 years ago
Was watching David Bull talk about historical Japanese wood carving. He described how the introduction of the printing press to Japan killed the entry-level, apprenticeship positions in printing.
halbritt · 7 years ago
Hello fellow David Bull fan.
bitL · 7 years ago
I am worried about the latter aspect as well - in order to keep automation going, very advanced developers would have to be involved; if the whole "easy job" ecosystem disappears, there won't be any reasonable way to keep developers progressing, with best in a competition filling up the spots at "cognitive automators".
gervase · 7 years ago
In the absence of on-the-job experience/growth for less-experienced developers, it forces them back into more academic training programs. I personally think this is likely to lead to exacerbated "degree inflation", where MS degrees will be the new minimum expectation for these new "entry-level" (read, Tier 2+) jobs.
omgwtfbyobbq · 7 years ago
A significant number of companies who can benefit from automation likely won't need to continue to automate indefinitely. I work in software development/automation and in my industry it's more about finding and configuring a framework that enables business users to configure software systems than about automating everything possible.

Even relatively rote software development that involves embedding business logic into a software system is likely safe as long as the cost of continuing to develop that software is close to the cost of switching to some other framework. It's when the cost of development is much greater than the cost to try another framework out, or when a company wants to expand something and doing so on the development side would be cost prohibitive, that someone's job could be on the line.

Wohlf · 7 years ago
Maybe we'll see a more apprenticeship type approach, where junior personnel are instead assigned to and trained by seniors. This would probably be a net good, but who knows how things will shake out.
0xffff2 · 7 years ago
I've thought about doing this at work, but it's hard to figure out how to make it attractive to my company. If I ask to hire a junior dev for the primary purpose of training them, it seems likely that I could get them full time for 50% of my salary. They're going to be a little bit productive, but they're also going to take up a lot of my time. At best, I think you end up with 150% of the labor costs for the exact same amount of work being done. That's a really hard sell to the business side of the company.
KineticLensman · 7 years ago
In the UK we have the 5% club [0], whose aim is to "to make at least 5% of its employees apprentices within a 5-year period". It does take commitment from the company, and it's perhaps not a surprise that its more common in companies that already invest in graduate recruitment. 280 companies have signed up so far.

(Disclaimer: I work for a company that has been in the 5% club since 2013)

[0] https://en.wikipedia.org/wiki/The_5%25_Club

normal_man · 7 years ago
I mean this is already an acknowledged good approach. Companies don't want to make that investment though, hiring both a senior and a junior who won't be immediately productive, and keep chasing the mythical senior who will work for mid-level salary.
kdf83 · 7 years ago
Why does watching a bank of green lights waiting for one to turn red and do first level triage and analysis require ML, isn't this just a bunch of rules?
munin · 7 years ago
I was over-simplifying to communicate the repetitive and dull nature of a job I'm very happy to no longer be doing. Really the game was more about "anomaly detection" and the sensor indicators/measurements were far more continuous than categorical, and their outputs had to be weighed against past experience and context of the monitored components.
Buttons840 · 7 years ago
Well, to start with, you have to use ML to train the computer vision system to recognize green and red lights. After that you update your resume with the ML experience you gained. :)
jpatokal · 7 years ago
The one entry level IT job that is not going away is tech support. Sure, some parts of it can be outsourced, but beyond a certain point you need a person on-site to figure out why the Internet is broken.

The Cloud companies are also hiring armies of support people, and it's a great way to kickstart your career in any of these companies while getting company provided training in the tech.

TeMPOraL · 7 years ago
Turn your products into services, hardware ownership into leasing, and suddenly you don't need to offer tech support. SLAs will establish the new "laws of physics" for users, where failure is a binary state: either it works, or it doesn't. When it doesn't, someone will come in couple of hours/days, trash the broken black box, replace it with a working black box, and things will be back to normal.

Of course, the service provider may need some amount of people figuring the failures out, but that amount is smaller than if customers had to debug their own problems, and is more susceptible to centralization, and to "fixing by replacing".

seanmmasters · 7 years ago
Not with AWS Outposts, and you can bet Google and Microsoft won't be far behind.
Ruxbin1986 · 7 years ago
"So I think the apocalypse is double-bladed: while automation kicks a bunch of current workers out by making them immediately redundant, it also freezes out the next generation by removing entry level jobs and not really replacing them with anything equivalent."

Agreed entirely. I would be surprised if we still have two-year technical degrees in a decade.

jac_no_k · 7 years ago
I can't find the article any more but I think Toyota has it's engineers build a handful of cars by hand as part of it's kaizen process. This would act as entry point for next the next generation of engineers.
whorleater · 7 years ago
How many solid universities still have IT programs? At UIUC as of 2016, the IT program was nonexistent.
munin · 7 years ago
Two points: first, even if "solid" universities are right to run away from this (and IMO they don't run away from it, they just move the IT program into the business school where you don't see it any more) there are still many other universities pumping out students into a dead career field, which should be concerning. Second, this is not just about IT programs. Computer science programs are impacted to. From the OP:

> But instead of five backend developers and three ops people and a DBA to keep the lights on for your line-of-business app, now you maybe need two people total.

All nine of those people would probably have been CS graduates, or at least many of the backend developers would be (and perhaps the DBA). Or they would be people that thought of themselves as "developers" and not "IT" for whatever that distinction is worth now.

arethuza · 7 years ago
What do you mean by an "IT program"?
Zelmor · 7 years ago
Europe does. ;)
hourislate · 7 years ago
I have not found one yet.

My son currently is enrolled in CS and his first 2 years of school are filled with humanities, history and a few more irrelevant courses all to keep some profs employed. His next two years will be filled with more useless courses and by the time it's all over will have cost 50k + (he lives at home and goes to a State School).

I feel like he could have taken a 6 week Java/Python/whatever and got more out of it. Add a CCNA/CCNP for the Networking knowledge, Linux Cert,Security Cert from SANS, and some self study and he would know more than a 4 year degree and be bettered prepared for the working world.

Universities in the US are all about making money, supporting football and athletics, tenure for the profs, and finally accreditation for 50k+?.

Meanwhile, 10's of 1000's of H1B's are needed because our kids know nothing and are being taught shit.

Two areas that need major change and disruption, Education and Healthcare, everything else can wait.

dfischer · 7 years ago
Education innovation is definitely a massive problem to solve. Especially because of a weird mix of control and influence Universities have between dictating k-12, expected job fulfillment vs skill, and their government sanctioned student debt vehicle unavoidable by bankruptcy.

We need a more decentralized education system top-down that isn't tightly coupled to the gov.

Ruxbin1986 · 7 years ago
Absolutely not. Look at colleges like Devry or University of Phoenix. Little to no regulation in curriculum, incredibly expensive, etc.
klodolph · 7 years ago
This has been happening since the 1970s. Or earlier. I don’t really think of it as an apocalypse. IT skills have never had a long shelf life. Any time you are a technology expert at your company, in IT, the technology landscape will shift under you. This is the Red Queen Hypothesis in action. People who ran mainframes in the 1980s became trusted experts and then most of the jobs evaporated. Same thing happened to people running critical VAX or Unix systems. Your skills are only valuable as long as the related technology is.

The same thing happens to programming positions.

But I think the good news is missing from this article—IT jobs are, overall, sticking around or increasing in number. (According to the Bureau of Labor Statistics, the jobs are growing “faster than average for all occupations”). You do have to keep updating your skill set, but it’s not like manufacturing, where efficiencies eliminate jobs altogether or move them to completely different sectors. And there is that ageism to worry about, and uncertainty.

I’m personally more worried about some of the other remaining white-collar office jobs, like the accountants, paralegals, HR, various banking positions, etc.

Rooster61 · 7 years ago
> The same thing happens to programming positions.

I can't stress how important this is. Folks going to things like boot camps or other educational outlets that focus on one language will utterly kill their career if they aren't aware of how fast things move. If you don't learn the underlying abstractions and paradigms that take various forms in different languages, you will get left in the dust in a matter of a few years.

The best programmers I've ever worked with got excited about programming patterns and paradigms, not frameworks and syntactic sugar. Those are also the ones I paid the most attention to.

Bottom line for both software and IT engineers: you learn to learn, not just to do.

bradfa · 7 years ago
Sure, there's places to learn hip new tech that might not be around for a while and you'll have to continuously learn new languages/frameworks/etc. But there's also things like FORTRAN, COBOL, and C, which are all in-demand in their own little niches, aren't going away any time soon, and where you can till make a quite good career with only knowing one of them.
jimbokun · 7 years ago
The key skill is knowing how to continuously acquire new skills.
Ruxbin1986 · 7 years ago
There are hundreds of thousands of people or System Administrators that have made life long careers out of managing networks, server farms, windows and linux systems since the early 90s. It's not fancy as software development or drives business value but work that needs to be done.

The Cloud greatly diminishes and in some cases completely eliminates that work. The only thing left is actual software development.

3pt14159 · 7 years ago
If you've tried to hire a good sys admin in North America you'd be shocked at how high their salaries are getting. Amazon is hiring them by the boatload. Shopify is moving to all cloud because they're unable to staff.

The problem is that junior sys admins aren't as useful as before to most startups. I still think they'll figure it out, but the industry is changing.

jimbokun · 7 years ago
"The Cloud greatly diminishes and in some cases completely eliminates that work."

It takes a hell of a lot of work to take a company's entire infrastructure and migrate it to Kubernetes and the Cloud, and then monitor and manage it. It's not trivial.

So the key is seeing your job as solving a business problem with computers, not "I administer Oracle version X.y.z running on Redhat Linux".

icedchai · 7 years ago
Not really. We now call systems administrators "devops engineers" now.
itgoon · 7 years ago
This. Check out my username, this is what I do, and I've been doing it for far longer than a decade.

The mid-range jobs have always been vanishing. Many times, I'm the guy automating them out of existence. It's always replaced by something else.

It depends on the circumstances, of course, but there is often more work after the automation than before. It's just different work. It requires reskilling.

The article mentions some new product AWS is coming out with. No matter how "simple" it makes things, someone is going to end up being an expert at using it, and will probably be paid well to do so.

Really, the toughest and most crucial part of this career has been keeping up. The work stays steady, though.

someone454 · 7 years ago
I gave myself the title “IT Janitor” because for years all I did was clean up other people’s $4!7. I automated everything I touched, and told people it was so I would have more time to surf the Internet. Now I’m over 50, in the Innovation group, and dreading ever having to find a new job. But I’m having fun: blockchain, robotics, mobile app dev, and now NLP.
sharemywin · 7 years ago
The problem is most companies want someone with 3-5 years of experience in that new skill set. So, unless you jump on that particular band wagon early you're out of luck.
klodolph · 7 years ago
In my experience, companies say 3-5 years but this requirement is written by someone in HR translating what the hiring manager said. The hiring manager is more likely to hire someone who has less than 3-5 years of experience, or even hire someone with zero experience, as long as the candidate demonstrates aptitude.

The trick is to write your resume so it gets past HR's filter, but without putting bullshit on it. It's unfortunate but I consider this kind of thing a critical skill for anyone applying to technical jobs.

jimbokun · 7 years ago
This problem has existed at least since the 90s, probably longer than that. I remember the joke being job descriptions asking for 3-5 years Java experience in 1996.
scirocco · 7 years ago
Mainframe still rules many companies. Especially banks and insurance
raphar · 7 years ago
And they are not going away.

Last year I witnessed the sad story of a general manager, that promised to migrate away from an as400 in six months.

This guy (and several others) doesnt fully understand that a system, that is software and hardware and infrastucture, has its life determined by the returns it gives to its mother organization. If the system works, the organization wont pay or dare to replace it. Core systems are the hardest, and thats where all those mainframes and c and fortram and nowadays java legacy sys are still alive and kicking.

Pd: I love those systems btw, if you have one that needs love and care, I'd like to hear about it :D.

jimbokun · 7 years ago
Agreed.

"I’ve spent my career in tech, almost a decade at this point, running about a step-and-a-half ahead of the automation reaper."

I mean, yes, that is the entire job description. If you are in IT, your responsibility is to learn the best technologies, and be continually re-evaluating what to keep of your organization's current and what to improve or replace.

That is why I wouldn't want to do anything else. I love learning new things and no profession offers more opportunities to learn new things than computer technology.

gaius · 7 years ago
IT skills have never had a long shelf life.

But this isn’t true, outside of webdev.

If your skill was “DB2” or “Oracle” or “Cisco” or “C++” you could have had a 30-40 year career in that, easily. There are plenty of others. Java has been around commercially since about 1995, there will definitely be plenty of Java jobs in 2025.

catdog · 7 years ago
True, but on the other hand those technologies have evolved more or less drastically as well. E.g. the Java or C++ skills from 1995 won't get you that far today, both in terms of the language itself and the framework/library ecosystem.
porpoisely · 7 years ago
Another worry is recession and the popping of the tech bubble. Sooner or later the bubble has to popped and that's going to cause a lot of pain throughout the tech industry. Should be interesting to see how the industry rebounds and in what form. Will VR be the new "hot thing" like smartphones were after the 2008 recession.
netsharc · 7 years ago
But smartphones are useful because now you have a computer in your pocket (but just like a PC, it's mostly being used for social media), if I look around my train in the morning many people are staring into their phones. I doubt VR will ever reach that. But AR might, so having a heads-up-display that will, for example, show you the next step of a recipe, or which screw to undo next while fixing your car.
joeax · 7 years ago
I'm no pollyanna but I doubt we'll ever see another tech bubble collapse in the likes of the post-2000 one ever again, unless society itself collapses. And believe me, 2002-03 were the most painful years of my working life.

Sure we'll see slowdowns and some dramatic shifts in skill sets. We just need to stay ahead of the curve.

KineticLensman · 7 years ago
> Will VR be the new "hot thing" like smartphones were after the 2008 recession

I don't think so. Mobile phones were already a significant thing before 2008 and the morph to smartphones was already in train (first iPhone was 2007).

VR still has to emerge from a relatively small set of niche use cases. Augmented / Mixed reality is more likely, given the ubiquity of good cameras on smartphones. Arguably AR/MR will let the phone manufacturers keep pushing device upgrades for longer, as the smartphone market saturates.

sarcasmic · 7 years ago
I agree with the points raised in the writing, but it mixes automation, abstraction, and industry consolidation as if they weren't separate processes. As such, the transformation being described isn't an impending cliff, but an ever-present pressure of economic forces that affect all business all the time, and one is wise to watch for.

Automation replaces repetitive work with tooling and work that's more complex. Abstraction allows one to delegate to another for details, which may include choosing from a palette of pre-made options. Consolidation will come about as fewer independent players can sustain themselves in the market. Some will be out-competed by economies of scale, some will be starved by restrictions on intellectual property and lack of access to expertise.

This process has already played out for "small business websites", yet there's still lots and lots of web developers and web designers employed or freelancing. The current wave of WYSIWYG website generators is actually very good, and they have add-ons and integrations that make sense for their target market. But plenty of clients don't want to mess around in it, so they'd rather hire someone. This could be maker of the generator, or it could be an outside consultant. In either case, the person brings judgement, experience, and creativity, to tailor the deliverable to the needs of the client. These are skills resistant to automation, but not immune to abstraction and consolidation.

In the end, the antidote is the same as it always was: be adaptable, be personable, be resilient, and be resourceful. These are especially important in one is in a comfortable job shielded from most competitive pressure, because they will be the most surprised and unprepared if their current employment is made redundant.

kpennell · 7 years ago
saving this comment...
zelon88 · 7 years ago
I keep seeing thse kinds of articles where the author has drank the Cloud kool-aid themselves, forgotten how to function without it, and insists that it's impossible to function without it. This guy even drags manufacturing into the mix, and obviously has no idea about manufacturing in the United States.

Manufacturers in the USA don't make cheap coffee cups. We don't make underwear. We don't make car fenders. We make warheads. We make gyroscopes. We make electro-mechanical assemblies that China or Malaysia or Mexico would screw up. We specialize in quality over quantity, and we specialize in cutting edge tolerances and specifications. We make export controlled things for enterprise contracts and the government. Things that require certifications to produce, and govermnent regulatory compliance, and tight tolerances. Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.

We don't make 100,000 of anything either. We make 100 gyroscopes for General Dynamics, or 5 jet engines for General Electric. We make US military grade munitions and weapons for the government. The author obviusly doesn't realize that the company making the wafers for Raytheon ISN'T ALLOWED TO USE THE CLOUD. All that great automation that helps AirBNB function with no infrastructure is meaningless when you have to protect your IP from nation state actors. To probably >50% of American manufacturing the Cloud is useless. It's a consolidated attack vector that WILL be compromised in the future and lead to liability. Sure you can put a NIST 800-171 or DFARS compliant business in the Cloud, but it costs extra and it's not worth the risk. You hear about misconfigured buckets leaking data almost daily. Nobody doing govermnet manufacturing work wants to deal with that headache. Infact, I've been in this industry for 10 years and I have NEVER seen a DFARS compliant supplier with outsourced IT infrastructure. I've visited hundreds of companies over the years. What you're describing doesn't interest American manufacturers one bit.

munin · 7 years ago
> The author obviusly doesn't realize that the company making the wafers for Raytheon ISN'T ALLOWED TO USE THE CLOUD.

This is probably going to change. People like you said the same thing about health data, and student data. The savings were so tantalizing that the regulators and stakeholders figured out how to make it work. What do you think GovCloud is for? C2S and "Secret cloud"?

Our university had a 3-4 person dedicated Exchange team. When "Google Apps" came out, people wanted us to switch to that from our old mail server stuff. Go figure, why would you keep using pine and squirrelmail when you could use gmail? "It can't hold student data" the IT team said, "it isn't certified for FERPA or ITAR." Okay, true. Fast forward two years, now Google's "Apps for Education" can deal with both. The switch was sudden and brutal and the university no longer has a 3-4 person dedicated Exchange team or an Exchange deployment of any kind.

jimbokun · 7 years ago
And at some point, AWS or Azure might be considered more secure than servers configured and administered by an organization that doesn't have that as its core competency.
blub · 7 years ago
Fast forward two years and after Google promised they would not spy on the students like they do on the general population to get the contracts they did exactly that.

Lobbyist and fools can get past most logical objections and cloudify anything.

ProAm · 7 years ago
> Manufacturers in the USA don't make cheap coffee cups. We don't make underwear. We don't make car fenders. We make warheads. We make gyroscopes. We make electro-mechanical assemblies that China or Malaysia or Mexico would screw up. We specialize in quality over quantity, and we specialize in cutting edge tolerances and specifications. We make export controlled things for enterprise contracts and the government. Things that require certifications to produce, and govermnent regulatory compliance, and tight tolerances. Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.

This a blanket statement and is wrong. Most cheap manufacturing is done over seas but the US still has a large manufacturing sector that makes all sorts of crap.

Frondo · 7 years ago
We sure do, and our stuff is generally a little more expensive than the Chinese-made import but a lot higher quality. My go-to example is brooms and mops. Libman makes their brooms and mops in the US, and I'll never buy another brand. They break like any sort of cleaning product does, but a lot less often than the imported ones (IMO).

Sterilite boxes are also made in the US -- cost a bit more than imported, but again, much higher quality.

It's a shame there isn't a "made in the US, and slightly more expensive but a lot higher quality" option for everything I buy, cause I'd do that in a heartbeat. I hate buying shit that breaks; waste of my time to even have to think about that stuff.

sandworm101 · 7 years ago
>> Nobody here is making the 100,000,000 wrenches you can buy at Wal-Mart.

There are still people making nails in the US. Fertilizer. Food gets exported. Then there is all the stuff to too expensive to ship. Lumber, aluminum sheeting, cement ... lots of non-precision stuff is still made locally. Not every US factory makes munitions.

And some stuff is made locally not because of 'better' manufacturing ability but for speed. The fashion industry has to react quickly, quicker than overseas shipping can manage. I just ordered a small electronics assembly from a Canadian manufacturer not because they are the most skilled or precise but because they can chat with me on the phone and ship a small-run (5) faster than any Asian manufacturer. (It's a device for measuring laser energy at specific wavelengths but I have some specific needs re how the data is collected/displayed. It only took a 10-minute call to explain my issues and get a deal together.)

coldtea · 7 years ago
>And some stuff is made locally not because of 'better' manufacturing ability but for speed. The fashion industry has to react quickly, quicker than overseas shipping can manage.

Almost everything in the fashion industry is made in Asia.

And few consumer goods (if anything) are "too expensive to ship".

>I just ordered a small electronics assembly from a Canadian manufacturer not because they are the most skilled or precise but because they can chat with me on the phone and ship a small-run (5) faster than any Asian manufacturer

Yes, but for anything at scale, they wont be the most competitive option.

mr_overalls · 7 years ago
> ISN'T ALLOWED TO USE THE CLOUD

Do you find this a bit strange, given that the Pentagon is making a massive push toward (presumably private) could infrastructure?

https://www.reuters.com/article/us-usa-pentagon-cloud-idUSKB...

icedchai · 7 years ago
They probably are allowed to use the cloud, it just requires a lot of red tape and paperwork: filling out forms, waiting, and filling out more forms. Pretty sure AWS GovCloud exists for a reason, and that reason isn't because it has no customers.
throwaway98121 · 7 years ago
May not be allowed to use cloud today but that will likely change in the near future. FWIW, I can imagine your post in my head as an argument in favor of horses over automobiles.

What you’ve seen in 10 years was reality during that time. That speake nothing little to the future.

throwanem · 7 years ago
We don't need a lot of people to do all those things.
nunez · 7 years ago
Good points, but there are other American-dominant industries aside from defense manufacturing, and they are definitely taking a keen interest in the cloud. Also, defense is moving to the cloud too, albeit more slowly than AirBnB, say.
foozed · 7 years ago
The article is not talking about IT at Raytheon but "[...] anonymous Windows administrators and point-and-click DBAs and “senior application developers” who munge JSON in C#".
tmaly · 7 years ago
If anything, it will be a private cloud setup on an isolated company owned data center.
pjc50 · 7 years ago
> Repetition is a sure warning sign. If you’re building the same integrations, patching the same servers over and over again every day, congratulations – you’ve already become a robot. It’s only a matter of time before a small shell script makes it official.

Absolutely - if something is repetitive, it's a candidate for automation. This is true across all disciplines. Only the as-yet unautomatable human judgement, insight and communication is safely valuable.

On the other hand, "go away or I will replace you with a very small shell script" has been a BOFH joke since the 90s.

ReptileMan · 7 years ago
I have actually replaced a person with 40 lines c# program
hobs · 7 years ago
I replaced a person with a one line code change (performance tuning) - they got her another job though.

Literally had a business call to see if they could run another instance of $LOB_SOFTWARE because they had a person clicking a button for 40 hours a week.

SketchySeaBeast · 7 years ago
His wife and kids don't even know.
pradn · 7 years ago
What did the program do?
redleggedfrog · 7 years ago
While I think this essay has some good points, it ignores the problem that I always see with this idea - people. I really wish the business people I deal with on a daily basis could have their business requirements met by such automation, cause it's the least fun part of my job. But the don't because I don't care how awesome your cloud provider tools are, they always come to me with some weird requirement or platform and I'm back to "munging JSON in C#".

The problem isn't the technology, it's the complexity of the customer's business requirements, and their nearly complete inability to transfer those requirements into software without complex implementations that they could never hope to implement themselves. I would love to see more tooling to help with this. I have been waiting for 25 years. It gets better, but not nearly what can be described as an apocalypse.

yourapostasy · 7 years ago
> ...their nearly complete inability to transfer those requirements into software without complex implementations that they could never hope to implement themselves.

I wouldn't even mind if they even had "those requirements". That would be a huge step up. Oftentimes the requirements are not written down and stuck in tribal knowledge. And woe betide you if the tribe is an outsourcer or offshore team. About half the time, they're unintentionally leaving knowledge in the heads of their meat-robots, and getting the information transferred out of those heads is painful and time-consuming, because "set of procedures for meat-robots" is effectively what they're hired for.

If the procedure is periodically performed, I often get pretty good results just asking for copies of the resultant emails reporting completion, and working backwards from there. Automation at this layer of staff work is considered exotic developer-realm, needs-a-budget-and-a-project-manager effort, even for what most HN readers would consider relatively trivial multi-hour or multi-day scripting work.

The abruptness and agony of automation sweeping through these layers in the upcoming years as the tooling to discover, capture, distill, and maintain these requirements with tight synchronization to software teams maintaining the code behind the automation are going to be politically challenging, as a lot of these people have zero notion what they're doing can be automated away, even as they consume the results of ML in their daily lives.

And I'm rather glum about teaching these people how to perform the automatation themselves. The reception I've gotten to my offers to help them get on the track to learning programming and automation have been very underwhelming. Even if someone doesn't "get it" about coding, just the exposure to the thinking patterns would help me enormously cut down on unnecessary meeting times, as there are still way too many people whose conception of automation is closer to "can't they/you just...[magic/mind-read]?"

F_J_H · 7 years ago
A point that often gets missed with "low-code" tools, is that it's not so much that they enable "non-coders" to build applications, but that they enable experienced developers to go so much faster.

I've been using a fullstack low code development tool for several years now, and when it comes to developing CRUD apps or data reporting apps (with charts, interactive, drill down reports, etc. etc.), it's astonishing how quickly you can stand-up a secure, fully responsive web-app, complete with authentication, authorization schemes, report subscriptions, etc., without writing any code at all.

And, when you bump up against the limits of the declarative/low-code aspect of the framework, you can toggle over to java script, your own CSS, SQL, etc., so it's not like you paint yourself into a corner.

So, I agree, if Amazon creates something like this, and it is as good as some of the existing low-code tools out there, it's going to have a big impact over the long term.

edit: typo

wvenable · 7 years ago
I'm doing amazingly more involved things much faster now than 20 years ago. However, user demands and expectations are higher than ever.
dleslie · 7 years ago
Which tool?
F_J_H · 7 years ago
I hesitate to say it on HN as everyone seems to hate Oracle (and for good reason), but the tool is Oracle Application Express. (https://apex.oracle.com/en/)

In my earlier consulting role, and now as a CDO, it's my go-to tool for CRUD and data presentation apps.

(I don't work for Oracle.)

adamc · 7 years ago
As a senior employee in an organization that has migrated much functionality to SaaS and the cloud... I'm doubtful. In my experience so far, what we do changes, but the need for IT employees hasn't gone down. Most of what we did was figure out how to solve business problems with IT, and that continues, cloud or no. SaaS offering are sophisticated, but hard to use out of the box when you have significant regulatory (and other) requirements.

If there are IT jobs developing for smaller organizations, maybe those will go away, but... I think a lot of that disappeared already.

I'm close to retiring (from this job, anyhow), so it's not a personal issue for me. I just haven't seen it happening as described.