Readit News logoReadit News
Posted by u/smdz 3 years ago
Ask HN: Burnout because of ChatGPT?
TL;DR (summarised by ChatGPT) - I'm experiencing increased productivity and independence with ChatGPT but grappling with challenges such as lack of work-life boundaries and overwhelming information, leading to stress and burnout.

Long story...

I have been using ChatGPT for a while, and moved to the Plus subscription for their GPT-4 model, which I must say, is quite good.

1. ChatGPT makes us very productive. Personally, in my early 40s, I feel my brain is back in 20s.

2. I no longer feel the need to hire juniors. This is a short-term positive and maybe a long-term negative. [[EDIT: I may have implied a wrong meaning. To clarify - nobody's going yet because of ChatGPT. It is just raising the bar high and higher. What took me years to learn, this thing can do already and much more. And I cannot predict the financial future of OpenAI or the markets in general.]]

A lot of stuff I used to delegate to fellow humans are now being delegated to ChatGPT. And I can get the results immediately and at any time I want. I agree that it cannot operate on its own. I still need to review and correct things. I have do that even when working with other humans. The only difference is that I can start trusting a human to improve, but I cannot expect ChatGPT to do so. Not that it is incapable, but because it is restricted by OpenAI.

And I have gotten better at using it. Calling myself a prompt-engineer sounds weird.

With all the good, I am now experiencing the cons, stress and burnout:

1. Humans work 9-5 (or some schedule), but ChatGPT is available always and works instantly. Now, when I have some idea I want to try out - I start working on it immediately with the help of AI. Earlier I just used to put a note in the todo-list and stash it for the next day.

2. The outputs with ChatGPT are so fast, that my "review load" is too high. At times it feels like we are working for ChatGPT and not the other way around.

3. ChatGPT has the habit of throwing new knowledge back at you. Google does that too, but this feels 10x of Google. Sometimes it is overwhelming. Good thing is we learn a lot, bad thing is that if often slows down our decision making.

4. I tried to put a schedule to use it - but when everybody has access to this tech, I have a genuine fear of missing out.

5. I have zero doubt that AI is setting the bar high, and it is going to take away a ton of average-joe desk jobs. GPT-4 itself is quite capable and organisations are yet to embrace it.

And not the least, it makes me worry - what lies with the future models. I am not a layman when it comes to AI/ML - have worked with it until the past few years in the pre-GPT era.

Has anybody experienced these issues? And how do you deal with those?

* I could not resist asking ChatGPT the above - couple of strategies it told me were to "Seek Support from Others" and "Participating in discussions or groups focused on ethical AI". *

tivert · 3 years ago
> 2. I no longer feel the need to hire juniors. This is a short-term positive and maybe a long-term negative.

> A lot of stuff I used to delegate to fellow humans are now being delegated to ChatGPT. And I can get the results immediately and at any time I want. I agree that it cannot operate on its own. I still need to review and correct things. I have do that even when working with other humans. The only difference is that I can start trusting a human to improve, but I cannot expect ChatGPT to do so. Not that it is incapable, but because it is restricted by OpenAI.

I think this point bears repeating.

The threat of these models isn't that they'll go all Skynet and kill everyone, it's that they'll cause a lot of economic devastation to people who make a living through labor requiring skill and knowledge, especially future generations of skilled labor. Then there will be a decision point: either the senior-level people who thought they were safe get replaced by a more-advanced model, or they don't and there's a future society-level shortage because the pipeline to produce more senior-level people has been shut down (like the OP is doing).

The only people who will come out (relatively) unscathed are the ownership class, like always.

Of course, this is inevitable because it's impossible to question or change our society's ideological assumptions. They must be played out until they utterly destroy society.

dragonwriter · 3 years ago
> Then there will be a decision point: either the senior-level people who thought they were safe get replaced by a more-advanced model, or they don't and there's a future society-level shortage because the pipeline to produce more senior-level people has been shut down (like the OP is doing).

Or, for every junior that isn't hired by a business that can't expand its portfolio to exploit greater productivity or can’t figure out how to effectively use LLMs across the experience spectrum, two will be hired in shops that can do those things, and, as with previous software dev productivity increases, greater productivity in the field will mean a broader range of viable applications and more total jobs across all experience levels.

coldtea · 3 years ago
>Or, for every junior that isn't hired by a business that can't expand its portfolio to exploit greater productivity or can’t figure out how to effectively use LLMs across the experience spectrum, two will be hired in shops that can do those things

And everybody also gets a pony! Win-win-win situation!

Previous "software dev productivity increases" happened as computing saturation itself increased from a hanful of mainframes to one in every office, then at every desk, then a few in every home, and later one in every hand. Now it's at 100% or close.

It also still required computer operators. LLM are not mere increased productivity of a human computer operator, but automation of productivity so that it can happen without an operator (or with much fewer).

Moreover, all this "increased productivity" still left wage stagnant for 40 years (with basic costs like housing, education, healthcare skyrocketing). It's not like more of it, in the same old corporatism context, bodes better for the future...

pixl97 · 3 years ago
Typically that's not how these things work at least in the time frame that you don't starve to death.

Leading up to 2008 you'd think the market would optimize for lenders that checked who they were giving loans to. But that's not what happened. The idiots kept giving out shit loans until the entire market burned down taking out good and bad lenders alike in the aftermath.

cheschire · 3 years ago
You can witness this happening in the trades right now. A whole generation of people were told to goto college and to avoid the trades, and now here we are in possibly the most significant manpower drought the trades have ever experienced. And this has a ripple effect as the older generations retire out, and take their hard won experiences with them with nobody to pass their knowledge onto. Can't tell a carpenter to go type that shit into Confluence, let alone tell the kid to look in the knowledge base first.
StevePerkins · 3 years ago
And yet the trades still have uneven access (or none at all) to health coverage, retirement planning options, etc.

As an American parent of young children, I keep being told that college is a scam and I should steer my kids toward the trades. 90+% of the time, I am being told this by a white-collar worker who went to college themselves, and is just bloviating.

When we reach a real crisis point, severe enough to actually consider granting skilled tradespeople access to a fraction of the privilege enjoyed by white-collar workers, then I might consider nudging my kids toward electrician or plumbing work. But under the current social caste system, of course I am going to do everything possible to give my kids access to college and steer them that way.

I believe that virtually everyone, white-collar and blue-collar alike, quietly feels likewise. We make a pretense of giving contrary advice, but mostly just in hopes that other people will move in that direction for us. To take the bullet and help with this imbalance, and also to relieve the intense competition our own kids face.

UncleMeat · 3 years ago
My brother in law is an electrician.

He does not have paid vacation, good sick leave policies, or good health insurance through his employer. He has witnessed a bunch of on-the-job injuries and one near-fatality, largely caused by his employer pushing hard for the team to complete jobs as fast as possible. He is paid alright, but less than the norm for the people I know with college degrees even after we exclude everybody in software. His job is also physically demanding and may cause problems later in life.

Not exactly a "hey, pick this job and you'll have a great career" story.

Dead Comment

sharts · 3 years ago
The only solution is to rapidly scale these up so that they can disrupt every aspect of everyone's life until we all decide to throw our hands up and decide it's time for a new social construct.

I'm surprised someone hasn't replace politicians with an LLM. Imagine not having to pay their salaries when ChatGPT can send "thoughts and prayers" to Maui over Twitter 24/7.

slg · 3 years ago
>until we all decide to throw our hands up and decide it's time for a new social construct.

In my opinion this is the most optimistic of the realistic possible outcomes. In the past when automation put a factory worker out of a job, they were just told to go back to school or "learn to code" which isn't actually a solution for most people. These LLMs disproportionately impact people further up the socioeconomic ladder than prior waves of automation. Maybe our uneven society means that this wave of distribution of a more powerful group of people will be more likely to cause an actual change to how we organize society.

tsunamifury · 3 years ago
I am absurdly and instinctually confident that LLM personas will replace at least the public face of politicians almost instantly. The problem is the back end work.
over_bridge · 3 years ago
Politicians are just a perfect role to disrupt. Problem is of course they are a monopoly so can't be competed with and are protected by law/constitution so can't be changed directly.

Their whole job is to 'represent' their constituents. A LLM can poll the sentiments of the people far more effectively than they can. I'm sure it could be programmed to accept bribes too to weigh rich people's opinions higher. I'd love to see votes done by 100 different LLMs instead of Senators (a hyperbolic, non literal statement but interesting as a thought experiment I hope)

Politicians should still propose new and altered legislation but actually voting, and being informed to vote, could be massively improved.

zarzavat · 3 years ago
It’s not a new social construct. It’s a very old one: a thin layer of rich people, and a large number of poor people fighting to survive because they don’t have anything to contribute beyond their own manual labour.

Head out of the developed world and you can see this type of society everywhere.

This meme of AI -> upheaval -> basic income utopia has got to die. It’s wishful thinking. It’s “clean coal” for programmers.

AtlasBarfed · 3 years ago
"The privileged will risk absolute destruction over the surrender of any advantage"
falloutx · 3 years ago
Is this satire? If you think politicians can be replaced, then you must be dreaming quite hard sir. They are running the show.
bpye · 3 years ago
This was a plot point in Avenue 5. The office of the other president.
JohnFen · 3 years ago
> The threat of these models isn't that they'll go all Skynet and kill everyone, it's that they'll cause a lot of economic devastation to people who make a living

Yes. This is pretty much my only concern about these models, and I'm powerfully concerned about this. It's hard to see how this will lead to a good place. It seems more likely that this will lead to increased poverty and multiple socioeconomic crises.

I am even more concerned that very few people are talking about this, and none of the power players in this space are, except for occasional mentions in passing of fantasies like UBI.

bloppe · 3 years ago
> I am even more concerned that very few people are talking about this.

People have been talking about the threat of automation since the very beginning of the industrial revolution. It just never plays out nearly that badly, and short-term disruptions are always outweighed by long-term efficiency gains within ~5 years or so; even those who experience the worst career disruption tend to end up better off within that time frame.

I certainly would not like for my career to be disrupted for ~5 years, but the alternative would be worse.

Animats · 3 years ago
I've been asking for years, if we have all these computers, why do we need so many people in offices? Now we seem to have passed "peak office", with much help from the pandemic.

If everything you do for money goes in and out over a wire, be very afraid.

obblekk · 3 years ago
I don't think so. Junior engineers will learn much much faster than the past (think about how much more effective GPT4 is as a learning tool than the "rubber ducky method" or manpages or even stackoverflow).

And a part of their role will morph into prompting GPT4 (much like this senior engineer has started doing).

If GPTx ends up in the narrow area where it's universally smarter than junior engs but definitely not capable of being a senior eng, then junior engs will just shift to the little remaining work for senior engs, shadow them for months to years like an apprenticeship.

Of course in that case the total number of eng needed will also decrease (already only a small percent ever get good enough to be considered truly senior), so there will be selection bias toward more intelligent engineers who are a step above GPTx. If none are left, then the profession will be gone and there will be no problem.

tivert · 3 years ago
> I don't think so. Junior engineers will learn much much faster than the past (think about how much more effective GPT4 is as a learning tool than the "rubber ducky method" or manpages or even stackoverflow).

That's bunk. The OP is literally "feel[s] the need to hire junior" engineers because he can ChatGPT that work. How are they going to learn a job they won't be given the opportunity to have much faster?

> If GPTx ends up in the narrow area where it's universally smarter than junior engs but definitely not capable of being a senior eng, then junior engs will just shift to the little remaining work for senior engs, shadow them for months to years like an apprenticeship.

That doesn't make much sense. That kind of apprenticeship would be pure charity, so it's not going to happen. No one is going to learn to be a senior engineer in "months," and no one (except someone's rich parents) is going to pay for someone to sit around unproductively in and office for years while they learn. Even interns are required to produce output that adds value. They do that by successfully completing junior-level tasks that need to be done well.

tivert · 3 years ago
> I don't think so. Junior engineers will learn much much faster than the past (think about how much more effective GPT4 is as a learning tool than the "rubber ducky method" or manpages or even stackoverflow).

That's bunk. The OP is literally "feel[s] the need to hire junior" engineers because he can ChatGPT that work. How are they going to learn a job they won't be given the opportunity to have much faster?

> If GPTx ends up in the narrow area where it's universally smarter than junior engs but definitely not capable of being a senior eng, then junior engs will just shift to the little remaining work for senior engs, shadow them for months to years like an apprenticeship.

That doesn't make any sense. That kind of apprenticeship is pure charity, so it's not going to happen.

CameronNemo · 3 years ago
You think GPT4 is more effective than learning how to read a manual? Or some of the best SO answers?
_lnwk · 3 years ago
Why would you take a junior then as an apprentice if they won't generate any value at all?
bawolff · 3 years ago
Computers have been putting people out of jobs since back when "computer" was a human job title and not a machine. Its always ended up creating more jobs than it eliminated in the end, i don't see why the future will be any different than the past.
jstx1 · 3 years ago
Seems like a logical fallacy to assume that the past extrapolates cleanly into the future. "It was okay last time" isn't a good enough argument.
Animats · 3 years ago
> I no longer feel the need to hire juniors.

I hear that from a friend in the legal business. Less need for paralegals. Unclear yet if the need for new lawyers will be reduced.

slt2021 · 3 years ago
maybe paralegals could be replaced by even cheaper clerks with ChatGPT ?

why pay Law school graduate as paralegal, when you can hire associate degree grad with ChatGPT to do the same work?

sircastor · 3 years ago
>> it's that they'll cause a lot of economic devastation to people who make a living through labor requiring skill and knowledge, especially future generations of skilled labor.

Occasionally I would see clips from or read reactions to Idiocracy, and be left scratching my head, because somehow, somewhere, there have to be the people who are thinking. The whole conceit of the film is that there are no smart, curious people because it's being bred out of the population. That never made sense to me because you still have to have some smart, curious, creative people somewhere to keep things moving. Our society is quite dependent on the people who silently keep things running in the background.

I can however envision a world where early curiosity is discouraged, and supplanted by a technology that can fill the holes of the entry-level smart people. When everyone is discouraged from starting, and the existing participants age out, then maybe you can get a world where there are no new smart, curious people.

tivert · 3 years ago
> That never made sense to me because you still have to have some smart, curious, creative people somewhere to keep things moving. Our society is quite dependent on the people who silently keep things running in the background.

Regarding Idiocracy, once of the background conceits of the film is those kinds of people set up automation to keep things going before they died out (for the reasons clearly explained at the start of the movie). If you pay attention everything in that world is automated: a diagnostic machine with a playskool interface (https://www.youtube.com/watch?v=hmUVo0xVAqE) is what's actually doing the doctor's job, a major company is run by a computer the CEO doesn't understand (https://www.youtube.com/watch?v=jBFREFtFEgs), etc.

FpUser · 3 years ago
I use ChatGPT very actively for programming among the other things and at no point feel threatened, rather empowered. No burnout either as I just work as usual. It just replaced Google Search and lots of typing.
guluarte · 3 years ago
> more senior-level people has been shut down (like the OP is doing).

I am already seeing this, companies are desperate for senior developers but at the same time they don't want to hire juniors.

Dalewyn · 3 years ago
>it's that they'll cause a lot of economic devastation to people who make a living through labor requiring skill and knowledge, especially future generations of skilled labor.

If a task can be completed satisfactorily by an automated computer program, was the task really "skilled labor"?

I ask this sincerely, because some of the occupations being replaced/evicted (eg: copywriting) were clearly given more skill value than they should have.

xnx · 3 years ago
You could've said this about calculators
strikelaserclaw · 3 years ago
It is endgame when AGI comes to fruition (which it will, it is just a matter of time, either in 10 years, 50 years, etc...), robots with AGI will be the ultimate life form in the universe and the final evolution of "life" on earth. It is laughable to think something so much more intelligent than people will somehow become a slave to us and do our bidding.
fleeno · 3 years ago
Can you elaborate on how you're actually using ChatGPT? I'm a developer and I haven't felt any need to use ChatGPT constantly.

What tasks are you delegating to ChatGPT that were previously done by humans? Most of my input from others is regarding current information specific to the task at hand. I don't see how ChatGPT would have any idea what I'm talking about.

Do you have some specific examples you could share?

simonw · 3 years ago
I have a bunch of examples myself. Here's a good recent one (prompts are linked about half way down the post): https://simonwillison.net/2023/Aug/6/annotated-presentations...

A few more:

- "Write a Python script with no extra dependencies which can take a list of URLs and use a HEAD request to find the size of each one and then add those all up" https://simonwillison.net/2023/Aug/3/weird-world-of-llms/#us...

- "Show me code examples of different web frameworks in Python and JavaScript and Go illustrating how HTTP routing works - in particular the problem of mapping an incoming HTTP request to some code based on both the URL path and the HTTP verb" https://til.simonwillison.net/gpt3/gpt4-api-design

- "JavaScript to prepend a <input type="checkbox"> to the first table cell in each row of a table" https://til.simonwillison.net/datasette/row-selection-protot...

- "Write applescript to loop through all of my Apple Notes and output their contents" https://til.simonwillison.net/gpt3/chatgpt-applescript

flanked-evergl · 3 years ago
I dunno what you do for a job but if that is your hard problems or even medium problems then I think you lucked out big time.
slt2021 · 3 years ago
sorry but these examples are not impressive at all and by no means a representative of any serious programmer's workload.

Programmers are paid not to bang out code, but rather to figure out the mess and crap of the existing codebase and how to selectively add one-two lines to change system's behavior and keep stability of the system.

bawolff · 3 years ago
So basically llm is the new short shell script.

Fair enough, but i also don't really feel this is threatening anybody's job.

lhl · 3 years ago
I think it's pretty useful to share concrete examples like this. I basically always have a ChatGPT4 CI tab open these days when I'm developing these days - it's a usually much faster/better go-to than Google or SO for looking stuff up. ChatGPT is great for all the random stuff I don't care to remember the syntax for.

* I'm always using it to munge/generated tables/csv/markdown/json - you can basically throw any copy and paste from a random PDF that's some weird gobbedlygook of tabs, spaces, newlines and get something cleanly formatted. On the one hand, it seems like a waste of computation, but on the other hand, it's way cheaper than my time and there are so many tasks that require using poorly formatted output. Even better, CI will of course write awk/sed for you if you need to do any automation.

* I'm always forgetting the syntax for named byobu sessions (it happily wrote a script to help with that) but I've also been staging some dev servers and it was able to generate the scripts to create new named session and windows, attaching/creating when necessary, handling if the processes were running, and creating the systemd units for spinning these up.

* On this same project it wrote some python scripts for managing SSH tunnels and reverse tunnels, including filtering/logging of error messages, handling jump servers, etc. This is all stuff I've done years ago (and even written lots of docs for), but it was actually way faster for ChatGPT to generate these than digging those out.

* I've been running into issues w/ some HTML5 audio output and needed to swap to websocket streaming w/ webmedia output (which I wasn't familiar with at all). ChatGPT gave me the code to swap into my FastAPI server and the frontend code I had w/o having to do any further research, great.

* I hate Docker setups, and I had issues w/ Nvidia containers and GPUs not showing up w/ my docker config. I was able pass the various error messages and get my problems fixed without spelunking/hair-pulling. Same with figuring out some cross-container network hijinx.

* There's a bunch of one-offs that I might just not have bothered doing, that I can just ask it to do as well - eg, I've previously written code for poisson distributions and the like, so I knew what to ask for, but would have been a huge PITA to dig out exactly how to do it, but took like no effort to just ask GPT4 CI to figure out a one off I just wouldn't have done otherwise: https://chat.openai.com/share/80fa7bc0-e099-4577-bad9-d026e7...

fleeno · 3 years ago
Those are good examples. I'm constantly doing weird one-off things in Ruby or Bash, and maybe I'll ask ChatGPT about more of them.

I really wanted to know what vague things OP's humans were doing, but they haven't responded to anything.

I've been reading your blog since the dark ages. Thanks for the great content over the years!

keithnz · 3 years ago
I've been coding 40+ years and I use it reasonably regularly, for things that are trivial time waster type stuff. Like I need some powershell command to automate something that I know will be automatable. Less so for coding, but I have a IDE AI code generator (Codeium) that is often good at predicting what you want to do next, especially for boilerplate type stuff. Then there's the times you are heading into unknown and just need a starting point, for instance I asked it to write a discord bot that did X Y and Z, and it pretty much gave me a good shell of a program. Didn't really have to refer to any other documentation. It's often good at finding ways to do obscure things. Quite often I find it most useful with TSQL stuff, not so much for basic queries, but there's lots of inbuilt toys I've just never come across nor care to spend the time researching. I can't see how it would replace a junior developer though. If anything, it makes it easier for junior developers to get up to speed.
skepticATX · 3 years ago
I think that we are the silent majority. ChatGPT can be mildly useful for me as a dev, but in general it actually slows me down. There have been a few times when it has really shined, but it’s not the norm.

If it was actually a life altering tool (and it might be one day) there wouldn’t need to be an entire industry of people trying to convince everyone that with just one small trick Google doesn’t want you to know, you can quadruple your productivity.

Exuma · 3 years ago
It’s immensely helpful for learning. Orders of magnitude better than google which has ruined their search results with seo bait.

At the very least, its a much more powerful google (dont nitpick my comparison, i realize it hallucinates). Getting the EXACT context of your question is something generalized search/articles online will NEVER give you, and you can read hundreds of pages of docs all day. This is good for certain things, but not when you want to know just a single setting or atomic piece of information. I want to get the smallest amount of accurate information very specifically to my problem, as I'm programming many hours per day on my own companies as a one man show.

My search history on chat gpt includes a few things as examples:

- specific ways SOLID principles could be applied to Go which is non-OOP language

- helping me quickly learn nuances of Lua for configuring neovim, specifically for weird syntax or things annoying to google (ie what does # mean) or what does a specific error mean within the context of the configuration

- more efficient top k algorithms than what I was building for learning purposes

- asking to break down big o complexity of certain types of sort functions and whether they differ from n log n

- helping me learn enough rust to do a bug fix Pr that was annoying me

- x vs s in neovim config for keymap modes

- figuring out why Ruby doesn’t implement descending ranges

Etc etc etc

claytongulick · 3 years ago
I'd love to understand this too - my experience has been that I can generally write what I want faster than figuring out what prompt will get something close to right, and then editing/revising it to make it right.

Add to this the limited usefulness for generating code that's contextual - making some method deep inside a component tree that needs to reference a service class, and pick some dom elements to mutate etc... it requires knowledge and reasoning about the project and overall code structure.

I don't understand how folks are using it as a productivity booster, unless maybe as something like a better StackOverflow?

kderbyma · 3 years ago
yeah, I would like some examples that sent just trivial. I have failed to use it successfully for anything I cannot simply state in a condensed singular statement or paragraph. And almost none of my work is easily condensed into a single paragraph. Coupled with the complete misunderstanding it constantly seems to have and it's inability to understand nuance...I am struggling to make use of it and actually feel productive. Everything I use with it fails when I attempt to test it and it won't do anything complex because the tokens needed to explain the idea alone are quite numerous. I guess you could have it refactor code...?
culopatin · 3 years ago
The chat part lets you start with a base and you build on top of it, you don’t have to fit it all in one sentence.
simple-thoughts · 3 years ago
For code I’ve found LLM mostly useless, since if I don’t understand something I need to read the docs anyway, and the generated code tends to be buggy even in react.

Where I have found LLM useful is in generating text. Where I used to use a thesaurus I now use LLM to find words to name things in themed UX. But it’s not great at function or variable names, it tends to pick names that look good but don’t precisely describe what something is. LLM is also great at generating text for role play.

GuB-42 · 3 years ago
I had a lot more success writing some code and have ChatGPT document it than doing the opposite. The documentation tends to be much better written than what I would have done by myself.

Indeed because ChatGPT is excellent at writing text. And because I know exactly what I want to see even if I have a hard time putting it into words myself, I can easily catch the mistakes and hallucinations.

I don't get why there is so much focus on code generating AIs and so little on code analysis. Have AIs do code reviews, write tests and analyze the results, etc... LLMs are awesome at reviewing code, they are able to tell you what's unexpected. And what is unexpected has a good chance of either being a bug or some key element of the code that needs attention. I think I have seen a single article about that, out of hundreds that are about code generation.

interstice · 3 years ago
After putting some thought into this I think it has to do with the kind of developer you are. In my case I'm usually across 10-20 ecommerce websites doing various semi-unique jobs with relatively simple code.

Largely I use CGPT for work that's boilerplate/LOC heavy but architecture light, things like writing first drafts of React hooks and the like. It's quite good with constraints like use typescript or use X function to do Y.

I usually give it about two goes if it goes in the wrong direction on the first try. If it seems to not conceptually understand what I'm asking I generally just write it directly rather than tinkering with prompts for 20 minutes.

I also have a couple of longer system prompts saved for converting Vue components to React using the house style and things like that using the playground.

nomel · 3 years ago
> but architecture light

It does fairly well for architecture, if you don't expect too many specifics. It, at least, works as a reasonable sanity check/brainstorm.

All of these LLM becomes less expert the finer resolution you take the context. Keep it high level, and you still have a relatively expert assistant.

selestify · 3 years ago
I would love to know this too. For me it’s involved too much manual copy-pasting of existing code for context, for it to feel like it’s doing much for me.
gandalfgreybeer · 3 years ago
For cases like that, copilot (with chat for context) might be more of what you’re looking for. Chatgpt specifically, I’ve been using for very light context / general tasks that I modify. I always consider the trade off between how much time I’m saving by having to write the prompt full of context.
petabytes · 3 years ago
Today I got chatgpt to generate a basic TCP server template in C for an app I'm working on. If I didn't have AI, I probably would have searched for a GitHub gist and there would have probably been a more accurate template.
stevage · 3 years ago
My main use is TypeScript, which I am using for the first time and struggling with a bit. I'm fine with straightforward type definitions but I often hit complicated situationszi don't know how to solve. Googling doesn't really help because U don't know the abstract terms for what I want to do.

Instead I paste the JavaScript and tell ChatGPT to add type definitions. Mostly it gets it right. If it doesn't, it gets me closer.

I don't use it for JS in general because I'm particular about how I write stuff. Though occasionally I'll lean on Copilot to fill out a utility function.

allenu · 3 years ago
I do Mac/iOS development and am constantly asking ChatGPT about various APIs and frameworks. Apple's documentation is not great for explaining how to actually use APIs, unless you can find the one WWDC video that explains it or a sample project that they released years ago. I would normally google for sample code, blog posts, tutorials, or Stack Overflow posts. Something that might take an hour of searching and reading now takes a few seconds of just asking ChatGPT.

Even for things that I've done before, it's often much easier to ask ChatGPT how to do something than to look through my projects to find how I did it previously. It might sound lazy, but if it takes me several minutes to search through various projects to find that one time I did something, why bother when I can just ask ChatGPT and know in seconds?

I will say that yes, ChatGPT can hallucinate APIs that don't exist, and that can be annoying, but even if it does it 20% of the time, it's still incredibly valuable in the time savings the other 80% of the time it does hit.

lolsal · 3 years ago
I wonder how this is affecting what you consider knowledge going forward. This strikes me as students using google to answer homework questions and forgoing the actual “learning” part.
slt2021 · 3 years ago
if you use elasticsearch and not familiar with elastic search's syntax (I am not), you could use ChatGPT to write elastic queries for you.

same for SQL, if you are not familiar with SQL.

probably could be same with Splunk SPL, Kibana KQL, Prometheus PromQL, or any other DSL that you are not familiar with

aldarisbm · 3 years ago
the problem is that you dont get to know if the output is A-Okay. You just know "it works". This is the scariest part to me. Especially a DSL/programming language I'm not familiar with.

I want to contribute, while being fully aware what I'm contributing with. This doesn't lend itself to that.

Deleted Comment

mahathu · 3 years ago
type a function signature and an opening squiggly brace, wait for "copilot" to autocomplete, press tab, ????, profit
karmajunkie · 3 years ago
> 1. Humans work 9-5 (or some schedule), but ChatGPT is available always and works instantly. Now, when I have some idea I want to try out - I start working on it immediately with the help of AI. Earlier I just used to put a note in the todo-list and stash it for the next day.

This sounds like the root of your problem, and entirely on your ability to enforce boundaries (which you may or may not have set for yourself). No judgment here; I think we all have struggled with this at one time or another. Or, you know, constantly...

> 4. I tried to put a schedule to use it - but when everybody has access to this tech, I have a genuine fear of missing out.

I definitely know that feeling. I think the likely outcome writ large is that this FOMO feeling will eventually subside. The economy for years has needed more developers than were available; ChatGPT and friends will result in individuals being able to do more and soak up demand that way instead of increasing supply. The long-term negative effect of this is more likely to be depressed wages instead of massive unemployment in the tech sector.

> 5. I have zero doubt that AI is setting the bar high, and it is going to take away a ton of average-joe desk jobs. GPT-4 itself is quite capable and organisations are yet to embrace it.

Another way of looking at it is that its going to create a number of desk jobs, but those who can't adapt to the tools on the market will suffer in the same way that people who couldn't adapt to the use of spreadsheets, word processors, etc, certainly had fewer job opportunities than those who did. Some people are going to get left behind, no doubt—this is why I'm in favor of a robust social safety net. But even with questionable public support for those people, I don't think anyone today would suggest we should retreat to an economy that didn't have such basic tools as spreadsheets and word processor apps today.

Paul-Craft · 3 years ago
Interesting observations. For context, it looks like you are a software engineer from your comment history, is that correct?

I'm wondering why you're feeling the need to hire juniors because of GPT-4. Is it because GPT-4 has taken up the cognitive load capacity you need for mentoring juniors, or do you feel like GPT "obsoletes" less experienced people?

I think ChatGPT's advice is on the right track. It sounds to me like your experience of using it is kind of like my experience of pairing with someone else of equal-ish ability: productive, but draining, due to the need to constantly pay attention. If so, why not treat it similarly? Most people don't pair all day every day, probably because of the aforementioned cognitive load of doing so.

Last, but not least, while this may seem obvious, you should remember that you are human and not a machine. You need to separate yourself from this thing for at least some portion of your day. The constant stress (and, yes, that dopamine rush you feel when you use it is a kind of stress -- stress isn't always a purely negative thing) will take its toll on you eventually. That's the "burnout" you're perceiving, and the only way to prevent it is to just not let it happen.

Take care of yourself. Socialize and interact with humans, especially close friends and/or SO's as applicable. If you have a pet, spend some time with them. Take a walk.

But, most of all, remember that GPT-x, as smart as it may appear, can't actually learn anything from experience. It can only learn from an expensive and labor-intensive process, and once its training is done, it's frozen in time forever (modulo some fine-tuning, which is essentially an extension of said labor-intensive training process). And, at the end of the day, that just makes it a very versatile, very expensive, and very useful tool, but a tool nonetheless.

obblekk · 3 years ago
I've experienced a very similar feeling.

To me it feels exactly like finding wikipedia in 2005, or getting an iphone + wikipanion in 2008. The frontiers of my mind have been unleashed. A real bicycle for the mind.

Here are some tactics I use to "turn off gpt":

1. It'll be there tomorrow. The great thing about their threaded model is you can easily find the convo and continue it tomorrow. Remind yourself of that consciously (or tape it to your monitor!)

2. You're not behind, you're ahead. 80% of Americans haven't tried chatgpt. 95% of the world maybe.

3. Don't worry about juniors. They'll still be hired because now they'll ramp up faster and produce better code, using the same tool you're using. Same thing that happened when stackoverflow became popular and junior devs stopped "reading the source code" or "reading man pages."

For all the limitations of GPT4, it truly is great at coding. Exciting times.

purplecats · 3 years ago
> 2. You're not behind, you're ahead. 80% of Americans haven't tried chatgpt. 95% of the world maybe.

idk if anyone realistically compares themselves to the abstract nebulous "everyone". its likely moreso in regards to their socioeconomic band

falloutx · 3 years ago
It seems like one of those things like VR and crypto, technical solution looking for a problem to solve. After 2 years we have still not found a single good use for it and yet it is supposed to be disruptive. If you think we have, provide me with an example of one app which really has used it so well that it is now comfortably ahead of the competition.
DonsDiscountGas · 3 years ago
>Don't worry about juniors. They'll still be hired because now they'll ramp up faster and produce better code

So maybe the seniors should be worried, since we/they don't have much barrier to entry that means much more competition.

midasz · 3 years ago
If you think writing code is the most important, or even the largest part of being a (senior) software engineer - sure. In my experience it's not though. It's being able to communicate clearly, understanding and translating requirements (sometimes to code), knowing boundaries and saying no, deep understanding of systems and knowing how to debug them.

Transitioning from junior to medior (for example) is much more than writing x% better code. It's the process of falling and getting back up. Being stumped and learning when to ask for help (and not just technical, what if the spec is 'wrong'?).

I definitely worry that we are leaving future generations in the dust and that there'll be an experience gap. It's a disservice to take away something from them that we enjoyed ourselves.

No sane company should run on juniors, they're an investment.

zer8k · 3 years ago
Waiting for your direct to Amazon book about how AI will kill software OP. Always entertaining.

ChatGPT will likely be added to the list of dead things that were supposed to "kill" the software developer. I've noticed this pervasive attitude among, what I can only term as, people who actually enjoy LinkedIn. If you understand what I'm saying you can probably already picture the annoying over the top buzzword written below-the-fold post that feels like its only designed to steal braincells. ChatGPT might be able to kill the CRUD developer like WYSIWIG killed HTML programmers. There will be plenty of jobs no one wants ChatGPT to touch. Finance, medicine, and military are some I can imagine without much thinking. "No Code" is on its, what, 4th iteration and still hasn't killed programming. We are more likely to lose our jobs to overseas outsourcing than a stupid rock we tricked into thinking.

I am actually annoyed reading this Ask HN. The level of smugness reminds me of wantrepreneur bros. Woe is me I'm burned out from being so productive. Gag. I'm an actual professional developer. ChatGPT does not provide oodles of value to me. A lot of our juniors and mids use it and I often find problems with the way they copy-and-paste garbage. Admittedly, the copy-and-paste is better. However, to me it reduces to the same StackOverflow problem. Maybe if they were better "prompt engineers" (lol) they might get better output. Or they could take the 30 hours needed to figure out prompts to just simply do better at writing code.

falloutx · 3 years ago
Also other things among HN readers I have noticed recently is that they say Google search has gone incredibly bad so they use chatGPT more. But they fail to understand that one of key reasons that Google search has gone bad is due to every tom and his mom pushing out garbage seo spam with the help of chatGPT in last 2 years.
nomel · 3 years ago
> At times it feels like we are working for ChatGPT and not the other way around.

Welcome to the future, where AI subscriptions (self or employee provided) are required for employment, with the majority of your work being management and high level input, where you guide and answering questions for the* AI.

*Probably "The" AI, since there will be one obvious choice for your problem space, which not using would put you at a severe disadvantage.

Seriously though, I've been feeling this somewhat too, lately. The "investment" part of ROI has been shifted significantly, for the "junior" side of things, where I can do "boring" things I wouldn't normally. So, I find myself doing more boring tasks, with a definite net positive outcome, but also everything negative that you described.

The problem with this is that this ROI only the "junior" end of problem space, so, I'm working on more junior problems than I was before.

I think we're somewhat proving that juniors are still needed, to take these tasks. They have been empowered the most, and will still learn and feel creative, working on these problems. More senior people won't. I understand I'm saying this from a point of extreme privilege, but I think most of us need to feel creative, and "enjoy" what we're doing. That means harder problems.

Maybe it's best to still let the juniors continue to do the junior things. There's someone out there that would love to spend all day doing what's burning you out.

al2o3cr · 3 years ago

    ChatGPT has the habit of throwing new knowledge back at you.
That's certainly ONE way to characterize its tendency to hallucinate APIs and operating modes out of thin air.

    I no longer feel the need to hire juniors.
You've just described how you're overworked and burning out from doing too much stuff yourself. Are you sure about that absence of need?