Professionals (using the term loosely) are using LLMs to draft emails and reports, and then other professionals (?) are using LLMs to summarise those emails and reports.
I genuinely believe that the general effectiveness of written communication has regressed.
Yep. My work has pushed AI shit massively. Something like 53% of staff are using it. They’re using it to write reports for them for clients, all sorts. It’s honestly mad.
I’ve tried using an LLM for coding - specifically Copilot for vscode. About 4 out of 10 times it will accurately generate code - which means I spend more time troubleshooting, correcting, and validating what it generates instead of actually writing code.
It’s not just the internet.
Professionals (using the term loosely) are using LLMs to draft emails and reports, and then other professionals (?) are using LLMs to summarise those emails and reports.
I genuinely believe that the general effectiveness of written communication has regressed.
Yep. My work has pushed AI shit massively. Something like 53% of staff are using it. They’re using it to write reports for them for clients, all sorts. It’s honestly mad.
Relevant comic
I’ve tried using an LLM for coding - specifically Copilot for vscode. About 4 out of 10 times it will accurately generate code - which means I spend more time troubleshooting, correcting, and validating what it generates instead of actually writing code.
I use it to construct regex’s which, for my use cases, can get quite complicated. It’s pretty good at doing that.
I like using gpt to generate powershell scripts, surprisingly its pretty good at that. It is a small task so unlikely to go off in the deepend.