- cross-posted to:
- technology@lemmy.world
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
- technology@lemmy.world
cross-posted from: https://sh.itjust.works/post/1062067
In similar case, US National Eating Disorder Association laid off entire helpline staff. Soon after, chatbot disabled for giving out harmful information.
It’s a little annoying how this article is written as a shitty “look who’s getting dunked on on Twitter today” article, even though it’s actually about a serious issue. I don’t care about Twitter drama, I care about the fact that people are losing their jobs to AI.
Because they can’t or are not willing to investigate what happened at this particular company nor to its staff. The push of the story is therefore about what’s happening on Twitter (“getting absolutely roasted”) because people connect with action.
A better story could recount the events up to now. Maybe something like this?
Finding this information and weaving it into a story that people go “And then what happened?!” is difficult and takes time. It’s hard to justify when you can get clicks from shit like this article.