Archive

Perhaps nothing has defined higher education over the past two decades more than the rise of computer science and STEM. Since 2016, enrollment in undergraduate computer-science programs has increased nearly 49 percent. Meanwhile, humanities enrollments across the United States have withered at a clip—in some cases, shrinking entire departments to nonexistence.

But that was before the age of generative AI. ChatGPT and other chatbots can do more than compose full essays in an instant; they can also write lines of code in any number of programming languages. You can’t just type make me a video game into ChatGPT and get something that’s playable on the other end, but many programmers have now developed rudimentary smartphone apps coded by AI. In the ultimate irony, software engineers helped create AI, and now they are the American workers who think it will have the biggest impact on their livelihoods, according to a new survey from Pew Research Center. So much for learning to code.

Fiddling with the computer-science curriculum still might not be enough to maintain coding’s spot at the top of the higher-education hierarchy. “Prompt engineering,” which entails feeding phrases to large language models to make their responses more human-sounding, has already surfaced as a lucrative job option—and one perhaps better suited to English majors than computer-science grads.

The potential decline of “learn to code” doesn’t mean that the technologists are doomed to become the authors of their own obsolescence, nor that the English majors were right all along (I wish). Rather, the turmoil presented by AI could signal that exactly what students decide to major in is less important than an ability to think conceptually about the various problems that technology could help us solve.

  • colonial@lemmy.world
    link
    fedilink
    arrow-up
    30
    ·
    1 year ago

    After all, the discipline has always been about more than just learning the ropes of Python and C++. Identifying patterns and piecing them together is its essence.

    Ironic, considering LLMs can’t fucking do that. All they do is hallucinate the statistically likely answer to your prompt, with some noise thrown in. That works… okay at small scales (but even then, I’ve seen it produce some hideously unsound C functions) and completely falls apart once you increase the scope.

    Short of true AGI, automatically generating huge chunks of your code will never end well. (See this video for a non-AI example. I give it two years tops before we see it happen with GPT.)

    Also… not hating on English majors, but the author has no idea what they’re talking about and is just regurgitating AI boosterism claims.

    • loobkoob@kbin.social
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      All they do is hallucinate

      I read an article a couple of months ago about AI usage in geolocation (link because it’s interesting, even though it’s not necessarily relevant). In it, they brought up a quote from a computer scientist / AI specialist who said he preferred the word “confabulate” to describe what happens with AI, rather than “hallucinate”

      Confabulation: a type of memory error in which gaps in a person’s memory are unconsciously filled with fabricated, misinterpreted, or distorted information.

      I agree with the guy that it’s a slightly better term for it, but I also just think it’s such a fun word that it’s too good not to share!

    • varsock@programming.dev
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      I agree with you and share the same opinions.

      For discussion sake I will add that, using AI I have became so fast at creating “units of code” or restructuring. I ask it to solve a narrow narrow scope and introduce constraints (like conditional variable and which parameters, initial conditions). And it does. I have the experience validate by reading and to piece together the units of code but now my productivity near tripled.

      I don’t write comments anymore. I write what I neeed, ask it to comment the function, maybe I’ll add something that is project specific.

      And getting started with new technologies is easier as long as, like you said, keep the scope small.

      AI will not replace programmers. Programmers that use AI will replace programmers who don’t.

      • psivchaz@reddthat.com
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I think this is generally true, probably for the rest of my career. I don’t think it is true forever. Asking “what happens when this stops being a career” or at least “what happens when there are less jobs to go around” is important, and something I would rather we all sort out long before I need the answer.

        • varsock@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Valid point. Again for the sake of discussion, technology evolves quickly. New tools are made out of the shortcoming of others. If docker evolves and a new tool - Kocker - is born, AI will need training data from best practices which should be generated by people.

          This could unfold in many ways. For one there could a small group of people pushing technology forward. But people will need to be around to create requirements, which takes experience.

          More likely, majority of engineers will likely just move up to a higher level of abstraction, letting new tools do the lower layer stuff. And any innovations in the lower levels of abstraction will be done by a small group of people with niche skills (take CPUs for example). This is the trend we saw historically. Assembly -> compilers -> lower languages -> interpreted languages -> scaling bare metal systems -> distributed systems -> virtual machines -> automation -> micro services etc etc

  • MajorHavoc@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    As a veteran, there’s always a new piece of technology about to make me obsolete.

    I’ve been made obsolete a dozen times now. It’s funny how many people still cold call me asking for my help on things.

    I’m sure they’ll all get the memo that I’m obsolete this time. /s

    • Montagge@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Where I grew up the loggers said the same thing. Sure there are still loggers, but teams that had 30 workers now have 6.

      If you don’t think LLM and/or AI isn’t going to be used to put people out of work to keep the cash flow going to the rich I don’t know what to tell you.

      • VoterFrog@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        The logger analogy is a misunderstanding of what people with a degree in CS do. Most become software engineers. They’re not loggers, they’re architects who occasionally have to cut their own logs.

        They’ve spent decades reducing the amount of time they have to spend logging only to be continually outpaced by the growth of demand from businesses and complexity of the end product. I don’t think we’ve reached a peak there yet, if anything the capabilities of AI are opening up even more demand for even more software.

        But, ultimately, coding is only a fraction of the job and any halfway decent CS program teaches programming as a means to practice computer science and software engineering. Even when an AI gets to the point that it can produce solid code from the English language, it has a ways to go before replacing a software engineer.

        One thing that’s for sure: tons of business owners will get richer and pay fewer workers. I think we’re going to have to face a reckoning as we reach the limits of what capitalism can sustain. But it’s also unpredictable because AI opens up new opportunities for everyone else as well.

      • MajorHavoc@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        1 year ago

        Oh, I think it will cause a reduction in jobs - at least compared to if it had not existed. So did interpreted languages, shell scripting, web frameworks, code-generation templates, the dot-com bust, and the 2008 recession.

        Some folks are convinced that reduction is happening this year. I’ve written my share of LLMs and ML code, and I assure you it will not be next year.

        The hallucination problem is being dramatically downplayed by the sales people, but reality tends to sort that stuff out.

        Over the longer term, there’s a feeling among developers that this job is going to both stop sucking so badly, and stop paying so well. I also believe that is coming. But not this year or next.

        If anything, we look forward to quite a bit of additional pain and money to clean up after all the all-in-on-day-one AI mistakes being made right now.

        After that mess is sorted, a few of us should be able to retire, and a bunch of the rest can switch to part time.

        Edit: A key thing folks don’t recognize about the in-progress AI revolution is that programming isn’t a slow job, before the AI.

        Programming is a minefield of preventable mistakes, botched deployments, subtle mistakes, outright security nightmare mistakes, and misunderstood user requests.

        The programmer shortage is not because we’re all tired of typing so many lines of code, or even of copy/pasting from Stack Overflow, and we just need more of us to do all that lovely typing and pasting.

        The programmer shortage is because this job is a minefield of stressful surprises, and we have a high attrition rate.

        AI can and will help. But anyone betting on AI as a full replacement in the short run is woefully ignorant of the steep challenges faced by the entire field of programming.

        • Montagge@kbin.social
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          I never meant to imply, like the attached article, that this is all happening soon. It doesn’t have to happen soon to upend decades long careers.

          I’m sure AI will help at first. That’s the point with something like this. Get people using it so you can continue to improve it to the point where you don’t need the people.

          Same thing happened with the loggers. Oh this equipment can’t replace a person. It’s just here to make a hard job easier. No. No. No it’s not the equipment taking your job away. It’s that damn spotted owl taking your job.

          That kind of makes me wonder in the decades ahead what will be the spotted owl when programmers start getting replaced.

      • jflorez@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Code created by a LLM still needs to be interpreted and understood by a human so it can be made useful in a software development context. So yeah the article is exaggerating the impact of AI for coding I think, in my opinion it will become yet another tool at a developer’s disposal to speed up their work

        • Montagge@kbin.social
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          I personally refuse to use it because how I use it is being used to create a system to replace me, and you can’t change my mind on that.

  • bloopernova@programming.dev
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    You still need to manually verify the code, line by line. Sure, in the future AI language models might be able to reason to fix bugs or do other human tasks, but that’s not happening tomorrow or even in the next couple of years.

    The AI has no understanding of what it is spitting out, so it’s up to us humans to curate its output.

  • squarm@lemm.ee
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    I assume these are just ghost written by business people who are hoping if they say something enough that it’ll become true

  • uniqueid198x@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    Scare pieces like this are created by people who have no actual understanding of software.

    Software is the automation of conceptual tasks. Some of these, like taxes or text editing, were fairly procedural and automated early. Others, like identifying birds or deepfaking celebreties, are dificult and were done later.

    Creating software is another conceptual task, and it might be possiple to automate it. But once we have automated creating software, automating any other task becomes trivial as well.

    If this ever comes to pass, there are no safe majors.

  • Beej Jorgensen@lemmy.sdf.org
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    I hypothesize the failure of AI in this arena will be due to the fact that English is a shit programming language. It can take many times the amount of English to be precise compared to the equivalent computer code.

  • foo@withachanceof.com
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    1 year ago

    Considering the amount of flat out incorrect or wildly off-base code GPT has generated on surprisingly simple tasks over the past nearly year now, no, I’m not too worried about my job. I find it handy for time to time in replacing stuff I previously used a search engine for which makes it a productivity booster for me, but for anything novel or not straightforward (aka, anything outside of its training set, which is what I’d ideally want to use it for), it’s less than useful or actively harmful in trying to lead me down the wrong path. Overall, it still requires a human with significant knowledge in the field to know how to use the information these tools generate and how to put the pieces together to do something useful. I don’t see how that could change until there is an actual reasoning artificial intelligence brain developed which is a BIG ask, if it’s even possible in our lifetimes, or ever.

    …or maybe we’ll all be out of a job in 10 years. Humans are quite bad at predicting the future and I am indeed human.

    And for what it’s worth, no I did not RTFA. I’ve spent enough time reading articles prophesizing the doom of software engineering due to generative AI and don’t feel like wasting more time on the topic.

    • colonial@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      And for what it’s worth, no I did not RTFA

      I thought you had up to this point. I guess that just goes to show how shallow and predictable this AI boosterism is.

  • RotaryKeyboard@lemmy.ninja
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    I’ve just spent a few weeks continually enhancing a script in a language I’m not all that familiar with, exclusively using ChatGPT 4. The experience leaves a LOT to be desired.

    The first few prompts are nothing short of amazing. You go from blank page to something that mostly works in a few seconds. Inevitably, though, something needs to change. That’s where things start to go awry.

    You’ll get a few changes in, and things will be going well. Then you’ll ask for another change, and the resulting code will eliminate one of your earlier changes. For example, I asked ChatGPT to write a quick python script that does fuzzy matching. I wanted to feed it a list of filenames from a file and have it find the closest match on my hard drive. I asked for a progress bar, which it added. By the time I was done having it generate code, the progress bar had been removed a couple of times, and changed out for a different progress bar at least three times. (On the bright side, I now know of multiple progress bar solutions in Python!)

    If you continue on long enough, the “memory” of ChatGPT isn’t sufficient to remember everything you’ve been doing. You get to a point where you need to feed it your script very frequently to give it the context it needs to answer a question or implement a change.

    And on top of all that, it doesn’t often implement the best change. In one instance, I wanted it to write a function that would parse a CSV, count up duplicate values in a particular field, and add that value to each row of the CSV. I could tell right away that the first solution was not an efficient way to accomplish the task. I had to question ChatGPT in another prompt about whether it was efficient. (I was soundly impressed that it recognized the problem after I brought it up and gave me something that ended up being quite fast and efficient.)

    Moral of the story: you can’t do this effectively without an understanding of computer science.

  • A1kmm@lemmy.amxl.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Programming is the most automated career in history. Punch cards, Assembler, Compilers, Linkers, Keyboards, Garbage Collection, Type Checkers, Subroutines and Functions, Classes, Macros, Libraries (of increasingly higher-level abstractions), Build Scripts, CI/CD - those are all automation concepts that do things that theoretically a programmer could have done manually. To build all the software we build now would theoretically be possible without any automation - but it would probably require far more programmers than there are people on earth. However, because better tech leads to people doing more with the same, in practice the number of programmers has grown with time as we’ve just built more complex software.

  • MNByChoice@midwest.social
    link
    fedilink
    arrow-up
    3
    arrow-down
    8
    ·
    1 year ago

    No career is “safe”. The person can learn new skills and adapt.

    The article actually ends well:

    exactly what students decide to major in is less important than an ability to think conceptually about the various problems that technology could help us solve.