As the AI market continues to balloon, experts are warning that its VC-driven rise is eerily similar to that of the dot com bubble.

  • gayhitler420@lemm.ee
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    4
    ·
    1 year ago

    You got two problems:

    First, ai can’t be a tutor or teacher because it gets things wrong. Part of pedagogy is consistency and correctness and ai isn’t that. So it can’t do what you’re suggesting.

    Second, even if it could (it can’t get to that point, the technology is incapable of it, but we’re just spitballing here), that’s not profitable. I mean, what are you gonna do, replace public school teachers? The people trying to do that aren’t interested in replacing the public school system with a new gee whiz technology that provides access to infinite knowledge, that doesn’t create citizens. The goal of replacing the public school system is streamlining the birth to workplace pipeline. Rosie the robot nanny doesn’t do that.

    The private school class isn’t gonna go for it either, currently because they’re ideologically opposed to subjecting their children to the pain tesseract, but more broadly because they are paying big bucks for the best educators available, they don’t need a robot nanny, they already have plenty. You can’t sell precision mass produced automation to someone buying bespoke handcrafted goods.

    There’s a secret third problem which is that ai isn’t worried about precision or communicating clearly, it’s worried about doing what “feels” right in the situation. Is that the teacher you want? For any type of education?

      • gayhitler420@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        We’ve invented a computer model that bullshits it’s way through tests and presentations and convinced ourselves it’s a star student.

    • linearchaos@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      5
      ·
      1 year ago

      First, ai can’t be a tutor or teacher because it gets things wrong.

      Since the iteration we have that’s designed for general purpose language modeling and is trained widely on every piece of data in existence can’t do exactly one use case, you can’t conceive that it can ever be done with the technology? GTHO. It’s not like we’re going to say ChatGPT teach kids how LLM works, but some more stuctured program that uses something like chatGPT for communication. This is completely reasonable.

      that’s not profitable.

      A. It’s my opinion but I think you’re dead wrong and it’s easily profitable if not to ivy league standards it would certainly put community college out of business.

      B. Screw profit. Philanthropic investment throws a couple billion into a nonprofit run by someone who wants to see it happen.

      The private school class isn’t gonna go for it either,

      You think an Ivy League school is above selling a light model of their courseware when they don’t have to pay anyone to teach the classes, or grade the work? Check out Harvard University Edx. It’s not a stretch.

      t third problem which is that ai isn’t worried about precision or communicating clearly

      Ohh a secret third problem, that sounds fun. I’ll let you in on another secret, AI isn’t worried because it’s a very large complicated math program. It doesn’t worry about communicating clearly, the people who pile on layer upon layer of LLM to produce output do that. It doesn’t give a damn about anything, but the people who work on it do.

      You want clarity?

      Let’s have GTP4, here as it sits clear up your complaint about my post:

      "Here is a revised version of your text that sounds more educated:

      There are two primary issues with the notion of using artificial intelligence as a tutor or teacher. Firstly, artificial intelligence is not infallible and can make mistakes. Consistency and correctness are integral components of pedagogy, and artificial intelligence is not yet capable of achieving these standards. As such, it is not suitable for the role you have suggested.

      Secondly, even if artificial intelligence were capable of fulfilling this role (which it is not, due to technological limitations), it would not be a profitable endeavor. Replacing public school teachers with artificial intelligence would not align with the goals of those seeking to reform the public school system. The aim is to streamline the process from birth to the workplace, and an artificial intelligence nanny would not facilitate this.

      Furthermore, the private school class would not be interested in such a proposal. They are currently opposed to subjecting their children to such technology and are willing to pay for the best educators available. Precision mass-produced automation cannot compete with bespoke handcrafted goods.

      There is also a third issue to consider: artificial intelligence is not concerned with precision or clear communication. Instead, it focuses on what “feels” right in a given situation. Is this the type of teacher we want for any form of education?"

      or perhaps you’d like to have it communicate clearly to younger people

      "Sure, here is a revised version of the text written at a 5th-grade level:

      There are two big problems with using computers to teach kids. The first problem is that computers can make mistakes. Teachers need to be right all the time, and computers aren’t always right. So, they can’t do the job you want them to do.

      The second problem is that even if computers could do the job (they can’t, but let’s pretend), it wouldn’t make money. People who want to change schools don’t want to use computers to teach kids everything. They want to make it easier for kids to get jobs when they grow up. A computer nanny can’t do that.

      Rich kids who go to private schools wouldn’t like it either. Their parents pay a lot of money for the best teachers. They don’t want a computer nanny. You can’t sell something cheap and easy to make to someone who wants something special and handmade.

      There’s also a secret third problem. Computers don’t care about being right or explaining things clearly. They just do what they think is best at the moment. Is that the kind of teacher you want? For any kind of learning?"

      • gayhitler420@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Woof.

        I’m not gonna ape your style of argumentation or adopt a tone that’s not conversational, so if that doesn’t suit you don’t feel compelled to reply. We’re not machines here and can choose how or even if we respond to a prompt.

        I’m also not gonna stop anthropomorphizing the technology. We both know it’s a glorified math problem that can fake it till it makes it (hopefully), if we’ve both accepted calling it intelligence there’s nothing keeping us from generalizing the inference “behavior” as “feeling”. In lieu of intermediate jargon it’s damn near required.

        Okay:

        Outputting correct information isn’t just one use case, it’s a deep and fundamental flaw in the technology. Teaching might be considered one use case, but it’s predicated on not imagining or hallucinating the answer. Ai can’t teach for this reason.

        If ai were profitable then why are there articles ringing the bubble alarm bell? Bubbles form when a bunch of money gets pumped in as investment but doesn’t come out as profit. Now it’s possible that there’s not a bubble and all this is for nothing, but read the room.

        But let’s say you’re right and there’s not a bubble: why would you suggest community college as a place where ai could be profitable? Community colleges are run as public goods, not profit generating businesses. Ai can’t put them out of business because they aren’t in it! Now there are companies that make equipment used in education, but their margins aren’t usually wide enough to pay back massive vc investment.

        It’s pretty silly to suggest that billionaire philanthropy is a functional or desirable way to make decisions.

        Edx isn’t for the people that go to Harvard. It’s a rent seeking cash grab intended to buoy the cash raft that keeps the school in operation. Edx isn’t an example of the private school classes using machine teaching on themselves and certainly not on a broad scale. At best you could see private schools use something like Edx as supplementary coursework.

        I already touched on your last response up at the top, but clearly the people who work on ai don’t worry about precision or clarity because it can’t do those things reliably.

        Summarizing my post with gpt4 is a neat trick, but it doesn’t actually prove what you seem to be going for because both summaries were less clear and muddy the point.

        Now just a tiny word on tone: you’re not under any compulsion to talk to me or anyone else a certain way, but the way you wrote and set up your reply makes it seem like you feel under attack. What’s your background with the technology we call ai?

        • linearchaos@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          What do you want me to do here? Go through each line item where you called out something on a guess that’s inherently incorrect and try to find proper citations? Would you like me to take the things were you twisted what I said, and point out why it’s silly to do that?

          I could sit here for hours and disprove and anti-fallacy you, but in the end, you don’t really care you’ll just move the goal post. Your world view is AI is a gimmick and nothing that I present to you is going to change that. You’ll just cherry pick and contort what I say until it makes you feel good about AI. It’s a fools’ errand to try.

          Things are nowhere near as bad as you say they are. What I’m calling for is well withing the possible realm of the tech with natural iteration. I’m not giving you any more of my time. any further conversation will just go unread and blocked.

          • gayhitler420@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Hey I know you’re out, but I just wanna jump in and defend myself: I never put words in your mouth and never moved a goal post.

            Be safe out there.

      • Ragnell@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        This weekend my aunt got a room at a ery expensive motel, and was delighted by the fact that a robot delivered amenities to her room. And at breakfast we had an argument about whether or not it saved the hotel money to us the robot instead of a person.

        But the bottom line is that the robot was only in use at an extremely expensive hotel and is not commonly seen at cheap hotels. So the robot is a pretty expensive investment, even if it saves money in the long run.

        Public schools are NEVER going to make an investment as expensive as an AI teacher, it doesn’t matter how advanced the things get. Besides, their teachers are union. I will give you that rich private schools might try it.

        • linearchaos@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          Single robot, single hotel = bad investment.

          Single platform teaching an unlimited number of users anywhere in the world for whatever price can provide the R&D and upkeep. Greed would make it expensive if it can, it doesn’t have to be.