• halcyoncmdr@lemmy.world
    link
    fedilink
    English
    arrow-up
    111
    arrow-down
    8
    ·
    14 days ago

    No shit. This was obvious from day one. This was never AGI, and was never going to be AGI.

    Institutional investors saw an opportunity to make a shit ton of money and pumped it up as if it was world changing. They’ll dump it like they always do, it will crash, and they’ll make billions in the process with absolutely no negative repercussions.

    • metaStatic@kbin.earth
      link
      fedilink
      arrow-up
      7
      arrow-down
      24
      ·
      14 days ago

      Turns out AI isn’t real and has no fidelity.

      Machine learning could be the basis of AI but is anyone even working on that when all the money is in LLMs?

      • Joeffect@lemmy.world
        link
        fedilink
        English
        arrow-up
        30
        arrow-down
        2
        ·
        14 days ago

        I’m not an expert, but the whole basis of LLM not actually understanding words, just the likelihood of what word comes next basically seems like it’s not going to help progress it to the next level… Like to be an artificial general intelligence shouldn’t it know what words are?

        I feel like this path is taking a brick and trying to fit it into a keyhole…

        • metaStatic@kbin.earth
          link
          fedilink
          arrow-up
          13
          arrow-down
          3
          ·
          13 days ago

          learning is the basis of all known intelligence. LLMs have learned something very specific, AGI would need to be built by generalising the core functionality of learning not as an outgrowth of fully formed LLMs.

          and yes the current approach is very much using a brick to open a lock and that’s why it’s … ahem … hit a brick wall.

          • Joeffect@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            13 days ago

            Yeah, 20 something years ago when I was trying to learn PHP of all things, I really wanted to make a chat bot that could learn what words are… I barely got anywhere but I was trying to program the understanding of sentence structure and feeding it a dictionary of words… My goal was to have it output something on its own …

            I see these things become less resource intensive and hopefully running not on some random server…

            I found the files… It was closer to 15 years ago…

              • taladar@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                4
                ·
                13 days ago

                Also a bit sadistic to be honest. Bringing a new form of life into the world only to subject it to PHP.

              • Joeffect@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                13 days ago

                I’m amazed I still have the files… But yeah this was before all this shit was big… If I had a better drive I would have ended up more evil than zuck … my plan was to collect data on everyone who used the thing and be able to build profiles on everyone based on what information you gave the chat … And that’s all I can really remember… But it’s probably for the best…

        • Pennomi@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          14 days ago

          Right, so AIs don’t really know what words are. All they see are tokens. The tokens could be words and letters, but they could also be image/video features, audio waveforms, or anything else.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          12 days ago

          shouldn’t it know what words are?

          Not necessarily, but it should be smart enough to associate symbols with some form of meaning. It doesn’t do that, it juts associates symbols with related symbols, so if there’s nothing similar that already exists, it’s not going to be able to come back with anything sensible.

          I think being able to create new content with partial sample data is necessary to really be considered general AI. That’s what humans do, and we don’t necessarily need the words to describe it.