• meme_historian@lemmy.dbzer0.comOP
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 天前

    Oh, no need to wait for LLMs. Apache Solr should be really good at it. We used it at a company I was working at to build the most kickass search into our platform, that would actually find the stuff you were looking for…and that was back in 2018 :D

    • thickertoofan@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 天前

      ayy, that’s nice. LLMs are truely overkill just for semantic search though, didnt know there are other ways to achieve this. but we need intelligence too right. (somewhat)

      • meme_historian@lemmy.dbzer0.comOP
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        1 天前

        Don’t get me wrong though… throwing an LLM at it would be a lot easier and faster. Just a mind boggling use of resources for a task that could probably be done more efficiently :D

        Setting this up with Apache Solr and a suitable search frontend runs a high risk of becoming an abandoned side project itself^^

        • thickertoofan@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 天前

          Yeah LLM seems like the go to solution. And the best one. And talking about resources, we can use barely smart models which can generate coherent sentences, be it 0.5b-3b models offloaded to CPU inference only.

      • 0xD@infosec.pub
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 天前

        Yes, your own intelligence that you integrate into the structure of your database and queries ;)