• OpenStars@discuss.online
    link
    fedilink
    English
    arrow-up
    12
    ·
    27 days ago

    A lot of that is doable now - like, how many grocery stores are even nearby to someone, so writing a custom bit of code to check the website of each, one by one, and looking for previously manually-identified items could be automated.

    One major downside is prioritization of large chain stores at the expense of smaller mom & pop ones that don’t maintain a constant inventory system accessible via the web. Someone could even volunteer their time to build them a database backend, but still they’d have to see the value in actually scanning the items every time or else it would quickly fall behind.

      • OpenStars@discuss.online
        link
        fedilink
        English
        arrow-up
        4
        ·
        27 days ago

        That’s precisely what I was thinking, but reflecting more on it, I don’t know how well it would handle the webpages, so maybe some other languages mixed in too (I’m out of date, maybe PHP?). If AI writing code worked it would lower the barrier, but I’m not certain we’re quite there yet to trust anything it would create.

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          27 days ago

          Python web scraping is just fine, with the llms you.have the option of either extracting the html and having the LLM read.over that, or having a vision ai OCR the page and make its own decision of what to extract.