I’m currently getting a lot of timeout errors and delays processing the analysis. What GPU can I add to this? Please advise.

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    16
    ·
    edit-2
    6 days ago

    The i7-6700 has an Intel iGPU that will handle heavy transcoding just fine using Quicksync.

    It will even do really fast object detection with OpenVINO, with minimal CPU usage. At least in Frigate both of those things work extremely well.

    • nieceandtows@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      I bought that desktop exactly for that reason. The video recording itself seems to work fine, but the ai model seems to be struggling sometimes, and even when it works, it takes about half a second or more to make a classification. That’s what I want to improve with the gpu. I’m reading up on openvio, and it seems impressive, but only on frigate. Do you have any experience with Frigate vs Blue iris? What are your thoughts?

      • ikidd@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        3 days ago

        So I run a debian server, it’s a shitty little i5 4th gen of some sort with 8GB of RAM. It runs BI in a docker of Windows using Dockur, as well as docker stacks for mosquitto and deepstack, and other containers for my solar array and inverters.

        I have BI AI pointing to the debian host machine’s IP with the port I used for deepstack container. This seems to be pretty good at object detection without any GPU and on a shitty little I3 processor that’s about a decade or more old.

        I use BI because Frigate and any other FOSS just don’t have anything approaching the usefulness and ease of setup of BI. I’d love if there were an alternative, because I fucking loathe having a Windows install in my network, even if it is running as a docker container. But that’s not the case, so I pay for BI and use some mitigations for having to deal with Windows.

        I can post the compose files if you think you might want to give this a try.

        • nieceandtows@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 days ago

          Yeah I used deepstack before, and it had a lot better detection times, but recently BI switched to using CodeProjectAI as the supported ai, so I moved over to that. It’s not as performant as Deepstack. Maybe I should try going back to deepstack even if it’s not officially supported.

          • ikidd@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 days ago

            That’s what I noticed and went back to deepstack. It integrates with no issues at all, just specify the port and let it go.

      • MangoPenguin@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        5 days ago

        I’ve never used anything else so I can’t really compare, but frigate works well.

        Blue Iris is windows only and really resource heavy, so thats why I’ve never used it for more than a quick test.