In my dayjob, there are four such nice blue sticks, gathering dust on my shelf. I have a possibility to put them online via Raspberry Pi.

So, if there is someone who may have an interesting idea how to put them to work for the good cause, talk to me.

  • poVoqMA
    link
    1
    edit-2
    1 year ago

    Hmm, not sure how much can be done with these. Most ML stuff you can self-host requires CUDA or OpenCL, i.e. a GPU.

    I am planning to setup a Libre-Translate instance on an old CUDA enabled gaming laptop turned server soon:

    https://github.com/LibreTranslate/LibreTranslate

    Cool would be an auto-translate button on Lemmy posts with Libre-translate API support like it exists for Discourse forums.