Is it possible to self-host some of the generators (for security purposes) I have a computer quite capable of running many of the models, and indeed have a number available already. I have python available and can install any addons needed – I just dont know how to utilize the plugins etc that are available here

  • perchance@lemmy.worldM
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 day ago

    It’s definitely possible, since all the code for generators on Perchance is openly available and downloadable, but currently there’s unfortunately no “one-click” way to do this right now - and it still would require a bit of coding knowledge at this point.

    I think I wrote a comment related to this a few months back - basically you’d need to use something like ollama or llama.cpp or tabbyAPI or Aphrodite or vLLM or TGI (…etc) to run the AI text gen model (and for image gen, ComfyUI or Forge WebUI). Unfortunately even a top-of-the-line gaming GPU like a 4090 is not enough to run 70B text gen models fully in VRAM, so it may be slow. And then you’d need swap out some code in perchance.org/ai-text-plugin and perchance.org/text-to-image-plugin so that it references your localhost API instead of Perchance’s server. You’d just fork the plugins, make the changes, then swap out the imports of the ai plugin for your new copies in the gens you want to self-host.

    Someone in the community with some coding experience could do the work to make this easier for non-coders, and hopefully they share it in this forum if they do. I’ll likely get around to implementing something eventually, but probably won’t have time in the near future.

  • wthit56@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    You cannot run the perchance AI locally, because it’s all on the server. You can download perchance generators but the AI aspects of those generators will not work when run locally.

    So you’d need to make your own thing to interface with whatever AI generators you’ve got on your machine, yourself.