F.A.P.S (Funscript - Audio Processing System) by CaptainHarlock

Aw shoot, just started downloading 0.7 and I hit the transfer quota… Any chance you could upload somewhere besides Mega, maybe pixeldrain?

Here is some help that I hope will make the process easier for you.
I don’t do it because I have no way of testing it.

What’s needed to port FAPS to AMD GPUs

Current status

AMD now officially supports PyTorch on Windows via ROCm (no WSL needed). Current version: ROCm 7.2 with PyTorch 2.9.1 and Python 3.12.

Supported GPUs

  • Radeon RX 9000 (RDNA 4)
  • Radeon RX 7000 (RDNA 3)
  • Ryzen AI 300 (Strix) and Ryzen AI MAX (Strix Halo)
  • NOT supported: RDNA 2 or older (RX 6000, RX 5000…)

Required driver

AMD driver 26.1.1 or newer.


  1. PyTorch ROCm (replaces CUDA)
  • Replace the torch, torchvision, torchaudio wheels (currently CUDA 12.8) with ROCm versions from AMD’s repo
  • Install command: pip install torch torchvision torchaudio --index-url https://repo.radeon.com/rocm/windows/rocm-rel-7.2/
  • No code changes neededtorch.cuda.is_available() returns True with ROCm too (AMD GPU shows up as a “CUDA” device)
  1. Demucs / audio-separator
  • Works with ROCm without code changes — it only needs the underlying PyTorch to be ROCm-enabled
  • There are already Demucs GUI builds specifically for ROCm
  1. BS-Roformer / MDX23C (stem separation)
  • Pure PyTorch models → they work if PyTorch works with ROCm
  • No custom CUDA kernels, only standard torch operations
  1. BeatNet / All-in-One
  • Also standard PyTorch → should work out of the box
  1. Non-GPU libraries (no changes needed)
  • librosa, scipy, numpy, matplotlib, Pillow, customtkinter → don’t use GPU, work the same
  1. What can’t be ported easily
  • If any model uses compiled CUDA extensions (.cu files, custom kernels), they’d need to be recompiled for ROCm with HIPify. FAPS doesn’t use any of these.

Practical plan for an AMD edition

  1. Create a new wheels folder with ROCm packages instead of CUDA
  2. New install.bat that installs from those ROCm wheels
  3. Same Python code — no app changes needed
  4. Test on a machine with Radeon RX 7000/9000
  5. Important: only RDNA 3 and RDNA 4 are supported. Users with RX 6000 or older are out of luck

In short: the port is mainly about swapping the PyTorch wheels from CUDA to ROCm. The app code should work without modifications.**

1 Like

Yep, hold on, I’ll try to upload it to GoFile…

1 Like

15-20 minutes to upload it to GoFile

Please remember to unzip the file using the latest version of WinRAR:

And of course, keep in mind that to use the application, you need an Nvidia GPU, which has been proven to work from the 10xx series to the 50xx series.

the issue is a lot more complex, NATTEN and ALLINONEFIX have no ROCM ports or equivalent

1 Like

Well, wait, because it could be even more tricky, as you specifically need Natten 0.17.5, I think.
These dependencies were definitely the ones that gave me the most headaches, lol
I should also mention that there being no official dependencies is one thing, but there being no dependencies at all is another. Check Huggingface, as there are people who compile dependencies, and you might get lucky.
I used some unofficial dependencies, apart from having to compile others myself, which is something you could also try.

i might have found a workaround, creating the api calls straight from pytorch

Check this, it might help: Backends - NATTEN

i’m 90% sure i have made the app work with Rocm

1 Like

Here is the new version uploaded to GoFile for those who cannot download from MEGA…

F.A.P.S v0.8:

1 Like

That sounds good!
To make sure, you can do this:

  1. Open a video
  2. Select a “basic” mood, such as “Normal,” “Relaxed,” “Slow,” “Simple,” “Crazy,” or “Crazy+Voice”
  3. Run “AI Analysis,” and here comes the important part: this will use the basic and crucial AI dependencies and models of the application: shazamio, all-in-one (with demucs and natten), etc.
  4. If step 3 finishes without errors, you should be able to generate funscripts.

Then you can try more advanced moods such as “SixthSense” and “Pulse,” which use BS Roformer (6 stems) + DrumSep (5 drum stems). Perhaps “AI Analysis” won’t work for you here; I remember having to make several adjustments and tests to get these models to work.

Hey, I found a dependency issue in the latest build. The app throws a matplotlib error because the embedded environment is missing Pillow (PIL).

λ run.bat
2026-03-07 19:15:41,741 [INFO] F.A.P.S (Funscript - Audio Processing System) by CaptainHarlock...
2026-03-07 19:15:42,241 [ERROR] Error: matplotlib
2026-03-07 19:15:42,241 [ERROR] pip install matplotlib
2026-03-07 19:15:42,242 [ERROR] Error crítico: Missing basic dependencies

A quick workaround for anyone else having this issue is to install it manually through the embedded Python:

.\python\python.exe -m pip install Pillow
2 Likes

Thanks for reporting it!

Yep, another user mentioned it to me via DM, but I wasn’t sure if it was just a problem on their end.
I’ll now check the code to fix that error.
But I’ll wait to upload the new v0.8.1 in case any other errors appear. If you detect any problems when using the app, I’d appreciate it if you could report them. Thank you!

i’m going to take a couple days to debug, but so far i have run Six sense and FapMixer, and ran multiple AI analysis and it seems to be working

Worked, if someone doesnt know how to do this, go to the faps folder, type cmd in the file browser and paste this: .\python\python.exe -m pip install Pillow

But the AI analysis does this with Pulse:

SoonTM

2 Likes

Added a HotFix with the two folders that need to be copied to avoid this error in basic dependencies:

Great job! :star_struck: :sign_of_the_horns:
Just to let you know, the versions used by the application are:
allin1fix
natten 0.17.5
I mention this because if they are not exactly the same, there may be some problems.

D’oh!
The idea was to provide everything ready to go (Python embedded) and avoid having to do the “install.bat” and, especially, needing the \wheels folder, to reduce space… But from what I’m seeing, the safest way is still to include the \wheels folder and have the “install.bat” just in case.
I’m going to upload a new version, v0.8.1. Give me a few minutes to get it ready. This one will have everything and shouldn’t fail.

1 Like

sounds great

1 Like