To those of you who asked to be beta testers, I have good news and “bad” news, but more good than bad, lol
The good news:
- Thanks to the first beta testers who helped me identify and fix installation issues, which was my biggest concern since I hadn’t been able to test it on other computers, the new installation system now works better, is simpler, and is compatible with modern GPUs.
That means I think I could now upload a beta version for all of you, not privately but in this thread, so that together we can finish ironing out any problems it may still have.
The bad news:
- Due to the architecture change that was made (from Pytorch 2.4+CUDA 12.4 to PyTorch 2.7+CUDA 12.8) so that it could work with modern GPUs such as the 40xx and 50xx series, it may now cause problems or not work with older GPUs.
Right now:
Nvidia 10xx series > Not compatible / Doesn’t work (Although I’m waiting to finish testing with a beta tester who is trying it out). UPDATE > Works (verified)
Nvidia 20xx series > Probably works but hasn’t been tested.
Nvidia 30xx series > Works (verified)
Nvidia 40xx series > Works (verified)
Nvidia 50xx series > Works (verified).
Also, note that to take full advantage of all features, it’s recommended to have a GPU with >8GB of VRAM.
The application has fallbacks to use librosa with CPU, but right now I’ve focused on making sure everything works well on GPU, so I can’t guarantee that the fallbacks work well at this time. In summary: The application is currently designed for use on modern Nvidia GPUs with 8GB of VRAM or more, and to make use of the most advanced AI models (SOTA, State Of The Art).
I have a very old version that works entirely with booksa+CPU, but if I wanted to go further and make it better, the use of GPU+AI is essential.
Right now, I’m polishing up a few last things, but I’ll be able to publish it here soon. ![]()




