[IDEA] Realtime Funscripts based on Beatmeter on screen

Heyo dear community :slightly_smiling_face:
I had a shower thought and am unsure if what I’m imagining is possible, or even already exists, hear me out:
I’m often using Intiface, MultiFunPlayer and RandomVideoPlayer together to play videos with funscripts on lovense vibration toys or a stroker. Now this means all videos in my library need a funscript that someone lovely handcrafted.
Now, I also enjoy Cockheroes, if you are unfamiliar, they use a beat meter concept at the bottom of the screen to let you know how fast and in which pattern you should stroke.
My idea: use some kind of on-screen pixel analysing tool and place it in the middle of the beat bar. detect each beat and convert this into a β€œrealtime- funscript”/instructions for intiface.

I currently don’t have enough free time to start coding this myself, as i’m also not an expert in any of the skills required. Maybe someone else finds this interesting enough to invest some time their self.

I once posted this in a discord server and a user made a small diagramm on how he would realise this idea:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Video Player   β”‚
β”‚  (with beat     β”‚
β”‚   meter bar)    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
         β”‚
         β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Python Tool    β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚   mss     β”‚  β”‚ Screenshot capture using python mss library
β”‚  β””β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜  β”‚
β”‚        β”‚        β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚  Pillow   β”‚  β”‚ Pixel analysis using python pillow library
β”‚  β””β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜  β”‚
β”‚        β”‚        β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚   Beat    β”‚  β”‚ Beat detection
β”‚  β”‚ Detection β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜  β”‚
β”‚        β”‚        β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚websockets β”‚  β”‚ WebSocket client message
β”‚  β””β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”˜  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”˜
         β”‚
         β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Intiface      β”‚
β”‚  (WebSocket     β”‚
β”‚   Server)       β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
         β”‚
         β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Lovense/Other  β”‚
β”‚     Device      β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

so you have a python process that runs mss, and sends websocket messages to intiface after processing the screenshot. it should all be fast enough to feel real time.

I imagine using some kind of base funscript file and change the amplitude of them for each beat (up - down stroke = 0% - 100%) But I also have no clue on how I would do that.

Any help appreciated :heart: