Hey everyone!
This guide breaks down what all the buttons and sliders in the PythonDancer interface actually do.
Just a heads-up: I used an AI (Gemini) to help figure this stuff out based on the programās GitHub notes, so donāt take it as gospel. Think of it as a starter guideāI used it myself to learn the basics, so it should give you a solid idea of how everything works!
1. Manual Analysis Controls
These two controls define the baseline for the script generation. They allow for precise manual tuning but are adjusted automatically when the Automap feature is active.
Pitch ā Offset
Function: Sets the Tone Baseline. It defines the neutral or starting position for pitch-based movements. This sets the threshold for determining if the audio tone is āhighā or ālowā relative to this baseline.
Energy ā magnitude
Function: Sets the Intensity Threshold. This defines the level of audio energy (volume) that the music must reach to generate the maximum/strongest movement in the script.
2. Automap Settings (The Easiest Way to Calibrate)
If the Automap checkbox is selected, these settings automatically calibrate the manual controls (Pitch Offset and Energy Magnitude) based on simple targets.
Target Pitch
Function:Reference Tone Threshold. This value automatically adjusts the Pitch ā Offset control. Set this to the average pitch line observed in the audio graph to establish the scriptās core tonal reference.
Target Speed
Function: The Maximum Speed Limit (Hardware Limit). This is the highest velocity value (e.g., 500) the program will write into the motion file (.funscript).
Purpose: Acts as a safety cap for your hardware. Set this to your deviceās maximum speed.
Target % (Active Movement Ratio)
Function: Controls the percentage of time you want your script to contain high action. This value automatically adjusts the Energy ā magnitude control.
Developer Context: This feature was introduced in the āNew automodeā update to control the number of movements generated that are āabove the set percentage.ā
Logic: When you set Target % (e.g., 75), the program finds the lowest energy level necessary to ensure that 75% of the song is considered āactive.ā It uses the inverse percentile (25th percentile) to set the threshold, ensuring the resulting script meets the desired action ratio.
3. Out of Range Options
These options determine how the program handles calculated motion values that exceed the Target Speed maximum.
Crop
Effect:Capping. The excessive value is simply limited directly at the Target Speed (e.g., 500). Use for a smooth and safe script.
Bounce
Effect:Reflection. The excessive value is treated as a reflection, causing the movement to ābounceā back into the acceptable range. Use for an intense, dynamic script.
Fold
Effect:Compression/Wrapping. The excessive value is mathematically āfoldedā or wrapped back into the acceptable range. Use for a compact, controlled action.
4. Misc Options (Miscellaneous Tools)
Automap
Function:Automation Toggle. Activates and deactivates the Automap Settings section.
PLP estimation
Function:Rhythm Correction. Uses the advanced PLP (Perceptual Linear Prediction) algorithm to stabilize beat detection.
Developer Context: This tool was added to prevent beat ādriftā over long tracks. Highly recommended for longer audio/video files.
Heatmap
Function:Visual Output. When checked, this allows the user to export a visual intensity map of the generated script for review.
Hey, this program has been working well for me so far. Iām wondering if itās possible to port the code as a browser plugin so that it can generate a funscript of whatever video/music the website is currently playing and possibly sync it up live. If anyone knows of an existing program that does this or have any idea on how to do so, I would love to hear your suggestions.
A real-time version would be a completely different approach unfortunately. Currently the program extracts all the audio and looks at the entire thing to generate a script.
feelme.com does video based toy control in real time via browser plug-in. I know thereās at least one other site doing the same, but I forget the name⦠the word āsamā comes to mind though.
Interesting, have you personally used the website you mentioned and if so, how long does it take to generate a funscript for a standard 5-10 minute video? Also how accurate is it for different videos including ones where scenes constantly change like PMVs or HMVs?
Also, yeah it would be tricky to generate an audio based script in real-time as the video is playing but I was thinking more like a plugin with the ability to download the video, run the funscript generation, and then sync the generated script to the video all in the browser. Thereās several software available now that can do the last part of fetching and syncing funscripts to videos so itās just a matter of automating funscript generation.
To clarify - feelme.com offers a browser plugin that analyzes the video as it plays and moves the toy accordingly. It doesnāt generate a funscript at all. I have used it, but I usually watch my porn in VR so it hasnāt been as useful as Iād hoped.