I think the dependancy on video framerate should be possible to be overridden (idealy even runtime). In general the higher the framerate the better the scripting accuracy becomes, and if the video is 25fps this is very limiting.
Sure, i can get around this by just converting the video to a diffirent framerate (for scripting the actual quality of the video doesnt matter so any losses her are just acceptable), and then use that video for scripting. But this is just is just an overkill workaround.
Even just basic ratios for framerate would do wonders here. So if a video is 25 fps, it for example allows 25,50,75,100 as scripting framerates, while for a 30fps video this is 30,60,90,120.
While for slower videos the accuracy it might not matter. For PMV videos it will as vibration patterns can use these FPS varieties a lot. But even then, i had 25 fps videos where a full stroke only takes 6.5 frames and the speeds were reaching device limits. Being able to use that 6.5 frames (by going to 50fps and use 13 instead), would have made things a lot easier to limit speeds.
Indeed, I have come across this “thing” more than once with low framerate videos it becomes hard to perfectly time the keyframe. What is weird is the grid won’t let you move the keyframe exactly where you want but if you script it “on the fly” you can insert the keyframe exactly at the right spot. However on the fly scripting is usually pretty bad so yeah having a way to increase the scripting framerate would be great.
I have no file/open existing option in OFS 2.0 was this feature removed? I see in the tutorial for 1.1.5 that its there. EDIT: I found it.
Guys, I can’t import in OFS a created script into another application.
If you look inside the file, everything seems fine, but it gives an error on import.
VRHush_From_The_Vault_Dani_Daniels.pitch.funscript (599.3 KB)
You can’t import it by itself. It requires a video.
In this case it would look for a video named
Thank you - it worked.
But how do you interpret the pitch roll, where the yaw and XY movements are?
It’s all relative to the device which has to playback the script.
Multi-axis players translate funscripts into TCode and it is specified like this.
I think this is how the channels are mapped.
L3: suck, valve
TCode is also relative to the device which interprets it.
I understand correctly that there is a multi-axis device, your editor has the ability to create such scripts. But no one knows how it works. Okay - we’ll figure it out thank you very much!