I’ve been working on the recording side of Ayva Remote (capturing motion data in order to export as Funscript). I decided to stroke along with some Dubstep and synchronize the result with a YouTube video of the song.
This is somewhat for the lawlz , but it is a good demo of the nuance available with a touch screen, and more importantly helped me test out some changes I made to increase the recording accuracy.
One challenge that was really fun to work through was getting timing to be accurate when recording movement data over a network. My first approach was to capture the timestamps of motion packets when they arrived at the destination. This turned out to be a no-go. Because of the small amounts of variable latency that naturally occurs from sending data over the internet, things get irreversibly out of sync real fast . So I had to change the format of my packets to instead include the absolute timestamp of each and every movement at the source so that when it is saved at the destination, it has the actual timing of the movement as it was originally made; effectively removing any of the network latency.
How well this playback stays in time with the song demonstrates that it works, and accurately recording a script in real time from motion generated remotely is more than feasible.
I go through all the modes (twist, pitch, and roll) too, and definitely max out the speed. So its a good demo of what someone can do with just their finger.
By the way, I’ve been getting better at this and coming up with techniques to stroke faster on a touchscreen without too much friction.
But I’ll save sharing an actual video of my hand movements for Part 2.
(Also is there any interest in a small app for sync-ing Funscript with YouTube videos!?)