A Question or Two on Kenetics, Fractals and the Future

…excuse me for engaging in a conversation that may be rather spurious and have little benefit to the community, but i’m curious. i have virtually no scripting experience (brief foray, somewhat confusing, steep learning curve…) so apologies if you have to ELI5.

I’m wondering if there is any kinetic input for scripting? i understand that you use real time motion tracking, obviously, but i’m more curious if you can hold a… thing… in your hand and manipulate it to produce the script? i’ve seen reference to gamepads being used, is that for kinetics, or more for something else? Isn’t there a program for the SR6 that allows direct multi axis inputs in real time? If so, i’m thinking we need a single axis version.

In my overly fertile imagination, i wonder if it might be possible to have a form of dildo (for want of a better image…) that has a bunch of sensors running up it, with a mobile ring around it that moves up and down, that you use to direct the action with (which i can now take great pleasure in refer to as the ‘eShaft’). Very basic, i know, but i theorize it could be advantageous straight away with:

  • swiftly inputting the raw data
  • inputting the data in a way that is more aesthetic, haz more ‘feels’
  • making scripts for stuff that really doesn’t want to be scripted easily cos it lacks obvious motion or beats
  • more fun!

Even a very basic device would be useful, maybe link to a Handy so that the eShaft could move independently when the scripter is shifting backwards and forwards through the frames so its always in the right place to grab and make adjustments on the fly, or one could go the whole hog, and bring a product to market complete with actuators, real time response, its own supported software, etc. i just noted that, @megamasha has made Cock Heroine, a program that allows you to fill in the beats in Cock Hero in real time. This is similar to what i’m imagining software wise, is the scripting version around the corner?

So yea, am i talking out my arse? Is it already kinda what you do? Is someone working furiously on this already!? Is it utterly pointless? …sorry, brain needs to know, brain demands knowledge, am but slave to brain…

Also, i’m fascinated to see @ncdxncdx’s FunscriptDancer. For a while i’ve been looking at the pictures and GIFs of the pitch/graph/thing one uses to visualize the scripts, and been struck by how similar to a fractal output it is. Is seems to me that if one could set the basic parameters of the pitch using motion tracking or an eShaft, and then let fractal magic make each and every stroke subtly different to the other, in an organic way, it would not only save a lot of time, but possible impart a new level of complexity to the script. Maybe FunScript, OFS, etc have these already and i’m late to that party, in which case i’d be curious to know how it’s implemented.

And just a quick one to say how blown away i am by this community. This entire field is a beautiful plastic unstoppable train, Goddess knows where we’ll be in two years, let alone ten, and this site may well turn out to be one of the more pivotal influences on the entire field of… VTR, Virtual Tactile Reality? This is where the Metaverse truly starts, many thanks for being even just a tiny point on the periphery, i hope i haven’t wasted too much of your precious time…

1 Like

I know you can already record with Handy control.

I tried in the past with the phone accelerometers trought Bluetooth but it wasn’t precise enough.
maybe with a dedicaded arduino/rpi and a inertial unit that could work better.

@a_human_bot

Nice link! In some way i’m further hobbled in this discussion as i don’t use my PC whilst engaged in VirtualTactility, i use the just Handy with a Quest 2, so i have yet to muck about with HandyControl.

So, boom, there you go, HandyControl has a script sequencer, and the ability to listen to Pyro sensor in real time. Combine the two and after that it’s a race to see who gets on Dragons Den or Shark Tank first. Oh boi, i would sure love to see that pitch…

But why’s it important? Because if there’s a simple UI, and a decent procedural generator, then even a massive clart like me can take all their fav video’s and turn them into fun times. It opens scripting up to far more peeps, which means far more scripts for far more obscure things (House of Gord, anyone…?). I’d buy into the early adopt for that tech, no worries.

Thought: one could use a servo from an OSR, attach the end of the arm to a pole it can slide up and down on, that could be an easy eShaft hack for which i believe there is already feedback software written, and of course the servo can position the arm in the correct spot during playback.

And so on, there must be loads of ways to doo eet…

2 Likes