Im wondering if it is because I tried to run the program via wireless steam VR vs a hardwired connection. I could get the controller to SORT OF track for a second or two, but the video stuttered like crazy and then crashed after a few seconds.
PC
Ryzen 7 3800x 8-core processor
GeForce RTX 3070
and 16GB of 3600MHz DDR4 RAM
Quest 2
Was running wireless because my usb3 ports off my mobo are fried for some reason
If I can get the program running, Iād be happy to provide feedback.
I am super interested in the multi-axis scripting aspect. The only other person I saw doing that with a VR controller was using Blender and their own code/plugins. The process made zero sense to me.
Just sent you a message, thank you for letting me know!
Iāve made some updates and bug fixes for DirectX and background threading in v2.9.1 that might help (unfortunately I donāt have any Ryzen devices to test on) so please do let me know if you see improvement.
Will the āSamsung Gear VR smartphoneā work with this app?*
I donāt have a VR controller and I found one online for really cheap, so I may get it for using the app.
(I donāt know the model name as is not write but on the photo of the box it says is āpowered by oculus, compatible with Galaxy Note8ā)
So, without knowing all the details of your particular setup I can tell you the following which might hopefully be of some help:
You donāt actually need a VR controller to use the app as its only needed if you want to record multi-axis scripts with it. (If no controller is detected, it falls back to using a mouse and you can still do everything else).
If you do have a controller, the only requirement is that itās something that works with Steam VR (examples: Valve Index, Meta Quest, HTC Vive - someone earlier in this thread even mentioned that their old 2016 Pico Neo works).
I wouldnāt recommend buying just a controller as, without an accompanying headset, theyāll be no way to connect it to a PC/Steam VR.
This is the early prototype on a breadboard with my comically oversized solar external battery.
Iām waiting on a couple of additional sensors to help nail it down, but otherwise itās solid. With the custom PCB the final size, all-in, would be roughly 40 x 25 x 15mm.
Iām kind of thinking of doing an early run of like 5 PCBs and going from there so if you or anyone else is interested lmk.
Performer wears it around either their waist or thigh and the movement is captured in real-time to the device OR could be streamed directly to a server and multicast to connected devices.
For live performances, the device could be flesh colored or integrated as part of the performers costume.
For regular recordings, the device could be green and then chroma keyed out using already well established vfx tooling and video editing pipelines.
Either way, my thought was it can potentially solve the current AI and computer vision hurdles by just bypassing them entirely.
Iād imagine its because nobody thought of it when training the models; its the same reason I think FunGen is going to hit a wall soon (if it hasnāt already) ā the models were trained for object recognition, which is great if you want your AI to recognize objects, but its not what you want if your goal is to generate funscripts.
Sensors arrived! After spending way too long troubleshooting a voltage issue⦠Itās Alive!!
Reminder: Anyone interested in something like this, please let me know here to help me determine if i should continue with this: vFAPS Hardware ā Get Notified