Before buying the HTC Vive, I experimented with PS Move + PS Eye - I almost got it, but the difficulty of launching and mediocre accuracy prevented me from getting any further than the tests.
Blender also has a mode for VR headsets, you can write multi-axis scripts directly in the headset but I haven’t tried it yet.
Thank you for your reply. I will test whether Quest3 is available during the holiday. Now Quest3 can be streamed directly to the computer using the Steamlink app, which is very convenient
It will likely require reworking the matrix conversion math from your controller and tracking button presses, but a quick search says that all headsets now support OpenXR and in general my script should work.
I just tested using Quest3 and the script works, but it seems to be in the opposite direction and the Y-axis cannot be recognized. And I can only operate in desktop mode with the headset (I don’t know how Blender enters VR operation mode), because when I remove the headset, it will automatically go to sleep.
Cool, thought it wouldn’t work at all
I have in Steam VR settings to work without headset at all, in your case it will not work - because the optical system is in the headset. So yes - most likely in desktop mode, or tuck in a sensor that determines whether the headset is on or not.
I foresee the problem with the mathematical matrix, also you need to understand the local coordinates of the controller
In Comparison.
From what I can see there should be no problem with the matrix, only the vector needs to be adjusted (they are different).
And what is the inconsistency of Y movements/rotations?
Do you need the headset to set up the HTC Vive controller? Or can you just buy the controller and use it on its own for this? I have a quest 3 but its a pain in the ass to use it without wearing the headset.
You need one controller, one base station, and a receiver for the Vive tracker (I use a reprogrammed receiver for the Steam Controller)
There was someone here on the forum asking about using different alternative devices for scripts. I’ve been using a keypad (Logitech G13) together with HTC Vive Controller for a long time now - left hand on the keypad, right hand holding the controller. I also have a script for the keypad, which I use to adjust the movement and rotation values when checking the script. I share it with the community.
import bpy, bpy_extras, mathutils
import time, random, math
obj = bpy.data.objects["fleshlight"]
class ModalTimerOperator(bpy.types.Operator):
"""Operator which runs itself from a timer"""
bl_idname = "wm.modal_timer_operator"
bl_label = "Modal Timer Operator"
_timer = None
def modal(self, context, event):
if event.alt and not event.shift:
if event.type == 'NUMPAD_2':
obj.location[1] -= 0.005
obj.keyframe_insert(data_path="location", index=1)
if event.type == 'NUMPAD_8':
obj.location[1] += 0.005
obj.keyframe_insert(data_path="location", index=1)
if event.type == 'NUMPAD_4':
obj.location[0] -= 0.005
obj.keyframe_insert(data_path="location", index=0)
if event.type == 'NUMPAD_6':
obj.location[0] += 0.005
obj.keyframe_insert(data_path="location", index=0)
if event.type == 'NUMPAD_3':
obj.location[0] = -obj.location[0]
obj.keyframe_insert(data_path="location", index=0)
if event.type == 'NUMPAD_5':
obj.location[1] = -obj.location[1]
obj.keyframe_insert(data_path="location", index=1)
if event.type == 'NUMPAD_7':
obj.rotation_euler[0] = -obj.rotation_euler[0]
obj.keyframe_insert(data_path="rotation_euler", index=0)
if event.type == 'NUMPAD_9':
obj.rotation_euler[1] = -obj.rotation_euler[1]
obj.keyframe_insert(data_path="rotation_euler", index=1)
if event.type == 'NUMPAD_1':
obj.rotation_euler[2] = -obj.rotation_euler[2]
obj.keyframe_insert(data_path="rotation_euler", index=2)
if event.alt and event.shift:
if event.type == 'NUMPAD_2':
obj.rotation_euler[1] -= 0.0873
obj.keyframe_insert(data_path="rotation_euler", index=1)
if event.type == 'NUMPAD_8':
obj.rotation_euler[1] += 0.0873
obj.keyframe_insert(data_path="rotation_euler", index=1)
if event.type == 'NUMPAD_4':
obj.rotation_euler[0] -= 0.0873
obj.keyframe_insert(data_path="rotation_euler", index=0)
if event.type == 'NUMPAD_6':
obj.rotation_euler[0] += 0.0873
obj.keyframe_insert(data_path="rotation_euler", index=0)
if event.type == 'NUMPAD_7':
obj.rotation_euler[2] -= 0.0873
obj.keyframe_insert(data_path="rotation_euler", index=2)
if event.type == 'NUMPAD_9':
obj.rotation_euler[2] += 0.0873
obj.keyframe_insert(data_path="rotation_euler", index=2)
return {'PASS_THROUGH'}
if event.type == 'RIGHTMOUSE':
self.cancel(context)
return {'CANCELLED'}
return {'PASS_THROUGH'}
def execute(self, context):
wm = context.window_manager
self._timer = wm.event_timer_add(1/50, window=context.window)
wm.modal_handler_add(self)
return {'RUNNING_MODAL'}
def cancel(self, context):
wm = context.window_manager
wm.event_timer_remove(self._timer)
def get_override(area_type, region_type):
for area in bpy.context.screen.areas:
if area.type == area_type:
for region in area.regions:
if region.type == region_type:
override = {'area': area, 'region': region}
return override
#error message if the area or region wasn't found
raise RuntimeError("Wasn't able to find", region_type," in area ", area_type,
"\n Make sure it's open while executing script.")
def menu_func(self, context):
self.layout.operator(ModalTimerOperator.bl_idname, text=ModalTimerOperator.bl_label)
def register():
bpy.utils.register_class(ModalTimerOperator)
bpy.types.VIEW3D_MT_view.append(menu_func)
# Register and add to the "view" menu (required to also use F3 search "Modal Timer Operator" for quick access).
def unregister():
bpy.utils.unregister_class(ModalTimerOperator)
bpy.types.VIEW3D_MT_view.remove(menu_func)
if __name__ == "__main__":
register()
# test call
bpy.ops.wm.modal_timer_operator()
How shortcuts are assigned:
- For movements (+/- 0.5 cm) and axis sign changes Alt + NumKey combination is used.
- For rotations (+/- 5 degrees) Alt + Shift + NumKey combinations are used.
What goes where can be understood from the script.
It’s still uncomfortable to sit at the table with HTC Vive Controller - my arm gets tired in about half an hour, I have to take a break more often.
I came up with an idea to take a second mouse - and set up its movements and buttons to create multi-axis scripts, I couldn’t find any examples of how to do it inside Blender on python.
I found a program that can do this - reWASD. It turns out promising - I attached a trackball - I will try to write scripts like this.
I really need to figure out how to get this working! This definitely seems like the best way that I’ve seen to do multi-axis scripting.
I’m thinking on maybe getting a Logitech G Extreme 3D Pro Flightstick and mapping the buttons.
With the twist joystick I feel like I could knock out the twist, pitch, and roll axis in one pass.
Well, I thought so too, and even somewhere I had a script based on another library for working with airplane joysticks. But it turned out to be a bit inconvenient for me - the joystick is quite tight and my hand gets tired. I’ll try to get my Logitech Extreme 3D pro off the shelf later and make a short instruction.
I found a PlayStation Eye at a flea market for $2. I remembered that I have one such camera and a PS Move controller at home. Before I bought HTC Vive Controller with base stations, I once tried to connect them in Blender for multi-axis scripts - but I was not satisfied with the accuracy (compared to gamepad) and the huge complexity of the configuration (several intermediate programs with reference to dlls and adding libraries in Blender).
I decided to restore in memory how it is and try not only rotations but also movements for PS Move - with two cameras.
It turned out tolerably well, compared to HTC Vive Controller - a little slow because of filters, accuracy +/- 2-5 degrees, and +/- 1 centimeter.
I’m glad to see the price of such a set, and maybe many people already have the necessary set of devices:
- PS Eye - 2 pcs.
- PS Move - 1 pc.
I’ve been looking at the motion recognition feature in Blender for a long time, but can’t think of a proper application yet. Maybe when I have free time I should just try it and the solution will come by itself.
So I’ve wondered if I could without thinking if I should
https://pixeldrain.com/u/NC1eHNrq
https://pixeldrain.com/u/vVbCQb6K
Tracking motion is pretty easy and I can export in into a json
What to do with that json remains a question, because points in 2d space is not exactly a funscript
First example is pixel animation where I was basically doing it by hand
That’s the thing - that recognition in Blender is essentially some sort of matching tool. To take, for example, what you have recognized - you need to write some handler for each separate case.
The handler is pretty generic - I just define where is UP direction and that’s it
If you haven’t seen Falafel’s Scripting Timelapse - Me Speedrunning A RIM Animation / Scripting Timelapse - Dehya by Haruya check it out
The OFS tracker works roughly the same way - you have to define what direction is UP and it tracks a point over it
My version allows R1/R2 as well though
I think it can be simplified so saying “This tracker defines default penis length and rotation” and as much as it can
@Falafel do you have a good scene to test video tracking?
A single scene which could produce with tracking as much data as it can
I don’t know what your goal is, but ShaggySusu’s stuff can generally be tracked and scripted in one go.