Track second Move controller
I want to create a string-puppet/stick-puppet concept with left- and right- Move tracking, such that I can control the body and head with the right Move and arms with the left Move.
I can use a Hand Tracker and place the puppet inside that to track the right Move, but how could I then track the left Move and position /orientate an object inside the right-hand grouping?
Alternatively, I can use a Controller tracker and its Movement tracking follows the right hand. Can I make it follow the left?
Oh okay that's fine then XD
I made something similar--a simple cross handle thingy with string connectors to the hands and such. I'm by no means an expert though ;P
You could use the trigger zone method to get (fairly) easy X/Y/Z relative to another position and angle. So, stick a tag in the hand tracker and use the trigger zone outputs for whatever you need. I have a couple of tutorials showing how to use that to make a minimap: https://www.youtube.com/channel/UCQEo4yuE81lk-ma2leFHcNg/search?query=minimap
As for the actual move controller outputs, I've made a chip with outputs for (most of) the move controller outputs. https://indreams.me/element/otJAhviAonF
You can also add a sculpt into the tracker and use any of the sensors to find the direction of different axes, get acceleration, rotation etc.
Yeah, an unfortunate choice of MM calling their puppets 'Puppets' means talk of creating puppets gets confused. I'm not using the Bipedal Puppet Controller. ;) I'm trying to build by own rigs. Although I get the impression I'm pretty much alone in this field. ;)
So the literal requirements without context is really "how can I track a Left Hand Move controller input and map that to objects?" Seems Logic rigging is the way to go with a radio controlled setup.
Right. Any sort of 3D tracking of a move controller won't be relative to anything; it'll be the exact position of the move controller. I didn't know that's what you wanted.
Yeah you need to scope in to wherever you can grab just the hand, and put the left hand tracker there. And grab the hand and scope it into the tracker.
Ideally, this wouldn't involve an actual "puppet" object--that won't be needed for something like this. So you can select the whole rig (scultps and joints) and scope it out of the puppet group and delete that puppet group. Then you have one less layer to worry about.
If you've got another way you want to go with this, then that's fine. I'm just answering based on the question you asked.
Putting a Left Hand in a Right Hand has the left hand move independently (it's not in Right Hand local space) and I'm unable to connect geometry across Left Hand and Right Hand scope levels.
I have had some prototype success using a follow tags and remote control that I think I can get working. I can have some Roaming Platform apparatus follow the left hand, position gear in the Left Hand relative to this Roaming Platform, and then adjust the model in the Right Hand based on sensors in the Left Hand controlling it (via Keyframes)
Try putting a hand tracker inside that one with the puppet, tweak it to be for the secondary controller. Grab the hand and scope it into the hand tracker and now the joint will be attached to that tracker.