Good morning,
Hope this is not a stupid question, I am very new to Blender. So, my setup is:
-
3d env built from iPad photogrammetry
-
we insert some lasers (a simple cylinder with emission node)
-
we control the lasers using QLC+ --> artnet --> BlenderDMX and a python expression that modulates the emission color for every laser from a separate dmx channel.
We would now love to be able to store the dmx animation directly in blender as keyframes in order to export the animation and put it back on the iPad for AR simulation. Is there any way to record the driver data in real time?
That’s a pretty niche use case so I can’t be of specific help, but it sounds like a job for Python scripting. Assuming you have the same number of lasers all the time, you can set the animation keyframes on your cylinders to have the correct position/rotation/scale as your data comes in. Keyframe animation should export easily to USDZ for use in AR.
Again I don’t know much about your specific setup but this is the direction I’d look.
First of all thank you for your answer! What I meant with animation is the light color modulation itself, so I would basically need to listen to incoming driver data and record it to play it afterwards as an animation (yes, we would export to usdz in order to import in reality composer).
I do not believe that USDZ supports features that allow you to modify the color of the object or do any kind of driver-based logic. (It certainly did not two years ago, when I was in the AR industry). Blender can do it, but Reality Composer probably can’t.
There are only three possible paths I think might be viable.
Thanks for the detailed explanation, not the answer I hoped for but definitely the answer I needed! Will look into all 3 options, thanks again!