Good morning,
Hope this is not a stupid question, I am very new to Blender. So, my setup is:
-
3d env built from iPad photogrammetry
-
we insert some lasers (a simple cylinder with emission node)
-
we control the lasers using QLC+ --> artnet --> BlenderDMX and a python expression that modulates the emission color for every laser from a separate dmx channel.
We would now love to be able to store the dmx animation directly in blender as keyframes in order to export the animation and put it back on the iPad for AR simulation. Is there any way to record the driver data in real time?
I do not believe that USDZ supports features that allow you to modify the color of the object or do any kind of driver-based logic. (It certainly did not two years ago, when I was in the AR industry). Blender can do it, but Reality Composer probably can’t.
There are only three possible paths I think might be viable.
Thanks for the detailed explanation, not the answer I hoped for but definitely the answer I needed! Will look into all 3 options, thanks again!