hey, it would be great if support FACEGOOD [aka Avatary] on iClone as Motion Live gear for Facial Animations ... its definitely game changer ......More
Hello Reallusion team, I sometimes have to work with Unreal. I've found that it takes a lot of time to render in Unreal to look really good, whereas Iclone requires a lot less manoeuvring to get a high quality render. However, when using Lumen in unreal, at the moment I don't know of any...More
To ease the process of using iClone in the broader pipeline with other suites it would be great to have the possibility of exporting renders with burnt-in timecode OR as a workaround - to have the option of importing custom video with alpha and TC overlay...More
Breaking clips wont allow you to change its length afterwards as there are no animaiton data, only the last freeze frame Steps to reproduce: 1. New project...More
Headlamps on camera wan work magic, and reduce workflow. Try the feature out in Daz. Use it with an offset. It will save hours on setup for some scenes, or add just that slight punch you need on a scene that's just not right. For that matter, it would greatly help us intuitively set up spotlights...More
I wish iClone could analyze and match expressions to a script like NVIDIA does with Audio2Face (and Audio2Gesture). As Acculips can work with clips up to five minutes long, the speaker's sentiment can change during the clip. While the Face Puppet is good, a script that can automatically time...More
When you drag an MI folder to the plugin, it by default renders the frames inside the MiResources folder. iClone sets it one directory higher. This is better and neater. I wish that I could set a separate render output folder in iClone, like I can from the plugin. This way, I can export the...More