Testing rendering of a Blender animation with Stable Diffusion and ControlNet

preview_player
Показать описание
When I first saw what ControlNet can do, the idea of rendering 3D animations with it popped immediately into my head. This is a second attempt of doing so. I used Blender OSM addon to get a street map into Blender, then rendered a fly trough the city in 3 different ways, as a segmentation map, normal map and a depth map. Then used the ControlNet movie2movie script to feed the frames of the animation into Stable Diffusion and waited for the result.

This is way better than my first attempt where I used just a normal map. The thing that helped the most was assigning each object a random solid color, effectively creating a segmentation map and helping the AI see what goes where. The only issue I've made is that I had the ControlNet strength turned up too high, resulting in this very rigid, flat result. Will keep on trying though.
Рекомендации по теме
Комментарии
Автор

It's great to create a model from a sketch with controlnet

hamitbalamimarlik
visit shbcf.ru