How to Make Isometric Game Assets with AI - ControlNet, Stable Diffusion + Blender Tutorial 2023

preview_player
Показать описание
How to Install and Use Stable Diffusion (June 2023) - Basic Tutorial

ControlNet WebUI Extension:

ControlNet models:

Ayonimix Model (get V5 and the VAE for this tut!):

Prompt:
[your building description], cozy, playful, beautiful, art by artist, (pastel colors:0.7), vector art, (illustration:1.4), digital painting, 3d render, stylized, painting, gradients, simple design, (centered:1.5), ambient occlusion, (soft shading:0.7), view from above, angular, isometric, orthographic

Negative: cartoon, zombie, disfigured, deformed, b&w, black and white, duplicate, morbid, cropped, out of frame, clone, photoshop, tiling, cut off, patterns, borders, (frame:1.4), symmetry, intricate, signature, text, watermark, fisheye, harsh lighting, shadows

CHAPTERS:
0:00 - Intro
1:39 - Installation
4:00 - Blender
10:43 - Prompting
13:06 - Upscaling
14:36 - Cutting Out in Affinity
15:59 - Cutting Out in Photoshop
16:44 - Making Other Buildings
18:33 - Results

----------------------------------------------

Did you like this vid? Like & Subscribe to this Channel!

Рекомендации по теме
Комментарии
Автор

great tutorial you, great video, great extra text notes, easy to follow, cant wait for your channel to blow up, you deserve it !!!

Hazzel
Автор

This is awesome, love the Blender to ControlNet workflow!

_amankishore
Автор

Thanks! BTW, I usually create two 3d viewports, split the main view from upper corner, then put the camera view on the left side and on the right side I do the modeling. That way I avoid the hassle of changing views. To get exact view for iso stuff, I usually parent camera to an empty, then move the camera along y-axis, then rotate the empty 45 degrees from the top view, then change the view height angle by rotating the empty along its local x-axis.

devnull_
Автор

One step further:
Back in Blender, go to the UV Editing tab and on the model do a UV Unwrap (Project from View).
Use the recently generated image as an ImageTexture.
This will give you half of a textured 3D model, and you can adjust the angle afterwards.
Create another image from the opposite side, create a second UVMap, and you'll have a simple fully textured 3D model 😉

Thanks for your brilliant and detailed explanations, by the way!

marioz
Автор

Games are typically 2:1 Diametric (30°) - so your orthographic camera should be rotated (60°, 0°, 45°) - which is not true-isometric. This way the width of the tile is 2x the height (2:1). If you want true isometric; the X is actually 54.7356°. This magical number comes from degrees(atan(1/sqrt(2))) - which is 35.264°; adjusted for blender's camera pointing up (90 - 35.264).

jtmcdole
Автор

Would this technique work for other type of assets like trees, rocks, grass...? Amazing tutorial btw!

josemaenzo
Автор

Could you do a 2D or an isometric game character with ControlNet video? That would be super helpful and definitely interesting to watch.

savandriy
Автор

I was wondering if the depth map would allow for staging things in a 3d modeler, really interesting tutorial, although I was thinking Unreal Engine (and maybe the Quixel library or marketplace assets) rather than Blender. I guess there's no easy fix for the varying sun direction on the shadows between buildings here?

Cadmeus
Автор

Very interesting, thank you. Did you ever try to generate pixelated 2D sprites?

izaacchirac
Автор

Hey there, thanks a lot for the video. Just one question: how do control models you are introducing in this video differ from those in your "installation tutorial" video? They are much different in size, they are in a different repo, they have different file extension: .yaml vs .safetensor. You also install the controlnet differently here compared to the installation tutorial. Care to explain? Thank you

zatokar
Автор

One problem i see with this is that the shadows come from different directions. If you put those buildings next to each other it would stick out.

jx
Автор

Let me ask, if I have 3 in-game item images (guns, armor, medical boxes...) with completely different styles, how can I make Stable Diffusion synchronize the styles? Thank you

hoangpham
Автор

can they be consistent, in terhms of both style and lighting

utkua
Автор

so possible to get maps similart to disco elysium or pillar of eternity?

indexli
Автор

14:30 that is crazy haha, I just thought to my self while watching "I should subscribe to this channel, that is a lot of very good in information", look at the subscribe button "ho, I already subscribe to, nice" haha, keep up de good work man.

jeckrucel
Автор

Why not use the new control net plugin inside blender?

MarkusAkaMasse
Автор

Help: This G-C-1 dont work in my Blender!

abltanarana
Автор

Thanks a lot for this tutorial! I am just wondering, whether all this is necessary? Wouldn't it be possible to get similar results using Midjourney with some specific prompt defining angle/size etc?

Автор

I have been trying to hire an artist for my project, it is a super complicate process, never did A.I. before but honestly thinking about it after seeing this

peileed
Автор

how would I use this to make other art? would I just draw the layout and write a description?

Musicu