The Technicals

Wherein I Tell you of Image File Formats

1 When it comes time to render, if you’re like me, you want a file format that’s more or less just JPEG, but with better dynamic range. 8 bits per channel just isn’t quite enough.

You want something that works on Mac, Linux, Android, and iPhone. And has to work with GIMP and ImageMagick.

It looks like such a thing doesn’t exist, but the closest thing that does exist is JPEG2000. It works with everything except Android phones. Alas.

Since I need to store my “intermediate” renders in some format, I was hoping to use OpenEXR. It seems to compress better than JPEG2000 (debateable) and is quite a bit faster to generate files. The problem, though, is OpenEXR is a 16-bit linear representation of your scene, with no render transforms baked in, so when you pull it up in another program like GIMP, the exposure will be all wrong.

So in the end I settled on JPEG2000.

Then when I have a few hundred jp2 renders I want to see on an Android phone (or here, on this humble blog post), I use ImageMagick to convert all of them into regular JPEGs.

Meanwhile, if I see some problems in a particular render, I can pull it up in GIMP, tweak the colours and lights, and things will be much more smooth and buttery than a regular JPEG.

I finally settled on a 12-bit 75% compression setting for my renders. I cannot see any loss of detail or noise added at that setting, and files are about 6x-8x larger than a low-ish quality JPEG of the same dimension.

And Render Farms

1 I’m sure you’ve noticed that while you’re rendering an Eevee image, your computer is completely unusable, whether you’re on Linux or Mac (or Windows?).

Since I am now generating actual renders, and not screnshotting my 3D window, I was noticing this rather painfully.

I needed a solution. Here’s my technical workflow.

I have Syncthing set up on my Mac and Linux machines, pointed at a directory that includes all my .blend files.

I create an animation, put her in a pose I like, with the camera where I want it, and then set the pose and camera orientation as a keyframe (and the lights, or whatever else needed).

I do this for a bunch of various poses that I like. Then I save the file as, eg:

2020-01-05--3-ShruggingPoseAnim.blend

Then I wait for Syncthing to do its magic (not long). I go to my Linux laptop, which is way less powerful than my Mac workstation, but whatever. I open that file. I have the animation output directory as another directory that Syncthing will sync back to other devices.

I hit Render Animation and let it slowly chug through all the poses and lighting setups in the “animation” and then Syncthing dutifully copies the resultant JPEG2000’s back to the Mac workstation.

Then I take a look and make sure it looks right. It usually does. Once Linux crashed halfway through a walk cycle. No clue why. I even got a corrupted frame (only a few passes on the soft shadow iterations). Whatever.

Run a script to generate regular JPEGs into another directory that Syncthing syncs out to all my various Android phones.

Run another script to generate the web-suitable images like what’s in this blog post.

BOOM.

Low-tech render farm.

And the Tale of a Walk Cycle

1 I did a whole 20-frame “walk cycle” - it isn’t an actual animation, which I’m not going for anyway. I just want a bunch of various poses that happen to line up into walking.

The image here is not part of the walk cycle. Unsure really if I want to bother trying to represent it in this blog post. I put it into some M4V’s and uploaded them to BlenderArtists.

I wonder if this blog software can handle M4V’s…

Walking Verachantesse Vertex-Colour Workspace

I guess not. Well, trust me. It’s kinda cool, but not that cool.

1 1

Reminder, Rigify is pretty nice. The above face is 100% bone-driven Rigify. No shape keys at all. A tiny bit of weight painting on the eyelashes, but other than that, also no weight painting.

Actually… ah… lies? I think I did do some weight painting on the transition between her cheeks and upper lips. I think that was bad out-of-the-box. But it wasn’t much.

And I hate weight painting, so you should know you don’t need to do much.

On the other hand, elbows, knees, hips, shoulders? Yeah. Shape keys. And a lot of time invested into them. See previous blog post on that.

I am pondering trying to find someone who will modify the Rigify metarig to include shoulder, knee, and elbow armature generation. I feel like this is mostly solveable using some more sophisticated rig setups.

I want to do that because her shoulders are still insanely bad in some poses, like if she puts her hand on top of or behind her head (think: playing with hair).

Since this is a pose I’ll need to hit all the time… I need to solve it, and shape keys is a really annoying way to solve it, although it’s the only tool I have at the moment.

Anyone want some $$$ to fix up Rigify to include shoulder, knee, elbow bones that deform properly? Hit me up. Philo Vivero at Gmail.

A Tiny Diversion into Lens Settings

1 1 I played with some really large apertures, like f1.8 (or even larger) and bokeh distortion settings. It was a little bit fun.

Even though in real life, when doing portrait photography, I prefer wide open aperture and zoomed in for that nice soft-focus background, for some reason in 3D I don’t really care for the effect.

So other than a few renders here and there, the aperture is more like f2.8 or f3.4. I still leave the aperture a bit open, so there is a tiny bit of blurring in fore- and back-ground, but nothing as dramatic as eg the above render.

1 1 Do you see it? No? Good.

What next, then?

Hair? Animation? More clothing and props?

Unsure, honestly. I’ll let you know when I get there.

Every now and then a POV that gives that very blurry look far away. I am always focusing on her eyes, so she gets blurrier and blurrier toward her feet.

Hold the Applause ‘til the End

1 1 Okay. You can go wild now.

~~~