I was very involved in the initial asset lighting and comp design for Voltron. I believe we did a good job matching the visual style of the 2D elements of the show while allowing for smooth and dynamic CG fight scenes. I still currently design and support all the lighting and comp pipeline and toolsets for the show. A big shout out to Andrew Hepp who was the architect of the show's lighting before I was ever on the production. This clip includes some of the assets I've lit and supported over the production of the show.
Was watching an old "The Muppet Show" episode with my girls and remembered how funny Gonzo as Darth Vader was. Thought this would be a fun sketch to do once the kiddos went down. About 3 hours in.
As PySide is becoming more and more the UI of the CG artist, I thought I'd share this little tip. How to add Qt widgets to a Nuke gizmo.
I use gizmos very often, but I felt limited with the ui tools available and I wanted the ability to add some additional functionality to them. I've been making tools using PySide in Nuke since version 6, however I had to spend a bit of time researching how to go about adding widgets to a group / gizmo, and as far as I can tell, I'm the first to post how to achieve this online. So here's to a first!
Step 1 - Subclass a widget.
I started by sub-classing a QWidget, setting a layout, adding my buttons and other widgets, making connections, etc. Note that you'll need to pass in the nuke node the widget is attached to as an argument.
Step 2 - Add the makeUI() and updateValue() methods.
The makeUI method is where the magic happens. It appears that Nuke is hard coded to look for a method named makeUI. Just return self and you're good. The only reason I add the updateValue method is because Nuke will scream at you if your class doesn't have it. I'm still not sure what triggers that method. As far as I can tell, it's neither a node eval nor a PySide signal. Oh well, not a big deal.
Step 3 - Create the knob and add it to a group.
You'll want to create a PyCustom_Knob. It requires everything you give it to be string based, so you have to pass in your widget name and arguments as a string. Haven't found another way to do this, but passing in "MyWidget(nuke.thisNode())" works just fine. You could potentially pass in more info if needed, but you should be able to get any info from the group node itself.
Hope you guys find this useful, cheers!
All the dynamic Fx work has either been directed, designed, programmed, simulated, lit, rendered, and/or composited by me. While working on this show, I worked on ~10 shots a week, producing over 400 FX shots and dozens of FX rigs. While I'm not entirely happy with the style of the show, I'm very proud of the work I and my fellow FX artists achieved on such a fast schedule and low budget.
Now that the beta for Renderman 19 is over and I have access again (Yay, non-commercial licenses!), I can share some renders I've done with RIS. I got onto the beta last fall and was able to play with it, and while RIS is Renderman's answer to Arnold and VRay, it's quite a different beast. Let me start out by saying this was my 1st dip into Renderman, and I am by no means an expert. I've always stayed on the raytracing side of the raytrace / Reyes debate, and now that Renderman has a production level raytracer, I couldn't wait to dive in. With that said, I do use Arnold, VRay, and MR and am very comforable in those renderers, and I feel like I can give the new RIS features a fair shake.
Like Arnold, Pixar is going for the least common denominator with RIS. The less settings to tweak, the better. There are 2 basic sampling setting, and that is it. Min/max samples, and a pixel variance (threshold) setting. There are no light or shadow samples to set. No reflection / refraction / SSS / GI samples to set. (Full brute force GI mind you) Renderman does it all for you. Noise is taken care of in the max samples. This can be a great feature, as it's easy to get rendering almost immediately, but it can also take a lot of extra samples to get noise out of small areas where I feel just a little tweak on a light or shader could make for a faster render.
*** UPDATE ***
As I've been playing with the full / non-beta release of the software, I have found ways to increase sampling on a per light, GI, and shader basis, however out of the box, it's setup to not need to make changes to these controls.
This render is a great example of what I'm talking about. (This setup is very similar to the Arnold render I did on this model several months back, however rendered at 850x850) The overall skin shader was smooth by 1024 samples (about 5-10 mins of rendering), however I let the renderer go to about 2048 samples (20 mins), and you can still see the flickering noise in the ears. I'd imagine another 10-15 minutes and it would be production quality. While Arnold rendered a similar image in ~3 minutes at a higher resolution, it's still hard to fault Renderman here. VRay and MR would probably be a bit faster, but also require a much higher entry level. While the render was slower than Arnold, I really like the way this render looks over my old Arnold render. Perhaps I should go back and revisit it.
I do have to say, I like the shaders. The above render is using the PxrSkin shader. It opporates much like equivilent SSS shaders with fewer settings. The PxrDisney shader is the do-all workhorse for dialectrics. While I'm used to the shader models that the other renderers have been using, the Disney shaders are probably more intuitive to use. I felt like I was able to get the basic look of my materials faster than I would with my typical MIA / Arnold / VRay_Mtl shaders. That being said, Disney and Pixar call these shaders "Physical-like", so there are some attributes that behave... interestingly. There are separate metalic and specular attributes, and when metalic is set to 100%, specular doesn't do anything. I think this is an odd behavior, but once you understand this, it doesn't cause any issues.
One benefit of RIS is the bi-directional path tracer. This is something you can't get in Arnold, and while I've heard Vlado talk about implementing it into VRay, it's not something I've seen in VRay 3. This rendering "integrator" shoots primary rays not just from the camera, but from lights as well, then blending any rays that are within a certain radius and angle. This allows for beautiful caustics without any tweaking, and can resolve renders faster around lights tucked into corners. While this is a really cool intergrator, I don't imagine it would generally be as useful as the uni-directional path tracer due to the rendering overhead.
While technically Arnold wins the above rendering speed battle, and Arnold isn't much more complicated, there is something to be said about the price difference. Rendermain is free for non-commercial use, and is a tad under $600 for a license. As of this writing, Arnold is $1200 and requires you to buy 5 licenses at a time. It's clear who Renderman is targeting, and I think it's worth a good hard look if you're freelance or working on a small project.
More to come as I can now render with my free non-commercial license.
Had some fun with Realflow 2013.
This test was a combined rigid body dynamic / fluid particle simulation at 240 fps. I'm pretty happy with the results. However at this speed I really needed to re-work the meshing, as some of the subtle changes in position gave different meshes that refract and reflect very differently, resulting in a popping effect. (Low quality on Vimeo for some reason).
I originally started rendering this with VRay 3.0 beta with irradiance mapping, and I was getting times between 3-6 minutes, however I found some bugs when rendering the water, so I switched to brute force just for fun. Render times shot up between 15 mins and 3 hours, and I didn't want to be rendering this for 3 months. I finally switched over to arnold, and with the same lighting and 6 GI bounces, I was able to render these frames in 3-4 minutes. Sorry V-Ray, but Arnold really kicked your butt in this instance.
And lastly, a pretty simple hybrido 2 sim. This was a fun scene to test, as it took about 1tb of space to cache roughly 250 mil splash and foam particles.
I really don't know a good way to render out the splash and foam, as I don't have Krakatoa. I'd like to get better rendering and compositing oceans, as I haven't really figured out how to render and comp the splash and foam particles. Here's a simple mesh render.
Here's another render in Arnold. I never really addressed interior renders, so I will in this post. This is the area where Arnold starts to stumble. In my opinion, it's still a very viable renderer, even for interiors, but since there is no caching GI option, renders take a while and GI samples need to be quite high. This render wasn't too bad, at 20 minutes a frame, 720p, with atmospherics.
There's a specific static sampling pattern in the atmosphere that I haven't been able to get rid of yet, despite making sure that lock sampling pattern is OFF. I've seen this on a few renders where I've used atmosphere. I've asked around the Arnold community but haven't come up with an answer quite yet.
One thing I do like is a natural blooming of the light around the windows, again because of the atmospherics. Sadly, there is no AOV for the atmosphere, but it can be subtracted out of the beauty easily enough.
I really thought that Arnold was edging ahead, but I've been playing with the VRay 3.0 beta and think it's a solid answer to Arnold. Don't get me wrong, Arnold is a BRILLIANT tool, probably best new softwares available in a long time (at least new to non-beta users), but VRay is no push over and I'm really liking some of the new tools in it as well. I'll add a new post on VRay 3.0 soon.
Here's a fun proof of concept I came up with for a cross hatch shading workflow. I'd love to try implementing it in a short at some point.
I got tired of command line renders for VRay, so here's a script / exe file that has a simple UI for rendering .vrscene files. Make sure VRay standalone is installed in it's standard location, or that it's installed in Maya's standard location.
Just browse to the file and click render. Standalone will popup and render the frame(s).
There are currently no overrides in the script (v2.0 perhaps?), so be sure to setup the file correctly from Maya. Note that it looks for the Maya 2014 version of standalone. If anyone wants this to work with Max exports, let me know where the default VRay for Max install location is. (I'd need the Max installer default vray.exe location, as well as the MAX installer's standalone only vray.exe location.)
Downloads below. To use the .py file, you need to have PySide installed.
Here's some work I did for the Dishonored add campaign. This was my render of the mask in neutral lighting we did for the campaign. I should note that this model came with textures, however I am responsible for the shaders and lighting, based on a concept from Blur Studios. All renders in VRay.
These three shots were for possible box art. The models and textures were all video game resolution and were upresed where needed. I scaled up and touched up textures, added textures where needed, fully shaded and lit these scenes. Photoshop paint overs were done by another artist. These posters were abandoned while still in progress, but I still like how they turned out.