Archive for rendering
Softimage, Arnold, ICE and LKfabric, intensive small-team project @ Studio Royale, I think this video is meant to play in the Nike stores inside of a sculpture/model of the loom. If anyone sees it in one of the store sculptures, send me a pic, it sounds cool. :)
A discussion about Mudbox and Zbrush-style shading arose on the Softimage mailing list. Their signature look comes from “MatCap” shaders (originally known as lit-spheres.) It’s a popular way to achieve a custom lighting solution from a texture, in realtime, which is particularly useful when modeling – you can get a nice clay or sculpy “look” to geometry in realtime. It’s also useful for creating nonphotorealistic (NPR) looks in realtime, toon shading etc.
As mentioned in an earlier post, the grey-ball shader in mental ray can render litsphere textures, and a user suggested that in the high quality viewport you can get the desired result by plugging the metaSL node “Map_ball” into the environment channel. The problem with this is the result (on my machine, at least) appears in world space. A proper litsphere should be in view space.
But it called my attention to something important – almost all of the metaSL nodes used in Mental Mill are now accessible in the render tree and can be used similarly – meaning for most intents and purposes all softimage users now have Mental Mill. Which is awesome.
But we still needed a solution for matcap functionality in the high quality viewport. So I bit the bullet and wrote a metaSL shader which seems to do the trick. It can be used for both realtime performance in the high quality viewport as well as full renders in mental ray (and any other platform supporting metaSL.)
Update: Daniel Brassard kindly fixed some bugs, the new version is now available below. Thanks Daniel!
Here’s the shader (MetaSL ~2kb): litsphere_v1_1
More examples of the shader:
Been playing with Vray. I like it. This is an “out of box” render of some strands, no effort on my part to properly light it, which really is a requirement for fine strands/hair etc. Still, not bad, they have a nice hair shader. I’m less interested in character hair than in fine detail fibers as VFX elements, so I’ll have to see what other looks I can coax out.
This amount of strands doesn’t even come close to taxing Softimage, it’s still interactive on my box…but rendering it at 2k took about 20 minutes, without any optimization. I’d guess that with some effort adjusting some settings I could get a better look at about 2x the speed, which isn’t bad, considering this is a GI render.
It’s a bit splotchy and the exposure is meh – chalk that up to me being a Vray noob. Looks like I was a bit too aggressive with the jpeg compression, too… The full render is much cleaner, and of course this is just a beauty pass, no passes or color correction. A decent lighter no doubt could take this to another level. Hey, I’m a VFX guy I hardly ever get to render stuff.
I’d like to get a feel for how much impact more strands will have on render time. I may have to try a “stress test” over the weekend to see what the reasonable limit for a single machine is in terms of numbers of strands I can get into a 2k half float render while still maintaining this look as a minimum quality…
In an earlier post I discussed and shared a litSphere (or Matcap) shader for maya (which also works in Softimage.) Back when I made one for a place I worked I also made this artist’s guide for coworkers, which I just found while sorting through my hard drive.
The shader I made available here is not the one I made (it’s no longer mine to share) but one in the public domain – happily this little guide still applies.
So here it is…
There are quite a few litsphere maps out there, do a little googling and you’ll find plenty. They are also easy to make in photoshop. If you want to simulate lighting from a render, you can place your advanced/non-realtime material on a sphere, render it out, and use that as your litsphere… this is great for previs and layout, you can see in your viewport a realtime approximation of your shaders. And since it works out-of-box it’s a useful trick for studios which don’t have the resources for a more advanced realtime visualization system.
Your results will vary depending on your graphics card. Softimage users, I’m not much of a shader guy but it shouldn’t be too difficult to set up a good solution for the high quality viewport (which, by the way, hasn’t gotten the attention it deserves – sure it needs work but its a huge step in the right direction.) If I get a chance I’ll see what I can come up with, it would be nice to be able to model in a viewport with shading looking similar to Zbrush etc. I’d also like to have a simple solution for lit sphere shading in Arnold and Vray. Any shader gurus out there who are interested in the idea contact me and I’ll share what I have (for what it’s worth.)
This isn’t an earth-shattering ground-breaking post, just a simple example of a workaround which came up on the SItoA mailing list…
In Mental Ray rendering out a copy of a Softimage ICE particle’s shape per strand point is easy, just disable the “stranddeform” attribute (when using “generate strand trails” it’s the “loft shape along strand” checkbox.) Now instead of rendering a continuous, lofted result the strand is depicted as a series of particle shapes or instances. Easy enough, and is described as a basic workflow in the softimage documentation.
But when using Arnold as a renderer, particle shapes are lofted no matter what. So how do you get a result where shapes are repeated along a strand? This is the question which came up on the mailing list, and Stefano Jannuzzo and Mathieu Leclaire posted a simple and direct solution: clone (or add manually) a point for each strandPosition on a new cloud. When using add points instead of cloning, you also need to carry over relevant information such as orientation and velocity.
This is a pretty common kind of technique for this sort of thing and is worth putting on the record where people can search for it, so I’ve implemented their suggestion in a compound and example scene here. Enjoy.
Just loosening up… I had created a couple of forces in ICE so I rendered out a galaxy using emRPC. I had never had call to make any space environments as part of work, so I thought I’d spend 20 minutes and give it a go.
Ok, it’s not going to win awards. But it was fast and fun to make. :)
As long as we are talking about Exocortex, they just posted this exciting preview of the next version of their point rendering tool Fury.
For those of you wondering why this is important, it’s simple enough. Fury is fast. Really, really fast. And it was written by Ben Houston, the original author of Krakatoa, a tool of choice for rendering particles. Softimage, Max and Maya users alike can move their simulations to ICE (or create their simulations with ice directly) and partake in the Fury awesomeness.
LOOK at it. 1 million points. Self shadowing and cast shadows. 1-second-per-frame.
“The major new features in Fury 2.0:
* GPU-accelerated particle self-shadowing
* Shadow maps
* Built-in compositing previewing.
* Command line renderer support.
* Synchronized Softimage and Maya support.
In this example, 1 million points are lit and rendered in about 1 second per frame and the shadow map is also created at the same time. Motion blur and DOF do not slow down rendering time.
The simulation in this example is from a alpha-version of SlipstreamVX 2.0 and thus the smoke motion isn’t quite perfect in this video.”
If you are already an experienced shader writer, Mental Mill doesn’t have much to offer, auto-generated code won’t appeal. But for TDs who haven’t the experience or time needed to get a shader going, it’s a huge boon. It’s also a way to get a feel for shader code, letting you experiment and see how the code changes as a concept is changed.
What’s important for everyone to know about Mental Mill is that the same “tree” can be used to generate code for multiple rendering types – with a few caveats, you can create a shader for Mental Ray, a matching realtime shader, and a renderman compliant shader all in one go (ummm, Arnold? Oh well). For softimage users, this can give you custom solutions you can see in both the viewport and at render time. Here’s a video which covers creation of a realtime shader, for instance…
StudioGPU’s realtime renderer Mach Studio Version 2 is now available as a free download, with exporters for Max, Maya and Softimage. Hopefully this reflects a change in marketing strategy and not a discontinuation of development, as the product was beginning to mature nicely… but either way it’s a powerful tool at a great price, well worth the download. My little tests on this blog have barely scratched the surface of this renderer. It’s not a replacement for all rendering, but if you need blazingly fast render times measured in seconds and minutes instead of hours and days, while maintaining a certain basic quality (which with skill can rival mental ray and arnold), it’s the only game in town.
I don’t know the folks at Meindbender, but I’d like to. What gorgeous work! They really show off what the Maxwell renderer is capable of, and the artwork is top-notch.