Just another test, rotating particles around average neighbor pointposition.
Over time people have been asking me to share the DLA compound I wrote waaay back. So here it is.
The compound can be found in the downloads section.
Diffusion limited aggregation is a natural phenomena in which particles in solution build up over time – as particles are deposited they limit the locations where subsequent particles can be deposited, resulting in a pseudo-random growth of deposits with characteristic nodules and ‘fjords’. You can see examples of DLA systems in nature all over, from veins of copper to urban growth patterns. In this simple ICE implementation particles are emitted from geometry and then pulled towards a “seed” point, namely the point cloud the compound is applied to. If there are no existing points in the cloud the first will be deposited at the origin. This causes DLA structures to build towards the provided surface and then to spread across it or inside it’s volume. The compound has a turbulence node which adds a random tropisim as well as a vector input for user-directed tropisims. Please be aware this was never intended as more than an experiment: it works fine but is a pretty simple implementation.
To use: apply the compound to a point cloud (typically with a single point) in a simulated ice tree… …
Since I’ve been putting MSpro through it’s paces today, I decided to see how it did with a larger data set. So I fed it a reasonably large model of Manhattan. Mach Studio did great, export/import was fast, the machine was responsive, and even hitting it with realtime ambient occlusion, shadows and depth of field didn’t get the scene to a point where it chugged. Rendering this scene with similar settings in Mental Ray would have meant significant time per frame, certainly enough to preclude much in the way of any kind of immediate feedback. MSpro was fine onscreen and rendering 5k resolution images with settings turned up relatively high resulted in render times still under a half second a frame.
Not high art, but we’re talking test images people. I rather enjoy the simcity look of the orthographic one…Hey it looks like I nudged the ground mesh in one of those images. That would be a serious drag with a traditional render process, you’d have to submit it to the farm again. Realtime, it’s no big deal. Of course, there’s a lot of room for improvements on these images… the majority are not issues with the renderer but the artist (errr, me) who for some reason didn’t want to spend all night obsessing over test images. You can speed up render times, but there’s still tweaking galore. But at least you can tweak interactively.
Here’s the result of 1/2 hour (probably less) fiddling around with Mach Studio Pro and Photoshop to try to see how fast I could produce an acceptable concept image.
I exported the buildings from Maya which took a few minutes, slapped on some materials, threw in an environment light and a spot light. Rendered a frame (which took 0.13 seconds to render) and took it into photoshop where again I just threw some elements together – a sketch filter blended with the CG, some vignetting and color manipulation. It’s not going to win any awards but that’s not the point… I was able to get this image out from start to finish incredibly fast, and that’s despite my being a novice with MSpro. Very promising…
Did some more work on a basic realtime toon shader for XSI (and maya, it’s CGFX.) Here’s a sample…
Functionality is pretty basic at the moment: Ink threshold, 2 levels of paint and a hard spec hilight, overall color control, a single point light source. I still need to introduce diffusion and spec mapping, possibly some reflections and bump/normal. Remember, if you see an agent, run.
A lot of people have been using an ICE compound I made a while back, “Emit instance matched to SRTs.” I made it during a game cinematic project to allow me to animate pre-scored geometry for explosions.
Since I’ve received so much feedback (and not a few questions) I’ve uploaded some basic example scenes illustrating a couple of ways the compound is meant to be used.
I’ve been playing around with CGFX realtime shaders and mental mill. Here are some basic first shaders. Mental mill makes creating this kind of stuff pretty fast, and you can edit the generated code by hand once you’re in the ballpark.
I have been testing Mach Studio Pro for PLF for a while now, and while it’s a very new tool I am pretty pleased with it’s capabilities and potential. MSpro is an application which accepts scenes from most 3d packages (including Max, Maya, XSI, sketchup etc) and enables the artist to shade/light/render in realtime. Quality is high, with renders being competitive (and sometimes mistaken for) mental ray/vray etc in many cases. Being a realtime application there are caveats and limitations of course, for instance raytracing is not (yet) supported, nor are true radiosity/GI effects. Fair enough. And you’ll still need to render out vFX passes like particles and volumetrics in another app.
But most CG isn’t about all that – it’s about the basics and that’s where MSpro shines: on the 90% of the work you render which you can now do so in seconds rather than hours. It’s very liberating being able to light shots with immediate visual feedback, and MSpro was written with a fair eye towards being a production-friendly application with python scripting, linear lighting and HDRI workflow, output to open EXR as pass breakdowns etc.
This is clearly the direction the industry is moving and Mach Studio is not without competitors, but as a just-out-of-the-gate package they are off to a great start. And don’t get me wrong, MSpro isn’t just about the bare minimums… realtime microtesselated displacement maps aren’t basic, and realtime AO and SSS go a long way towards giving you the tools you need to create great imagery. In real time. No more waiting on farms. No more unpleasant surprises a day lat
Currently ICE has no provisions for creating or destroying faces or edges, which means we can’t create compounds directly in ice which for instance shatter objects. So what do you do if you need a “blastcode” type destruction effect in XSI? One solution is to approach the problem with XSI’s rigid bodies, but they have their own issues and it’s frustrating not to be able to leverage some of ICE’s power for this kind of thing.
The good news is that if you can create fragments outside of ICE you can use ICE instancing to go from pre-fragmented instance masters to ICE particles. For instance, build a brick wall out of individual bricks, instance them in ICE with the same SRTs, and do a rigid body simulation in ICE. Or in the examples below, use ICE to animate geometry fragments or polygons.
To achieve these effects I’ve written a script which extracts selected polys from a model and groups them as needed, and a compound which seamlessly takes a group of instance masters and creates identically placed ICE instances ready for simulation.