video

Higgs Bosons and why you should care.

I’ll get back to some CGI related posts soon… But in the meantime, what’s all this about the Higgs Boson, and why should we care? I mean, the math seems impossibly deep, the experiments staggeringly expensive, and then when we ask physicists to explain how all this matters to us in our daily lives they mumble stuff about the beginning of the universe and dark matter.

I know, it’s annoying, right?

Instead of asking physicists, who understandably are up to their waders in some pretty deep stuff, we should ask some engineers why we should care. The answer becomes something like this:

“If you want to see a day where perhaps things like pod racers or gravity skateboards are a reality, this matters. This is an important step which will help us know if such seemingly fanciful devices are even possible.”

Suddenly, the staggering costs and daunting math make a lot more sense. All this is about advancing knowledge that ultimately lends humanity more capability to DO stuff, from feeding ourselves to making iPads.

This video does a pretty good job of putting the “discovery” into some context:

[youtube]http://www.youtube.com/watch?v=9Uh5mTxRQcg[/youtube]

Amazing low earth orbit images from the Frontier

These timelapses from the ISS are amazing. The ‘visual effects’ real life has to offer are incredible, all we have to do is keep exploring…

[vimeo]http://vimeo.com/38409143[/vimeo]

Auroras, stars, space stations, city lights, atmosphere… it’s pretty jaw-dropping.

Mental Mill and Softimage

If you are already an experienced shader writer, Mental Mill doesn’t have much to offer, auto-generated code won’t appeal. But for TDs who haven’t the experience or time needed to get a shader going, it’s a huge boon. It’s also a way to get a feel for shader code, letting you experiment and see how the code changes as a concept is changed.

Here’s the mental mill blog, with some info for softimage users.

What’s important for everyone to know about Mental Mill is that the same “tree” can be used to generate code for multiple rendering types – with a few caveats, you can create a shader for Mental Ray, a matching realtime shader, and a renderman compliant shader all in one go (ummm, Arnold? Oh well). For softimage users, this can give you custom solutions you can see in both the viewport and at render time. Here’s a video which covers creation of a realtime shader, for instance…

[youtube]http://www.youtube.com/watch?v=E3biGBm2mkM&feature=youtube_gdata_player[/youtube]

Gorillaz and Miku. Who can’t resist a cg character performing onstage? Not these people. I dig the leek-shaped lightsticks. (For the impatient, Miku makes her stage appearance at about 5:25 in this video.)

The mocap performance is nice I guess but what really interests me about Hatsyune Miko is that her voice is synthesized, not recorded. Would have been cooler if she was puppeteer-ed in real time ala Henson’s creature shop…

Update – no video available anymore, it was removed from youtube. Bah.