Working in VR

It’s been a while since my last post, mainly because I’ve been busy at Meta. Working in VR on a daily basis has been fascinating. Meetings in VR, working in a virtual office with a Quest Pro, and (now that the pro can be seen in public) traveling with that same virtual office.

There are a lot of things I can’t share but one thing which strikes me is the sheer speed at which VR is becoming a transformative tool for all kinds of business. Interestingly enough, the “standard” business use cases are emerging more rapidly than many of the creative scenarios.

In hindsight this makes sense… You can get a lot of work done with 2d applications on laptops and mobile devices already. While often creative work requires high-end physical setups with resolutions and color clarity not yet available in VR.

What VR is bringing to the table is that you can now use your ”standard” workplace applications tools in virtual spaces… which you can take with you.

By this I mean your workspace is not limited to your portable screen any more. You can take along your whole office (albeit a virtual one), with arrays of huge monitors around you to multitask as you prefer. And already you have increasing control of what the rest of the office around you looks like… a fireplace nearby, or a night sky and aurora above you… a music system, knickknacks on a shelf, photos of a loved one, a working clock on your desk, and so on.

The point is your virtual space can exist wherever you do. In a hotel room, airport terminal, or the cramped confines of airline seating (where the ability to be anywhere other than 9 inches away from the seat in front of you becomes a godsend..) Tired of the current layout or your view? You can change it around whenever you choose. Need an additional monitor? Just spawn another one.

With mixed reality, you can increasingly blend the real world around you with the virtual. You can see your real hands typing on a virtual keyboard. You can remain in VR but confidently move between your couch and your desk. Don’t worry, if your pet walks into your space you can see it, too.

And you can collaborate with your coworkers in your virtual office space, regardless of where you are. I do the majority of my work from my home and meet in VR a lot. I tend to prefer it to zoom meetings.

Sure, you are in a somewhat cartoony place with somewhat cartoony coworkers and self. Surprisingly that really doesn’t matter – you get used to it and even better you don’t have bad hair days or need to worry about what your physical surroundings may look like.

The sense of presence makes a big difference. The nature of discussions is impacted I think, and in a positive way. For better or worse, you are in the meeting. And I find the additional engagement this sense of presence delivers causes me to participate and remember the meeting more.

This sense of additional engagement from a VR presence has been observed in a number of studies which have measured considerable increase in participant’s retention of information in VR versus videoconferencing. And unlike physical real-world meetings participants can be scattered across the planet.

The first time I went to a real-world meeting with my team at meta I was immediately comfortable and productive with my teammates – despite never having actually been in the same room with them before. I knew them already, from VR. Their body language was familiar, their whole presence was known to me already, the fact that we were physically in the same room for the first time was unremarkable.

That’s not to say face-to-face communications are going by the wayside, not at all. Businesses are discovering that it’s critical to make sure closely partnered coworkers get some “real world” time. VR cannot replace face-to-face interactions entirely. But it can make remotely dispersed teams more comfortable and productive.

From any physical location you can all group up. You can read each other’s expressions and body language and feel their proximity. It doesn’t matter if you don’t look your best because your avatar looks fine. Meeting in VR gives you the immediacy of dropping into a videoconference with the familiarity (and retention) of a face to face meeting.

I can tap my headset and see the “real” world around me if I need to. My passthrough image on the Quest Pro is accurate and fully stereoscopic so I can comfortably reach out to grab a cup of hot coffee from a flight attendant, or pet my dog on the head in my home office… before jumping back “into” my VR space.

In a VR workspace I can mute myself but remain in the virtual room. I can see my laptop screen or project it onto the VR whiteboard so everyone can see it. I can draw on the whiteboard from my desk, point at it with a “laser pointer,” or get up in front of it to present to the others. Attendees who call in from “2d” (aka video participants using their cellphones or webcams) are there too.

And like the internet once did, this is going to change everything.

It’s transformative. This is why companies are purchasing headsets designed specifically for this kind of working, and why companies like Microsoft and Accenture are moving to bring their tools and workforces into the VR space.

Like the internet when it was new, you’re going to have to experience it for a while before the benefits of this new kind of media really sinks in. The tools and workflows are going to change and improve. Offices will get rid of the miles of cubicle spaces that exist today and instead emphasize conference spaces and comfortable temp locations for a more mobile and dispersed workforce.

Home workspaces will become more important, while costs once applied towards physical office infrastructure will be reduced to a degree and applied towards equipping a more remote workforce, and supporting periodic meetups.

This isn’t going away: you are going to experience it (if you haven’t already.) And in time literally everyone will. It’s that transformative.

Like the early internet, or smart phones, DVDs, streaming, the web and online shopping this inevitable, and it’s coming soon.

The first years are going to be marked by frustrations and constant change, but that’s part of the fun. Fortunes will be made and lost, new markets will emerge, and surprises are going to happen every step of the way… Enjoy the ride!

Procedural Glyphs, another method

I thought I was done with the procedural glyphs project but it occured to me there was another approach I could take and I had to give it a try. The results are a bit closer to feeling handwritten. What would be really interesting would be to develop a continuous cursive-like script with some form of punctuation.

Update: Cursive writing GET.

Procedural Hatching

I’ve been using Shadershop as a tool to explore useful procedural functions. In this case I needed procedural hatching. A square wave was quick to define:
  In this case, all that we need to do is call a couple of floor functions. Frequency, phase and amplitude are easy enough to adjust after the fact just as you would any other periodic function. (A simpler method to create a square wave is to use a rounding function, but I wanted to see where Shadershop’s approach took me.)
  I then extrapolated this function into a vector2, skewed it with a 2×2 matrix, and voila. Shadershop makes figuring this kind of thing out much easier by using a visual workflow and graphing your function as you go. Very handy!

  You can do something similar with a 1d random function fract(sin(x)*999) to achieve a periodic 2d noise pattern.

  A simple crosshatch shader in unity can be made by adding two of the above square wave functions. In this case I used a min and max function to change the “pulse length” of the square wave. You can get a similar look by using the more common approach of establishing a sine wave which you then clip/threshold. I’m not sure which method would be faster… Here’s the basic shader:

Shader "Unlit/Test"
      _Density ("Density", Range(1,400)) = 150
      _Rotation ("Rotation", Range(0,360)) = 45.0
      _Width ("Width", Range(0,1)) = 0.4
          #pragma vertex vert
          #pragma fragment frag
          #include "UnityCG.cginc"
          struct v2f
              float2 uv : TEXCOORD0;
              float4 vertex : SV_POSITION;
          float _Density;
          float _Rotation;
          float _Width;
          v2f vert (float4 pos : POSITION, float2 uv : TEXCOORD0)
              v2f o;
              o.vertex = UnityObjectToClipPos(pos);
              o.uv = uv * _Density;
              return o;
          fixed4 frag (v2f i) : SV_Target
            float sn, cs;
            sincos(radians(_Rotation), sn, cs);
            float2x2 mt = float2x2(cs, -sn, sn, cs); 
            float2 c = i.uv;
            c.xy = mul ( c.xy, mt );
            float hatch = max(abs(floor((c.x/2 - 0.5)/-1) 
                          + floor(c.x/2)+1),
                          abs(floor(((c.x-_Width)/2 - 0.5)/-1) 
                          + floor((c.x-_Width)/2)+1))
                          - min(abs(floor((c.y/2 - 0.5)/-1) 
                          + floor(c.y/2)+1),
                          abs(floor(((c.y-_Width)/2 - 0.5)/-1) 
                          + floor((c.y-_Width)/2)+1));
            return hatch;

  The problem with this kind of approach is that it’s overly complex. All of that min/max/floor stuff can be replaced with a much shorter step function. Same look, much more direct.

  float sn, cs;
  sincos(radians(_Rotation), sn, cs);
  float2x2 mt = float2x2(cs, -sn, sn, cs); 
  float2 c = i.uv;
  c.xy = mul ( c.xy, mt );
  float hatch = abs(step(1-_Width,sin(c.x)))
              + abs(step(1-_Width,sin(c.y)));
  return hatch;

  I added some lighting to the shader and here’s the result. Procedural and written entirely in notepad!

Contribution to PopcornFX 2.10 Now Released

The PopcornFX team honored me recently by including content I’ve created as part of the official 2.10 release. As they work towards a version in which user content can be uploaded and shared freely, the team approached me knowing I had a pack of templates and example projects I was intending to share out to other users. I was flattered and pleased to share this content out via an official release like this.

It’s worth noting that PopcornFX is free to download and use for personal learning, as is this content pack. The content I’ve contributed can be downloaded directly from the editor and is among the “official” learning content and examples.

By including this content in the 2.10 release they have a demonstration and first test of how new nodes and files can be created by a third party and shared among users.

The content is a collection of 31 “templates” or custom nodes I originally created for myself as I worked with PKFX. The nodes are documented and tested and vary widely in both their usage and usefulness. A few are so helpful I use them in practically everything I do now, while others are very specific in their use cases. 

In addition there is a small suite of “example” files showing various effects, each cleanly presented, commented and documented. 

I tend to clean and document files for myself as part of my creation process so it didn’t take much effort to prep these files for the public. Hopefully they will spark interest and prove useful as examples of one user’s workflow, in tasks both simple and advanced.

Read more for descriptions of each of the nodes included:

New nodes

Random Spherical Vector
Produces a pure random vector from a unit sphere.
Random Value In Cone
Generates a random vector in a conical arc with the center by default oriented in the positive Y axis.
Test If Points Intersect Plane
Tests point position to determine if it intersects an infinite plane.
Spring Force
Plug into acceleration. Updated position is the particle position on current frame. Rest position is the point the spring seeks to reach. Spring constant is a measure of the “tightness” of the spring.
Wander Around Position
Randomly moves offset around center point position. Two methods available, with their own strengths and weaknesses. Keep seed values low.
Normalized ID
Applies a unique ID ranging from 0 to 1 for all spawned particles. Only works when there is a total (fixed) number of particles spawned.
Time Based Rand
Instead of a unique value per-point this random function delivers a single random value at spawn or per frame.
Animated Noise
Noise function animated over the effect age.
Linear Probability
Produces a boolean and optional values based on a random probability between 0 and 100.
Spiral Trails
When used in a “trail” layer this template causes resulting trail to be a spiral around the parent particle’s path.
Spin on Axis
Animates an incoming vector (position) to rotate around axis over time.
Point Visibility By Facing
Outputs a number 0 or 1 based on whether a point is on the front, back or edge of the sphere of the point normal.
Sparkle Color
Flickers colors over the particle’s lifespan.
Flicker Value
Flips between two values based on a random noise.
Position On Circle
Outputs a vector which lies on a circle defined by input values.
Oscillate between two values. User can choose between several waveforms via a dropdown on the node. Speed is in seconds.
4 Pole Switch
Choose between 4 inputs via a dropdown list on node.
3 Pole Switch
Choose between 3 inputs via a dropdown list on node.
2 Pole Switch
Choose between 2 inputs via a dropdown list on node.
Value by Distance To Camera
Sets a range of values based on each point’s distance to camera.
Kill by Camera Distance
Deletes particles closer or further than a set distance from the camera. 
Identify Single Point
“Tags” a single point or range of points and returns true when graph evaluates them. Use with billboard “enable” to render a single point.
Newtons Universal Gravitation Force
Simple implementation of universal gravitation. Plug into acceleration. Position should be discretized (updated per frame.) Values are arbitrary, not real-world.
Set Circular Orbital Velocity
Calculates velocity needed to place point into a circular orbit when using a universal gravitation force. Normal vector is typically the effect.up vector.
Position On View Axis
Outputs a position along the axis of the camera into the scene, with distance 0 being the camera position. Useful for defining focal planes etc. 
Color Turbulence
Defines a color based on the location of a point within a vector field.
Spatial Basis Rand
Returns random values based on position in a vector field. Must have a “turbulence” node added to sampler input.
Generate Ring
Distributes all spawned points in a ring or arc. Requires a static (total) number of points.
Generate Cyber Rings
Distributes spawned points into a series of circular arcs. Total spawned points must be static, though they can be animated in the layer.
Generate Line
Distributes spawned points on a line between two input locations. Unlike the debug “draw line” this line is made up of spawned points which can be further animated. Requires a static (total) number of points spawned. 
Random Static
Useful for “glitch” effects. Offsets positions by a random value at random intervals. Spatial mode requires a turbulence sample.


Creating Bezier Curve Nodes in PopcornFX

While making a demo scene for PopcornFX I found a need to make individual particles move on arcs where I could choose a start and end position, height of the arc, and overall placement in 3d space. Depending on how you approach the problem you can easily find yourself slogging through some pretty involved math, at least for the level I’m at. An “arc” can be a parabola, or a section of a circle or so on. A few directions can take you into the woods pretty quickly. I wanted simple.

The simplest approach I’m aware of is to construct a quadratic bezier curve, since it is defined by a start, end and midpoint and can easily be plotted out using a few LERP functions. If you define a start point, end point and midpoint a quadratic bezier curve describes a very intuitive arc, and if you restrict the control point to be perpendicular to the midpoint of the line between your start and end points you get a beautiful parabolic curve in a very intuitive fashion.

Once I had decided on a quadratic bezier, creating a template in PopcornFX was quick, and it wasn’t long before I had a very useful new node in my toolbox. To extend the idea a little further I decided to take a little extra time and make another node for a cubic bezier curve, which is a logical extension of the same approach.

A cubic bezier curve is defined by a start point and end point like before, only this time those two points each have a separate control point to describe a 3d curve. The resulting path can now diverge from a single plane but is still simple enough for the results to be intuitive. In cases where a quick spline is needed in PKFX, this template should allow the artist to describe a simple motion without having to import in a curve or animation. It’s made up of a series of LERP functions and while the idea can be extended further to create fourth or fifth order curves actually wiring up so many LERP’s gets crazy. We’ll stop with our two handy new nodes, the quadratic and cubic beziers.

Here’s what the two look like in PopcornFX. The more complex cubic bezier is the top curve, and the bottom is the quadratic curve, with the midpoint restriction mentioned above:

So from there I had a nice tool to create motion paths of as many separate particles as I needed. To celebrate, I made the following simulation. Yes, we are playing a game of thermonuclear war:

There are a myriad of ways in which the above animation could be approached, but by abstracting out the “arc” motions into templates I am able to finish the project in a way that leaves me with tools for further work in the future. And of course these nodes can not only be re-used, they can be shared out with other artists.

A final benefit is a much more simple graph overall, and best of all we’ve avoided relying on simulation. Sure you could fire each particle like a little mortar and let a physics node calculate the motion up to the point where the particle collides back to earth. But that’s computationally wasteful and worse it cedes control of the results to the parameters you give the simulation.

By abstracting out the arcs the way I did, I am able to easily able to set conditions so that every arc starts from a populated region and ends in a populated region. If you watch the animation you will note no arcs start or end in the ocean. (I admit however that I didn’t go so far as to prevent Canada from nuking the USA and vice versa…)

Similarly I have control over the heights of the arcs, their angle respective to the surface of the world, and the range or distance any point is allowed to travel. Simple procedural approaches can yield visual complexity while retaining control and easing the difficulty of projects!

By the way this image is entirely particles. 

The “land” and “city” areas are the result of taking some texture maps of the earth from NASA into Houdini and using them to create PKFX-friendly emitter geometry as .FBX polygonal meshes. This lets me quickly spawn particles exactly where I wanted them as well as define start and end positions for arcs in areas already predefined as valid. No dart-throwing approaches where particles are spawned and then accepted or rejected based on texture maps or the like.

You can’t always predefine information this way but in this case having re-usable meshes defining the landmasses and cities of Earth was another useful set of tools for future projects. Perhaps the next one won’t be quite so apocalyptic…

Misc Shots

I haven’t edited an “official” demo reel in a very long time. People know who I am, and hire me for projects based on peer testimonial and having seen past work directly. A reel just hasn’t been needed, since I’m fortunate enough to be a freelancer who gets approached regularly. So when I started teaching students who knew nothing about me I had the challenge of presenting myself to an entirely new audience who were eager to see some work I’ve done in films. I really haven’t had time to edit much, so instead I spent an evening grabbing various shots from films and the like. Stuff I’ve either worked on directly, supervised, or had some hand in.

I have only grabbed a portion of the films I’ve worked on, plus some TV and game stuff, pretty much in the order they were stacked on my desk. The choices are fairly miscellaneous – at the moment I just don’t have time to go through film after film and say “oh yeah, I did that,” then tighten it all down into an edit with music and polish. I know, this breaks all the rules where I’m supposed to present only the very very best, nothing less. But the policy that seems to work best for me is to just be open, good or bad just get it out there and get back to making the next thing. I will never be a charismatic showman or clever self-promoter. I’m a creative geek and frankly I like showing failures, too. It’s part of the process, and that’s what really motivates me.

So all of that said, here’s about 6 minutes of footage and, well, stuff. I basically play it in the background when I introduce myself to students, so it is what it is. Enjoy!

Article on 80 Level

I wrote an article for the game-and-vfx website 80 Level a while back, and it’s gone live. The topic is Hooke’s law as implemented in Popcorn FX (and Unity’s VFX graph.) It’s brief but has been well received, I appreciate the feedback I’ve gotten. A second article for centered around Newton’s universal gravitation as implemented in PopcornFX (and Houdini this time) is in the works.

Vorticle Fluids implemented in PopcornFX

This is a really crude first approximation of a homebrew realtime fluid simulation using vorticles. The general idea is a sparse layer of particles are queried for position and orientation and used as centers of rotation to simulate turbulent fluids. Even this very simple first experiment is very promising.

No noises, fields or turbulence forces are used, the motion is all directly a result of particles interacting with a layer of vorticles, which I reveal in the latter portion of this video.

Simulation is entirely on the CPU and takes 18ms per frame for 21,000 particles. One nice thing about vorticle simulations is the motion does not change with the number of particles, meaning a simulation can be tuned while running very fast and then the simulation turned up for millions of particles to be rendered in After Effects etc. I look forward to seeing how fast this will run when entirely on the GPU.

Apologies for the tiny video… this is just a screen capture.

Fibonacci emitter module in Unreal Niagara

I created a number of custom emitters for Niagara which I have yet to release to the public. (I wish there was a nice way to package up and share custom modules rather than having to share out entire project files.) These frames are illustrating the use of a fibonacci (or phyllotactic) sphere emitter which is tailor-made for dandelions…