Intersection paint logo

Intersection Paint

Overview

Intersection Paint is a custom Unity shader that uses depth to paint meshes based on their intersections and then simulate the paint in texture space.

Process

Since this shader has been designed for VRChat avatars, certain restrictions are imposed, such as not being able to use custom C# scripts.

Whilst in the middle of working on a more typical projection painting project, I came to the sudden realization that I could potentially use the main camera as a substitute for a brush’ projection camera to paint a mesh. Although since I can’t use custom scripts to create additional render textures/buffers and Blit them, I had to take advantage of Custom Render Textures for this project, I’ll be referring to them as ‘CRTs’.

I began by writing out a simple shader that would highlight the distance between the vertex depth and the depth in the _CameraDepthTexture buffer like you normally would. But instead of transforming the mesh to clip space, I use the texture coordinates to overlay the mesh on the entire screen according to its UV layout. And now, I could essentially see what precise parts of the mesh were being intersected along the main view, after comparing it against a tiny threshold and using a GrabPass to copy the results into a new buffer, and all of a sudden I had the projected brush texture. The next step was to somehow store this somewhere, but I’m not able to create new buffers at runtime.

Then I was reminded of the existence of CRTs, which have the ability to double-buffer information, which would be perfect for something like accumulated intersections from all the regions on a mesh that get intersected. An interesting thing you can do with CRTs is sample GrabPasses, since they’re declared as global uniforms. I’m not exactly sure how I realized this, but it might’ve had something to do with me tinkering with sampling audio data in CRTs since AudioLink (an audio reactive shader system) used to rely on a GrabPass to globally declare the audio data buffer, and being successful in doing so.

After throwing together a quick demo where you could erase fragments by intersecting and receiving positive feedback in VRChat from friends I tested it with, I improved and polished out the shader some more, such that it did not feel like such a tech demo. I realized it might be fun to use it to demo painting canvases and meshes without the use of additional cameras, so I stuck with the paint idea.

Since I now wanted to make this shader act more like paint, I also wanted to somehow simulate it, but I had to also account for mesh transformations and skinning, so baking a flow map before hand was something I passed up on. Instead I soon found myself taking advantage of the fact that the mesh is already laid out according to its UVs. By taking the dot product between the meshes original world position and some arbitrary gravity vector and then getting the screen space derivatives of the resulting scalar, I suddenly had a flow map I could use to offset samples. This solution is not perfect, but it’s definitely held up throughout all the hours I’ve been testing this shader.

I’m happy to say that a lot of people say they love how interactive this shader is, sticking their hands through my models and finger-painting them, and they also find a deep satisfaction in just painting an entire model, like a scratchcard. 😋