Fake it until you make it

Image6.gif

Fake it 'til you make it: Fake extended draw distance in mobile games

Optimization is a cornerstone of mobile game development. With thousands of phone models out there, many of which use outdated chipsets, every game needs to aim for a reasonable lowest common denominator, and one of the most consistent ways to optimize performance in 3D games is to manage draw distance.

The drawing distance should be as short as possible to achieve stable FPS. But what about open worlds, where players must see the entire map from any point? This is the challenge we faced at Cubic Games during the development of Block City Wars, and below we will explore the solution we chose and the strengths of this particular approach.

The problem:

In a game like Block City Wars, each player must see the entire map from any position or be at a disadvantage, and simply increasing the far clipping plane won't work. Increasing the draw distance increases the number of triangles that pass through all selection stages: more objects are subjected to bounding box checks on the CPU and more fragments are drawn on the GPU.

Using another background camera with a different draw distance complicates camera management and adds unnecessary overhead. Finally, experiments with HLOD (Hierarchical Level-Of-Detail) also proved to be unsuitable for solving this problem. While some of these solutions may be applicable to other games, they failed to meet our needs. When all else fails, shader magic saves the day.

The essence of the solution:

The solution we chose was to use a mix of shader tricks combined with our existing simple fog effect to provide useful but largely faked detail. Using a shader we can create the illusion that an object is far away when in reality it is close to the player. This allows us to choose which objects will always be visible, regardless of distance.

It makes sense to only use objects that are tall enough so that players can orient themselves on the map, allowing us to completely remove visual clutter from the final render. To ensure a seamless transition between the “fake” objects and the real ones, we will render the silhouettes in the fog color. This also allows us to reduce detail significantly. It will look like this:

Before

Image1.png

After

Image2.png

Deceptive CPU culling:

To achieve this effect, we can take advantage of the tools provided by Unity. In order for a mesh to be sent for rendering, its boundaries must fit within the camera trunk. This can be done easily, for example, using this MonoBehaviour. We'll do this in Start() because Unity recalculates the bounds when the mesh is initialized. For our purposes, we need to set the size so that the player's camera is always within the bounds; therefore, the mesh will always be sent for rendering on the GPU, easing the load on older CPU models.

void Start() 
{
	Mesh mesh = selectedMeshFilter.sharedMesh;
	Bounds bounds = mesh.bounds;
	bounds.center = newCenter;
	bounds.size = newSize;
	mesh.bounds = bounds;
}

Tricky GPU Selection:

Once the mesh is on the GPU, there is an additional trunk deletion phase, between the vertex phase and the fragment phase. To get around this problem, we need to transform the vertex coordinates so that all vertices are within the camera view, while still preserving perspective.

v2f vert (appdata v)
{
	v2f o;         
	float3 worldPos = mul(unity_ObjectToWorld, v.vertex).xyz;
	float3 directionToOriginal = normalize(worldPos - _WorldSpaceCameraPos);
	float3 scaledPos = _WorldSpaceCameraPos + directionToOriginal*_ScaleDownFactor;
	float3 objectPos = mul(unity_WorldToObject, float4(scaledPos,1));	
	o.vertex =UnityObjectToClipPos(objectPos);
	return o;       }

_ScaleDownFactor is the distance from the camera at which all vertices will be positioned. It needs to be adjusted based on the distance of the fog to hide the transition.

All we need to do in the fragment shader is simply draw the color of the fog, which will mask the geometry cut.

fixed4 frag (v2f i) : SV_Target
{
	return unity_FogColor;
}

Example with an island mesh:

Image3.png
Image4(1).png

image4.png

This effect can be clearly seen in Blender. If you place the camera at the origin and point it at a cube, then duplicate the cube and scale it relative to 0, from the camera's point of view there will be no difference between these cubes. Obviously this is a trick that won't work perfectly in VR, but we're developing it for mobile here, so depth perception isn't something we need to work on.

Image5.png

Image6.gif

In our case a further step is added: the mesh is “squashed” to adapt precisely to the limit of the camera's drawing distance. This is done to avoid overlapping with the z-buffer of other objects that should be closer to the player. When dealing with detailed “impostor” objects like this, all it takes is a small rendering issue to shatter the illusion and draw attention to background objects that should normally be seamless.

We also need to keep in mind cases where the camera might end up inside the silhouette mesh. The vertices of a triangle can end up on different sides of the camera, causing it to stretch across the entire screen. This should be taken into account when meshing the silhouette, making sure the camera does not enter it or disabling meshes when the camera gets closer.

Conclusion

While this approach won't be applicable to all games, it fits perfectly with Block City Wars and its existing fog effects. This approach allows you to quickly extend the effective viewing distance using “fake” silhouette details with serious performance limitations, taking advantage of existing fog effects to hide the smoke and mirrors used. It's easy to reproduce in any pipeline and rendering engine and doesn't require modifying existing code.

Even with much of the detail falsified and obscured behind fog effects, distant silhouettes still provide useful gameplay information to players at minimal performance cost. A clear win for players on all platforms, especially older hardware.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *