In the wake of the BMW QUT design academy's M8 launch event, I wanted to dedicate some time to highlight the different interactive experiences our team developed to showcase the technologies we work with. The M8 Coup functions as a research tool and as one of BMW’s leading supercars, the product sports the very best of the company's design language. Our team capitalized on the opportunity to accentuate the vehicle's design features and selected the Wheels, Exterior surfacing Engine, and key focus areas. This blog post will focus on the production work surrounding the M8 Wheel’s AR installation by unveiling the modeling process behind the asset, the texturing workflow, animation and finally highlighting the tools used to bring the wheel to life.
BMW's M8 Coupe's Wheels sports a unique spoke design - a feature we intended to accentuate. When attempting the animation sequence for this project, we storyboarded a concept that bridged the realism of the physical wheel to a floating exploded part view of the product, much like the ironman blueprints from Marvel. Below is a summarised showcase of the other AR directions we considered.
Every good asset begins from a database of references; we began by scouring the internet for different images of both the unique BMW M8 wheel design as well as the brake caliper. In addition to the photos, we used the Academy’s M8 vehicle as a reference for scale as well as texture details. Spearheaded by my colleague Tim, the first stage involved using Blender as a poly modelling platform to ensure the wheel distributed light along the required contours, and the asset size remained as optimised as possible. When both the wheel and the brake caliper was finished, they were combined into a singular assembly and moved to the next stage of delivery.
Once the asset was assembled and animated we redeveloped the product within a platform for viewing the assembly on a mobile platform. We chose to work with Vuforia and Android due to the open-source nature of the kit and Unity’s access to microlevel animation, material and lighting controls. Another important aspect of this phase was aligning the CGI model with the physical car - months before our Beta Lab space was furnished. We accomplished this by using mundane objects as references (such as a cylindrical pillow for a wheel) and setting markers around the room to symbolise the dimensions of the vehicle and their relationship to the various AR installations.
Read more about the lighting, texturing and animation workflows below:
To explore a variety of AR modalities, our second installation focused on the process of 3D concept development, especially texture detailing for efficiency using Adobe’s substance painter. The M8 engine has a utilitarian beauty to it, showcasing peak Bavarian design both as an art form and through its engineered performance. We aimed to isolate the heart of the M8 and showcase the engine as a model for viewers to appreciate its beauty. The workflows we used included blocking out the significant shapes, studying the environment lighting, fabricating textures by utilising UDIM techniques to increase file efficiency and detailing the mesh using Sub-D modelling workflows.
We simplified the engine's textures into 5 groups, each demanding its own level of resolution and textile density. Carbon fibre, a glossy composite material, matte plastic, metallic coloured accents and plastic tubing were all materials that we would usually include on a 4K texture map with a colour depth of 16bit. However, our tests indicated that loading large texture sets would lower the stability of the AR experience and cause instances of lag.
To accommodate, we approached the issue using UDIMs. UDIM or UV Dimension is a method of ascribing a number to a linear block of UV space. This allows us to distribute different materials, or UV islands with specific textile densities to different UV tiles - and efficiently texture the model without needing a large texture resolution. For our use case, we used 6 UDIM each set to 1080p with an 8bit colour depth and used float values for roughness and metallic surfaces when appropriate. Our approach allowed us to increase the resolution of desired faces while minimizing the total texture budget.
The final step in the implementation of our AR prototype was to rebuild the scene in one-to-one space with an addressable tracking image. My colleague James Dwyer, utilised Unity and the Vuforia platform to transfer the data to an Android OS while also rebuilding the material links and establishing the asset’s location in physical space. Annotations, Shadows and additional fabricated lighting fixtures were also added to match Beta Lab’s lighting for the launch event.
For more information on the material development, read more here:
When faced with the rare opportunity to design an AR experience for one of BMW’s luxurious Motorcars I knew we had to feature the vehicle’s unique body. Our third installation aimed to highlight the vehicle's aerodynamic design and requested that we use other forms of animation sets, unique to the techniques used in the first two installations. We prototyped moving splines, their motion indicative of a moving road and began by manipulating their displacement to follow the main contours of the vehicle. Subtle in nature, this implementation would be mapped over the physical vehicle as illustrated in the images below.
We achieved this effect by scaling meshes based on their origin points and constraining their displacement along the curves of the vehicle. We soon discovered that transferring the displacement effect to an appropriate format, however, was impossible using conventional fbx animation baking. As an alternative, we explored an alembic animation export - a unique type of file format created by Sony Pictures Imageworks for visual effects. While our initial transfer tests between blender, unity and unreal were successful; Vuforia and Android would limit our implementation in AR - forcing us to explore a different avenue.
Building upon our last attempts, we outlined the limitations we discovered and planned a new approach. Rather than showcase the surface geometry of the vehicle through airflow lines, we proposed a transitional wireframe effect that would encompass the vehicle. In reality, an easier and perhaps more engaging effect, however, would involve a series of alternating panels - invoking a wireframe effect within the negative space. Our approach applied the declared panels from a digital asset and offset the scale value of each instance in proximity to a defined start and end point.
Establishing our 3D assets, in reality, required anchor points and unique identifiers for each model to ascribe to - we achieved this by using JS Placement. JS Placement is a great resource for creating Sci-fi height maps and adding detail to surfaces via micro displacement. Its algorithm allowed us to create 3 distinct images while still maintaining a coherent overarching visual style.
Our team explored the boundaries of what we were capable of developing in AR with our given tools and has set the stage for future developments by documenting our workflows and practises. We hope to further our interest in this field by developing new installations and exploring increasingly complex applications of this developing technology - both individually and with our commercial partners.
Until then, here's a detailed breakdown and screensaver of our surface transition installation: