ar chinese dragon
ar chinese dragon
Bringing the
spirit to you.
Bringing the
spirit to you.
Overview & brief:
Augmented Chinese Dragon is a custom Snapchat filter created for an experimental university brief. Featuring a hand-crafted 3D dragon built in Blender, the filter uses head binding, animated tweens, and custom scripting in Lens Studio to explore character-driven AR and creative 3D design.
services
CREATIVE COMPUTING Javascript
animation
3d model
Research
Ar filter
timelinE
jan '25 - feb '25
(4 weeks)
ar chinese dragon:
prototypes
ar chinese dragon:
prototypes
developing a prototype with Blender:
Creating a
developing a prototype:
Creative
computing
3d model.
I created the 3D dragon using a combination of modelling, animation, and shading techniques in Blender. Starting with basic shapes, I used tools like array, mirror, curve and bevel modifiers to sculpt key features such as scales, fur, and facial details.
I applied vertex manipulation and extrusion for structure, and used decimate and wireframe views to develop a stylised, web-like form. This process refined my skills in procedural modelling and non-destructive workflows.
We used p5.js and JavaScript to create responsive visuals controlled by MIDI input. Key functions like beginShape()
and vertex()
let us draw polygons with dynamic point counts, while conditional logic (if
, switch
) mapped controller values to colour, speed, and visibility.
We also used vector math (p5.Vector
, normalize()
) to manage smooth movement and ensure shapes responded fluidly to input.
We used p5.js and JavaScript to create responsive visuals controlled by MIDI input. Key functions like beginShape()
and vertex()
let us draw polygons with dynamic point counts, while conditional logic (if
, switch
) mapped controller values to colour, speed, and visibility.
We also used vector math (p5.Vector
, normalize()
) to manage smooth movement and ensure shapes responded fluidly to input.


Visual storytelling through animation:
Camera paths
Interaction and user engagement:
MIDI
and lighting.
controls.
To enhance the presentation of my 3D dragon, I created a dynamic camera and lighting setup in Blender. Using a bezier curve and the follow path constraint, I animated a smooth camera motion to highlight the model’s form.
Lighting was controlled through an emissive mesh, generating a glowing trail that followed the dragon’s path to accentuate movement and flow. I used keyframe animation, curve editing and constraints to synchronise visual effects, creating a cinematic and immersive scene.
We used MIDI hardware to create an interactive visual experience, combining creative coding with physical input controls.
Using p5.js
and webmidi.js
, each MIDI input—knobs, sliders, and buttons—was mapped to visual parameters like color, shape, and animation speed.
This setup made data interaction tactile and intuitive. Users could explore visual changes in real time, enhancing both usability and creative freedom.
To enhance the presentation of my 3D dragon, I created a dynamic camera and lighting setup in Blender. Using a bezier curve and the follow path constraint, I animated a smooth camera motion to highlight the model’s form.
Lighting was controlled through an emissive mesh, generating a glowing trail that followed the dragon’s path to accentuate movement and flow. I used keyframe animation, curve editing and constraints to synchronise visual effects, creating a cinematic and immersive scene.
To enhance the presentation of my 3D dragon, I created a dynamic camera and lighting setup in Blender. Using a bezier curve and the follow path constraint, I animated a smooth camera motion to highlight the model’s form.
Lighting was controlled through an emissive mesh, generating a glowing trail that followed the dragon’s path to accentuate movement and flow. I used keyframe animation, curve editing and constraints to synchronise visual effects, creating a cinematic and immersive scene.
How might we use scripting and 3D modelling to develop an interactive AR experience in Lens Studio?
How might we use scripting and 3D modelling to develop an interactive AR experience in Lens Studio?

Introducing lens studio:
AR experience
introducing lens studio:
AR
experience
Throughout this project, I used creative computing techniques in Lens Studio to bring my 3D dragon to life. I faced challenges importing assets from Blender, especially with background transparency, so I tested various file types, compression tools, and rendering methods like PNG sequences to work around Lens Studio’s alpha limitations.
To improve visuals, I experimented with materials, shaders, and blend modes, and used marker tracking with animated textures to sync visual and audio elements. I also scripted marker-based audio triggers and used TweenColor to create dynamic material transitions.
This process enhanced my skills in scripting, asset integration, and real-time AR interaction.
Throughout this project, I used creative computing techniques in Lens Studio to bring my 3D dragon to life. I faced challenges importing assets from Blender, especially with background transparency, so I tested various file types, compression tools, and rendering methods like PNG sequences to work around Lens Studio’s alpha limitations.
To improve visuals, I experimented with materials, shaders, and blend modes, and used marker tracking with animated textures to sync visual and audio elements. I also scripted marker-based audio triggers and used TweenColor to create dynamic material transitions.
This process enhanced my skills in scripting, asset integration, and real-time AR interaction.
Throughout this project, I used creative computing techniques in Lens Studio to bring my 3D dragon to life. I faced challenges importing assets from Blender, especially with background transparency, so I tested various file types, compression tools, and rendering methods like PNG sequences to work around Lens Studio’s alpha limitations.
To improve visuals, I experimented with materials, shaders, and blend modes, and used marker tracking with animated textures to sync visual and audio elements. I also scripted marker-based audio triggers and used TweenColor to create dynamic material transitions.
This process enhanced my skills in scripting, asset integration, and real-time AR interaction.



ar chinese dragon:
final overview
ar chinese dragon:
final overview
key takeaway:
Future
key takeaway:
Future
strategies
strategies
This project gave me hands-on experience with AR design and highlighted the challenges of creating mobile AR content.
Lens Studio’s small file capacity limited what I could publish, and outdated documentation made development harder due to a lack of up-to-date resources. Despite these issues, I improved my problem-solving skills and learned a lot about optimising AR experiences.
In the future, I’d love to explore platforms like Unity and Adobe Aero, which offer better support, modern tools, and more creative flexibility for building advanced AR projects.
This project gave me hands-on experience with AR design and highlighted the challenges of creating mobile AR content.
Lens Studio’s small file capacity limited what I could publish, and outdated documentation made development harder due to a lack of up-to-date resources. Despite these issues, I improved my problem-solving skills and learned a lot about optimising AR experiences.
In the future, I’d love to explore platforms like Unity and Adobe Aero, which offer better support, modern tools, and more creative flexibility for building advanced AR projects.