Introductory
Aimyon is a newcomer that brought a bit of pop music to the Japanese youth, we had the opportunity to promote her latest single
An interesting supplementary read.
The concept is quite simple, the music video combines a real-time Twitter feed —that has the word death or shinu 「死ぬ」written in the post— and the result is then broadcasted through Youtube live-stream.
And depending on the number of posts the design will adapt itself.
If you notice the billboard, tree, and rain scenes, they all have a different amount of characters at different times.
System Architecture
The system is quite simple.
Development
Twitter API and Filter
For this, we simply used the Twitter search API and extract any post that contains the word death in it.
- “死ぬ” exact phrase.
- #死ぬ” hashtags.
Encoder

Layer Stack
On the front-end side, the graphic basically is generated by combining WebGL(PIXI) and HTML5 Video tag as shown below.
We needed this format because certain design element requires being at the back of the video to create a depth effect.
For the video format, WebM is used because it supports the alpha channel.
Syncing
Some parts of the animation need to be timed exactly like the video, this itself was a challenge, at first we’ve tested two ways;
Painstakingly recreate the animation in code
Doing this has its benefit we can control the entire movement, but obviously having it sync with the video would require tremendous hours.
Syncing it with AfterEffect’s camera tracker and keyframe data
This sounds good at first since you can automate the entire process using AE Script but after several tests, we quickly realized that we couldn’t match the videos and the browser frame rates.
We’ve tried using time-step + RAF + setInterval but still see a certain discernible lag is visible but exploring more options means we’re wasting time and doing something hacky will eventually produce another bug later.
Final Solution
The safest solution is just to combine the two, by exporting between each movement stride and filling the gap with code, with this, we were able to achieve a near-perfect synchronization.
Optimisations
The biggest hurdle was avoiding UI locks, each design element represents thousand of fonts that are expensive to initialize and render.
The next problem is the initialization part, to solve this issue we separate each scene to a corresponding PIXI view, then initialize the heaviest view at the beginning of the video and initialize the consecutive views a few seconds before their intended scene appears.
Chrome Error
The only technical anomaly that we encountered was the Chrome render error that happens at random times.
This was extremely hard to debug since it only happens every 3 or 4 days, so to solve it, we created a Chrome Plugin that automatically reloads the browser whenever Chrome crashes.
if(
(exitType === 3 && exitCode === 11) || // chrome crash
(exitType === 2 && exitCode === 15) // chrome kill
) {
chrome.tabs.query({active:true, currentWindow:true},function(tabs){
chrome.tabs.update(tabs[0].id, {url:”http://localhost:8888/”}); //reload browser
});
}
});
Another fail-safe is, that every time the application ends it will send a timestamp where a Cron script compares the current and previous input and if the interval exceeds the maximum limit allocated then an alert email will be sent.
And finally, we also utilized
which enables us to access the host computer remotely and do any necessary changes manually.Conclusion
There are more details that have been omitted for brevity, overall it was a fun project and we had a very positive response from the client and their fans.