<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Lusion]]></title><description><![CDATA[A place where the team at Lusion sharing their thoughts.]]></description><link>https://blog.lusion.co</link><generator>RSS for Node</generator><lastBuildDate>Thu, 07 May 2026 02:44:26 GMT</lastBuildDate><atom:link href="https://blog.lusion.co/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Oryzo BTS  (Part 3 / 7) - Website UX/UI and Illustrations]]></title><description><![CDATA[If you have not seen Oryzo AI in action yet, I would recommend checking it out first: oryzo.ai. It is, quite honestly, five minutes of your life gloriously wasted for a nerdy laugh.
This is the third ]]></description><link>https://blog.lusion.co/oryzo-bts-part-3-7-website-ux-ui-and-illustrations</link><guid isPermaLink="true">https://blog.lusion.co/oryzo-bts-part-3-7-website-ux-ui-and-illustrations</guid><dc:creator><![CDATA[Edan Kwan]]></dc:creator><pubDate>Tue, 21 Apr 2026 13:35:37 GMT</pubDate><enclosure url="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/8ce5c62d-586f-4929-845f-fd5c6f7b77a2.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>If you have not seen Oryzo AI in action yet, I would recommend checking it out first: <a href="http://oryzo.ai">oryzo.ai</a>. It is, quite honestly, five minutes of your life gloriously wasted for a nerdy laugh.</p>
<p>This is the third post in the Oryzo behind the scenes series. Part 1 covered how the idea came together. <a href="https://blog.lusion.co/oryzo-bts-part-2-7-3d-design-and-motion-graphics">Part 2</a> covered how the visual world was built. This one is about the design system, the illustrations, and the way we approached both.</p>
<p>We will walk through the early visual exploration, the design principles that came out of it, the way we kept UI from getting in the way of the content, and where we allowed AI into the pipeline without letting it define the final look.</p>
<h2>Finding the Look</h2>
<p>Once the idea was locked in, as covered in <a href="https://blog.lusion.co/oryzo-bts-part-1-7-concept-and-creative-direction">Part 1</a>, we had to figure out what the site should actually look like. We all have to start somewhere:</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c3e56110e664c5da9f744b/bb893ee7-fa54-4cfe-a6a8-933db5504d2e.png" alt="Early concept" style="display:block;margin:0 auto" />

<blockquote>
<p><em>A completely different style, right?</em></p>
</blockquote>
<p>It looked like one of those typical Awwwards Site of the Day concepts from a design studio portfolio. Clean, stylish, and abstracted. But something was missing. It felt lifeless.</p>
<p>That direction treated the coaster more like an abstract product study. It looked good, but it stripped away the world around it. We wanted the opposite. We were happy to give up some minimalism if it meant placing the product inside a more believable environment.</p>
<p>That was the moment the desk came in.</p>
<p>The hero scene became a creative work desk, the kind of space that feels lived in by someone in a studio. That direction lined up naturally with two things Lusion already does well: storytelling, and realistic real time 3D in the browser.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c3e56110e664c5da9f744b/7481d1a5-e7d6-499b-b00b-ef181fc7cc23.webp" alt="" style="display:block;margin:0 auto" />

<p>At this stage, we were also using Midjourney to explore composition and mood. The hero scene started to find its identity, and once the direction was approved internally, the work moved into full non-AI 3D production, which we covered in Part 2.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c3e56110e664c5da9f744b/5e1e3c14-1e69-49fd-97af-6e339f6bd96c.webp" alt="" style="display:block;margin:0 auto" />

<p>At that point, the site structure itself was still fairly standard: hero shot, key benefit, usage, specs, reviews, comparisons, purchase. Nothing unusual. What this second concept really gave us was not a radically different layout, but a clearer set of principles to build against.</p>
<p>There were four of them, and they guided almost every design decision that followed:</p>
<ul>
<li><p><strong>A realistic image.</strong></p>
</li>
<li><p><strong>The product at the centre.</strong></p>
</li>
<li><p><strong>Seamless transitions.</strong></p>
</li>
<li><p><strong>Humour.</strong></p>
</li>
</ul>
<p>From there, the next question was how to translate those ideas into typography, colour, and UI.</p>
<h2>Design Without Ego</h2>
<p>Working at Lusion means working as part of a creative collective. Working on Oryzo meant working inside something closer to an orchestra, where each person plays a specific role.</p>
<p>This project already had a lot going on: 3D renders, humour, photography, dense product copy, little visual jokes, and a world full of props. In that kind of environment, design cannot constantly ask for attention. Its job is to support the content, not compete with it.</p>
<p>So we made two early decisions: use as few typefaces as possible, and use as few colours as possible.</p>
<p>Around 99 percent of the type on the site is set in a single family. The colour system is just four values: a cream, a near black, a muted olive, and an orange. Anything more expressive than that would have started pulling attention away from the renders and illustrations, which were already doing the heavier visual work.</p>
<img alt="" style="display:block;margin:0 auto" />

<p>We also stripped back the type system itself. SUB 3, LINK FOOTER, BTN 2, BIG QUOTE, and INPUT TEXT all disappeared from the original set. The rule was simple: typography should hold the layout together, then step out of the way.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c3e56110e664c5da9f744b/e5741942-e61a-453a-ba95-fc868f7b430d.webp" alt="" style="display:block;margin:0 auto" />

<p>Does that make the design process easier? Not really. Constraints like these usually create more pressure, not less. But they also force better decisions.</p>
<p>On the hero section, for example, the text is doing more than delivering information. It is also helping to reinforce the feeling that this is a real desk with real objects on it. The typography behaves almost like another prop in the scene.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c3e56110e664c5da9f744b/b019c58e-bd85-4d22-a99e-8c7c1445ec11.png" alt="" style="display:block;margin:0 auto" />

<p>There were also a few moments where we deliberately broke our own rules. The transition between sections, for example, leans into the visual language of an old fashion magazine and uses a completely different set of typefaces. It is one of the few places on the site where the UI is allowed to perform, because the surrounding scene is asking for exactly that kind of flourish.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c3e56110e664c5da9f744b/d9313d0f-14c5-4253-90ed-f474371cadd8.png" alt="" style="display:block;margin:0 auto" />

<h2>Using AI With a Straight Face</h2>
<p>This part is a little delicate.</p>
<p>Oryzo pokes fun at AI and the way it gets used, while at the same time AI tools were part of our own pipeline. Earlier in this post, you already saw composition studies made with Midjourney. Beyond those early stages, we also used Google Flow in a few very specific ways.</p>
<p>The important distinction is this: AI was not generating the core idea or the final look. It was helping us quickly test combinations of things we already had.</p>
<p>In other words, the source material was ours. The result was a rough visual check, not a finished asset.</p>
<p>That distinction might sound small, but to us it matters. It is the difference between using AI to speed up a decision and using it to replace one.</p>
<p>An office mug combined with a photo from an Nvidia press kit:</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c3e56110e664c5da9f744b/2caab74e-61f4-4feb-93d1-4f5f9589af95.png" alt="" style="display:block;margin:0 auto" />

<p>A scene render paired with an illustration:</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c3e56110e664c5da9f744b/9fcfd124-83d0-4c7a-abb0-3a538867384a.png" alt="" style="display:block;margin:0 auto" />

<p>Some of these ended up as final assets on the site. They were fast visual tests, often done in about an hour, just to answer a simple question: does this read, or does it not?</p>
<p>If the answer was yes, the direction moved into real production. If the answer was no, we dropped it and moved on.</p>
<p>We wanted to use AI exactly as much as it deserved to be used in a project like this, and not a step more.</p>
<h2>A Human Layer</h2>
<p>To balance out all the premium tech language and digital polish, we wanted a layer that felt more obviously handmade.</p>
<p>That led us to illustration.</p>
<p>None of the illustrations on the site were AI assisted. Everything was drawn by hand in Procreate. We only wanted AI in places where it reinforced the joke or helped us test a direction. The illustration layer needed to do the opposite. It needed to bring taste, irregularity, and human intent back into the system.</p>
<p>We created a series of custom drawings inspired by technical sketches, with a few nods to Leonardo da Vinci style anatomical studies. It made the whole project feel slightly overcommitted in a way we really enjoyed.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c119b8545ab963126a7152/62bd7a42-dd17-42ad-91b0-69d84282c627.gif" alt="" style="display:block;margin:0 auto" />

<p>If you look closely across the site, you will notice small hand drawn elements tucked into different scenes. They act as little reminders that even in a very tech forward project, a lot of what we care about still comes back to human output, taste, and the process of making things carefully.</p>
<p>For the coffee mug on the desk, we drew a custom illustration inspired by Lusion’s website astronaut character. Instead of treating it like a separate graphic, we painted it directly onto the surface of the mug so it would hold up properly inside the 3D world.</p>
<p>A small detail, but an important one. It makes the object feel like it belongs to the scene, rather than sitting on top of it.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c119b8545ab963126a7152/b0e11f99-65dd-45e7-ac96-396ddaa12660.png" alt="" style="display:block;margin:0 auto" />

<p>Further down the page, in the close up section where Oryzo is pushed to an absurd level of zoom, there is another small layer of detail. If you stay there long enough, tiny characters begin to appear.</p>
<p>Those are tardigrades.</p>
<p>They are not especially cute, which made them even more fun to include. Turning a technical scale reference into something slightly ridiculous landed exactly where we wanted it to, somewhere between overdesigned and completely justified.</p>
<p>Like most things in this project.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c119b8545ab963126a7152/74876f32-70c5-4f9c-bb19-c32f1bb3518e.png" alt="" style="display:block;margin:0 auto" />

<h2>Tools</h2>
<p>We try not to be too precious about tools, but for the sake of record, here is what went into this part of the project:</p>
<ul>
<li><p>Figma for layouts and the UI system.</p>
</li>
<li><p>Rive for interactive animations and motion prototypes.</p>
</li>
<li><p>Midjourney for early concept and composition exploration.</p>
</li>
<li><p>Google Flow for image generation for the AI jokes on top of our own 3D renders.</p>
</li>
<li><p>Procreate for illustrations.</p>
</li>
</ul>
<h2>Closing Thoughts</h2>
<p>Most of the UX work on Oryzo came down to deciding what not to do.</p>
<p>Fewer typefaces. Fewer colours. Fewer UI ideas competing for attention. Fewer fully generated assets.</p>
<p>The parts of the site that do get loud, the desk scene, the humour, the illustrations, only work because everything around them stays relatively quiet.</p>
<p>In Part 4, we will move from design into the WebGL and Three.js layer, and start breaking down some of the technical tricks that made all of this run in the browser.</p>
<h2>Oryzo Behind the Scenes Series</h2>
<p>We will be publishing the rest of the Oryzo behind the scenes series over the next few days. If you enjoyed this post, feel free to bookmark it or subscribe for the upcoming parts.</p>
<ul>
<li><p><a href="https://blog.lusion.co/oryzo-bts-part-1-7-concept-and-creative-direction">☑ Oryzo BTS (Part 1 / 7), Concept and Creative Direction</a></p>
</li>
<li><p><a href="https://blog.lusion.co/oryzo-bts-part-2-7-3d-design-and-motion-graphics">☑ Oryzo BTS (Part 2 / 7), 3D Design and Motion Graphics</a></p>
</li>
<li><p><strong>☑ Oryzo BTS (Part 3 / 7), Website UX/UI and Illustrations</strong></p>
</li>
<li><p><em>☐ Oryzo BTS (Part 4 / 7), WebGL/ThreeJS Tricks 1</em></p>
</li>
<li><p><em>☐ Oryzo BTS (Part 5 / 7), WebGL/ThreeJS Tricks 2</em></p>
</li>
<li><p><em>☐ Oryzo BTS (Part 6 / 7), WebGL/ThreeJS Tricks 3</em></p>
</li>
<li><p><em>☐ Oryzo BTS (Part 7 / 7), WebGL/ThreeJS Tricks 4</em></p>
</li>
</ul>
]]></content:encoded></item><item><title><![CDATA[Oryzo BTS  (Part 2 / 7) - 3D Design and Motion Graphics]]></title><description><![CDATA[If you have not seen Oryzo AI in action yet, I would recommend checking it out first - oryzo.ai. It is, quite honestly, five minutes of your life gloriously wasted for a nerdy laugh.
If Part 1 was abo]]></description><link>https://blog.lusion.co/oryzo-bts-part-2-7-3d-design-and-motion-graphics</link><guid isPermaLink="true">https://blog.lusion.co/oryzo-bts-part-2-7-3d-design-and-motion-graphics</guid><dc:creator><![CDATA[Edan Kwan]]></dc:creator><pubDate>Thu, 02 Apr 2026 14:31:39 GMT</pubDate><enclosure url="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/a0b59013-56ec-4a83-b16a-cd0b055ef6a0.webp" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>If you have not seen Oryzo AI in action yet, I would recommend checking it out first - <a href="http://oryzo.ai"><strong>oryzo.ai</strong></a>. It is, quite honestly, five minutes of your life gloriously wasted for a nerdy laugh.</p>
<p>If <a href="https://blog.lusion.co/oryzo-bts-part-1-7-concept-and-creative-direction">Part 1</a> was about concept and creative direction, this is where the visual world started to take shape. In addition to the main website scenes, we also developed a full set of campaign assets, including a <a href="https://x.com/lusionltd/status/2033564065096753180">prelaunch video</a>, a <a href="https://x.com/lusionltd/status/2034723323637092533?s">launch film</a>, and a range of behind the scenes content. So this post is naturally a more visual one, full of 3D experiments, motion studies, and production tests.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/46100483-19dd-423c-83d3-4cb10270f4ea.webp" alt="" style="display:block;margin:0 auto" />

<p>At Lusion, we care more about the final experience than any particular tool. The pipeline follows the idea, not the other way around. That principle has led us to use a wide mix of software depending on what each project actually needs.</p>
<p>For Oryzo, which relied heavily on 3D imagery and motion, we primarily used <a href="https://www.sidefx.com/products/houdini/">SideFX Houdini</a> for scene building and <a href="https://www.maxon.net/en/redshift">Maxon Redshift</a> for rendering. Houdini, while traditionally associated with visual effects in the Hollywood movies, has become a core part of our workflow for building flexible systems that translate well into interactive web work. Redshift, being GPU accelerated, gave us the speed we needed to iterate quickly on lighting, textures, and final renders.</p>
<p>With that foundation in place, the next step was to define the main visual anchors of the campaign.</p>
<hr />
<h2>The Hero Scene</h2>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/d9eeebee-9260-4845-a04f-428fec2881c6.png" alt="" style="display:block;margin:0 auto" />

<p>We needed a setting that could hold the coaster naturally while also helping establish the tone of the project.</p>
<p>A desk felt like the obvious choice.</p>
<p>It became the hero scene of the entire campaign: warm, calm, and filled with objects that would feel familiar to designers and other visually minded people. We wanted to create an environment that felt curated and believable enough to support the absurd seriousness of the product presentation.</p>
<p>We explored several directions before landing on the final look. Here are a few early tests:</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/0bf0a49a-a9ee-43c5-8c5b-6a2306532e56.jpg" alt="" style="display:block;margin:0 auto" />

<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/0405cbc2-627b-4c04-a3a6-7a38decae5d8.jpg" alt="" style="display:block;margin:0 auto" />

<p>In the end, we settled on the work desk because it felt the most personal to us. It reflected the kind of space we know well, somewhere between digital tools and analogue mess, between design work and everyday objects.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/9995c4f8-a84d-4cbd-becd-c544a579406c.jpg" alt="" style="display:block;margin:0 auto" />

<p>One of the biggest technical challenges was preserving the fidelity of that scene once it moved into a web based environment. We tried image sequences and video, but they lacked the interactivity we wanted. We also tested real time PBR rendering, but it did not quite reach the visual quality we were aiming for.</p>
<p>So we explored Gaussian Splatting as a way to translate high quality rendered scenes into something that could still run in real time with WebGL. We tested two tools for this workflow: Jawset’s <a href="https://www.jawset.com/">Postshot</a> (paid) and the open source <a href="https://github.com/MrNeRF/LichtFeld-Studio">Lichtfeld Studio</a> (free).</p>
<p>In most of our production tests, Postshot gave us better quality and faster processing times. That said, Lichtfeld Studio has continued to improve and now includes features like LOD support, so it is still very much worth exploring.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/9ee54f34-79da-406b-a8e8-c6d925fd6a53.gif" alt="" style="display:block;margin:0 auto" />

<p>To generate the dataset, we rendered multiple camera perspectives in Houdini and processed them through Postshot to create the splats. We also split the scene into multiple splats for compositing inside the web experience, which we will talk about more in later parts.</p>
<blockquote>
<p>At first, we made the classic mistake of trying to be too clever.</p>
</blockquote>
<p>Because Oryzo.ai is a linear scrolling web experience, we assumed the best dataset would come from rendering the exact camera spline used in the final website. In theory, that sounded efficient. In practice, it reduced the quality of the result. Too many nearly identical frames during eased camera motion meant weaker coverage in the areas that actually needed more variation.</p>
<p>The better solution turned out to be the simpler one: standard hemispherical or spherical camera placement.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/e810b0f0-9a59-4ad3-9c3b-cb1a16a83090.png" alt="" style="display:block;margin:0 auto" />

<p>Postshot requires both images and COLMAP data. With real world scans, Postshot can estimate this itself, or you can use tools like RealityScan to improve the tracking. But in our case, because the images were rendered in Houdini, the camera data already existed.</p>
<p>That meant we could export it directly.</p>
<p>To streamline that process, we used <a href="https://github.com/cgnomads/GSOPs">GSOPs</a>, a third party Houdini toolset for Gaussian splats, which made it much easier to export COLMAP data and move efficiently from Houdini into Postshot.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/a05e0df6-ed39-4f01-9b2f-984b25841cb7.gif" alt="" style="display:block;margin:0 auto" />

<p>The scene itself was built through a mix of sourced assets and custom work. Some areas, like the background trays, included many small objects that would have been tedious to place manually. For those, we used rigid body simulations to let the objects settle naturally into place.</p>
<p>That is often how these workflows unfold. Large forms come together quickly, then the smaller details demand much more specific, and sometimes slightly strange, solutions.</p>
<blockquote>
<p>One Trick Pony Does Not Work</p>
</blockquote>
<p>We tried rendering the entire scene as Gaussian splats:</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/fc4d05db-07aa-4460-9a29-d1b213f47a44.webp" alt="" style="display:block;margin:0 auto" />

<p>We pushed it to around 900,000 splats already, but if you look closely at the desk and the cutting mat, you can still see plenty of flaws in the wooden details and the fine sketch lines.</p>
<p>That led us to a hybrid solution.</p>
<p>We used splats only for the props and the desk reflections. Once we narrowed the splat content to the elements that actually benefited from it, the result immediately looked better, even with far fewer points: around 78,233 splats on desktop and 44,683 on mobile.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/fc190f1c-423f-482f-888e-471f61637fc4.png" alt="" style="display:block;margin:0 auto" />

<p>For the rest of the scene, we fell back to simpler texture mapping. Even there, we used several tricks to preserve visual quality while keeping the overall file size under control.</p>
<p>For the desk and the mat, we simplified the geometry and merged them together. Redshift does not let you directly bake displacement or tessellation detail into a texture map in the way we wanted, so instead of baking, we rendered orthographic passes from the top and front without reflections, then stitched them back together in Photoshop.</p>
<p>Because the camera spends most of its time focused around the centre of the desk, we also added a shader pass to distort the texture coverage so that the centre 50 percent of the surface received roughly 90 percent of the available texture detail.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/291383b5-8b20-4c7f-abd5-5aeda30da617.webp" alt="" style="display:block;margin:0 auto" />

<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/b6ce2c1c-6b47-4892-9e69-f87a2fc82d76.webp" alt="" style="display:block;margin:0 auto" />

<hr />
<h2>Human Interface</h2>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/00514007-2788-486c-8352-8bc1be0d9005.png" alt="" style="display:block;margin:0 auto" />

<p>One of the clearest visual references in Oryzo came from the way premium tech campaigns use hands.</p>
<p>That kind of imagery is familiar from product advertising, especially in Apple campaigns, where human interaction is used to make digital objects feel tactile, minimal, and desirable. We wanted to borrow some of that visual language and reinterpret it for our own purposes.</p>
<p>The result was the six finger hand scene.</p>
<p>We started from a high quality 3D scan of a real hand that we purchased from <a href="https://www.3dscanstore.com/hand-3d-model/all-model-hands-section/female-3d-hand-model-black-20">3D Scan Store</a>, then modified it to give it one extra finger.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/6076fa31-7ec5-4839-ba46-c3f698ba256e.webp" alt="" style="display:block;margin:0 auto" />

<p>We then built a rig in Houdini using <strong>KineFX</strong>. Even though rigging can become very complex, the motion we needed was fairly contained, so a relatively simple setup was enough to give us the control we wanted.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/a5e3035e-83ba-4dac-83ba-9795ae225607.png" alt="" style="display:block;margin:0 auto" />

<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/ce6e895b-6cd5-4b1e-9789-ef4dc4d2fc8c.gif" alt="" style="display:block;margin:0 auto" />

<p>What mattered most here was not technical complexity for its own sake, but the feeling of physical intent. The hand needed to move with just enough realism to sell the premium presentation, while still leaving room for the joke to land.</p>
<hr />
<h2>Function Reimagined</h2>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/089a7fe1-ceb0-4ecd-b0bf-720153aa0e5f.webp" alt="" style="display:block;margin:0 auto" />

<p>Once the core visual language was in place, we started looking for ways to stretch the product concept further.</p>
<p>One of those directions was the idea of making the coaster feel “wearable.” Premium products never just sell specifications. They sell lifestyle, identity, and status. Somehow, following that line of thinking led us to the condom wrapper packaging scene.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/dfa9facd-8b13-4eb8-b9fb-04f06444556b.webp" alt="" style="display:block;margin:0 auto" />

<p>For the packaging, we used Houdini’s Vellum system to simulate realistic stretching and material behaviour. You could approach this with sculpting or more traditional polygon modelling, but simulation made more sense for us because we already knew the geometry would need to tear later.</p>
<p>That made the process feel closer to a real material study rather than a purely static model.</p>
<p>We used two planes with different physical properties to represent two different materials. The front side was a flimsy transparent plastic with lower stiffness. The back side was a soft metallic layer that was stiffer and more resistant to bending. We then applied attraction forces that behaved like a vacuum seal. To add more realism and wrinkling, the attraction force on the upper layer was multiplied by a noise distorted radial wave pattern, as shown below.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/c860bc9a-b7a9-4f6e-8524-dc1421cadff8.webp" alt="" style="display:block;margin:0 auto" />

<p>The tearing animation was also driven by Vellum. We split the mesh into separate sections, stitched them back together with <strong>weld constraints</strong>, and let those constraints break dynamically once they reached their stress limits. By animating the initial separation, the solver handled the rest and produced a tearing motion that felt much more natural than a hand animated effect would have.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/47828e92-279c-4ae5-92e1-ba66726d9a95.webp" alt="" style="display:block;margin:0 auto" />

<hr />
<h2>Inside the Material</h2>
<p>As part of the storytelling, we wanted to talk about the material qualities of cork at a microscopic level. That led us to create a close up render as the main background, paired with an interactive microscopic view box for a more detailed material reveal.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/a38b8479-8e3e-4420-828d-479caa91b5c4.webp" alt="" style="display:block;margin:0 auto" />

<p>For the macro render, we first developed an early look development pass that looked something like this:</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/796d908b-a193-4578-8511-dabb45f18913.webp" alt="" style="display:block;margin:0 auto" />

<p>The cork itself was built using a VDB modelling setup combined with procedural noise to create the right look and feel. It already felt visually convincing, but then we pushed it further by treating it as a microscopic world that also needed to loop seamlessly across the horizontal scroll.</p>
<p>That turned out to be particularly challenging. We had to pay close attention to seam handling and rely on periodic noise so that both the vertex positions and the surface normals would transition cleanly across the loop.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/684318cc-2a03-4e81-8988-0f8a01b77e9f.webp" alt="" style="display:block;margin:0 auto" />

<p>We also added a small Easter egg in this interactive section. If you drag and shake the microscopic view rapidly, a water bear appears.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/36977060-e17f-4937-80e2-b2257ec2585a.webp" alt="" style="display:block;margin:0 auto" />

<p>To create that asset, we experimented with an AI assisted pipeline. We generated multiple reference views using Nano Banana Pro via <a href="https://labs.google/fx/tools/flow/">Google Flow</a>, processed them through <a href="https://3d.hunyuan.tencent.com/">Hunyuan’s 3D generator</a>, then refined the result in ZBrush and added procedural detailing before baking it back into textures.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/e7bd37d9-d906-4ef9-ab14-23e5581f6dcf.webp" alt="" style="display:block;margin:0 auto" />

<hr />
<h2>Grounded in the Real</h2>
<p>The more we worked on the project, the more important it felt to ground some of the visuals in something physically real.</p>
<p>When we looked more closely at the material, we were reminded that cork comes from the outer bark of cork oak trees. That gives it a strong tactile and ecological identity. Rather than relying entirely on pre made models or purely generated assets, we wanted to bring some of that material truth directly into the campaign.</p>
<p>So we bought a piece of cork bark from Amazon, mounted it on a tripod setup, and photographed it for photogrammetry using a digital camera in RAW.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69c11fef545ab96312724825/5750c39c-852c-4610-be0b-3ffbcc4d25c8.webp" alt="" style="display:block;margin:0 auto" />

<p>We captured around 180 high resolution images to give us enough coverage for accurate geometry and texture reconstruction. Those images were then processed in <a href="https://www.realityscan.com/">RealityScan</a> to generate the mesh and texture maps.</p>
<p>From there, we refined the result further with subtle procedural surface treatment to improve the richness of the close up renders.</p>
<p>That scanned bark ended up becoming an important visual ingredient not only in the website, but also in the film and promotional assets.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/e000a540-f52e-4a90-8159-2df0f802d842.webp" alt="" style="display:block;margin:0 auto" />

<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/554bc642-9cef-4860-8613-264a505bfa0e.webp" alt="" style="display:block;margin:0 auto" />

<hr />
<h2>The Film</h2>
<p>By the time we moved into the launch film, most of the visual ingredients were already there.</p>
<p>The goal of the film was to distill them into something simple, premium, and cinematic, taking cues from luxury technology product videos without overcomplicating the structure.</p>
<p>We opened with a sequence focused on how cork is traditionally produced, using the same scanned bark asset to anchor the story in something tangible and material.</p>
<p>From there, the film transitions into a more stylised world. The disintegration effect was created using VDB mesh booleans, particle simulations, and pyro smoke, allowing the material to shift from natural object into designed spectacle.</p>
<p>The following sequence uses a textured backdrop generated in Copernicus, Houdini’s GPU accelerated image processing system, inspired in part by Jose Molfino’s <a href="https://x.com/Jose_Molfino/status/1894125340122837073">experiments</a> translating TouchDesigner style techniques into COPs.</p>
<p>The remaining shots are relatively restrained. They rely less on technical novelty and more on timing, composition, and motion. That was intentional. For a product film like this, the challenge is often not adding more, but knowing when to stop.</p>
<p>And in a way, that idea runs through the whole project.</p>
<p>Even when the product is absurd, the craft still works best when it is controlled.</p>
<p><a class="embed-card" href="https://www.youtube.com/watch?v=0PZPwjqYViw">https://www.youtube.com/watch?v=0PZPwjqYViw</a></p>

<hr />
<h2>Closing Thoughts</h2>
<p>By this stage, Oryzo had already become much more than a single coaster render. It had grown into a full visual system spanning the website, campaign content, and launch film. What made that possible was not any one trick, but the way all of these pieces were shaped to support the same tone.</p>
<p>In Part 3, we will go deeper into the website flow, illustration, and UI design decisions that helped bring that tone into the interactive experience.</p>
<hr />
<h2>Oryzo Behind the Scene Series</h2>
<p>We will be publishing the rest of the Oryzo behind the scenes series over the next few days. If you enjoyed this post, feel free to bookmark it or subscribe for the upcoming parts.</p>
<p><a href="https://blog.lusion.co/oryzo-bts-part-1-7-concept-and-creative-direction">☑ Oryzo BTS (Part 1 / 7) - Concept and Creative Direction</a></p>
<p><strong>☑ Oryzo BTS (Part 2 / 7) - 3D Design and Motion Graphics</strong></p>
<p><a href="https://blog.lusion.co/oryzo-bts-part-3-7-website-ux-ui-and-illustrations"><strong>☑</strong> Oryzo BTS (Part 3 / 7) - Website UX/UI and Illustrations</a></p>
<p><em>☐ Oryzo BTS (Part 4 / 7) - WebGL/ThreeJS Tricks 1</em></p>
<p><em>☐ Oryzo BTS (Part 5 / 7) - WebGL/ThreeJS Tricks 2</em></p>
<p><em>☐ Oryzo BTS (Part 6 / 7) - WebGL/ThreeJS Tricks 3</em></p>
<p><em>☐ Oryzo BTS (Part 7 / 7) - WebGL/ThreeJS Tricks 4</em></p>
]]></content:encoded></item><item><title><![CDATA[Oryzo BTS  (Part 1 / 7) - Concept and Creative Direction]]></title><description><![CDATA[If you have not seen Oryzo AI in action yet, I would recommend checking it out first - oryzo.ai. It is, quite honestly, five minutes of your life gloriously wasted for a nerdy laugh.


Oryzo is a year]]></description><link>https://blog.lusion.co/oryzo-bts-part-1-7-concept-and-creative-direction</link><guid isPermaLink="true">https://blog.lusion.co/oryzo-bts-part-1-7-concept-and-creative-direction</guid><category><![CDATA[BTS]]></category><category><![CDATA[creative coding]]></category><category><![CDATA[Web Development]]></category><category><![CDATA[Web Design]]></category><category><![CDATA[animation]]></category><category><![CDATA[Founder]]></category><category><![CDATA[startup]]></category><dc:creator><![CDATA[Edan Kwan]]></dc:creator><pubDate>Mon, 30 Mar 2026 14:10:39 GMT</pubDate><enclosure url="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/cb2cede1-06b8-480c-b2c7-ff19304912b6.png" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>If you have not seen Oryzo AI in action yet, I would recommend checking it out first - <a href="https://oryzo.ai/">oryzo.ai</a>. It is, quite honestly, five minutes of your life gloriously wasted for a nerdy laugh.</p>
<img alt="" style="display:block;margin:0 auto" />

<p>Oryzo is a year long internal project, and we wanted to share some of the thinking behind how it came together. This post is the first in a seven part behind the scenes series on the making of Oryzo.ai. Across the series, we will cover the concept, creative direction, design process, and technical execution behind the project.</p>
<p>We hope you enjoy it and maybe find a few useful ideas in it too.</p>
<hr />
<h2>When the Legend Was Born</h2>
<p>Back in early 2025, we had not done a proper Monthly Experiment at Lusion for quite a while.</p>
<p>At Lusion, we have a long standing tradition of setting aside time for internal visual experiments and sharing them online. Some of those ended up on our <a href="https://labs.lusion.co/">Lusion Labs</a> page. Those projects were always valuable for us. They gave us room to play, test ideas, and explore directions that client work would not always allow.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/a6e2abc4-24c6-4308-8aa1-bb5bfae2b531.webp" alt="" style="display:block;margin:0 auto" />

<p>After a particularly intense period of client projects, we decided it was time to commit to one bigger internal piece.</p>
<p>If you have followed our work before, you will know that we mostly operate in the advertising space, designing and developing 3D immersive digital experiences for brands and agencies. Over the past few years, though, the industry has been shifting, and we have been gradually trying to move further into brand design and storytelling.</p>
<p>That led us to a simple realization.</p>
<p>Most of our work had been for digital products. We had never created a website for a physical product.</p>
<p>When we looked at websites like Opal Camera and Daylight Computer, it became obvious that this was a space we wanted to explore more seriously. Product storytelling on the web has its own language, its own rhythm, and its own kind of restraint. We felt it could open up a very interesting direction for the studio.</p>
<p>But there was one obvious problem.</p>
<p>We did not actually have a product.</p>
<p>So the first step was to find one.</p>
<p>It could be almost anything, but we knew it could not be something heavily branded. A lot of design experiments rely on borrowing the equity of a famous logo to create instant appeal. You put a Nike logo on something, make it glossy enough, and suddenly it looks more legitimate than it really is. We never liked that approach. It feels more like borrowing attention than earning it.</p>
<p>The product had to be generic enough to prove a point:</p>
<blockquote>
<p>If we can sell this, we can help you sell anything.</p>
</blockquote>
<p>At some point, someone on the team picked up an IKEA cork coaster from his desk and said, what about this?</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/c186b6cb-b698-4b79-9cc5-cdb826f1bf61.webp" alt="" style="display:block;margin:0 auto" />

<p>That was it.</p>
<p>We ran with it, and Oryzo was born.</p>
<hr />
<h2>A Premium Coaster</h2>
<p>As you would expect, there are not many features you can realistically sell with a cork coaster.</p>
<p>In the beginning, we genuinely considered abandoning the idea and choosing something else. But after sitting with it for a while, we realised the limitation was actually the opportunity. The product was so mundane that treating it seriously was already funny. That tension became the creative hook.</p>
<p>So instead of moving away from the absurdity, we leaned into it.</p>
<p>We decided to position the coaster as if it were a premium tech product. Something presented with the kind of confidence, polish, and dramatic seriousness you would normally associate with a keynote launch or a high end product page. The gap between what the product actually is and how seriously it is being presented created most of the comedy we needed.</p>
<p>That contrast became the foundation of the project.</p>
<p>Internally, we studied a lot of product storytelling references, particularly the way brands like Apple frame value, intention, and design. Not because we wanted to make a parody of Apple, but because they are still one of the clearest examples of how to turn a product page into a narrative experience.</p>
<p>That influence shaped everything from pacing to framing to the way the product is revealed.</p>
<p>In one section, we took inspiration from Apple’s product storytelling language by using a hand to grab the coaster as part of the title reveal, then followed it with a rainbow like border activation reminiscent of the Action button interface.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/c0a9fcf5-3b25-427b-84cc-9bda2d5846b1.webp" alt="" style="display:block;margin:0 auto" />

<p>This project also forced us to work differently from the kind of immersive websites we usually make.</p>
<p>A premium product page is not built the same way as a spectacle driven experimental site. Normally, we love dramatic motion, big visual moments, and highly performative transitions. But for Oryzo, we had to dial all of that down. The goal was not to overwhelm the user. The goal was to make them look at the product, believe the framing, and feel the brand language around it.</p>
<p>That meant no unnecessary chaos. No huge transformation gimmicks. No random cinematic objects bursting through the screen.</p>
<p>The coaster had to stay at the centre of everything.</p>
<p>Even though the product itself is ridiculous, the experience still needed to feel emotionally convincing. We wanted the site to subtly suggest why this coaster might deserve premium attention, while also making it clear that we were fully aware of the joke.</p>
<p>At the same time, we did not want to make a straight Apple Store clone. Strong visual storytelling is still one of the core things that makes our work feel like Lusion, so the site needed to sit somewhere between premium restraint and immersive theatre.</p>
<p>That balance was surprisingly difficult to get right.</p>
<p>One of the things we learned while crafting our own <a href="http://lusion.co">lusion.co</a> is that visual consistency matters a lot more than individual flashy moments. A beautiful section on its own is not enough. What makes the experience feel premium is when every section feels like it belongs to the same world.</p>
<p>So for Oryzo, we focused heavily on continuity.</p>
<p>We wanted scene transitions to feel as seamless as possible, even when shifting between very different layouts and 3D compositions. That decision made the project significantly harder, especially when dealing with responsiveness and figuring out how to move between scenes without awkward cuts or obvious resets.</p>
<p>But in the end, that effort was worth it. The smoothness of those transitions is one of the reasons the whole thing feels more considered.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/c5cc65e7-3fd1-4720-82a0-d3b221b149b1.webp" alt="" style="display:block;margin:0 auto" />

<hr />
<h2>AI Satire</h2>
<p>We think it's worth stating that we are not against AI.</p>
<p>Like most people working in creative production today, we use AI tools regularly. They are useful for learning, ideation, iteration, and occasionally speeding up parts of the workflow. But at the same time, the internet is now full of low effort AI slop, overhyped product narratives, and a lot of visual nonsense pretending to be innovation.</p>
<p>That made it irresistible material.</p>
<p>If Oryzo was going to pretend to be a premium product for the AI era, then it also needed to borrow some of the visual language and cultural baggage that comes with that territory.</p>
<p>The jokes had to feel specific enough that digital artists, developers, and AI people would immediately recognise them, but still readable enough that a general audience could get the point.</p>
<p>The six finger hand was an obvious one.</p>
<p>It is one of the most iconic visual mistakes from the early Midjourney and Stable Diffusion era, so we took a normal 3D hand model, modified it into a six finger version, and custom rigged it for a playful interaction on the site.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/93269428-d905-429f-9faa-21ec8a75d07d.webp" alt="" style="display:block;margin:0 auto" />

<p>We also generated a yoga instructor character whose head spins all the way around.</p>
<p>Ironically, current image and video models are now much better at preserving human anatomy than they used to be, so getting something convincingly wrong actually took more trial and error than expected. One of the funnier discoveries was that prompting for a 360 degree head spin often does not break the model enough. Asking for 720 degrees gives you much more cursed results.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/6bc7012a-b4ca-457d-a2cc-493779a36ddf.webp" alt="" style="display:block;margin:0 auto" />

<p>We also took a shot at wearable AI culture. At some point, every awkward piece of speculative consumer hardware started being introduced as if it would fundamentally change humanity. That was too tempting not to poke at.</p>
<p>So yes, there is a little AI pin joke in there too.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/de637841-ff14-4ee4-90bb-95c262f9a941.webp" alt="" style="display:block;margin:0 auto" />

<p>And of course, we could not ignore the endless flood of questionable image generation content on X.</p>
<p>So that made its way into the project as well.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/2a2ec16e-d74c-4938-ab23-63bec197f309.webp" alt="" style="display:block;margin:0 auto" />

<p>For the AI crowd specifically, we wanted one joke that went a step further than visual gags. That is where Oryzo 1 came from.</p>
<p>If every AI product now needs an open weight model, then obviously our cork coaster needed one too.</p>
<p>So we created an academic style section for Oryzo 1, complete with OBJ model releases, a fake research framing, a model page, a <a href="https://github.com/lusionltd/ORYZO-1">GitHub link</a>, a paper reference, and even a BibTeX block. It is probably the most unnecessarily committed part of the whole project, which is exactly why we liked it.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/341fdc52-53b0-4d84-9f1a-c375135c4a0e.png" alt="" style="display:block;margin:0 auto" />

<hr />
<h2>VC Founder Video</h2>
<p>By this point, you can probably tell that we take our jokes far too seriously.</p>
<p>About a week before launch, we had another thought.</p>
<p>If this was going to feel like a real AI product launch, then it should not stop at the website. It also needed the founder video. You know the type. Slightly dramatic. Softly self important. Full of vague problem statements, carefully framed ambition, and just enough restrained confidence to imply that something world changing is happening.</p>
<p>So we made one.</p>
<p>We spent some time studying this genre of video, and the structure is almost always the same:</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/86d6ef9d-ed3e-4db1-8e74-048fc41241a0.webp" alt="" style="display:block;margin:0 auto" />

<p>Once we recognised the format, writing the script became much easier.</p>
<p>The trick was to make it sound legitimate at first, then gradually let the absurdity slip in. We wanted the viewer to feel, for a moment, that this might actually be a real founder video before lines like "removing AI from our AI product" or "zero million dollars in revenue and a zero billion valuation" quietly break the illusion.</p>
<p>To help sell the joke, we also added infographic style motion graphics throughout the video. That layer was important because founder videos often rely on graphics to make very ordinary statements feel meaningful. Used properly, they made the satire land much better.</p>
<img src="https://cdn.hashnode.com/uploads/covers/69bd2d832ff723725f185c76/2837a458-4682-4db6-b5d2-ed4349dea1be.webp" alt="" style="display:block;margin:0 auto" />

<p>The production pipeline itself was surprisingly straightforward:</p>
<ol>
<li><p>Our ECD, Edan Kwan, recorded the voiceover.</p>
</li>
<li><p>We photographed Edan from a range of static angles.</p>
</li>
<li><p>We used Nano Banana Pro to generate a darker studio style background, because our actual studio is bright and airy, with oak wood flooring that did not really fit the mood we wanted.</p>
</li>
<li><p>We used ElevenLabs Creatify Aurora to combine the audio and still images into a full one minute founder style video. We tested other platforms as well, including Kling, but this gave us the strongest result for what we needed.</p>
</li>
<li><p>After that, we finished everything using a more traditional video workflow: editing, colour grading, animated infographics, and stock audio.</p>
</li>
</ol>
<p>What is funny is that even though the final piece is clearly a joke, a lot of the effort behind it was completely real. That ended up becoming one of the themes of the whole project.</p>
<p>The idea is ridiculous.</p>
<p>The craft is not.</p>
<p><a class="embed-card" href="https://www.youtube.com/watch?v=uGJ9qh7DO-0">https://www.youtube.com/watch?v=uGJ9qh7DO-0</a></p>

<hr />
<h2>Closing Thoughts</h2>
<p>What started as a simple question, how do we make a physical product website without having a physical product, turned into one of the most enjoyable internal projects we have worked on in a long time.</p>
<p>The cork coaster gave us the perfect canvas because it was so ordinary. It forced us to focus on storytelling, design language, pacing, and tone rather than relying on the product itself to do the heavy lifting. And once we embraced the joke, it opened up even more room to explore the strange overlap between premium branding, immersive web design, and AI satire.</p>
<p>In Part 2, we will go behind the scenes of the 3D design and motion graphics work, and show how we built the premium visual tone that made this ridiculous little product feel strangely believable.</p>
<hr />
<h3>Oryzo Behind-The-Scene Series</h3>
<p>We will be publishing the rest of the Oryzo behind the scenes series over the next few days. If you enjoyed this post, feel free to bookmark it or subscribe for the upcoming parts.</p>
<p><strong>☑ Oryzo BTS (Part 1 / 7) - Concept and Creative Direction</strong></p>
<p><a href="https://blog.lusion.co/oryzo-bts-part-2-7-3d-design-and-motion-graphics">☑ Oryzo BTS (Part 2 / 7) - 3D Design and Motion Graphics</a></p>
<p><a href="https://blog.lusion.co/oryzo-bts-part-3-7-website-ux-ui-and-illustrations">☑ Oryzo BTS (Part 3 / 7) - Website UX/UI and Illustrations</a></p>
<p><em>☐ Oryzo BTS (Part 4 / 7) - WebGL/ThreeJS Tricks 1</em></p>
<p><em>☐ Oryzo BTS (Part 5 / 7) - WebGL/ThreeJS Tricks 2</em></p>
<p><em>☐ Oryzo BTS (Part 6 / 7) - WebGL/ThreeJS Tricks 3</em></p>
<p><em>☐ Oryzo BTS (Part 7 / 7) - WebGL/ThreeJS Tricks 4</em></p>
]]></content:encoded></item></channel></rss>