The Sauce Code, Plug-in for After Effects

theSauceCode after effects Plug-in

Building theSauceCode: A Year of Creative Problem-Solving

I've been busy building theSauceCode from the ground up, an After Effects plug-in for ASCII motion design. It started as a hobby. I found making ASCII artwork was surprisingly tedious, usually involving websites or external tools, and lots of exporting and re-importing. I wanted something that worked directly inside After Effects, without breaking the flow. so I made my own workflow.

There are a few alternatives out there, but most felt like simple overlays. So I started building my own. What began as a side project slowly took over my life for about a year. I wanted a tool with a unique, hands-on toolset that actually becomes part of your workflow, and once you get into the flow, it's a dream to use.

That idea turned into months of problem-solving, learning, breaking things, and rebuilding them properly, from early concepts and prototypes, to developing native builds for PC and Mac, working in Visual Studio and Xcode.

And now, after certification and approvals, it's finally live on aescripts. theSauceCode is out in the world, and I'm excited to finally share what I've been working on.

The Problem: Why Nothing Else Worked

Before I started building theSauceCode, I tried every ASCII solution I could find. Web-based converters, command-line tools, simple After Effects overlays. They all had the same fundamental problem: they broke your creative flow, and looked just exactly like what other users would end up with.

The web tools were the worst. You'd export your footage, upload it to some site, wait for processing, download the result, import it back into After Effects. Every time you wanted to adjust something or change the character set, tweak the density, modify the look - you'd repeat the entire process. It wasn't a workflow, it was a loop of frustration.

The After Effects plugins that existed were mostly just character overlays. They'd place ASCII characters over your footage, but they weren't really converting anything. No real control over the rendering, no procedural effects, no way to treat the ASCII art as an actual creative medium rather than just a filter slapped on top.

What I wanted was a tool that lived inside After Effects but gave you deep, hands-on control. Something that captured your footage, let you work with it in real-time with proper creative tools, then brought it back into your comp as a proper render you could composite with. That tool didn't exist, so I had to build it.

The Architecture Decision: Plugin + Standalone Editor

The first major decision was architectural: how do you build something that integrates with After Effects but also gives you the kind of real-time creative control you'd expect from a dedicated application?

I could have built everything directly into the plugin interface, but After Effects' plugin UI system is... limited. You're working within constraints that don't lend themselves to real-time preview, complex parameter controls, or the kind of interactive feedback loop that creative tools need. Basically After Effects text is limited, it can do pretty neat things I love but just not the way I wanted them.

So I split it: a plugin that lives in After Effects and handles frame capture and import, plus a standalone editor where the real creative work happens.

The Plugin Side:

Captures frames from your After Effects composition

Manages session data and file paths

Handles the import of rendered ASCII sequences back into your project

Integrates with aescripts' licensing system

The Editor Side:

Real-time viewport with instant parameter feedback

Full creative toolset for character mapping, effects, animation

Rendering engine that produces the final PNG sequences

Timeline system with procedural animation and evolution interpolation

The challenge was making these two components work together seamlessly. The plugin needed to pass session information to the editor, the editor needed to know where captured frames lived, and the entire system needed to maintain state across After Effects restarts and multiple effect instances.

Session management became the glue. Each effect instance gets its own unique session folder, stored locally on the user's machine. The plugin writes captured frames there, the editor reads from there, exports go there. This isolation means you can run multiple ASCII treatments simultaneously without them interfering with each other - each one has its own workspace, its own state, its own output. A few headaches and naps later I would usually find a solution somewhere.

Learning the Adobe SDK: The Undocumented Reality

Building an After Effects plugin means working with Adobe's SDK. The documentation exists, but it's sparse. The online community? Even sparser. Most of what I needed to learn came from trial, error, and reading through example code trying to understand what the hell was actually happening.

The SDK toolkit itself is powerful, you can hook into After Effects' rendering pipeline, access composition data, handle parameters, integrate with the UI. But figuring out how to do any of that when you're starting from scratch is its own education.

I spent weeks just understanding how the plugin lifecycle works, when your code gets called, what data is available at different stages, how to properly allocate and manage memory in Adobe's plugin environment. The render pipeline is particularly opaque if you're coming from general software development. It's not just "here's your frame, process it" - there's parameter dependencies, caching considerations, multi-threaded rendering, and a dozen other things that aren't immediately obvious.

The breakthrough moments usually came from persistence. Read the example code. Break it. Fix it. Try something slightly different. Break it again. Eventually, you start to understand not just what the code does, but why it's structured that way.

Building on top of the SDK meant accepting that some things would be harder than they should be, and that's just the cost of working within someone else's architecture. But once you understand the patterns, it becomes manageable.

The Pipeline: How Everything Connects

The pipeline is deceptively simple on the surface: capture frames, edit them, export them, import them back. But making that flow work smoothly required careful coordination between components.

Frame Capture: When you click "Frame Capture" in the plugin, it's not just grabbing pixels. The plugin needs to:

Convert the current frame to grayscale (because ASCII is luminance-based)

Write it to the session folder with proper sequential naming

Update the session metadata so the editor knows new frames exist

Handle the case where the user is capturing over multiple work sessions

The grayscale conversion happens immediately so you can see how your footage will map to characters before you even open the editor. It's a preview of the luminance information that will drive the ASCII conversion.

The Editor Connection: When you click "Open Sauce," the plugin launches the standalone editor and passes the session ID. The editor:

Reads the session folder to find captured frames

Loads them into memory for real-time processing

Applies the current parameter settings to generate the viewport preview

Maintains its own state file so your settings persist across sessions

Everything is file-based. The plugin and editor don't communicate through network sockets or shared memory - they communicate through the filesystem. It's simple, it's reliable, and it works across different processes and different user sessions.

The Export Loop: When you export, the editor walks through every frame in your sequence:

Applies the current parameters (or interpolated parameters if Evolution is active)

Renders the ASCII grid with full character mapping, fills, strokes, and effects

Saves it as a PNG with alpha transparency

Writes metadata JSON for reference

The 1:1 rendering system means whatever resolution went in comes out exactly. No scaling, no interpolation, no quality loss. If you captured 1920x1080, you export 1920x1080.

The Rendering Breakthrough: Character Remapping

This was one of the hardest technical problems to solve, and arguably our best feature.

The naive approach to rendering ASCII art is simple: load a font at a fixed size, draw characters into a grid. But that creates problems. If you're stuck at, say, 24-pixel font rendering, your creative options are limited. You can't scale individual characters dynamically, you can't apply effects that require size variation, and you're locked into a fixed aesthetic.

We needed a way to render characters at arbitrary sizes while maintaining quality and performance. The solution involved multiple attempts before we found something that worked.

Attempt One: Direct Font Scaling Just scale the font up or down per character. Simple, right? Except font rendering engines don't love being asked to draw the same character at 47 different sizes in rapid succession. Performance tanked, and the quality was inconsistent, small sizes looked terrible, large sizes were fine but slow.

Attempt Two: Pre-render at Multiple Sizes Pre-render each character at common sizes and cache them. Better performance, but still limited flexibility and the cache memory requirements grew quickly. Plus, effects that needed arbitrary scaling still fell apart.

The Solution: Character Remapping Instead of trying to scale fonts dynamically in real-time, we remap characters at the base rendering level. The system renders at a foundation size, but we developed a remapping technique that lets us effectively treat each character position as independently scalable without sacrificing quality.

The key insight was separating the character rendering from the grid positioning. Once those were decoupled, we could apply transformations, effects, and variations without being locked to a single font size.

This is why effects like Random Scale work smoothly. Each character position can vary independently, the system handles the remapping, and the output remains crisp. It's also why the Evolution system can interpolate font sizes across the timeline without introducing artifacts. You can see in some early posts that the pixels are not as smooth as in later posts, once I fixed the rendering to any scale I was a very happy person and treated myself to a nap and decompression time.

The Creative Toolset: Making Effects Work Together

Building individual effects is one thing. Making them work together harmoniously without creating chaos is another.

Character Mapping System: The foundation of everything is luminance-based character mapping. Your footage is converted to grayscale, and luminance values map to four character types:

Black characters

Grey characters

White characters

Alpha characters

Each character type can have its own character, color, opacity, fill, and stroke settings. The rendering happens in layers - fills first, then stroke borders, then characters on top. This layering means you can create depth and visual complexity without the effects fighting each other.

Procedural Effects: The procedural effects (Sine Wave, Advanced Noise, Scroll) needed to be time-based but also controllable. The solution was the Animation Time system, a separate timeline that drives all procedural animation independently of your frame sequence.

This separation means you can:

Scrub through frames to preview your static look

Play the animation to see procedural effects in motion

Export with both frame changes and procedural animation working together

The Advanced Noise system supports multiple algorithms (Perlin, Simplex, Fractal, Turbulence) with controllable parameters. Getting these to run in real-time while maintaining viewport responsiveness required careful optimization. Each noise type is computed per-character position per-frame, which adds up fast when you're dealing with a 240x135 character grid running at 30fps. It's tricky when you develop on your own high-end machine not really knowing how this will perform for other users, but quality output was always my number one aim.

The Evolution System: Evolution interpolates parameter values across your timeline. You set start keyframes, set end keyframes, and the system linearly interpolates everything in between.

The technical challenge was making sure all parameters could be interpolated safely. Some parameters (like Block Size) are discrete integers that affect grid structure. Others (like Random Scale) are continuous floats. The interpolation system needed to handle both, apply them correctly during both preview and export, and maintain coherence across frame boundaries.

The result is a system where you can start with one look and smoothly transition to something completely different over the course of your animation. All of it calculated per-frame during export, with the interpolated values baked into the metadata for reference.

Creative Problem-Solving: Specific Challenges

The Session Isolation Problem: Early versions used a single global output folder. This broke immediately when users tried to run multiple ASCII treatments in the same project. Frames from different instances would overwrite each other, sessions would get confused, and the whole system would fall apart.

The solution was unique session IDs generated per effect instance. Each session gets its own folder, each folder is named with a timestamp and random component, and the plugin stores the session ID in its parameters. Now each effect instance lives in its own isolated workspace, and multiple instances can coexist without conflict.

The Real-Time Preview Challenge: Rendering ASCII grids is computationally expensive. For a 1920x1080 frame at 8-pixel block size, you're rendering 32,400 character positions. Do that for every parameter change in real-time, and your UI becomes unusable.

The solution involved multiple optimizations:

Render viewport at lower resolution when parameters are actively changing

Use GPU acceleration where possible for fill and stroke rendering

Implement dirty rectangle tracking to only re-render changed regions

Cache character glyphs aggressively

The viewport still slows down with extremely small block sizes (4-6 pixels), but it remains usable even at high character densities. And when you export, we render at full quality regardless of viewport performance.

The Font Rendering Quality Issue: Fonts are vectors, but when you start applying noise displacement or extreme scaling, they become rasterized. This introduced quality problems - characters would look crisp in the viewport but become pixelated in exports when certain effects were active.

We addressed this by rendering the base font at a higher resolution internally, then applying effects to the already-rasterized version. It's not perfect (some extreme effects still show artifacts), but version 1.1 significantly improved quality for smaller scale fonts. Version 1.2 will continue refining this.

Platform-Specific Development

Building for both Windows and Mac meant maintaining two separate native builds - Visual Studio for Windows, Xcode for Mac.

The core rendering engine is cross-platform, but platform-specific code was necessary for:

File path handling (Windows backslashes vs Unix forward slashes)

Font loading mechanisms

Process launching (plugin launching the editor)

Session folder permissions and access

Mac support is Apple Silicon only (M1/M2/M3/M4). Intel support would have required maintaining a separate architecture, and realistically, that's not where the platform is going. Focusing on Apple Silicon meant we could optimize specifically for ARM architecture without compromise.

Each platform needed its own testing environment, its own build configuration, its own debugging process. A bug that appeared on Windows might not appear on Mac, and vice versa. Cross-platform development is its own discipline - you're not just writing code, you're writing code that works identically on two fundamentally different operating systems.

What I Learned

Building theSauceCode taught me more about software engineering than any course or tutorial ever could:

Start with the architecture. The decision to split the plugin and editor early on saved me from countless headaches later. If I'd tried to cram everything into the After Effects plugin UI, I'd still be fighting with it.

Rendering is hard. Really hard. The character remapping solution took multiple failed attempts before we found something that worked. Sometimes the breakthrough comes from completely rethinking the problem rather than optimizing the current approach.

The SDK documentation will never be enough. You learn by breaking things and reading example code and trying again. That's just how it is when you're working with someone else's platform.

Performance optimization is never finished. There are still parts of the viewport that slow down under certain conditions. Version 1.1 improved things, version 1.2 will improve them more, and there will always be another optimization to make.

Cross-platform development doubles everything. Not just the builds, but the testing, the debugging, the platform-specific quirks you never anticipated. Budget time for this if you're planning to ship on multiple platforms.

TheSauceCode is Live

After a year of development, problem-solving, and creative engineering, theSauceCode v1.1 is now available on aescripts.

It transforms After Effects footage into character-based ASCII art with real-time creative control, procedural animation effects, and a rendering system that maintains quality at any scale. It's the tool I wanted to use, built the way I wanted it to work.

The engineering journey isn't over - version 1.2 is already in progress. Font rendering quality improvements, fixing the frame capture order issues, smoothing out Evolution mode interpolation for longer renders. There's always more to build, more to optimize, more to solve.

But for now, the mountain has been climbed. The plugin is out there. After a year of work, people can finally use what I've been building.

theSauceCode is available now through aescripts + aeplugins. For support, bug reports, or feature requests, visit the product page.

Do you need help?

If you are looking at developing something simailr and feel you need a help, drop me a line I would love to talk

Next
Next

Saudi Arabia Pavilion at Osaka Expo