Designing Automation That Works Smarter, Not Harder
Timeline: Sept 2023 - August 2024
I led the end-to-end design of the Automated Pacing Tool, collaborating closely with Product Manager Peter Inoyue, and Lead Engineer Danny Pierce. From early discovery through iterative prototyping and execution, I focused on building a tool that was intuitive, high-impact, and aligned with Kargo’s campaign delivery goals.
This case study highlights how we reimagined daily and monthly pacing—empowering Account Managers to shift from manual spreadsheet work to strategic decision-making.
Read time: 3-5 mins
Want to skip and see solutions? “Click Here”
Project Overview
-
Automated Pacing simplifies how ad dollars are managed—both where and when campaigns spend.
Instead of relying on spreadsheets and third-party platforms, the tool automates:
Data ingestion from active campaigns
Real-time budget rebalancing using AI recommendations
It’s built to do what Account Managers were doing manually—only smarter, faster, and at scale
-
Manual pacing was inefficient, error-prone, and exhausting.
Account Managers were spending 20-60 minutes a day adjusting campaign budgets, toggling between multiple tools and spreadsheets. Across the year, that added up to 180+ hours—or 22 full workdays lost to manual pacing.
Worse, the work didn’t stop at 6pm. Budget reviews often crept into nights and weekends due to inconsistent tools and lack of automation.
-
Transforming Budget Management with Actionable Automation
To eliminate this inefficiency, I led the design of Kargo’s Automated Pacing Tool, built to help both internal teams and self-service clients manage pacing more strategically.
Key Features:
Built-in demand calculator: Automatically applies pacing formulas—no Excel required
AI-powered recommendations: Informs when and where to spend, day by day
Seamless integration: Works within existing workflows, eliminating third-party dependency
Live syncing with campaign performance for real-time adjustments
The result? Fewer clicks. Smarter decisions. More time for strategic work.
-
Human-Centered Design
We grounded the experience in user feedback and testing, ensuring the tool was clean, focused, and easy to use—even under time pressure.
Efficiency + Accuracy
Automation increased campaign precision and eliminated common human errors—saving both time and spend.
Intelligence that Explains Itself
By offering explainable AI recommendations, we gave users confidence in the tool’s decisions and built long-term trust.
Trusted by Top Clients










Research
We kicked off the project with a detailed product brief outlining the tool’s vision and goals. Through early research and interviews led by Product Manager Peter Inoyue, three core pain points emerged:
Manual budget updates: Account Managers were adjusting dozens of campaigns and ad sets daily—often in sprawling spreadsheets with 100+ columns. Bulk-editing helped, but there were no rules to align spend with pacing goals.
Fragmented tools: Teams relied on multiple spreadsheets—one for pacing groups, another for daily plans—requiring manual exports, formulas, and re-entry into Ads Manager or KC.
Off-hours workload: Some accounts needed pacing updates at midnight or early morning, pushing AMs to work beyond normal hours to maintain accuracy.
MockUp Development
For this case study, I focus on the Budget Insights & AI-Driven Recommendations, the phase that delivered the biggest impact. It tackled the core inefficiencies of manual updates and fragmented tools by using AI to improve campaign pacing and performance.
This section highlights my design process:
→ how I translated data into intuitive UI
→ built high-fidelity mockups for testing
→ and solved real user pain points through thoughtful UX.
To structure our approach, Peter and I broke the project into four key phases:
Setup – Build the foundation for automated pacing
Demand Planning – Improve forecasting and budget allocation
Getting Started – Streamline onboarding and adoption
Budget Insights & AI-Driven Recommendations – Deliver intelligent, automated budget guidance
Wireframing +
Usability Testing
Wireframing + Usability Testing
V1 Pacing Dashboard
My focus was designing a dashboard that provided users a clear and intuitive view of their campaign budgets while integrating AI-powered recommendations seamlessly.
The goal was to create a usable data-driven interface that allowed Account Managers to track and adjust pacing efficiently.
V2 Pacing Dashboard
Based on usability tests, I refined the pacing dashboard by introducing a dual-view table, allowing users to switch between:
Simplifying the approval process
We had designed the “Approve” column to include 4 options, but the complexity introduced delays and dev overhead.
To keep simplify the experience, users can either approve or ignore. The result? A cleaner workflow. faster build, and a launch-ready feature.
Outcomes
-
1
Spend less time Manually Pacing
“Great progress. Really excited to adopt this for everyday use. Pacing can definitely be improved and automated.”
— Account Manager, JD Sports
-
2
Adoption Speaks for itself
“Positive feedback overall.
This [platform] replaces the Excel demand plus and saves a lot of time.”
— Account Manager, Saks
-
3
Smarter Spending. More Trust.
“We used multiple [third party] platforms. This [tool] eliminates the need to reference the excel demand plans.”
— Account Manager, Costar
1.
Account Managers reduced manual pacing time by 80%, freeing up hours for strategic thinking—not spreadsheet wrangling.
Lesson Learned:
Automating budget calculations and pacing workflows significantly reduced the time Account Managers spend manually inputting data in spreadsheets. By streamlining this process, we enabled Account Managers to focus on strategic planning rather than operational tasks, reinforcing the impact of automation on productivity.
Spend Less Time Pacing.
Deliver More Campaigns.
2.
85% of Account Managers and Self-Service clients are using the tool—thanks to thoughtful iteration, usability testing, and fast feedback loops.
Lesson Learned:
Early usability tests revealed that users struggled with the original layout. By pivoting to a cleaner, more intuitive display and validating changes with real users, we delivered a tool that people not only understood, but actively chose to use. The 85% adoption rate post-launch proved that iterative design isn’t just a best practice, it’s essential.
Adoption That Speaks For
Itself.
Automated Pacing improved budget utilization accuracy by 20% but transparency was the key to user trust and long-term adoption.
Lesson Learned:
AI-driven recommendations helped optimize pacing decisions and improved accuracy by 20%. Through early feedback, we discovered that explainability mattered just as much as performance. This insight emphasized a crucial principle that transparency builds confidence, especially in AI-powered tools.
3.
Smarter Spending.
More Trust.