The Problem: Slow Websites Lose Customers

When visitors hit your website and see nothing for 2 minutes, they leave. We had 10 landing pages at alorny.cloud — 5 of them failing performance benchmarks. Page weights ranged from 4MB to over 9MB. Request counts hit 60 per page. Something had to change.

Enter AutoResearch

Andrej Karpathy released autoresearch — a methodology where AI agents iteratively optimize a target metric through experimentation. The concept is simple: measure, hypothesize, change, measure again, keep improvements, revert regressions.

We applied this to website performance. The target metric: number of failing sites (lower is better). The measurement tool: Playwright-based performance benchmarks checking FCP, LCP, CLS, page weight, and request count.

The Results: 11 Experiments, 100% Pass Rate

Over 11 experiments, the autonomous agent:

Key Metrics Before vs After

The improvements were dramatic:

How It Works

The autoresearch loop follows a strict protocol:

  1. Analyze — Read past experiment results, understand what worked and what failed
  2. Hypothesize — Choose one specific, focused change to try
  3. Implement — Make the code change and commit it to git
  4. Evaluate — Run the measurement suite and capture results
  5. Decide — Keep improvements, revert regressions

Each experiment is atomic — one change, one measurement, one decision. Git serves as the persistence layer, making every experiment reversible.

Why This Matters for Your Business

At Alorny, we build custom automation solutions including AI-powered optimization tools. The autoresearch methodology isn't limited to websites — it works for any process with a measurable metric: email campaigns, trading strategies, prompt engineering, and more.

If your website takes more than 3 seconds to load, you're losing 53% of mobile visitors (Google data). Autonomous optimization can fix this while you sleep.

Key Takeaways