top of page

Is Your Test Strategy Just Creating Noise?

  • Writer: Brandon Homuth
    Brandon Homuth
  • Jul 7
  • 2 min read

Running tests has technically never been easier. With highly configurable back-end tech and sophisticated data analysis tools widely available, even early-stage lenders with small teams can run a sophisticated testing program. The ease is a double-edged sword though, as it's also become easy to drown your insights in noise with sloppy testing. To make sure testing brings meaningful results, follow these four principles.


1. You need a learning agenda. And a budget, too.


There might not be a dollar cost, but testing consumes resources: time, attention, and loan volume. Write out a learning agenda tied to your high-level business goals. Then budget your loan volume and people accordingly. Ask yourself:


  • How much traffic or volume can we allocate to tests this quarter?

  • Which tests are worth that investment?

  • Where are we seeing the best return?


2. Be a disciplined scientist.


The principles of good test design are simple. Resist the temptation to sideline them in favor of convenience or speed.


  • Use a real control group: Running version A in January and version B in March isn’t a real test—too many other factors change in between. Run true A/B tests whenever you can.

  • Don't overlap tests: When multiple things are being tested at once, it’s hard to discern their effects.

  • Design each test: Tailor test design and sample size to the metric being optimized and the business impact of potential findings, not habitual thresholds like “we always test 50,000.”


3. Establish clear ownership.


Without someone accountable for setup, execution, and follow-up, testing is treated as a “side quest” and quickly lost in the shuffle. We saw one lender who had hundreds of tests going each month, but barely any were reviewed. Some ideas were tested multiple times because new teams didn’t know they’d already been tried.


4. Take the time to learn from all your tests, even "failures."


Even tests that don’t “win” can teach you something, but only if you track what happened and go back to review it. Too often, test results live in someone’s inbox or are buried in a slide deck. Leave yourself time to properly reflect after each test, too, as moving too quickly can actually waste time. Engineering teams build things that never get evaluated, product priorities shift too fast, and analysts can’t keep up.

bottom of page