- Retention Recipes
- Posts
- How to Set Up an Automated Survey System That Actually Gets Answers
How to Set Up an Automated Survey System That Actually Gets Answers

Today’s issue is coming to you live from my kitchen table, where I’m mentally preparing to go full karaoke mode with my wife at the Shakira concert tonight in Atlanta.
Like most English-speaking teens in the early 2000s, I first saw her on MTV and thought, “Cool, she can dance.”

As a Denver Broncos fan those horses really tied this video together.
But a few years later, someone told me—“Her Spanish albums are way better.”
They weren’t wrong.
I torrented Dónde Están los Ladrones? off Limewire in college (shoutout to my blue iMac and ethernet dorm internet), and quickly entered into heavy rotation in my top albums.
I even ended up transcribing the drum and guitar parts from the entire early albums. Not a nerd or anything.
So yeah, for me, it was more than just the hips.
But before I swap retention strategy for reggaetón, I want to walk you through how I built an automated survey system for a client that gave us insights and ideas you just can’t get from a ChatGPT research prompt..
Let’s Build an Automated Survey System
To be honest, I’d never done this before.
In all my time working with brands at agencies, I’d never had the time to actually build a survey system.
I was always buried in day-to-day work.
When my client mentioned they wanted help to get more customer insights, I was stoked.
They had one old multiple choice survey that was tucked away in an email post-purchase automation.
It had been running for years, but despite tens of thousands of people getting the email, hardly anyone ever filled it out.
So we rethought the whole thing.
Instead of trying to collect data, we wanted to start real conversations. Something low effort for the customer, high signal for us, and designed to keep learning as the list grew.
Setting up the Surveys
We were very intentional about how we set this up.
Instead of one generic survey filled with endless multiple choice questions, we created targeted surveys triggered at key moments in the customer journey.
Each survey had three to four short, open ended questions.
The goal was simple. Ask something timely, relevant, and useful. Something that would help us understand how to serve them better.
Here’s what we asked at each stage:
Warm Prospects – 30 days on the list with no purchase
What would you like to see from us that would make you more likely to place an order?
New Customers – A few days after their first purchase
What motivated you to finally make this purchase?
Did you have any hesitations before purchasing? If so, what were they?
Is there a specific goal you’re hoping to achieve—like managing your diet, finding a satisfying alternative, or something else?
If you feel comfortable, tell us a little about yourself.
Churn Risk – 60 days after first purchase, no second order
What’s the main reason you haven’t placed another order yet?
What was your experience like with the product?
What would make you more likely to order again?
Loyal Customers – Sent after their third order
Are there any foods that you’d LOVE to see us make?
How does our product fit into your daily routine?
What do you love most about it?
What could we improve to make it even better?
We set everything up in Typeform. Once responses started coming in, everything was automatically updated into a Google Sheet. The survey platform was fully integrated, so answers rolled in 24/7 without us having to touch it.
After a few weeks, we dropped the data into ChatGPT to help summarize and cluster common themes.
And that’s when the insights started rolling in.
What We Learned
The first thing we saw? People actually wanted to share.
We got over 2,000 responses in the first month, and the quality was high. A mix of thoughtful answers, honest feedback, and the kind of details that are hard to get from reviews alone. It felt like having one-on-one conversations at scale.
Some responses confirmed what we already suspected from support tickets and product reviews. Others gave us totally new angles we hadn’t considered.
Customer objections got clearer
A lot of what we heard echoed concerns we’d seen in customer service. Things like taste, texture, or price. The difference was, now we had those objections phrased in the customer’s own words. That made it easier to address them in emails and also look for ways to improve the product.
Timing of repeat orders needed work
The majority of first-time buyers started with a product sample pack. But many were still working through it weeks later. That helped explain why some hadn’t placed a second order yet. It pushed us to rethink the timing of certain offers and flows.
Real input for content, offers, and future products
The responses were basically copywriting gold.
People handed us the exact language they use to describe their problems, goals, and why this product matters to them. When you feed this kind of raw voice-of-customer data into an AI copywriting workflow, the ideas multiply fast. It becomes a lot easier to crank out effective emails and ads.
We got future product ideas directly from the source
Our most loyal customers told us exactly what products they wanted us to make next. No guessing. We were able to rank potential product ideas and get a clear picture of what we could launch next—with a built-in audience already asking for it.
You can’t get these insights with vibe marketing
Why this matters
Most people never do this. Honestly, I hadn’t either because it takes a little legwork.
It’s easier to just run a few AI prompts, scrape some reviews, and call it “research.”
But a survey system? This is the work that separates real operators from everyone else.
The brands that win are the ones willing to go a layer deeper.
To actually ask questions and listen.
Well I hope you folks enjoyed yourselves. I’ll catch you later on down the trail.
Got a topic you want me to cook up a recipe for? Drop me an email or a DM.
- Ben
P.S.
Want help with your eCom email and retention marketing?