Every discovery call, I ask the same five questions. Not because I'm following a script. Because these questions determine if the project will succeed.
Good answers mean we'll probably work well together. Bad answers mean I should walk away. Even if they're willing to pay.
Here are the questions and what I'm actually looking for.
Question 1: "Walk me through exactly how you do this now."
Not asking about their problem. Asking about their current process. Step by step. Every detail.
What I'm listening for: Can they articulate the process? Do they actually understand what they're doing? Are there clear steps or is it chaotic?
If they can't explain their current process, automation won't help. You can't automate what you don't understand. We'd be automating confusion.
Good answer sounds like: "Every Monday, Sarah exports data from the CRM. She pastes it into this Excel template. Then she calculates totals using these formulas. Then emails the report to five people."
Bad answer sounds like: "We just kind of... figure it out as we go. Everyone does it differently. It's kind of a mess."
If the process is a mess, I tell them. Fix the process first. Standardize it. Document it. Then call me back. Automating a mess just makes a faster mess.
Question 2: "How much time does this take?"
Need specifics. Not "a lot" or "too much." Actual hours. Per week or per month.
What I'm listening for: Is the time investment worth automating? Is their estimate realistic?
Rule of thumb: If it's less than 5 hours monthly, probably not worth automating. Unless it's particularly error-prone or time-sensitive. But generally, the ROI isn't there for small time sinks.
Good answer: "About 12 hours weekly. Two people spend 6 hours each on data entry and reporting."
Bad answer: "I don't know exactly. A lot though. Feels like we're always doing it."
If they don't know how much time it takes, they won't know if automation actually helped. Can't measure success. Can't justify the investment.
Question 3: "What have you tried already?"
Looking for problem-solving attempts. Did they try fixing it themselves? Try different tools? Try different processes?
What I'm listening for: Are they proactive or passive? Do they own their problems or blame circumstances?
Best clients have already tried things. Experimented. Failed a few times. Learned what doesn't work. They're not looking for someone to solve everything. They're looking for expertise to solve what they couldn't.
Good answer: "We tried using Zapier but couldn't get the data formatting right. Tried hiring a VA but turnover was too high. Tried an off-the-shelf tool but it didn't integrate with our CRM."
Bad answer: "Nothing. Just thought we should automate it." Or worse: "We hired someone before but they didn't deliver."
If they've done nothing and expect me to magically fix everything, that's a red flag. If they've hired multiple people who "didn't deliver," maybe the problem isn't the consultants.
Question 4: "What does success look like?"
Want measurable outcomes. Not vague goals. Specific metrics.
What I'm listening for: Do they have clear expectations? Are those expectations realistic? Can we measure success?
Without clear success criteria, every project becomes subjective. Client might feel unsatisfied even if I delivered exactly what they asked for. Because they don't know what they actually wanted.
Good answer: "Save 10 hours weekly. Reduce errors to near zero. Generate reports in under 5 minutes instead of 2 hours."
Bad answer: "Make things more efficient." Or "Just make it better."
Vague goals lead to scope creep. Lead to misaligned expectations. Lead to unhappy clients and frustrated consultants.
If they can't define success, I help them. Ask follow-up questions. Narrow it down. Get specific. If we still can't land on clear criteria, probably not a good fit.
Question 5: "What's your timeline?"
Not asking when they want it done. Asking when they need it done. And why.
What I'm listening for: Is the timeline realistic? Is there actual urgency or artificial urgency? Are they flexible?
Red flag: "We need this done in two weeks." Unless the project is tiny, two weeks is unrealistic. Especially with discovery, planning, building, testing, and deployment.
Clients with unrealistic timelines either don't understand the complexity or are desperate. Either way, problematic.
Good answer: "We'd like to have this running by end of Q2. But we're flexible if needed. Just want to make sure it's done right."
Bad answer: "ASAP. Yesterday if possible. We're drowning."
If they're drowning, they need a life raft, not a custom boat. I'll suggest quick fixes. Band-aids. Temporary solutions to stop the bleeding. Then we can build the real solution once things stabilize.
Rushing automation leads to mistakes. Skipped testing. Poor planning. Technical debt. Better to take time and do it right.
These questions filter fast.
Usually within 20 minutes, I know if it's a good fit. Good answers across all five questions? Probably a great project. Bad answers on multiple questions? Probably not worth taking.
Sometimes the answers are mixed. Good on some, bad on others. Then it's judgment call. Depends on which questions had bad answers. Depends on whether the client is open to feedback. Depends on whether I think we can align expectations.
But these five questions catch most red flags. Save me from bad projects. Save clients from wasting money on the wrong solution.
Everyone wins when you filter properly upfront.