Real, ready-to-use interview scorecard examples for the five roles founders hire most — with competencies, scoring criteria, and the questions to match.
What an Interview Scorecard Looks Like
Most startup founders interview by feel. They ask different questions to different candidates, compare notes over Slack, and make a decision based on whoever seemed most impressive. This reliably produces bad hires — not because founders are bad judges, but because unstructured interviews have the same predictive validity as a coin flip.
An interview scorecard solves this by forcing you to define what "good" looks like before you meet a single candidate. You score the same competencies for every person. You compare candidates on the same dimensions. And when you debrief with your team, you're comparing evidence — not impressions.
Below are five complete interview scorecard examples — one for each of the roles founders hire most. Each includes the competencies, what each score means, and the interview questions that surface the evidence.
Software Engineer — Interview Scorecard
| Competency | Interview Question | Score 1 (Weak) | Score 3 (Strong) | Score 4 (Exceptional) |
|---|---|---|---|---|
| Technical problem-solving | "Walk me through the hardest technical problem you've solved in the last year. What was your approach?" | Vague description, no clear process, couldn't explain trade-offs | Clear problem definition, logical approach, quantified outcome | Anticipates edge cases, explains why alternatives were rejected, ties to business impact |
| Code quality and standards | "How do you decide when code is 'good enough' to ship? Give me a real example." | Ships fast without considering maintainability or tests | Balances speed and quality with clear reasoning; references tests or review process | Sets standards for the team, has built review systems, can articulate the cost of shortcuts |
| Ownership and self-direction | "Tell me about a project you drove end-to-end without much direction. How did you scope it and what happened?" | Waited for requirements; needed significant management | Defined scope independently, shipped, and communicated clearly throughout | Identified an important problem the team wasn't thinking about, owned it fully, and delivered |
| Communication and collaboration | "Describe a time you disagreed with a technical decision made by someone senior. What did you do?" | Stayed quiet or escalated poorly; no productive resolution | Raised concern with evidence, heard out the other perspective, accepted outcome gracefully | Changed the team's direction with data; built trust by separating ego from the technical argument |
| Learning and growth | "What's the most significant technical skill you've built in the last 12 months, and how did you build it?" | No clear examples; learning is reactive rather than intentional | Has a consistent habit of learning; can describe deliberate steps taken | Ahead of the curve — learning things before they're required; applies new skills quickly to real work |
Minimum hire bar (Software Engineer)
Marketing Manager — Interview Scorecard
| Competency | Interview Question | Score 1 (Weak) | Score 3 (Strong) | Score 4 (Exceptional) |
|---|---|---|---|---|
| Entrepreneurial execution | "Tell me about a marketing initiative you ran from idea to results — no playbook, no team, just you." | Ran campaigns in large teams; can't point to personal contribution | Clear ownership, described what they built, can show measurable results | Built a channel or system that didn't exist before; still running and referenced by the company |
| Analytical thinking | "Walk me through how you measure whether a campaign worked. What's your framework?" | Focused on vanity metrics (impressions, likes) with no business tie | Tracks CAC, conversion, or pipeline contribution; knows the difference between signal and noise | Built a measurement system; can describe how marketing influenced revenue decisions at the exec level |
| Writing and positioning | "What's a piece of copy you're proud of? Walk me through the thinking." | Generic; can't explain positioning choices; no awareness of audience | Specific, benefit-led; can explain why certain words were chosen over others | Has tested copy systematically; can show before/after and what the data said |
| Channel expertise | "What's the one acquisition channel you know better than most? What would you do in the first 60 days to build it here?" | Spread thin; no depth on any single channel; answer is generic | Clear primary channel with real results; has a day-one plan that makes sense for the business | Has built the channel multiple times across companies; brings a playbook, not just a point of view |
| Stakeholder communication | "How do you keep founders or leadership updated on marketing without over-reporting? What does that look like?" | No clear rhythm; reactive updates; founders complained about visibility | Has a cadence (weekly/monthly); reports on metrics that matter to the business, not marketing | Designed the reporting system; trained the leadership team on what to look for; proactively flags risks |
Sales Development Rep — Interview Scorecard
| Competency | Interview Question | Score 1 (Weak) | Score 3 (Strong) | Score 4 (Exceptional) |
|---|---|---|---|---|
| Drive and hunger | "Tell me about the quota or target you're most proud of hitting. How far did you push yourself to get there?" | Hit targets but no story of going above; no intrinsic motivation visible | Clear personal ambition; took extra steps beyond what was required | Overachieved consistently; can describe what they changed to break through a ceiling |
| Resilience under rejection | "Walk me through a stretch where you were getting a lot of no's. How did you handle it mentally and practically?" | Got discouraged; needed motivation from manager; changed approach only when told | Self-regulated; had a clear mental reset routine; adjusted messaging without being told | Turned a slump into a learning phase; came out with a better process and shared it with the team |
| Prospecting and research | "Walk me through how you would research and reach out to a prospect before our first call — using our product as the example." | Generic outreach; hasn't researched the company; no personalisation | Researched HireLikeaPro before the interview; has a clear personalisation methodology | Has a system: trigger events, stack research, multi-channel sequencing — and can explain why it works |
| Coachability | "Tell me about a time a manager gave you feedback you disagreed with. What did you do?" | Defended themselves; didn't implement; blamed the manager | Heard it out, tried it, reported back on what happened — even if it confirmed their original view | Sought feedback proactively; applied it immediately; can show how it improved their numbers |
Product Manager — Interview Scorecard
| Competency | Interview Question | Score 1 (Weak) | Score 3 (Strong) | Score 4 (Exceptional) |
|---|---|---|---|---|
| Customer empathy | "Describe the last time you deeply understood a customer problem. How did you get to that understanding, and what did it change?" | Relied on analytics; never spoke to customers directly; assumptions were untested | Has a regular customer research habit; can describe specific insights that changed product direction | Built a customer feedback system; the whole team has access to customer insight; it visibly shapes roadmap |
| Prioritisation under constraints | "Tell me about a time you had to say no to a feature the CEO or a key customer really wanted. How did you handle it?" | Built it anyway; couldn't defend the tradeoff; made the CEO the customer | Said no with clear reasoning and data; found a smaller version to test; maintained the relationship | Has a prioritisation framework the team trusts; CEO defers to them on roadmap decisions |
| Cross-functional influence | "How do you get engineers to care about a problem they don't think is important? Walk me through a real example." | Uses authority or escalation; engineers see PM as overhead | Brings problem context rather than solutions; engineers feel ownership over how to solve it | Engineers proactively bring product ideas to this PM because they trust the framework |
| Analytical rigour | "Describe how you've used data to make a product decision you were initially uncertain about." | Data is post-hoc justification; qualitative bias; no tracking before shipping | Defined metrics upfront; ran experiments; let data change their recommendation | Built the analytics infrastructure; can describe a time they were wrong and the data told them so |
| Shipping rhythm | "What does your release process look like? Walk me through a feature from idea to production." | No clear process; launches are unpredictable; retrospectives don't happen | Clear intake-to-launch process; consistent cadence; launch checklists exist | Built a process others copy; shipping is boring (predictable, reliable) not heroic |
Operations / Chief of Staff — Interview Scorecard
| Competency | Interview Question | Score 1 (Weak) | Score 3 (Strong) | Score 4 (Exceptional) |
|---|---|---|---|---|
| Systems thinking | "Tell me about a process you built from scratch that other people now rely on. How did you design it?" | Executes tasks; doesn't design systems; relies on manual effort | Identified a bottleneck, designed a solution, documented it, and it scaled | Builds systems that outlast them; others copy the approach across the company |
| Judgment under ambiguity | "Describe a time you had to make an important decision with incomplete information. What did you do?" | Waited for clarity; escalated every decision; couldn't operate in grey areas | Made a call, documented their reasoning, communicated it clearly, and learned from the outcome | Has a framework for operating under uncertainty; the CEO trusts their judgment at high stakes |
| Communication and alignment | "How do you keep multiple teams aligned without needing to be in every meeting? Walk me through how you do it." | Relies on being in every meeting; creates bottlenecks; information hoards | Has async communication systems; uses written docs; teams feel informed | Designed the company's communication infrastructure; information flows without them in the room |
| Prioritisation and follow-through | "Tell me about a time you had 10 things on your plate and everything felt urgent. How did you triage it?" | Got overwhelmed; dropped balls; needed a manager to triage for them | Applied a clear framework; communicated what was being deprioritised and why; delivered on commitments | Prevented the situation from recurring by redesigning the intake system; now a resource on prioritisation for others |
These templates are starting points. For any role you're hiring, you should customise the competencies to your specific context — a Series A startup's expectations for an SDR are different from a bootstrapped SaaS at $500k ARR.
Here's how to adapt them:
Adapting a scorecard to your hire
The scoring scale used in all examples above is 1–4:
1 — Weak
Evidence of a clear gap. Not just inexperience — a pattern that suggests this competency won't improve quickly.2 — Mixed
Some positive evidence but inconsistent. Warrants a follow-up question or concern.3 — Solid
Consistent evidence of the competency. This is a hire recommendation for this dimension.4 — Exceptional
Notably above expectations. The kind of answer that makes you lean forward.Don't score 3.5 or average two answers. Force a discrete score per competency. The discipline is the point.
Scores are only useful if you use them consistently. Here's the minimum process for a founding team:
Post-interview scoring process
If you need a complete structured template to work from — including the scorecard layout, scoring guidance, and a debrief checklist — see the free interview scorecard template. You can also generate a custom one for any role in two minutes using HireLikeaPro.
HireLikeaPro builds a tailored interview scorecard — with competencies, scoring criteria, and questions — based on your role and company context. Free forever, no credit card.
Build My Scorecard Free →