Full Stack Recruiter Newsletter

Full Stack Recruiter Newsletter

Your Interview Process is a Product. Right Now it Has A 1-Star Rating.

Most companies treat interviews as a filter, not a product. That's why top candidates drop off, ghost your offers, and leave 1-star reviews. Here's how to fix it with basic product discipline.

Jan Tegze's avatar
Jan Tegze
Feb 22, 2026
∙ Paid

The most viral interview threads on Reddit aren’t about rejection. They’re about disrespect. A candidate sits through five rounds over seven weeks, then gets ghosted. A recruiter schedules a call, no-shows, reschedules, no-shows again.

A take-home assignment arrives with a 10-hour scope and a 48-hour deadline, followed by silence. A panel of four interviewers asks the same question three different ways because nobody read the notes from round two. These aren’t edge cases. The interview process is the first product most candidates interact with. For most rejected candidates, it’s the only product.

And right now, that product has a 1-star rating.

What viral threads actually measure

The Reddit threads that hit thousands of upvotes share a pattern. They’re not stories about tough questions or high bars. They’re stories about broken systems masquerading as deliberate processes. Six rounds of interviews that never converge on a decision.

Take-home projects that replicate the company’s actual roadmap work. Interviewers who show up unprepared, ask about skills not listed in the job description, or spend 30 minutes monologuing about company culture while the candidate sits muted.

These threads function as user bug reports. Except most companies never read them.

The internal narrative of the internal process is often the same: we’re rigorous, we’re thorough, we’re protecting quality. The external reality, the one visible in Glassdoor reviews and Reddit posts, is often different.

That gap exists because most companies don’t think of their interview process as a product. They think of it as a filter. Filters don’t need user experience. Filters don’t need ownership. Filters don’t get iterated on when they produce bad outcomes, they just get blamed on “the talent pool.”

But a filter and a product solve different problems. A filter separates. A product converts. If you’re treating interviews as a pure filter, you’re measuring precision: did we let the wrong people through?

If you’re treating interviews as a product, you’re measuring conversion at every stage: did the right people make it to the end, and did they say yes when we asked?

Stack of bug report tickets sliding off a desk into an overflowing wastebasket

The internal architecture problem

Interview processes don’t break down. They were never built. What exists in most companies is an accumulated system, a kind of sedimentation where each new hire adds a round, each bad hire adds a skill assessment, each executive adds a “culture conversation,” and nobody ever removes anything. Within 18 months, a process that started with three steps has metastasized into seven.

And nobody owns it.

Recruiting owns sourcing and scheduling. Hiring managers own the final decision. HR owns the offer paperwork. But the candidate journey, the actual end-to-end experience, belongs to no one. Sometimes it belongs to TA teams, but not every time. It’s like a product where engineering owns the backend, design owns the frontend, and nobody owns whether the two systems talk to each other.


Subscribe today and stay one step ahead of other recruiters with the latest news and tips!


This creates predictable failure modes. Round three asks the same technical questions as round two because the interviewers don’t coordinate. The take-home arrives before anyone’s explained what the role actually does day-to-day. The hiring manager cancels twice because “things came up,” which they did, because the hiring manager has 12 other priorities and interviewing isn’t on the performance review.

I watched a Series B startup add a sixth interview round after a bad executive hire. The logic was sound: we need better signal on leadership capability. The result was a 40% drop in candidate progression from round four to round five. Not because the bar went up. Because the process became a test of patience, and the candidates with options stopped having patience.

The fix they implemented wasn’t better interviewers or clearer rubrics. It was a product manager for hiring. Someone whose actual job was to look at conversion rates by stage, identify where candidates dropped off, ask why, and ship improvements weekly. Within two months, they’d cut the process from six rounds to four, introduced structured debriefs that happened within 24 hours, and saw offer acceptance climb from 58% to 76%.

The work wasn’t sophisticated. It was basic product discipline applied to a system everyone assumed was fine.

Three hands pulling puzzle pieces apart leaving an empty gap in the center

When take-homes became spec work

Take-home assignments started as a reasonable alternative to whiteboard coding. The idea was: give candidates time to think, let them work in their own environment, evaluate them on something closer to actual job tasks. Somewhere in the last five years, this mutated.

Developers frequently complain about 6+ hour assignments with zero feedback (e.g., Reddit threads, interviewing.io survey of 700 engineers noting “no feedback at all” as demoralizing).

The candidate invests a weekend. The company invests 20 minutes of review time, usually by someone who doesn’t read the whole submission.

This isn’t a signal problem. It’s a respect problem.

I know a designer who was asked to “reimagine our onboarding flow” as a take-home. She spent 12 hours on it. Delivered a Figma file with flows, research notes, and rationale. Got rejected via automated email. Three months later, she saw her onboarding concept live on their site. Not similar, but identical! Same interaction patterns, same copy structure, same visual approach.

They'd specced real work as an interview exercise. The line between “show us how you think” and “do our work for free” has collapsed in enough places that candidates now assume the worst. When a company asks for a take-home, the immediate question is: are they evaluating me, or are they mining ideas?

The companies doing this well have hard rules. Take-homes are capped at 90 minutes. They’re artificial problems, not real roadmap work. Every submission gets written feedback within 48 hours, even rejections. And they track completion rates. If 60% of candidates who receive the assignment don’t return it, that’s not a candidate quality problem. That’s a product problem.

But most companies don’t track completion rates. They don’t measure time-to-feedback. They don’t ask whether the assignment correlates with job performance once someone’s hired. They just know they need “more signal,” so they add more stages, and the wheel spins.

Person handing a blueprint through a window seeing it framed on the other side

Product discipline applied to hiring

Treating the interview process as a product means applying the same rigor you’d apply to anything users interact with. You instrument it. You measure drop-off. You run experiments. You kill features that don’t work.

Start with an NPS question for candidates, regardless of outcome: “Would you recommend interviewing here to someone else in your field?” Track it by stage. If candidates who make it to round four rate the experience worse than candidates who stop at round two, round three is doing something destructive.

Then map the actual candidate journey. Not the idealized process in the handbook. The lived experience. How long between application and first contact? How many times does the average candidate get rescheduled? How often do interviewers show up unprepared? How many candidates ghost after receiving an offer, and at what stage did the process lose them?

Most companies discover they don’t have this data. They have time-to-hire and cost-per-hire, which are operations metrics. They don’t have conversion-by-stage or satisfaction-by-touchpoint, which are product metrics.

Once you have the data, you ship improvements the way you’d ship product improvements. Small, frequent, measurable.

A mid-stage company I consulted, many years ago, had a 35% drop-off between the recruiter screen and the hiring manager call. Standard explanation: candidates weren’t serious. Actual cause, once they investigated: average time between the two calls was 11 days, and the hiring manager call was scheduled as a “quick sync” with no prep materials. Candidates didn’t know what to prepare for, didn’t know what the role entailed, and had usually received another offer by the time the call happened.

Fix: send a structured “what to expect” doc within 24 hours of the recruiter screen, and move the hiring manager call to within 72 hours. Drop-off fell to 18% in one quarter.

That’s not a hiring innovation. It’s blocking and tackling. Set expectations. Reduce latency. Give users the information they need to make the next decision.

The reason most companies don’t do this is the same reason most products shipped before product management existed were inconsistent and confusing. Nobody owned the whole experience. Engineers owned features. Designers owned flows. But the thing the user experienced, the integrated whole, had no owner.

Hiring has the same problem. Until someone owns candidate experience end-to-end, with the authority to cut broken stages and the responsibility to measure outcomes, the process will optimize for internal convenience, not conversion.

Right now, your interview process is a 1-star product. Candidates are writing the reviews. You’re just not reading them.


Share this article with your thoughts on LinkedIn!

Share


The Five Fixes That Actually Move Offer Acceptance Rates

Most interview process improvements are theater. Companies add rubrics that nobody follows, run calibration sessions that don’t change scoring patterns, and institute “candidate experience initiatives” that produce zero measurable change in conversion rates or satisfaction.

There are five interventions that consistently move the numbers. I’ve saw these across 40+ companies, from Series A startups to public companies with 10,000+ employees. The pattern holds.

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2026 Jan Tegze · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture