← Go back

AI Job Description Enhancer

Responsibilities
  • Identified user need through Client Success feedback
  • Collaborated with product team on priorities and scope
  • Used AI tools to explore layouts and estimate effort
  • Designed and built a working prototype independently using Cursor
  • Tested prototype live with clients through Client Success
  • Iterated based on real client feedback before launch
Results
  • Shipped to clients as a live tool in the Mogul Recruiter
  • Prototype built independently without engineering support
  • Validated directly with clients through live testing on calls
  • Client feedback shaped the final version before launch
The problem

Bad job descriptions cost companies money, waste recruiter time, and turn away great candidates. Our Client Success team kept hearing the same thing from clients: their job descriptions and recruiting messages were inconsistent, unclear, and often unintentionally exclusionary. This was leading to poor applicant quality, low response rates, and slower hiring overall.

When something gets mentioned enough times by clients, Client Success brings it to the product team. This was one of those features. We added it to our backlog and decided to revisit it when we had more time.

The difference between a bad job description and a good one is huge. A bad one uses vague language, buzzword soup, intimidating tone, and a wish list of requirements that scares people off. A good one is clear, specific, explains the actual work and why it matters, and uses focused requirements that don't filter out strong candidates unnecessarily. Most companies don't realize their job descriptions are working against them.

Bad vs. good
✕  Before
Marketing Rockstar Wanted
·Passionate guru who crushes KPIs
·Thrives in fast-paced environment
·Self-starter with can-do attitude
Requirements
·10+ years of experience required
·Must have exceptional writing skills
·Bachelor's degree required, MBA preferred
✓  After
Marketing Manager, Growth
·Develop and run campaigns across paid and organic channels
·Report on performance against defined goals weekly
·Partner with product and sales to align on messaging
Requirements
·3–5 years of marketing experience
·Strong written communication skills
·Experience with campaign analytics
The same role, written two different ways. The version on the left reads like a wish list. The one on the right tells candidates what the job actually is.
The approach

I started by talking with our Client Success team to understand what clients were actually asking for and where their biggest pain points were. From there, I worked with the product team to align on priorities, identify gaps, and figure out how much time we wanted to invest.

Once we had a clear direction, I used Claude to explore different layout options for the tool and estimate development effort. This let me quickly work through questions about how the tool should be structured and what the flow should feel like, without needing to pull in the engineering team right away.

From those explorations, I sketched out breadboards that mapped the core user journey. The idea was simple: a recruiter pastes in their job description, the tool analyzes it and highlights areas that need improvement, and they work through the suggestions at their own pace.

User flow
Paste JD
recruiter pastes their text
Analyze
AI flags issues across 4 categories
Review
highlights appear inline in the text
Accept / Reject
each suggestion individually or all at once
Copy
improved text ready to use
The core flow mapped out before any design work started. Each step had to be obvious and fast. Recruiters aren't going to slow down for a tool that makes them think too hard.
Building the prototype

This is where things got interesting. I wanted to test what I could accomplish using AI tools on my own, without leaning on the engineering team. I took my breadboard ideas and wrote a prompt in Cursor to help me build a working prototype.

It was a game changer. I was able to move really fast, test ideas, and iterate on the fly. Instead of waiting for engineering bandwidth or putting together a spec and handing it off, I had a functional prototype I could put in front of real users. This is the kind of thing that would have taken weeks in a traditional workflow, and I was able to do it independently.

How it works

The tool analyzes a pasted job description and highlights suggestions across four categories. Recruiters can review each one individually, accept or reject changes, and watch their job description improve in real time.

app.mogulrecruiter.com/tools/jd-enhancer
Clarity 0
Inclusive language 0
Requirements 0
Tone 0
Software Engineering Manager We're looking for a rockstar engineering manager
Clarity
Vague and culture-specific. Try "experienced engineering manager" to describe the role clearly.
to join our growing team. You'll be a ninja
Inclusive language
"Ninja" and similar terms can feel exclusionary. Consider "skilled" or "experienced" instead.
who can juggle multiple priorities
Clarity
This phrase is vague. Be specific about what they'll actually be managing at once.
in a fast-paced, dynamic environment. What you'll do
  • Lead and mentor a team of engineers to crush their sprint goals
    Clarity
    Unclear and informal. Try "meet sprint commitments" or "hit delivery milestones."
  • Interface with stakeholders and drive technical strategy
    Tone
    Consider "develop and communicate technical strategy" to sound more collaborative than directive.
  • Partner closely with product and design
Requirements
  • 10+ years of software engineering experience
    Requirements
    This may filter out strong candidates. Consider "5+ years" or focus on specific skills instead of years.
  • Must have
    Tone
    "Must have" reads as demanding. Try "strong" or "demonstrated" to set a more welcoming tone.
    excellent written and verbal communication
  • Bachelor's degree required (Master's preferred)
    Requirements
    Degree requirements can limit diversity. Consider "or equivalent practical experience."
  • Must be a team player
    Inclusive language
    This phrase is subjective and hard to define. Focus on specific collaborative behaviors instead.
    who is passionate about excellence
Hover any highlight to see the suggestion. Each category flags a different type of issue. Recruiters can work through them one at a time or accept everything at once.

Purple highlights flag clarity issues: vague language, buzzwords, and sections that don't clearly communicate what the role actually involves.

Blue highlights flag inclusive language: wording that could unintentionally discourage certain candidates from applying, with more welcoming alternatives suggested.

Yellow highlights flag requirements: inflated or unnecessary criteria that might be filtering out strong candidates who could do the job.

Green highlights flag tone: adjustments that make the overall description more approachable and human.

Client testing

Once I had a working prototype, I sent it to our Client Success team. Instead of running a traditional usability test, they did something better: they tested it live during client calls with the exact clients who had originally requested this kind of tool.

The reaction was really positive. Clients were excited to actually test out a feature and give direct feedback. It also gave us a clear picture of what needed to change before we built the final version.

What shipped

We implemented all the changes that came out of client testing. The final version was more polished than the prototype, but the prototype had already nailed the flow, the core functionality, and the features that mattered most. The client feedback mostly shaped the details and the experience around the edges.

The tool shipped as part of the Mogul Recruiter, living in the Tools section where recruiters could use it alongside their sourcing and outreach workflow.

app.mogulrecruiter.com/tools/jd-optimizer
Job Description Optimization tool showing color-coded suggestions in the Mogul Recruiter
The shipped version of the tool inside the Mogul Recruiter.
What I learned

This project changed how I think about prototyping. Being able to go from sketches to a working prototype on my own, using AI tools like Claude and Cursor, meant I didn't have to wait for engineering time to validate an idea. I could test a concept with real users and come to the engineering team with a validated direction and real feedback instead of just a spec. That's a fundamentally different conversation, and it leads to better products shipping faster.