The briefing comes in. "We need a senior profile, resourceful, with strong communication, used to working under pressure." You post the role, run searches, do outreach, and by day three you already notice the problem. Reasonable candidates come in, but few really fit. The hiring manager tweaks the requirements on the fly. The shortlist shifts. The process drags on.
That''s usually not a sourcing problem. It''s usually a problem of job analysis done badly.
In recruiting, the most expensive mistake isn''t reaching out to the wrong candidate. It''s starting with a vague definition of the role. When the role is poorly analyzed, everything else gets contaminated: the search, the filters, the prioritization, the interview, and even how you justify why you process certain data and not others.
I''ve seen processes fall apart over a single confusion: asking for experience with a tool when the real bottleneck was the relationship with the internal client, or looking for a "hunter" when the company actually needed someone capable of organizing pipeline, processes, and reporting. The market doesn''t correct that ambiguity. It amplifies it.
Why a Good Job Analysis Is Your Secret Sourcing Weapon
Most teams still treat job analysis as a preliminary document you have to fill in so the process "looks right". That approach wastes time. In practice, the analysis isn''t bureaucracy. It''s the master filter of the process.

When a recruiter receives a vague request, they don''t have a volume problem. They have a precision problem. If you don''t know what tasks support the role, what results are expected, and what signals distinguish a good candidate from a merely acceptable one, the search becomes trial and error.
The real cost of doing it badly
The analysis of jobs has very old roots. Frederick Taylor''s studies between 1885 and 1915 laid the foundations of systematic work observation, and adapted to recruitment, they reduced hiring risks by 30-50% in manufacturing sectors by aligning capabilities with specific tasks, according to the work compiled in this Comillas research on job analysis and valuation. That same framework is still useful today to prevent conflicts over duplicated functions, reducing them by 25%, and to support precise descriptions.
That matters far more than it seems in an agency or TA team. If the role definition is fuzzy, three very specific failures appear:
Inflated searches: you add more keywords to "cover yourself" and end up with noise.
Inconsistent interviews: each interviewer evaluates a different idea of the role.
Generic outreach: the message doesn''t connect because it doesn''t speak to the real challenge of the role.
A good recruiter doesn''t start by searching
They start by translating business into observable criteria.
A hiring manager might ask for "proactivity". That''s not actionable for searching. What is actionable is grounding it. Should they detect issues before they escalate? Unblock other teams? Speak with complex clients without constant support? Quality sourcing is born at that level of detail.
Practical tip: if a requirement can''t be turned into evidence visible on a CV, on LinkedIn, in an interview, or in a reference, it''s still not well defined.
Analysis improves speed, it doesn''t slow you down
This is the point most often misunderstood. Many recruiters avoid digging in because they want to gain speed. In reality, when you don''t do the analysis well at the start, you pay for that haste later with shortlist revisions, redoing searches, and miscalibrated pipelines.
In complex processes, a good job analysis does something very simple: it gives you exclusion and prioritization criteria before you open any tools. That''s the competitive edge. You don''t compete by sending more messages. You compete by knowing whom not to write to and by spotting the profile that actually makes sense earlier.
And the sharper that analysis is, the more useful technology layers become. AI doesn''t fix a bad role definition. It only accelerates its mistakes. But with a well-dissected role, it can help you filter patterns, prioritize profiles, and find candidates who don''t describe themselves with the exact words the hiring manager uses.
The Foundations of Analysis: Gathering Key Information
Job analysis breaks down when it leans on a single source. If you only listen to the hiring manager, you get an aspirational version of the role. If you only look at the old job description, you inherit biases and recycled language. If you only talk to the person currently in the role, you get a partial view of day-to-day reality.
That''s why mixed approaches work better. In Spain, combining interviews, observation, and questionnaires optimizes hiring costs by 20% to 30%, and questionnaires are already used in 60% of medium-sized companies. Direct observation, while more costly, can reduce errors in repetitive tasks by 25-35%, according to Miguel Hernández University''s study on job analysis and description.
Who to actually interview
My rule is simple. For any relevant position, I talk to three levels:
Direct manager
Internal stakeholder who receives the work
Person who does or did the job
Each one brings a different piece.
The manager defines expectations, context, and scope of autonomy. The stakeholder reveals real friction. The person executing the role usually shares what no one puts in the JD: invisible tasks, makeshift tools, dependencies, and bottlenecks.
What to ask to get out of vagueness
Don''t start with competencies. Start with facts.
Ask questions that force you down to the ground:
About results: What does this person have to have solved at 3 or 6 months?
About real priority: What tasks take up the most time even though they don''t look "strategic"?
About autonomy: What decisions can they make without escalating?
About friction: Where did previous people fail?
About context: Which areas does this role clash with the most?
About signs of success: What would make you say "we got it right" a few months in?
If you want to organize those interviews and stay consistent across processes, it helps to work with a clear checklist. A good support is this guide on evaluation checklists, because it forces you to turn impressions into comparable criteria.
When to use each method
Not every role deserves the same effort or the same type of data collection. That''s where many teams overcomplicate things.
Structured interviews
They''re the foundation for almost any office position, middle manager, or specialized profile.
They work well when you need to understand influence, relationships with other areas, judgment, and decision level. The important thing isn''t "having a conversation". The important thing is that all conversations follow the same architecture so you can compare.
Direct observation
Makes more sense in roles with repetitive tasks, clear sequences, or visible interaction with tools, clients, or processes. In operational back office, in-person service, logistics, or environments with high standardization, observing avoids relying on idealized descriptions.
In strategic or hybrid roles, observing without a prior framework can consume time and add little value.
Questionnaires
They serve to scale. They don''t replace a good interview, but they organize information and let you compare similar roles with less friction. They also help when you''re working with several managers and you need consistency in data collection.
Comparison of Information-Gathering Methods
Method | Ideal for... | Advantages | Disadvantages |
|---|---|---|---|
Structured interviews | Technical roles, middle managers, sales positions, profiles with high interaction | Surface context, nuances, priorities, and judgment | Depend on the quality of questions and participants'' time |
Direct observation | Repetitive tasks, operations, customer service, production, roles with visible sequence | Show real work and drift less toward the aspirational | Consume more time and don''t always fit knowledge roles |
Standardized questionnaires | High-volume processes, role families, comparisons across areas | Make consistency and objective comparison easier | If poorly designed, generate generic responses |
Useful tip: when a manager says "the role is very changing", don''t skip the analysis. The opposite. That usually means you need to separate stable tasks, incidents, and invisible work.
What doesn''t work
Some practices almost always degrade the quality of the analysis:
Copying an old JD: you carry over requirements that no longer explain the business.
Accepting soft labels without examples: "leadership", "proactivity", "flexibility".
Talking to a single person: you get one self-interested version of the role.
Confusing seniority with years: the real level usually lies in the complexity of the environment and autonomy, not just career length.
When you collect information well, the next step is no longer accumulating notes. It''s turning them into a useful map for searching better.
From Information to Profile: Building the Competency Matrix
Once you have interviews, observations, and questionnaires, another risk appears. Ending up with a pile of correct information that''s useless for searching. The recruiter needs to turn that material into an actionable structure. That''s where the competency matrix comes in.

The matrix forces you to separate the important from the accessory. And, above all, it forces you to stop mixing knowledge, skills, and behaviors as if they were the same thing.
The structure that works best
I work with three blocks:
Knowledge
It''s what the person needs to know. It includes technical mastery, regulations, tools, processes, markets, or working languages.
For a Key Account Manager, this would include CRM, P&L reading, account structure, complex commercial negotiation, or channel knowledge.
Skills
It''s what the person needs to know how to do. We''re no longer talking about theory, but observable execution.
Continuing with the same example, a skill isn''t "communication". It would be presenting proposals to leadership, renegotiating terms, detecting churn risk in an account, or coordinating deliveries with operations.
Behaviors and attitudes
It''s how the candidate moves in the real environment of the role. This isn''t about abstract traits. It''s about conduct.
"Customer orientation" is too vague. In the matrix, I prefer to write something like: maintains difficult conversations without damaging the commercial relationship, or prioritizes incidents with contractual impact over lower-value tasks.
The value of analyzing hidden behaviors
In remote work, the critical incidents method gives a lot of information that a formal description doesn''t capture. In Spain, remote work has increased 150% post-pandemic, and this method showed 92% effectiveness in defining hidden responsibilities in a study of 200 Spanish SMEs, while reducing onboarding by 30%, according to the analysis published by PeopleNext on practical methods for job analysis.
This fits perfectly with the matrix. Often the problem isn''t in the official task of the role, but in what no one wrote down: managing interruptions, coordinating without visibility, writing clearly, working with autonomy, or sustaining pace without continuous supervision.
Operational key: if a competency doesn''t change your shortlist, it doesn''t deserve to be on the matrix. Adding for the sake of adding only injects noise.
A small example of conversion
Say in interviews this phrase comes up: "We want a profile with strong communication, commercial focus, and strategic mindset."
That''s not yet usable.
A useful translation would look like:
Knowledge: experience with consultative sales in long cycles.
Skill: running meetings with business stakeholders and synthesizing complex needs.
Behavior: maintains commercial judgment even when the client pressures for unfeasible urgencies.
Now you can search, filter, and evaluate.
How to prioritize without overcomplicating things
The matrix doesn''t have to be huge. It has to be clear. To make it useful, separate competencies into three levels:
Must-haves: without this, there''s no fit.
Accelerators: not mandatory, but shorten the adaptation curve.
Contextual: depend on the team, sector, or business moment.
That order helps you avoid the classic mistake of asking for an impossible profile. And it gives you a solid base to translate the role into search filters, interview questions, and screening criteria.
Drafting the Job Description and Specification
With the matrix defined, it''s time to write two distinct documents. This step often gets blurred together, and that''s where many problems start. The job description doesn''t serve the same function as the job specification.

The description looks outward. The specification looks inward. When you try to use a single document for both, you publish flat texts and on top of that you have no serious guide for sourcing.
The job description sells context, not fluff
A good description isn''t a list of tasks pasted one below the other. It has to explain why the role exists, what impact it will have, and what environment it will operate in.
To write it well, I prioritize this order:
Mission of the role
Expected results
Main responsibilities
Work environment and key relationships
Visible requirements for the market
The mission should be brief and concrete. If it sounds like a corporate slogan, it doesn''t help. "Lead the relationship with key accounts to protect and expand recurring business" says more than "manage portfolio with results orientation".
Responsibilities also improve a lot when written by impact. Instead of "coordinate meetings with clients", it''s more useful to write "coordinate periodic conversations with clients to anticipate risks, align expectations, and sustain renewals".
The specification is your technical blueprint
This document isn''t designed to attract. It''s designed to decide.
Here you dump the hard part of job analysis:
Real minimum requirements
Must-have competencies
Disqualifying signals
Team context
Valid transferable experience
Validation questions for screening
A structured functional analysis can achieve 85% accuracy in defining competencies and reduce turnover by 22% by aligning profiles better, according to data from Spanish SMEs collected by Cegid in their guide on how to do job analysis. That same material warns of something I see continuously: skipping cross-validation between job holder, manager, and HR can generate up to 40% inaccuracies in the specification.
That explains why some searches start well and then drift. The market didn''t fail. The internal document failed.
How to write without creating absurd filters
The specification must distinguish between what''s negotiable and what''s not. If you don''t, you end up screening out very valid profiles.
Three rules help a lot:
Remove tools that are quickly learned: if the value of the role lies in judgment, don''t turn every piece of software into an absolute requirement.
Describe valid transfer: if you accept experience from adjacent sectors, write it down.
Add context criteria: company size, type of client, operating pace, international exposure.
Later, when it''s time to publish, you''ll want to adapt the description for visibility and clarity. If you''re reviewing that part, this guide on free job posting platforms can serve as a tactical reference for the channel, although the quality of the underlying document remains the decisive factor.
Visual support helps align the manager
When I notice the hiring manager keeps mixing wishes with needs, I prefer to review the wording with a visual support or an external resource that makes consensus easier.
Golden rule: if the job description works to attract, but the specification doesn''t work to screen out, it''s not finished yet.
Integrating Results with Sourcing and Compliance
This is where job analysis stops being theory and goes into production. If the document ends up filed away, you haven''t gained anything. The value appears when you turn that analysis into search logic, prioritization, and outreach.
The right translation is fairly direct. From the specification you pull alternative titles, keywords, exclusions, equivalent industries, seniority level, languages, operating environment, and context signals. That gets transformed into Boolean searches and finer filters.
From criterion to filter
A weak recruiter searches for titles. A strong recruiter searches for evidence.
If the role requires enterprise account management in complex cycles, "Account Manager" isn''t enough. You have to cross signals. Company size. type of client. international exposure. relationship with procurement. work with multiple stakeholders. All that comes from the analysis, not from intuition.
This is the layer where modern tools fit. For example, HeyTalent lets you extract updated profiles from Boolean searches, enrich them with contact data, and apply customizable AI variables to filter by signals from professional history, working as a complement to the ATS and to sourcing work. If you want to dig deeper into how this logic fits within a broader strategy, this piece on talent attraction adds operational context.
AI doesn''t replace the recruiter''s judgment
It accelerates it when the role is well defined.
One of the least-explored angles is the use of AI for predictive evaluations. In Spain, 75% of large companies report musculoskeletal disorders, and only 15% of Spanish SMEs use AI in HR, according to the technical note linked on ergonomic job evaluation and use of AI in predictive contexts. The same material proposes that AI tools can enrich profiles by analyzing histories to predict ergonomic aptitudes, especially under the framework of EU directive 2023/970.
For recruiters and agencies, the practical reading is clear. If the role has physical conditions, hybrid environment, autonomy demands, or work patterns that don''t always appear explicitly, the prior analysis tells you what signals to look for. AI helps find them faster, but only after defining the criterion well.
Compliance and traceability
There''s also a legal reason to take this work seriously. When you document why certain requirements are relevant for the role, you better justify the data processing within your process. That doesn''t make any practice valid, but it does provide traceability, coherence, and defensible judgment.
In simple terms, a good analysis does three things:
Limits arbitrariness: you reduce requirements added "because they''re always asked".
Adjusts data to the purpose of processing: you look for information related to the role, not out of curiosity.
Organizes internal communication: manager, recruiter, and internal client share the same definition of the role.
Operational conclusion: the integration between analysis, sourcing, and AI isn''t an add-on. It''s the only way to turn a briefing into a replicable, fast, and defensible process.
Conclusion: Your Action Plan for a Flawless Process
Job analysis isn''t the prologue to the process. It''s the core. If this part fails, everything else becomes slower, more expensive, and less reliable.
The right sequence is fairly clear. First, you collect information from several sources. Then, you turn it into a useful competency matrix. Then you separate description and specification so you don''t mix attraction with evaluation. Finally, you carry that criterion into sourcing, filtering, outreach, and internal documentation.
This shift in approach greatly improves the recruiter''s work. You stop chasing ambiguous briefs and start operating with concrete signals. You stop filtering by intuition and move to prioritizing by evidence. You stop relying on job titles and start spotting real capability.
If I had to summarize it in a short action plan, it would be this:
Don''t accept vague briefings. Drill every requirement down to tasks, results, and observable conduct.
Use more than one source. Manager, stakeholder, and job holder.
Build a small but tough matrix. Must-haves, accelerators, and context.
Separate the publishable from the evaluable. One attracts. The other decides.
Turn the analysis into real filters. That''s where the process gains speed.
The recruiter who closes better usually isn''t the one who sees the most candidates. It''s usually the one who defines the role better before going to market.
Frequently Asked Questions on Job Analysis
What to do when the hiring manager and the team contradict each other
It''s normal. The manager usually describes the role from expectation and responsibility. The team describes it from friction and execution. Don''t pick a version by intuition.
What works better:
Map overlaps: tasks and results both repeat.
Separate desire from need: many discrepancies come from mixing "ideal" with "must-have".
Validate with recent examples: ask for concrete situations, not general opinions.
If the contradiction persists, document scenarios. Sometimes the problem isn''t the candidate being sought, but that the role is poorly designed.
How to analyze a new role that no one has held yet
In a new role, you can''t interview an incumbent, but you can analyze the business problem the role has to solve.
Start with four questions:
What result justifies the hire?
What tasks will they have to take on from day one?
Which areas will they interact with the most?
What mistakes would be especially costly in the first months?
Then look for similar internal references or equivalent profiles at other companies. The key isn''t to copy a title. It''s to isolate transferable capabilities.
How often should the analysis be updated
Don''t wait for the next process if you already know the role has changed. In commercial, tech, operations, or growing teams, roles mutate fast.
Clear signals to review it:
Change in reporting
New product or market
Greater client complexity
Increased autonomy or responsibility
Emergence of tasks that didn''t exist before
If you don''t update, sourcing keeps searching for an old version of the role.
How to keep the analysis from becoming too long
Excess detail also hurts. A good analysis doesn''t document everything. It documents what affects selection.
What to leave out:
Residual tasks that don''t change the role''s success
Ornamental requirements
Secondary tools
Personal traits impossible to verify
What absolutely must be clear:
Expected result
Core responsibilities
Must-have requirements
Role context
Disqualifying and prioritization criteria
Does it work the same for high-volume roles and executive roles?
Yes, but not with the same depth or the same method.
In volume hiring, you want to standardize more and reduce operational ambiguity. In executive hiring, what matters more is influence, decision-making, internal political context, and capacity to transform. The principle is the same. What changes is the level of analysis and the type of evidence you need to collect.
If you want to bring this approach into daily execution, HeyTalent can help you turn a well-done specification into Boolean searches, AI filters, enriched contact data, and automated outreach without replacing your ATS. Technology speeds up the process a lot when the role is already well defined.
