Skip to content

The Ultimate ATS Data Migration Checklist: A 10-Point Plan for a Compliant Transition

A 10-point ATS data migration checklist covering EEOC/OFCCP compliance, API rate limits, stage mapping, attachment handling, and how to avoid pipeline collapse during cutover.

Raaj Raaj · · 14 min read
The Ultimate ATS Data Migration Checklist: A 10-Point Plan for a Compliant Transition
TALK TO AN ENGINEER

Planning a migration?

Get a free 30-min call with our engineers. We'll review your setup and map out a custom migration plan — no obligation.

Schedule a free call
  • 1,200+ migrations completed
  • Zero downtime guaranteed
  • Transparent, fixed pricing
  • Project success responsibility
  • Post-migration support included

An ATS migration is a data-model translation problem, not a CSV upload. Your applicant tracking system stores deeply relational data — candidates linked to applications, applications linked to jobs, scorecards hanging off interviews, EEO responses tied to applicant records — and flattening that structure into spreadsheets destroys the relationships your recruiting team depends on.

This 10-point checklist covers the technical and compliance reality of moving candidate data between systems like Greenhouse, Lever, Workday Recruiting, iCIMS, Ashby, and BambooHR — without breaking active hiring pipelines or exposing your organization to discrimination claims. If you've already hit some of the common pitfalls, our coverage of 5 "Gotchas" in ATS Migration goes deeper on custom fields, integrations, and compliance traps.

Why ATS Data Isn't "Just a CSV Export"

Every ATS stores candidate data differently, and those differences aren't cosmetic — they dictate the entire database schema.

Greenhouse separates the person (Candidate) from the candidacy (Application), linking Applications to Jobs and hanging Scorecards off Applications with structured attribute ratings. Lever is opportunity-centric — a single Contact spawns multiple Opportunities. BambooHR flattens applicants as records attached directly to job openings. Workday Recruiting nests candidates inside requisitions with business-process-driven workflows. (developers.greenhouse.io)

When you export to CSV, you collapse these relationships into rows. A candidate who applied to three roles becomes three disconnected rows. Interview scorecards lose their attribute-level ratings. Stage history timestamps vanish. Source attribution breaks. The result: your recruiting analytics in the new system are useless, and your hiring managers can't see a candidate's full history.

Take a concrete example. If you export flat BambooHR data and push it into Greenhouse without transforming it, you'll create duplicate Candidate profiles every time a person applied for a different role. You lose the unified history of that person's interactions with your company. Translating Lever's fluid Contact → Opportunity structure into Greenhouse's rigid Candidate → Application → Job hierarchy requires splitting records, deduplicating contacts, and re-associating historical interview data.

Warning

Never collapse relational ATS data into flat rows. Maintain immutable source keys for the person (e.g., candidate_id or contact_id), each candidacy (application_id or opportunity_id), and each job (requisition_id) throughout the entire migration pipeline. (developers.greenhouse.io)

This is why the checklist below starts with compliance and schema mapping — not the migration itself. If you get the foundation wrong, the move is just organized data loss.

Step 1: Run the Compliance & Retention Audit (EEOC, OFCCP, GDPR)

Before you touch a single record, you need to know what you're legally required to keep, what you're legally required to delete, and what falls in the gray zone.

EEOC Retention Requirements

EEOC recordkeeping rules (29 CFR Part 1602) require covered employers to keep all personnel and employment records — including application forms, resumes, and hiring records — for one year from the date of the record or the personnel action, whichever is later. If an employee is involuntarily terminated, records must be kept for one year from the termination date. If an EEOC charge has been filed, all relevant records must be preserved until final disposition of the charge. (eeoc.gov)

OFCCP Requirements for Federal Contractors

If you're a federal contractor or subcontractor with 150 or more employees and a contract of at least $150,000, OFCCP requires you to retain personnel and employment records for a minimum of two years. This includes job postings, applications, resumes, interview notes, and selection process records. Smaller contractors (below those thresholds) fall back to the one-year minimum. Under Section 503 of the Rehabilitation Act and VEVRAA, certain records related to disability and veteran outreach require a three-year retention period.

OFCCP's applicant recordkeeping guidance reaches beyond the final hired/not-hired outcome. It calls for records such as resumes of qualified job seekers considered from external sites, copies of job listings, the substantive search criteria used, and the date of each search. If you purge that material because it feels old, you're not cleaning data — you're destroying evidence. (dol.gov)

If you collect veteran self-identification data under VEVRAA, OFCCP says an ATS or HRIS can serve as the required data analysis file only if that data is stored securely, kept apart from other personnel information, and restricted to people with a compliance need to know. (dol.gov)

Danger

Failure to preserve complete and accurate records during a migration constitutes noncompliance under 41 CFR § 60-1.12. OFCCP can apply an adverse inference — assuming missing records would have been unfavorable to you. This turns a migration shortcut into a discrimination liability.

GDPR and the Right to Erasure

For candidates in the EU (or EU citizens applying remotely), GDPR Article 17 grants the right to erasure. Candidates can request deletion of their data, and you must respond within one month. Before migrating, you must:

  • Process any pending erasure requests in the source system
  • Verify that consent records transfer correctly to the new system
  • Confirm your new ATS supports automated retention periods and deletion workflows
  • Ensure you are not migrating data for candidates who have withdrawn consent

The right to erasure is not absolute. ICO guidance confirms it does not override a separate legal obligation to retain data. (ico.org.uk) Your migration policy needs three outputs before export: what must be retained, what must be anonymized or suppressed, and what can actually be deleted.

Typical GDPR-compliant retention for unsuccessful candidates is 6–24 months, depending on jurisdiction and lawful basis. The French CNIL, for example, sets a default of 2 years.

For a deep dive on compliance during candidate data transfers, see our GDPR & CCPA compliance guide for ATS migrations.

Step 2: Audit and Classify Every Data Object

Before mapping anything, catalog every object type in your source ATS. Here's what a typical system holds:

Data Object Examples Migration Risk
Candidate profiles Name, email, phone, location, source Deduplication across multiple applications
Applications Job linkage, stage history, timestamps Stage names rarely match 1:1 across systems
Resumes/CVs PDF, DOCX, binary attachments URLs expire; files need re-association
Interview scorecards Attribute ratings, interviewer comments Structured ratings may not map to target schema
Offer letters Terms, approval chains, signed documents Often stored as attachments, not structured data
EEO/OFCCP data Race, ethnicity, gender, veteran/disability status Must remain separated from personnel files
Source attribution Channel, referrer, campaign tags Taxonomy differences break reporting
Talent pool tags Custom lists, nurture campaigns No standard taxonomy across vendors
Consent records GDPR opt-in timestamps, privacy policy versions Loss = compliance violation
Communication history Emails, InMails, scheduling threads Often not exposed via API

Tag each object as must-migrate, nice-to-have, or do-not-migrate (e.g., records past retention, data subject to pending erasure requests). (developers.greenhouse.io)

Step 3: Map Candidate Stages Across Systems

Pipeline stage mapping is where most ATS migrations silently fail. No two systems use the same stage names, and the stage hierarchy often reflects different hiring philosophies.

A company moving from Greenhouse to Lever needs to translate Greenhouse's rigid, job-specific stages ("Application Review" → "Phone Screen" → "On-Site" → "Offer") into Lever's account-wide pipeline stages that apply across all Opportunities. A stage called "Recruiter Screen" in one system might be "Phone Interview" in another — or split into two stages entirely.

Build a stage-mapping table before writing any migration code:

Source Stage (Greenhouse) Target Stage (Lever) Notes
Application Review New Lead Default entry stage
Recruiter Screen Recruiter Screen Direct match
Phone Interview Phone Interview Direct match
On-site Interview On-site Rename only
Debrief Debrief May not exist in Lever by default — create it
Offer Offer Check approval workflow differences
Rejected Archived – Rejected Lever uses archive reasons, not a rejection stage

Map stages by meaning, not label. Define how active, rejected, withdrawn, hired, and archived states land in the target, plus what happens to historical interviews and feedback.

person_key: source_candidate_id
candidacy_key: source_application_id
job_key: source_requisition_id
source_stage_id: a42482ff
target_stage: recruiter_phone_screen
historical_rule: backdated_scorecard_or_note
Warning

Do not attempt to map stages dynamically during migration script execution. Build a static mapping table that your ETL pipeline references. This forces Talent Acquisition leaders to sign off on the exact stage translations before data moves.

You must also map custom fields. If your legacy ATS captured "Willing to Relocate" as a boolean and the new system uses a dropdown, your script must translate the data types. Failure to do so results in silent data truncation.

Workable makes customers complete a stage-mapping sheet as part of migration. (help.workable.com) Greenhouse's active candidate migration guide notes that bulk stage changes only work when candidates are on the same job. (support.greenhouse.io) Stage labels are tenant-specific workflow markers, not portable truth. Get sign-off from your Head of Talent Acquisition before writing migration code.

For specifics on Greenhouse ↔ Lever stage mapping, see our Greenhouse to Lever migration guide.

Step 4: Handle In-Flight Requisitions

This is the step most checklists skip. If you have active requisitions with candidates mid-process — in interview loops, pending offer approvals, or awaiting background checks — you have three options:

  1. Freeze and migrate. Pause all hiring for the cutover window. This is the simplest approach technically, but it costs you candidate velocity. For a company with 50+ open reqs, even a 48-hour freeze can push offers past competitive deadlines.

  2. Parallel-run. Keep the source ATS live for in-flight candidates while onboarding new applications into the target ATS. Run both systems for 2–4 weeks, then reconcile. This avoids pipeline disruption but doubles the operational burden on recruiters.

  3. Staged migration. Migrate historical/closed data first, then cut over active reqs in batches by department or geography. This limits blast radius if something goes wrong and is what we recommend for most mid-to-large organizations.

Tip

Whichever approach you choose, never migrate active candidates without verifying their current stage in the source system at the moment of cutover — not at the time of the initial data export. Stages change hourly during active hiring.

Greenhouse's own active-candidate migration guide recommends keeping recruiting moving while historical migration runs, then migrating active candidates through a separate operational track. (support.greenhouse.io) That same split applies when moving between other ATS platforms.

Step 5: Understand Your API Rate Limits

If you're migrating via API (which you should be for any non-trivial migration), the rate limits of both the source and target systems will dictate your migration timeline.

Greenhouse Harvest API

Greenhouse enforces a rate limit returned in the X-RateLimit-Limit response header. Harvest v1/v2 will be deprecated and unavailable after August 31, 2026 — any migration scripts must target Harvest v3, which uses OAuth 2.0 and a 30-second fixed window. Exceeding the limit returns HTTP 429 Too Many Requests with Retry-After and X-RateLimit-Reset headers. List endpoints support per_page values up to 500 — use this to reduce total request volume. (support.greenhouse.io)

Lever Data API

Lever uses a token-bucket rate limit of 10 requests per second per API key, with bursts up to 20 req/sec. (hire.lever.co) The real bottleneck is write operations: Application POST requests are capped at 2 requests per second, and Lever warns this limit may change without notice. For a migration of 50,000 historical applications, that's a minimum of ~7 hours just for the application writes — assuming zero retries.

Workday EIB

Workday's native no-code import tool (Enterprise Interface Builder) uses XML Spreadsheet 2003 format and has an SFTP file-size limit commonly cited at 300 MB (30 MB for files attached at launch). EIB lacks native error handling and cannot natively represent deeply nested relational objects — candidate → requisition → interview feedback chains — in a single load. For complex migrations, you'll likely need Workday Studio or the SOAP/REST APIs. Confirm these limits against your tenant documentation before planning a high-volume attachment load. (workday.com)

For a deep technical analysis of Workday's migration constraints, see our Taleo to Workday migration guide.

The engineering lesson isn't just "slow down." It's centralize throttling. If four workers each believe they own 10 requests per second, your script will rate-limit itself into failure. Implement a single rate-limiter with exponential backoff and jitter that all workers share.

Step 6: Solve the Attachment Problem

Resumes, cover letters, and offer documents are the most fragile data in any ATS migration. Here's why:

  • Most ATS APIs return signed URLs for attachments, and those URLs expire. Greenhouse documents attachment URLs as expiring in 7 days and warns they can be temporarily unavailable during AWS S3 outages. (developers.greenhouse.io) Lever's temporary upload URIs expire within 24 hours. (hire.lever.co) If your extraction script saves metadata but defers file downloads, you'll come back to dead links.

  • Greenhouse's Harvest API requires attachments as base64-encoded content on POST, not as file URLs. A 2 MB PDF becomes ~2.7 MB after encoding, and you're uploading it through a rate-limited API.

  • Some systems store parsed resume data separately from the original file. If you only migrate the parsed fields, you lose the original document — which is the legally required artifact for EEOC/OFCCP retention.

  • Greenhouse distinguishes attachment types: resume, cover letter, offer packet, offer letter, take-home test, and other. That typing matters when recruiters search or audit later.

Your extraction pipeline must:

  1. Query the candidate/application record.
  2. Extract the temporary attachment URL.
  3. Download the binary file before the URL expires.
  4. Store locally with the source record ID and attachment type as metadata.
  5. Base64 encode (or prepare multipart form data) for the target API.
  6. POST the payload to the new ATS as part of the candidate/application creation step.
  7. Verify the checksum to confirm the file isn't corrupted.
Tip

Run a binary QA pass that is separate from row-count QA: file count, checksum/hash, MIME type, open test, and candidate/application linkage. Missing resumes are often discovered only when a recruiter opens a record in production. (developers.greenhouse.io)

Step 7: Migrate EEO/Diversity Data Separately

EEO and demographic data (race, ethnicity, gender, veteran status, disability status) must be handled with extreme care:

  • OFCCP requires this data to be kept separate from personnel decision files. If your migration script dumps EEO responses into a candidate's general notes or a custom field visible to hiring managers, you've created a discrimination exposure.

  • Most ATS platforms store EEO data in a segregated, access-controlled area. Your migration must write to the equivalent segregated area in the target system — not to the candidate profile.

  • Verify that EEO category taxonomies match between source and target. OFCCP's categories don't perfectly align with EEOC's EEO-1 report categories, and your source system may use yet another taxonomy.

  • For VEVRAA data specifically, the target system must enforce the same access restrictions — only people with a compliance need-to-know should be able to view it.

Step 8: Preserve Scorecard and Interview Feedback

Interview scorecards contain structured evaluation data — attribute-level ratings (e.g., "Technical Skills: Strong Yes"), free-text comments, and interviewer identity. This data powers quality-of-hire analytics and is often required for compliance audits.

The problem: scorecard schemas vary wildly between platforms. Greenhouse uses attributes with four-level ratings (Definitely Not, No, Yes, Strong Yes). Lever uses freeform feedback on Opportunities. Ashby has its own structured feedback model. A scorecard is not a single text field — it's a nested object containing specific attributes, interviewer IDs, ratings, and timestamps. If you flatten this into a generic "Notes" field, you destroy your company's ability to run historical hiring analytics.

If the target system doesn't support the same structured format, you have two options:

  1. Serialize scorecards as structured notes attached to the application, preserving interviewer, date, attribute, and rating in a parseable format.
  2. Map to custom fields in the target system, accepting that the data won't drive native reporting.

Neither is perfect. Document the compromise with stakeholders before migration, not after. Greenhouse explicitly notes that active candidate migration cannot backdate interviews, although scorecards can be backdated. (support.greenhouse.io)

Step 9: Test with a Representative Sample

Don't validate your migration with 50 clean candidate records. Test with a sample that includes:

  • A candidate with multiple applications across different jobs
  • A candidate with attachments in multiple formats (PDF, DOCX, image)
  • A rejected candidate with full scorecard history
  • A candidate with EEO data and consent records
  • An active candidate currently in an interview loop
  • A candidate with non-Latin characters in their name or address
  • A record that was merged or deduplicated in the source system

Push at least a representative subset into the target system's sandbox environment. Validate in the target system's UI, not just via API responses. What the API returns as "201 Created" and what the recruiter sees on screen can differ — especially for rich text fields, attachments, and stage history.

Use vendor thresholds when they exist. Workable recommends a sample import for datasets exceeding 40,000 candidates. (help.workable.com) Greenhouse recommends importing no more than 8,000 candidates at a time for best performance during bulk loads. (support.greenhouse.io)

Step 10: Execute the Cutover and Validate

The cutover should be the least dramatic part of the process if steps 1–9 are done correctly. Here's the execution sequence:

  1. Freeze the source system for net-new applications (redirect your career page to the new ATS or enable a maintenance window).
  2. Run a delta extraction to capture any changes since your last full export — new applications, stage movements, updated candidate info.
  3. Execute the final migration run against the target system.
  4. Validate record counts — total candidates, total applications, total attachments — against source system counts.
  5. Spot-check 20–50 records in the target UI, covering the edge cases from Step 9.
  6. Verify integration connections — job board feeds, HRIS sync, background check providers, scheduling tools — are pointed at the new system and receiving data.
  7. Confirm EEO data is segregated and accessible only to authorized compliance users.
  8. Enable the career page on the new ATS.
  9. Monitor for 48–72 hours — watch for broken candidate self-service portals, failed webhook deliveries, and recruiter-reported data gaps.
  10. Decommission the source system only after the monitoring period passes and stakeholder sign-off is received. Archive a full export for retention compliance.
Info

Do not delete the source ATS data immediately after cutover. Keep a read-only archive for at least as long as your longest retention obligation (2–3 years for OFCCP/VEVRAA) in case of audit or litigation.

When to Bring In a Migration Partner

DIY ATS migrations work for small teams with simple pipelines — under 5,000 candidates, fewer than 10 active reqs, no EEO obligations. Beyond that threshold, the combination of API rate-limit engineering, relational data mapping, binary attachment handling, and compliance constraints makes this a specialty job.

At ClonePartner, we've executed ATS migrations across Greenhouse, Lever, Workday, iCIMS, Ashby, Bullhorn, JazzHR, and Teamtailor — handling the API orchestration, stage mapping, attachment re-association, and EEO data segregation that this checklist describes. We guarantee zero downtime for active hiring pipelines and full preservation of nested relational data. Our Highsnobiety case study shows how we completed a Greenhouse-to-Teamtailor migration — originally quoted at three months — in days.

If you're planning an ATS migration and want to avoid the failure modes outlined above, talk to our engineering team.

Frequently Asked Questions

Can I use a CSV export to migrate ATS data?
CSV works only for very small, simple datasets. ATS data is relational — candidates link to applications, which link to jobs, scorecards, and EEO records. A CSV export flattens this structure, duplicates candidate records across multiple applications, drops scorecard attribute ratings, and loses stage-movement timestamps. For any migration with compliance requirements or more than a few thousand records, API-based migration is necessary.
How long do I need to keep applicant data when migrating ATS systems?
EEOC requires private employers to retain all personnel and employment records, including applications and resumes, for at least one year. Federal contractors with 150+ employees and $150k+ contracts must retain records for two years under OFCCP. Under VEVRAA/Section 503, certain disability and veteran outreach records require three-year retention. If an EEOC charge is filed, records must be kept until the case concludes. Never purge applicant data during a migration without verifying these thresholds.
What API rate limits affect ATS migrations?
Greenhouse Harvest v3 enforces a rate limit per 30-second fixed window (returned in the X-RateLimit-Limit header) and returns 429 with Retry-After when exceeded. Lever allows 10 requests/second steady state with bursts up to 20, but caps Application POST requests at 2/second. Workday's EIB tool has a commonly cited 300 MB SFTP file-size limit. These constraints directly determine migration duration and require centralized throttle-and-retry engineering.
How do you handle GDPR right-to-erasure during an ATS migration?
Process all pending GDPR erasure requests in the source system before migration. Do not migrate data for candidates who have withdrawn consent. Verify that consent timestamps and privacy policy version records transfer to the new system. Confirm the target ATS supports automated retention periods and self-service deletion workflows. The standard GDPR response deadline is one month from the date of the erasure request.
Why do resumes and scorecards disappear after an ATS migration?
Attachments use expiring signed URLs — Greenhouse URLs expire in 7 days, Lever upload URIs in 24 hours. If your script defers file downloads, you'll come back to dead links. Interview scorecards are nested structured objects, not flat text. If you export them via CSV they lose attribute-level ratings, interviewer identity, and timestamps. Both require separate extraction and re-linking in the target system.

More from our Blog

5
ATS

5 "Gotchas" in ATS Migration: Tackling Custom Fields, Integrations, and Compliance

Don't get derailed by hidden surprises. This guide uncovers the 5 critical "gotchas" that derail most projects, from mapping tricky custom fields and preventing broken integrations to navigating complex data compliance rules. Learn how to tackle these common challenges before they start and ensure your migration is a seamless success, not a costly failure.

Raaj Raaj · · 14 min read
Ensuring GDPR & CCPA Compliance When Migrating Candidate Data
ATS

Ensuring GDPR & CCPA Compliance When Migrating Candidate Data

This is your essential guide to ensuring full compliance with GDPR and CCPA. We provide a 7-step, compliance-first plan to manage your ATS data migration securely. Learn to handle lawful basis, data retention policies, DSARs, and secure transfers to avoid massive fines and protect sensitive candidate privacy.

Raaj Raaj · · 10 min read