Skip to content

Migrating Dynamics 365 On-Premise to Cloud: Escaping the SSIS Bottleneck with JSONata

If you are migrating Microsoft Dynamics 365 from on-premise to the cloud, standard tools like SSIS and KingswaySoft often cause project-stalling bottlenecks. This technical guide details how to replace slow, UI-bound SSIS packages with self-contained, JSONata-powered binaries. By leveraging declarative YAML mappings and automation , engineering teams can bypass workflow fatigue, execute complex data merges, and reduce debugging cycles from four hours to just twenty minutes.

Raaj Raaj · · 7 min read
Migrating Dynamics 365 On-Premise to Cloud: Escaping the SSIS Bottleneck with JSONata
TALK TO AN EXPERT

Planning a migration?

Get a free 30-min consultation. Our engineers review your setup and map out a custom migration plan — no obligation.

Schedule Free Consultation
  • 1,200+ migrations completed
  • Zero downtime guaranteed
  • Transparent, fixed pricing

If you migrate Microsoft Dynamics 365 from on-premise to the cloud, conventional wisdom points you toward SQL Server Integration Services (SSIS) coupled with a connector like KingswaySoft. It is the industry standard.

It is also exactly why I recently watched a single bank migration drag on for six agonizing months.

The reality of these migrations hit hard when I spoke with Sudhir, a Product Manager on the client team. He was forced into the migration trenches because key people overseeing their Dynamics rollouts had left the company. They had exactly one person across their India/US team building SSIS packages. Due to a combination of workflow fatigue and the clunky SSIS developer experience, their migrations were stalling out.

To hit a tight weekend cutover for Bank A and rescue a hostile project for Bank B, we had to stop clicking through workflow interfaces.

Here is the exact technical and operational blueprint of how we replaced slow, UI-bound SSIS packages with self-contained, JSONata-powered binaries. I will walk you through the strict boundaries we set, how we solved the three fatal flaws of KingswaySoft, and the hybrid architecture we deployed when standard tools failed.

The Operational Boundaries and Technical Context

Before writing a single line of code, Sudhir and I established strict operational boundaries for the bank A migration. When dealing with core banking data, security is absolute:

  1. Zero Access: My team would not have access to any client or bank systems.
  2. Self-Contained Execution: We would provide a standalone executable binary that client could run locally on their secure servers.
  3. The Scope: The binary would ingest a series of flat CSV files and push them to the Cloud Dynamics 365 Web API.
  4. Data Sourcing: The CSV files would be generated entirely by the client team by exporting their on-premise Microsoft SQL Server tables (like contacts, account, person).

Sudhir shared a batch of test CSVs. We built the first version of the binary in a single day. It handled OAuth, parsed the CSVs, validated the schema, and executed the migration. We ran a few tests, recorded a Loom video to prove the data flow, and sent it to Sudhir. Once he was convinced, we moved to the QA environment.

The Bottleneck: The 3 Pitfalls of KingswaySoft

When you use KingswaySoft, engineers manually build packages for each Dynamics table. For a massive CRM, this introduces three fatal flaws that crush iteration speed:

  1. Workflow Fatigue: You sit and write packages for each table individually. When you repeat the same manual clicking process hundreds of times, it is only a matter of time before you start making mistakes.
  2. Copy-Pasta Errors: Engineers frequently copy and paste mapping nodes to save time. KingswaySoft provides a UI to select fields, but humans still forget to update the target attributes, leading to bad data mappings.
  3. Opaque Logic in the UI: Workflow UIs look fantastic in a sales demo. But believe it or not, for anything more complicated than two if-else conditions, you cannot make sense of what is happening. The mind-boggling amount of effort and time required to see what is happening inside a package is insane.

To add insult to injury, SSIS and Azure Data Factory (ADF) have their own stability quirks. Client engineers frequently spent three to four hours debugging why their SSIS packages simply stopped running, only to realize they just had to restart the ADF to make it work again.

The Engineering Solution: How We Solved the 3 Pitfalls

To fix this, we worked directly with a new Business Analyst on their team. The BA talked to the bank to understand what SQL tables to migrate and how to transform them, documenting everything in a "Migration Checklist."

Here is how we bypassed the KingswaySoft bottlenecks entirely:

1. LLM Automation (Solving Fatigue)

We didn't click through UIs, and we didn't write our mapping files from scratch. We built a templatized script that utilized LLMs to read the BA's checklist and auto-generate the mapping structure. This got us 90% of the way there instantly, leaving only 10% for manual tweaking.

This automation was an absolute lifesaver for Bank A's marketing data. They had a massive amount of cdi tables generated by an external email campaign tool. Manually mapping dozens of cdi tables in SSIS would have taken weeks; our script generated the baseline mappings in seconds.

2. Shifting Left (Solving Copy-Pasta)

We shifted the place where mapping mistakes happen right down to the source. That BA checklist became our literal source of truth. If the migration checklist was correct, the generated mapping was correct. We had numerous cases where the name of the table attributes was wrong in the migration checklist itself—we caught and fixed them before code was ever executed.

3. JSONata and YAML (Solving Opaque Logic)

We implemented the entire mapping logic as a single, declarative YAML file powered by JSONata (a lightweight query and transformation language).

It is much easier to find how you are transforming a specific attribute by just using Cmd+F in your text editor. Furthermore, these mappings dictated row-level ETL logic—instructing the binary exactly when to skip a malformed CSV row, when to run a strict insert, and when to upsert based on existing Alternate Keys.

# Conceptual YAML/JSONata mapping generated from the BA Checklist
entity: "account"
source_csv: "BankA_Accounts_Export.csv"
mappings:
  - source_field: "status_code"
    target_field: "statecode"
    # JSONata logic to map legacy on-prem statuses to cloud integers
    transformation: "status_code = 'Active' ? 0 : 1"

Because of this setup, our iteration speed was a breath of fresh air. Columns would be missed, CSV data might be absent, or mapping names would be wrong. Debugging these issues took us 20 to 30 minutes, and the fixes were ready instantly. Compare that to KingswaySoft, where the debugging-packaging-running ceremony took around four hours.

Dynamics 365 Web API Survival Guide

Running a custom binary means you interact directly with the Web API. We learned several critical optimizations to prevent the API from choking during Bank A's 4 to 5 million record weekend cutover:

  • Disabling Plugins: Dynamics plugins configured to run on every insert or update drastically increase API response times. Disable all non-essential plugins before migrating.
  • Date Formatting: On-premise SQL datetimes must be strictly parsed and formatted before hitting the Web API.
  • OData Foreign Keys: Specifying foreign key references in the Web API requires exact OData binding syntax (primarycontactid@odata.bind: "/contacts(uuid)").
  • Status Updates: Understanding exactly how statecodes and statuscodes need to be passed for various entities is critical to prevent API rejection.

The Bank B: The Hybrid Architecture Tipping Point

Then came Bank B. This migration had been running for six months, the bank was pissed off, and Sudhir explicitly warned me the stakeholders were "hostile."

Bank B was using Dynamics CRM as a custom helpdesk (similar to Salesforce Service Cloud). They had a custom setup where tickets were split across four entirely different on-premise SQL tables based on "Ticket Type." They needed all of them merged into a single Cloud table.

The tipping point for Bank B wasn't just the delay. The Client engineer would spend all day working on the custom logic, feel confident the migration was done, and the customer would shoot back the very next hour saying the migrated data was shit due to broken relational links.

To fix this, we deployed a hybrid architecture. The SSIS packages handled half of the migration (the standard, flat entities), while our custom binaries took over the complex custom logic. We bypassed SSIS joins and used JSONata to effortlessly normalize their four distinct ticket tables into a single payload. We migrated the million records seamlessly, and not once did our scripts die randomly.

The Overlooked Requirement: Observability

One major pain point with their SSIS packages was the black box. The business side had no clue what the migration scripts were doing. Just to answer a simple question—"How many records errored out?"—required a whole one-hour ceremony for an engineer to pull the information.

We built observability directly into our tooling:

  1. Automated Error Reporting: A single command generated an error report automatically.
  2. MS Teams Webhooks: We added notifications that pinged the stakeholders with exact progress updates. They loved it. Observability is usually the most overlooked area of migrations, but it is crucial.

To be honest, until the actual Bank A production migration was complete, the Client team was on the edge and unsure. Clearly, SSIS is what they thought was the recommended way to do migrations. The initial QA run was an eye-opener for them.

Use this framework to decide your own migration path:

Metric / Constraint Stick with SSIS & KingswaySoft Build Custom JSONata Binaries
Record Volume < 1 Million > 1 Million
Mapping Complexity 1-to-1 table mapping Normalizing multiple legacy tables into 1 cloud table
Data Transformation Basic UI if/else logic Complex JSONata, cross-referencing, or mathematical transforms
Team Skillset Heavy C# / SQL / DBA skills Node.js, Go, API architecture, JSON manipulation
Iteration Speed 4-hour debug/deploy cycles 20-minute debug/deploy cycles

When you treat your data migration as a strict software engineering project rather than a UI workflow task, you stop crossing your fingers and hoping the package finishes. You build a predictable, fault-tolerant pipeline.

References & Technical Documentation

Frequently Asked Questions

Why does SSIS often become a bottleneck in Dynamics 365 cloud migrations?
Traditional SSIS and KingswaySoft workflows introduce severe workflow fatigue when dealing with massive CRMs. Engineers are forced to manually build packages for hundreds of tables, which inevitably leads to copy-paste mapping errors. Additionally, the opaque logic hidden inside visual UIs makes debugging complex data transformations incredibly difficult.
How do JSONata and custom binaries improve data mapping speed?
Custom binaries utilize JSONata to implement mapping logic as a single declarative YAML file. This allows engineers to easily search and modify transformation rules using a simple text editor. Because the logic is code-driven rather than UI-driven, teams can reduce debugging and deployment cycles from four hours down to just twenty minutes.
What are the biggest Web API risks when migrating millions of records to Dynamics 365?
Pushing massive data volumes directly to the Cloud Dynamics 365 Web API requires strict optimization to prevent the system from choking. The primary risks are mitigated by disabling non-essential plugins that trigger on inserts or updates. Furthermore, on-premise SQL datetimes must be strictly parsed, and foreign keys require exact OData binding syntax to avoid payload rejection.
Can custom binaries handle complex relational data better than SSIS?
Yes. When standard SSIS joins fail to maintain complex relational links, a hybrid architecture using custom binaries is highly effective. Binaries powered by JSONata can effortlessly bypass restrictive SSIS joins to normalize multiple distinct legacy tables into a single, clean cloud payload.
How do you ensure data security when running custom migration scripts for banking clients?
Security is maintained by establishing strict operational boundaries before writing any code. Migration teams can build self-contained executable binaries that the client runs locally on their own secure servers. This ensures the migration team never requires direct access to core banking systems or customer environments.

More from our Blog

Migrating Magento to Wix: Why UI Plugins Break SEO and Business Continuity
From The Migration Trenches

Migrating Magento to Wix: Why UI Plugins Break SEO and Business Continuity

Standard UI plugins often fail Magento-to-Wix migrations because of the architectural mismatch between Magento’s EAV model and Wix’s document-style API. To prevent broken links, lost customer history, and 404 errors, businesses should use a custom ETL (Extract, Transform, Load) approach. This method ensures data integrity by directly querying the MySQL database, handling password hashing incompatibilities, and implementing programmatic link rewriting to save organic search traffic.

Raaj Raaj · · 5 min read