
If you migrate Microsoft Dynamics 365 from on-premise to the cloud, conventional wisdom points you toward SQL Server Integration Services (SSIS) coupled with a connector like KingswaySoft. It is the industry standard.
It is also exactly why I recently watched a single bank migration drag on for six agonizing months.
The reality of these migrations hit hard when I spoke with Sudhir, a Product Manager on the client team. He was forced into the migration trenches because key people overseeing their Dynamics rollouts had left the company. They had exactly one person across their India/US team building SSIS packages. Due to a combination of workflow fatigue and the clunky SSIS developer experience, their migrations were stalling out.
To hit a tight weekend cutover for Bank A and rescue a hostile project for Bank B, we had to stop clicking through workflow interfaces.
Here is the exact technical and operational blueprint of how we replaced slow, UI-bound SSIS packages with self-contained, JSONata-powered binaries. I will walk you through the strict boundaries we set, how we solved the three fatal flaws of KingswaySoft, and the hybrid architecture we deployed when standard tools failed.
The Operational Boundaries and Technical Context
Before writing a single line of code, Sudhir and I established strict operational boundaries for the bank A migration. When dealing with core banking data, security is absolute:
- Zero Access: My team would not have access to any client or bank systems.
- Self-Contained Execution: We would provide a standalone executable binary that client could run locally on their secure servers.
- The Scope: The binary would ingest a series of flat CSV files and push them to the Cloud Dynamics 365 Web API.
- Data Sourcing: The CSV files would be generated entirely by the client team by exporting their on-premise Microsoft SQL Server tables (like contacts, account, person).
Sudhir shared a batch of test CSVs. We built the first version of the binary in a single day. It handled OAuth, parsed the CSVs, validated the schema, and executed the migration. We ran a few tests, recorded a Loom video to prove the data flow, and sent it to Sudhir. Once he was convinced, we moved to the QA environment.
The Bottleneck: The 3 Pitfalls of KingswaySoft
When you use KingswaySoft, engineers manually build packages for each Dynamics table. For a massive CRM, this introduces three fatal flaws that crush iteration speed:
- Workflow Fatigue: You sit and write packages for each table individually. When you repeat the same manual clicking process hundreds of times, it is only a matter of time before you start making mistakes.
- Copy-Pasta Errors: Engineers frequently copy and paste mapping nodes to save time. KingswaySoft provides a UI to select fields, but humans still forget to update the target attributes, leading to bad data mappings.
- Opaque Logic in the UI: Workflow UIs look fantastic in a sales demo. But believe it or not, for anything more complicated than two if-else conditions, you cannot make sense of what is happening. The mind-boggling amount of effort and time required to see what is happening inside a package is insane.
To add insult to injury, SSIS and Azure Data Factory (ADF) have their own stability quirks. Client engineers frequently spent three to four hours debugging why their SSIS packages simply stopped running, only to realize they just had to restart the ADF to make it work again.
The Engineering Solution: How We Solved the 3 Pitfalls
To fix this, we worked directly with a new Business Analyst on their team. The BA talked to the bank to understand what SQL tables to migrate and how to transform them, documenting everything in a "Migration Checklist."
Here is how we bypassed the KingswaySoft bottlenecks entirely:
1. LLM Automation (Solving Fatigue)
We didn't click through UIs, and we didn't write our mapping files from scratch. We built a templatized script that utilized LLMs to read the BA's checklist and auto-generate the mapping structure. This got us 90% of the way there instantly, leaving only 10% for manual tweaking.
This automation was an absolute lifesaver for Bank A's marketing data. They had a massive amount of cdi tables generated by an external email campaign tool. Manually mapping dozens of cdi tables in SSIS would have taken weeks; our script generated the baseline mappings in seconds.
2. Shifting Left (Solving Copy-Pasta)
We shifted the place where mapping mistakes happen right down to the source. That BA checklist became our literal source of truth. If the migration checklist was correct, the generated mapping was correct. We had numerous cases where the name of the table attributes was wrong in the migration checklist itself—we caught and fixed them before code was ever executed.
3. JSONata and YAML (Solving Opaque Logic)
We implemented the entire mapping logic as a single, declarative YAML file powered by JSONata (a lightweight query and transformation language).
It is much easier to find how you are transforming a specific attribute by just using Cmd+F in your text editor. Furthermore, these mappings dictated row-level ETL logic—instructing the binary exactly when to skip a malformed CSV row, when to run a strict insert, and when to upsert based on existing Alternate Keys.
Because of this setup, our iteration speed was a breath of fresh air. Columns would be missed, CSV data might be absent, or mapping names would be wrong. Debugging these issues took us 20 to 30 minutes, and the fixes were ready instantly. Compare that to KingswaySoft, where the debugging-packaging-running ceremony took around four hours.
Dynamics 365 Web API Survival Guide
Running a custom binary means you interact directly with the Web API. We learned several critical optimizations to prevent the API from choking during Bank A's 4 to 5 million record weekend cutover:
- Disabling Plugins: Dynamics plugins configured to run on every insert or update drastically increase API response times. Disable all non-essential plugins before migrating.
- Date Formatting: On-premise SQL datetimes must be strictly parsed and formatted before hitting the Web API.
- OData Foreign Keys: Specifying foreign key references in the Web API requires exact OData binding syntax (primarycontactid@odata.bind: "/contacts(uuid)").
- Status Updates: Understanding exactly how statecodes and statuscodes need to be passed for various entities is critical to prevent API rejection.
The Bank B: The Hybrid Architecture Tipping Point
Then came Bank B. This migration had been running for six months, the bank was pissed off, and Sudhir explicitly warned me the stakeholders were "hostile."
Bank B was using Dynamics CRM as a custom helpdesk (similar to Salesforce Service Cloud). They had a custom setup where tickets were split across four entirely different on-premise SQL tables based on "Ticket Type." They needed all of them merged into a single Cloud table.
The tipping point for Bank B wasn't just the delay. The Client engineer would spend all day working on the custom logic, feel confident the migration was done, and the customer would shoot back the very next hour saying the migrated data was shit due to broken relational links.
To fix this, we deployed a hybrid architecture. The SSIS packages handled half of the migration (the standard, flat entities), while our custom binaries took over the complex custom logic. We bypassed SSIS joins and used JSONata to effortlessly normalize their four distinct ticket tables into a single payload. We migrated the million records seamlessly, and not once did our scripts die randomly.
The Overlooked Requirement: Observability
One major pain point with their SSIS packages was the black box. The business side had no clue what the migration scripts were doing. Just to answer a simple question—"How many records errored out?"—required a whole one-hour ceremony for an engineer to pull the information.
We built observability directly into our tooling:
- Automated Error Reporting: A single command generated an error report automatically.
- MS Teams Webhooks: We added notifications that pinged the stakeholders with exact progress updates. They loved it. Observability is usually the most overlooked area of migrations, but it is crucial.
Overcoming "Microsoft Recommended" Skepticism
To be honest, until the actual Bank A production migration was complete, the Client team was on the edge and unsure. Clearly, SSIS is what they thought was the recommended way to do migrations. The initial QA run was an eye-opener for them.
Use this framework to decide your own migration path:
Metric / Constraint | Stick with SSIS & KingswaySoft | Build Custom JSONata Binaries |
Record Volume | < 1 Million | > 1 Million |
Mapping Complexity | 1-to-1 table mapping | Normalizing multiple legacy tables into 1 cloud table |
Data Transformation | Basic UI if/else logic | Complex JSONata, cross-referencing, or mathematical transforms |
Team Skillset | Heavy C# / SQL / DBA skills | Node.js, Go, API architecture, JSON manipulation |
Iteration Speed | 4-hour debug/deploy cycles | 20-minute debug/deploy cycles |
When you treat your data migration as a strict software engineering project rather than a UI workflow task, you stop crossing your fingers and hoping the package finishes. You build a predictable, fault-tolerant pipeline.