The previous eight posts in this series have made the strategic case for FSC Core from every relevant angle: architecture, data model, performance, AI readiness, compliance, and integration. If you have read this far, the question is likely no longer whether your institution should migrate, but how.
This post is the answer to that question. Not a step-by-step technical implementation guide — the specific configuration and migration mechanics are covered in depth by Salesforce’s own documentation and by experienced implementation partners. What this post provides is a strategic migration framework: the four phases every institution should work through, the key decisions that must be made in each phase, the specific watch-outs that derail migrations that start well, and the mindset reframe that separates institutions that come out of the migration stronger from those that merely complete it.
That mindset reframe is worth stating upfront: the migration from managed package to FSC Core is not just a technical migration. It is an opportunity. An opportunity to clean up data that has accumulated inaccuracies over years. An opportunity to rebuild relationship structures that were always compromises imposed by the managed package data model. An opportunity to retire custom code that was built to work around managed package limitations and that no longer needs to exist on the native platform. Institutions that approach the migration with this lens come out the other side with a better platform, better data, and a cleaner architecture than they had going in.
The goal of the migration is not to replicate what you had on the managed package in FSC Core. The goal is to build what you should have had all along.
The Four-Phase Framework at a Glance
The framework below provides a structured path from current-state assessment through full FSC Core operation. Duration estimates reflect typical engagements for mid-sized banking and lending institutions; complex implementations with large data volumes or heavily customised orgs will require more time at each phase.
|
Phase |
Name |
Typical Duration |
Key Deliverables |
|
Phase 1 |
Assess & Inventory |
4–8 weeks |
Dependency map, customisation audit, integration register, data quality baseline, AppExchange review, executive alignment |
|
Phase 2 |
Design & Validate |
6–12 weeks |
Target architecture, data migration mapping, parallel-run strategy, sandbox POC, DPE template configuration, PRG design, change management plan |
|
Phase 3 |
Migrate in Waves |
12–26 weeks |
Wave-by-wave data migration starting with lowest-risk segments, integration cut-overs, user acceptance testing, compliance sign-off per wave |
|
Phase 4 |
Stabilise & Optimise |
4–8 weeks |
Managed package decommission planning, DPE performance tuning, AI / Agentforce activation, reporting rebuild, full parallel-run retirement |
Phase 1: Assess and Inventory
No migration succeeds without a clear-eyed understanding of what you are migrating from. Phase 1 is about building that understanding systematically — not just cataloguing what exists, but understanding the dependencies, risks, and opportunities that the current managed package implementation represents.
The Dependency Map
The single most important deliverable of Phase 1 is a complete dependency map of your managed package implementation. This means identifying every custom field, custom object, custom Apex class, Flow, Process Builder automation, report, dashboard, and page layout that references FinServ__-prefixed managed package objects.
This is more extensive than most institutions initially expect. Managed package references accumulate silently over years of Salesforce development — in validation rules, in formula fields, in Email Templates, in Approval Processes, and in integration middleware configurations that were built long ago by team members who may no longer be with the institution. Tools like Salesforce’s Field Usage component, the Metadata API, and third-party dependency analysis tools (such as those available from AppExchange) can accelerate this discovery, but manual review of complex automations and integrations is typically still required.
The dependency map drives every subsequent decision in the migration: which customisations can be migrated directly to FSC Core equivalents, which need to be rebuilt to take advantage of the new data model, and which represent technical debt that the migration is an opportunity to retire.
|
⚠ Migration Watch-Out The most common Phase 1 failure is underestimating the depth of managed package dependencies in reporting and automation layers. Institutions that assess only their data objects and miss the hundreds of report fields, formula references, and Flow element references to namespace-prefixed fields discover this gap at the worst possible moment — during user acceptance testing in Phase 3. Recommendation: Run a full metadata extraction and do not begin Phase 2 until every managed package reference in every component type has been catalogued. |
The Integration Register
Phase 1 must also produce a complete register of every external system that connects to your Salesforce org and reads from or writes to managed package objects. This includes core banking integrations, loan origination system connections, data warehouse feeds, credit bureau pipes, digital banking platform connectors, and any custom API clients built by internal teams.
For each integration, the register should capture: the systems involved, the specific managed package objects and fields accessed, the direction of data flow (read, write, or bidirectional), the frequency of updates, the team or vendor responsible for the integration, and the estimated effort to update the integration for FSC Core standard objects.
The integration register is the primary input to the migration sequencing decisions in Phase 2. Integrations that write to financial account objects are typically the most complex to migrate and must be sequenced carefully relative to data migration waves.
The Data Quality Baseline
Phase 1 is also the right moment to establish an honest baseline of your current data quality — because the migration is an opportunity to improve it, but only if you know where the problems are before you start.
Specifically for banking and lending institutions, this means assessing: the completeness and accuracy of household structures (which will be migrated to Party Relationship Groups), the accuracy of financial account ownership records (which will become Financial Account Party records), the integrity of balance data (which will move from single overwrite fields to Financial Account Balance child records), and the consistency of relationship role data that may have been stored as free text or in custom fields.
Institutions that invest time in data quality assessment during Phase 1 are able to build data remediation into the migration plan itself — arriving at FSC Core with cleaner, more accurate relationship data than they had on the managed package. Those that skip this step migrate their data quality problems alongside everything else.
Executive Alignment and Governance
The final critical output of Phase 1 is executive alignment on the migration scope, timeline, and success criteria. FSC Core migration is a significant programme that touches every team that uses Salesforce — relationship managers, lending officers, operations staff, compliance teams, IT, and integration teams. Without active executive sponsorship and clear governance structures, migrations stall when competing priorities arise.
The Phase 1 assessment provides the factual foundation for that executive conversation: this is the scope of what we are migrating, this is the complexity we have found, this is the timeline and resource requirement, and this is the strategic value we will gain on the other side.
Phase 2: Design and Validate
With a clear picture of the current state, Phase 2 is about designing the target architecture, validating the critical assumptions of that design, and building the confidence — through proof-of-concept work in a sandbox environment — that the migration plan will work before any production data moves.
Target Architecture Design
The target architecture design answers three interconnected questions: how will the FSC Core data model represent the relationships and financial data that currently live in the managed package, how will integrations be rebuilt to write to standard objects, and what new capabilities will be activated as part of the migration rather than deferred to post-migration optimisation.
The data model design is the most consequential decision in Phase 2. This is where the migration team designs the Party Relationship Group structure that will replace household record types, the Financial Account Party role taxonomy that will replace the two-field ownership model, the Financial Account Balance record strategy that will replace overwritten balance fields, and the DPE template configuration that will replace rollup-by-lookup.
Getting the data model design right requires domain expertise in both FSC Core’s standard object model and the institution’s specific relationship banking and lending requirements. This is typically where experienced FSC Core implementation partners add their greatest value — translating the institution’s business requirements into FSC Core configurations that avoid common design mistakes that are expensive to correct post-migration.
|
✨ Migration as Data Model Improvement Opportunity |
|
The migration is the right moment to correct data model decisions that were always compromises on the managed package. Three specific opportunities that institutions should explicitly plan for in Phase 2: 1. Household restructuring: Review existing household configurations against the Party Relationship Group design and identify households that were simplified to work within record type constraints. The PRG migration is an opportunity to build the relationship structure that actually reflects the client relationship, not the one that fit the managed package model. 2. Ownership role enrichment: The move from two-field ownership to Financial Account Party records is an opportunity to collect and structure beneficial ownership, guarantor, and signatory data that was previously unstructured or stored in notes. Plan data collection campaigns alongside the technical migration for high-value accounts. 3. Custom code retirement: Every custom Apex class, trigger, or Flow automation built to work around managed package limitations is a candidate for retirement on FSC Core. Phase 2 should include a deliberate review of which custom code can be replaced by native FSC Core capabilities — reducing technical debt and maintenance overhead in the target architecture. |
The Parallel-Run Strategy
For institutions that cannot afford any disruption to Salesforce availability during migration — which is most banking and lending institutions — the parallel-run strategy is a critical design decision. A parallel run means operating the managed package and FSC Core simultaneously for a defined period, with data flowing to both and users operating in both environments while the migration team validates the integrity of the target state before cutting over.
Salesforce supports a coexistence mode in which institutions can use FSC Core features alongside the managed package, which provides a foundation for parallel operation. The parallel-run strategy defines: which data domains run in parallel first, what validation criteria must be met before cut-over, how long parallel operation will run for each wave, and what the rollback procedure is if a wave fails validation.
Sandbox Proof of Concept
Before any migration plan is finalised and resourced, the critical assumptions of the target architecture design must be validated in a sandbox environment. This means migrating a representative sample of the institution’s data to FSC Core standard objects, configuring the DPE templates, building the PRG structures for test households, and running the integration connections against test data.
The sandbox POC typically surfaces issues that were not visible in the dependency map and data quality assessments — edge cases in the data model mapping, DPE configuration requirements that were not anticipated, integration field mapping gaps, and user experience differences that require page layout and workflow adjustments. Discovering these in the sandbox costs a sprint. Discovering them in production costs far more.
Phase 3: Migrate in Waves
Phase 3 is where the migration actually happens — and the wave-based approach is what makes it manageable rather than overwhelming. Rather than attempting to migrate all data, all users, and all integrations simultaneously, a wave-based migration moves specific data domains, user populations, and integration connections in sequenced stages, with validation and stabilisation built into the transition between each wave.
Wave Sequencing Principles
The sequencing of migration waves should be driven by a combination of risk management and strategic value. Several principles guide effective wave sequencing:
- Start with lower-risk data domains. Begin with data that has simpler ownership structures, fewer integration dependencies, and lower compliance sensitivity. Consumer deposit accounts with straightforward ownership are typically a better first wave than complex commercial lending relationships with multiple guarantor structures.
- Migrate integrations in coordination with their data domains. An integration that writes financial account data to Salesforce should be cut over in the same wave as the financial account migration for the corresponding accounts, not independently. Misaligned integration cut-overs create data consistency problems that are difficult to diagnose and correct.
- Use early waves to refine the process. The first migration wave will surface procedural issues that are not visible in planning — how long data validation actually takes, which user experience differences require additional training, and which edge cases in the data model mapping need to be addressed. Build review-and-refine sessions between waves to incorporate these learnings.
- Prioritise high-value relationships for middle waves. After the process is refined through early waves, apply it to the institution’s most strategically important relationships — high-value commercial clients, large household groups, complex multi-entity relationships — when the team’s confidence and execution capability are at their highest.
- Keep compliance-sensitive data for coordinated cut-over. Some data domains — particularly those subject to examination or audit — benefit from a coordinated cut-over with compliance team sign-off rather than inclusion in an earlier wave.
|
⚠ Migration Watch-Out The most common Phase 3 failure mode is insufficient post-wave validation before proceeding to the next wave. Migrated data that passes initial validation checks can still have integrity issues that only surface when users interact with it — relationship map views that do not render correctly, DPE rollups that produce unexpected aggregations, or Financial Account Party records that were created with incorrect role assignments. Recommendation: Build a minimum two-week stabilisation period after each wave, with active monitoring by relationship managers and the Salesforce team, before the next wave begins. The cost of this patience is far less than the cost of discovering data integrity issues in wave three that trace back to uncorrected problems in wave one. |
Household-to-PRG Migration: The Most Consequential Wave
Within the wave-based migration, the transition from household Account records to Party Relationship Groups deserves particular attention because it touches the most visible user experience in the platform — the relationship view that bankers and relationship managers use every day.
The migration of household structures to PRGs is not a purely technical operation. It requires decisions about how existing households map to PRG structures, how to handle households that were simplified to fit the managed package’s record type constraints, and how to communicate to users the differences between the household view they are used to and the PRG-based view they will work with going forward.
User training and change management for the PRG transition is one of the highest-leverage investments in Phase 3. Relationship managers who understand why the PRG model is more powerful than the household record type — and who have been shown how to use it through hands-on training with their own client data — adopt it faster and with less resistance than those who encounter it as an unexplained change to their daily workflow.
Phase 4: Stabilise and Optimise
Phase 4 begins after all migration waves are complete and all users are operating in FSC Core. Its primary purpose is to ensure stability, retire the managed package, and activate the capabilities that the migration was designed to unlock.
Managed Package Decommission Planning
One of the most important — and most commonly deferred — activities of Phase 4 is planning the actual decommission of the managed package from the org. Many institutions complete the migration to FSC Core but leave the managed package installed ‘just in case,’ creating an environment where both architectures exist simultaneously and users can inadvertently continue to interact with managed package objects.
A clean decommission requires: confirming that all managed package field references have been removed from active automations, reports, and integrations; archiving or retiring any custom objects that were built as managed package companions; and communicating to users that the managed package objects are no longer in use. Salesforce provides guidance on managed package retirement procedures that should be followed as part of this workstream.
Activating the Capabilities You Migrated For
Phase 4 is also the moment to begin activating the capabilities that the migration was specifically designed to unlock and that were referenced throughout this series:
- DPE performance tuning: Review DPE template execution schedules and event-trigger configurations, optimise for the institution’s specific data volumes and refresh frequency requirements, and activate on-demand rollup capabilities as they become available.
- Agentforce activation: With FSC Core’s standard objects now fully in place, begin deploying Agentforce agents for the use cases identified in Post 6 — starting with the pre-meeting brief agent, which has the most immediate and visible impact on relationship manager productivity.
- Data Cloud connection: Establish the Data Cloud integration that unifies Salesforce FSC Core data with core banking transaction history and other external data sources, building the unified customer profile that powers more intelligent AI applications.
- Reporting modernisation: Rebuild key reports and dashboards to use FSC Core standard objects throughout, taking advantage of the cleaner field names, the Financial Account Balance history data, and the PRG-based relationship aggregations that were not available on the managed package.
- AppExchange expansion: With the namespace constraints removed, evaluate fintech AppExchange partners whose solutions were previously incompatible with or required significant custom work to connect to the managed package architecture.
|
✨ The Post-Migration Dividend |
|
Institutions that reach Phase 4 having followed the four-phase framework typically report several unexpected benefits beyond the capabilities they planned for: Reduced Salesforce team maintenance burden: Custom code retired during Phase 2 does not need to be maintained, tested, or upgraded. Institutions with large managed package customisation backlogs often report significant reductions in the time their Salesforce teams spend on maintenance versus new capability development. Improved data confidence: The data quality work embedded in the migration process — particularly the household restructuring and ownership role enrichment — produces a relationship database that relationship managers and compliance teams trust more than the one they were using before. Faster onboarding of new Salesforce talent: Developers and administrators who join the team after the migration work in a standard Salesforce environment without managed package specialisation requirements. Recruiting and onboarding timelines shorten. |
Addressing the Most Common Migration Objections
“We have too much custom code on the managed package.”
This is the most common objection, and it is worth engaging with directly. Heavy managed package customisation is a reason to plan the migration carefully and invest in Phase 1 discovery — but it is not a reason to defer. The managed package customisation you have today was built to work around limitations that FSC Core removes. The longer you maintain that custom code, the higher its cumulative maintenance cost. The migration is the opportunity to stop paying that tax.
“We are mid-cycle on a major integration project.”
If the integration project is building on managed package objects, the most efficient path is often to scope the integration for FSC Core from the start rather than building for managed package objects that will need to be migrated later. Integration teams that build on FSC Core standard objects during an active migration benefit from avoiding the double-migration of first building for the managed package and then rebuilding for Core.
“We don’t have the internal capacity to run a migration.”
FSC Core migration is a programme that almost always benefits from experienced external implementation partner involvement, particularly in Phase 1 dependency analysis, Phase 2 target architecture design, and Phase 3 wave execution. The internal capacity required is executive sponsorship, business domain expertise, and change management leadership — the deep technical execution can be augmented with partner resources. The question is not whether to migrate; it is whether to engage the right partner to help you do it well.
“Salesforce hasn’t announced a deprecation date.”
This is true, and it is the wrong frame. The managed package will not be deprecated tomorrow. The question is whether your institution will be positioned to access AI capabilities, open banking integrations, and the performance architecture improvements that Salesforce is building exclusively for FSC Core — and whether you will build that position proactively or reactively. The institutions that wait for a deprecation announcement will be migrating under time pressure, competing for partner capacity with everyone else who waited, and migrating a more complex environment than the one they have today.
The best time to migrate was two years ago. The second-best time is now — before the environment becomes more complex, the innovation gap widens further, and the urgency of a deprecation announcement removes your ability to plan on your own terms.
PREVIOUS IN THE SERIES
Post 8: The Integration Advantage — Open Banking, Fintechs, and the Core Platform
NEXT IN THE SERIES
Post 10: The Institutions That Move First Will Win — A Vision for FSC Core’s Future
About This Series
“Building the Future of Financial Services on Salesforce’s Native Platform” is a 10-part thought leadership series exploring why FSC Core represents the strategic imperative for financial services and banking institutions on Salesforce. Posts publish weekly.
