Data transformation case studies

Navigating Compliance and Talent Management Through Effective HCM Data Governance
For a 30,000-employee industrials company growing rapidly through acquisitions, maintaining accurate HR data was a constant challenge. Each new acquisition brought an influx of disparate data, inconsistent structures, and compliance risks, particularly when meeting federal reporting requirements like Affirmative Action Plans (AAP), Equal Employment Opportunity (EEO), and Fair Labor Standards Act (FLSA) regulations. These gaps in HCM data fundamentals not only strained resources but also delayed strategic decision-making about talent across the organization.
When the company engaged us, their HR team faced a tangled web of challenges. Data preparation for newly acquired companies took up to six weeks, consuming hours of manual effort. Inconsistent job classifications, mismatched AAP codes, and errors in hierarchical reporting were just a few of the recurring issues. Over the course of our partnership, we implemented a series of targeted interventions to clean up their data, automate quality checks, and establish sustainable HCM data governance processes that transformed their HR function.
Diagnosing and Resolving Systemic HR Data Errors
The first step in addressing the company’s HR data issues was to thoroughly diagnose the root causes of their errors. Our diagnostics uncovered a range of problems, from blank or missing fields to misaligned reporting structures. For instance, many employee records were missing critical information, such as job or position data, making it nearly impossible to maintain compliance with federal reporting requirements. These gaps highlighted the need for better HCM data best practices around record-keeping and validation.
We also encountered complex issues related to hierarchical relationships within the data. In some cases, managers and subordinates were incorrectly assigned, creating circular reporting structures that led to confusion and undermined organizational clarity. Other inconsistencies included discrepancies between positions and their corresponding job records, as well as mismatches between organizational units assigned to employees and their roles.
To resolve these challenges, we implemented:
- Rigorous data validation processes that flagged errors and inconsistencies.
- Systematic cleanup of reporting structures, ensuring clarity and alignment.
- Enhanced reporting templates that captured the necessary data fields to meet compliance requirements.
These efforts not only reduced immediate errors but also laid the groundwork for more robust HCM data governance practices moving forward.
Establishing Standardized Validation and Configuration
After addressing the initial wave of data errors, we shifted our focus to preventing future inconsistencies. Standardizing HR data configurations across the parent company and its acquired entities was essential. Disparate systems and manual processes in the acquired companies frequently led to issues such as mismatches between Affirmative Action Plan (AAP) codes and Equal Employment Opportunity (EEO) categories. Similarly, pay scales and personnel subgroups were often misaligned, leading to significant compliance risks and reporting inaccuracies.
We developed standardized templates and automated validation rules that ensured alignment across all entities. For example, we predefined valid combinations of AAP and EEO codes, allowing the system to flag invalid entries before they propagated. We also implemented rules to align job classifications with pay scales, addressing common issues where managerial roles were not correctly tied to corresponding compensation structures.
By creating and deploying these templates and rules, we gave the HR team tools to validate data at the point of entry. This significantly reduced the volume of errors and manual corrections, enabling the team to focus on more strategic tasks. These efforts reinforced the importance of HCM data best practices in ensuring consistent and reliable data.
Automating Data Audits and Reporting
Data quality is not a one-time fix; it requires continuous monitoring. Recognizing this, we partnered with the HR team to automate their data quality audits. Previously, audits were conducted sporadically and often relied on labor-intensive processes. We introduced automated tools created in Applaud to generate monthly diagnostics, a cornerstone of effective HCM data governance.
These automated reports provided a clear picture of data health across the organization. For example, they identified discrepancies where employees’ organizational units did not align with their assigned positions, or where Fair Labor Standards Act (FLSA) classifications were mismatched. By detecting these errors close to real time, the HR team could address them proactively.
We also tailored the reports to meet the needs of different stakeholders, offering division-level breakdowns as well as consolidated organizational summaries. This allowed the company to prioritize corrections efficiently and maintain high levels of data integrity while adhering to HCM data best practices.
Building a Scalable Data Governance Framework
To ensure long-term success, we worked with the company to establish a robust HCM data governance framework. This involved defining clear ownership of HR data fields and creating processes for maintaining data accuracy. We documented standard operating procedures for data validation, error escalation, and corrective actions, ensuring that the HR team could sustain high data quality even as the organization continued to grow.
An integral part of this effort was updating training materials for HR administrators. These materials covered everything from field definitions and job classification standards to the proper configuration of manager assignments. By empowering the HR team with the knowledge and tools they needed, we helped build a culture of accountability and precision—essential for maintaining HCM data fundamentals across a rapidly expanding organization.
Optimizing Processes for Acquired Companies
One of the most significant outcomes of our engagement was the transformation of the data onboarding process for newly acquired companies. Previously, it could take up to six weeks to prepare and integrate HR data from a new acquisition—a process fraught with errors and inefficiencies. By adding field derivations, automated validity checks, and simplified handoffs, we reduced this timeline to just one week of part-time effort.
This optimization not only accelerated integrations but also ensured that acquired companies were aligned with the parent company’s standards from day one. Pre-configured templates and automated validation reduced the burden on both the parent company and the acquired entities, creating a seamless transition. These efforts showcased the practical application of HCM data best practices in real-world scenarios.
The Business Impact: From Chaos to Confidence
By the end of our engagement, the company’s HR function had undergone a dramatic transformation. Key outcomes included:
- Regulatory Compliance: The company achieved consistent compliance with federal AAP, EEO, and FLSA requirements, significantly reducing regulatory risks.
- Enhanced Data Quality: Automated diagnostics and continuous monitoring ensured that errors were caught and corrected before they could impact reporting or decision-making.
- Operational Efficiency: Manual data preparation efforts were reduced by 80%, freeing up valuable time for the HR team to focus on strategic initiatives.
- Scalable Processes: A robust HCM data governance framework positioned the company to manage HR data effectively across future acquisitions.
This transformation highlights the power of mastering HCM data fundamentals. Through strategic interventions in HCM data governance, quality, and migration, we helped this company turn its HR data challenges into strengths, providing a solid foundation for compliance and talent management. As a result, they are now better equipped to understand and leverage their workforce, ensuring their continued growth and success.
By emphasizing HCM data best practices, robust HCM data governance, and the essentials of HCM data fundamentals, organizations can build sustainable frameworks that not only meet compliance standards but also enable strategic workforce management. Let this story inspire your journey to smarter, more reliable HR data.

Equipping a Workday Implementation with Data Governance
Background
Our client, one of the largest US ports, was preparing for a vital multi-year ERP transformation journey with Workday at its core. Acknowledging the pivotal role of data and recognizing gaps in their existing data management capabilities, the CIO sought to ground this transformation initiative with a formal data governance function – imperative for aligning business objectives with data requirements and enabling the adoption of Workday through data fidelity and trust.
Choosing Definian
Definian stood out with a unique value proposition that demonstrated expertise not only in understanding the intricate data requirements of a Workday implementation but also in presenting a strategic vision for data enablement that will drive implementation efforts to success. This included foundational data management disciplines for data governance and data quality to not only support the ERP go-live but also are required to maintain data integrity in the future. This, coupled with a proven track record in navigating organizations through complex data transformation initiatives, instilled confidence in Port leadership, affirming our suitability as the ideal partner.
The Process
Definian delivered a multi-phased tactical plan to develop and execute a strategy to formalize a data governance discipline at the Port. The approach tailored and launched a data governance framework aligned with the business operating model. Recognizing that data governance was a novel concept at the Port, an educational component was incorporated to facilitate change adoption and operationalize data governance concepts into an ongoing data management discipline. This was accomplished by working through use cases with the council centered on four targeted end-to-end business processes. Following the launch, Definian transitioned data governance facilitation to the Port’s appointed data governance leader and provided them guidance to sustain the momentum required to operationalize the council.
The objectives of each phase were:
Phase 1: Strategic Assessment
· Obtain a current state understanding through stake holder engagement
· Envision a desired state for data governance tailored to the Port’s operating model
· Translate gaps into prioritized capability-building initiatives
· Define a strategic roadmap of initiatives to implement data governance
Phase 2: Data Governance Launch
· Charter the launch of an Interim Data Governance Council
· Educate concepts for data governance and data quality frameworks
· Develop data governance rigor by working through targeted use cases
Phase 3: Operationalize Data Governance
· Sustain data governance momentum from the initial launch
· Transition facilitation to the Port’s data governance leader
· Compile formal data governance standards and policies for critical information
Results
By delivering on this initiative in less than six months, the outcomes have been promising as the Port approached its Workday implementation.
- Provided a shared understanding of the Port’s current data management capabilities and desired vision for data governance
- Launched a centralized data governance operating model comprising data owners and subject matter experts that’s Business-driven and IT-supported
- Established a cross-functional Data Governance Council promoting collaboration in formalizing data requirements, standards, and policies across the Port’s applications, processes, and reports
- Educated data governance and data quality concepts into practice through the execution of targeted use cases that generated data governance artifacts
- Achieved consensus with an initial inventory of critical data elements and their supporting definitions, standards, and stewardship accountabilities, providing clarity and eliminating redundant efforts from future Workday data migration and requirements tasks
In conclusion, through strategic implementation of data governance, our client not only embraced the intrinsic value of their data but also secured a resilient foundation for their Workday transformation, ensuring a more rapid and cohesive transition tailored to their long-term goals.

Developing a Data Strategy and Data Observability Capabilities to Address Data Risk factors for a Leading Airline
Background
After resolving an unauthorized data incident that was reported on the nightly news, our Client, the CISO of an airline, needed to prevent future breaches by getting their arms around data risk across the enterprise.
To identify PII and confidential data distributed across many sources in a dynamic data landscape, the airline needed to understand where those critical and sensitive data assets reside and develop capabilities that identify and monitor data access gaps as they arise.
Four objectives needed to be met to complete this initiative.
- Catalog sensitive data across structured and unstructured data throughout the organization.
- Identify and resolve any gaps within the data access controls within Azure.
- Create a process that constantly monitors data risk and pushes out alerts when a critical gap is detected.
- Create an observability portal that shows an at-a-glance data risk score, the current gaps, the severity of each gap, and a mechanism for correcting each gap.
Why Definian
Our Client chose Definian as their partner for this sensitive work because of our expertise in data strategy for major financial institutions and our deep technical knowledge in building underlying connectors for leading data platforms.
The Work
To meet the first objective, BigID was deployed to scan, discover, and catalog the sensitive data. BigID is the leading product in automated data discovery and classification and was the standout choice for uncovering and monitoring unstructured and structured data throughout the organization.
With the sensitive data catalog in hand, we reviewed the access controls within Azure and immediately began to address any data sensitivity access gaps. As we resolved the gaps in the technical controls, we worked with our Client to address the data governance processes that led to the gaps.
With the current gaps resolved, we developed a process that would provide a proactive backstop to address future issues. There are two components to the backdrop. For the first component, Definian integrated Azure data access controls and BigID's ongoing scan results. This integration connects and analyzes the metadata from each solution and pushes out real-time alerts when a critical gap is detected. The second component was to utilize the integration to power a risk observability portal.
The risk observability portal enabled an at-a-glance assessment of the current data risk levels across the organization. The observability portal has two main features. The first is a risk score calculated by analyzing the various types and numbers of gaps. This score instantly communicates current risk levels and trends. The second feature displays the details behind the gaps and a mechanism to correct them within that single screen.
Results
With this additional observability capability, our Client is confident in always knowing their overall risk exposure. They now have a comprehensive process that tracks the unstructured and structured data across their dynamic landscape and a mechanism for quickly addressing gaps.

Data Governance and Data Quality Program for Retail Holding Company
Background
A rapidly growing Dubai-based retail-oriented holding company faced increased data regulations as it expanded its business interests in the Middle East region. It needed to establish a more robust and proactive Data Governance, Quality, and Privacy program to accelerate compliance with these increased data regulations.
The main objective of this initiative was to establish a Data program across 15 Operating companies with business interests in over 20 countries.
The Challenge
Beyond the organizational complexity of coordinating across the 15 Operating companies, our Client faced a rapidly evolving regulatory landscape around consumer data and privacy. Each operating company had a low data quality maturity that limited the ability to control and prevent known compliance issues. Lastly, the organization needed an in-house capability to set up and operate a complex program and gain support.
Why Definian
Our Client selected Definian to partner with on this initiative because of our experience building and facilitating complex data governance and quality programs. That expertise provided the confidence that Definian could build a flexible, democratized program to keep the organization ahead of their regulation requirements.
Primary Deliverable
As we engaged with our Client, we assessed current maturity levels and cataloged the data governance, privacy, and security requirements. Using the assessment results, we built a hub-and-spoke Data Governance model supporting all business units. This model enabled each business unit to operate independently while meeting common standards established at each hub.
As part of the data governance program's setup, we guided our Client through a data governance and quality platform selection process. Together, we evaluated several alternatives to ensure that the platform aligned with the Client's evolving needs and future vision. Using the platform, Definian connected, scanned, and collected metadata for 109 systems. Through those scans and data governance operations, 600 Critical Data Elements were identified.
With the governance operations and platform operational, the project's next step was to address the significant data quality issues. With limited Client resources, our Client turned to Definian to set up and operate a Data Quality as a Service program. Combining automated tools and Definian's consulting team, we quickly started the data quality improvement process while collaborating on formal organization-specific requirements. Additionally, we rolled out DQ dashboards tailored for each operating company.
Next steps
While much work remains, the organization has significantly increased its data maturity. It's equipped to more readily do business in more countries and adapt to changing regulations. Our Client now also has the data confidence to take advantage of more advanced ML/AI capabilities, which will accelerate their organizational growth.

Workday Human Capital Management Data Migration for a $3.5B Healthcare System
A $3.5B healthcare provider moved their HCM and payroll operations from 15 disparate legacy applications to a single Workday tenant. Premier facilitated the data migration for this initiative. Download the PDF to learn how Premier made the data track successful.
Project Scope
- 15 Legacy Applications
- 50 Conversion Objects
- 20K Employees
- 18 Months
Key Data Objects
Core HCM | Compensation | Payroll | Absences | Benefits | Talent | Recruiting | Learning
Primary Data Sources
Infinium | Sage HRMS | Meditech | JobVite | Healthcare Position | Manager | MyCompass | Healthstream | Relias | Lippincott | Elsevier | ADP | Fidelity | Active Directory | Reliance Matrix
Expectations
- On-Time and Budget
- Drive Data Quality
- Minimize Client Resource Effort
- No Impact on the Open Enrollment Process
Risk Factors
- Client Resource Constraints: Client resourcing was limited and often times a single person was responsible for many tasks spread across multiple workstreams. Premier needed to enable our client to spend time focusing on items that only they could answer and clear other data obstacles from their path.
- Duplication Across Legacy Systems: Each legacy system used a worker-based approach in how they managed worker data, with three positions under the same worker as three separate workers with different employee IDs. Additionally, the legacy infrastructure prevented the business from getting insights of harmonization issues across the various data sources.
- Inconsistent Legacy Data Files: When the legacy data was managed by third-party vendors or the legacy infrastructure prevented direct access, data extracts were generated as part of the migration. Getting consistent/accurate extracts from these systems was difficult and needed processes in place that helped ensure their accuracy.
- Last Minute Changes to Scope: Change is expected in large implementations. The client understood the risk of scope change and expected the data team to address any scope change in stride. The most impactful scope change was that benefit providers switched during the final build. This required new data sources to be incorporated and tested without impacting the timeline.
- Benefits Open Enrollment: During the last two months of the implementation, the organization went through open enrollment with a new benefits provider. Premier needed to enable the client to focus on facilitating open enrollment while incorporating all the necessary changes needed to the data migration without impacting timelines.
- Complex Cutover Transaction Maintenance: High volumes of transaction data needed to be migrated, monitored, and managed during the cutover window. To ensure the accuracy of migrations and to not disrupt business, validation needed to be efficient and trustworthy.
Premier's Workday Accelerator's
- Shortened Tenant-Build Timelines: Hundreds of Workday pre-validations dynamically check converted data against tenant configuration before load; speeding up issue identification, issue resolution, and overall load time allowing more time for users to test.
- Real Data, Real Quick: With Premier, clients achieve data-load success percentages over 90% in the first (Foundation) tenant build, fueling more accurate Workday tenant configuration and test results during the subsequent project stages.
- Workday Delivery Approach: Step-by-step data conversion execution process from the Plan stage through the Deploy and Hypercare stages, understanding the dependencies and ensuring all execution follows the Workday delivery approach.
- Tenant Upgrade Support: Data conversion tool assessment with each semi-annual tenant version release to ensure conversion solution always meets the current requirements.
- Templated Workday Conversion Tools: Developed with years of Workday implementation experience, our tools expedite requirement gathering and conversion development while ensuring you get the best solution.
- Customized Data-Quality Strategy: Insight into the legacy data landscape, alongside the understanding of Workday best practices, allows Premier to deliver a customized data-quality strategy suited to your specific needs.
- Automated Cleansing: Automatically cleanse data objects such as Employees, Addresses, Students, or Suppliers, ensuring that your Workday solution goes live with data quality standards fit for the cloud.
- Your Data, Workday-Ready: Generate load-ready data in standard Workday Advanced Load, Enterprise Interface Builder (EIB), iLoad, and any other custom load workbooks.
- Data Validation Support: Accelerate the validation process through post-conversion validation planning and reports for functions such as payroll or finance.
Mitigating the Risk - Client Resource Constraints
- Gated activities within project management strategy to keep the team focused on most critical tasks
- Leveraged Applaud data profiling capabilities to accelerate data mapping activities
- Accountable for data extraction, profiling, analysis, cleansing, and transformation data conversion activities enabling client resources to focus on requirements, validation, and getting familiar with Workday
- Applaud's data cleansing and enrichment features enabled the business to focus on what correct looks like and let the actual data cleansing occur in within Applaud's data repository
Mitigating the Risk - Duplication Across Legacy Data Sources
- Leveraged Applaud's data matching engine to automate the legacy data linkage and deduplication of employee and contingent workers
- Automated the process to select the correct primary position by applying a custom set of rules to Applaud's data-matching engine
- Improved the business's vision of overlapping data across the legacy data sources through Applaud's reporting capabilities
- Implemented a detailed audit approval process that provided a clear view of all the deduplication and consolidation that occurred during the migration
- Broke down the silos between multiple entities through the establishment of clear data ownership, stewardship, and governance processes to promote trust and increase the accuracy of all cleansings, enrichments, and transformations
- Managed a new position-based system through the implementation of complex eligibility checks and provided the analysis to give the business clarity and trust that the plans were assigned appropriately to the correct position
Mitigating the Risk - Inconsistent Legacy Data Files
- Identified file structure changes with the third party provided data files using Applaud's import reporting process
- Established file review process with third-party system administrators to confirm unexpected changes that occurred before each tenant build
- Created detailed workflows that ensured alignment between the business, third-parties, and data conversion team
- Documented the complex data extraction process to show where, when, who, and how all data is extracting from legacy environments
Mitigating the Risk - Last Minute Changes to Scope
- Executed data track risk management process that pinpointed scope risks months ahead of the go-live period to allow adequate time to mitigate most changes
- Kept last-minute specification changes in line by enforcing Premier's specification freeze portion of our process that requires extra approvals when changes come in after agreed-upon dates
- Quickly identified the impact of requested specification changes through Applaud's "Where Used" and "Functional Snapshot" features
- Collaborated with the implementation partner to identify the most efficient way to incorporate critical scope changes that occurred mid-build into the load schedule without impacting timelines
- Implemented scope changes into the automated migration programs without impacting deadlines through Applaud's transformation components
Mitigating the Risk - Benefits Open Enrollment
- Proactively strategized with the benefits team to change the conversion approach to accommodate open enrollment administration and the incorporation of new benefits provider data into the implementation
- Leveraged Applaud's rapid application development capabilities to implement conversions necessary for the new benefits provider within the project timeline
- Facilitated additional test cycles to flush out issues ahead of the gold build window
Mitigating the Risk - Complex Cutover Transaction Maintenance
- Enabled the team to address missing configurations in the tenant ahead of the cutover window through Applaud's Workday pre-validations
- Instilled confidence in the transactional data through Applaud's Workday pre-validation reports that alert the team of data issues within the EiBs and DGWs
- Optimized transaction conversion schedule by identifying and proving out that certain data files be loaded prior to and after cutover
- Facilitated the migration process through a repeatable, predictable, and highly automated migration process within Applaud
- Accelerated the validation and reconciliation process by providing validation reports that join the legacy data to the fully migrated data from Workday.
- Provided 24 hour support during throughout the cutover period in case of any unexpected surprises (there weren't any)
Project Results
- 100% load success for Core HCM
- Migrated 19,794 employees and 3,579 contingent workers with 100% load success
- Identified and closed 1,179 of 1,192 data related defects
- Harmonized data for 1,173 duplicated employees
- 97% load success across all data objects for the Configure and Prototype build
- Data conversion completed on time and within budget
- Expectations for limiting internal resource time was exceeded

Reducing Integration and Analytics Costs for a Fortune 500 Cloud Solutions Provider
Background
A Fortune 500 CRM (Customer Relationship Management) and cloud solutions provider faced the complex challenge of exponential growth in integration, compute, and data warehousing costs caused by the rapid acceleration of data volumes across its data landscape. The organization needed to move its data and analytics from Snowflake to a custom-built Hive solution on the Amazon AWS (Amazon Web Services) platform to reduce costs and improve performance.
Why Definian
While the Client has significant in-house data analytics and cloud infrastructure skills, they needed data engineering expertise to complete this initiative in the desired time-frame. The client had previously worked with Definian on data governance and integration projects, and through that experience, knew Definian had the necessary skills, accelerators, and methods to carry out this project efficiently and effectively. This was validated by Definian's three decades of experience building complex data engineering solutions.
The Project: A Joint Effort Across Four Milestones
Like many large transformative initiatives, this project simultaneously posed significant risk and value. To reduce project risk and maximize value along the way, the initiative was split into four distinct milestones. This modular approach enabled the Client to realize value throughout the initiative without disrupting current processes. It also enabled the Client and Definian to focus their energy on their respective strengths.
Being a pioneer in cloud applications and data modeling, the Client owned the design and development of the new analytics platform. The Client leveraged Definian’s data engineering solutions to minimize development time and maximize data pipeline throughput. While Definian upgraded the data pipelines that fed their analytics platforms, the Client focused on data models and cloud architectures.
Milestone 1: Migrate Jitterbit Integrations to AWS Glue
The project's first milestone focused on replacing approximately 300 Jitterbit integrations that connected the Client's operational data to their primary analytics data stores in Snowflake and Redshift. To help keep this milestone on track, Definian used its integration design frameworks and reference library to reverse engineer the poorly documented legacy Jitterbit integrations and replicate them in AWS glue.
Milestone 2: Design and Build the Pipelines for the Future State Data and Analytics Platform
While the Client focused on designing and developing the Hive database in AWS infrastructure,Definian designed and built the future state integration framework and process. Collaborating closely with the Client, Definian enhanced the integrations from Milestone 1 to easily re-point to the new analytics warehouse during cut-over. Additionally, Definian and the Client found opportunities to rationalize and improve the performance of existing integrations. As part of the improvements, Definian increased pipeline efficiency by transitioning/mirroring the ETLs from AWS Glue to Apache Airflow.
Milestone 3: Migrate from Snowflake to Hive
With the new analytics platform operational, it was time to migrate the data and shut down Snowflake. Definian built a Snowflake to Hive pipeline to execute the migration using PySpark in Apache Airflow. This approach maximized throughput and minimized development time. To reduce downtime during the cut-over, Definian and the Client collaborated on a tight cut-over plan. The execution of the plan exceeded expectations, resulting in no downtime and an on-time go-live.
Phase 4: Consolidate Data Silos
After the new analytics platform went live, the last step was consolidating and decommissioning additional data silos into the new analytics platform. Definian designed the pipelines and processes for this last step to enable the Client to self-execute the plan when ready. When the Client was ready to migrate, Definian provided as-needed back-up to the Client.
Impact: Improved Data Pipelines, Improved Data Analytics, Lower Costs
This complex initiative enabled long-term sustainable analytics capabilities for the Client. They have a pathway for more intelligent AI, sharper analytics, and data-driven decisions. The new data pipelines in Apache Airflow run at a lower cost and greater efficiency than the prior Jitterbit framework.

Building the Data Integration Capabilities for Leading Data Privacy Platform
Introduction
A quickly growing data security, privacy, compliance, and governance software provider found itself facing a significant hurdle: their cutting-edge data discovery algorithm couldn't easily work with the systems their customers already used. To keep growing and attracting new clients, they needed a swift solution to improve how their software could integrate with a variety of other metadata management technologies.
Choosing Definian for the Solution
The reason they turned to Defnian was clear: Definian had a strong track record of engineering seamless integrations for Fortune 500 organizations and other leading data platform solutions. With three decades of expertise in making legacy and modern technologies work together, Definian was the standout partner for this initiative.
Setting Project Goals
The goal for this initiative was twofold. The project's main aim was to create bi-directional integrations with Alation and Informatica EDC. These two initial integrations would enable the Client’s data discovery algorithm to create a unified data catalog that includes sensitive and PII information in common customer environments. The secondary aspect of the project would set the foundation for rapidly incorporating additional integrations into the core product, enabling the Client’s data engineering team to continue focusing on improving their core product's data discovery capabilities.
Outcome: A Game Changer
Since the initial engagement, the partnership has continued to expand the product’s integration capabilities to accelerate the implementation process and serve their customer’s increasingly complex data privacy, security, and governance requirements. The results helped propel this prominence in the marketplace:
- It earned a spot as Gartner Magic Quadrant leader.
- Household names like American Airlines, Discover, and Dell came on board as clients, drawn by how well the software could connect with tools they already use.
- Foundation is set to enable exponential increase in integration capabilities through a soon-to-be-released marketplace.

100% Workday Load Success for State Government
Massive Scope
The project was to perform data analysis and migration of Human Capital Management (HCM) and Payroll data from PeopleSoft, learning data from Taleo, and Benefits data from a custom benefits administration system into Workday for a state government with 34,000 active employees and 120 agencies.
“Thank you for putting in all this work [to extract data from Taleo]. In the end, you’ll be saving the state and the project a ton of money and headache,” Director of Workday Operations for the State Government
Important Goals
The state embarked on a transformation initiative to consolidate their processes and data into a single instance of Workday and to eliminate their dependence outdated technology. This project launched amid the COVID-19 pandemic. These circumstances accelerated a variety of work-life changes and underscored the importance of project goals. The goal of the Workday implementation was to modernize the client’s HCM system and keep individuals connected while shifting to a hybrid work environment. Workday HCM created a user-friendly hub of employee resources and workforce data that will allow the client to strengthen themselves as an employer. Looking ahead, employees will also enjoy an enhanced career experience with the new capabilities available to them.
The Risks
The client had several concerns regarding the migration of their data and understood the risk. Definian worked with the state government and Workday teams to address these concerns, ensuring a successful and on-time Workday implementation.
Every data migration and implementation project has inherent risk factors. This client was most exposed to the following risks:
- Limited knowledge on data dependencies across different legacy sources
- Lack of experience with Workday target system requirements
- Employee staffing and overall resource constraints
- Reviewing transformed data in a timely manner prior to Workday loading
- Mapping PeopleSoft payroll codes to Workday reference IDs
Mitigating Risks
Definian mitigated the client’s high-risk areas by employing the following practices:
- Created detailed reports to help the client better understand their legacy data
- Worked alongside the client to develop a strategy to address legacy data issues
- Implemented programmatic data cleansing processes such as address standardization, reformatting of employee names to have proper casing, and reformatting of position IDs to be unique
- Configured data sets shared with the client to draw a connection between an employee’s data across multiple legacy sources
Definian executed the following actions to combat the risks associated with the client’s lack of knowledge of the Workday target system:
- Created validation reports tailored to Workday’s requirements for each individual data piece, allowing the client to address invalid data or missing Workday configuration prior to loading the data.
- Developed conversion error reports which allowed the client to identify legacy system issues, such as invalid, missing, or duplicate data. This enabled the client to address these issues early, a benefit as there is little ability to back data out of Workday once loaded.
- Built specific workbooks designated as client-facing validation files that were tailored for clear presentation and easy understanding of the data
“It has been great working with you all. This was definitely the best government implementation I have worked on,” Senior Associate for the System Integrator
Definian provided the following resources to help with staffing constraints:
- Took ownership of the data conversion process to allow client resources to focus on pre-load and post-load validation
- Led mapping specification sessions with the System Integrator and client to bridge the gap between the legacy and target data
- Facilitated discussions to identify resources and document Responsible, Accountable, Consulted, and Informed (RACI) parties
- Met with the client team to guide the pre-load validation process
To mitigate the risk of client data review deadlines Definian:
- Thoroughly laid out build schedules and review expectations
- Identified client resources for each specific conversion item, including back up resources where needed
- Met with client resource individuals to aid them in their data review
Also, to address the delays caused by mapping legacy payroll codes to Workday reference IDs, Definian:
- Coordinated the SI and the client team to determine pay code conversion logic
- Generated detailed reports highlighting any present payroll codes that did not have configuration in the Workday tenant
- Updated the build schedule to include additional time for new pay code mapping to be completed prior to each data load
Overview of Key Activities
- The team used the Applaud software in a deep analysis of the client data, using integrated reporting tools to identify project risks.
- Transformation and migration processes were developed for all required data from multiple legacy sources.
- Throughout each test cycle, data quality issues and load errors were identified and proactively corrected to alleviate stress leading up to the go-live window.
- The team was able to adjust to updated timelines and changes to the project scope.
- Data defect logs and build cycle trackers were maintained throughout the project to provide clear visibility on requirements and deadlines.
- Built a payroll reconciliation tool to aide in the validation of the payroll results in the Workday tenant.
- Developed a system to run payroll reports to proactively configure new earning and deduction codes ahead of each build window.
Successful Workday Go-Live
Definian played a major role in this successful Workday implementation. Achieving a 100% load success on critical-path files paved the way for a smooth transition into the client beginning HR business processes in Workday. In addition, the payroll reconciliation that provided line by line comparison and analysis of every single paycheck led to a successful payroll data conversion, which allowed for the client to confidently begin payroll processing in Workday immediately.
The Applaud data migration services allowed for a diligent data review process in the initial test cycles of the project, which provided key insight that was later pivotal to the on-time delivery of the Workday tenant. Detailed reporting and documentation maintained the integrity of the data as it was transformed and combined across multiple legacy sources.
Success Metrics
34k active employees converted
27k terminated employees converted
4.5k retired employees converted
Data extracts and conversion all completed on schedule to support cutover and catch-up transaction efforts
100% load success on critical-path files

Migrating from IFS to Oracle Cloud for a Major US Defense Contractor
Background
In April of 2021 one of the largest providers of technology services and hardware for the United States government embarked on a mission to modernize their Supply Chain Planning solution. The goal of this modernization effort was to reduce costs, gain efficiencies, and drive innovation so that the company could better serve its governmental, defense, and intelligence clients. To make this implementation possible, Procurement, Inventory, and Order Management data would first need to be migrated from their legacy IFS system to Oracle Cloud. One priority of the migration project was to enrich data as it was extracted from IFS so that it could drive enhanced business capabilities and intelligence in Oracle Cloud.
Connecting Different Sources
One challenge that was identified early in the process was that while the Supply Chain Planning was moving to Oracle Cloud, the accounts payable and other financial data remained in a separate Oracle EBS ERP system. There were several impacts of this system design on the data conversion process.
During the migration process Definian needed to make sure that any purchase order that still had an open accounts payable amount would be migrated into Oracle Cloud. Using the Applaud tool, Definian was able to connect to Oracle EBS directly and confirm that all purchase orders with the open AP amount were converted.
On top of this, the Supplier data needed to exist in both ERP systems after the Oracle Cloud system went live. To ensure that the data stayed in sync between the two sources Definian leveraged their deduplication process to merge Suppliers that had already been merged in Oracle EBS during a prior migration. Definian also provided a detailed mapping showing how Supplier data had been mapped between IFS, Oracle Cloud and Oracle EBS.
Validating Manual Data Collection
During the project design it was identified that a significant, several month effort would need to be undertaken to collect additional data that was not present in the legacy system, but was needed for business operations in Oracle Cloud. The project team would need to reach out to hundreds of individuals across dozens of teams asking for additional data points.
Definian was able to mitigate the risk that comes with data collection by building out a complex migration solution to apply these user-provided values to the correct Oracle Cloud attributes. Further, Definian leveraged our thorough data pre-validation process to ensure that the attributes that were required for each purchase order record were provided and that it fell within the list of values that were available for use. Definian went as far as validating the values provided for custom Oracle Cloud DFFs to ensure that they would be accepted by the system requirements.
Creating Note Attachments
In Oracle Cloud, it was determined that there wasn’t going to be enough space on each purchase order to capture the note data that needed to be preserved from the legacy system. The solution was to attach the legacy note details to each purchase order record as a downloadable Excel file. The challenge is that those Excel files didn’t exist prior to the conversion.
Definian was able to rectify this by building out a complex process to scan the legacy database for note details and build out the Excel attachment files. From there Definian leveraged their SDO Attachment utility to link each Excel file to the correct purchase order record. In total, nearly 25,000 attachment files were uploaded to Oracle Cloud and attached to the purchase orders at 100% success rate.
Set Up for Success
In the end the client was able to load 100% of the converted data objects and ensure that their Supply Chain Planning solution was set up for success for the years to come. While there were plenty of challenges along the way, the team was able to accomplish the original goal in just about one year. The legacy data had also been standardized and enriched in a way to allow for more efficient operations and more reporting capabilities in the future.

Improving Regulatory Compliance and the Customer Experience through Advisory and Technology Consulting Services
Client: Multi-National Bank with Regulatory Compliance and Data Quality Issues
Background
A growing multinational bank found itself confronted with rising regulatory compliance demands, data security concerns, and issues regarding data quality. The bank’s data management operations were reactive, fragmented, and struggled to cope with the escalating challenges stemming from the organization's growth and regulatory remediation mandates. Definian was engaged to restructure data operations to be in line with the business objectives and to resolve issues related to data availability, compliance, quality, aggregation, and overall operational inefficiencies.
Project Challenges
- A lack of a formalized data quality issue remediation process had drawn the concern of regulators and was negatively impacting financial and leadership reporting
- Limited resources prevented the backlog of data issues from being addressed.
- The inability to accurately unify customer account and product data from multiple sources into a single view dramatically prevented profit aggregation reporting targeted marketing and risk monitoring effectiveness
- Unstructured and competing data ownership, management, and governance led to inconsistent decision making
Key Solutions Summary
- Definian’s Data Governance-in-a-Box approach met the client where they were to create a roadmap aligned with business objectives and cultural readiness
- Designed and implemented comprehensive data governance program, organizational framework, and operating model
- Definian’s Data Companion solution clarified critical data elements definitions, captured metadata and management requirements, cataloged data quality rules, and established a hub for data operations issue management
- Data stewardship-as-service resolved the backlog of data issues and operationalized the Data Governance program
- Designed and facilitated educational curriculum on data governance best practices to accelerate and improve adoption and support regulatory requirements
Outcomes
- Operationalized Enterprise data governance and data management organization
- Identified underlying technical and business logic issues leading to mistrust in customer hierarchy and account ownership
- Recommended steps to resolve and enhance customer and product views without significant investment in new software products
- Improved compliance, financial, and leadership reporting
- Catalog of critical data element definitions, metadata requirements, governance processes
- A data issue resolution plan that methodically reduced the backlog with the available resources
- Increased cultural awareness on the impact of data to the organization

Easing Data Conversion Woes for an Early Education Client
Project Summary
This early education client wished to upgrade their data architecture from PeopleSoft to Oracle Cloud for enhanced functionality and reporting capabilities. As one of the largest privately-owned child care providers in the US, they wished to bring 10 distinct brands and hundreds of schools to their existing Oracle Cloud platform. Definian was enlisted to perform data migration for this Oracle Cloud Financials and Procurement implementation in 2019. Our project team included the clients’ functional and technical leads, system integrators from a partner consulting firm, and our own project managers and developers. After months of development and test cycles, the project was put on pause in the spring of 2020 due to the COVID-19 pandemic. Definian resumed project activities in September 2021 to iterate on existing data conversions and support the production loads for a February 2022 go-live, completing six conversion cycles in six months to ensure the utmost data quality in the new system.
Project Requirements
The client’s go-live comprised of the following Oracle Cloud data objects: Suppliers, Supplier Banks, Purchase Orders, General Ledger, and Fixed Assets. Source data came from PeopleSoft and Excel reports from business users. To minimize the impact on school operations, the Cutover schedule was incredibly compact. The goal was to load historical General Ledger, Suppliers, Supplier Banks, and Purchase Orders before go-live and load Fixed Assets and the most recent GL period after go-live. So as not to disrupt the pre-go-live processes in Peoplesoft, two separate data snapshots were necessary before go-live and a third was needed for the post-go-live conversions. All data objects were to be loaded into the already live client HCM Oracle Cloud environment.
Key Activities at Cutover
We instilled confidence our process by using our repeatedly enhanced data conversions for our final Cutover loads. At the time of Cutover, we provided 42 distinct files to support the in-scope data objects. We converted over 4 million records, performed pre-load validations using our Critical-to-Quality checks, aided in client validations, and provided load files to the technical leads from the system integrator. To adapt to this client’s high data volume, we generated and provided Excel validation extracts to the client leads as well as Excel extracts and load files to the system integrator instead of the typical Oracle FBDIs (File Based Data Imports). All data provided at Cutover was successfully loaded into the client’s Oracle Cloud system.
Project Risks
Given the project restart, we faced unique and unanticipated challenges. However, we used our agile approach to adapt to the circumstances and support our client in this unprecedented time.
COVID-19-Induced Challenges
The client and System Integrator (SI) both had notable resource changes after the project pause. We worked with client functional leads who did not have insight into the original data mapping conversations nor the functionality of the target system.
We adapted to these challenges by iterating on previous versions of Mapping Specification and discussing logic changes with the new client and SI leads. We modified preexisting conversion code and used our Oracle Cloud knowledge to provide solutions. We ensured that their organizational needs were met through repeated validations and transparent communication.
Client Technology Limitations
Large client data volumes caused server slowdown; conversion processes took substantially longer to run during the first cycle after the project pause.
Definian consultants tuned and optimized our data conversions programs to overcome limitations of the client server. Definian also consulted with our internal software team to troubleshoot issues in real time. We were able to get faster conversion processing times than ever before and delivered data extracts to the client leads with ease.
Data Transformation/Requirements Changes
The Supplier client leads wanted Supplier addresses standardized in a manner that did not align with the best practices address standardization approach.
We worked with the client leads to discuss their desired end state and implemented additional transformation logic to guarantee that their needs would be met. We tested this logic and ran our Critical-to-Quality (CTQ) pre-load validation checks to ensure that all processed records would load successfully into the client’s Oracle environment.
Client leads requested substantial requirements changes on the Purchase Orders data conversion during Cutover.
We typically do not recommend logic changes at Cutover given increased risk with the load to Production. However, based on the client’s needs and understanding of the risk, a consensus was achieved to proceed with the change order. Definian consultants worked with client leads to understand all aspects of their requests and robustly tested the newly coded logic. Client leads thoroughly validated the data extracts with all requirements changes encompassed. The desired changes were reflected and the data loaded fully into the Production environment, saving the client from making substantial manual changes to post-load data.
We found extensive duplication of supplier data within the legacy system. Additionally, the supplier format and structure required for Oracle Cloud was substantially different than of the client’s existing system.
Applaud’s powerful and flexible data matching engine was used to de-duplicate the client’s supplier data. Definian provided ample reporting tools to the client for them to determine which records should persist in their new Oracle solution. Across multiple test cycles, we gathered, implemented, and tested complex transformation logic that required cross-functional input and validations. Our custom error log reflected the coded changes and any cases that required further client investigation. We received some substantial structural Supplier changes during SIT2 and UAT, but adapted accordingly to meet the client’s data expectations.
Challenges Faced During Data Loads
Oracle Cross-Validation Checks (CVRs) were turned on for Purchase Orders (PO) during UAT without our knowledge. We loaded POs in the Production instance with CVRs on as well.
We dealt with significant PO fallout during UAT due to the CVRs being left on for the loads. We took a multi-step approach to mitigate the situation. First, we created a delta process to capture the POs that fell out in a prior load so we could try to re-load them again during the same mock cycle. The Oracle CVRs sometimes over-exclude records, so this process helped us to iteratively attempt to load previous fallout to eventually arrive at just the ‘problem’ records. Next, we conducted a second UAT test cycle to re-load POs with additional account transformation logic captured in the PO conversion. The objective of this approach was to mimic the back-end transformations that would be happening in Oracle so that validators could easily identify improper accounting combinations. These combined strategies, coupled with ample pre-load validation, helped us achieve 100% PO load success at Cutover.
The client’s Oracle HCM Cloud system was already live; all Procurement and Financial data was to be loaded/merged into this Production environment.
We needed to ensure that all newly converted data would be compatible with the already live HCM Production data. Our previous test cycles loaded data into a mock environment, so the Cutover loads were unique in this manner. We found inconsistencies with certain Location and Department configurations in Production that did not align with our repeatedly tested logic. Given our suite of Critical-to-Quality checks, we were able to quickly identify issues and subsequently notify the client team. We worked alongside the client team to guarantee that all affected Locations and Departments were updated accordingly. All issues were resolved prior to the formal data loads, paving the way for Cutover success.
Given the large data volumes of this client, the System Integrator requested that we provide data for loads using flat text files instead of the typical FBDIs (File-Based Data Imports).
While this may seem like a slight and nuanced change, the data delivery process was substantially impacted by this request. Usually, the clients validate the data in the FBDI, provide sign-off, and that same file is used for the SI load resource to perform the associated data load. In this case, we took a multi-step approach to ensure that all functional and technical parties would have their needs met. First, we provided an enhanced Excel extract to the client with additional legacy information included for easier validation. Once the client approved that data, we would exclude any records that were flagged by our Critical-to-Quality Checks and output that data in two formats for the System Integrator: 1) an Excel file that mimicked the format of the FBDI and 2) a CSV with the same data for load purposes. Several data conversions required multiple sets of files be generated; we managed the data delivery with ease. Specifically regarding the Fixed Assets conversion, we provided data in chunks of 100,000 records or less for an effective and efficient load into Oracle.
Project Timeline
After the project pause, the client schedule was extremely compact (6 test cycles, including Cutover, occurred in 6 months).
We understood the needs of the client and the pressure to meet their ambitious go-live date. We leveraged our many years of experience performing data migration to reduce risk along the way. Definian worked closely with the client and the System Integrator to create test cycle schedules and set expectations for data snapshots, data conversion turnaround time, and data validation. Definian also went above and beyond to work with individual client leads to instill confidence in the converted data.
We were faced with incredibly tight timing around Cutover, particularly for Purchase Orders.
We worked relentlessly during Cutover to determine that all necessary steps were taken for our client’s success. We completed the entire Cutover PO data conversion cycle (from raw data to pre-load validation) in 24 hours. This timeline was substantially faster than other test cycles. To meet this timeline, we planned and conducted numerous test runs off of an earlier data snapshot as practice for the final Cutover PO conversion. This robust validation using the prior data snapshot allowed us to expedite the final Cutover PO sign-off. Definian worked around the clock with client vendors and leads to ensure that all of our bases were covered during this critical time.
The Results
Our client went live on-time and with 100% load success for all data objects. Shortly after the go-live, the client team rolled out the Oracle Procurement and Financials system across their entire organization. The client business users had confidence in the converted Oracle data and few manual updated were needed. In addition, our expeditious turnaround at Cutover allowed operating schools to function as per usual, school facilitators were able to order food and supplies with minimal interruption. Child care has become more important than ever given the COVID-19 pandemic; we are glad to know that our client is able to operate effectively to serve working parents and teach the leaders of tomorrow.

Phased ERP Go-Live for a Multi-System Healthcare Provider
Background
One of the world’s largest medical centers made strategic plans to upgrade their entire data architecture. This transformation required moving their Enterprise Resource Planning (ERP) system from Lawson to Oracle Cloud and their Supplier Relationship Management (SRM) system to Ivalua. This organization faced serious data conversion issues with their initial implementation in London; consequently, they enlisted Definian to handle their second implementation: performing data migration for their main US campus. Definian addressed data complexities with a methodical and thoroughly validated approach, allowing the client to use the new system with full confidence in the underlying data. As a result of the second effective implementation, the client re-enlisted Definian for their third wave to handle the integration of recently acquired hospitals onto their new data platform. All of these newly acquired medical centers needed to have their data converted to match the specifications of the already live data in Oracle Cloud and Ivalua.
Due to the intricacies of the third wave implementation (large number of conversion objects, high data volume, dependent integrations), the project plan was engineered to include two separate go-live windows. The first go-live (aka the “Winter Release”) would reflect the master data objects in Oracle Cloud, the integration of loaded Oracle master data into Ivalua, and loading all additional source-to-pay data objects into Ivalua. The second go-live would reflect the transactional data in Oracle Cloud, building upon the already loaded master data. This case study will focus on the “Winter Release” go-live: the load of Oracle Cloud Supply Chain conversion objects, the subsequent integration of this data into the Ivalua procurement system, and the load of all Ivalua conversion objects for procurement purposes.
Project Requirements
The “Winter Release” go-live involved six Oracle Cloud data objects (Items, Manufacturer Part Numbers, Supplier Part Numbers, Inventory Org Items, Intraclass UOM, Suppliers) and three Ivalua data objects (Suppliers, Contracts, Pricing). With these data loads, the client wanted to ensure minimal disruption to business users; as such, it was a priority to load the data in an expeditious and creative fashion. The “Winter Release” project plan stated that between Sunday night to Friday morning the team would complete the following: obtain a legacy data refresh, perform conversion runs, preliminarily validate converted data to be loaded into Oracle, load data into Oracle, and validate loaded data. From there, the Ivalua integration would run on Friday to include the recently loaded Oracle Cloud data. All other Ivalua data objects were to be loaded over the weekend so that end users could perform their day-to-day tasks the following Monday.
Key Activities
The data migration team fully leveraged Definian’s EPACTL framework, and Applaud® data migration software by performing the following activities:
We began by extracting legacy data to be used for Oracle and Ivalua data conversions. Through multiple test runs, we were able to reduce the time it took to extract the legacy data by updating table set-up and removing unneeded columns. At the beginning of the project, we profiled and analyzed source data to determine any legacy data quirks and understand how the data should exist in the end-state Oracle and Ivalua systems. We then performed data cleansing which entailed updating legacy data inaccuracies, supplier deduplication, and address standardization. Next, we performed the necessary data transformations specified by the client to be reflected in the Oracle Cloud and Ivalua production environments. In the process, we automated our data conversions to limit manual work. We also used our proprietary Oracle Cloud Critical-to-Quality (CTQ) pre-load validation checks to identify load errors prior to executing data loads. These checks led to targeted error resolution before it was too late, thereby enabling efficient and highly effective data loads within a tight timeframe. Additionally, we updated Oracle load settings to use multiple threads; this slight tweak significantly reduced load times. We performed the necessary boundary system integrations to ensure that the client’s Oracle Cloud data would be reflected in Ivalua and completed all necessary Ivalua data loads.
Our EPACTL process is enhanced by our detailed project management strategies and communicative consultants. Our detailed project plan outlined all necessary steps from data extraction to post-load validation as well as the expected time to complete each step. Additionally, our team was in constant communication with client leads to execute changes and answer questions during the validation process; this allowed us to address issues in real time.
Client Risks and Mitigation Strategies
COVID-19-Induced Challenges
The “Winter Release” go-live was executed later than originally anticipated due to the COVID-19 pandemic. Eight months after the start of the project, project activities were paused; the client’s leadership team determined a new path forward due to COVID-related interruptions.
- Resource changes occurred as a result of the pandemic. Additionally, client SME availability was oftentimes scarce.
- Due to COVID restrictions, the “Winter Release” go-live was executed remotely.
- The production Oracle Cloud environment was already live with data from previous project waves; this data was regularly being accessed by users around the globe.
- The hospital system had a large volume of data including 4.5 million inventory organization item records.
Our solutions
- Definian was a constant across the project, providing support to the client at the start of the project and after the project pause. As a result, Definian consultants had a deep understanding of the legacy data and the objectives of the implementation, which allowed the Definian team to assist with/help drive data validation.
- Definian consultants set up Microsoft Teams channels to increase real-time communication and address any problems expeditiously.
- Definian remained extra vigilant to ensure duplicate records would not be loaded for already existing data.
- We maintained consistent cross-functional communication to guarantee that all necessary processes were turned on/off as needed for successful data loads.
- Definian’s Applaud® software is able to ingest, process, and output large data volumes. We worked with the client to ensure that the data would be exported in a validation-friendly manner.
Data Transformation and Requirements
Challenges
- Incomplete testing and data validation in prior mock cycles led to inconsistencies in financial reports, which were not discovered until later in the project. As a result, this impacted the SCM conversions (Items and Suppliers) as the legacy selection logic needed to be updated to support the client’s necessary financial requirements.
- The client made late changes to data conversion logic (the week before data was slated to be loaded into Production). These mapping updates added extra risk as the new conversion logic was not tested in a development environment.
- Considerations for different types of suppliers being converted including 1099, international, and "confidential" Supplier Types.
- Multiple legacy records were combined into one Oracle Cloud record for Suppliers and Inventory Organizations, adding an extra layer of complexity.
- The Ivalua implementation and integration were not a part of the initial project scope. The inclusion of the Ivalua mid-project required the team to act quickly, executing mapping sessions and conversion builds with fewer mock cycles to test the conversion logic.
- The client had a tight timeline to execute the Oracle Cloud data loads due to the Ivalua integration. The Oracle Cloud loads had to be finished by Friday morning in order to carry out the Ivalua integration Saturday morning and subsequently load the Ivalua data objects over the weekend.
- The project timeline was written with the “Winter Release” go-live overlapping with the UAT test cycle for the April transactional go-live. This overlap added extra constraint on client resources.
- The dual maintenance period was extended due to the phased go-live (the “Winter Release” and later the April transactional go-live). This two-month lag period required that the client carefully track changes in both Oracle/Ivalua as well as their legacy system.
Our solutions
- Definian scheduled meetings to discuss modifications to selection logic and tested the impacted conversions to ensure that the associated code changes resulted in the correct output.
- Definian consultants validated the conversion changes and ensured that the output appeared as anticipated prior to releasing the new set of files to the client.
- Definian provided reports to assist with pre-load validation and answered specific client questions/concerns as they arose.
- Leveraging Definian’s decades of data migration experience, the Definian team was prepared to execute additional data conversions in an efficient, effective manner.
- Our ethos (The Definian Way) involves rising to the opportunity to meet our clients’ needs. We understood the importance of executing the loads on schedule and ensured that the data would be ready for end users come Monday morning.
- Again, Definian rose to the opportunity to meet project deadlines and worked with client SMEs’ schedules to address concerns as they arose.
- Definian supported the client through their dual maintenance period and reported any unexpected data cases on conversion-specific error logs prior to loads.
Results
The large data volume, late requirements changes, as well as shifting resources/timelines added risk to the overall success of the project. However, with a tight project plan, several mock cycles, automated processes, and Definian’s years of data migration experience, the entire go-live process went smoothly from start to finish. Using our Applaud® software, Definian consultants quickly adjusted to changing requirements and supported data validation efforts to propel this successful implementation. All Oracle Cloud data objects loaded successfully and were swiftly validated; these efforts allowed the Ivalua integration and data loads to take place on schedule. The “Winter Release” go-live was an important milestone as it instilled confidence in the project team and laid the foundation for the official project go-live thereafter. Our client’s data is more streamlined than ever and they are able to operate efficiently across campuses and countries!











