Data best practices for modern organizations

Definian explores practical methods, proven frameworks, and actionable guidance that help teams work smarter with data across the enterprise.

Formalizing Data Governance As Part of Your Workday Implementation

Workday
Best Practices
Definian's agile data governance framework accelerates Workday implementations and provides organizations the structure to maintain data clarity and trust long after go-live

Modernizing to the Workday platform enables organizations to increase efficiency and gain insights that were not feasible with fragmented legacy systems. However, implementing the Workday platform is not a silver bullet to solving the data management challenges organizations are facing in today’s data-driven climate. Adapting to the increasing speed, scale, and variety of data along with growing data privacy regulations present obstacles that every organization is working to address. These challenges are further compounded with frustrated data consumers having to spend more time collecting and cleansing data than using it. While Workday’s modern platform can help tackle these challenges, a data governance discipline is required to formalize the data standards and policies that not only guide the implementation of the platform but also provide the accountability structure to uphold them across the organization. When planning for a Workday implementation it is essential to evaluate your data governance strategy if one does not exist. Launching your data governance journey with a Workday implementation presents a golden opportunity to shift your data culture forward while working through the critical data and process requirements of the business.

Data Governance with a Workday Implementation

As an official Workday partner, Definian has deep experience supporting organizations with data migrations from complex legacy systems to the Workday platform. Our methodology includes data readiness activities that expose data quality issues that require remediation to migrate data successfully to the target environment. While this exercise is beneficial to ensuring clean data is migrated, there are no methods in place to certify that it will remain clean in the future. For this reason, we strongly advise that data governance is a must to ensure post-implementation success.

Starting data governance as part of a Workday implementation may cause concern when viewed as a constraint. We firmly believe data governance can work in parallel and should never pose any blockers to the execution. Instead, outputs from the data migration and implementation activities complement data governance by serving as inputs and use-cases. While data governance provides oversight, the data readiness, design, configuration, and decisioning aspects of a Workday implementation inform data governance with the accompanying data standards, processes, policies, and accountability structures that need to be formalized.

Figure 1: How a Workday implementation and data governance complement one another

Making the Case for Data Governance

Data governance is a fundamental data management discipline that provides immediate value in addressing the needs for data quality, data literacy, data security and compliance. The simple goal of data governance is to establish clarity and trust in the data that drives the business. This is accomplished by formalizing working definitions, data standards and policies that are enforced through an accountability and stewardship model. With a data governance capability established, organizations can effectively manage their data across the key functions of data management.

Figure 2: Data management functions supported by data governance

Our Data Governance Framework

The traditional application of data governance has been met with more failure than success. It is characterized as a centralized program that focuses on data protection using command-and-control methods that dictate how data is accessed and used throughout organization. This approach faced failure as it would often result in bureaucratic processes the made data governance a bottleneck to producing any value. A key lesson learned was that data governance is not a one-size-fits all approach and needs to be flexible in tailoring to an organization’s specific needs.

We take a modern approach to data governance by using our agile framework that is centered on the goals of promoting data usage by establishing trust while enforcing adherence to data security policies. With this framework we apply a use-case driven methodology to launch data governance and iteratively scale it to a desired state with an intent to provide immediate business value. The key pillars of the framework ensure that both culture and technology support the people, process, and methods to establish data governance.

Figure 3: Definian’s agile data governance framework

Using our framework clients quickly realize that there are many aspects of data management that are already being governed informally in operational siloes or systems. Our framework recognizes the need to formalize those behaviors by establishing a data governance discipline. By raising awareness and fostering collaboration around the working standards, policies, and guidelines for using and managing data, data governance serves as a success enablement factor for any business initiative that hinges on data.

Including Data Governance as Part of your ERP Implementation

Best Practices
Service Offering
After working on dozens of implementations, we find that organizations frequently overlook using the implementation as an opportunity to upgrade data governance to fuel long term data performance.

Implementing new business applications is a massive investment that requires spending significant sums defining business requirements, defining terms, configuring the solution, building RICEW objects, cleansing data, transforming data. After working on dozens of implementations, we find that organizations frequently overlook using the implementation as an opportunity to upgrade data governance to fuel long term data performance.

Upgrading data governance as part of the implementation doesn’t need to be a big lift and can be accomplished in bite-sized chunks throughout the project timeline. If an organization doesn’t consider data governance beyond the new application controls, the data will only be as good as the application was configured.  This myopic approach can lead to disaster. We’re currently working with a Fortune 500 client who implemented a new HR solution without Definian’s guidance.  Their implementation lacked the governance to control the data within the new system.  After go-live, it quickly became impossible to get a trustworthy report from the new system in a timely manner. After several months of untangling the data and working with the business to redefine what its requirements should be, they’re finally closing in on trustworthy reporting.

The lack of governance applied to this HR solution hindered the company in two primary ways. The first is that they were unable to generate trustworthy reports for months. This challenge made it impossible for them to meet the basic HR reporting requirements for a large enterprise. Additionally, being focused on tactical data cleanup and definitions also prevented this department from being able to use the data to drive the organization forward.

The firm now has clean definitions for many of its attributes and is developing training materials to help ensure that divisions across the enterprise are using the attributes and the application consistently. The business requirements can be used to update the application and execute ongoing quality reporting. Most importantly, the governance they’re using will enable the focus to shift towards improving HR rather than trying to generate standard reports.

To avoid the pitfall this HR department encountered, the following activities are a good start to upgrading data governance during the implementation without impacting the timeline or licensing another piece of software. While the following sections may seem overwhelming at first blush, these activities can be spread out across the entire implementation and even after go-live. The important part is that they get scheduled while data is on everyone’s mind.

Kick off Data Governance - If you don’t have an active data governance organization within your firm. Set some time aside towards the beginning of the implementation to discuss data governance, what it is, and how it can be used to improve everyone’s work. If you have active data governance, it’s a great opportunity to talk through the project with the governance team and project teams together. Outline the program's goals, how governance can help with implementation, and how business can continue after go-live. The important part of the discussion is to get people thinking about data governance.

Document the Gaps and Define the Best Practices - Identify current data governance strengths, weaknesses, gaps, and risks, and define future state data governance best practices. While working to identify the gaps and best practices, start to identify the key people who will be the driving force behind ongoing data governance. The exact roles vary from organization to organization, but some common roles are executive champion, data governance manager, data governance council, and data steward.

Define a Data Governance Charter - Similar to the program charter that is created as part of the Oracle implementation, create a data governance charter to keep everyone on the same page about how we’re going to make sure data is enabling the organization to meet its needs. The charter usually contains the following sections: problem statement, responsibilities, goals, benefits, scope, assumptions, dependencies, activities, deliverables, risks, and critical success factors.

Collect the Metadata - There will be many meetings discussing what is an active customer, what is a line of business, what is an item type, and how to classify an item for a given item type. Don’t lose that knowledge in the opacity of the system. Make sure the decisions are documented and create a library that contains the terms, requirements, and all the metadata that you can.  This will speed up future development and the potential future implementation of governance software.

Build out the Organizational Framework - Formalize the resourcing decisions made while documenting the current gaps to create the data governance organizational framework. Identify each of the important roles that will make up the data governance council and office and their responsibilities.

Assemble a Data Governance Handbook - If your organization doesn’t already have a handbook that contains the organization’s policies, procedures, roles, and responsibilities for data governance, define what should be included in your organization’s data governance manual. This handbook is the go to authority for all things data governance. It can be quite difficult to create, but it doesn’t have to be done in one go. It is also a living document that is meant to evolve with the business over time. The handbook frequently contains the mission, guiding principles, organizational framework, roles, responsibilities, communication framework, metadata catalog, policies, standards, metrics, processes, tools, and resources. If there is something data governance related, it or a reference to it should be found within the handbook.

Create a Communication Matrix - Effective communication is critical for a successful data governance program. To clarify communication protocols, create the protocol for how to communicate data governance policies, procedures, issues, and updates. The one created by Robert Seiner, with his non-invasive data governance approach, is excellent. It is recommended to work on defining the communication matrix during the initial set-up of the data governance organization, as it provides a mechanism for defining a clear communication structure — heading off the game of telephone before it starts.

Define the Data Governance Scorecard - Define the metrics and mechanisms that will let you know if data governance is effective at the organization. These metrics align with General Accepted Information Principles (GAIP) and measure data infrastructure, security, quality, and financial goals. They can be about reducing the number of legacy databases, increasing employee skills, and making sure that the costs are in line with the committed return on investment.

Build out the roadmap - Chances are, at the time of go-live for the main implementation there will be many activities and sections that still need to be completed or updated within the governance handbook. Create a roadmap to keep improving and building data governance operations. This is a journey and not a project. Defining the next steps will continue to provide a vision for how the organization can better serve its constituents.

Establish data governance council meeting cadence - Getting the data governance council together is important to keep the data improvement momentum going. During the first meeting, show the group where the data has been, how far the organization has come, and where it is going. The format for the meeting depends on your organization's culture. The important pieces are that all data related initiatives are discussed, and issues are identified, discussed, and solved.

While these deliverables may seem like a lot, if they are spread across an implementation or at least planned to be executed at the tail end of the implementation, they are manageable. Additionally, you will be able to get more out of your new system and recognize benefits over the long term. We see that organizations that have even a minimal data governance operation at the time of go-live have the mechanisms in place to measure what’s important, communicate issues, and quickly come up with resolutions.

If you have questions about data, data governance, or data migration, let’s spend 30 minutes together to go through your questions and approach on your project.

3 Keys to Drive Your Payroll Validation Success

Best Practices
Digital transformations are a major investment in your organization’s future. We’ve identified 3 keys needed to accomplish an enterprise payroll validation.

Digital transformations are a major investment in your organization’s future. However, the success of such an implementation can hinge on payroll validation. During this critical step, the project team has the important goal of ensuring that employee paychecks are being calculated correctly. By the time your new system is live, every single earning, deduction, and tax must be accurate.

Payroll testing is a time-consuming process—typically several weeks long. Therefore, if more issues are encountered than expected, the entire project can be delayed if these critical issues are not addressed promptly.

That said, payroll validation must be focused and efficient. We’ve identified 3 keys needed to accomplish an enterprise payroll validation:

Understand that Payroll issues cannot wait to be fixed.

Payroll validation is a critical-path item. Going live with incorrect payroll will lead to backlash and reduce confidence in leadership. This is exactly what happened to the City of Dallas in 2019 when they failed to capitalize on their payroll testing and went live with faulty configuration. As a result, first responders were not receiving their benefits, and they were not being paid properly. This culminated in the story making national news after the president of the Dallas Fire Fighters Association penned an open letter to city officials about the negative impact the issues had on crew morale. Fortunately, Definian was pulled in to resource the project and we secured the benefits that our first responders were entitled to. You can read more about this here.

This story should serve as a warning to all other organizations undergoing a similar transformation. Seldom are there issues that come up during payroll validation that can wait to be solved. Instead, while the project team is all hands-on deck for payroll testing, they should ensure that issues are resolved as soon as possible after discovery to prevent any incorrect paychecks being sent out after go-live.

Establish and enforce expectations across your project team

When validating payroll, having the right people is key. The team tasked with this job should have intimate familiarity with how payroll is currently run, how it will be run in the new system, and the importance of the job they are doing. This knowledge empowers them to more quickly resolve issues so that testing can continue on-schedule. Teammates across the project must also be highly organized so that as defects are discovered, they are quickly logged in a central location that is reviewed at least daily, with clearly defined action items, owners, and deadlines. Accepted tolerances should be documented, helping the team stay focused on a clear goal.

This applies not only to the client side, but vendors as well. Functional consultants, data migration experts, and integration consultants all must be aware of the process so that issues can be addressed effectively, regardless of the root cause. Without involvement from appropriate subject matter experts, a single issue can easily derail testing and, by extension, the entire project timeline.

Use insightful metrics to drive project direction

Due to the high-stakes nature of payroll validation, project leaders must feel confident and secure in their decision to move to the next steps of the project. To do this, they must have access to insightful metrics.

For example, imagine a scenario where legacy and target system payroll deductions, summated across all employees, are equal to the exact cent. Lurking under the surface, however, is a healthcare deduction that is severely in excess, and wage garnishments that are not being deducted at all. In this scenario, at a high level, things may appear fine, but the reality is far more severe when examining at a detailed level. Alternatively, there may be known issues that are affecting variances that are skewing metrics, or metrics for an individual division or agency. As a leader, it is important to ensure that the results are viewed holistically and through multiple lenses so that smart decisions can be made. Failing to do so can lead to false positives and unexpected issues after go-live.

Final thoughts

Keeping these points in mind will play a major part in the success of your digital transformation. While the process does come with risk, when done right, it helps ensure your organization has a seamless transition to your new system and maximizes the benefits reaped from it.

Remember that payroll is not merely data, it has a real impact on the day-to-day lives of your employees, and they rely on its accuracy. We all expect employees to be good stewards of organizational resources and time spent “on the clock.” Likewise, the project leadership need to be the best possible stewards of employee payroll.

Part of that stewardship means finding the right project partners. If you would like to learn more about best practices that we employ at Definian and see if our services fit your needs, contact us below.

Best Practices Data Migration Services Explained

Best Practices
A comprehensive data migration approach covers the phases from preparing the target vision, executing the transformation, and driving long term data performance.

A comprehensive data migration approach covers the phases from preparing the target vision, executing the transformation, and driving long term data performance.  These three phases enable a successful implementation through a combination of data governance, data management, and traditional data migration activities that align technical execution with business goals. Read the steps in the attached pdf to learn what's needed to maximize the effectiveness of your migration.

What's the Difference Between Data Validation and Testing?

Best Practices
Data validation and testing are both crucial when migrating data to any system. It’s important to understand the difference as they are frequently mistaken to mean the same thing.

Data validation and testing are both crucial when migrating data to any system.  It’s important to understand the difference as they are frequently mistaken to mean the same thing.  If validation and testing are not performed thoroughly or not performed in conjunction with one another, the ensuing issues can easily derail an implementation.

What is Data Validation?

Data validation is the process of checking the accuracy, quality, and integrity of the data that was migrated into the target system.  When performing data validation, it is important to understand not all of the data will look exactly as it did in the legacy system.  The structure of the master data may have changed and values may also be different between the legacy and target system.  Payment terms, for example, may be stored as ‘Net 30’ in the legacy system, but changing to ‘N30’ in the target system.  Understanding how and what is changing is going to be important when performing data validation activities and the resources performing the validation should have that understanding ahead of trying to perform validation.

What is Testing?

Throughout the implementation process of a new system there ought to be multiple opportunities to perform testing in the target system with the migrated data during planned test cycles.  Preparing for an implementation is an iterative process.  Issues will be identified in each test cycle, whether it is with the system configuration or the migrated data.  The various teams will then work together to resolve the issues identified in preparation for the next test cycle with the goal being much improvement from the previous.  This does not mean, however, that there will not be any issues.  Again, it is an iterative process and typically with the resolution of one issue, it can uncover additional issues that may have been hidden by the original issue.  By the User Acceptance Testing (UAT) cycle, the major issues should have been identified and resolved with the goal being minimal issues and changes post UAT heading into the go live.

Testing is not data validation.  Testing is much more involved than performing basic data validation.  Since data can drive system functionality, it is important to perform data validation and testing in tandem.  The data may look correct, but that does not mean the system is functioning as expected.  Testing involves more advanced navigation of the target system along with executing typical business processes in the new system.  The results of executing those processes should confirm if the system and functionality is working as expected and meets the requirements needed by the business.  All areas of the business should execute end-to-end testing.

Validation and Testing for Success

After witnessing and being a part of many, many implementations, I can confidently say, a good indicator of a successful go live can be partially based on the amount of thorough data validation and testing that was performed throughout the project ahead of the go live.  Ensuring your project has a comprehensive and robust data validation and testing plan will put the project on the path to success!

How a Go-Live Dress Rehearsal Ensures Cutover Success

Best Practices
The final test cycle before cutover deserves as much or even more attention than cutover itself. A successful dress rehearsal is an excellent indication that the entire project team can deliver a successful go-live.

Go-live cutover activities are the most important part of an Enterprise Resource Planning (ERP) implementation, right? Cutover is the final push of critical data into production, can we conclude that it must matter the most? That was the opinion I held early in my career and, as a result, I felt that go-live week deserved more of my attention than any other week of the year. Today, however, I emphasize the final test cycle, the dress rehearsal for go-live.

The final test cycle before cutover deserves as much or even more attention than cutover itself.  Years of successful data migration projects and countless test cycles have influenced my opinion today.  As you continue your own implementation, I hope that you and your team elevate the importance of the final test cycle.  I invite you to reflect on my insights below as you lead your project team to the final phases of implementation.

Why do we treat it like a dress rehearsal?

Every successful implementation project has a schedule with numerous test cycles for configuration, data migration, validation, loading and system & integration testing. The last test cycle—the one before the final flurry of go-live activities, is the dress rehearsal. Every action that needs to take place during go-live is replicated during the dress rehearsal. Timing and sequencing are key.  Aligning resources and understanding expectations are critical to designing a go-live schedule that minimizes business disruption and maximizes the likelihood of smooth operations post-cutover. If a go-live cutover task is expected to happen at 3 AM on a Saturday, that same activity needs to have resources at-the-ready to execute the plan at 3 AM on dress rehearsal Saturday.

Most project teams put a lot of focus on the final push for go-live (and rightfully so!). From a project management and data migration perspective, I put just as much—if not more—emphasis on the dress rehearsal. As dress rehearsal approaches, the “big picture” finally starts to settle in. After months of test cycles and “more leeway” with timing (think of activities planned over 4 weeks instead of 2 weeks), all our work is locked into a very specific, high-visibility timeline. Dress rehearsal is when we finally learn if our setups, runtime documentation, data dependencies and workflows are truly synchronized.

What does this mean for Project Managers?

As a project manager, dress rehearsal is when my team truly understands the sense of urgency and pinpoint accuracy needed for go-live. The last test cycle is when teammates learn how to “flip a switch” and approach their tasks with focus and intentionality. As we prepare for specific data migration activities, timing is everything. We go over our checklists:  are configurations set up; have we received the necessary supplemental files; can we run ETL processes (and support pre-load validations in parallel!) in the allotted time?

My biggest worry is, “What did we collectively miss?” Dress rehearsal is the last time to correct the complex web of activities; my mindset during go-live cutover is that something unexpected will happen. A smooth and successful final test cycle builds confidence in the entire implementation process. When we iron out the wrinkles during dress rehearsal, we prepare ourselves for the inevitable curveball during go-live. Instead of fretting about the schedule and cutover plan, the team already knows that we have excelled under the exact same expectations; it allows us to focus on solutions instead of timelines.

How does this help us approach cutover with confidence?

Once a dress rehearsal is executed successfully, my mind is much more at ease entering go-live cutover. We are one united team, everyone with a clear understanding of expectations. We have the confidence knowing that if (when?) something unexpected happens during go-live, we will rise to the occasion and find solutions. A successful dress rehearsal is an excellent indication that the entire project team can deliver a successful go-live cutover when it matters most.

Data Governance Kick-Off Meeting

Best Practices
Kicking off your organization’s data governance implementation with an introductory meeting is an effective way to ensure stakeholder buy-in and set expectations for things to come.

Kicking off your organization’s data governance implementation with an introductory meeting is an effective way to ensure stakeholder buy-in and set expectations for things to come. With the proper focus, this meeting can quickly lay the groundwork for the journey ahead.  That said, avoid the temptation to cover everything during this meeting and instead limit it to the following sections:

  • Introduction
  • Primer on data governance  
  • Thoughts from the group
  • Establish meeting cadence
  • High level overview of plan + deliverables (time permitting)

Introduction:  

At the start of the meeting take a few minutes to get each person’s thoughts about what data governance is and the impact they anticipate it will have on the organization. People have different ideas about what data governance is. Some will think of it as a technical project, others a business process, others something that is contained within an application, and others will think it is only going to be a pain in the butt. The first step to getting people aligned is learning where everyone is starting out.

Primer on data governance:

Once we learn the team’s initial perspective of data governance, it is time to put down a central definition and relate everyone’s individual definitions to that definition. DAMA shares a concise arrow diagram that defines and shows how governance fits into the data management landscape.  

This diagram is a fantastic tool to pull everyone's definition into something that the entire group can coalesce. From here, we can take the definition a half step deeper into data governance.

I have found it helpful to go through what data governance is and is not. How its purpose is to enable business outcomes and not limit activities, how it is a process and not a project and will require long term commitment from the organization to be successful. It takes a bit of dialogue, but if a group begins to recognize that data governance is more about organization change than just a thing to do, it will have been a successful meeting.  

Depending on the group and discussion, this might be all there is time for, and it would be fine to begin closing by establishing next steps and a meeting cadence (more on this below) so that you can maintain momentum without getting stuck at the scheduling step later.

Thoughts from the group:

If there is time left and the group is in good spirits, take a few minutes to get additional thoughts from the group about what they are looking to get out of data governance at their organization. Capture these initial thoughts and tie them to data governance implementation roadmap. Being able to tie back to these ideas later in the implementation will significantly boost engagement and help your team see the impact of their efforts.

Your team may be optimistic and share ideas around positive changes like better data quality and faster reporting. Call out and celebrate this! But your discussion will also include thoughts from those that are more resistant to change. Some folks might even share a negative outlook or express skepticism about the implementation having any impact on their day-to-day role. Even this “negative” idea is important. It will eventually be turned into a positive as it sets up further discussions on how to measure the impact governance is having on the organizations.  

High level overview of plan + deliverables (time permitting):

As the meeting ends, take a few minutes to tie the group's thoughts to the next activity or two. Tying thoughts to activities help build the urgency that this activity will generate positive results for the organization and keep people engaged to join the next discussion.

Establish a meeting time + Cadence:

Before the meeting is over, take 10 minutes to establish a suitable time and frequency for the group to meet. Schedule management could be the most difficult aspect of any project. Getting a commitment from group members to meet at a certain frequency while you have their attention is much more effective than doing the calendar hunt to find an open time. Once the time is set, send the meeting invite immediately.

Share a meeting follow up:

Sending a meeting summary is always great practice and sending one after the initial kickoff is of utmost importance. The summary will reinforce the importance, impact, and learnings that came out of your collaboratory meeting. The summary should include:

  • Materials that were covered
  • The team’s thoughts
  • Upcoming project schedule
  • Agreed upon meeting cadence
  • Issues resolved/outstanding
  • Next actions
  • Main topic for the next meeting - Identifying gaps/defining best practices

Data Holds Organizations Back

Best Practices
Service Offering
According to McKinsey, organizations spend 30% of their time on non-value added tasks because of poor data management. Read on to explore why data impacts performance and how Definian can work with you on these issues.

Data Holds Organizations Back

Data is a critical asset for every organization, especially in the modern era. Despite its recognized criticality, many organizations fail to properly manage data as a value-driving asset. According to McKinsey, organizations spend 30% of their time on non-value added tasks because of poor data management; further causing issues, 70% of employees have access to data that they should not be able to see.

Definian focuses on the foundational aspects that strengthen data management at your organization. Our process is geared to render data compliant, clean, consistent, credible, and current. With these measures adequately addressed, you will be confident that your data is secure, increasing in value, and powering insights across the enterprise.

Why Data Impacts Performance

DEGRADATION

Depending on the entity, data degrade at a rate of 4-20% a year. Without active mitigation and monitoring, data will constantly decrease in usefulness.

QUALITY

An insufficient data quality strategy plus growing data volumes means that inaccurate data are affecting your analytics and are causing a manual data cleanup burden on your team.

INCOMPLETE METRICS

The principle of what’s not measured is not managed applies to data. By not tracking important data-related metrics, organizations don’t know the impact of ongoing data management gaps.

SECURITY

Without role-based security policies that meet regulatory, contractual, and ethical requirements, organizations put themselves at significant risk.

COMPLEXITY

As data velocity accelerates and unstructured data is continuously entered into the system, data lakes start to turn into data swamps.

LACK OF STANDARDS

Data entered and maintained by different people, departments, and applications can slow operations and pollute analytics.

SILOS

Data silos that persist between departments slow business operations, increase integrity issues and prevent the data access that is needed.

LINEAGE

When data moves across the landscape, its requirements change. The smallest error or update can have significant ripple effects.

CULTURE

An organization that is not prepared for technological change will be ill-equipped to effectively manage their data in the future.

Modernizing Data Landscapes Since 1985

Prepare Your Organization
  • Assess your current data landscape, processes, and governance policies
  • Develop a tailored data improvement roadmap
  • Help prepare for the organizational change to spur adoption of the new polices and processes
Execute Improvement Roadmap
  • Ensure the data vision aligns with business objectives
  • Facilitate organizational change
  • Implement technical components for data security, quality, interoperability, etc.
Secure data across the organization
  • Instill confidence that the team has access to quality data when they need it
  • Assure that your data are compliant with regulatory, contractual, and ethical obligations
  • Promote a culture that uses data to improve operations, increase sales, and reach better decisions across the enterprise

Mastering the Item Master: A Tour de Force Wrangling Unconventional Data

Dynamics
Best Practices
As part of a global rollout of Dynamics 365, the Item Master for a particular cold finishing mill required a dramatic redesign.

A dramatic redesign of the Item Master proposed

Global supply chains have changed dramatically over the last few decades and constant innovation is required to keep the American steel industry competitive.  Strategic investments in Enterprise Resource Planning (ERP) software is one way that our client – a Fortune 500 steel manufacturer – is extracting more value from the same raw materials.  As part of a global rollout of Dynamics 365, the Item Master for a particular cold finishing mill required a dramatic redesign.

One of the main objectives of this initiative was to build a true item master across the entire line of business to improve their reporting.  This was defined as a collection of 21 individual attributes including shape, size, chemical composition, length, and others.  Any distinct combination of values for these 21 attributes represented a unique item.

One piece of the puzzle

This cold finishing mill represents only one of the dozens of divisions implementing Dynamics 365.  This steel company is vertically integrated – steel cast at one mill may ultimately find its way into this cold finishing mill where it can be turned, ground, rolled, chamfered, or annealed.  A goal for this global implementation is to integrate the Enterprise Resource Planning (ERP) systems across the divisions to streamline steel production and finishing.  That said, implementing Dynamics 365 for this cold finishing mill required meeting the requirements and matching the conventions set by divisions already live in Dynamics 365.  This mill was just one piece of the puzzle.

The sites that had already successfully migrated to D365 were all made to stock and migrated off a common legacy system.  This made the process efficient and repeatable; however, this division practiced a ‘made to order’ business strategy, meaning all material was made based on the specifications defined by the customer.  This meant the functional business requirements were going to differ vastly from the already live divisions and need to be fluid enough to accommodate a wider range of customer demands.

Overcoming documentation gaps

Decades ago, the mill purchased mainframe servers and leveraged in-house knowledge to build their supply chain management system from scratch using COBOL.  The truly one-of-a-kind solution that ran this mill helped them maintain operational excellence and profitability through turbulent years for the American steel industry.  Today, however, the hardware is at risk of failing, the software cannot scale, and the talented resources that built the legacy system are nearing retirement.  Worse still, the documentation for this complex system is sometimes buried between lines of COBOL code, relegated to tribal knowledge, or is missing entirely.

The robust Dynamics operations solution requires more detailed data on the production of the finished products, namely formulas and routes.  However, this data did not exist in a structured fashion in the legacy system.  This required a much deeper dive by Definian and the technical team to understand how this data could be extracted and migrated to the new system.  

Reverse engineering:  How we used transactional data to define items

The unique operating model of this steel mill meant that there was no true item master in the legacy system.  Deriving an item master and having it integrate seamlessly with the other divisions became a colossal task in data analysis, cleansing, and conversion.

The process began with a review of all required transactional data.  Analysis of the data behind nearly one hundred thousand sales order lines allowed Definian to build the finished goods, hundreds of thousands of lines of production scheduling to create WIP parts, and years of purchase history to derive the raw materials.

However, the redesign did not end with the Item Master. With the item master was now complete, the formula and route data was also needed.  The team faced similar challenges here true master data did not exist in the legacy system for formulas and routes. Read more about how we solved these challenges here.  Adding to this complexity, were customer specifications for highly regulated industries and critical applications that we discuss further here.

Success Today and for Years to Come

Since go live, the client is now able to identify the specific products they buy and sell, while still delivering the same experience customers have been accustomed to for years.  In addition, the facility now has a true accounting of their production processes, something that was previously confined to the minds of a few planners.  And finally, with such a drastic modernization of their technology, the client location is finally able to report on an enterprise level, rather than the silo they operated on for decades.  

If you have questions about data management, data quality, or data migration, click the chat bubble below. I'd like to learn about them and work with you to figure out solutions.

7 Barriers to Successful Data Migration

Best Practices
Successful completion of a data transformation and ERP implementation will provide huge advantages to your organization. But before you can realize these benefits, your organization must rise to the challenges ahead.

Successful completion of a data transformation and ERP (Enterprise Resource Planning) implementation will provide huge advantages to your organization. These include optimized business processes, better reporting capabilities, robust security, and more. But before you can realize these benefits, your organization must rise to the challenges ahead.

Below, we share 7 noteworthy obstacles to successful data migration which we have culled from our decades of experience converting data to Workday, Oracle Cloud ERP, Oracle EBS, Infor MS Dynamics, and other technologies.

1. Difficult Legacy Data Extraction

Logically, one of the first steps in data migration is determining which legacy data sources need to brought into the new system. Once this scope is determined, it is crucial to decide how that information will be extracted.  While extraction could be straightforward for some data sources, it could be onerous for others – including those hosted by third party vendors or data stored in older mainframe servers.

To facilitate legacy data extraction, ask yourself and your team the following questions:

  • How well do we know our existing data system?
  • Which members of your organization have intimate knowledge of your system’s nuances and current usage?  

Definian utilizes our central data repository, Applaud®, to house data from each of your distinct legacy data sources. Additionally, we can directly connect to your ODBC system, seamlessly translate EBCDIC data, build utilities to pull data from many cloud data sources, and quickly incorporate new data sources as they emerge from your shadow IT (Information Technology) infrastructure.

2. Decision Making and Mapping Specification Challenges

To make your company’s existing data compatible with your new ERP solution, hundreds of tables and fields will need to be mapped and transformed from your “legacy” system to your “target” system. It is important to consider what data you want to appear in your target system. Do you need to bring over certain historical data? Do you want terminated employees or closed purchase orders to exist on your new platform? These are examples of questions that you should ask (and answer) in the requirements gathering. These decisions require input from both the technical team and subject matter experts (SMEs) to ensure overall accuracy in conversion and functionality. Note that these specifications can evolve throughout test cycles or as new requirements are understood and progressively elaborated.  

Additionally, legacy-to-target crosswalks are necessary to map existing values into their accepted equivalents in the new system. This can span everything from simple translations for country codes and pay terms to an entire revamp of your company’s Chart of Accounts. These crosswalks also change as the project progresses and the final solutions begins to take form.

Definian helps facilitate decision-making with decades of data migration experience. We document field-by-field level transformations in our Mapping Specification documents; these mappings are archived, updated, and saved as requirements change throughout the project.

3. Complexity of Selection and Transformation Logic

Complex conversion logic is often necessary to get the desired outcome in the new data system. For example, selection logic between Suppliers, Purchase Orders, and Accounts Payable Invoices is closely interrelated and may need to address outliers like prospective suppliers, invoices on hold, and unreconciled checks. Failure to handle all data situations could lead to open transactions being orphaned in the legacy system and difficulties passing audits. Additionally, fields required in the target system might not exist in the legacy system and could require a complex derivation from several related fields. To effectively document the mapping specification, your organization’s technical and functional teams both provide insight into what is needed from each converted record.

Definian’s Applaud® software is tried and true for effective coding of data transformation logic. We systematically process data on a record-by-record basis using predetermined requirements and mappings. The output data is in a ready-to-load format compatible with your new data solution.  

4. Communication Gaps

Communication between your organization’s functional and technical resources, the data conversion team, and the System Integrator can be challenging. It is especially important for data migration projects to ensure that all team members are aligned for the highest level of data accuracy in the new system. Miscommunications across the project team can lead to data errors and project delays. It is crucial that all team members know the current status of each conversion within the data conversion domain. Cross-functional issues related to mapping decisions, acceptable values for fields, and functionality within the new data system arise all the time. Communication within cross-functional meetings is constant and is necessary to ensure that each component of your data conversion team is on the same page.

Definian stays on top of all data conversion tasks to ensure that each party knows its current status and pending decisions. We actively track each data conversion object to ensure that all necessary meetings are scheduled, and follow-up items are addressed. No matter the medium, we pursue issues until we achieve resolution and document pertinent decisions made in the process.

5. Difficulty Validating Converted Data

After following the conversion requirements to prepare data, there needs to be a way to validate that the data has been handled accurately (according to the requirements) and is fit for its purpose (fills business needs).  This process can be obstructed by availability of resources across your organization as well as knowledge of what to look for and approve. It can also be difficult to compare legacy source data with corresponding converted data because selection logic and mapping logic can be difficult to fully replicate.  

Definian produces error reports for each conversion detailing missing or unmapped fields, unacceptable values, and more. We also produce validation reports that use a 3-way match to compare each record between the legacy data, target table, and extract provided by the System Integrator. These files aid our clients in identifying what fields need to be cleaned up or modified to have a successful load.

6. Limited Knowledge of Target System

Clients usually do not have upfront, advanced knowledge and experience working with the data system that they are trying to implement. Your organization may be often unfamiliar with the data migration process as well as certain nuances of your new data solution. Each data system has its own requirements for a successful data load, and they each have their own front-end functionality. Both factors need to be understood by the project team to map, test, and validate the converted data.  

Definian utilizes our system-by-system validations built over years of successful project implementations. Our consultants possess years of experience with a given target system, guiding our clients to understanding the nuances of their new data solution and helping them achieve their overall project objective.

7. Project Delays

There are many components of your project that can contribute to project delays, including lack of available resources, difficulties understanding data requirements, or an unreasonable/compressed timeline. Once delays occur, it is oftentimes more costly for your organization and can disrupt the overall rollout of your new data solution. Some implementations are never able to fully recover from delays and the additional costs associated, leaving them ill-prepared to move to their new data system (if at all).  

Definian is a Client First organization. Our skilled consultants go above and beyond to make sure our data migration tasks are completed on time and with great accuracy.

Celebrating Success

If you can overcome these obstacles (and potentially others), you will be on your way to a successful data migration project. You and your team will expend effort, time, and resources throughout the implementation process, so it is important to reap the benefits of your new data system after go-live! Make sure you celebrate your success and remember the challenges you had to overcome to get here. Congratulations!

Behind the Scenes: 7 Things We Do Before a Successful Test Cycle

Case Study
Best Practices
Making a first data conversion test cycle a success requires good planning and proactive work. Here, we take you behind the scenes of 7 key steps that we do at Definian to help make your test cycle a success.

Making a first data conversion test cycle a success requires good planning and proactive work. Here, we take you behind the scenes of 7 key steps that we do at Definian to help make your test cycle a success.

1. Identify similarities

Serving as a strategic partner for clients with multi-year, multi-division ERP rollouts often means we encounter similar requirements across different parts of the business.  To minimize cost and accelerate development, we review existing documents and conversion code and leverage as much prior art as possible when building something new for the client.  It helps that our main competitive differentiator– Applaud® - is a rapid application development environment that enables our team to centralize development and quickly identify and reuse relevant code.

Business Challenge:  Definian converted Customer Master data for multiple sites of a single client over several years. Each site’s Customer Master came from the same legacy system and most of the conversion requirements and programming was reusable from site to site. But a new set of sites came into scope that used a Customer Master totally different from the earlier sites. Drivers related to the load application were updated at the same time that these complex new sites came into scope. It turned out that these updates broke the previous process of loading data to the target database.  This failure required a new load application to be used.  The client was faced with the challenge to maintain their aggressive go-live schedule while addressing the complexities arising from the newly introduced legacy system and technical changes to the load process.  Defnian's ability to test load processes ahead of time prevented unexpected issues from emerging during the first test cycle for the new sites.

2. Generate “pre-load” conversion error reports

Another competitive differentiator for Definian is our repository of “critical-to-quality”(CTQ) checks that identify and report data issues.  Our CTQ tool validates that data is structurally valid and that it fits configuration.  For a basic example, this validates that customer site level information appropriately ties to the customer header and that the pay term for the customer is configured.  These CTQ checks identify data issues before we even push data to the target system.  These checks are carried forward from project to project to decrease development time and improve the results for our client. By doing this, we can quickly identify numerous issues that need to be addressed. To contrast, standard load programs typically only identify one issue at a time and must be rerun repeatedly after solving each successive issue.

3. Spearhead data cleanup activities

To maximize the success of your test cycle you need to first improve the quality of the data.  The legacy system is rife with issues, and this is often one of the reasons to implement a new ERP solution.  Defnian’s best practices approach to data migration integrates data profiling, analysis, and cleansing with the data conversion itself.  Many data cleanup activities can be fully automated and are carried out in the conversion code and appear in the target system.  Automated cleansing could include translating atypical code values to the desired target value or merging duplicate customer or supplier records. Other cleanup activities can be reported before and during the test cycle for manual cleanup in the legacy system – often driving business value.  Examples of these manual cleanup activity include closing old order or invoices that have been left in open status or filling in required fields.

4. Mockup records for test loads before development is complete

While conversions are still being developed and legacy data is being cleaned up another key activity is to mockup sample data to test the load process. This step to reveals if there are additional dependencies or configurations required that have not been previously identified. Dependencies are critical to an implementation because any delay to a master data (or any object that has downstream dependencies) will lead to delays in later conversions. For instance, Sales Orders cannot be loaded until Customers are successfully loaded.

Business Challenge:  Load requirements are not always clear. In one example, a client used alphanumeric identifiers for their Accounts Receivable (AR) transactions. The alpha characters were an important feature of their legacy identifier because it classified the transactions and drove business processes. The division considered the letter a required attribute. The target transaction identifier was also defined as a character field, so it seemed that there would be no issue with retaining the original alphanumeric values. However, Defnian’s preemptive load tests discovered that the load program pushed the transaction identifier into a second field, a document number, which was defined as numeric. This meant that the document “numbers” with letters would fail to load. Testing the load program before going into the test cycle gave us the opportunity to address this issue and figure out a solution for the client. Had the load program not been tested until the integration test cycle, AR would not have been ready for the integration test.

5. Report and escalate issues and new questions arising from test loads

Documenting findings, noting configuration gaps, and identifying necessary scope changes are all part of the progressive elaboration of a project.  This part of the process will vary based on what tools are available, but no matter the method, it is important to track issues and decisions to help with successive test runs.    Inevitably, questions will arise later in the project as to why something is being done the way it is and a test management tool (e.g. Jira, HPQC, DevOps, etc.) will have the answer.

6.  Facilitate client review of data before it is loaded

When it comes to full test cycles it is critical for the business team to review the data to make sure that what is being loaded matches expectations.  Inadequate testing is one of the primary sources of project failure, which is why we serve as our client’s partner throughout their testing and review.  Definian speeds this process by providing a series of reports that help connect the dots between the legacy and the target system. This usually includes an extract of the data being loaded in excel format with key legacy fields and descriptions for key coded fields, a record count breakdown and error reporting. Ultimately, this approach demonstrates that converted data fits the requirements and can reconcile back to the legacy data source and satisfy auditors.

7. Optimize runtime performance

Even in the earliest phases of the project, our team is already thinking about cutover and go-live.  To accommodate aggressive schedules and minimize business disruption, we optimize the conversion process to run as quickly as seamlessly as possible. Often, this requires that we parallelize high-volume portions of the data extraction from the legacy system and the data load to the target system.  Sometimes, system configuration impacts load throughput or fine tuning of batch sizes can save precious hours from the cutover schedule.  Whatever it takes, We understand the importance of a timely go-live and strive to keep business shutdown as brief as we can safely accommodate.

Enter your test cycle with confidence and focus

By carrying out the preparatory steps listed above, Definian minimizes data risk from your transformation project.  This allows the business to focus on learning and optimizing the new ERP system, instead of chasing data issues.  Less distractions for the wider project teams means that more time can be spent on functionality, configuration, training, and the future or your enterprise.

How Inaccurate UOM Data Can Break Your Supply Chain

Best Practices
Units of Measure can make or break their supply chain solution. Read more to understand more about unit of measure, UOM conversions, and issues to avoid.

Unit of Measure (UOM) is an attribute that describes how an item is counted. When we consider sundry items like a gallon of milk, a pair of gloves, or a case of beer the concept of UOM is deceptively simple. However, for manufacturers, hospitals, and retailers, Units of Measure can make or break their supply chain solution. Inconsistencies in UOM data across functional areas or gaps in how units are converted can create data exceptions in procurement, inventory, and other operations. This could lead to incorrect inventory counts, delayed orders, and storeroom confusion.

Why would a single item have multiple Units of Measure Anyway?

An individual item can often be represented using different units of measure based on the context. Medical exam gloves provide an intuitive example, they are procured in bulk by the case, stocked by the box, and used by the pair. A single item can have multiple units of measure because employees in diverse job functions – like procurement, storeroom, or nursing – interact with the same item in distinct ways.

Intraclass UOM Conversion explained

Converting across units of measure is a necessary part of supply chain management. Continuing the example with gloves, let us say the storeroom managers in the hospital calculate they need 12 more boxes of gloves. Since gloves are only sold in cases a conversion from boxes to cases is necessary, 1 case of gloves equals 10 boxes of gloves. Consequently, 2 cases of gloves are needed to fill the need for 12 boxes of gloves.

Converting between cases and boxes is an example of an intraclass UOM relationship. The intraclass moniker means that the separate units of measure being considered are of the same class, in this case the class is quantity.

Interclass UOM Conversion explained

Additionally, some items have interclass UOM relationship to convert between different classes of measure - like from quantity and volume. For example, consider vials of medicine. For certain types of medicine, the stocking unit of measure is ‘vial’ and this makes perfect sense as a stocking unit considering how vials of medicine are individually stored in trays and racks. However, for clinical purposes the unit of measure is often milliliter. For the system to work with both units, an interclass UOM relationship needs to be set up to determine how milliliters equates to each vial. This ratio will be different from item to item so a unique interclass UOM is set up for each item that requires it.

Criticality of accurate UOM data

It is impossible to order, receive, or stock items in a unit of measure that has not been assigned to the item. For example, if a purchase order is for a ‘pack’ of masks, but the item is only set up in the system with the ‘each’ then the purchase order will fail or go on hold. Inventory and procurement data is often fed to boundary systems (Epic, Ivalua, WaveMark, and others) that manage inventory or contract information. The unit of measure used needs to remain consistent across these interfaces or have clearly defined rules for how the units are converted.

Inaccurate UOM relationship data can cause supply chain snafus. Consider what might happen if the ‘case’ you orders contains 10 ‘boxes’ but you expected it to contain 100! Inaccuracies like this could throw off operations and require expensive rush deliveries to resolve. Whether your business is implementing a new supply chain software solution or implementing a master data management strategy having a approach to identify and solve UOM data issues is critical to your success.

Partners & Certifications

Ready to unleash the value in your data?