Marty Zigman - The NetSuite Expert

Conversations with Marty Zigman

Certified Administrator • ERP • SuiteCloud

Prolecto Labs Accelerator Templates

Feed a Reporting Data Warehouse using NetSuite Driven Cross Reference Maps

Accounting ERP NetSuite Reporting



This article is relevant if you are migrating to the NetSuite ERP system and you have and seek to preserve an existing reporting data warehouse.

Background

In my June 2022 article, Best Practices for Modeling the General Ledger in NetSuite, I outline the fundamentals for building a powerful general ledger structure. In that article, I outline the situation. Before working with us, our client built a complex datawarehouse to produce financial and operational reporting based on its legacy (Sage-based) general ledger system.

The data warehouse acted as the information repository to allow a tool (BI 360 against Solver) to be used to produce the reports for the business community. The business reports effectively hid the complexity of the general ledger structures. I often see data warehouse projects conducted because the fundamental information modeling was not produced in the source transaction systems. This ineffective transaction modeling situation is understandable due to the need to address immediate business concerns without the time or resources to stand back to build a coherent information model from the start.

The article makes a case for powerful General Ledger modeling to gain the promised value of a new NetSuite ERP implementation. I often make the case that proper up-front information modeling can help us avoid (or prolong) the need for a data warehouse. With an existing data warehouse system, the business community has a specific point of view.  Even with all the fundamental upfront modeling, we have the challenge of already existing listening to the many uninitiated organizational managers for thinking about their business. If the business community is large, the effort to bring all the users along in the modeling exercise becomes prohibitively expensive in terms of time and energy.

Thus, when producing a new general ledger implementation, preserving the existing data warehouse and reporting is not uncommon to allow a post-ERP GoLive transition period for greater business community education and adoption.

Preserving the Existing Reporting Data Warehouse

When we model a new NetSuite ERP using best practices, we will change how we capture transaction information and effectively use general ledger accounts and related dimensions in a new and powerful manner. Even though it is modeled better, the change in information structure means that all the existing data warehouse-driven reports will no longer work.  If all the data warehouse reports are going to break, we have a concern: how will we rebuild all the reports to satisfy the overall business community information requirements?

In theory, we can rebuild all new reports with the new information modeling. This may be ideal, but it imposes the cost of a) building and distributing respective reports; and b) educating the business community on interpretation.

Another approach is to preserve the existing data warehouse so that the existing reports do not break. This approach requires us to build a map from the new general ledger structure to the old general ledger structure. An effort analysis should be done to consider which approach leads to a better return on investment.

Mapping New Model to Old Model to Preserve the Data Warehouse

Assuming the option to preserve the data warehouse is better than abandoning it, an investment will be required to build an (E)xtraction, (T)ransformation and (L)oading (ETL) infrastructure.  All data warehouse implementations address ETL concerns, meaning to craft such a reporting and analytical database, we need to get information from source business systems into a coherent format for systems analysts to drive the target data warehouse with respective reports and views to be consumed by end-users.

Thus, the most natural place to produce the ETL effort is in the same tool currently fed with legacy transactions. As such, the existing data import team (if they are still available for such an endeavor) will be required to learn the new NetSuite General Ledger structure and then maintain a map and a set of algorithms to transform the data into a structure that (ideally) does not impact existing reports.

Yet, in our client’s situation, the only people that understood the old legacy account structure and went on the journey to model a new general ledger with dimensions were the accounting team. The modeling demanded by the accounting team was substantial, and it was decided, early in the effort, to not bring the client’s IT team along on the journey due to their existing attention capacity and limited understanding of the existing business operations.

Build a Mapping Tier in NetSuite

As we contemplated the situation with the client, it became clear that our team and the client’s accounting leadership were the only ones that could mentally hold the new general ledger structure. We thus collectively decided to build a mapping tier in NetSuite. Here is how we did it.

The old (Sage-based) account structure was based on a common (yet incoherent) general ledger account segment as follows:

XXXXX-YYY-ZZZ
where the values of

X = Account
Y = Division
Z = Profit Center

After a significant amount of effort, our new General Ledger had the following rich and coherent structure well beyond the three-segment legacy structure:

  1. Subsidiary
  2. Legal Entity
  3. Location
  4. GL Account
  5. Product & Service
  6. Division
  7. Department
  8. Sales Rep

As we analyzed the legacy account structure carefully, it became clear that only the accountants on the client team, the ones that coded all legacy transactions, understood the old incoherent world; note we call these situations institutional knowledge where the understanding is undocumented but lives in the narratives of individuals that had worked in the organization over long periods of time.

Thus two crucial considerations came forth in the mapping exercise:

  1. Profit Center Map: the old (division/profit center world) could be determined by understanding how we would code the new subsidiary, location, department and division segments in our new ERP system.
  2. General Ledger Map: the old general ledger account (XXXXX-YYY portion) could be determined by looking up the new general ledger account along with the new sales rep and with an understanding of the calculated new Profit Center map (previous item).

Furthermore, there were different rules if we were focused on a general ledger account targeting a balance sheet versus the income statement.

This map is considerably complex. The goal was to build a set of cross-reference tables to allow us to calculate the old values from the new transaction values.  Click images to get a feel for the concept and implementation.

Cross Reference Lookup Values as Custom NetSuite Dimensions

Given NetSuite’s new capacity for crafting NetSuite Dimensions (called Custom Segments), we continued to extend the new general ledger structure to include two new segments. We called the legacy database system “Solvr” and thus created two new custom segments (dimensions) below:

  1. Solvr Profit Center: these are calculated first based on the values of subsidiary, location, department and division.
  2. Solvr General Ledger: these are subsequently calculated as they depend on the lookup values of the Solvr Profit Center and the new general ledger account and sales rep segment.

Because NetSuite Custom Segments use underlying custom tables, we extended the mapping segments to include more information where needed.

Sourcing New NetSuite Data Warehouse Mapping Segment Values

Retrieving the values to automatically be applied to transactions is called “sourcing.” NetSuite offers an array of methods to source values onto transactions.  In our custom mapping segment case, given all the complexity, we created scripted User Event rules that would push the values into the transaction lines as financial records were being saved.

Every financial (posting) transaction required a value in our two custom segment mapping tiers.  Thus, we wrote our algorithms to fail gracefully if a transaction lookup would not find a value. We simply recorded nothing — thus, null meant there was a problem in the lookup table definitions.

Data Warehouse Transformation Control

Since null values in our Solvr Profit Center or Solvr General Ledger segments meant a mapping problem, we could then use saved searches to help us detect if we were out of control.  In our client’s model, there was no need to push information to the reporting data warehouse until the financial books were closed.  Hence, as part of the accounting period close process, the accountants could review all the transactions where null values existed to resolve issues. The cross reference tables would be enhanced, as needed, and the related problem transactions are simply edited/saved (sometimes called touched) to recalculate to source the mapping values. Once our control saved searches returned no rows indicating we were in control, it was safe to close the books.

Moving Financial Information to the Data Warehouse

Under our prescribed approach, moving the information to the data warehouse was trivial. It was simple to provide an extract of all closed period transactions with the segmented XXXXX-YYY-ZZZ information it expected because we could cross reference the mapped control segment attributes. Furthermore, the application required minimal IT assistance because the accountants controlled the entire process. Consequently, the data warehouse was fed information it understood, and all the existing financial and managerial reports continued working without missing a step.

NetSuite Financial Reporting and Implementation Leadership

The above discussion is an example of what we believe is a fundamental ingredient to powerfully and successfully implement the NetSuite ERP Financial to gain the promised value from the investment. The challenge is conceptual in nature in that NetSuite’s structures produce both the flexibility and constraints to solve the modeling puzzle. The ability to synthesize, conceive, demonstrate, and craft the desired outcomes demands mature leadership to help less-initiated individuals through the journey.

We hold that there is no one-size-fits-all approach to NetSuite implementations. Instead, a carefully planned ERP project demands the commitment to invest time and energy to realize and produce the shared solution understanding. Our firm recurrently produces successful outcomes because we are mature enough to say “no” to demands to move fast and haphazardly toward business objectives. I respect this slower approach is not for everyone — and I have had to decline engagements due to misalignments. However, our existing clients often say they would never consider an alternative approach once they have realized their objectives under our leadership.

If you found this article relevant, feel free to sign up for notifications to new articles as I post them. If you are contemplating a NetSuite implementation and this article resonates with you, let’s have a conversation.

Marty Zigman

Holding all three official certifications, Marty is regarded as the top NetSuite expert and leads a team of senior professionals at Prolecto Resources, Inc. He is a former Deloitte & Touche CPA and has held CTO roles. For over 30 years, Marty has produced leadership in ERP, CRM and eCommerce business systems. Contact Marty to set up a conversation.

More Posts - Website - Twitter - Facebook - LinkedIn - YouTube

About Marty Zigman

Marty Zigman

Holding all three official certifications, Marty is regarded as the top NetSuite expert and leads a team of senior professionals at Prolecto Resources, Inc. He is a former Deloitte & Touche CPA and has held CTO roles. For over 30 years, Marty has produced leadership in ERP, CRM and eCommerce business systems. Contact Marty to set up a conversation.

Biography • Website • X (Twitter) • Facebook • LinkedIn • YouTube

Leave a Reply

Your email address will not be published. Required fields are marked *