Feeds:
Posts
Comments

Posts Tagged ‘Data Migration’

Managing enterprise data has never been more difficult.  This becomes all too apparent during that critical stage of business management software turnover when it comes time to decide what data to migrate over from the old system to the new, and by implication, what data not to migrate.  We offer a few simple tips here today.

System implementers generally consider your company data to be of two types, often called “Master” (or static) and “Transactional.”  The former consists of the key organizational data elements – things like employee names; inventory part items and descriptions; vendors; customers; general ledger accounts and their numbers.  In other words, the kinds of data that are relatively fixed in nature, referred to frequently, and generally pretty static in terms of their consistency over time.

The other data, transactional records, consists of the everyday events (transaction) records that occur in business, like creating and invoice or adding or removing items from inventory due to production, sale or other effects upon an item.

Technically, a third category might be considered starting balances, or opening general ledger account balances as far as the new system is concerned.

Nowadays, it’s common to use spreadsheets or the tools provided by the vendor to assist in this process.  For example, customers moving off a legacy system into a Dynamics 365 (D/365 cloud-based) environment would use Microsoft’s “Data Management Framework” tool, first to extract data from a legacy system, then transform that data to a relevant framework for the new system, and then import the data into the D/365 application – thus: extract, transform, import.

Here’s what we and others suggest, based on having tried many approaches over many years with many clients, some more successful than others.

Step 1: Use the available tools to migrate all the Master data you can by extracting it, transforming it (usually using a tool or an Excel spreadsheet) into the appropriate new-system fields, then importing the data into the new system.  You’ll save yourself both money and frustration if you carefully review all the data during that transformation process to clear out any old, inactive or inaccurate records.  Clean up your data here and now, one time, so you start fresh and clean in the new system.

Step 2: Don’t even think about porting over (migrating) all your various and complex transactional data.  Your old system and new will handle those transactions in radically different ways, and the odds of migrating your data cost-effectively or efficiently are slim at best.  Somebody will eat an inordinate number of hours trying to migrate your transactional data (trust us).  It’s not worth it, as it’s a whole lot simpler to just leave the legacy system running after you cut over to the new system, so it’s available merely for inquiry purposes, if you really need to look up that transaction later.  You might be surprised at how seldom you’ll need to do this.

Step 3: Then finally key in your beginning G/L account balances into your new system for each G/L account.  It’s not that big a task, and the data entry practice will do your users good.

Approaching your migration along the simple lines we’ve defined here will give you the most bang for your buck, and keep your migration moving forward at a reasonable pace, while limiting the frustration that your employees and your partners would otherwise face by thinking you can simply transfer “everything” from the old system to the new.

Read Full Post »

etl_jetIn our prior post we discussed the two most common ways companies move their data from an old, legacy business management software system (i.e., ERP) to a new one.  We noted that most companies “port” their old master (sometime called static) data – like customers, vendors and items – but only in some carefully considered cases do they move the more complex transactional items – things like invoice line items, quantities shipped, item lot numbers or discrete payables invoices – from their old databases to the new.  The complexity of doing so has cost implications that, as we noted in that post, aren’t always worth the investment.

Sometimes, we opined, it’s simply easier and more cost effective to keep access to the old system available to users for those times when the old data is needed, but not to bother moving all that old transaction data over to the new system.

But for those companies who have a well-considered need for moving both static and transactional data – and keeping it for an extended period of time — one possible data migration strategy to consider is called ETL.  ETL stands for Extract, Transform and Load, and the tools to employ it are becoming known today.

Basically, ETL employs what’s called a “data warehouse,” into which you can bring transactional data for reporting purposes.  You first do a traditional mapping and porting of your “master” data into your new ERP system.  The data warehouse then exists outside the actual ERP system, but it’s accessible.  It allows you to use legacy data in conjunction with the new ERP system, but without the cost and difficulty of actually porting the old transaction data into the new ERP system.

ETL tools are available to help with the warehousing task.  For example, the folks at Jet Reports – a powerful reporting tool that allows users of systems like Microsoft Dynamics to automatically extract data from their ERP system into Excel for subsequent manipulation and analysis – have engineered a solution that enables the old data to be extracted out of an existing system, transforms its structure to match the desired new data conventions, and then transforms the data itself to be, as they put it, “optimized for reporting and analysis.”  The final step is to load the data into an environment where it can later be easily retrieved – the data warehouse.

ETL vendors claim that their advantages include speed, simplicity, preservation of data, time savings, money savings, and “fewer heated conversations.”

The better tools work with today’s increasing ubiquitous SQL database structures in an automated fashion.  Jet claims that “it can read the database schema automatically and then lay it out in a graphical format so that tables and fields can be chosen for inclusion.”  From there, the data can be cleaned and scrubbed, and users can create verification and business rules to ensure that history is accurate and valid.  Testing is still required, but an ETL’s automated approach is said to increase its chances for success.

The end results can be access to many years’ of historical data safely ensconced within the data warehouse that does not corrupt the new system, but maintains historical accuracy for users of the new system when business need, or regulation, calls for it.  It’s just one more tool in today’s burgeoning Business Intelligence (BI) landscape.

 

Read Full Post »