Feeding the Dataport

Please note that all DMS news and blog content has been moved to the main website at www.dynms.com for a categorized view.

All utilities and downloads are available at www.dmsiworks.com

In a previous post, we introduced the Universal Dataport  – a tool that can import a simple text file into any NAV table, complete with validations and change log.  The Universal Dataport (UDP) is a useful tool for importing data once NAV is in production, but is even more useful when doing the initial imports from legacy systems during new implementations.  The tricky part of these initial imports is actually obtaining and mapping the legacy data to NAV before feeding it into the UDP.  A number of years ago we developed a fairly simple spreadsheet-based tool to help define field mappings from legacy systems and to generate a file that easily feeds the UDP.

Often during implementations you won’t import data just once – normally we’ll do an initial import into a pilot or staging NAV database to highlight missing data elements, aid in discovery sessions, and perform initial training.  This initial import is never good enough to “go live” with, and is usually drastically out of date by the time go-live comes around.  So what ends up happening is that we do a number of imports, tweaking the data cleanup rules and the mappings as the requirements get more refined.  To be efficient, we’d rather not start from scratch with raw exports from the legacy system and a ton of manual editing and reformatting each time we need to do an import.

This is where the spreadsheet (DMSDataMap to be specific) comes in – it lets you create a mapping between NAV fields and the legacy data that you can save, tweak, and reuse multiple times.  (Being able to save the mapping rules is also great if there is a post-implementation audit by an external auditor).

The DMSDataMap uses the power of SQL to map NAV fields to a legacy field or expression.  This, of course, means that the legacy data needs to be in an ODBC-compliant database, with SQL Server being the preferred choice (“ODBC-compliant” does not mean we have the same range of SQL functionality provided by SQL Server).  “But”, you say, “most of the time my data comes to me in Excel or text files”.  No problem – SQL Server has a handy SSIS wizard to automagically import Excel spreadsheets (and other files) into new tables.  See this MSDN article on the wizard (there are also lots of other step-by-step examples out there).

So whether you started with the data in an existing ODBC-compliant database or you moved the data to SQL, we now have the data in a repository we can access with the DMSDataMap tools.  Let’s walk through an example of importing item data from a legacy table already in SQL.  First, we open the workbook and enable the macros (the macros are available in a text file with the download for you to examine before enabling them if you like).  The workbook has three sheets in it: a “Notes” sheet for notes on this mapping, a “Mapping” sheet for the actual field mapping, and a “Query” sheet for defining the source database and exporting data.

<<spreadsheet pic>>

The first thing we’ll do is bring in the NAV field names.  Open the Object Designer and design table 27 (Item).  In the table designer, show the “Field Type” column and filter on “Normal” to exclude any flow fields and flow filters.

<<filtered fields pic>>

Select and copy all the rows in the designer and paste into a blank sheet, then copy the list of field names into the DataMap “Mapping” sheet.  You should end up with something like the list below (which you can sort alphabetically to make it easier to find the fields).

<<nav fields pic>>

The next step is to map the available and/or required fields (or expressions) from the legacy table to the appropriate NAV fields, and indicate which ones need to be validated by the UDP as they’re imported.  Here’s a sample of some of the mapped fields (blank mappings have been filtered out).

<<mapping pic>>

Let’s take a closer look at these mappings:
  • No. : this is straightforward – the “PART_NO” field from the legacy table maps directly to the NAV “No.” field.
  • Description and Description 2: slightly trickier – here we’re converting from the single “DESCRIPTION” field in the legacy table to two NAV fields using the SQL substring function to split the text and limit the number of characters in each field.
  • Base Unit of Measure: another simple 1:1 mapping, but this time we’re validating the field.  Validating this field populates the Purchase and Sales unit of measure fields.  (Keep in mind that the UDP will automatically create the “Item Unit of Measure” record when we import this field if it doesn’t exist).
  • Replenishment System: Trickier yet.  The legacy system uses “M” to identify manufactured items, and “P” or other codes for purchased items.  The CASE statement allows us to do a simple conversion from the legacy field value to what NAV expects.
  • Vendor No.: Trickiest so far.  In this example, the default vendor for each item is stored in a separate table.  This mapping retrieves that mapping through the use of a subquery to the PART_VEND table.  Note that you can use joins, but usually a simple subquery is much easier to keep track of.
Once the field mapping is complete, we can now do a little setup and see what kind of file gets generated.  On the “Query” sheet, the cells in light green need to be filled in:

<<query sheet pic>>
  • Connection String: this is where you specify the connection details for the legacy database (not the NAV database).  Click the ellipses button for a wizard to step you through building the string.
  • “SELECT” modifier: Optional. Anything you enter here will appear immediately after the “SELECT” keyword in the generated query. Use this for modifiers like “TOP 10” to only retrieve 10 records for testing purposes.
  • Tables (“FROM” clause): Enter the table or tables from the legacy database that you’re querying.  You can specify any number of JOINs and the join conditions here if you have multiple tables.
  • “WHERE” clause: Optional. Use this to restrict which records you get from the legacy table.
  • “ORDER BY” clause: Optional.  Use this to specify the order of the returned data.
  • Export File: Enter the file name you will be exporting the data to.  Click the ellipses button to open a file selection dialog.
  • Pre-Query Lines: Optional.  Enter a full query here that you want to run before the generated export query runs.  For example, you may want to generate some temporary tables for the main query to use.
  • Post-Query Lines: Optional.  Enter a full query here that you want to run after the generated export query runs.  For example, you may want to drop temporary tables here.
  • Other export queries: Optional.  The data retrieved from queries entered here will be appended to the export file.  Only useful if you’re using custom, multi-table dataports.

Here’s an example of the fields populated with the details for our legacy table:

<<query sheet – populated pic>>

Now that we have the settings filled in, the next step is to either export the data or generate the query to see what it looks like and possibly to test it in SQL Management Studio (SSMS).  To see what the query will look like, click “Create Query”.  This generates the query and asks if you’d like to copy it to the clipboard, which can be handy for pasting into SSMS for testing.

To export the data, click (you guessed it) “Export Data”.  This will generate the query, execute it against the database specified in the connection string, and create a tab-delimited export file that’s ready for the UDP.  After the file is generated, you will be prompted to view it – if you click “Yes”, Notepad is launched to show the file:

<<export file pic>>

At this point you can run the UDP in NAV, specify the item table to import, select the exported file, and you’re done.

The real value with this tool is that we now have a data mapping that can be saved, easily modified, and executed any time the data changes or we refine our understanding of the mappings to NAV.  We can also reuse the mappings any time we convert from the same legacy system in the future (for example, we can have a library of mappings from AccPac or Syteline to NAV).

Hopefully you find the tool as useful as we do. Let us know if you have any comments or suggestions.

<<Extra notes for the end:
* some other options – if you’re using SQL to SQL, why not just populate data directly? Validations.
* we also have some generic dataports for importing item and general journals, including item tracking (lot and serial #’s) and dimensions.  We’ll publish them in an upcoming post, or give us a shout and we’ll fire ‘em off to you.
* stay tuned for an upcoming post on an XMLPort version of the UDP for NAV2013
** make sure macros are provided as separate file

No comments:

Post a Comment

Your comment will be posted once approved.