Data Archiving in Dynamics AX - Creation and Best Practices


Data Archiving in AX is an easy to build solution to issues of table size and slow data queries. It's typically performed by copying data in highly used AX tables into newly created ‘History’ tables.

Data archiving tables are kept small while older data is still readily available for reporting, or research.  However if done improperly the task can be overly cumbersome, tax system resources heavily or potentially cause data integrity issues.

Preventing Data Archiving Issues

Typically the actual archiving is best achieved by either batch jobs or AOT jobs, as data archiving is usually performed highly infrequently or even a single time and doesn’t need to be tied into any other functionality in the system.

Batch Jobs

In most cases, Batch jobs should be used. Data archiving is typically done on a timetable based around system downtime (i.e. once every 3 months archive the data) that Batch Jobs allow you to setup at the time of creation or alter based on what is needed. Learn more about running batch jobs.

AOT Jobs

AOT Jobs are only utilized when archiving requires a developers intervention each time or if the archiving only occurs once. This process is much more manual than the batch jobs approach. Make sure to consider both options when deciding to data archive.
AOT > Jobs > Right Click > new job

Creating History Tables

History tables are used to store historical data. They need to match the values of the table exactly. Duplicating the table is best way of achieving this. Here's how:

  1. In the development work space, right click the table and select 'duplicate'. This will create a new table
  2. Rename the new table. Use a name that will help you remember the purpose of the table
    1. ex. $TableName$History

This is the best process to ensure that the tables are identical between the history and original tables.

Populating History Tables

Population of the history tables will require copying the records from one table into another. This is performed using "insert_recordset" or traditional inserts.


Data archiving revolves around the concept of copying large amounts of data and records from one table to another. An insert_recordset is ideal because it makes much fewer calls to the database as opposed to a while select.  It's also possible to update multiple records with a single command. 

Traditional Insert

Traditional inserts, otherwise know as "while select", is a less efficient method of copying records to another table. It take more time and more code in order to copy the data. It's used if it's the only way the performance of the data archiving operation won't suffer.

Purging data tables: delete_from complications

Purging of the data tables should be performed after the population of the history tables. This ensures that if the population fails, it will error out without having data deleted. In general, a delete_from will perform this operation better than a while select delete operation.


Delete_from is suited for deleting large numbers of records from tables. This is because delete_from makes fewer calls to the database just like insert_recordset.

If a delete method or trigger exists on the table you are trying to purge records from, the speed of delete_from is the same as a while select.  This is because each record needs to be evaluated.  When archiving, we want to evaluate the delete methods or triggers as those methods usually have a cascading effect to other data that corresponds with AX processes. 

To prevent this:
Call your table buffer before delete_from with these commands:

This will ensure the delete_from operates as intended and that the archiving of data does not affect more data than it should in AX.

Sudo-coded ex with a commonly used WHS table – typically done in the run command of the batch job class



WHSDimTracking                              whsDimTracking

WHSDimTrackingHistory               whsDimTrackingHistory // new history table



insert_recordset whsDimTrackingHistory (field-list)

Select (field-list) from whsDimTracking

where …

— exists joins are also possible here


whsDimTracking. skipDeleteMethods(true);



delete_from whsDimTracking

where ..

— exists joins are also possible here




As Vice President of Digital Transformation at Blue Horseshoe, Steve Shebuski guides businesses through their digital transformation journey. Steve has 20+ years of experience as a Program Manager/Design Lead/Project Manager implementing both Microsoft Dynamics AX / Dynamics 365 as well as tier I and tier II warehouse management and transportation software solutions. Steve's deep knowledge within the distribution industry and his innovative approach to solution architecture are the backbone of the solution implementation and deployment by Blue Horseshoe.

Learn How Others Have Improved Their Supply Chain Operations With Blue Horseshoe

Microsoft Extended Support for Dynamics AX 2012 R3: What's Next?

Read More
Supply Chain Technology

D365 Wave Updates (Updated for 2021)

Read More
Dynamics 365 ERP

Overcoming Cloud ERP Misconceptions: Your Guide to D365

Read More