Sage can be a pain to extract data from and a while back I was given the mission of moving data from sage into SQL Server where we could more easily analyze the data.  I searched and searched for some documentation to help get me rolling.    The lack of helpful resources surrounding this problem is what inspired this post.

Moving data from Sage’s Provide X database into SQL Server took a bit of trial and error, but all worked tremendously in the end.

  1. Create corresponding tables in the destination database
  2. Create an SSIS package for each table to move data from Sage to SQL
  3. Create SSIS package to execute every SSIS package created in step 2
  4. Execute package from step 3
  5. Enjoy working with SQL instead of the Sage ODBC


Create Corresponding Table in the Destination Database

When I first started on this task I thought this was going to be the easy part.  Needless to say, I was very wrong.  First, you’ll need to find the .xsd file in the sage file system.  For me it was here: C:\Sage\Sage 100 Standard ERP\MAS90\SData\MasContract.xsd.  Once we have that we need to write some very simple C# to create all of the tables from the sage system in our SQL database.  Let’s create a console application in visual studio then add a COM reference for SQL XML Bulk Load.  We’ll need to download .net 2.0 if it isn’t already installed and SqlXml 4.0.  Once installed we can add the reference to our application by clicking references –> add reference –> COM –> Microsoft XML Bulkload for SQL Server 4.0 Type Library.


Here’s the code I used to create the tables:

using System;

namespace XSDtoTable
class Program

static void Main(string[] args)


objBL.ConnectionString = “<ConnectionString>”;
objBL.ErrorLogFile = “error.xml”;
objBL.BulkLoad = false; //If this is set to true then data will be loaded into the tables as well.  Since we don’t have data yet, we only want to create schema
objBL.SchemaGen = true;
objBL.KeepIdentity = false;


catch (Exception e)







There will probably be errors when you execute this for the first time depending on your xsd and what it contains.  There are phrases that sage uses that SQL doesn’t agree with when creating the tables.  I wasn’t able to find a good way of making C# encase those phrases in [] so they wouldn’t cause an issue.  If anyone figures it out, I would love to hear how you did it.  For now we need to go through the xsd with a text editor and change all the phrases.  Make sure to keep a list of the phrases that you change, you’ll need them when we start working with Biml.  Here are the phrases that I had to change: Key, Table, View, Description, Column, External, Lookup, No, Order, Group, Print, Row, Collate, Error, Identity, Length, DateFormat.  The console app should run through successfully when you change all of the phrases.

Create an SSIS package for each table to move data from Sage to SQL

We will use biml to create a package for each of the tables we need to move data from.  If you haven’t done so already download biml for free for different versions of SSDT/BIDS.  SQL Server 2016 –     |    Older versions –  Read up on the documentation for biml and familiarize yourself with the basic concepts.

To get started, let’s create an SSIS package that will move data from a sage table to sql.  We won’t use biml yet, we just need to create connections and ensure that the package transfers data as we expect.  Create an OLEDB connection to the SQL database and an odbc connection to Sage then configure the package as needed and test.  Once the package is flowing through as expected we can open up the .dtsx file with a text editor and view the connection as XML.  These will serve as a nice template for the connections that we’ll make in biml.  Below are what the connection strings look like in the .dtsx file.


After we’ve identified the the connections in the xml file we can start writing some biml.   Create a new biml file in you ssis solution and add the below biml script.  Replace the connections strings with values from the .dtsx file.  The script runs some C# to loop through the tables from the destination database and create an SSIS package for each.  At this point we will want to use the list of changes we had to make to the xsd file to create a store procedure that returns the correct table and column names.  You will also need to xml encode the results of the sproc.  Below the biml script there is very simplified version of my sproc.

<#@ template language=”C#” hostspecific=”true”#> //any code between # will be replaced with C# at run time.
<#@ import namespace=”System.Data”#>
<#@ import namespace=”System.IO” #>
<#@ import namespace=”System.Data.Odbc”#>

<Biml xmlns=””>

<AdoNetConnection Name=”SOTAMAS90″ ConnectionString=”uid=<your user id>;Dsn=SOTAMAS90″ Provider=”System.Data.Odbc.OdbcConnection, System.Data, Version=, Culture=neutral, PublicKeyToken=<this will come from the xml file>”></AdoNetConnection>
<OleDbConnection Name=”<name>” ConnectionString=”<connection string from .dtsx>”></OleDbConnection>
<# string conn =”<Connection string for an oledb connection in c#>”;#>
<# DataTable tbllist = ExternalDataAccess.GetDataTable(conn,”Exec Biml_PackageCreate“);#>
<# foreach (DataRow table in tbllist.Rows){#>
<Package Name=”<#=table[0]#>”>
<Dataflow Name=”DataFlowTransfer”>
<AdoNetSource Name=”SageTables” ConnectionName=”<odbc name>”>
Select * From <#=table[1]#>
<DataConversion Name=”Converter”>
<OleDbDestination Name=”Dest Table” ConnectionName=”<name>”>
<ExternalTableOutput Table=”<#=table[0]#>”></ExternalTableOutput>
<# }#>



Returns a list of tables with the destination database name, sage name (SourceTableName), and the destination db column names and sage column names with data types for data conversion.  There is only one replace clause below but you will  need to add one for each of the phrases you changed when modifying the xsd file earlier.  This allows SSIS to create the correct column and table mappings.  Data types need to be changed from SQL to Biml types as well.

,’Tble’,’Table’) as SourceTableName,
(Select distinct char(10) +
‘<Column SourceColumn=”‘ +
,’Colmn’,’Column’) +
WHEN Right(t.Name,3) = ‘int’ THEN ‘DataType=”Decimal” Precision=”25″‘
WHEN t.Name = ‘nvarchar’ THEN ‘DataType=”String” Length=”1000″‘
WHEN t.Name = ‘datetime’ THEN ‘DataType=”DateTime”‘
WHEN t.Name = ‘decimal’ THEN ‘DataType=”Double” Precision=”10″‘
END + ‘ />’
From sys.columns c
sys.types t ON c.user_type_id = t.user_type_id
Where object_id = a.object_id
FOR XML PATH(”)),1,1,”),’&lt;’,'<‘),’&gt;’,’>’)
From sys.tables a

You’ll have some trial and error getting the biml script to compile but once it successfully runs through, you should have a bunch of packages that are ready to move your data!

Create SSIS package to execute every SSIS package created in step 2

To finally get all of the data from sage to sql, we’ll just need to run all of the packages that we just created using biml.  Let create another package in a separate solution that will execute the .dtsx files.

We need a for each container and an execute package task within it.  We make the for each container a file enumerator and point it to the location where our sage packages are stored.


We’ll then create a variable called FileName that will hold the fully qualified name of the package we are currently executing.  Map that variable to index 0 of the for each loop.


Next we need to pass the fully qualified name to our execute package task.  To do that we need to create a File connection manager and pass the FileName variable as the connection string.



Execute package from step 3

Enjoy working with SQL instead of the Sage ODBC