Migration Steps

The steps types are the following:

  • Global step: it contains generic properties values to be used when specific ones are not set.
  • Configuration step: it represents a logical connection to the database.
  • Action step: it can be one of the following:
    • Creation or deletion step: executes scripts from the sql files. The scripts aims to creates or deletes database items.
    • Data loading step: loads data to the database.
    • Conversion step: converts the data to a specific format.

The Migrator Tool accept .ini input files with the -configurationIni option. Each .ini file can contains one or more steps.                                         
For more readability, the steps can be specified inside 2 input files:

  • The config.ini file, including one or more configuration steps.
  • The all.ini file, including one or more action steps. The global step can be included in this file.

A step corresponds to a section in the .ini file.

  • Each section have an id as title. The id is a combination of a typeand a name connected with a hyphen. (e.g.[type - name])

The section type must be one of the following:

  • Database : for a configuration step.
  • ExecuteSql : for a creation or deletion step.
  • CSV2DB : for a CSV data loading step.
  • EBCDIC2DB : for an EBCDIC data loading step.
  • ConvertQDDS: for a QDDS data conversion step.

Notes

  • A section should have a unique id. If many sections have the same id, only the content of the last one will be taken into consideration, however, the order of the first one will be kept.
  • The configuration steps and the global step do not accept duplicated section id.
  • Since the global step is unique, its id is "Global".
  • The connection name should be specified in the action steps.
  • When generating a reverse project using Analyser, a template of "all.ini" file will be created under yourReverseProject\database\stepsIniFile

Global step

Since the global step is unique, its id is Global. The global step contains the following properties to consider when they are not specified in the specific steps.

Property nameDescriptionValuesDefault value
multithreadNumber of threads to use for parallel stepsInteger
  • Runtime.getRuntime().availableProcessors() - 1
  • 1 if only one processor is available
encodingEncoding to considerer when reading filesStringUTF-8
path.exe.oracleOracle's execution pathStringNone
path.exe.mssqlMssql's execution pathStringNone
sqlmodel.path

Path of the SQL Model JSON file. It can be absolute or relative path

[DOCKER] To use this property see path in Docker

Stringdatabase/SQLMODEL.json

Notes

  • If a property is specified in a step, it must have a non-empty value.              
  • If the property is missing, its default value will be taken into consideration.              
  • An environment variable can be used as a property value.               
  • The global step is optional.               

Configuration step

The configuration step contains the following properties to create a connection to the database:

Database identification

Property nameDescriptionValuesDefault value
databaseTypeThe target database typePOSTGRE                               
ORACLE                               
MSSQL              
POSTGRE
dataBaseHost

DNS name or IP address of the database

This property can be stored in AWS secret manager, see dataBaseAWSSecretName property

[DOCKER] To use this property see DNS name or IP address in Docker

Stringlocalhost
dataBasePort

The listener port

This property can be stored in AWS secret manager, see dataBaseAWSSecretName property

Integer5432
dataBaseAdditionalArgsEnable to add connection properties to url.                               
Example: add "?ssl=true" => jdbc:postgresql://localhost:5432/aeat?ssl=true
StringNone
adminDataBaseName[POSTGRESQL] The name of database to createStringpostgres
adminDataBaseUser[POSTGRESQL] The password of super user ownerStringNone
adminDataBasePassword[POSTGRESQL] The super user ownerStringNone
dataBaseName

The name of database to create

This property can be stored in AWS secret manager, see dataBaseAWSSecretName property

Stringmydb
dataBaseServiceName

[ORACLE] The service name of database to create

This property can be stored in AWS secret manager, see dataBaseAWSSecretName property

StringThe dataBaseServiceName property value
dataBaseUser

The user used to connect to the database

This property can be stored in AWS secret manager, see dataBaseAWSSecretName property

StringNone
dataBasePassword

The password used to connect to the database

This property can be stored in AWS secret manager, see dataBaseAWSSecretName property

StringNone
server.host

The license server host for security

[DOCKER] To use this property see DNS name or IP address in Docker

Stringlocalhost
server.portThe license server port for securityInteger8888
dataBaseDriverPath

Path of the database driver folder. It can be absolute or relative path. Only one driver need to be in this location

[DOCKER] To use this property see path in Docker

StringSee use default value.
dataBaseAWSSecretName

AWS Secret Manager stores the database connection details like host, port, user, password, etc., It's to avoid stealing sensitive informations.

Data Migrator use this property to refer to Secret Name of the AWS Secret Manager, retrieve database information and establish DB connection.

If this property is specified along with other properties (dataBaseHost, dataBasePort, dataBaseName, dataBaseUser, dataBasePassword, dataBaseServiceName) then databaseSecretName information will replace the data directly mentioned in the ini or property file.

Refer FAQ to know how to add a customised field (ex: Oracle Service Name) for a database and use it in the Data Migrator

StringNone
dataBaseAWSRegionAWS Region where the database informations like host, port, user, password, etc., are store in the Secret NameStringeu-west-3
dataBaseCustomConnectionUrl

This property can be used to provide the connection string for the database.

If the dataBaseUser and dataBasePassword properties are defined in the configuration files with dataBaseCustomConnectionUrl, the values taken into account for username and password are those defined in the configuration file.

If you wish to use only dataBaseCustomConnectionUrl for login, you must remove the dataBaseUser and dataBasePassword properties from the configuration files.

[POSTGRES] In case we use the property stepDatabaseKillDropCreate=true, we must keep both properties databaseName and dataBaseUser in the configuration files or AWS Secret Manager.

This property can be stored in AWS secret manager with the value dbcustomconnectionurl as key , see dataBaseAWSSecretName property

StringNone
adminDataBaseCustomConnectionUrl

[POSTGRES] This property can be used to provide the connection string for the database.

This property can be stored in the AWS Secrets Manager with the value admindbcustomconnectionurl as key, see the dataBaseAWSSecretName property

StringNone
customSqlldrUserId

[Oracle] This optional property can be used to provide the USERID for SQLLDR.

This property can be stored in AWS secret manager with the value customSqlldrUserId as key, see the dataBaseAWSSecretName property.

This property becomes mandatory, if the dataBaseUser and dataBasePassword properties are not defined in the configuration files and the dataBaseCustomConnectionUrl is used to connect to a database.

StringNone

Steps

Property nameDescriptionValuesDefault value
stepDatabaseKillDropCreate[POSTGRESQL] Drop and create databasetrue/falsetrue
stepDatabaseCreateScriptPath[POSTGRESQL] This optional property helps to create a database with custom script. It's value can be relative or absolute path of the SQL file. More information is in this FAQStringNone

POSTGRE configuration step example

[Database - cnxPostgreSQL1]
adminDataBaseName=postgres
adminDataBaseUser=postgres
adminDataBasePassword=P@ssw0rd 
dataBaseHost=localhost 
dataBasePort=5432 
dataBaseName=mydb 
dataBaseUser=myuser 
dataBasePassword=mypwd
dataBaseAdditionalArgs="?currentSchema=APL002" 
stepDatabaseKillDropCreate=true
stepDatabaseCreateScriptPath=database/database/create-database.sql

Creation or Deletion step

The creation or deletion step contains the following properties to consider to execute the SQL targeted scripts.

Property nameDescriptionValuesDefault value
databaseThe connection nameStringNone
enableWhether the step is enabledtrue/falsetrue
encodingEncoding to consider when reading filesStringDepends on the Global step
multithreadNumber of threads to use for parallel stepsIntegerDepends on the Global step
input.folderThe folder containing sql filesStringNone
plainExecutionWhether to consider the total sql file instead of parts separated by ";"true/falsefalse

Deletion step example

[ExecuteSql - Drop tables] 
input.folder=database/tables-drop 
database=cnxPostgreSQL1 
multithread=1

Creation step example

[ExecuteSql - Create tables] 
input.folder=database/tables 
database=cnxPostgreSQL1 
multithread=1

Data loading step

The data loading step contains the following properties to consider when loading data to the database.

DATA

Property nameDescriptionValuesDefault value
databaseThe connection nameStringNone
enableWhether the step is enabledtrue/falsetrue
encodingData encodingStringDepends on the Global step
multithreadNumber of threads to use for parallel steps.                                    
One thread by table to load.
IntegerDepends on the Global step
input.folder

The folder containing data:

  • Plain files must have the name of the table· Folders must have the name of the table, and all files directly inside will be used (no recursion)
  • Archive files must have the name of the table, and all files inside matching the zipContentWildcard property will be used
String                                    
Absolute or relative path folder
None
mssqlInsertMode[MSSQL] This optional property can be used to provide the mode to load data. 
By default Data Migrator use BULK INSERT statement to load data. 
The INSERT INTO statement is used for the POC projects and BCP utility is used in order to support more types of databases, to use this utility see install BCP utility
BULK_INSERT 
BCP_UTILITY 
INSERT_INTO
BULK_INSERT
zipContentWildcardComma-separated list of wildcards used to filter the content of data archive filesComma-separated list*
oracle.disableSqlldr[ORACLE] Whether or not to disable the SQL Loader step, allowing to generate CSV files onlytrue/falsefalse

oracle.ctlOptionClauseSqlldr

[ORACLE] To specify command-line parameters in the SQL*Loader control file. Refer doc for more information.

Provide each parameter and it's value in the key=value format and multi parameters are separated by comma as key1=value1,key2=value2

Comma-separated key valueNone
nbFileThreadsNumber of files to be processed in parallel within a single table thread.                                   
For Oracle, triggers the PARALLEL=true SQL Loader option, which is incompatible with already existing constraints on the table to migrate to.
Integer1
defaultFixedValuesSee AS400 Specificity None

convertGraphicDataToFullWidth

To convert a half width to a full width characters for the DB column type 'Graphic'. 

Default value is set to ‘false’, which refers to original data as in the input file will be migrated. 

If value is ‘true’, then all the graphic type data will be converted to a full-width characters and trim the extra character if any.

true/falsefalse

CSV

Property nameDescriptionValuesDefault value
csvSeparatorSeparator between fields , (Comma)
csvQuoteQuote character ' (Single quote)
csvNullReplacement character for empty value None
csvOneLiner

The database like Oracle or Postgres, possible to have line-feed (new line character) character as part of a column data. In a CSV input file, the line-feed data will spread a single table row data into multiple lines. 

If the value is set to false, data-migrator reads CSV line by line and pushes data to the database. It is suitable for data without line feed character and gain performance.

If the value is set to true, data-migrator able to understand the difference with help of csvQuotes, merge the lines and import the data.               
Note: csvQuotes property value must enclose the whole column's data which contain Line feed character.

Example: Consider, we try to import Hello\nWorld data as a single column then whole data should be surround by csvQuotes (double quote) as "Hello\nWorld".

Additional feature: For NUMERIC columns, comma is replaced by dot to fit with the POSTGRES decimal format. 

This property is only applicable for Postgres and Oracle database

true/falsefalse
csvWithHeaderThis property allows you to define whether data files contain a header.true/falsefalse

EBCDIC

Property nameDescriptionValuesDefault value
ebcdicOidIndicate the presence of a OID before each records.true/falsetrue
ebcdicOidLengthIndicate the length of a OID before each records.Integer2
ebcdicOidForLengthWorks with ebcdicOid= true and ebcdicOidLength !=0, take oid bytes to compute length of current record instead length deduced from SQLModel.jsontrue/falsefalse
ebcdicZonedModeIndicate the strategy to apply when handling the Zone-Decimal fields.EBCDIC_STRICT                                
EBCDIC_MODIFIED                                
AS400
EBCDIC_STRICT
ebcdicCodePointShiftCode shift point to apply. See CodeShiftPointInteger384
ebcdicVarcharAsCharConsider all VARCHAR columns as CHAR while migrating datatrue/falsefalse
ebcdicByteReplacementSpecify byte replacement list by setting to replace and replacing codes point:                                
Before applying codeshiftpoint offset if any, this enables to convert each byte having the to replace code point in considered encoding by the replacing code point.                                
Example : 30:94|XX:YY With CP037 replace (IRS) by [;]
emptyNone
ebcdicNullByteIndicatorPositionModeWhere to find the Null Byte Indicator (NBI) in provided data files for nullable columnsALWAYS_LEFT                                
RIGHT_FOR_DATES                                
ALWAYS_RIGHT                                
IGNORED
RIGHT_FOR_DATES
ebcdicEmptyVarcharModeBehavior to apply while dealing with VARCHAR columns with a size 0LOW_VALUE                                
NULL_VALUE                                
SINGLE_SPACE                                
FILL_SPACE
LOW_VALUE
ebcdicVarcharKeepPaddingGet all the VARCHAR definition data if provided instead of usefull data provided by the 2 first bytes length.So padding characters can be catched.                                
Cannot be used with ebcdicFilesWithVarcharInVB to true
true/falsefalse
ebcdicFilesWithVarcharInVBFor VARCHAR, 2-bytes length are use to read byte instead of use size column definition from SQLModel.jsontrue/falsefalse
ignoreDeletedRecordsSet to true to pre-read the record and test whether or not the bytes are all blanks on the EBCDIC encoding. If blank, tool ignore the record and continue to the next one. This option is useful if REUSDLT option in AS/400 set as YES to reuse deleted records.true/falsefalse
ebcdicCsvDumpFolderAbsolute or relative path of the folder to store the CSV data which extracts from the EBCIDIC.       
More information is in debug page.
StringNone
onyExtractCsvTo instruct the Data-Migrator only to extract the CSV from the EBCDIC. By default, value of property is “false”. If value is “true” then data migrator will generate CSV and wouldn’t migrate data. This property is depends on “ebcdicCsvDumpFolder” property for the path. If that property value is empty or none, then automatically generate CSV and cutting log file under “data/extract” foldertrue/falsefalse

Miscellaneous

temp.folderLocation of temporary files. If empty, let java handle it in user folder.Absolute folder path 
keepDebugFilesKeep temporary files at end of migration. Not available for POSTRESQL since pipe mechanism is used, so use debug.folder insteadtrue/falsefalse
date.format[ORACLE] Date pattern to add for Date column (DB2) definitionStringNone
time.format[ORACLE] Timestamp pattern to add for Time column (DB2) definitionStringNone
timestamp.format[ORACLE] Timestamp pattern to add for Timestamp column (DB2) definitionStringNone
error.maxNb[ORACLE] Maximal number of tolerated errors before canceling the data migration.Integer50
spannedModeWhen this option is active, tables containing BLOBs / CLOBs are considered to have been unloaded with the SPANNED YES option.                                
In this mode, the VARCHAR, BLOB and CLOB have a dynamic payload carried by the 2 (resp. 4) bytes preceding the data.                                
Each BLOB is exported to a separate dat file, referenced in the ctl command.                                
CLOBs are exported in a single file, with the CSV delimiter configured, and referenced in the ctl command.                                
For ORACLE only.
true/falsefalse
debug.folderSee Solve Loading Trouble (only POSTGRESQL)String                                
Absolute or relative path folder
None
mssql_server_temp.folder[DOCKER] Path of the temp.folder from the server where SQL Server is runningStringtemp.folder

CSV Data loading example

[CSV2DB - Loading1] 
csvQuote=' 
csvSeparator=, 
database=cnxOracle1 
encoding=UTF-8 
input.folder=datas 
keepDebugFiles=true 
multithread=5 
temp.folder=dataTemp

EBCDIC Data loading example

[EBCDIC2DB - Loading1] 
csvQuote=' 
csvSeparator=, 
database=cnxOracle1 
encoding=CP1145 
ebcdicCodePointShift=0 
ebcdicNullByteIndicatorPositionMode=ALWAYS_RIGHT 
ebcdicOid=false 
ebcdicOidLength=0 
ebcdicVarcharAsChar=false 
ebcdicZonedMode=EBCDIC_STRICT 
input.folder=data 
keepDebugFiles=true 
multithread=10 
temp.folder=dataTemp 
zipContentWildcard=P???

QDDS Conversion step

The QDDS conversion step contains the following properties to consider when converting the QDDS data.

Property nameDescriptionValuesDefault value
databaseThe connection nameStringNone
enableWhether the step is enabledtrue/falsetrue
encodingEncoding to considerer when reading filesStringDepends on the Global step
input.folderThe data conversion input folderStringNone
output.folderThe data conversion output folderStringNone

ignoreLogicallyDeletedRecords

To ignore a record (row) which is marked logically as deletedtrue/falsefalse
multithreadNumber of threads to use for parallel stepsIntegerDepends on the Global step

QDDS Conversion step example

[ConvertQDDS - Conversion1] 
database=cnxPostgreSQL1 
encoding=CP037 
input.folder=inputs 
output.folder=data/DataFiles