Test Scenarios, in general, would be as below:

I)​ If the migration is to the same type of Database, then,

  • Verify if the queries executed in the new database yield same results as in the older one
  • Verify if the number of records in the old database and new database is the same. Here use appropriate automation tool
  • Verify that there are no redundancies and new database works exactly as the older one
  • Verify if the schema, relationships, table structures are unaltered or set back to match the old database image
  • Verify whether the changes made in application updates new database with correct values and type
  • Verify if after the new database connection is provided to all the components of the application. Application, server, interfaces, firewall, network connectivity etc.
  • Verify the query performance (time-taken to execute complex queries) of the new database is not more than earlier performance

Challenges faced in this testing are mainly with data. Below are few in the list:

#1) Data Quality:

We may find that the data used in the legacy application is of poor quality in the new/upgraded application. In such cases, data quality has to be improved to meet business standards.

Factors like assumptions, data conversions after migrations, data entered in the legacy application itself are invalid, poor data analysis etc. leads to poor data quality. This results in high operational costs, increased data integration risks, and deviation from the purpose of business.

#2) Data Mismatch:

Data migrated from the legacy to the new/upgraded application may be found mismatching in the new one. This may be due to the change in data type, format of data storage, the purpose for which the data is being used may be redefined.

This result in huge effort to modify the necessary changes to either correct the mismatched data or accept it and tweak to that purpose.

#3) Data Loss:

Data might be lost while migrating from the legacy to the new/upgraded application. This may be with mandatory fields or non-mandatory fields. If the data lost is for non-mandatory fields, then the record for it will still be valid and can be updated again.

But if the mandatory field’s data is lost, then the record itself becomes void and it cannot be retracted. This will result in huge data loss and should have to be retrieved either from the backup database or audit logs if captured correctly.

#4) Data Volume:

Huge Data that requires a lot of time to migrate within the downtime window of the migration activity. ​E.g:​ Scratch cards in Telecom industry, users on an Intelligent network platform etc., here the challenge is by the time, the legacy data is cleared, a huge new data will be created, which needs to be migrated again. Automation is the solution for huge data migration.

#5) Simulation of a real-time environment (with the actual data):

Simulation of a real-time environment​ ​in the testing lab is another real challenge, where testers get into different kind of issues with the real data and the real system, which is not faced during testing.

So, data sampling, replication of real environment, identification of volume of data involved in migration is quite important while carrying out data Migration Testing.

#6) Simulation of the volume of data:

Teams need to study the data in the live system very carefully and should come up with the typical analysis and sampling of the data.

E.g:​ users with age group below 10 years, 10-30 years etc., As far as possible, data from the live needs to be obtained, if not data creation needs to be done in the testing environment. Automated tools need to be used to create a large volume of data. Extrapolation, wherever applicable can be used, if the volume cannot be simulated.