BIZFILE – California Secretary of State Filing Services

California Business Connect

(CBC) is an IT project that aims to automate paper–based processes; allowing businesses to file and request copies of records online 24 hours a day, 7 days a week.

Furthermore, it will provide access to Secretary of State business records allowing the public and government agencies to perform functions in a more efficient manner and allow fee payments to be processed within one business day.

The project was approved in 2011 and is set to go public in Mid 2019. 

California BizFile was commissioned by the California Secretary of State Department which processes close to 2 million business filings and other customer orders each year. The business functions such as registering business entities and registering trademarks and service marks are handled by the Business Programs Division of the Secretary of State’s office

The Business Programs Division performs the “Filing Process” in which information submitted by businesses are reviewed for filing. The filing information is available upon request to California businesses, government agencies and other customers; some specific information is made available publicly online.

Business filings delivers numerous benefits to individuals and private and public agencies by providing information such as:

  • Evidence of the formation, registration, modification of domestic and foreign business entities.
  • Evidence of the key persons or entities operating business entities
  • Evidence of the registration and modification of trademarks and business marks.
  • Evidence for court cases and law enforcement investigations.
  • Information to government agencies for taxing, licensing, and regulatory purposes.
  • Proof of existence or good standing to open bank accounts, obtain financing, obtain licenses, enter into contracts, and conduct other official business in California.

For more information, visit

Project Overview:

The Business Programs Division performs a variety of activities in support of the core functions of the Secretary of State. The previous Filing Processes were various and based on older information technology systems and paper processes that supported the different types of filings and orders.

The Business Programs Division goal is to have all the filing and order types supported by a single set of common processes using similar systems with similar architecture.

For example, The Trademark Registration pilot has been developed and implemented based on the desired process model and using cloud-based services such as “Government Online Forms & Workflow Automation” electronic workflow ( and inhouse development of web services.

To support the desired business process and the electronic workflow, CBC Provides a cloud-based DevOps solution based on Microsoft Azure platform and .NET technologies to design, develop, and implement integrated web applications and web services to facilitate real time bi-directional transmission of data between public facing systems and internal systems of record.

On January 17, 2018, it was launched, and the Trademark Registration Online, allows you to submit online trademark and service mark registrations including required supporting documentation; allowing faster processing of both electronic and paper submissions, provides filing tips and samples to help you get your filing right the first time.

The highly successful ECI team was brought on to the project and achieved multiple milestones: 

  • Collaborated with cross-functional teams, stakeholders and sponsors to develop the business requirements
  • Incorporated an Agile mindset to help the team focus on delivering the shippable products in incremental development iterations.
  • Facilitated and collaborated across teams involved in delivering integrated modules and Led Intranet/Extranet applications, workflows, and web services.
  • Supported assigned development team when encountered problems beyond their expertise buy performing the following tasks:
    • Apply best practices in software development using Microsoft IDE, Azure DevOps, and .NET technologies
    • Design and develop web services, dynamic libraries, and web applications, using the appropriate programming techniques and languages
    • Develop and maintain cloud-based SQL Server databases and advanced SQL scripts
    • Create logic, system, and program flows
    • Write and execute unit test plans
    • Track and resolve processing issues
    • Participate in the review of code/systems for proper design standards, content and functionality.
    • Participate in all aspects of the Systems Development Life Cycle.
    • Adhere to established source control versioning policies and procedures
  • Performed Agile planning and project management using Azure DevOps
  • Developed data migration plan and led the development team in selecting, preparing, extracting, and transforming data and permanently transferring it from legacy systems to the new designed database in a cloud-based platform.
  • Acted as servant leader in establishing a continuous plan of action to improve the team’s success toward commitments and goals.
  • Applied Continuous Integration (CI) process to automating the build and testing
  • Used Azure BLOB Storage to organize and maintain the documents and attachments associated to Filing Process.
  • Designed and developed self-installing service to automate and support integration and database management back-end services

While this project is still ongoing; ECI’s part has been successfully completed. We are greatly looking forward to the tools we helped create being fully utilized by the general public.  

E-Procurement Systems

A quick dive into the current state of procurement online and what we can expect.

Estrada Consulting Inc.


  • E-Procurement has been on a steady year over year growth.
  • Measures are being taken to further E-procurement across the United States.
  • The growth presents many opportunities for contractors for more involvement.

This month GovWin and Deltek Technologies published reports on a survey they took from contractors and governments across the country to examine the procurement industry in different Variables.

Their report showcased a strong indication towards a series of different measures being taken in the industry that are developing a new landscape for contractors and governments alike.

A big part of this was the emergence of E-Procurement systems as the major way to procure new contracts across the United States.

What is E-Procurement, and why is it growing?

Traditionally, procurement was paper based, usually with procurement officers interacting with long-time partners or well-known suppliers and purchasing at fixed prices. In recent years, this has changed somewhat to become a strategic function: Procurement officers seek suppliers that fit with a company’s overall strategy. (Source)

E-procurement involves moving the procurement process online to cut out steps and save money. For example, traditional procurement involves getting quotes and then approval, probably from finance, as well as a purchase order, which could take more than a week. With e-procurement, this process is simplified and speeded up considerably, thanks to real-time interaction with pre-approved suppliers and trading partners, who can be anywhere in the world. With online purchasing, the purchase can be approved online, and the order completed within minutes; the required item often arrives within days. (Source)

What does the survey say?

The survey outlines the fact 55% of new contracts are procured through electronic means; this follows the historical trend of year over year increase of E-procurement across the United States. It’s been argued that e-Procurement allows the agency to be able to handle or scale-up for more purchases without requiring much more staff time.

How do these numbers vary across the country?

Differences by type or size of agency were often noteworthy. Agencies most likely
to use these systems included state government, those serving 500,000+ population, and those with staff working extra hours.

The data shows a strong correlation between population density and the implementation of e-procurement systems.

In conclusion

E-Procurement systems are on the rise with positive feedback being given from both sides of the isle. Therefore, we can expect more and more local governments entering the realm of E-procurement which presents a big opportunity for companies building these platforms to thrive.


Automated Testing For Data Migration

Data Migration, Part 4/4, Automation

Data Migration Testing revolves around testing the integrity of Data, content and the quality of data between Source and Destination DB based on the given mapping rule.

Broadly, any data Migration has the following:

– Table level mapping

– Element level mapping (column level)

– New elements in the New DB

– Ignore few old elements from the Legacy DB

– Not to be converted data from Legacy to the New system.

– Count Check

Table Level mapping defines how the tables from the old DB should be mapped to the new DB. Based on the mapping rules and how the new system is designed, the table level mapping could be one to many or many to one or one to one.

Element Level mapping defines how each column in the legacy system is mapped to the columns in the new system and this could be also be one to many, manty to one or one to one.

New elements in the new DB are the columns which is very specific to the New DB and not existing in the Legacy

Ignore elements from the Legacy system are the columns which are being exempted from the mapping rule and never brought to the new system.

Not to be converted data are the records from the Legacy DB which are being exempted from getting converted to the new DB for whatsoever reason.

Count Check This is one of the most important validation check to ensure that the source count is in sync with the target count check based on the given mapping rule.

Essentially every automated/manual testing for Data migration projects focuses on the above validation points.

The very fact that we deal with millions and millions of data, manual testing of data migration project can be next to impossible and the situation demands  an automated process and everytime there is a new extraction of data and any changes to the code, we should be able to trigger the automated scripts to check the sanctity and integrity of the data.

This could be achieved with a bunch of SQL scripts which compares the Source data to the Target data based on the mapping rule against each source and target tables.

A simple methodology without using any expensive off the shelf testing tool is to use Excel Macros.

– Get the macros in Excel to connect to Source and Target DB.

– Store the SQL scripts which  validates each table and element level mapping in the macros

– Store the result of the execution back to the Excel which basically confirms if the mapping passed/failed.

The Pros:

1. User-friendly and doesn’t need anyone to learn and understand the complexities of some off -the shelf expensive tools.

2. All the sql scripts being used as part of the test execution are stored at one place and are quite easy for reference.

3. Any mapping rule changes would mean a change in the SQL script and it’s easy to go and make the SQL changes to the corresponding script.

4. It’s very likely that the testing environment (source and destination DB) could change with every execution depending on its availability and the same could be configured in the Excel as a drop down.

5. As the actual execution happens against the DB, validating millions of data is not a challenge and the excel row limit doesnt kick in as only the results of the execution gets stored back to the Excel.

6. Its quite easy to filter the failed test cases and focus on the issues and re-run the same once the issues are fixed.

7. We can even configure the macros to run for specific set of tables, failed test cases rather than going for the entire test suite based on the regression cycle needed for a specific run thereby saving time.and effort.

8. As its excel we could get some fancy reports once the run is complete without depending on some expensive tools and its reporting services.

9. We could even integrate this with the unit test cases of Development team and run it as and when they complete their coding for each tables thereby delivering the QA much matured and bug free code.

10. On an average, 2000 test cases with as much SQL queries takes about the 30 mins to execute and report depending on the load on the DB at a given point of time which signifies the amount of time getting saved as against executing as much test cases manually.

11. All this without having some expensive data migration testing tools thereby saving a lot of cost to the project.


The major factors which should drive QA automation for any project and not particularly migration project is a cost-benefit analysis. It makes sense to go for expensive off-the shelf automation tools if and only if the returns justify the investment. As long as if we can achieve even 80 percent of the efficiency by any in-hand existing inexpensive tool, we should always shoot for the same and get the best out of it.

As for Data Migration project, the biggest challenge is always the sheer quantity of data to be tested and getting a quality output. Most of the times, the very databases where the data gets loaded with a smart combination of excel macros could achieve the desired results.

Data Migration Part 3 of 4

Test Scenarios, in general, would be as below:

I)​ If the migration is to the same type of Database, then,

  • Verify if the queries executed in the new database yield same results as in the older one
  • Verify if the number of records in the old database and new database is the same. Here use appropriate automation tool
  • Verify that there are no redundancies and new database works exactly as the older one
  • Verify if the schema, relationships, table structures are unaltered or set back to match the old database image
  • Verify whether the changes made in application updates new database with correct values and type
  • Verify if after the new database connection is provided to all the components of the application. Application, server, interfaces, firewall, network connectivity etc.
  • Verify the query performance (time-taken to execute complex queries) of the new database is not more than earlier performance

Challenges faced in this testing are mainly with data. Below are few in the list:

#1) Data Quality:

We may find that the data used in the legacy application is of poor quality in the new/upgraded application. In such cases, data quality has to be improved to meet business standards.

Factors like assumptions, data conversions after migrations, data entered in the legacy application itself are invalid, poor data analysis etc. leads to poor data quality. This results in high operational costs, increased data integration risks, and deviation from the purpose of business.

#2) Data Mismatch:

Data migrated from the legacy to the new/upgraded application may be found mismatching in the new one. This may be due to the change in data type, format of data storage, the purpose for which the data is being used may be redefined.

This result in huge effort to modify the necessary changes to either correct the mismatched data or accept it and tweak to that purpose.

#3) Data Loss:

Data might be lost while migrating from the legacy to the new/upgraded application. This may be with mandatory fields or non-mandatory fields. If the data lost is for non-mandatory fields, then the record for it will still be valid and can be updated again.

But if the mandatory field’s data is lost, then the record itself becomes void and it cannot be retracted. This will result in huge data loss and should have to be retrieved either from the backup database or audit logs if captured correctly.

#4) Data Volume:

Huge Data that requires a lot of time to migrate within the downtime window of the migration activity. ​E.g:​ Scratch cards in Telecom industry, users on an Intelligent network platform etc., here the challenge is by the time, the legacy data is cleared, a huge new data will be created, which needs to be migrated again. Automation is the solution for huge data migration.

#5) Simulation of a real-time environment (with the actual data):

Simulation of a real-time environment​ ​in the testing lab is another real challenge, where testers get into different kind of issues with the real data and the real system, which is not faced during testing.

So, data sampling, replication of real environment, identification of volume of data involved in migration is quite important while carrying out data Migration Testing.

#6) Simulation of the volume of data:

Teams need to study the data in the live system very carefully and should come up with the typical analysis and sampling of the data.

E.g:​ users with age group below 10 years, 10-30 years etc., As far as possible, data from the live needs to be obtained, if not data creation needs to be done in the testing environment. Automated tools need to be used to create a large volume of data. Extrapolation, wherever applicable can be used, if the volume cannot be simulated.


Data Migration Testing Part 2 of 4

Verification Requirements in the Migrated Environment: 

It is highly important to have a verification process built in for a migration testing process. By putting out the 

The following tests are designed for a hypothetical test case. 

  • Check whether all the data in the legacy is migrated to the new application within the downtime that was planned. To ensure this, compare the number of records between legacy and the new application for each table and views in the database. Also, report the time taken to move say 10000 records.
  • Check whether all the schema changes (fields and tables added or removed) as per the new system are updated.
  • Data migrated from the legacy to new application should retain its value and format unless it is not specified to do so. To ensure this, compare data values between legacy and new application’s database.
  • Test the migrated data against the new application. Here cover a maximum number of possible cases. To ensure 100% coverage with respect to data migration verification, use the automated testing tool.
  • Check for database security.
  • Check for data integrity for all possible sample records.
  • Check and ensure that the earlier supported functionality in the legacy system works as expected in the new system.
  • Check the data flow within the application which covers most of the components.
  • The interface between the components should be extensively tested, as the data should not be modified, lost, and corrupted when it is going through components. Integration test cases can be used to verify this.
  • Check for legacy data redundancy. No legacy data should be duplicated itself during migration
  • Check for data mismatch cases like data type changed, storing format is changed etc.,
  • All the field level checks in the legacy application should be covered in the new application as well
  • Any data addition in the new application should not reflect back on the legacy
  • Updating legacy application’s data through the new application should be supported. Once updated in the new application, it should not reflect back on the legacy.
  • Deleting the legacy application’s data in the new application should be supported. Once deleted in the new application, it should not delete data in legacy as well.
  • Verify that the changes made to the legacy system support the new functionality delivered as a part of the new system.
  • Verify the users from the legacy system can continue to use both the old functionality and new functionality, especially the ones where the changes are involved. Execute the test cases and the test results stored during the Pre-migration testing.
  • Create new users on the system and carry out tests to ensure that functionality from the legacy as well as the new application, supports the newly created users and it works fine.
  • Carry out functionality related tests with a variety of data samples (different age group, users from different region etc.,)
  • It is also required to verify if ‘Feature Flags’ are enabled for the new features and switching it on/off enables the features to turn on and off.
  • Performance testing is important to ensure that migration to new system/software has not degraded the performance of the system.
  • It is also required to carry out Load and stress tests to ensure the system stability.
  • Verify that the software upgrade has not opened up any security vulnerabilities and hence carry out security testing, especially in the area where changes have been made to the system during migration.
  • Usability is another aspect which is to be verified, wherein if GUI layout/front-end system has changed or any functionality has changed, what is the Ease of Use that the end user is feeling as compared to the legacy system.

Since the scope of Post-Migration testing becomes large, it is ideal to segregate the important tests that need to be done first to qualify that Migration is successful and then carry out the remaining later.

It is also advisable to automate the end to end functional test cases and other possible test cases so that the testing time can be reduced and the results would be available quickly.