SSRS Custom Authentication Part 2

Implementing Custom SSRS Authentication

Let`s start working on the implementation.

Step 1: Creating the UserAccounts Database

Create the tables exactly how it`s shown in the Database Model Section explained earlier

Step 2: Building the Sample

Microsoft provides a sample project that implements custom authentication. It can be downloaded from https://github.com/microsoft/Reporting-Services/tree/master/CustomSecuritySample2016

If you have not already created a strong name key file, generate the key file using the following instructions in this link https://docs.microsoft.com/en-us/dotnet/framework/app-domains/how-to-sign-an-assembly-with-a-strong-name

Before compiling the sample, let`s make a small change on it.

  • Open the Logon.aspx file
  • Remove the Register button from the file

In our example users will be registered based on the AD FS claims

To compile the sample using Visual Studio

  • Open CustomSecuritySample.sln in Microsoft Visual Studio.
  • In Solution Explorer, select the CustomSecuritySample project.
  • Look at the CustomSecuritySample project’s references. If you do not see Microsoft.ReportingServices.Interfaces.dll, then complete the following steps:
    • On the Project menu, click Add Reference. The Add References dialog box opens.
    • Click the .NET tab.
    • Click Browse, and find Microsoft.ReportingServices.Interfaces on your local drive. By default, the assembly is in the <install>\ReportServer\bin directory. Click OK. The selected reference is added to your project.
  • On the Build menu, click Build Solution.

Debugging

To debug the extension, you might want to attach the debugger to both ReportingServicesService.exe and Microsoft.ReportingServices.Portal.Webhost.exe. And add breakpoints to the methods implementing the interface IAuthenticationExtension2.

Step 3: Deployment and Configuration

The basic configurations needed for custom security extension are the same as previous releases. Following changes are needed in for web.config and rsreportserver.config present in the ReportServer folder. There is no longer a separate web.config for the reportmanager, the portal will inherit the same settings as the reportserver endpoint.

To deploy the sample

  • Copy the Logon.aspx page to the <install>\ReportServer directory.
  • Copy Microsoft.Samples.ReportingServices.CustomSecurity.dll and Microsoft.Samples.ReportingServices.CustomSecurity.pdb to the <install>\ReportServer\bin directory.
  • Copy Microsoft.Samples.ReportingServices.CustomSecurity.dll and Microsoft.Samples.ReportingServices.CustomSecurity.pdb to the <install>\RSWebApp\bin directory.

If a PDB file is not present, it was not created by the Build step provided above. Ensure that the Project Properties for Debug/Build is set to generate PDB files.

Modify files in the ReportServer Folder

To modify the RSReportServer.config file.

  • Open the RSReportServer.config file with Visual Studio or a simple text editor such as Notepad. RSReportServer.config is located in the <install>\ReportServer directory.
  • Locate the <AuthenticationTypes> element and modify the settings as follows:

<Authentication>

<AuthenticationTypes>

<Custom/>

</AuthenticationTypes>

<RSWindowsExtendedProtectionLevel>Off</RSWindowsExtendedProtectionLevel>

<RSWindowsExtendedProtectionScenario>Proxy</RSWindowsExtendedProtectionScenario>

<EnableAuthPersistence>true</EnableAuthPersistence>

</Authentication>

  • Locate the <Security> and <Authentication> elements, within the <Extensions> element, and modify the settings as follows:

<Security>

<Extension Name=”Forms” Type=”Microsoft.Samples.ReportingServices.CustomSecurity.Authorization, Microsoft.Samples.ReportingServices.CustomSecurity” >

<Configuration>

<AdminConfiguration>

<UserName>username</UserName>

</AdminConfiguration>

</Configuration>

</Extension>

</Security>

<Authentication>

<Extension Name=”Forms” Type=”Microsoft.Samples.ReportingServices.CustomSecurity.AuthenticationExtension,Microsoft.Samples.ReportingServices.CustomSecurity” />

</Authentication>

 

To modify the RSSrvPolicy.config file

 

  • You will need to add a code group for your custom security extension that grants FullTrust permission for your extension. You do this by adding the code group to the RSSrvPolicy.config file.
  • Open the RSSrvPolicy.config file located in the <install>\ReportServer directory.
  • Add the following <CodeGroup> element after the existing code group in the security policy file that has a URL membership of $CodeGen as indicated below and then add an entry as follows to RSSrvPolicy.config. Make sure to change the below path according to your ReportServer installation directory:

<CodeGroup

class=”UnionCodeGroup”

version=”1″

Name=”SecurityExtensionCodeGroup”

Description=”Code group for the sample security extension”

PermissionSetName=”FullTrust”>

<IMembershipCondition

class=”UrlMembershipCondition”

version=”1″

Url=”C:\Program Files\Microsoft SQL Server\MSRS13.MSSQLSERVER\Reporting Services\ReportServer\bin\Microsoft.Samples.ReportingServices.CustomSecurity.dll”/>

</CodeGroup>

To modify the Web.config file for Report Server

  • Open the Web.config file in a text editor. By default, the file is in the <install>\ReportServer directory.
  • Locate the <identity> element and set the Impersonate attribute to false.

<identity impersonate=”false” />

  • Locate the <authentication> element and change the Mode attribute to Forms. Also, add the following <forms> element as a child of the <authentication> element and set the loginUrl, name, timeout, and path attributes as follows:

<authentication mode=”Forms”>

<forms loginUrl=”logon.aspx” name=”sqlAuthCookie” timeout=”60″ path=”/” domain=”.yourDomain” enableCrossAppRedirects=”true” ></forms>

</authentication>

  • Add the following <authorization> element directly after the <authentication> element.

<authorization>

<deny users=”?” />

</authorization>

This will deny unauthenticated users the right to access the report server. The previously established loginUrl attribute of the <authentication> element will redirect unauthenticated requests to the Logon.aspx page.

 

Step 4: Some of the other changes required in the web.config file and Microsoft.ReportingServices.Portal.WebHost.exe.config

Adding Machine Keys

  • In <RSPATH>\ReportServer\web.config, add under <system.web>

<machineKey validationKey=”[YOUR KEY]” decryptionKey=”[YOUR KEY]” validation=”SHA1″ decryption=”AES” compatibilityMode=”Framework45″/>

  • Then <RSPATH>\RSWebApp\Microsoft.ReportingServices.Portal.WebHost.exe.config, add under <configuration>

<system.web>  <machineKey validationKey=”[YOUR KEY]” decryptionKey=”[YOUR KEY]” validation=”AES” decryption=”SHA1″ compatibilityMode=”Framework45″ /></system.web>

Our example is a little bit different from Microsoft`s example, we are setting he decryption to SHA1 and compatibility mode to Framework 4.5 in order to share cookies between our Ladining Page ( MVC APP ) and SSRS

Step 5: Configure Passthrough cookies

The new portal and the reportserver communicate using internal soap APIs for some of its operations. When additional cookies are required to be passed from the portal to the server the PassThroughCookies properties is still available. More Details: https://msdn.microsoft.com/en-us/library/ms345241.aspx In the rsreportserver.config file add following under <UI>

<UI>   <CustomAuthenticationUI>      <PassThroughCookies>         <PassThroughCookie>sqlAuthCookie</PassThroughCookie>      </PassThroughCookies>   </CustomAuthenticationUI></UI>

 

Integrating MVC with SSRS

Once the focus of this post is to show how to authenticate with SSRS I will not go through of how to communicate with AD FS.

Supposing that your AD FS integration is done and working let`s see below how to implement the User Registration Process

Step 1: Creating the Authentication Class

  • Add the SSRS Service web reference. The Web Service url is <Your_SSRS_ReportServerURL>/ReportService2010.asmx
  • Add a new class to your MVC project named AuthenticationHelper
  • This class needs to implement the Singleton pattern to avoid losing policies when setting policies using the SSRS WS. See below the Singleton implementation

The class must have the following methods:

  • private List<OrgGroupRole> GetOrgGroups(int orgId)
    • This method will query for the Organization Roles
    • Query: SELECT * FROM OrgGroupRole WHERE OrganizationId = @orgId
  • private void CreateUser(string userName, int orgId)
    • This method will create a new User
    • Query: INSERT INTO Users VALUES(@userName, @organizationId)
  • public Organization GetOrgId(string orgName)
    • This method will find the OrgId based on the Organization name
    • Query: SELECT * FROM Organization WHERE UPPER(OrganizationName) = UPPER(@orgName)
  • private bool IsUserRegistered(string username, int orgId)
    • This method will check if the user already exists
    • Query: SELECT 1 FROM Users WHERE UserName = @userName and OrganizationId = @orgId
  • private List<string> GetSSRSRoleNames(List<int> ids)
    • This method will get the SSRS roles that are mapped to the Organization roles
    • Query: $”SELECT DISTINCT SSRSRoleName FROM OrgGroupRoleSSRS Where OrgGroupRoleId in ({string.Join(“, “, ids)})”

 

For simplicity the methods above are described only with the signature and the query you need to execute to return the data.

The methods below contain the full implementation:

 

SetPolicies Method

This method will call the SSRS Web Service to set policies for the user that is being registered based on the path used as parameter.

AddUserRoles Method

This method will call the SetPolicies Method passing the SSRS Root Folder and Report Builder Role as parameters (You can use the Role that works better for you in the Root Folder) and will call again the SetPolicies Method using the Organization Folder Path

ExecuteUserBasedAuthenticationFlow Method

This method will execute the Authentication flow using the ADFS claims as input parameters.

Step 2: Creating a filter to register users

Once the authentication happens in AD FS and the MVC executes the actions the user authentication flow will be executed.

To do it create an Action Filter and override the OnActionExecuting method as below:

The filter will get the necessary claims, run the authentication flow and if the flow succeed the Forms Authentication Cookies will be created allowing you to access SSRS Reports.

The SSRS cookie name, domain and claims are set in the web.config file, check below:

The domain and cookie names need to match with the ones used in the SSRS setup.

The claims need to match the claim names returned by the AD FS, change it if you need.

SSRS Custom Authentication Part 1

Authentication with the Report Server

SQL Server Reporting Services (SSRS) offers several configurable options for authenticating users and client applications against the report server. By default, the report server uses Windows Integrated authentication and assumes trusted relationships where client and network resources are in the same domain or in a trusted domain. Depending on your network topology and the needs of your organization, you can customize the authentication protocol that is used for Windows Integrated authentication, use Basic authentication, or use a custom forms-based authentication extension that you provide. Each of the authentication types can be turned on or off individually. You can enable more than one authentication type if you want the report server to accept requests of multiple types.

All users or applications who request access to report server content or operations must be authenticated before access is allowed.

When Windows Integrated Authentication does not meet the requirements

In one of our projects we had a scenario where Windows Authentication would not help us to meet the project requirements.

The project was using ADFS to authenticate users from different organizations in a single MVC Application, once the users were authenticated, they should access SSRS based on their ADFS Username and Role claims.

Per Microsoft definition SSRS does not support Single Sign On technologies, so what to do? To solve this problem a Custom Authentication was implemented.

Custom Authentication Flow

Before we start going through the technical implementation is important to understand the authentication flow we used for this scenario.

User Registration Process

See below how the User Registration Process works:

  1. User authenticates in the AD FS
  2. AD FS returns Organization, Role and Username claims
  3. The App will look for the Organization in the Organization Table
  4. The app will create the user in the Users table linked to the Organization
  5. The app will get the AD FS roles and look for them in the OrgGroupRole table
  6. For each role found in the OrgGroupRole Table, the app will map them to SSRS Roles based on the OrgGroupRoleSSRS
  7. Once the mapping is done the app will call SSRS web services to set the policies for the user in the Organization`s folder (each organization will have its own folder in SSRS)

Database Model

Let’s look at the Database model we created to map AD FS Claims to SSRS.

This model is used to create users and map the organization roles to real SSRS roles. See the table’s details below:

  • Organization Table – As it was told in earlier, AD FS will integrate with different organizations. This table will store the Organizations and the SSRS Folder path for the organization
  • Users Table – This table will store the users of the organizations
  • OrgGroupRole Table – This table will store the organization roles provided by AD FS claims
  • OrgGroupRoleSSRS Table – This table will relate the organization roles to real SSRS roles

 

Continue Reading Part 2 

Automated Testing For Data Migration

Data Migration, Part 4/4, Automation

Data Migration Testing revolves around testing the integrity of Data, content and the quality of data between Source and Destination DB based on the given mapping rule.

Broadly, any data Migration has the following:

– Table level mapping

– Element level mapping (column level)

– New elements in the New DB

– Ignore few old elements from the Legacy DB

– Not to be converted data from Legacy to the New system.

– Count Check

Table Level mapping defines how the tables from the old DB should be mapped to the new DB. Based on the mapping rules and how the new system is designed, the table level mapping could be one to many or many to one or one to one.

Element Level mapping defines how each column in the legacy system is mapped to the columns in the new system and this could be also be one to many, manty to one or one to one.

New elements in the new DB are the columns which is very specific to the New DB and not existing in the Legacy

Ignore elements from the Legacy system are the columns which are being exempted from the mapping rule and never brought to the new system.

Not to be converted data are the records from the Legacy DB which are being exempted from getting converted to the new DB for whatsoever reason.

Count Check This is one of the most important validation check to ensure that the source count is in sync with the target count check based on the given mapping rule.

Essentially every automated/manual testing for Data migration projects focuses on the above validation points.

The very fact that we deal with millions and millions of data, manual testing of data migration project can be next to impossible and the situation demands  an automated process and everytime there is a new extraction of data and any changes to the code, we should be able to trigger the automated scripts to check the sanctity and integrity of the data.

This could be achieved with a bunch of SQL scripts which compares the Source data to the Target data based on the mapping rule against each source and target tables.

A simple methodology without using any expensive off the shelf testing tool is to use Excel Macros.

– Get the macros in Excel to connect to Source and Target DB.

– Store the SQL scripts which  validates each table and element level mapping in the macros

– Store the result of the execution back to the Excel which basically confirms if the mapping passed/failed.

The Pros:

1. User-friendly and doesn’t need anyone to learn and understand the complexities of some off -the shelf expensive tools.

2. All the sql scripts being used as part of the test execution are stored at one place and are quite easy for reference.

3. Any mapping rule changes would mean a change in the SQL script and it’s easy to go and make the SQL changes to the corresponding script.

4. It’s very likely that the testing environment (source and destination DB) could change with every execution depending on its availability and the same could be configured in the Excel as a drop down.

5. As the actual execution happens against the DB, validating millions of data is not a challenge and the excel row limit doesnt kick in as only the results of the execution gets stored back to the Excel.

6. Its quite easy to filter the failed test cases and focus on the issues and re-run the same once the issues are fixed.

7. We can even configure the macros to run for specific set of tables, failed test cases rather than going for the entire test suite based on the regression cycle needed for a specific run thereby saving time.and effort.

8. As its excel we could get some fancy reports once the run is complete without depending on some expensive tools and its reporting services.

9. We could even integrate this with the unit test cases of Development team and run it as and when they complete their coding for each tables thereby delivering the QA much matured and bug free code.

10. On an average, 2000 test cases with as much SQL queries takes about the 30 mins to execute and report depending on the load on the DB at a given point of time which signifies the amount of time getting saved as against executing as much test cases manually.

11. All this without having some expensive data migration testing tools thereby saving a lot of cost to the project.

Conclusion

The major factors which should drive QA automation for any project and not particularly migration project is a cost-benefit analysis. It makes sense to go for expensive off-the shelf automation tools if and only if the returns justify the investment. As long as if we can achieve even 80 percent of the efficiency by any in-hand existing inexpensive tool, we should always shoot for the same and get the best out of it.

As for Data Migration project, the biggest challenge is always the sheer quantity of data to be tested and getting a quality output. Most of the times, the very databases where the data gets loaded with a smart combination of excel macros could achieve the desired results.

Data Migration Part 3 of 4

Test Scenarios, in general, would be as below:

I)​ If the migration is to the same type of Database, then,

  • Verify if the queries executed in the new database yield same results as in the older one
  • Verify if the number of records in the old database and new database is the same. Here use appropriate automation tool
  • Verify that there are no redundancies and new database works exactly as the older one
  • Verify if the schema, relationships, table structures are unaltered or set back to match the old database image
  • Verify whether the changes made in application updates new database with correct values and type
  • Verify if after the new database connection is provided to all the components of the application. Application, server, interfaces, firewall, network connectivity etc.
  • Verify the query performance (time-taken to execute complex queries) of the new database is not more than earlier performance

Challenges faced in this testing are mainly with data. Below are few in the list:

#1) Data Quality:

We may find that the data used in the legacy application is of poor quality in the new/upgraded application. In such cases, data quality has to be improved to meet business standards.

Factors like assumptions, data conversions after migrations, data entered in the legacy application itself are invalid, poor data analysis etc. leads to poor data quality. This results in high operational costs, increased data integration risks, and deviation from the purpose of business.

#2) Data Mismatch:

Data migrated from the legacy to the new/upgraded application may be found mismatching in the new one. This may be due to the change in data type, format of data storage, the purpose for which the data is being used may be redefined.

This result in huge effort to modify the necessary changes to either correct the mismatched data or accept it and tweak to that purpose.

#3) Data Loss:

Data might be lost while migrating from the legacy to the new/upgraded application. This may be with mandatory fields or non-mandatory fields. If the data lost is for non-mandatory fields, then the record for it will still be valid and can be updated again.

But if the mandatory field’s data is lost, then the record itself becomes void and it cannot be retracted. This will result in huge data loss and should have to be retrieved either from the backup database or audit logs if captured correctly.

#4) Data Volume:

Huge Data that requires a lot of time to migrate within the downtime window of the migration activity. ​E.g:​ Scratch cards in Telecom industry, users on an Intelligent network platform etc., here the challenge is by the time, the legacy data is cleared, a huge new data will be created, which needs to be migrated again. Automation is the solution for huge data migration.

#5) Simulation of a real-time environment (with the actual data):

Simulation of a real-time environment​ ​in the testing lab is another real challenge, where testers get into different kind of issues with the real data and the real system, which is not faced during testing.

So, data sampling, replication of real environment, identification of volume of data involved in migration is quite important while carrying out data Migration Testing.

#6) Simulation of the volume of data:

Teams need to study the data in the live system very carefully and should come up with the typical analysis and sampling of the data.

E.g:​ users with age group below 10 years, 10-30 years etc., As far as possible, data from the live needs to be obtained, if not data creation needs to be done in the testing environment. Automated tools need to be used to create a large volume of data. Extrapolation, wherever applicable can be used, if the volume cannot be simulated.

 

Data Migration Testing Part 2 of 4

Verification Requirements in the Migrated Environment: 

It is highly important to have a verification process built in for a migration testing process. By putting out the 

The following tests are designed for a hypothetical test case. 

  • Check whether all the data in the legacy is migrated to the new application within the downtime that was planned. To ensure this, compare the number of records between legacy and the new application for each table and views in the database. Also, report the time taken to move say 10000 records.
  • Check whether all the schema changes (fields and tables added or removed) as per the new system are updated.
  • Data migrated from the legacy to new application should retain its value and format unless it is not specified to do so. To ensure this, compare data values between legacy and new application’s database.
  • Test the migrated data against the new application. Here cover a maximum number of possible cases. To ensure 100% coverage with respect to data migration verification, use the automated testing tool.
  • Check for database security.
  • Check for data integrity for all possible sample records.
  • Check and ensure that the earlier supported functionality in the legacy system works as expected in the new system.
  • Check the data flow within the application which covers most of the components.
  • The interface between the components should be extensively tested, as the data should not be modified, lost, and corrupted when it is going through components. Integration test cases can be used to verify this.
  • Check for legacy data redundancy. No legacy data should be duplicated itself during migration
  • Check for data mismatch cases like data type changed, storing format is changed etc.,
  • All the field level checks in the legacy application should be covered in the new application as well
  • Any data addition in the new application should not reflect back on the legacy
  • Updating legacy application’s data through the new application should be supported. Once updated in the new application, it should not reflect back on the legacy.
  • Deleting the legacy application’s data in the new application should be supported. Once deleted in the new application, it should not delete data in legacy as well.
  • Verify that the changes made to the legacy system support the new functionality delivered as a part of the new system.
  • Verify the users from the legacy system can continue to use both the old functionality and new functionality, especially the ones where the changes are involved. Execute the test cases and the test results stored during the Pre-migration testing.
  • Create new users on the system and carry out tests to ensure that functionality from the legacy as well as the new application, supports the newly created users and it works fine.
  • Carry out functionality related tests with a variety of data samples (different age group, users from different region etc.,)
  • It is also required to verify if ‘Feature Flags’ are enabled for the new features and switching it on/off enables the features to turn on and off.
  • Performance testing is important to ensure that migration to new system/software has not degraded the performance of the system.
  • It is also required to carry out Load and stress tests to ensure the system stability.
  • Verify that the software upgrade has not opened up any security vulnerabilities and hence carry out security testing, especially in the area where changes have been made to the system during migration.
  • Usability is another aspect which is to be verified, wherein if GUI layout/front-end system has changed or any functionality has changed, what is the Ease of Use that the end user is feeling as compared to the legacy system.

Since the scope of Post-Migration testing becomes large, it is ideal to segregate the important tests that need to be done first to qualify that Migration is successful and then carry out the remaining later.

It is also advisable to automate the end to end functional test cases and other possible test cases so that the testing time can be reduced and the results would be available quickly.