en flag +1 214 306 68 37
Testing of Software for User Behavior Compliance Management

Testing of Software for User Behavior Compliance Management

Industry
Information Technology
Technologies
Selenium

Customer

Founded in 1986, Consul risk management is an authority in policy-based security audit and compliance. The Consul InSight™ Suite provides the unique ability to capture comprehensive log data, correlate the data through sophisticated log interpretation, and communicate results through a dashboard for full audit and compliance reporting. To reduce the threats posed by privileged insiders, Consul InSight monitors change management procedures, acceptable use policies and user authorization processes against company and regulatory policies.

More than 350 customers around the world rely on Consul to accelerate log management and user monitoring, including AEGON Canada, Blue Cross/Blue Shield, Fidelity Financial Services, Ford, Kroger, The New York Times, Office Depot, Philadelphia Stock Exchange, Wachovia and government agencies. Consul has offices in the United States and the Netherlands, and 25 partners worldwide, including BMC Software.

In January 2007 Consul was acquired by IBM. It became a part of IBM's Tivoli software unit. This acquisition strengthened IBM's Service Management initiative by adding key data governance and compliance monitoring, auditing and reporting capabilities across mainframe and distributed environments, a unique capability unmatched by other competitors. Accordingly, the product name was changed to Tivoli Compliance InSight Manager, or TCIM.

ScienceSoft has been a partner of Consul since August 2004. The team in ScienceSoft is constantly growing and at the moment consists of 16 test engineers, 27 developers and 2 project managers.

Challenge

After Consul Product became a part of larger Tivoli environment, the complex integration task raised. To comply with standards of the new environment, the product had to be moved from Oracle database engine to IBM DB2 database engine. User management of the product that was earlier based on internal authentication had to be moved to IBM Tivoli Directory Server authentication. All the project documentation had to be created in the new format. The new release got the name of TCIM 8.5 (Tivoli Compliance Insight Manager Version 8.5).

An additional challenge for ScienceSoft was to move to the new IBM environment. That meant new standards of work, including new documentation templates, tool approaches, etc. The TCIM testing team started using more detailed specifications and based their work upon the quality assurance plan as the main test document.

The dedicated team had some special assignments: system test of the product in the environment as close to the Customer's as possible, extended performance test to make sure that moving to the new database engine would not affect performance of the product and internalization test that was required because the product supported different languages. The development process was spread between different countries, which increased the complexity of communication. In addition, the Customer requested to increase the number of the dedicated TCIM testing engineers almost twofold. All testing was to be performed by ScienceSoft.

Solution

All test activities needed to be coordinated with international teams. This coordination was established by weekly progress meetings and e-mail communication.

Automated TCIM testing and unit testing

The main difference of the new release was the conversion to a different platform that posed risks of retaining full functionality of the product. The migration to the new database engine affected all features of the product.

Taking into account this specific situation, ScienceSoft’s dedicated team decided to put as many efforts as possible to test automation. Lots of tests were regression and they were the best candidates for automation.

To ensure an effective TCIM testing process, we decided to assign the most experienced testers to create a test API that included most complex functions. The rest of the team, mostly experienced in manual testing, focused on creating automated test cases and used the API in their work.

As a result, we achieved several goals:

  • Improved test automation skills of the team;
  • Increased test automation coverage;
  • Tested the product.

One of the most challenging tasks was the testing of the database conversion results. The database contained a set of complex views, stored procedures, tables, etc. This needed to be converted from Oracle to DB2. In order to have migration tested we decided to create unit tests on PL SQL language. The tests were created in the Oracle environment and sent to the DB conversion team. They used tests to verify their code after the conversion. The benefit of this approach was time saving for the DB conversion team, because they did not have to study functionality of the product and could just perform the unit tests. As the database was covered by unit tests as well, now the product was protected from the regression issues in the database.

Performance TCIM testing

Important requirement to this release was the performance of the product. Performance of the product with DB2 engine was to be the same or better. In this situation, the importance of performance testing was very high. The most complex in this task was that performance had to be measured not for only one aspect of the system but for the whole system in total.

To cope with this complex task, ScienceSoft created a performance test council that consisted of testing specialists, development specialists and performance analysts. As a result, we worked out a performance measurement model, which was used by the dedicated team to convey performance measurement and build all needed tools.

Manual functional testing

In spite of the good coverage of the product by automated tests, ScienceSoft’s testing team still decided to put much effort to the manual functional testing. It was very important that functionality was not broken after migration from Oracle to DB2, thus even small parts of the software had to be covered by tests. We introduced several approaches, which helped us to finish testing in time.

Approach 1: We used test cases created during previous releases. Therefore, we did not create test design from scratch. In addition, the usage of previously created test cases made us sure that they were correct and the test data sufficient, as they were executed previously at least once.

Approach 2: We organized a meeting where test engineers who tested different parts of the project discussed possibility to combine their tests together, so they used the same environment and test data. Output of some tests was used as input for other tests etc. Some tests were optimized so could they cover more functionality with the same effort. This optimization saved about 25% of test time.

Approach 3: Our plan was on schedule. However, the schedule was still very tight. There was a big risk of possible delays in delivery or miss-coordination of teams, which might lead to failures. That is why we performed risk analysis of our test plans and prioritized test scenarios in accordance with the impact and likelihood of possible failures if the test was not performed. This model allowed us to postpone less priority tests in case of any failure of the plan.

Security testing

The new release had to be tested for accordance with IBM security standards. It was also important to make sure that security of TCIM was on the same or better level than the previous version. We had a big risk of security issues because all the user management was re-created from scratch. In addition, the product itself is a security system, so the cost of every security issue is very high.

In cooperation with developers and architects, we created a security map and identified potentially vulnerable areas where special attention should be applied. As the data stored in TCIM is usually strictly confidential, we took into account not only unauthorized access issues but also data integrity and protection.

Results

Upon completion of the full-cycle TCIM testing project, all test activities were performed on a regular basis. The test planning was tracked regularly and was fully transparent for the IBM management. Intermediate deadlines were achieved in time.

In spite of different environments, communication with all development teams was going smoothly. The code was tested very quickly without any delays, right after development was finished.

Technologies and Tools

High level GUI tests:

  • IBM Rational Functional Tester
  • IBM Rational Test Manager
  • OpenQA: Selenium
  • AutoIT

Middle-tests for business logic & performance test

Java, Perl, C/C++, Windows Script Host

Database Unit Tests

PL SQL

Have a question to our team or need help with your project?

Our team is ready to provide client references, estimate your project, or answer any other question related to your IT initiative.

Upload file

Drag and drop or to upload your file(s)

?

Max file size 10MB, up to 5 files and 20MB total

Supported formats:

doc, docx, xls, xlsx, ppt, pptx, pps, ppsx, odp, jpeg, jpg, png, psd, webp, svg, mp3, mp4, webm, odt, ods, pdf, rtf, txt, csv, log

More Case Studies