Replacement of the existing system, with extension of functionality and integration with other applications used by the client.
What have we done?
The project includes analysis of the client’s requirements, design and implementation of software, manual and automatic tests of the application and maintenance of the implemented system. The integration of the application with programs currently used by the client is a big challenge in this project. the moment of defining the requirements, which are subject to reviews (refinements),
Nowadays about 50 people work on this project: analysts, architects, developers, DevOps and testers. The whole team watches over quality assurance at each stage of software development, starting from:
through the implementation stage, in which the application code undergoes checking in unit and component tests, static code analysis and review by the architect,
ending with the testing stage taking place on several phases.
In order to ensure the quality level required by the client, we operate in several areas:We conduct manual tests of newly created functionalities. We make them in close cooperation with analysts and developers. Progression tests are documented in JIRA using xRay and are available to the client at any time to ensure transparency. Similarly, errors found are documented in JIRA. According to the established quality gates, these errors must be fixed before sending the next version of the software to the client.
Automatic tests are run daily on the application with newly added functionalities. Test results are up to date analysed by the team and in the event of an error, appropriate actions are taken to eliminate the error efficiently. Automatic tests are constantly being extended to ensure the best coverage of newly added functionalities. Additionally, we work on improving test reliability and waiting time for results. The results are each time published as xRay reports in JIRA and are generally available to those interested
The next step is manual regression testing. During these tests we check the work of basic functionalities used on the daily basis by the client. Tests are performed on the OpenShift environment in a configuration similar to the client’s configuration of application. Thanks to this, we can check if interfaces work in the assumed way with applications used by the client (or their simulations). Tests perform on anonymised data based on production data. The set of these data is constantly adapted by the testing team to current needs. Test results are each time published in JIRA, together with any errors found. On this basis is made the decision if the newest version can be transferred to the client. The decision about the transfer the newest version to the client is made based on these results.
After the client accepts the version, the last verification occurs before implementation and release of the application to the users. The installation and smoke test of the application are carried out on the acceptance environment.
The product is currently being implemented and, to ensure efficient service of incidents reported by users, the client has also delegated maintenance of the production to Capgemini.
Within it, daily shifts are carried out, during which developers and testers provide supervision of the application’s work.