Jenkins pipeline project to test Folio performance is located in the repo - folio-perf-test.
JMeter is used as a tool for performance testing. Each JMeter script corresponds to an individual API in a module. Basic scenario is to create new data by POST request and DELETE it once test completes.
Example workflow used to test the mod-inventory-storage item-storage API:
Tests are scheduled to run in Jenkins. Jenkins job runs nightly and triggers about 16000 http requests. New environment is created every time the tests run and is torn down once testing is complete.
Stages followed by the Jenkins pipeline:
- Check out.
- Create environment - call AWS CloudFormation template to create three EC2 instances. One m5.large instance for Okapi, and two m5.xlarge instances for Modules and Database.
- Check environment - wait for all instances to be fully ready.
- Bootstrap DB - pull and start PostgreSQL Docker image on DB EC2 instance.
- Bootstrap Okapi - pull and start Okapi Docker image used by snapshot-stable site from Docker hub and start it on Okapi EC2 instance.
- Bootstrap Modules - pull and start all modules Docker images used by snapshot-stable site.
- Populate data - pull EBSCO sample data from a public S3 bucket to populate database. Sample data is a Harvard dataset which could be considered similar to production version.
- Run JMeter tests.
- Tear down environment - delete the AWS stack.
EC2 instances characteristics:
Reports are generated in Jenkins using JMeter Performance plugin.
Following metrics are collected in the Performance Report:
- Average response times (ART) for each transaction
- Min and Max response times
- Failure rate and errors/warnings in the logs
- The average response time (AVG RT) for the JMeter captured transaction should not be more than 1000 milliseconds.
- The percent of CPU utilization on any module should not be more than 50%.
- JMeter tests running nightly in Jenkins pipeline as a job will fail if even a single test fails.
Example performance report: