Application performance testing
Jama Software runs daily performance testing of Jama Connect in a self-hosted setup and continuously monitors performance in our cloud environment.
Self-hosted performance test results
These performance test results indicate a large self-hosted environment with 1,250 concurrent users.

All requests were serviced during the testing period of 90 minutes with no un-intentional errors.
82% of requests were serviced in under 1 second.
Longer running operations (like bulk updates and copies) were serviced within tolerances.

Performance tests consider different user personas commonly found in our customer base, with the most common and active users being Staff Engineers and Testers (Creators).
The total of unique, concurrent users spread across the personas was 1,250.

This graph shows the distribution of serviced requests during the test with response times expressed in milliseconds (ms).
79% of requests were serviced in under 305ms with 99.5% of requests serviced in the consumer web target of under 3 seconds.
Cloud performance test results
Jama Software continuously monitors application performance in the cloud. We utilize Real User Monitoring (RUM) technology to understand performance at the end-user level.
This performance monitoring includes network latency and transmission of the entire user request across our global customer base.

The Jama Connect cloud environment serves an average of over 6 million pages per month with hundreds of millions of total requests.
The P75 total page load time (from request to full browser paint) averages 3.41 seconds.
The P75 total page load time (from request to full browser paint) averages 3.41 seconds. The cumulative average of all page loads is under 3 seconds (2.66).

We gauge cloud performance on a consumer web scale, which is superior to most enterprise B2B applications.
First Input Delay (FID) averages 16.5ms across the cloud environment and is a critical component of a responsive web experience.