Results API
The @cypress/extract-cloud-results module provides the getUICoverageResults utility to programmatically fetch UI Coverage results for a run in a CI environment. This allows you to determine if test coverage meets your requirements before merging code changes.
Supported CI Providers​
The utility supports the following CI providers. Refer to the linked guides for setup details:
- Azure (requires Cypress v13.13.1)
- CircleCI
- GitHub Actions
- GitLab
- Jenkins
- AWS CodeBuild
- Drone
For other CI providers, contact Cypress Support to request support.
Installation​
Install the @cypress/extract-cloud-results module in your install step in CI.
npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
Do not check this module in as a dependency. We recommend you install it separately outside of your normal module installation. Use --force to get the latest version.
If you check this in as a dependency, your installation will fail when we update the package.
Usage​
1. Get the UI Coverage Results​
Write a script to fetch UI Coverage results and assert test coverage criteria. This script will be executed in CI.
const { getUICoverageResults } = require('@cypress/extract-cloud-results')
getUICoverageResults({
projectId: process.env.CYPRESS_PROJECT_ID, // Optional if set from env
recordKey: process.env.CYPRESS_RECORD_KEY, // Optional if set from env
runTags: [process.env.RUN_TAGS], // Required if recording multiple runs
}).then((results) => {
const { runNumber, uiCoverageReportUrl, summary, views } = results
console.log(
`Received ${summary.isPartialReport ? 'partial' : ''} results for run #${runNumber}.`
)
console.log(`See full report at ${uiCoverageReportUrl}.`)
// Verify overall coverage
if (summary.coverage < 80) {
throw new Error(
`Project coverage is ${summary.coverage}, below the minimum threshold of 80%.`
)
}
const criticalViews = [/login/, /checkout/]
// Verify critical view coverage
views.forEach((view) => {
const { displayName, coverage, uiCoverageReportUrl } = view
if (
criticalViews.some((pattern) => pattern.test(displayName)) &&
coverage < 95
) {
throw new Error(
`Critical view "${displayName}" coverage is ${coverage}%, below the required 95%. See: ${uiCoverageReportUrl}`
)
}
})
console.log('UI Coverage is above minimum thresholds.')
})
getUICoverageResults arguments​
getUICoverageResults accpets the following arguments:
getUICoverageResults({
// The Cypress project ID.
// Optional if the CYPRESS_PROJECT_ID env is set
projectId: string
// The project's record key.
// Optional if the CYPRESS_RECORD_KEY env is set
recordKey: string
// The run tags associated with the run.
// Required IF you are recording multiple Cypress runs from a single CI build.
// Pass the run tags you used when recording in each run
runTags: string[]
})
Result Details​
The getUICoverageResults utility returns the following data:
{
// The run number of the identified build.
runNumber: number
// The run url for the identified build.
runUrl: 'https://cloud.cypress.io/projects/:project_id/runs/:run_number'
// The status of the identified build.
runStatus: 'passed' | 'failed' | 'errored' | 'timedOut' | 'cancelled' | 'noTests'
// The url that links to UI Coverage report for the identified build.
uiCoverageReportUrl: 'https://cloud.cypress.io/[...]'
summary: {
// Indicates whether a complete UI Coverage report was generated.
// For example, if a run was cancelled and the report expected to run
// for 20 specs, but only 10 ran, this would result in a partial report.
isPartialReport: boolean
// The report coverage from 0-100 with 2 decimal precision (e.g 92.45).
coverage: number
// The number of views tested and analyzed.
viewCount: number
// The number of interactive elements that were tested.
testedElementsCount:number
// The number of interactive elements that were not tested.
untestedElementsCount: number
}
// The list of tested views and the coverage of each page.
views: [{
// The sanatized URL pattern shown in the report.
displayName: string
// The view coverage from 0-100 with 2 decimal precision (e.g 92.45).
coverage: number
// The number of interactive elements that were tested on this view.
testedElementsCount:number
// The number of interactive elements that were not tested on this view.
untestedElementsCount: number
// The url that links the report for this view.
uiCoverageReportUrl: 'https://cloud.cypress.io/[...]'
}]
}
2. Add to CI Workflow​
In your CI workflow that runs your Cypress tests,
- Update your install job to install the
@cypress/extract-cloud-resultsmodule. - Pass in the necessary arguments to
getUICoverageResults. - Add a new step to the job that runs your Cypress tests to verify the UI Coverage results.
If you record multiple runs in a single CI build, you must record these runs using the --tag parameter and then call getUICoverageResults with the runTags argument for each run.
This is necessary to identify each unique run and return a corresponding set of results. The tags are how each run is uniquely identified.
Example
- Let's imagine that within a single CI build you call
cypress run --recordmultiple times because you're running one set of tests against astagingenvironment, followed by aproductionenvironment. - In this scenario, you pass a different
--tagto each cypress runcypress run --record --tag stagingcypress run --record --tag production
- When calling
getUICoverageResultsyou would then pass these same tags to get the unique set of results for each rungetUICoverageResults({ runTags: ['staging']})getUICoverageResults({ runTags: ['production']})
Example Job Workflow Update:​
- GitHub Actions
- GitLab
- Jenkins
- Azure
- CircleCI
- AWS CodeBuild
- Drone
name: My Workflow
on: push
env:
CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }}
jobs:
run-cypress:
runs-on: ubuntu-24.04
steps:
- name: Checkout
uses: actions/checkout@v4
- name: install
run: npm install
- name: Run
run: npx cypress run --record
+ - name: Verify UI Coverage Results
+ run: |
+ npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
+ node ./scripts/verifyUICoverageResults.js
name: Run Cypress Tests
image: node:latest
stages:
- test
run-cypress:
stage: test
secrets:
CYPRESS_RECORD_KEY:
vault: vault/cypressRecordKey
script:
- npm install
- npx cypress run --record
+ - npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
+ - node ./scripts/verifyUICoverageResults.js
pipeline {
agent {
docker {
image 'cypress/base:22.15.0'
}
}
environment {
CYPRESS_PROJECT_ID: 'xxxx'
CYPRESS_RECORD_KEY = credentials('cypress-record-key')
}
stages {
stage('build and test') {
steps {
sh 'npm ci'
sh 'npx cypress run --record'
}
}
+ stage('Verify UI Coverage Results') {
+ steps {
+ sh 'npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz'
+ sh 'node ./scripts/verifyUICoverageResults.js'
+ }
+ }
}
}
jobs:
- job: run_tests
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool@0
inputs:
versionSpec: '20.x'
displayName: 'Install Node.js'
- script: npm i
displayName: 'Install npm dependencies'
- script: npx cypress run --record
displayName: 'Run Cypress tests'
env:
# avoid warnings about terminal
TERM: xterm
CYPRESS_RECORD_KEY: $(CYPRESS_RECORD_KEY)
+ - script: |
+ npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
+ node ./scripts/verifyUICoverageResults.js
+ displayName: 'Verify UI Coverage Results'
+ env:
+ CYPRESS_PROJECT_ID: $(CYPRESS_PROJECT_ID)
+ CYPRESS_RECORD_KEY: $(CYPRESS_RECORD_KEY)
version: 2.1
jobs:
linux-test:
docker:
- image: cypress/base:22.15.0
working_directory: ~/repo
steps:
- checkout
- run: npm install
- run: npx run cypress:run --record
+ - run: npm install --force https://cdn.cypress.io/extract-cloud-results/beta/v1/extract-cloud-results.tgz
+ - run: node ./scripts/verifyUICoverageResults.js
workflows:
version: 2
tests:
jobs:
- run-cypress
phases:
install:
runtime-versions:
nodejs: latest
commands:
# Set COMMIT_INFO variables to send Git specifics to Cypress Cloud when recording
# https://docs.cypress.io/app/continuous-integration/overview#Git-information
- export COMMIT_INFO_BRANCH="$(git rev-parse HEAD | xargs git name-rev |
cut -d' ' -f2 | sed 's/remotes\/origin\///g')"
- export COMMIT_INFO_MESSAGE="$(git log -1 --pretty=%B)"
- export COMMIT_INFO_EMAIL="$(git log -1 --pretty=%ae)"
- export COMMIT_INFO_AUTHOR="$(git log -1 --pretty=%an)"
- export COMMIT_INFO_SHA="$(git log -1 --pretty=%H)"
- export COMMIT_INFO_REMOTE="$(git config --get remote.origin.url)"
- npm ci
pre_build:
commands:
- npm run cypress:verify
- npm run cypress:info
build:
commands:
- CYPRESS_INTERNAL_ENV=staging CYPRESS_PROJECT_ID=[slug] npx cypress run --record --key [KEY]
+ post_build:
+ commands:
+ - npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
+ - CYPRESS_INTERNAL_ENV=staging CYPRESS_PROJECT_ID=[slug] CYPRESS_RECORD_KEY=[KEY] node ./scripts/verifyUICoverageResults.js
kind: pipeline
name: default
environment:
CYPRESS_PROJECT_ID: example_project_slug
CYPRESS_RECORD_KEY:
from_secret: example_record_key_secret
steps:
- name: test
image: node:latest
commands:
- npm install
- npx cypress run --record
* - name: validate
* image: node:latest
* commands:
* - npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
* - node ./scripts/verifyUICoverageResults.js
Required CI environment variables​
The @cypress/extract-cloud-results helper cross-references some environment variables from where it is executed with ones that were present when a Cypress Cloud run was recorded. This allows for automatically detecting the correct Cloud run when the Results API is invoked from the same CI context as a given run (as is the case in the above examples).
For more complex setups, or for local iteration on your Results API handler code, it can be useful to know what variables Cypress is looking for so that you can make sure they are passed through where they are needed.
Likewise, if you want to use the Results API locally to pull the data for a specific run (within the last 7 days), you can set these variables locally to match what was present in CI.
Local development example​
If you executed a run in GitHub Actions and it was recorded to Cypress Cloud, you would set these 4 environment variables to replicate the context of that run locally and execute your local handler script. This is a great way to iterate on your script and verify everything is working as expected, without having to integrate anything in CI. It's also useful for debugging.
CYPRESS_PROJECT_ID=AAA
CYPRESS_RECORD_KEY=BBB
GITHUB_ACTIONS=true
GITHUB_RUN_ID=111
GITHUB_RUN_ATTEMPT=0
node verifyAccessibilityResults.js
The Results API will then look for the Cypress Cloud run that matches this run ID. If there is more than one Cypress Cloud run found for that GitHub Actions Run, you can pass run tags to narrow down to one run's report.
Supported CI Provider Overview​
Each CI provider has a unique combination of components, patterns, and environment variables that must be interpreted by this module.
GitHub Actions​
Reference: https://docs.github.com/en/actions/learn-github-actions/understanding-github-actions
Essential environment variables​
GITHUB_ACTIONS- Presence identifies the environment as a GitHub Actions environment.GITHUB_RUN_ID- Value uniquely identifies a GitHub Actions workflow instance. Value does not change as jobs in the workflow are re-executed.GITHUB_RUN_ATTEMPT- Value identifies the workflow instance's attempt index. Value is incremented each time jobs are re-executed.
Full environment variable reference: https://docs.github.com/en/actions/learn-github-actions/variables#default-environment-variables
Prerequisites​
- The run to validate and this module's validation script are being executed within the same workflow.
- The module script is always executed after the run to validate has been created. This can be achieved by either:
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
needs: [job-name]option in the config), or - Executing the module script in serial with the cypress recording in the same job.
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
GitLab Pipelines​
Reference: https://docs.gitlab.com/ee/ci/pipelines/
Essential environment variables​
GITLAB_CI- Presence identifies the environment as a GitLab CI environmentCI_PIPELINE_ID- Value uniquely identifies a GitLab pipeline workflow. This value does not change as jobs in the pipeline are retried.CI_JOB_NAME- Value uniquely identifies a single job name within a pipeline. Ex.run-e2eCI_JOB_ID- Value uniquely identifies an execution instance of a job. This value will change each time a job is executed/re-executed.
Full environment variable reference: https://docs.gitlab.com/ee/ci/variables/predefined_variables.html
Prerequisites​
- The run to validate and this module's validation script are being executed within the same pipeline.
- The module script is always executed after the run to validate has been created. This can be achieved by:
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
needs: [job-name]option in the config), or - Executing the module script in a separate job that is executed in a lower stage than the job that records the Cypress run, or
- Executing the module script in serial with the cypress recording in the same job.
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
Jenkins​
Reference: https://www.jenkins.io/doc/
Jenkins is heavily customizable through the usage of plugins, which limits the amount of assumptions we can make about available environment variables and overall behavior.
We have implemented Jenkins support within this module using the broadest set of available default values. For the purposes of this documentation, though, we will discuss terms related to Jenkins Pipeline support: https://www.jenkins.io/doc/book/pipeline/getting-started/
Essential terms​
Essential environment variables​
JENKINS_HOME- Presence identifies the environment as a Jenkins environmentBUILD_URL- Value uniquely identifies a Jenkins job execution, including name and id characteristics.
Full environment variable reference: https://www.jenkins.io/doc/book/pipeline/jenkinsfile/#using-environment-variables
Prerequisites​
- The run to validate and this module's validation script are being executed within the same job.
- The module script is always executed after the run to validate has been created. This can be achieved by executing the module script in serial with the cypress recording in the same job.
Azure​
Note: Cypress v13.13.1 is the earliest Cypress release that records the environment variables necessary for this module to identify runs in an Azure environment. Previous Cypress versions are not supported in Azure pipelines.
Essential environment variables​
TF_BUILDandAZURE_HTTP_USER_AGENT- Combined presence identifies the environment as a Azure pipeline environment.SYSTEM_PLANID- Value uniquely identifies a pipeline run. Value does not change as jobs within the pipeline are retried from failure.SYSTEM_JOBID- Value uniquely identifies a job execution. Value changes each time a job is retried from failure, in conjunction with theSYSTEM_JOBATTEMPTbeing incremented.SYSTEM_JOBATTEMPT- Value identifies the pipelines shared attempt index. Value is incremented when jobs are retried from failure.
Full environment variable reference: https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml
Prerequisites​
- The run to validate and this module's validation script are being executed within the same pipeline run (i.e. they share a
SYSTEM_PLANIDvalue). - The module script is always executed after the run to validate has been created. This can be achieved by either:
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
dependsOn: [job-name]option in the config), or - Executing the module script in serial with the Cypress recording in the same job.
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
CircleCI​
Reference: https://circleci.com/docs/about-circleci/
Note: Cypress v13.13.1 is the earliest Cypress release that records the environment variables necessary for this module to identify runs in an CircleCI environment. Previous Cypress versions are not supported in CircleCI pipelines.
Essential environment variables​
CIRCLECI- Presence identifies the environment as a CircleCI environmentCIRCLE_PIPELINE_ID- Value uniquely identifies a CircleCI pipeline, created on push or manually triggered through the UI. This value does not change as workflows within the pipeline are re-executed.CIRCLE_WORKFLOW_ID- Value uniquely identifies an instance of a workflow's execution within a pipeline. This value will be updated upon each workflow execution; in other words, retrying a workflow from failure from the Circle UI will create a new workflow with a newCIRCLE_WORKFLOW_IDvalue available to the jobs executed within it.CIRCLE_WORKFLOW_JOB_ID- Value uniquely identifies an execution instance of a named job within a workflow instance.
Full environment variable reference: https://docs.gitlab.com/ee/ci/variables/predefined_variables.html
Prerequisites​
- The run to validate and this module's validation script are being executed within the same pipeline and workflow.
- The module script is always executed after the run to validate has been created. This can be achieved by:
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
requires: [job-name]option in the config), or - Executing the module script in serial with the cypress recording in the same job.
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
AWS CodeBuild​
Reference: https://docs.aws.amazon.com/codebuild/
Essential environment variables​
CODEBUILD_BUILD_ID- Presence identifies the environment as an AWS CodeBuild environment. Value uniquely identifies a build.
Full environment variable reference: https://docs.aws.amazon.com/codebuild/latest/userguide/build-env-ref-env-vars.html
Prerequisites
- The run to validate and this module's validation script are being executed within the same build.
- The module script is always executed after the run to validate has been created. This can be achieved by executing the module script in serial with the cypress recording in the same build.
Drone​
Reference: https://docs.drone.io/pipeline/overview/
Essential environment variables​
DRONE- Presence identifies the environment as an Drone environment.DRONE_BUILD_NUMBER- Value uniquely identifies a Drone build.
Full environment variable reference: https://docs.drone.io/pipeline/environment/reference/
Prerequisites​
- The run to validate and this module's validation script are being executed within the same build.
- The module script is always executed after the run to validate has been created. This can be achieved by executing the module script in serial with the cypress recording in the same build.
In order to iterate on your verification script and see everything working without putting code into your CI environment, it can be useful to simulate the CI context for a specific Cypress run locally. This can save a lot of time when getting started.