How to test Browser Performance with PWSLab? [Web and Mobile simulation is supported]
Determine Browser Performance impact of Code changes
If your project offers a Web interface and you are using PWSLab CI/CD, you can quickly find the performance regressions of the incremental code changes. PWSLab can monitor your Review/UAT/Staging environment's web performance (including the Coach, Browsertime, PageXray, and others). This can be extended to measure the Production environment as well.
We think of a complete Browser performance testing as having three key capabilities
- It should test web sites using real browsers, simulating real users' connectivity and collect important user-centric metrics like Speed Index and First Visual Render.
- It should analyze how your page is built and give feedback on how you can make it faster for the end-user.
- It should collect and keep data on how your pages are built so you can easily track changes.
PWSLab Browser Performance Use-cases
It is usually used in two different areas
- To find web performance regressions early: on commits or when you move code to your test/review/staging environment.
- Monitoring your performance in production, alerting on regressions.
For instance, consider the following workflow
- A member of the marketing team is attempting to track engagement by adding a new feature to an application.
- With browser performance metrics, they see how their changes are impacting the usability of the page for end-users.
- They ask a front end developer to help them, who sets the library to load asynchronously.
- The frontend developer approves the merge request and authorizes its deployment to production.
How does automated Browser Performance Testing work in PWSLab?
First a few key concepts
- PWSLab Browser Performance Testing is carefully architected using some leading Open Source tools like the Sitespeed.io suite.
- Browsertime is the tool that drives the browser and collects metrics.
- The Coach knows how to build fast websites and analyze your page and give you feedback on what you should change.
- Visual Metrics are metrics collected from a video recording of the browser screen.
- All these tools communicate by passing messages in a queue.
When you as the user choose to test your Environment, this is what happens on a high level
- PWSLab Docker Runner starts and initializes all configured packages including Sitespeed.
- The URL is passed around the plugins through the queue.
- Browsertime gets the Environment URL and opens the browser (headless).
- It starts to record a video of the browser screen.
- The browser accesses the URL.
- When the page is finished, Browsertime takes a screenshot of the page.
- Stop the video and close the browser.
- Analyze the video to get Visual Metrics like First Visual Change and Speed Index.
- Browsertime passes all metrics and data on the queue so other plugins can use it.
- The HTML/Graphite/InfluxDB plugin collects the metrics in a queue.
- When all URLs are tested, sitespeed sends a message telling PWSLab to summarise the metrics and then render it.
- PWSLab picks up the data metrics, an HTML report is generated and saved to PWSLab Job Artifacts.
After running the Browser Performance testing, PWSLab generates the reports into Job Artifacts for further investigation by the project developers allowing them to take action quickly to fix the issues.
For Browser Performance configuration in your PWSLab project's pipeline, please raise a DevSecOps Support Request.
Support Ticket Guidelines and Information Required
PWSLab Job Environment Variables:
Set "true" or "false" for receiving Browser Performance reports as email notifications.
|Set the Email Recipient for receiving Report email notifications if PWS_EMAIL_STATUS is set as "true".|
Set PWSLab Docker Hub Username
Set PWSLab Docker Hub Password
Set the Target URL for Browser Performance Testing
Set this variable "true" to disable Browser Performance job.
Please raise a DEVSECOPS Support Request in the PDS Department providing the below information:
- Which environment/s or URLs do you want the Browser Performance Test to be configured for?
- Do you want this job to run every time a code-commit is pushed?
- Do you want this job to run for a specific branch like UAT after merge requests are accepted by maintainers? So, before anything is merged to MASTER, it is tested.
Also, let us know if the article is helpful!