How to record a a specific metric before/after accurately? #14187
Unanswered
jeremygottfried
asked this question in
Q&A
Replies: 1 comment
-
There isn't a way to aggregate results in this repo. You might find these helpful: |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I know Lighthouse gives you ms values for metrics like Total Blocking Time and First Contentful Paint, but is there an easy way to test these metrics many times and compute an average? If not with lighthouse, then manually with headless chrome?
With other performance testing, averages are helpful because they control for the variability of results.
I've been struggling with lighthouse because many of the opportunities affect the score by a fraction of a point, so it's hard to gauge the impact of one opportunity.
If not by averaging, how do you determine whether an opportunity impacted the metrics?
Beta Was this translation helpful? Give feedback.
All reactions