What Does This GPU Benchmark Test?
This benchmark measures your graphics card's rendering performance using WebGL, the standard 3D graphics API available in all modern browsers. Three tests stress different GPU capabilities:
- Triangle throughput — How many triangles your GPU can render per frame while maintaining smooth performance
- Texture fill rate — How quickly your GPU can sample and render textured surfaces
- Shader complexity — How much mathematical computation your GPU's shader processors can handle (using Mandelbrot set calculations)
WebGL vs Native GPU Benchmarks
Browser-based GPU benchmarks test a subset of your GPU's full capabilities. WebGL adds overhead compared to native APIs like DirectX, Vulkan, or Metal. However, WebGL benchmarks are excellent for comparing relative GPU performance across devices and browsers without installing any software.
FAQ
Why does my dedicated GPU show low scores?
Some laptops default to integrated graphics for browser rendering. Check your GPU driver settings to ensure your browser uses the dedicated GPU. In Chrome, visit chrome://gpu to see which GPU is active.
What's a good GPU benchmark score?
Integrated graphics (Intel UHD, AMD Radeon) typically score 500-1500. Dedicated mobile GPUs score 1500-4000. Desktop GPUs like RTX 3060+ can score 4000+. Mobile devices range from 300-2000.
Does browser choice affect GPU scores?
Yes, significantly. Chrome and Edge (Chromium-based) generally have the best WebGL performance. Firefox is close behind. Safari's WebGL implementation may differ. Always compare scores using the same browser.
Why does the benchmark show "Unknown GPU"?
Some browsers hide GPU information for privacy. The WEBGL_debug_renderer_info extension may be disabled. This doesn't affect benchmark accuracy — only the displayed GPU name.