Web Codegen Scorer is a specialized tool designed to assess the quality of web code generated by Large Language Models (LLMs), enabling developers to make informed decisions about AI-generated code. It allows for the configuration of evaluations across different models and frameworks, offers built-in checks for various code quality metrics, and provides a user-friendly report viewer for analysis. The tool aims to improve the consistency and repeatability of measuring code generation performance compared to traditional trial-and-error methods.