Code Naming Auditor Skill
This Skill guides an AI Agent to perform a comprehensive audit of a codebase to ensure it adheres to a clearly defined set of terminology and naming conventions. Its core goal is to enforce a "Ubiquitous Language" across the entire project, thereby improving code clarity, consistency, and maintainability.
During the use of this Skill, the Agent's output should avoid Markdown Table format. Use unformatted text lists instead.
Workflow
The audit process follows a structured, step-by-step workflow.
Step 1: Establish the Glossary
A clear, unambiguous glossary is the foundation of all code audit work.
- Check for existing glossary: First, search the project for an existing terminology document. Look for files named , , or similar in the directory or root directory.
- Confirm with the user: If a document is found, present it to the user and confirm whether it is the "source of truth" for this audit work.
- Create a new glossary: If no glossary exists in the project, inform the user of its importance. Use the template located at
references/glossary_template.md
to guide the user in creating one. The glossary should precisely define the semantics of the project's core Nouns (e.g., , , ) and Verbs (e.g., , , , ). When defining the glossary, consider using precise, sufficiently specific names; avoid vague or ambiguous terms.
Do not proceed to the next step until a clear and agreed-upon glossary is in place.
Step 2: Define the Audit Scope
- Ask the user: Prompt the user to specify the directories or files they wish to audit.
- Confirm the scope: Confirm the scope with the user before proceeding. For example: "I will audit all files in the directory. Is this correct?"
Step 3: Analyze and Report
This is the core execution phase. Systematically analyze each file within the defined scope.
- Read files: Read the content of one file at a time.
- Analyze naming: Carefully examine the programming names in the file:
- Method / function names
- Parameter names
- Variable names
- Type names / aliases
- Cross-reference with the glossary: Compare each name against the rules and definitions in the glossary. Pay particular attention to:
- Incorrect verb usage: Has been misused instead of in a function name where should be used?
- Inconsistent nouns: Is a representing multiple lines of file content named instead of as defined in the glossary?
- Vagueness/ambiguity: Are there vague names like , , , ? In these cases, can more precise terms from the glossary be used?
- Summarize deviations: Create a checklist for the current file listing all identified naming deviations. For each deviation, record its location (e.g., method name, parameter name), a description of the issue, and a clear modification suggestion based on the glossary.
- Repeat: Repeat this process for all files within the scope.
- Submit the report: After analysis is complete, present the summarized list of deviations and suggestions to the user in a clear, structured format (e.g., a list grouped by file).
Step 4: Implement Refactoring (Optional)
After submitting the report, the Agent can proactively offer to apply these modification suggestions.
- Propose refactoring: Ask the user: "Would you like me to apply these naming corrections to the codebase?"
- Apply modifications: If the user agrees, use the tool to systematically apply each modification suggestion. To ensure accuracy, execute these changes one by one or in small logical batches.
- Verify: After refactoring is complete, it is best to run the project's test suite to ensure these changes do not introduce any regression issues.