feat: add reminders page, BMad skills upgrade, MCP server refactor
- Add reminders page with navigation support - Upgrade BMad builder module to skills-based architecture - Refactor MCP server: extract tools and auth into separate modules - Add connections cache, custom AI provider support - Update prisma schema and generated client - Various UI/UX improvements and i18n updates - Add service worker for PWA support Made-with: Cursor
This commit is contained in:
76
.claude/skills/bmad-builder-setup/SKILL.md
Normal file
76
.claude/skills/bmad-builder-setup/SKILL.md
Normal file
@@ -0,0 +1,76 @@
|
||||
---
|
||||
name: bmad-builder-setup
|
||||
description: Sets up BMad Builder module in a project. Use when the user requests to 'install bmb module', 'configure bmad builder', or 'setup bmad builder'.
|
||||
---
|
||||
|
||||
# Module Setup
|
||||
|
||||
## Overview
|
||||
|
||||
Installs and configures a BMad module into a project. Module identity (name, code, version) comes from `./assets/module.yaml`. Collects user preferences and writes them to three files:
|
||||
|
||||
- **`{project-root}/_bmad/config.yaml`** — shared project config: core settings at root (e.g. `output_folder`, `document_output_language`) plus a section per module with metadata and module-specific values. User-only keys (`user_name`, `communication_language`) are **never** written here.
|
||||
- **`{project-root}/_bmad/config.user.yaml`** — personal settings intended to be gitignored: `user_name`, `communication_language`, and any module variable marked `user_setting: true` in `./assets/module.yaml`. These values live exclusively here.
|
||||
- **`{project-root}/_bmad/module-help.csv`** — registers module capabilities for the help system.
|
||||
|
||||
Both config scripts use an anti-zombie pattern — existing entries for this module are removed before writing fresh ones, so stale values never persist.
|
||||
|
||||
`{project-root}` is a **literal token** in config values — never substitute it with an actual path. It signals to the consuming LLM that the value is relative to the project root, not the skill root.
|
||||
|
||||
## On Activation
|
||||
|
||||
1. Read `./assets/module.yaml` for module metadata and variable definitions (the `code` field is the module identifier)
|
||||
2. Check if `{project-root}/_bmad/config.yaml` exists — if a section matching the module's code is already present, inform the user this is an update
|
||||
3. Check for per-module configuration at `{project-root}/_bmad/{module-code}/config.yaml` and `{project-root}/_bmad/core/config.yaml`. If either file exists:
|
||||
- If `{project-root}/_bmad/config.yaml` does **not** yet have a section for this module: this is a **fresh install**. Inform the user that installer config was detected and values will be consolidated into the new format.
|
||||
- If `{project-root}/_bmad/config.yaml` **already** has a section for this module: this is a **legacy migration**. Inform the user that legacy per-module config was found alongside existing config, and legacy values will be used as fallback defaults.
|
||||
- In both cases, per-module config files and directories will be cleaned up after setup.
|
||||
|
||||
If the user provides arguments (e.g. `accept all defaults`, `--headless`, or inline values like `user name is BMad, I speak Swahili`), map any provided values to config keys, use defaults for the rest, and skip interactive prompting. Still display the full confirmation summary at the end.
|
||||
|
||||
## Collect Configuration
|
||||
|
||||
Ask the user for values. Show defaults in brackets. Present all values together so the user can respond once with only the values they want to change (e.g. "change language to Swahili, rest are fine"). Never tell the user to "press enter" or "leave blank" — in a chat interface they must type something to respond.
|
||||
|
||||
**Default priority** (highest wins): existing new config values > legacy config values > `./assets/module.yaml` defaults. When legacy configs exist, read them and use matching values as defaults instead of `module.yaml` defaults. Only keys that match the current schema are carried forward — changed or removed keys are ignored.
|
||||
|
||||
**Core config** (only if no core keys exist yet): `user_name` (default: BMad), `communication_language` and `document_output_language` (default: English — ask as a single language question, both keys get the same answer), `output_folder` (default: `{project-root}/_bmad-output`). Of these, `user_name` and `communication_language` are written exclusively to `config.user.yaml`. The rest go to `config.yaml` at root and are shared across all modules.
|
||||
|
||||
**Module config**: Read each variable in `./assets/module.yaml` that has a `prompt` field. Ask using that prompt with its default value (or legacy value if available).
|
||||
|
||||
## Write Files
|
||||
|
||||
Write a temp JSON file with the collected answers structured as `{"core": {...}, "module": {...}}` (omit `core` if it already exists). Then run both scripts — they can run in parallel since they write to different files:
|
||||
|
||||
```bash
|
||||
python3 ./scripts/merge-config.py --config-path "{project-root}/_bmad/config.yaml" --user-config-path "{project-root}/_bmad/config.user.yaml" --module-yaml ./assets/module.yaml --answers {temp-file} --legacy-dir "{project-root}/_bmad"
|
||||
python3 ./scripts/merge-help-csv.py --target "{project-root}/_bmad/module-help.csv" --source ./assets/module-help.csv --legacy-dir "{project-root}/_bmad" --module-code {module-code}
|
||||
```
|
||||
|
||||
Both scripts output JSON to stdout with results. If either exits non-zero, surface the error and stop. The scripts automatically read legacy config values as fallback defaults, then delete the legacy files after a successful merge. Check `legacy_configs_deleted` and `legacy_csvs_deleted` in the output to confirm cleanup.
|
||||
|
||||
Run `./scripts/merge-config.py --help` or `./scripts/merge-help-csv.py --help` for full usage.
|
||||
|
||||
## Create Output Directories
|
||||
|
||||
After writing config, create any output directories that were configured. For filesystem operations only (such as creating directories), resolve the `{project-root}` token to the actual project root and create each path-type value from `config.yaml` that does not yet exist — this includes `output_folder` and any module variable whose value starts with `{project-root}/`. The paths stored in the config files must continue to use the literal `{project-root}` token; only the directories on disk should use the resolved paths. Use `mkdir -p` or equivalent to create the full path.
|
||||
|
||||
## Cleanup Legacy Directories
|
||||
|
||||
After both merge scripts complete successfully, remove the installer's package directories. Skills and agents in these directories are already installed at `.claude/skills/` — the `_bmad/` directory should only contain config files.
|
||||
|
||||
```bash
|
||||
python3 ./scripts/cleanup-legacy.py --bmad-dir "{project-root}/_bmad" --module-code {module-code} --also-remove _config --skills-dir "{project-root}/.claude/skills"
|
||||
```
|
||||
|
||||
The script verifies that every skill in the legacy directories exists at `.claude/skills/` before removing anything. Directories without skills (like `_config/`) are removed directly. If the script exits non-zero, surface the error and stop. Missing directories (already cleaned by a prior run) are not errors — the script is idempotent.
|
||||
|
||||
Check `directories_removed` and `files_removed_count` in the JSON output for the confirmation step. Run `./scripts/cleanup-legacy.py --help` for full usage.
|
||||
|
||||
## Confirm
|
||||
|
||||
Use the script JSON output to display what was written — config values set (written to `config.yaml` at root for core, module section for module values), user settings written to `config.user.yaml` (`user_keys` in result), help entries added, fresh install vs update. If legacy files were deleted, mention the migration. If legacy directories were removed, report the count and list (e.g. "Cleaned up 106 installer package files from bmb/, core/, _config/ — skills are installed at .claude/skills/"). Then display the `module_greeting` from `./assets/module.yaml` to the user.
|
||||
|
||||
## Outcome
|
||||
|
||||
Once the user's `user_name` and `communication_language` are known (from collected input, arguments, or existing config), use them consistently for the remainder of the session: address the user by their configured name and communicate in their configured `communication_language`.
|
||||
6
.claude/skills/bmad-builder-setup/assets/module-help.csv
Normal file
6
.claude/skills/bmad-builder-setup/assets/module-help.csv
Normal file
@@ -0,0 +1,6 @@
|
||||
module,skill,display-name,menu-code,description,action,args,phase,after,before,required,output-location,outputs
|
||||
BMad Builder,bmad-builder-setup,Setup Builder Module,SB,"Install or update BMad Builder module config and help entries. Collects user preferences, writes config.yaml, and migrates legacy configs.",configure,,anytime,,,false,{project-root}/_bmad,config.yaml and config.user.yaml
|
||||
BMad Builder,bmad-agent-builder,Build an Agent,BA,"Create, edit, convert, or fix an agent skill.",build-process,"[-H] [description | path]",anytime,,bmad-agent-builder:quality-optimizer,false,output_folder,agent skill
|
||||
BMad Builder,bmad-agent-builder,Optimize an Agent,OA,Validate and optimize an existing agent skill. Produces a quality report.,quality-optimizer,[-H] [path],anytime,bmad-agent-builder:build-process,,false,bmad_builder_reports,quality report
|
||||
BMad Builder,bmad-workflow-builder,Build a Workflow,BW,"Create, edit, convert, or fix a workflow or utility skill.",build-process,"[-H] [description | path]",anytime,,bmad-workflow-builder:quality-optimizer,false,output_folder,workflow skill
|
||||
BMad Builder,bmad-workflow-builder,Optimize a Workflow,OW,Validate and optimize an existing workflow or utility skill. Produces a quality report.,quality-optimizer,[-H] [path],anytime,bmad-workflow-builder:build-process,,false,bmad_builder_reports,quality report
|
||||
|
20
.claude/skills/bmad-builder-setup/assets/module.yaml
Normal file
20
.claude/skills/bmad-builder-setup/assets/module.yaml
Normal file
@@ -0,0 +1,20 @@
|
||||
code: bmb
|
||||
name: "BMad Builder"
|
||||
description: "Standard Skill Compliant Factory for BMad Agents, Workflows and Modules"
|
||||
module_version: 1.0.0
|
||||
default_selected: false
|
||||
module_greeting: >
|
||||
Enjoy making your dream creations with the BMad Builder Module!
|
||||
Run this again at any time if you want to reconfigure a setting or have updated the module, (or optionally just update _bmad/config.yaml and config.user.yaml to change existing values)
|
||||
|
||||
For questions, suggestions and support - check us on Discord at https://discord.gg/gk8jAdXWmj
|
||||
|
||||
bmad_builder_output_folder:
|
||||
prompt: "Where should your custom output (agent, workflow, module config) be saved?"
|
||||
default: "{project-root}/skills"
|
||||
result: "{project-root}/{value}"
|
||||
|
||||
bmad_builder_reports:
|
||||
prompt: "Output for Evals, Test, Quality and Planning Reports?"
|
||||
default: "{project-root}/skills/reports"
|
||||
result: "{project-root}/{value}"
|
||||
259
.claude/skills/bmad-builder-setup/scripts/cleanup-legacy.py
Executable file
259
.claude/skills/bmad-builder-setup/scripts/cleanup-legacy.py
Executable file
@@ -0,0 +1,259 @@
|
||||
#!/usr/bin/env python3
|
||||
# /// script
|
||||
# requires-python = ">=3.9"
|
||||
# dependencies = []
|
||||
# ///
|
||||
"""Remove legacy module directories from _bmad/ after config migration.
|
||||
|
||||
After merge-config.py and merge-help-csv.py have migrated config data and
|
||||
deleted individual legacy files, this script removes the now-redundant
|
||||
directory trees. These directories contain skill files that are already
|
||||
installed at .claude/skills/ (or equivalent) — only the config files at
|
||||
_bmad/ root need to persist.
|
||||
|
||||
When --skills-dir is provided, the script verifies that every skill found
|
||||
in the legacy directories exists at the installed location before removing
|
||||
anything. Directories without skills (like _config/) are removed directly.
|
||||
|
||||
Exit codes: 0=success (including nothing to remove), 1=validation error, 2=runtime error
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import shutil
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def parse_args():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Remove legacy module directories from _bmad/ after config migration."
|
||||
)
|
||||
parser.add_argument(
|
||||
"--bmad-dir",
|
||||
required=True,
|
||||
help="Path to the _bmad/ directory",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--module-code",
|
||||
required=True,
|
||||
help="Module code being cleaned up (e.g. 'bmb')",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--also-remove",
|
||||
action="append",
|
||||
default=[],
|
||||
help="Additional directory names under _bmad/ to remove (repeatable)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--skills-dir",
|
||||
help="Path to .claude/skills/ — enables safety verification that skills "
|
||||
"are installed before removing legacy copies",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--verbose",
|
||||
action="store_true",
|
||||
help="Print detailed progress to stderr",
|
||||
)
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def find_skill_dirs(base_path: str) -> list:
|
||||
"""Find directories that contain a SKILL.md file.
|
||||
|
||||
Walks the directory tree and returns the leaf directory name for each
|
||||
directory containing a SKILL.md. These are considered skill directories.
|
||||
|
||||
Returns:
|
||||
List of skill directory names (e.g. ['bmad-agent-builder', 'bmad-builder-setup'])
|
||||
"""
|
||||
skills = []
|
||||
root = Path(base_path)
|
||||
if not root.exists():
|
||||
return skills
|
||||
for skill_md in root.rglob("SKILL.md"):
|
||||
skills.append(skill_md.parent.name)
|
||||
return sorted(set(skills))
|
||||
|
||||
|
||||
def verify_skills_installed(
|
||||
bmad_dir: str, dirs_to_check: list, skills_dir: str, verbose: bool = False
|
||||
) -> list:
|
||||
"""Verify that skills in legacy directories exist at the installed location.
|
||||
|
||||
Scans each directory in dirs_to_check for skill folders (containing SKILL.md),
|
||||
then checks that a matching directory exists under skills_dir. Directories
|
||||
that contain no skills (like _config/) are silently skipped.
|
||||
|
||||
Returns:
|
||||
List of verified skill names.
|
||||
|
||||
Raises SystemExit(1) if any skills are missing from skills_dir.
|
||||
"""
|
||||
all_verified = []
|
||||
missing = []
|
||||
|
||||
for dirname in dirs_to_check:
|
||||
legacy_path = Path(bmad_dir) / dirname
|
||||
if not legacy_path.exists():
|
||||
continue
|
||||
|
||||
skill_names = find_skill_dirs(str(legacy_path))
|
||||
if not skill_names:
|
||||
if verbose:
|
||||
print(
|
||||
f"No skills found in {dirname}/ — skipping verification",
|
||||
file=sys.stderr,
|
||||
)
|
||||
continue
|
||||
|
||||
for skill_name in skill_names:
|
||||
installed_path = Path(skills_dir) / skill_name
|
||||
if installed_path.is_dir():
|
||||
all_verified.append(skill_name)
|
||||
if verbose:
|
||||
print(
|
||||
f"Verified: {skill_name} exists at {installed_path}",
|
||||
file=sys.stderr,
|
||||
)
|
||||
else:
|
||||
missing.append(skill_name)
|
||||
if verbose:
|
||||
print(
|
||||
f"MISSING: {skill_name} not found at {installed_path}",
|
||||
file=sys.stderr,
|
||||
)
|
||||
|
||||
if missing:
|
||||
error_result = {
|
||||
"status": "error",
|
||||
"error": "Skills not found at installed location",
|
||||
"missing_skills": missing,
|
||||
"skills_dir": str(Path(skills_dir).resolve()),
|
||||
}
|
||||
print(json.dumps(error_result, indent=2))
|
||||
sys.exit(1)
|
||||
|
||||
return sorted(set(all_verified))
|
||||
|
||||
|
||||
def count_files(path: Path) -> int:
|
||||
"""Count all files recursively in a directory."""
|
||||
count = 0
|
||||
for item in path.rglob("*"):
|
||||
if item.is_file():
|
||||
count += 1
|
||||
return count
|
||||
|
||||
|
||||
def cleanup_directories(
|
||||
bmad_dir: str, dirs_to_remove: list, verbose: bool = False
|
||||
) -> tuple:
|
||||
"""Remove specified directories under bmad_dir.
|
||||
|
||||
Returns:
|
||||
(removed, not_found, total_files_removed) tuple
|
||||
"""
|
||||
removed = []
|
||||
not_found = []
|
||||
total_files = 0
|
||||
|
||||
for dirname in dirs_to_remove:
|
||||
target = Path(bmad_dir) / dirname
|
||||
if not target.exists():
|
||||
not_found.append(dirname)
|
||||
if verbose:
|
||||
print(f"Not found (skipping): {target}", file=sys.stderr)
|
||||
continue
|
||||
|
||||
if not target.is_dir():
|
||||
if verbose:
|
||||
print(f"Not a directory (skipping): {target}", file=sys.stderr)
|
||||
not_found.append(dirname)
|
||||
continue
|
||||
|
||||
file_count = count_files(target)
|
||||
if verbose:
|
||||
print(
|
||||
f"Removing {target} ({file_count} files)",
|
||||
file=sys.stderr,
|
||||
)
|
||||
|
||||
try:
|
||||
shutil.rmtree(target)
|
||||
except OSError as e:
|
||||
error_result = {
|
||||
"status": "error",
|
||||
"error": f"Failed to remove {target}: {e}",
|
||||
"directories_removed": removed,
|
||||
"directories_failed": dirname,
|
||||
}
|
||||
print(json.dumps(error_result, indent=2))
|
||||
sys.exit(2)
|
||||
|
||||
removed.append(dirname)
|
||||
total_files += file_count
|
||||
|
||||
return removed, not_found, total_files
|
||||
|
||||
|
||||
def main():
|
||||
args = parse_args()
|
||||
|
||||
bmad_dir = args.bmad_dir
|
||||
module_code = args.module_code
|
||||
|
||||
# Build the list of directories to remove
|
||||
dirs_to_remove = [module_code, "core"] + args.also_remove
|
||||
# Deduplicate while preserving order
|
||||
seen = set()
|
||||
unique_dirs = []
|
||||
for d in dirs_to_remove:
|
||||
if d not in seen:
|
||||
seen.add(d)
|
||||
unique_dirs.append(d)
|
||||
dirs_to_remove = unique_dirs
|
||||
|
||||
if args.verbose:
|
||||
print(f"Directories to remove: {dirs_to_remove}", file=sys.stderr)
|
||||
|
||||
# Safety check: verify skills are installed before removing
|
||||
verified_skills = None
|
||||
if args.skills_dir:
|
||||
if args.verbose:
|
||||
print(
|
||||
f"Verifying skills installed at {args.skills_dir}",
|
||||
file=sys.stderr,
|
||||
)
|
||||
verified_skills = verify_skills_installed(
|
||||
bmad_dir, dirs_to_remove, args.skills_dir, args.verbose
|
||||
)
|
||||
|
||||
# Remove directories
|
||||
removed, not_found, total_files = cleanup_directories(
|
||||
bmad_dir, dirs_to_remove, args.verbose
|
||||
)
|
||||
|
||||
# Build result
|
||||
result = {
|
||||
"status": "success",
|
||||
"bmad_dir": str(Path(bmad_dir).resolve()),
|
||||
"directories_removed": removed,
|
||||
"directories_not_found": not_found,
|
||||
"files_removed_count": total_files,
|
||||
}
|
||||
|
||||
if args.skills_dir:
|
||||
result["safety_checks"] = {
|
||||
"skills_verified": True,
|
||||
"skills_dir": str(Path(args.skills_dir).resolve()),
|
||||
"verified_skills": verified_skills,
|
||||
}
|
||||
else:
|
||||
result["safety_checks"] = None
|
||||
|
||||
print(json.dumps(result, indent=2))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
408
.claude/skills/bmad-builder-setup/scripts/merge-config.py
Executable file
408
.claude/skills/bmad-builder-setup/scripts/merge-config.py
Executable file
@@ -0,0 +1,408 @@
|
||||
#!/usr/bin/env python3
|
||||
# /// script
|
||||
# requires-python = ">=3.9"
|
||||
# dependencies = ["pyyaml"]
|
||||
# ///
|
||||
"""Merge module configuration into shared _bmad/config.yaml and config.user.yaml.
|
||||
|
||||
Reads a module.yaml definition and a JSON answers file, then writes or updates
|
||||
the shared config.yaml (core values at root + module section) and config.user.yaml
|
||||
(user_name, communication_language, plus any module variable with user_setting: true).
|
||||
Uses an anti-zombie pattern for the module section in config.yaml.
|
||||
|
||||
Legacy migration: when --legacy-dir is provided, reads old per-module config files
|
||||
from {legacy-dir}/{module-code}/config.yaml and {legacy-dir}/core/config.yaml.
|
||||
Matching values serve as fallback defaults (answers override them). After a
|
||||
successful merge, the legacy config.yaml files are deleted. Only the current
|
||||
module and core directories are touched — other module directories are left alone.
|
||||
|
||||
Exit codes: 0=success, 1=validation error, 2=runtime error
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import json
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
try:
|
||||
import yaml
|
||||
except ImportError:
|
||||
print("Error: pyyaml is required (PEP 723 dependency)", file=sys.stderr)
|
||||
sys.exit(2)
|
||||
|
||||
|
||||
def parse_args():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Merge module config into shared _bmad/config.yaml with anti-zombie pattern."
|
||||
)
|
||||
parser.add_argument(
|
||||
"--config-path",
|
||||
required=True,
|
||||
help="Path to the target _bmad/config.yaml file",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--module-yaml",
|
||||
required=True,
|
||||
help="Path to the module.yaml definition file",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--answers",
|
||||
required=True,
|
||||
help="Path to JSON file with collected answers",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--user-config-path",
|
||||
required=True,
|
||||
help="Path to the target _bmad/config.user.yaml file",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--legacy-dir",
|
||||
help="Path to _bmad/ directory to check for legacy per-module config files. "
|
||||
"Matching values are used as fallback defaults, then legacy files are deleted.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--verbose",
|
||||
action="store_true",
|
||||
help="Print detailed progress to stderr",
|
||||
)
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def load_yaml_file(path: str) -> dict:
|
||||
"""Load a YAML file, returning empty dict if file doesn't exist."""
|
||||
file_path = Path(path)
|
||||
if not file_path.exists():
|
||||
return {}
|
||||
with open(file_path, "r", encoding="utf-8") as f:
|
||||
content = yaml.safe_load(f)
|
||||
return content if content else {}
|
||||
|
||||
|
||||
def load_json_file(path: str) -> dict:
|
||||
"""Load a JSON file."""
|
||||
with open(path, "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
|
||||
|
||||
# Keys that live at config root (shared across all modules)
|
||||
_CORE_KEYS = frozenset(
|
||||
{"user_name", "communication_language", "document_output_language", "output_folder"}
|
||||
)
|
||||
|
||||
|
||||
def load_legacy_values(
|
||||
legacy_dir: str, module_code: str, module_yaml: dict, verbose: bool = False
|
||||
) -> tuple[dict, dict, list]:
|
||||
"""Read legacy per-module config files and return core/module value dicts.
|
||||
|
||||
Reads {legacy_dir}/core/config.yaml and {legacy_dir}/{module_code}/config.yaml.
|
||||
Only returns values whose keys match the current schema (core keys or module.yaml
|
||||
variable definitions). Other modules' directories are not touched.
|
||||
|
||||
Returns:
|
||||
(legacy_core, legacy_module, files_found) where files_found lists paths read.
|
||||
"""
|
||||
legacy_core: dict = {}
|
||||
legacy_module: dict = {}
|
||||
files_found: list = []
|
||||
|
||||
# Read core legacy config
|
||||
core_path = Path(legacy_dir) / "core" / "config.yaml"
|
||||
if core_path.exists():
|
||||
core_data = load_yaml_file(str(core_path))
|
||||
files_found.append(str(core_path))
|
||||
for k, v in core_data.items():
|
||||
if k in _CORE_KEYS:
|
||||
legacy_core[k] = v
|
||||
if verbose:
|
||||
print(f"Legacy core config: {list(legacy_core.keys())}", file=sys.stderr)
|
||||
|
||||
# Read module legacy config
|
||||
mod_path = Path(legacy_dir) / module_code / "config.yaml"
|
||||
if mod_path.exists():
|
||||
mod_data = load_yaml_file(str(mod_path))
|
||||
files_found.append(str(mod_path))
|
||||
for k, v in mod_data.items():
|
||||
if k in _CORE_KEYS:
|
||||
# Core keys duplicated in module config — only use if not already set
|
||||
if k not in legacy_core:
|
||||
legacy_core[k] = v
|
||||
elif k in module_yaml and isinstance(module_yaml[k], dict):
|
||||
# Module-specific key that matches a current variable definition
|
||||
legacy_module[k] = v
|
||||
if verbose:
|
||||
print(
|
||||
f"Legacy module config: {list(legacy_module.keys())}", file=sys.stderr
|
||||
)
|
||||
|
||||
return legacy_core, legacy_module, files_found
|
||||
|
||||
|
||||
def apply_legacy_defaults(answers: dict, legacy_core: dict, legacy_module: dict) -> dict:
|
||||
"""Apply legacy values as fallback defaults under the answers.
|
||||
|
||||
Legacy values fill in any key not already present in answers.
|
||||
Explicit answers always win.
|
||||
"""
|
||||
merged = dict(answers)
|
||||
|
||||
if legacy_core:
|
||||
core = merged.get("core", {})
|
||||
filled_core = dict(legacy_core) # legacy as base
|
||||
filled_core.update(core) # answers override
|
||||
merged["core"] = filled_core
|
||||
|
||||
if legacy_module:
|
||||
mod = merged.get("module", {})
|
||||
filled_mod = dict(legacy_module) # legacy as base
|
||||
filled_mod.update(mod) # answers override
|
||||
merged["module"] = filled_mod
|
||||
|
||||
return merged
|
||||
|
||||
|
||||
def cleanup_legacy_configs(
|
||||
legacy_dir: str, module_code: str, verbose: bool = False
|
||||
) -> list:
|
||||
"""Delete legacy config.yaml files for this module and core only.
|
||||
|
||||
Returns list of deleted file paths.
|
||||
"""
|
||||
deleted = []
|
||||
for subdir in (module_code, "core"):
|
||||
legacy_path = Path(legacy_dir) / subdir / "config.yaml"
|
||||
if legacy_path.exists():
|
||||
if verbose:
|
||||
print(f"Deleting legacy config: {legacy_path}", file=sys.stderr)
|
||||
legacy_path.unlink()
|
||||
deleted.append(str(legacy_path))
|
||||
return deleted
|
||||
|
||||
|
||||
def extract_module_metadata(module_yaml: dict) -> dict:
|
||||
"""Extract non-variable metadata fields from module.yaml."""
|
||||
meta = {}
|
||||
for k in ("name", "description"):
|
||||
if k in module_yaml:
|
||||
meta[k] = module_yaml[k]
|
||||
meta["version"] = module_yaml.get("module_version") # null if absent
|
||||
if "default_selected" in module_yaml:
|
||||
meta["default_selected"] = module_yaml["default_selected"]
|
||||
return meta
|
||||
|
||||
|
||||
def apply_result_templates(
|
||||
module_yaml: dict, module_answers: dict, verbose: bool = False
|
||||
) -> dict:
|
||||
"""Apply result templates from module.yaml to transform raw answer values.
|
||||
|
||||
For each answer, if the corresponding variable definition in module.yaml has
|
||||
a 'result' field, replaces {value} in that template with the answer. Skips
|
||||
the template if the answer already contains '{project-root}' to prevent
|
||||
double-prefixing.
|
||||
"""
|
||||
transformed = {}
|
||||
for key, value in module_answers.items():
|
||||
var_def = module_yaml.get(key)
|
||||
if (
|
||||
isinstance(var_def, dict)
|
||||
and "result" in var_def
|
||||
and "{project-root}" not in str(value)
|
||||
):
|
||||
template = var_def["result"]
|
||||
transformed[key] = template.replace("{value}", str(value))
|
||||
if verbose:
|
||||
print(
|
||||
f"Applied result template for '{key}': {value} → {transformed[key]}",
|
||||
file=sys.stderr,
|
||||
)
|
||||
else:
|
||||
transformed[key] = value
|
||||
return transformed
|
||||
|
||||
|
||||
def merge_config(
|
||||
existing_config: dict,
|
||||
module_yaml: dict,
|
||||
answers: dict,
|
||||
verbose: bool = False,
|
||||
) -> dict:
|
||||
"""Merge answers into config, applying anti-zombie pattern.
|
||||
|
||||
Args:
|
||||
existing_config: Current config.yaml contents (may be empty)
|
||||
module_yaml: The module definition
|
||||
answers: JSON with 'core' and/or 'module' keys
|
||||
verbose: Print progress to stderr
|
||||
|
||||
Returns:
|
||||
Updated config dict ready to write
|
||||
"""
|
||||
config = dict(existing_config)
|
||||
module_code = module_yaml.get("code")
|
||||
|
||||
if not module_code:
|
||||
print("Error: module.yaml must have a 'code' field", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Migrate legacy core: section to root
|
||||
if "core" in config and isinstance(config["core"], dict):
|
||||
if verbose:
|
||||
print("Migrating legacy 'core' section to root", file=sys.stderr)
|
||||
config.update(config.pop("core"))
|
||||
|
||||
# Strip user-only keys from config — they belong exclusively in config.user.yaml
|
||||
for key in _CORE_USER_KEYS:
|
||||
if key in config:
|
||||
if verbose:
|
||||
print(f"Removing user-only key '{key}' from config (belongs in config.user.yaml)", file=sys.stderr)
|
||||
del config[key]
|
||||
|
||||
# Write core values at root (global properties, not nested under "core")
|
||||
# Exclude user-only keys — those belong exclusively in config.user.yaml
|
||||
core_answers = answers.get("core")
|
||||
if core_answers:
|
||||
shared_core = {k: v for k, v in core_answers.items() if k not in _CORE_USER_KEYS}
|
||||
if shared_core:
|
||||
if verbose:
|
||||
print(f"Writing core config at root: {list(shared_core.keys())}", file=sys.stderr)
|
||||
config.update(shared_core)
|
||||
|
||||
# Anti-zombie: remove existing module section
|
||||
if module_code in config:
|
||||
if verbose:
|
||||
print(
|
||||
f"Removing existing '{module_code}' section (anti-zombie)",
|
||||
file=sys.stderr,
|
||||
)
|
||||
del config[module_code]
|
||||
|
||||
# Build module section: metadata + variable values
|
||||
module_section = extract_module_metadata(module_yaml)
|
||||
module_answers = apply_result_templates(
|
||||
module_yaml, answers.get("module", {}), verbose
|
||||
)
|
||||
module_section.update(module_answers)
|
||||
|
||||
if verbose:
|
||||
print(
|
||||
f"Writing '{module_code}' section with keys: {list(module_section.keys())}",
|
||||
file=sys.stderr,
|
||||
)
|
||||
|
||||
config[module_code] = module_section
|
||||
|
||||
return config
|
||||
|
||||
|
||||
# Core keys that are always written to config.user.yaml
|
||||
_CORE_USER_KEYS = ("user_name", "communication_language")
|
||||
|
||||
|
||||
def extract_user_settings(module_yaml: dict, answers: dict) -> dict:
|
||||
"""Collect settings that belong in config.user.yaml.
|
||||
|
||||
Includes user_name and communication_language from core answers, plus any
|
||||
module variable whose definition contains user_setting: true.
|
||||
"""
|
||||
user_settings = {}
|
||||
|
||||
core_answers = answers.get("core", {})
|
||||
for key in _CORE_USER_KEYS:
|
||||
if key in core_answers:
|
||||
user_settings[key] = core_answers[key]
|
||||
|
||||
module_answers = answers.get("module", {})
|
||||
for var_name, var_def in module_yaml.items():
|
||||
if isinstance(var_def, dict) and var_def.get("user_setting") is True:
|
||||
if var_name in module_answers:
|
||||
user_settings[var_name] = module_answers[var_name]
|
||||
|
||||
return user_settings
|
||||
|
||||
|
||||
def write_config(config: dict, config_path: str, verbose: bool = False) -> None:
|
||||
"""Write config dict to YAML file, creating parent dirs as needed."""
|
||||
path = Path(config_path)
|
||||
path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
if verbose:
|
||||
print(f"Writing config to {path}", file=sys.stderr)
|
||||
|
||||
with open(path, "w", encoding="utf-8") as f:
|
||||
yaml.dump(
|
||||
config,
|
||||
f,
|
||||
default_flow_style=False,
|
||||
allow_unicode=True,
|
||||
sort_keys=False,
|
||||
)
|
||||
|
||||
|
||||
def main():
|
||||
args = parse_args()
|
||||
|
||||
# Load inputs
|
||||
module_yaml = load_yaml_file(args.module_yaml)
|
||||
if not module_yaml:
|
||||
print(f"Error: Could not load module.yaml from {args.module_yaml}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
answers = load_json_file(args.answers)
|
||||
existing_config = load_yaml_file(args.config_path)
|
||||
|
||||
if args.verbose:
|
||||
exists = Path(args.config_path).exists()
|
||||
print(f"Config file exists: {exists}", file=sys.stderr)
|
||||
if exists:
|
||||
print(f"Existing sections: {list(existing_config.keys())}", file=sys.stderr)
|
||||
|
||||
# Legacy migration: read old per-module configs as fallback defaults
|
||||
legacy_files_found = []
|
||||
if args.legacy_dir:
|
||||
module_code = module_yaml.get("code", "")
|
||||
legacy_core, legacy_module, legacy_files_found = load_legacy_values(
|
||||
args.legacy_dir, module_code, module_yaml, args.verbose
|
||||
)
|
||||
if legacy_core or legacy_module:
|
||||
answers = apply_legacy_defaults(answers, legacy_core, legacy_module)
|
||||
if args.verbose:
|
||||
print("Applied legacy values as fallback defaults", file=sys.stderr)
|
||||
|
||||
# Merge and write config.yaml
|
||||
updated_config = merge_config(existing_config, module_yaml, answers, args.verbose)
|
||||
write_config(updated_config, args.config_path, args.verbose)
|
||||
|
||||
# Merge and write config.user.yaml
|
||||
user_settings = extract_user_settings(module_yaml, answers)
|
||||
existing_user_config = load_yaml_file(args.user_config_path)
|
||||
updated_user_config = dict(existing_user_config)
|
||||
updated_user_config.update(user_settings)
|
||||
if user_settings:
|
||||
write_config(updated_user_config, args.user_config_path, args.verbose)
|
||||
|
||||
# Legacy cleanup: delete old per-module config files
|
||||
legacy_deleted = []
|
||||
if args.legacy_dir:
|
||||
legacy_deleted = cleanup_legacy_configs(
|
||||
args.legacy_dir, module_yaml["code"], args.verbose
|
||||
)
|
||||
|
||||
# Output result summary as JSON
|
||||
module_code = module_yaml["code"]
|
||||
result = {
|
||||
"status": "success",
|
||||
"config_path": str(Path(args.config_path).resolve()),
|
||||
"user_config_path": str(Path(args.user_config_path).resolve()),
|
||||
"module_code": module_code,
|
||||
"core_updated": bool(answers.get("core")),
|
||||
"module_keys": list(updated_config.get(module_code, {}).keys()),
|
||||
"user_keys": list(user_settings.keys()),
|
||||
"legacy_configs_found": legacy_files_found,
|
||||
"legacy_configs_deleted": legacy_deleted,
|
||||
}
|
||||
print(json.dumps(result, indent=2))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
220
.claude/skills/bmad-builder-setup/scripts/merge-help-csv.py
Executable file
220
.claude/skills/bmad-builder-setup/scripts/merge-help-csv.py
Executable file
@@ -0,0 +1,220 @@
|
||||
#!/usr/bin/env python3
|
||||
# /// script
|
||||
# requires-python = ">=3.9"
|
||||
# dependencies = []
|
||||
# ///
|
||||
"""Merge module help entries into shared _bmad/module-help.csv.
|
||||
|
||||
Reads a source CSV with module help entries and merges them into a target CSV.
|
||||
Uses an anti-zombie pattern: all existing rows matching the source module code
|
||||
are removed before appending fresh rows.
|
||||
|
||||
Legacy cleanup: when --legacy-dir and --module-code are provided, deletes old
|
||||
per-module module-help.csv files from {legacy-dir}/{module-code}/ and
|
||||
{legacy-dir}/core/. Only the current module and core are touched.
|
||||
|
||||
Exit codes: 0=success, 1=validation error, 2=runtime error
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import csv
|
||||
import json
|
||||
import sys
|
||||
from io import StringIO
|
||||
from pathlib import Path
|
||||
|
||||
# CSV header for module-help.csv
|
||||
HEADER = [
|
||||
"module",
|
||||
"agent-name",
|
||||
"skill-name",
|
||||
"display-name",
|
||||
"menu-code",
|
||||
"capability",
|
||||
"args",
|
||||
"description",
|
||||
"phase",
|
||||
"after",
|
||||
"before",
|
||||
"required",
|
||||
"output-location",
|
||||
"outputs",
|
||||
"", # trailing empty column from trailing comma
|
||||
]
|
||||
|
||||
|
||||
def parse_args():
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Merge module help entries into shared _bmad/module-help.csv with anti-zombie pattern."
|
||||
)
|
||||
parser.add_argument(
|
||||
"--target",
|
||||
required=True,
|
||||
help="Path to the target _bmad/module-help.csv file",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--source",
|
||||
required=True,
|
||||
help="Path to the source module-help.csv with entries to merge",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--legacy-dir",
|
||||
help="Path to _bmad/ directory to check for legacy per-module CSV files.",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--module-code",
|
||||
help="Module code (required with --legacy-dir for scoping cleanup).",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--verbose",
|
||||
action="store_true",
|
||||
help="Print detailed progress to stderr",
|
||||
)
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def read_csv_rows(path: str) -> tuple[list[str], list[list[str]]]:
|
||||
"""Read CSV file returning (header, data_rows).
|
||||
|
||||
Returns empty header and rows if file doesn't exist.
|
||||
"""
|
||||
file_path = Path(path)
|
||||
if not file_path.exists():
|
||||
return [], []
|
||||
|
||||
with open(file_path, "r", encoding="utf-8", newline="") as f:
|
||||
content = f.read()
|
||||
|
||||
reader = csv.reader(StringIO(content))
|
||||
rows = list(reader)
|
||||
|
||||
if not rows:
|
||||
return [], []
|
||||
|
||||
return rows[0], rows[1:]
|
||||
|
||||
|
||||
def extract_module_codes(rows: list[list[str]]) -> set[str]:
|
||||
"""Extract unique module codes from data rows."""
|
||||
codes = set()
|
||||
for row in rows:
|
||||
if row and row[0].strip():
|
||||
codes.add(row[0].strip())
|
||||
return codes
|
||||
|
||||
|
||||
def filter_rows(rows: list[list[str]], module_code: str) -> list[list[str]]:
|
||||
"""Remove all rows matching the given module code."""
|
||||
return [row for row in rows if not row or row[0].strip() != module_code]
|
||||
|
||||
|
||||
def write_csv(path: str, header: list[str], rows: list[list[str]], verbose: bool = False) -> None:
|
||||
"""Write header + rows to CSV file, creating parent dirs as needed."""
|
||||
file_path = Path(path)
|
||||
file_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
if verbose:
|
||||
print(f"Writing {len(rows)} data rows to {path}", file=sys.stderr)
|
||||
|
||||
with open(file_path, "w", encoding="utf-8", newline="") as f:
|
||||
writer = csv.writer(f)
|
||||
writer.writerow(header)
|
||||
for row in rows:
|
||||
writer.writerow(row)
|
||||
|
||||
|
||||
def cleanup_legacy_csvs(
|
||||
legacy_dir: str, module_code: str, verbose: bool = False
|
||||
) -> list:
|
||||
"""Delete legacy per-module module-help.csv files for this module and core only.
|
||||
|
||||
Returns list of deleted file paths.
|
||||
"""
|
||||
deleted = []
|
||||
for subdir in (module_code, "core"):
|
||||
legacy_path = Path(legacy_dir) / subdir / "module-help.csv"
|
||||
if legacy_path.exists():
|
||||
if verbose:
|
||||
print(f"Deleting legacy CSV: {legacy_path}", file=sys.stderr)
|
||||
legacy_path.unlink()
|
||||
deleted.append(str(legacy_path))
|
||||
return deleted
|
||||
|
||||
|
||||
def main():
|
||||
args = parse_args()
|
||||
|
||||
# Read source entries
|
||||
source_header, source_rows = read_csv_rows(args.source)
|
||||
if not source_rows:
|
||||
print(f"Error: No data rows found in source {args.source}", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
# Determine module codes being merged
|
||||
source_codes = extract_module_codes(source_rows)
|
||||
if not source_codes:
|
||||
print("Error: Could not determine module code from source rows", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
if args.verbose:
|
||||
print(f"Source module codes: {source_codes}", file=sys.stderr)
|
||||
print(f"Source rows: {len(source_rows)}", file=sys.stderr)
|
||||
|
||||
# Read existing target (may not exist)
|
||||
target_header, target_rows = read_csv_rows(args.target)
|
||||
target_existed = Path(args.target).exists()
|
||||
|
||||
if args.verbose:
|
||||
print(f"Target exists: {target_existed}", file=sys.stderr)
|
||||
if target_existed:
|
||||
print(f"Existing target rows: {len(target_rows)}", file=sys.stderr)
|
||||
|
||||
# Use source header if target doesn't exist or has no header
|
||||
header = target_header if target_header else (source_header if source_header else HEADER)
|
||||
|
||||
# Anti-zombie: remove all rows for each source module code
|
||||
filtered_rows = target_rows
|
||||
removed_count = 0
|
||||
for code in source_codes:
|
||||
before_count = len(filtered_rows)
|
||||
filtered_rows = filter_rows(filtered_rows, code)
|
||||
removed_count += before_count - len(filtered_rows)
|
||||
|
||||
if args.verbose and removed_count > 0:
|
||||
print(f"Removed {removed_count} existing rows (anti-zombie)", file=sys.stderr)
|
||||
|
||||
# Append source rows
|
||||
merged_rows = filtered_rows + source_rows
|
||||
|
||||
# Write result
|
||||
write_csv(args.target, header, merged_rows, args.verbose)
|
||||
|
||||
# Legacy cleanup: delete old per-module CSV files
|
||||
legacy_deleted = []
|
||||
if args.legacy_dir:
|
||||
if not args.module_code:
|
||||
print(
|
||||
"Error: --module-code is required when --legacy-dir is provided",
|
||||
file=sys.stderr,
|
||||
)
|
||||
sys.exit(1)
|
||||
legacy_deleted = cleanup_legacy_csvs(
|
||||
args.legacy_dir, args.module_code, args.verbose
|
||||
)
|
||||
|
||||
# Output result summary as JSON
|
||||
result = {
|
||||
"status": "success",
|
||||
"target_path": str(Path(args.target).resolve()),
|
||||
"target_existed": target_existed,
|
||||
"module_codes": sorted(source_codes),
|
||||
"rows_removed": removed_count,
|
||||
"rows_added": len(source_rows),
|
||||
"total_rows": len(merged_rows),
|
||||
"legacy_csvs_deleted": legacy_deleted,
|
||||
}
|
||||
print(json.dumps(result, indent=2))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -0,0 +1,429 @@
|
||||
#!/usr/bin/env python3
|
||||
# /// script
|
||||
# requires-python = ">=3.9"
|
||||
# dependencies = []
|
||||
# ///
|
||||
"""Unit tests for cleanup-legacy.py."""
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
from pathlib import Path
|
||||
|
||||
# Add parent directory to path so we can import the module
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent))
|
||||
|
||||
from importlib.util import spec_from_file_location, module_from_spec
|
||||
|
||||
# Import cleanup_legacy module
|
||||
_spec = spec_from_file_location(
|
||||
"cleanup_legacy",
|
||||
str(Path(__file__).parent.parent / "cleanup-legacy.py"),
|
||||
)
|
||||
cleanup_legacy_mod = module_from_spec(_spec)
|
||||
_spec.loader.exec_module(cleanup_legacy_mod)
|
||||
|
||||
find_skill_dirs = cleanup_legacy_mod.find_skill_dirs
|
||||
verify_skills_installed = cleanup_legacy_mod.verify_skills_installed
|
||||
count_files = cleanup_legacy_mod.count_files
|
||||
cleanup_directories = cleanup_legacy_mod.cleanup_directories
|
||||
|
||||
|
||||
def _make_skill_dir(base, *path_parts):
|
||||
"""Create a skill directory with a SKILL.md file."""
|
||||
skill_dir = os.path.join(base, *path_parts)
|
||||
os.makedirs(skill_dir, exist_ok=True)
|
||||
with open(os.path.join(skill_dir, "SKILL.md"), "w") as f:
|
||||
f.write("---\nname: test-skill\n---\n# Test\n")
|
||||
return skill_dir
|
||||
|
||||
|
||||
def _make_file(base, *path_parts, content="placeholder"):
|
||||
"""Create a file at the given path."""
|
||||
file_path = os.path.join(base, *path_parts)
|
||||
os.makedirs(os.path.dirname(file_path), exist_ok=True)
|
||||
with open(file_path, "w") as f:
|
||||
f.write(content)
|
||||
return file_path
|
||||
|
||||
|
||||
class TestFindSkillDirs(unittest.TestCase):
|
||||
def test_finds_dirs_with_skill_md(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
_make_skill_dir(tmpdir, "skills", "bmad-agent-builder")
|
||||
_make_skill_dir(tmpdir, "skills", "bmad-workflow-builder")
|
||||
result = find_skill_dirs(tmpdir)
|
||||
self.assertEqual(result, ["bmad-agent-builder", "bmad-workflow-builder"])
|
||||
|
||||
def test_ignores_dirs_without_skill_md(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
_make_skill_dir(tmpdir, "skills", "real-skill")
|
||||
os.makedirs(os.path.join(tmpdir, "skills", "not-a-skill"))
|
||||
_make_file(tmpdir, "skills", "not-a-skill", "README.md")
|
||||
result = find_skill_dirs(tmpdir)
|
||||
self.assertEqual(result, ["real-skill"])
|
||||
|
||||
def test_empty_directory(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
result = find_skill_dirs(tmpdir)
|
||||
self.assertEqual(result, [])
|
||||
|
||||
def test_nonexistent_directory(self):
|
||||
result = find_skill_dirs("/nonexistent/path")
|
||||
self.assertEqual(result, [])
|
||||
|
||||
def test_finds_nested_skills_in_phase_subdirs(self):
|
||||
"""Skills nested in phase directories like bmm/1-analysis/bmad-agent-analyst/."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
_make_skill_dir(tmpdir, "1-analysis", "bmad-agent-analyst")
|
||||
_make_skill_dir(tmpdir, "2-plan", "bmad-agent-pm")
|
||||
_make_skill_dir(tmpdir, "4-impl", "bmad-agent-dev")
|
||||
result = find_skill_dirs(tmpdir)
|
||||
self.assertEqual(
|
||||
result, ["bmad-agent-analyst", "bmad-agent-dev", "bmad-agent-pm"]
|
||||
)
|
||||
|
||||
def test_deduplicates_skill_names(self):
|
||||
"""If the same skill name appears in multiple locations, only listed once."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
_make_skill_dir(tmpdir, "a", "my-skill")
|
||||
_make_skill_dir(tmpdir, "b", "my-skill")
|
||||
result = find_skill_dirs(tmpdir)
|
||||
self.assertEqual(result, ["my-skill"])
|
||||
|
||||
|
||||
class TestVerifySkillsInstalled(unittest.TestCase):
|
||||
def test_all_skills_present(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
skills_dir = os.path.join(tmpdir, "skills")
|
||||
|
||||
# Legacy: bmb has two skills
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "skill-a")
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "skill-b")
|
||||
|
||||
# Installed: both exist
|
||||
os.makedirs(os.path.join(skills_dir, "skill-a"))
|
||||
os.makedirs(os.path.join(skills_dir, "skill-b"))
|
||||
|
||||
result = verify_skills_installed(bmad_dir, ["bmb"], skills_dir)
|
||||
self.assertEqual(result, ["skill-a", "skill-b"])
|
||||
|
||||
def test_missing_skill_exits_1(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
skills_dir = os.path.join(tmpdir, "skills")
|
||||
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "skill-a")
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "skill-missing")
|
||||
|
||||
# Only skill-a installed
|
||||
os.makedirs(os.path.join(skills_dir, "skill-a"))
|
||||
|
||||
with self.assertRaises(SystemExit) as ctx:
|
||||
verify_skills_installed(bmad_dir, ["bmb"], skills_dir)
|
||||
self.assertEqual(ctx.exception.code, 1)
|
||||
|
||||
def test_empty_legacy_dir_passes(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
skills_dir = os.path.join(tmpdir, "skills")
|
||||
os.makedirs(bmad_dir)
|
||||
os.makedirs(skills_dir)
|
||||
|
||||
result = verify_skills_installed(bmad_dir, ["bmb"], skills_dir)
|
||||
self.assertEqual(result, [])
|
||||
|
||||
def test_nonexistent_legacy_dir_skipped(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
skills_dir = os.path.join(tmpdir, "skills")
|
||||
os.makedirs(skills_dir)
|
||||
# bmad_dir doesn't exist — should not error
|
||||
|
||||
result = verify_skills_installed(bmad_dir, ["bmb"], skills_dir)
|
||||
self.assertEqual(result, [])
|
||||
|
||||
def test_dir_without_skills_skipped(self):
|
||||
"""Directories like _config/ that have no SKILL.md are not verified."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
skills_dir = os.path.join(tmpdir, "skills")
|
||||
|
||||
# _config has files but no SKILL.md
|
||||
_make_file(bmad_dir, "_config", "manifest.yaml", content="version: 1")
|
||||
_make_file(bmad_dir, "_config", "help.csv", content="a,b,c")
|
||||
os.makedirs(skills_dir)
|
||||
|
||||
result = verify_skills_installed(bmad_dir, ["_config"], skills_dir)
|
||||
self.assertEqual(result, [])
|
||||
|
||||
def test_verifies_across_multiple_dirs(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
skills_dir = os.path.join(tmpdir, "skills")
|
||||
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "skill-a")
|
||||
_make_skill_dir(bmad_dir, "core", "skills", "skill-b")
|
||||
|
||||
os.makedirs(os.path.join(skills_dir, "skill-a"))
|
||||
os.makedirs(os.path.join(skills_dir, "skill-b"))
|
||||
|
||||
result = verify_skills_installed(
|
||||
bmad_dir, ["bmb", "core"], skills_dir
|
||||
)
|
||||
self.assertEqual(result, ["skill-a", "skill-b"])
|
||||
|
||||
|
||||
class TestCountFiles(unittest.TestCase):
|
||||
def test_counts_files_recursively(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
_make_file(tmpdir, "a.txt")
|
||||
_make_file(tmpdir, "sub", "b.txt")
|
||||
_make_file(tmpdir, "sub", "deep", "c.txt")
|
||||
self.assertEqual(count_files(Path(tmpdir)), 3)
|
||||
|
||||
def test_empty_dir_returns_zero(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
self.assertEqual(count_files(Path(tmpdir)), 0)
|
||||
|
||||
|
||||
class TestCleanupDirectories(unittest.TestCase):
|
||||
def test_removes_single_module_dir(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
os.makedirs(os.path.join(bmad_dir, "bmb", "skills"))
|
||||
_make_file(bmad_dir, "bmb", "skills", "SKILL.md")
|
||||
|
||||
removed, not_found, count = cleanup_directories(bmad_dir, ["bmb"])
|
||||
self.assertEqual(removed, ["bmb"])
|
||||
self.assertEqual(not_found, [])
|
||||
self.assertGreater(count, 0)
|
||||
self.assertFalse(os.path.exists(os.path.join(bmad_dir, "bmb")))
|
||||
|
||||
def test_removes_module_core_and_config(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
for dirname in ("bmb", "core", "_config"):
|
||||
_make_file(bmad_dir, dirname, "some-file.txt")
|
||||
|
||||
removed, not_found, count = cleanup_directories(
|
||||
bmad_dir, ["bmb", "core", "_config"]
|
||||
)
|
||||
self.assertEqual(sorted(removed), ["_config", "bmb", "core"])
|
||||
self.assertEqual(not_found, [])
|
||||
for dirname in ("bmb", "core", "_config"):
|
||||
self.assertFalse(os.path.exists(os.path.join(bmad_dir, dirname)))
|
||||
|
||||
def test_nonexistent_dir_in_not_found(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
os.makedirs(bmad_dir)
|
||||
|
||||
removed, not_found, count = cleanup_directories(bmad_dir, ["bmb"])
|
||||
self.assertEqual(removed, [])
|
||||
self.assertEqual(not_found, ["bmb"])
|
||||
self.assertEqual(count, 0)
|
||||
|
||||
def test_preserves_other_module_dirs(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
for dirname in ("bmb", "bmm", "tea"):
|
||||
_make_file(bmad_dir, dirname, "file.txt")
|
||||
|
||||
removed, not_found, count = cleanup_directories(bmad_dir, ["bmb"])
|
||||
self.assertEqual(removed, ["bmb"])
|
||||
self.assertTrue(os.path.isdir(os.path.join(bmad_dir, "bmm")))
|
||||
self.assertTrue(os.path.isdir(os.path.join(bmad_dir, "tea")))
|
||||
|
||||
def test_preserves_root_config_files(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
_make_file(bmad_dir, "config.yaml", content="key: val")
|
||||
_make_file(bmad_dir, "config.user.yaml", content="user: test")
|
||||
_make_file(bmad_dir, "module-help.csv", content="a,b,c")
|
||||
_make_file(bmad_dir, "bmb", "stuff.txt")
|
||||
|
||||
cleanup_directories(bmad_dir, ["bmb"])
|
||||
|
||||
self.assertTrue(os.path.exists(os.path.join(bmad_dir, "config.yaml")))
|
||||
self.assertTrue(
|
||||
os.path.exists(os.path.join(bmad_dir, "config.user.yaml"))
|
||||
)
|
||||
self.assertTrue(
|
||||
os.path.exists(os.path.join(bmad_dir, "module-help.csv"))
|
||||
)
|
||||
|
||||
def test_removes_hidden_files(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
_make_file(bmad_dir, "bmb", ".DS_Store")
|
||||
_make_file(bmad_dir, "bmb", "skills", ".hidden")
|
||||
|
||||
removed, not_found, count = cleanup_directories(bmad_dir, ["bmb"])
|
||||
self.assertEqual(removed, ["bmb"])
|
||||
self.assertEqual(count, 2)
|
||||
self.assertFalse(os.path.exists(os.path.join(bmad_dir, "bmb")))
|
||||
|
||||
def test_idempotent_rerun(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
_make_file(bmad_dir, "bmb", "file.txt")
|
||||
|
||||
# First run
|
||||
removed1, not_found1, _ = cleanup_directories(bmad_dir, ["bmb"])
|
||||
self.assertEqual(removed1, ["bmb"])
|
||||
self.assertEqual(not_found1, [])
|
||||
|
||||
# Second run — idempotent
|
||||
removed2, not_found2, count2 = cleanup_directories(bmad_dir, ["bmb"])
|
||||
self.assertEqual(removed2, [])
|
||||
self.assertEqual(not_found2, ["bmb"])
|
||||
self.assertEqual(count2, 0)
|
||||
|
||||
|
||||
class TestSafetyCheck(unittest.TestCase):
|
||||
def test_no_skills_dir_skips_check(self):
|
||||
"""When --skills-dir is not provided, no verification happens."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "some-skill")
|
||||
|
||||
# No skills_dir — cleanup should proceed without verification
|
||||
removed, not_found, count = cleanup_directories(bmad_dir, ["bmb"])
|
||||
self.assertEqual(removed, ["bmb"])
|
||||
|
||||
def test_missing_skill_blocks_removal(self):
|
||||
"""When --skills-dir is provided and a skill is missing, exit 1."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
skills_dir = os.path.join(tmpdir, "skills")
|
||||
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "installed-skill")
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "missing-skill")
|
||||
|
||||
os.makedirs(os.path.join(skills_dir, "installed-skill"))
|
||||
# missing-skill not created in skills_dir
|
||||
|
||||
with self.assertRaises(SystemExit) as ctx:
|
||||
verify_skills_installed(bmad_dir, ["bmb"], skills_dir)
|
||||
self.assertEqual(ctx.exception.code, 1)
|
||||
|
||||
# Directory should NOT have been removed (verification failed before cleanup)
|
||||
self.assertTrue(os.path.isdir(os.path.join(bmad_dir, "bmb")))
|
||||
|
||||
def test_dir_without_skills_not_checked(self):
|
||||
"""Directories like _config that have no SKILL.md pass verification."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
skills_dir = os.path.join(tmpdir, "skills")
|
||||
|
||||
_make_file(bmad_dir, "_config", "manifest.yaml")
|
||||
os.makedirs(skills_dir)
|
||||
|
||||
# Should not raise — _config has no skills to verify
|
||||
result = verify_skills_installed(bmad_dir, ["_config"], skills_dir)
|
||||
self.assertEqual(result, [])
|
||||
|
||||
|
||||
class TestEndToEnd(unittest.TestCase):
|
||||
def test_full_cleanup_with_verification(self):
|
||||
"""Simulate complete cleanup flow with safety check."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
skills_dir = os.path.join(tmpdir, "skills")
|
||||
|
||||
# Create legacy structure
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "bmad-agent-builder")
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "bmad-builder-setup")
|
||||
_make_file(bmad_dir, "bmb", "skills", "bmad-agent-builder", "assets", "template.md")
|
||||
_make_skill_dir(bmad_dir, "core", "skills", "bmad-brainstorming")
|
||||
_make_file(bmad_dir, "_config", "manifest.yaml")
|
||||
_make_file(bmad_dir, "_config", "bmad-help.csv")
|
||||
|
||||
# Create root config files that must survive
|
||||
_make_file(bmad_dir, "config.yaml", content="document_output_language: English")
|
||||
_make_file(bmad_dir, "config.user.yaml", content="user_name: Test")
|
||||
_make_file(bmad_dir, "module-help.csv", content="module,name\nbmb,builder")
|
||||
|
||||
# Create other module dirs that must survive
|
||||
_make_file(bmad_dir, "bmm", "config.yaml")
|
||||
_make_file(bmad_dir, "tea", "config.yaml")
|
||||
|
||||
# Create installed skills
|
||||
os.makedirs(os.path.join(skills_dir, "bmad-agent-builder"))
|
||||
os.makedirs(os.path.join(skills_dir, "bmad-builder-setup"))
|
||||
os.makedirs(os.path.join(skills_dir, "bmad-brainstorming"))
|
||||
|
||||
# Verify
|
||||
verified = verify_skills_installed(
|
||||
bmad_dir, ["bmb", "core", "_config"], skills_dir
|
||||
)
|
||||
self.assertIn("bmad-agent-builder", verified)
|
||||
self.assertIn("bmad-builder-setup", verified)
|
||||
self.assertIn("bmad-brainstorming", verified)
|
||||
|
||||
# Cleanup
|
||||
removed, not_found, file_count = cleanup_directories(
|
||||
bmad_dir, ["bmb", "core", "_config"]
|
||||
)
|
||||
self.assertEqual(sorted(removed), ["_config", "bmb", "core"])
|
||||
self.assertEqual(not_found, [])
|
||||
self.assertGreater(file_count, 0)
|
||||
|
||||
# Verify final state
|
||||
self.assertFalse(os.path.exists(os.path.join(bmad_dir, "bmb")))
|
||||
self.assertFalse(os.path.exists(os.path.join(bmad_dir, "core")))
|
||||
self.assertFalse(os.path.exists(os.path.join(bmad_dir, "_config")))
|
||||
|
||||
# Root config files survived
|
||||
self.assertTrue(os.path.exists(os.path.join(bmad_dir, "config.yaml")))
|
||||
self.assertTrue(os.path.exists(os.path.join(bmad_dir, "config.user.yaml")))
|
||||
self.assertTrue(os.path.exists(os.path.join(bmad_dir, "module-help.csv")))
|
||||
|
||||
# Other modules survived
|
||||
self.assertTrue(os.path.isdir(os.path.join(bmad_dir, "bmm")))
|
||||
self.assertTrue(os.path.isdir(os.path.join(bmad_dir, "tea")))
|
||||
|
||||
def test_simulate_post_merge_scripts(self):
|
||||
"""Simulate the full flow: merge scripts run first (delete config files),
|
||||
then cleanup removes directories."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
bmad_dir = os.path.join(tmpdir, "_bmad")
|
||||
|
||||
# Legacy state: config files already deleted by merge scripts
|
||||
# but directories and skill content remain
|
||||
_make_skill_dir(bmad_dir, "bmb", "skills", "bmad-agent-builder")
|
||||
_make_file(bmad_dir, "bmb", "skills", "bmad-agent-builder", "refs", "doc.md")
|
||||
_make_file(bmad_dir, "bmb", ".DS_Store")
|
||||
# config.yaml already deleted by merge-config.py
|
||||
# module-help.csv already deleted by merge-help-csv.py
|
||||
|
||||
_make_skill_dir(bmad_dir, "core", "skills", "bmad-help")
|
||||
# core/config.yaml already deleted
|
||||
# core/module-help.csv already deleted
|
||||
|
||||
# Root files from merge scripts
|
||||
_make_file(bmad_dir, "config.yaml", content="bmb:\n name: BMad Builder")
|
||||
_make_file(bmad_dir, "config.user.yaml", content="user_name: Test")
|
||||
_make_file(bmad_dir, "module-help.csv", content="module,name")
|
||||
|
||||
# Cleanup directories
|
||||
removed, not_found, file_count = cleanup_directories(
|
||||
bmad_dir, ["bmb", "core"]
|
||||
)
|
||||
self.assertEqual(sorted(removed), ["bmb", "core"])
|
||||
self.assertGreater(file_count, 0)
|
||||
|
||||
# Final state: only root config files
|
||||
remaining = os.listdir(bmad_dir)
|
||||
self.assertEqual(
|
||||
sorted(remaining),
|
||||
["config.user.yaml", "config.yaml", "module-help.csv"],
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
@@ -0,0 +1,644 @@
|
||||
#!/usr/bin/env python3
|
||||
# /// script
|
||||
# requires-python = ">=3.9"
|
||||
# dependencies = ["pyyaml"]
|
||||
# ///
|
||||
"""Unit tests for merge-config.py."""
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
from pathlib import Path
|
||||
|
||||
# Add parent directory to path so we can import the module
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent))
|
||||
|
||||
import yaml
|
||||
|
||||
from importlib.util import spec_from_file_location, module_from_spec
|
||||
|
||||
# Import merge_config module
|
||||
_spec = spec_from_file_location(
|
||||
"merge_config",
|
||||
str(Path(__file__).parent.parent / "merge-config.py"),
|
||||
)
|
||||
merge_config_mod = module_from_spec(_spec)
|
||||
_spec.loader.exec_module(merge_config_mod)
|
||||
|
||||
extract_module_metadata = merge_config_mod.extract_module_metadata
|
||||
extract_user_settings = merge_config_mod.extract_user_settings
|
||||
merge_config = merge_config_mod.merge_config
|
||||
load_legacy_values = merge_config_mod.load_legacy_values
|
||||
apply_legacy_defaults = merge_config_mod.apply_legacy_defaults
|
||||
cleanup_legacy_configs = merge_config_mod.cleanup_legacy_configs
|
||||
apply_result_templates = merge_config_mod.apply_result_templates
|
||||
|
||||
|
||||
SAMPLE_MODULE_YAML = {
|
||||
"code": "bmb",
|
||||
"name": "BMad Builder",
|
||||
"description": "Standard Skill Compliant Factory",
|
||||
"default_selected": False,
|
||||
"bmad_builder_output_folder": {
|
||||
"prompt": "Where should skills be saved?",
|
||||
"default": "_bmad-output/skills",
|
||||
"result": "{project-root}/{value}",
|
||||
},
|
||||
"bmad_builder_reports": {
|
||||
"prompt": "Output for reports?",
|
||||
"default": "_bmad-output/reports",
|
||||
"result": "{project-root}/{value}",
|
||||
},
|
||||
}
|
||||
|
||||
SAMPLE_MODULE_YAML_WITH_VERSION = {
|
||||
**SAMPLE_MODULE_YAML,
|
||||
"module_version": "1.0.0",
|
||||
}
|
||||
|
||||
SAMPLE_MODULE_YAML_WITH_USER_SETTING = {
|
||||
**SAMPLE_MODULE_YAML,
|
||||
"some_pref": {
|
||||
"prompt": "Your preference?",
|
||||
"default": "default_val",
|
||||
"user_setting": True,
|
||||
},
|
||||
}
|
||||
|
||||
|
||||
class TestExtractModuleMetadata(unittest.TestCase):
|
||||
def test_extracts_metadata_fields(self):
|
||||
result = extract_module_metadata(SAMPLE_MODULE_YAML)
|
||||
self.assertEqual(result["name"], "BMad Builder")
|
||||
self.assertEqual(result["description"], "Standard Skill Compliant Factory")
|
||||
self.assertFalse(result["default_selected"])
|
||||
|
||||
def test_excludes_variable_definitions(self):
|
||||
result = extract_module_metadata(SAMPLE_MODULE_YAML)
|
||||
self.assertNotIn("bmad_builder_output_folder", result)
|
||||
self.assertNotIn("bmad_builder_reports", result)
|
||||
self.assertNotIn("code", result)
|
||||
|
||||
def test_version_present(self):
|
||||
result = extract_module_metadata(SAMPLE_MODULE_YAML_WITH_VERSION)
|
||||
self.assertEqual(result["version"], "1.0.0")
|
||||
|
||||
def test_version_absent_is_none(self):
|
||||
result = extract_module_metadata(SAMPLE_MODULE_YAML)
|
||||
self.assertIn("version", result)
|
||||
self.assertIsNone(result["version"])
|
||||
|
||||
def test_field_order(self):
|
||||
result = extract_module_metadata(SAMPLE_MODULE_YAML_WITH_VERSION)
|
||||
keys = list(result.keys())
|
||||
self.assertEqual(keys, ["name", "description", "version", "default_selected"])
|
||||
|
||||
|
||||
class TestExtractUserSettings(unittest.TestCase):
|
||||
def test_core_user_keys(self):
|
||||
answers = {
|
||||
"core": {
|
||||
"user_name": "Brian",
|
||||
"communication_language": "English",
|
||||
"document_output_language": "English",
|
||||
"output_folder": "_bmad-output",
|
||||
},
|
||||
}
|
||||
result = extract_user_settings(SAMPLE_MODULE_YAML, answers)
|
||||
self.assertEqual(result["user_name"], "Brian")
|
||||
self.assertEqual(result["communication_language"], "English")
|
||||
self.assertNotIn("document_output_language", result)
|
||||
self.assertNotIn("output_folder", result)
|
||||
|
||||
def test_module_user_setting_true(self):
|
||||
answers = {
|
||||
"core": {"user_name": "Brian"},
|
||||
"module": {"some_pref": "custom_val"},
|
||||
}
|
||||
result = extract_user_settings(SAMPLE_MODULE_YAML_WITH_USER_SETTING, answers)
|
||||
self.assertEqual(result["user_name"], "Brian")
|
||||
self.assertEqual(result["some_pref"], "custom_val")
|
||||
|
||||
def test_no_core_answers(self):
|
||||
answers = {"module": {"some_pref": "val"}}
|
||||
result = extract_user_settings(SAMPLE_MODULE_YAML_WITH_USER_SETTING, answers)
|
||||
self.assertNotIn("user_name", result)
|
||||
self.assertEqual(result["some_pref"], "val")
|
||||
|
||||
def test_no_user_settings_in_module(self):
|
||||
answers = {
|
||||
"core": {"user_name": "Brian"},
|
||||
"module": {"bmad_builder_output_folder": "path"},
|
||||
}
|
||||
result = extract_user_settings(SAMPLE_MODULE_YAML, answers)
|
||||
self.assertEqual(result, {"user_name": "Brian"})
|
||||
|
||||
def test_empty_answers(self):
|
||||
result = extract_user_settings(SAMPLE_MODULE_YAML, {})
|
||||
self.assertEqual(result, {})
|
||||
|
||||
|
||||
class TestApplyResultTemplates(unittest.TestCase):
|
||||
def test_applies_template(self):
|
||||
answers = {"bmad_builder_output_folder": "skills"}
|
||||
result = apply_result_templates(SAMPLE_MODULE_YAML, answers)
|
||||
self.assertEqual(result["bmad_builder_output_folder"], "{project-root}/skills")
|
||||
|
||||
def test_applies_multiple_templates(self):
|
||||
answers = {
|
||||
"bmad_builder_output_folder": "skills",
|
||||
"bmad_builder_reports": "skills/reports",
|
||||
}
|
||||
result = apply_result_templates(SAMPLE_MODULE_YAML, answers)
|
||||
self.assertEqual(result["bmad_builder_output_folder"], "{project-root}/skills")
|
||||
self.assertEqual(result["bmad_builder_reports"], "{project-root}/skills/reports")
|
||||
|
||||
def test_skips_when_no_template(self):
|
||||
"""Variables without a result field are stored as-is."""
|
||||
yaml_no_result = {
|
||||
"code": "test",
|
||||
"my_var": {"prompt": "Enter value", "default": "foo"},
|
||||
}
|
||||
answers = {"my_var": "bar"}
|
||||
result = apply_result_templates(yaml_no_result, answers)
|
||||
self.assertEqual(result["my_var"], "bar")
|
||||
|
||||
def test_skips_when_value_already_has_project_root(self):
|
||||
"""Prevent double-prefixing if value already contains {project-root}."""
|
||||
answers = {"bmad_builder_output_folder": "{project-root}/skills"}
|
||||
result = apply_result_templates(SAMPLE_MODULE_YAML, answers)
|
||||
self.assertEqual(result["bmad_builder_output_folder"], "{project-root}/skills")
|
||||
|
||||
def test_empty_answers(self):
|
||||
result = apply_result_templates(SAMPLE_MODULE_YAML, {})
|
||||
self.assertEqual(result, {})
|
||||
|
||||
def test_unknown_key_passed_through(self):
|
||||
"""Keys not in module.yaml are passed through unchanged."""
|
||||
answers = {"unknown_key": "some_value"}
|
||||
result = apply_result_templates(SAMPLE_MODULE_YAML, answers)
|
||||
self.assertEqual(result["unknown_key"], "some_value")
|
||||
|
||||
|
||||
class TestMergeConfig(unittest.TestCase):
|
||||
def test_fresh_install_with_core_and_module(self):
|
||||
answers = {
|
||||
"core": {
|
||||
"user_name": "Brian",
|
||||
"communication_language": "English",
|
||||
"document_output_language": "English",
|
||||
"output_folder": "_bmad-output",
|
||||
},
|
||||
"module": {
|
||||
"bmad_builder_output_folder": "_bmad-output/skills",
|
||||
},
|
||||
}
|
||||
result = merge_config({}, SAMPLE_MODULE_YAML, answers)
|
||||
|
||||
# User-only keys must NOT appear in config.yaml
|
||||
self.assertNotIn("user_name", result)
|
||||
self.assertNotIn("communication_language", result)
|
||||
# Shared core keys do appear
|
||||
self.assertEqual(result["document_output_language"], "English")
|
||||
self.assertEqual(result["output_folder"], "_bmad-output")
|
||||
self.assertEqual(result["bmb"]["name"], "BMad Builder")
|
||||
self.assertEqual(result["bmb"]["bmad_builder_output_folder"], "{project-root}/_bmad-output/skills")
|
||||
|
||||
def test_update_strips_user_keys_preserves_shared(self):
|
||||
existing = {
|
||||
"user_name": "Brian",
|
||||
"communication_language": "English",
|
||||
"document_output_language": "English",
|
||||
"other_module": {"name": "Other"},
|
||||
}
|
||||
answers = {
|
||||
"module": {
|
||||
"bmad_builder_output_folder": "_bmad-output/skills",
|
||||
},
|
||||
}
|
||||
result = merge_config(existing, SAMPLE_MODULE_YAML, answers)
|
||||
|
||||
# User-only keys stripped from config
|
||||
self.assertNotIn("user_name", result)
|
||||
self.assertNotIn("communication_language", result)
|
||||
# Shared core preserved at root
|
||||
self.assertEqual(result["document_output_language"], "English")
|
||||
# Other module preserved
|
||||
self.assertIn("other_module", result)
|
||||
# New module added
|
||||
self.assertIn("bmb", result)
|
||||
|
||||
def test_anti_zombie_removes_existing_module(self):
|
||||
existing = {
|
||||
"user_name": "Brian",
|
||||
"bmb": {
|
||||
"name": "BMad Builder",
|
||||
"old_variable": "should_be_removed",
|
||||
"bmad_builder_output_folder": "old/path",
|
||||
},
|
||||
}
|
||||
answers = {
|
||||
"module": {
|
||||
"bmad_builder_output_folder": "new/path",
|
||||
},
|
||||
}
|
||||
result = merge_config(existing, SAMPLE_MODULE_YAML, answers)
|
||||
|
||||
# Old variable is gone
|
||||
self.assertNotIn("old_variable", result["bmb"])
|
||||
# New value is present
|
||||
self.assertEqual(result["bmb"]["bmad_builder_output_folder"], "{project-root}/new/path")
|
||||
# Metadata is fresh from module.yaml
|
||||
self.assertEqual(result["bmb"]["name"], "BMad Builder")
|
||||
|
||||
def test_user_keys_never_written_to_config(self):
|
||||
existing = {
|
||||
"user_name": "OldName",
|
||||
"communication_language": "Spanish",
|
||||
"document_output_language": "French",
|
||||
}
|
||||
answers = {
|
||||
"core": {"user_name": "NewName", "communication_language": "English"},
|
||||
"module": {},
|
||||
}
|
||||
result = merge_config(existing, SAMPLE_MODULE_YAML, answers)
|
||||
|
||||
# User-only keys stripped even if they were in existing config
|
||||
self.assertNotIn("user_name", result)
|
||||
self.assertNotIn("communication_language", result)
|
||||
# Shared core preserved
|
||||
self.assertEqual(result["document_output_language"], "French")
|
||||
|
||||
def test_no_core_answers_still_strips_user_keys(self):
|
||||
existing = {
|
||||
"user_name": "Brian",
|
||||
"output_folder": "/out",
|
||||
}
|
||||
answers = {
|
||||
"module": {"bmad_builder_output_folder": "path"},
|
||||
}
|
||||
result = merge_config(existing, SAMPLE_MODULE_YAML, answers)
|
||||
|
||||
# User-only keys stripped even without core answers
|
||||
self.assertNotIn("user_name", result)
|
||||
# Shared core unchanged
|
||||
self.assertEqual(result["output_folder"], "/out")
|
||||
|
||||
def test_module_metadata_always_from_yaml(self):
|
||||
"""Module metadata comes from module.yaml, not answers."""
|
||||
answers = {
|
||||
"module": {"bmad_builder_output_folder": "path"},
|
||||
}
|
||||
result = merge_config({}, SAMPLE_MODULE_YAML, answers)
|
||||
|
||||
self.assertEqual(result["bmb"]["name"], "BMad Builder")
|
||||
self.assertEqual(result["bmb"]["description"], "Standard Skill Compliant Factory")
|
||||
self.assertFalse(result["bmb"]["default_selected"])
|
||||
|
||||
def test_legacy_core_section_migrated_user_keys_stripped(self):
|
||||
"""Old config with core: nested section — user keys stripped after migration."""
|
||||
existing = {
|
||||
"core": {
|
||||
"user_name": "Brian",
|
||||
"communication_language": "English",
|
||||
"document_output_language": "English",
|
||||
"output_folder": "/out",
|
||||
},
|
||||
"bmb": {"name": "BMad Builder"},
|
||||
}
|
||||
answers = {
|
||||
"module": {"bmad_builder_output_folder": "path"},
|
||||
}
|
||||
result = merge_config(existing, SAMPLE_MODULE_YAML, answers)
|
||||
|
||||
# User-only keys stripped after migration
|
||||
self.assertNotIn("user_name", result)
|
||||
self.assertNotIn("communication_language", result)
|
||||
# Shared core values hoisted to root
|
||||
self.assertEqual(result["document_output_language"], "English")
|
||||
self.assertEqual(result["output_folder"], "/out")
|
||||
# Legacy core key removed
|
||||
self.assertNotIn("core", result)
|
||||
# Module still works
|
||||
self.assertIn("bmb", result)
|
||||
|
||||
def test_legacy_core_user_keys_stripped_after_migration(self):
|
||||
"""Legacy core: values get migrated, user keys stripped, shared keys kept."""
|
||||
existing = {
|
||||
"core": {"user_name": "OldName", "output_folder": "/old"},
|
||||
}
|
||||
answers = {
|
||||
"core": {"user_name": "NewName", "output_folder": "/new"},
|
||||
"module": {},
|
||||
}
|
||||
result = merge_config(existing, SAMPLE_MODULE_YAML, answers)
|
||||
|
||||
# User-only key not in config even after migration + override
|
||||
self.assertNotIn("user_name", result)
|
||||
self.assertNotIn("core", result)
|
||||
# Shared core key written
|
||||
self.assertEqual(result["output_folder"], "/new")
|
||||
|
||||
|
||||
class TestEndToEnd(unittest.TestCase):
|
||||
def test_write_and_read_round_trip(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
config_path = os.path.join(tmpdir, "_bmad", "config.yaml")
|
||||
|
||||
# Write answers
|
||||
answers = {
|
||||
"core": {
|
||||
"user_name": "Brian",
|
||||
"communication_language": "English",
|
||||
"document_output_language": "English",
|
||||
"output_folder": "_bmad-output",
|
||||
},
|
||||
"module": {"bmad_builder_output_folder": "_bmad-output/skills"},
|
||||
}
|
||||
|
||||
# Run merge
|
||||
result = merge_config({}, SAMPLE_MODULE_YAML, answers)
|
||||
merge_config_mod.write_config(result, config_path)
|
||||
|
||||
# Read back
|
||||
with open(config_path, "r") as f:
|
||||
written = yaml.safe_load(f)
|
||||
|
||||
# User-only keys not written to config.yaml
|
||||
self.assertNotIn("user_name", written)
|
||||
self.assertNotIn("communication_language", written)
|
||||
# Shared core keys written
|
||||
self.assertEqual(written["document_output_language"], "English")
|
||||
self.assertEqual(written["output_folder"], "_bmad-output")
|
||||
self.assertEqual(written["bmb"]["bmad_builder_output_folder"], "{project-root}/_bmad-output/skills")
|
||||
|
||||
def test_update_round_trip(self):
|
||||
"""Simulate install, then re-install with different values."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
config_path = os.path.join(tmpdir, "config.yaml")
|
||||
|
||||
# First install
|
||||
answers1 = {
|
||||
"core": {"output_folder": "/out"},
|
||||
"module": {"bmad_builder_output_folder": "old/path"},
|
||||
}
|
||||
result1 = merge_config({}, SAMPLE_MODULE_YAML, answers1)
|
||||
merge_config_mod.write_config(result1, config_path)
|
||||
|
||||
# Second install (update)
|
||||
existing = merge_config_mod.load_yaml_file(config_path)
|
||||
answers2 = {
|
||||
"module": {"bmad_builder_output_folder": "new/path"},
|
||||
}
|
||||
result2 = merge_config(existing, SAMPLE_MODULE_YAML, answers2)
|
||||
merge_config_mod.write_config(result2, config_path)
|
||||
|
||||
# Verify
|
||||
with open(config_path, "r") as f:
|
||||
final = yaml.safe_load(f)
|
||||
|
||||
self.assertEqual(final["output_folder"], "/out")
|
||||
self.assertNotIn("user_name", final)
|
||||
self.assertEqual(final["bmb"]["bmad_builder_output_folder"], "{project-root}/new/path")
|
||||
|
||||
|
||||
class TestLoadLegacyValues(unittest.TestCase):
|
||||
def _make_legacy_dir(self, tmpdir, core_data=None, module_code=None, module_data=None):
|
||||
"""Create legacy directory structure for testing."""
|
||||
legacy_dir = os.path.join(tmpdir, "_bmad")
|
||||
if core_data is not None:
|
||||
core_dir = os.path.join(legacy_dir, "core")
|
||||
os.makedirs(core_dir, exist_ok=True)
|
||||
with open(os.path.join(core_dir, "config.yaml"), "w") as f:
|
||||
yaml.dump(core_data, f)
|
||||
if module_code and module_data is not None:
|
||||
mod_dir = os.path.join(legacy_dir, module_code)
|
||||
os.makedirs(mod_dir, exist_ok=True)
|
||||
with open(os.path.join(mod_dir, "config.yaml"), "w") as f:
|
||||
yaml.dump(module_data, f)
|
||||
return legacy_dir
|
||||
|
||||
def test_reads_core_keys_from_core_config(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
legacy_dir = self._make_legacy_dir(tmpdir, core_data={
|
||||
"user_name": "Brian",
|
||||
"communication_language": "English",
|
||||
"document_output_language": "English",
|
||||
"output_folder": "/out",
|
||||
})
|
||||
core, mod, files = load_legacy_values(legacy_dir, "bmb", SAMPLE_MODULE_YAML)
|
||||
self.assertEqual(core["user_name"], "Brian")
|
||||
self.assertEqual(core["communication_language"], "English")
|
||||
self.assertEqual(len(files), 1)
|
||||
self.assertEqual(mod, {})
|
||||
|
||||
def test_reads_module_keys_matching_yaml_variables(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
legacy_dir = self._make_legacy_dir(
|
||||
tmpdir,
|
||||
module_code="bmb",
|
||||
module_data={
|
||||
"bmad_builder_output_folder": "custom/path",
|
||||
"bmad_builder_reports": "custom/reports",
|
||||
"user_name": "Brian", # core key duplicated
|
||||
"unknown_key": "ignored", # not in module.yaml
|
||||
},
|
||||
)
|
||||
core, mod, files = load_legacy_values(legacy_dir, "bmb", SAMPLE_MODULE_YAML)
|
||||
self.assertEqual(mod["bmad_builder_output_folder"], "custom/path")
|
||||
self.assertEqual(mod["bmad_builder_reports"], "custom/reports")
|
||||
self.assertNotIn("unknown_key", mod)
|
||||
# Core key from module config used as fallback
|
||||
self.assertEqual(core["user_name"], "Brian")
|
||||
self.assertEqual(len(files), 1)
|
||||
|
||||
def test_core_config_takes_priority_over_module_for_core_keys(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
legacy_dir = self._make_legacy_dir(
|
||||
tmpdir,
|
||||
core_data={"user_name": "FromCore"},
|
||||
module_code="bmb",
|
||||
module_data={"user_name": "FromModule"},
|
||||
)
|
||||
core, mod, files = load_legacy_values(legacy_dir, "bmb", SAMPLE_MODULE_YAML)
|
||||
self.assertEqual(core["user_name"], "FromCore")
|
||||
self.assertEqual(len(files), 2)
|
||||
|
||||
def test_no_legacy_files_returns_empty(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
legacy_dir = os.path.join(tmpdir, "_bmad")
|
||||
os.makedirs(legacy_dir)
|
||||
core, mod, files = load_legacy_values(legacy_dir, "bmb", SAMPLE_MODULE_YAML)
|
||||
self.assertEqual(core, {})
|
||||
self.assertEqual(mod, {})
|
||||
self.assertEqual(files, [])
|
||||
|
||||
def test_ignores_other_module_directories(self):
|
||||
"""Only reads core and the specified module_code — not other modules."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
legacy_dir = self._make_legacy_dir(
|
||||
tmpdir,
|
||||
module_code="bmb",
|
||||
module_data={"bmad_builder_output_folder": "bmb/path"},
|
||||
)
|
||||
# Create another module directory that should be ignored
|
||||
other_dir = os.path.join(legacy_dir, "cis")
|
||||
os.makedirs(other_dir)
|
||||
with open(os.path.join(other_dir, "config.yaml"), "w") as f:
|
||||
yaml.dump({"visual_tools": "advanced"}, f)
|
||||
|
||||
core, mod, files = load_legacy_values(legacy_dir, "bmb", SAMPLE_MODULE_YAML)
|
||||
self.assertNotIn("visual_tools", mod)
|
||||
self.assertEqual(len(files), 1) # only bmb, not cis
|
||||
|
||||
|
||||
class TestApplyLegacyDefaults(unittest.TestCase):
|
||||
def test_legacy_fills_missing_core(self):
|
||||
answers = {"module": {"bmad_builder_output_folder": "path"}}
|
||||
result = apply_legacy_defaults(
|
||||
answers,
|
||||
legacy_core={"user_name": "Brian", "communication_language": "English"},
|
||||
legacy_module={},
|
||||
)
|
||||
self.assertEqual(result["core"]["user_name"], "Brian")
|
||||
self.assertEqual(result["module"]["bmad_builder_output_folder"], "path")
|
||||
|
||||
def test_answers_override_legacy(self):
|
||||
answers = {
|
||||
"core": {"user_name": "NewName"},
|
||||
"module": {"bmad_builder_output_folder": "new/path"},
|
||||
}
|
||||
result = apply_legacy_defaults(
|
||||
answers,
|
||||
legacy_core={"user_name": "OldName"},
|
||||
legacy_module={"bmad_builder_output_folder": "old/path"},
|
||||
)
|
||||
self.assertEqual(result["core"]["user_name"], "NewName")
|
||||
self.assertEqual(result["module"]["bmad_builder_output_folder"], "new/path")
|
||||
|
||||
def test_legacy_fills_missing_module_keys(self):
|
||||
answers = {"module": {}}
|
||||
result = apply_legacy_defaults(
|
||||
answers,
|
||||
legacy_core={},
|
||||
legacy_module={"bmad_builder_output_folder": "legacy/path"},
|
||||
)
|
||||
self.assertEqual(result["module"]["bmad_builder_output_folder"], "legacy/path")
|
||||
|
||||
def test_empty_legacy_is_noop(self):
|
||||
answers = {"core": {"user_name": "Brian"}, "module": {"key": "val"}}
|
||||
result = apply_legacy_defaults(answers, {}, {})
|
||||
self.assertEqual(result, answers)
|
||||
|
||||
|
||||
class TestCleanupLegacyConfigs(unittest.TestCase):
|
||||
def test_deletes_module_and_core_configs(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
legacy_dir = os.path.join(tmpdir, "_bmad")
|
||||
for subdir in ("core", "bmb"):
|
||||
d = os.path.join(legacy_dir, subdir)
|
||||
os.makedirs(d)
|
||||
with open(os.path.join(d, "config.yaml"), "w") as f:
|
||||
f.write("key: val\n")
|
||||
|
||||
deleted = cleanup_legacy_configs(legacy_dir, "bmb")
|
||||
self.assertEqual(len(deleted), 2)
|
||||
self.assertFalse(os.path.exists(os.path.join(legacy_dir, "core", "config.yaml")))
|
||||
self.assertFalse(os.path.exists(os.path.join(legacy_dir, "bmb", "config.yaml")))
|
||||
# Directories still exist
|
||||
self.assertTrue(os.path.isdir(os.path.join(legacy_dir, "core")))
|
||||
self.assertTrue(os.path.isdir(os.path.join(legacy_dir, "bmb")))
|
||||
|
||||
def test_leaves_other_module_configs_alone(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
legacy_dir = os.path.join(tmpdir, "_bmad")
|
||||
for subdir in ("bmb", "cis"):
|
||||
d = os.path.join(legacy_dir, subdir)
|
||||
os.makedirs(d)
|
||||
with open(os.path.join(d, "config.yaml"), "w") as f:
|
||||
f.write("key: val\n")
|
||||
|
||||
deleted = cleanup_legacy_configs(legacy_dir, "bmb")
|
||||
self.assertEqual(len(deleted), 1) # only bmb, not cis
|
||||
self.assertTrue(os.path.exists(os.path.join(legacy_dir, "cis", "config.yaml")))
|
||||
|
||||
def test_no_legacy_files_returns_empty(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
deleted = cleanup_legacy_configs(tmpdir, "bmb")
|
||||
self.assertEqual(deleted, [])
|
||||
|
||||
|
||||
class TestLegacyEndToEnd(unittest.TestCase):
|
||||
def test_full_legacy_migration(self):
|
||||
"""Simulate installing a module with legacy configs present."""
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
config_path = os.path.join(tmpdir, "_bmad", "config.yaml")
|
||||
legacy_dir = os.path.join(tmpdir, "_bmad")
|
||||
|
||||
# Create legacy core config
|
||||
core_dir = os.path.join(legacy_dir, "core")
|
||||
os.makedirs(core_dir)
|
||||
with open(os.path.join(core_dir, "config.yaml"), "w") as f:
|
||||
yaml.dump({
|
||||
"user_name": "LegacyUser",
|
||||
"communication_language": "Spanish",
|
||||
"document_output_language": "French",
|
||||
"output_folder": "/legacy/out",
|
||||
}, f)
|
||||
|
||||
# Create legacy module config
|
||||
mod_dir = os.path.join(legacy_dir, "bmb")
|
||||
os.makedirs(mod_dir)
|
||||
with open(os.path.join(mod_dir, "config.yaml"), "w") as f:
|
||||
yaml.dump({
|
||||
"bmad_builder_output_folder": "legacy/skills",
|
||||
"bmad_builder_reports": "legacy/reports",
|
||||
"user_name": "LegacyUser", # duplicated core key
|
||||
}, f)
|
||||
|
||||
# Answers from the user (only partially filled — user accepted some defaults)
|
||||
answers = {
|
||||
"core": {"user_name": "NewUser"},
|
||||
"module": {"bmad_builder_output_folder": "new/skills"},
|
||||
}
|
||||
|
||||
# Load and apply legacy
|
||||
legacy_core, legacy_module, _ = load_legacy_values(
|
||||
legacy_dir, "bmb", SAMPLE_MODULE_YAML
|
||||
)
|
||||
answers = apply_legacy_defaults(answers, legacy_core, legacy_module)
|
||||
|
||||
# Core: NewUser overrides legacy, but legacy Spanish fills in communication_language
|
||||
self.assertEqual(answers["core"]["user_name"], "NewUser")
|
||||
self.assertEqual(answers["core"]["communication_language"], "Spanish")
|
||||
|
||||
# Module: new/skills overrides, but legacy/reports fills in
|
||||
self.assertEqual(answers["module"]["bmad_builder_output_folder"], "new/skills")
|
||||
self.assertEqual(answers["module"]["bmad_builder_reports"], "legacy/reports")
|
||||
|
||||
# Merge
|
||||
result = merge_config({}, SAMPLE_MODULE_YAML, answers)
|
||||
merge_config_mod.write_config(result, config_path)
|
||||
|
||||
# Cleanup
|
||||
deleted = cleanup_legacy_configs(legacy_dir, "bmb")
|
||||
self.assertEqual(len(deleted), 2)
|
||||
self.assertFalse(os.path.exists(os.path.join(core_dir, "config.yaml")))
|
||||
self.assertFalse(os.path.exists(os.path.join(mod_dir, "config.yaml")))
|
||||
|
||||
# Verify final config — user-only keys NOT in config.yaml
|
||||
with open(config_path, "r") as f:
|
||||
final = yaml.safe_load(f)
|
||||
self.assertNotIn("user_name", final)
|
||||
self.assertNotIn("communication_language", final)
|
||||
# Shared core keys present
|
||||
self.assertEqual(final["document_output_language"], "French")
|
||||
self.assertEqual(final["output_folder"], "/legacy/out")
|
||||
self.assertEqual(final["bmb"]["bmad_builder_output_folder"], "{project-root}/new/skills")
|
||||
self.assertEqual(final["bmb"]["bmad_builder_reports"], "{project-root}/legacy/reports")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
@@ -0,0 +1,237 @@
|
||||
#!/usr/bin/env python3
|
||||
# /// script
|
||||
# requires-python = ">=3.9"
|
||||
# dependencies = []
|
||||
# ///
|
||||
"""Unit tests for merge-help-csv.py."""
|
||||
|
||||
import csv
|
||||
import os
|
||||
import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
from io import StringIO
|
||||
from pathlib import Path
|
||||
|
||||
# Import merge_help_csv module
|
||||
from importlib.util import spec_from_file_location, module_from_spec
|
||||
|
||||
_spec = spec_from_file_location(
|
||||
"merge_help_csv",
|
||||
str(Path(__file__).parent.parent / "merge-help-csv.py"),
|
||||
)
|
||||
merge_help_csv_mod = module_from_spec(_spec)
|
||||
_spec.loader.exec_module(merge_help_csv_mod)
|
||||
|
||||
extract_module_codes = merge_help_csv_mod.extract_module_codes
|
||||
filter_rows = merge_help_csv_mod.filter_rows
|
||||
read_csv_rows = merge_help_csv_mod.read_csv_rows
|
||||
write_csv = merge_help_csv_mod.write_csv
|
||||
cleanup_legacy_csvs = merge_help_csv_mod.cleanup_legacy_csvs
|
||||
HEADER = merge_help_csv_mod.HEADER
|
||||
|
||||
|
||||
SAMPLE_ROWS = [
|
||||
["bmb", "", "bmad-bmb-module-init", "Install Module", "IM", "install", "", "Install BMad Builder.", "anytime", "", "", "false", "", "config", ""],
|
||||
["bmb", "", "bmad-agent-builder", "Build Agent", "BA", "build-process", "", "Create an agent.", "anytime", "", "", "false", "output_folder", "agent skill", ""],
|
||||
]
|
||||
|
||||
|
||||
class TestExtractModuleCodes(unittest.TestCase):
|
||||
def test_extracts_codes(self):
|
||||
codes = extract_module_codes(SAMPLE_ROWS)
|
||||
self.assertEqual(codes, {"bmb"})
|
||||
|
||||
def test_multiple_codes(self):
|
||||
rows = SAMPLE_ROWS + [
|
||||
["cis", "", "cis-skill", "CIS Skill", "CS", "run", "", "A skill.", "anytime", "", "", "false", "", "", ""],
|
||||
]
|
||||
codes = extract_module_codes(rows)
|
||||
self.assertEqual(codes, {"bmb", "cis"})
|
||||
|
||||
def test_empty_rows(self):
|
||||
codes = extract_module_codes([])
|
||||
self.assertEqual(codes, set())
|
||||
|
||||
|
||||
class TestFilterRows(unittest.TestCase):
|
||||
def test_removes_matching_rows(self):
|
||||
result = filter_rows(SAMPLE_ROWS, "bmb")
|
||||
self.assertEqual(len(result), 0)
|
||||
|
||||
def test_preserves_non_matching_rows(self):
|
||||
mixed_rows = SAMPLE_ROWS + [
|
||||
["cis", "", "cis-skill", "CIS Skill", "CS", "run", "", "A skill.", "anytime", "", "", "false", "", "", ""],
|
||||
]
|
||||
result = filter_rows(mixed_rows, "bmb")
|
||||
self.assertEqual(len(result), 1)
|
||||
self.assertEqual(result[0][0], "cis")
|
||||
|
||||
def test_no_match_preserves_all(self):
|
||||
result = filter_rows(SAMPLE_ROWS, "xyz")
|
||||
self.assertEqual(len(result), 2)
|
||||
|
||||
|
||||
class TestReadWriteCSV(unittest.TestCase):
|
||||
def test_nonexistent_file_returns_empty(self):
|
||||
header, rows = read_csv_rows("/nonexistent/path/file.csv")
|
||||
self.assertEqual(header, [])
|
||||
self.assertEqual(rows, [])
|
||||
|
||||
def test_round_trip(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
path = os.path.join(tmpdir, "test.csv")
|
||||
write_csv(path, HEADER, SAMPLE_ROWS)
|
||||
|
||||
header, rows = read_csv_rows(path)
|
||||
self.assertEqual(len(rows), 2)
|
||||
self.assertEqual(rows[0][0], "bmb")
|
||||
self.assertEqual(rows[0][2], "bmad-bmb-module-init")
|
||||
|
||||
def test_creates_parent_dirs(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
path = os.path.join(tmpdir, "sub", "dir", "test.csv")
|
||||
write_csv(path, HEADER, SAMPLE_ROWS)
|
||||
self.assertTrue(os.path.exists(path))
|
||||
|
||||
|
||||
class TestEndToEnd(unittest.TestCase):
|
||||
def _write_source(self, tmpdir, rows):
|
||||
path = os.path.join(tmpdir, "source.csv")
|
||||
write_csv(path, HEADER, rows)
|
||||
return path
|
||||
|
||||
def _write_target(self, tmpdir, rows):
|
||||
path = os.path.join(tmpdir, "target.csv")
|
||||
write_csv(path, HEADER, rows)
|
||||
return path
|
||||
|
||||
def test_fresh_install_no_existing_target(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
source_path = self._write_source(tmpdir, SAMPLE_ROWS)
|
||||
target_path = os.path.join(tmpdir, "target.csv")
|
||||
|
||||
# Target doesn't exist
|
||||
self.assertFalse(os.path.exists(target_path))
|
||||
|
||||
# Simulate merge
|
||||
_, source_rows = read_csv_rows(source_path)
|
||||
source_codes = extract_module_codes(source_rows)
|
||||
write_csv(target_path, HEADER, source_rows)
|
||||
|
||||
_, result_rows = read_csv_rows(target_path)
|
||||
self.assertEqual(len(result_rows), 2)
|
||||
|
||||
def test_merge_into_existing_with_other_module(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
other_rows = [
|
||||
["cis", "", "cis-skill", "CIS Skill", "CS", "run", "", "A skill.", "anytime", "", "", "false", "", "", ""],
|
||||
]
|
||||
target_path = self._write_target(tmpdir, other_rows)
|
||||
source_path = self._write_source(tmpdir, SAMPLE_ROWS)
|
||||
|
||||
# Read both
|
||||
_, target_rows = read_csv_rows(target_path)
|
||||
_, source_rows = read_csv_rows(source_path)
|
||||
source_codes = extract_module_codes(source_rows)
|
||||
|
||||
# Anti-zombie filter + append
|
||||
filtered = target_rows
|
||||
for code in source_codes:
|
||||
filtered = filter_rows(filtered, code)
|
||||
merged = filtered + source_rows
|
||||
|
||||
write_csv(target_path, HEADER, merged)
|
||||
|
||||
_, result_rows = read_csv_rows(target_path)
|
||||
self.assertEqual(len(result_rows), 3) # 1 cis + 2 bmb
|
||||
|
||||
def test_anti_zombie_replaces_stale_entries(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
# Existing target has old bmb entries + cis entry
|
||||
old_bmb_rows = [
|
||||
["bmb", "", "old-skill", "Old Skill", "OS", "run", "", "Old.", "anytime", "", "", "false", "", "", ""],
|
||||
["bmb", "", "another-old", "Another", "AO", "run", "", "Old too.", "anytime", "", "", "false", "", "", ""],
|
||||
]
|
||||
cis_rows = [
|
||||
["cis", "", "cis-skill", "CIS Skill", "CS", "run", "", "A skill.", "anytime", "", "", "false", "", "", ""],
|
||||
]
|
||||
target_path = self._write_target(tmpdir, old_bmb_rows + cis_rows)
|
||||
source_path = self._write_source(tmpdir, SAMPLE_ROWS)
|
||||
|
||||
# Read both
|
||||
_, target_rows = read_csv_rows(target_path)
|
||||
_, source_rows = read_csv_rows(source_path)
|
||||
source_codes = extract_module_codes(source_rows)
|
||||
|
||||
# Anti-zombie filter + append
|
||||
filtered = target_rows
|
||||
for code in source_codes:
|
||||
filtered = filter_rows(filtered, code)
|
||||
merged = filtered + source_rows
|
||||
|
||||
write_csv(target_path, HEADER, merged)
|
||||
|
||||
_, result_rows = read_csv_rows(target_path)
|
||||
# Should have 1 cis + 2 new bmb = 3 (old bmb removed)
|
||||
self.assertEqual(len(result_rows), 3)
|
||||
module_codes = [r[0] for r in result_rows]
|
||||
self.assertEqual(module_codes.count("bmb"), 2)
|
||||
self.assertEqual(module_codes.count("cis"), 1)
|
||||
# Old skills should be gone
|
||||
skill_names = [r[2] for r in result_rows]
|
||||
self.assertNotIn("old-skill", skill_names)
|
||||
self.assertNotIn("another-old", skill_names)
|
||||
|
||||
|
||||
class TestCleanupLegacyCsvs(unittest.TestCase):
|
||||
def test_deletes_module_and_core_csvs(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
legacy_dir = os.path.join(tmpdir, "_bmad")
|
||||
for subdir in ("core", "bmb"):
|
||||
d = os.path.join(legacy_dir, subdir)
|
||||
os.makedirs(d)
|
||||
with open(os.path.join(d, "module-help.csv"), "w") as f:
|
||||
f.write("header\nrow\n")
|
||||
|
||||
deleted = cleanup_legacy_csvs(legacy_dir, "bmb")
|
||||
self.assertEqual(len(deleted), 2)
|
||||
self.assertFalse(os.path.exists(os.path.join(legacy_dir, "core", "module-help.csv")))
|
||||
self.assertFalse(os.path.exists(os.path.join(legacy_dir, "bmb", "module-help.csv")))
|
||||
# Directories still exist
|
||||
self.assertTrue(os.path.isdir(os.path.join(legacy_dir, "core")))
|
||||
self.assertTrue(os.path.isdir(os.path.join(legacy_dir, "bmb")))
|
||||
|
||||
def test_leaves_other_module_csvs_alone(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
legacy_dir = os.path.join(tmpdir, "_bmad")
|
||||
for subdir in ("bmb", "cis"):
|
||||
d = os.path.join(legacy_dir, subdir)
|
||||
os.makedirs(d)
|
||||
with open(os.path.join(d, "module-help.csv"), "w") as f:
|
||||
f.write("header\nrow\n")
|
||||
|
||||
deleted = cleanup_legacy_csvs(legacy_dir, "bmb")
|
||||
self.assertEqual(len(deleted), 1) # only bmb, not cis
|
||||
self.assertTrue(os.path.exists(os.path.join(legacy_dir, "cis", "module-help.csv")))
|
||||
|
||||
def test_no_legacy_files_returns_empty(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
deleted = cleanup_legacy_csvs(tmpdir, "bmb")
|
||||
self.assertEqual(deleted, [])
|
||||
|
||||
def test_handles_only_core_no_module(self):
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
legacy_dir = os.path.join(tmpdir, "_bmad")
|
||||
core_dir = os.path.join(legacy_dir, "core")
|
||||
os.makedirs(core_dir)
|
||||
with open(os.path.join(core_dir, "module-help.csv"), "w") as f:
|
||||
f.write("header\nrow\n")
|
||||
|
||||
deleted = cleanup_legacy_csvs(legacy_dir, "bmb")
|
||||
self.assertEqual(len(deleted), 1)
|
||||
self.assertFalse(os.path.exists(os.path.join(core_dir, "module-help.csv")))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
Reference in New Issue
Block a user