This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
mxdev is a Python utility that enables managing Python projects with multiple interdependent packages where only a subset needs to be developed locally. It operates as a preprocessor that orchestrates VCS checkouts and generates combined requirements/constraints files for pip installation.
Key principle: mxdev does NOT run pip - it prepares requirements and constraints files for pip to use.
# Install development environment with all dependencies
make install
# This creates a virtual environment in .venv/ and installs:
# - mxdev in editable mode
# - All test dependencies
# - All development tools
# Activate the virtual environment
source .venv/bin/activate# IMPORTANT: Tests must run from the activated virtual environment
source .venv/bin/activate
# Run all tests
pytest
# Run specific test file
pytest tests/test_git.py
# Run specific test function
pytest tests/test_git.py::test_function_name
# Run tests with verbose output
pytest -v
# Or use the Makefile (automatically uses venv)
make testCoverage is automatically collected when running tests via tox or CI.
# Run tests with coverage
source .venv/bin/activate
coverage run -m pytest
# View terminal report with missing lines
coverage report --show-missing
# Generate and view HTML coverage report
coverage html
open htmlcov/index.html # macOS
# or: xdg-open htmlcov/index.html # Linux
# or: start htmlcov/index.html # Windows
# Or use Makefile shortcuts (defined in include.mk)
make coverage # Run tests + combine + show terminal report
make coverage-html # Run tests + combine + open HTML reportNote: Coverage targets are defined in include.mk, which is included by the mxmake-generated Makefile and preserved during mxmake updates.
Coverage is automatically collected and combined from all matrix test runs in GitHub Actions:
Process:
- Each test job (Python 3.10-3.14, Ubuntu/Windows/macOS) uploads its
.coverage.*file as an artifact - A dedicated
coveragejob downloads all artifacts - Coverage is combined using
coverage combine - Reports are generated:
- Terminal report added to GitHub Actions summary
- HTML report uploaded as downloadable artifact
- CI fails if combined coverage falls below 35% (baseline threshold)
To view coverage from CI:
- Go to Actions tab in GitHub
- Click on the workflow run
- Scroll down to Artifacts section
- Download the
html-coverage-reportartifact - Unzip and open
htmlcov/index.htmlin a browser
To adjust coverage threshold:
Edit .github/workflows/tests.yaml and change the --fail-under=35 value in the "Fail if coverage is below threshold" step.
Note: The threshold is currently set to 35% as a baseline. This should be gradually increased as test coverage improves.
Coverage settings are in pyproject.toml under [tool.coverage.*] sections:
[tool.coverage.run]: Source paths, branch coverage, parallel mode, file patterns[tool.coverage.paths]: Path mapping for combining coverage across environments[tool.coverage.report]: Excluded lines, precision, display options[tool.coverage.html]: HTML output directory
Key settings:
parallel = true- Allows multiple test runs without overwriting datarelative_files = true- Required for combining coverage across different OSesbranch = true- Measures branch coverage (not just line coverage)- Excludes: tests/, _version.py, defensive code patterns
### Code Quality
```bash
# Run all pre-commit hooks (using uvx with tox-uv)
uvx --with tox-uv tox -e lint
# Run ruff linter (with auto-fix)
uvx ruff check --fix src/mxdev tests
# Run ruff formatter
uvx ruff format src/mxdev tests
# Sort imports with isort
uvx isort src/mxdev tests
# Run type checking
uvx mypy src/mxdev
# Run all pre-commit hooks manually
uvx pre-commit run --all-files
# Run tests on all supported Python versions (Python 3.10-3.14)
# This uses uvx to run tox with tox-uv plugin for much faster testing (10-100x speedup)
uvx --with tox-uv tox
# Run on specific Python version
uvx --with tox-uv tox -e py311
# Run multiple environments in parallel
uvx --with tox-uv tox -p auto
# Run with extra pytest arguments
uvx --with tox-uv tox -e py312 -- -v -k test_git
# Use a specific Python version with uvx (uv will auto-install if needed)
uvx --python 3.11 --with tox-uv tox -e py311Note:
- This project uses
uvx toxinstead of globally installed tox - no installation required! - tox configuration is defined in pyproject.toml under
[tool.tox] - The
tox-uvplugin provides 10-100x speedup over traditional pip/virtualenv - You can create a shell alias for convenience:
alias tox='uvx --with tox-uv tox'
# Default: reads mx.ini, fetches sources, generates requirements/constraints
mxdev
# Custom config file
mxdev -c custom.ini
# Skip fetching (useful for offline work)
mxdev -n
# Fetch only (don't generate output files)
mxdev -f
# Offline mode (no VCS operations)
mxdev -o
# Control parallelism
mxdev -t 8 # Use 8 threads for fetching
# Verbose output
mxdev -v
# Silent mode
mxdev -sThe codebase follows a three-phase pipeline:
-
Read Phase (processing.py:read)
- Parses
mx.iniconfiguration - Recursively reads requirements and constraints files
- Supports both local files and HTTP(S) URLs
- Calls
read_hooks()for extensibility
- Parses
-
Fetch Phase (processing.py:fetch)
- Checks out VCS sources into target directories
- Uses multi-threaded queue-based workers for parallel operations
- Supports Git, SVN, Mercurial, Bazaar, Darcs, and local filesystem
- Controlled by
threadssetting (default: 4)
-
Write Phase (processing.py:write)
- Generates modified requirements file (packages from source as
-e) - Generates modified constraints file (developed packages commented out)
- Applies version overrides
- Calls
write_hooks()for extensibility
- Generates modified requirements file (packages from source as
main.py - CLI entry point
- Argument parsing and validation
- Orchestrates read → fetch → write workflow
config.py - Configuration management
Configurationclass: parses INI files withExtendedInterpolation- Main sections:
[settings], package sections, hook sections - Validates install-mode, version overrides, and package settings
state.py - Application state container
- Immutable dataclass holding
Configuration,requirements,constraints - Passed through the entire pipeline
processing.py - Core business logic
process_line(): Handles individual requirement lines, comments out developed packagesresolve_dependencies(): Recursively processes-cand-rreferences- File/URL resolution with proper path handling
vcs/ - VCS abstraction layer
BaseWorkingCopy: Abstract base class withcheckout(),update(),status(),matches()WorkingCopies: Orchestrates multiple VCS operations with threading- Entry points-based plugin system for VCS types
- Git (production stable): Full support including submodules, shallow clones, branch/tag checkout
- fs (stable): Local filesystem pseudo-VCS
- svn, hg, bzr, darcs (unstable): Legacy VCS support
hooks.py - Extensibility system
Hookbase class withnamespace,read(state),write(state)methods- Loaded via
[project.entry-points.mxdev]inpyproject.toml - Settings isolated by namespace to avoid conflicts
including.py - Recursive INI inclusion
read_with_included(): Handlesincludedirective in[settings]- Supports local files and HTTP(S) URLs
- Relative path resolution from parent file/URL
- Abstract Base Class: VCS abstraction with pluggable implementations
- Factory/Registry: Entry points for VCS types and hooks
- Producer-Consumer: Queue-based threading for parallel VCS operations
- Immutable State:
Statedataclass prevents mutation bugs - Dependency Injection: Configuration and hooks passed through state
- Minimal dependencies: Only
packagingat runtime - no requests, no YAML parsers - Standard library first: Uses
configparser,urllib,threadinginstead of third-party libs - No pip invocation: mxdev generates files; users run pip separately
- Backward compatibility: Supports Python 3.10+ with version detection for Git commands
mxdev uses INI files with configparser.ExtendedInterpolation syntax.
[settings]
github = git+ssh://git@github.com/
[mypackage]
url = ${settings:github}org/mypackage.gitDevelop multiple packages with version overrides:
[settings]
requirements-in = requirements.txt
requirements-out = requirements-mxdev.txt
constraints-out = constraints-mxdev.txt
version-overrides =
somepackage==3.0.0
ignores =
main-package-name
main-package = -e .[test]
[package1]
url = git+https://github.com/org/package1.git
branch = feature-branch
extras = test
install-mode = editable
[package2]
url = git+https://github.com/org/package2.git
branch = main
install-mode = fixed
[package3]
url = git+https://github.com/org/package3.git
install-mode = skipInstall mode options:
editable(default): Installs with-eprefix for developmentfixed: Installs without-eprefix for production/Docker deploymentsskip: Only clones, doesn't installdirect: Deprecated alias foreditable(logs warning)
Using includes for shared configurations:
[settings]
include = https://example.com/shared.ini
# Settings here override included settingsShallow clones (faster checkouts):
[package]
url = ...
depth = 1Or set globally:
export GIT_CLONE_DEPTH=1Submodule handling:
always(default): Always checkout/update submodulescheckout: Only fetch during initial checkoutrecursive: Use--recurse-submodulesfor nested submodules
- Tests are colocated with source:
tests/ - Fixtures in conftest.py
- Test utilities in utils.py
tempdir: Temporary working directorymkgitrepo: Factory for creating test Git repositoriesdevelop: MockDevelop instance for simulating development environmentshttpretty: HTTP mocking for URL-based tests
- test_git.py: Git VCS operations
- test_git_submodules.py: Comprehensive submodule scenarios
- test_including.py: INI file inclusion
- test_common.py: VCS abstraction utilities
When adding VCS functionality, follow existing patterns:
- Use
mkgitrepofixture to create test repositories - Create a
developinstance with mock configuration - Test both initial checkout and update scenarios
- Verify output in requirements/constraints files
To create a mxdev extension:
- Create a Hook subclass:
from mxdev import Hook, State
class MyExtension(Hook):
namespace = "myext" # Prefix for all settings
def read(self, state: State) -> None:
"""Called after read phase."""
# Access config: state.configuration.settings
# Access packages: state.configuration.packages
# Access hook config: state.configuration.hooks
def write(self, state: State) -> None:
"""Called after write phase."""
# Generate additional files, scripts, etc.- Register as entry point in pyproject.toml:
[project.entry-points.mxdev]
myext = "mypackage:MyExtension"- Add namespaced config to mx.ini:
[settings]
myext-global_setting = value
[myext-section]
specific_setting = value
[somepackage]
myext-package_setting = value- Formatting: Ruff formatter (max line length: 120, target Python 3.10+)
- Configured in pyproject.toml under
[tool.ruff] - Rules: E, W, F, UP, D (with selective ignores for docstrings)
- Automatically enforced via pre-commit hooks
- Configured in pyproject.toml under
- Import sorting: isort with plone profile,
force_alphabetical_sort = true,force_single_line = true- Configured in pyproject.toml under
[tool.isort] - Runs after ruff in pre-commit pipeline
- Configured in pyproject.toml under
- Type hints: Use throughout (Python 3.10+ syntax)
- Use
X | Yinstead ofUnion[X, Y] - Use
list[T],dict[K, V]instead ofList[T],Dict[K, V]
- Use
- Path handling: Prefer
pathlib.Pathoveros.pathfor path operations- Use
pathlib.Path().as_posix()for cross-platform path comparison - Use
/operator for path joining:Path("dir") / "file.txt" - Only use
os.path.join()in production code where needed for compatibility
- Use
- Logging: Use
logger = logging.getLogger("mxdev")from logging.py - Docstrings: Document public APIs and complex logic
The project uses GitHub Actions for continuous integration, configured in .github/workflows/tests.yaml.
Lint Job:
- Runs on:
ubuntu-latest - Uses:
uvx --with tox-uv tox -e lint - Executes pre-commit hooks
Test Job:
- Matrix testing across:
- Python versions: 3.10, 3.11, 3.12, 3.13, 3.14
- Operating systems: Ubuntu, Windows, macOS
- Total: 15 combinations (5 Python × 3 OS)
- Uses:
uvx --with tox-uv tox -e py{version} - Leverages
astral-sh/setup-uv@v7action for uv installation
- No pip caching needed: uv handles caching automatically and efficiently
- Fast execution: uv's parallel installation and caching dramatically speeds up CI
- Python auto-installation:
uv python installautomatically downloads required Python versions - Unified tooling: Same
uvx --with tox-uv toxcommand used locally and in CI
When adding new Python versions:
- Add to
python-configmatrix in .github/workflows/tests.yaml - Add corresponding environment to
env_listin pyproject.toml[tool.tox]section - Update
requires-pythonand classifiers in pyproject.toml if needed
- Create module in src/mxdev/vcs/ (e.g.,
newvcs.py) - Subclass
BaseWorkingCopyfrom vcs/common.py - Implement:
checkout(),status(),matches(),update() - Register in pyproject.toml under
[project.entry-points."mxdev.workingcopytypes"] - Add tests following test_git.py pattern
The core logic is in processing.py:
process_line(): Handles line-by-line processing- Look for
# -> mxdev disabledcomments - this is how packages are marked as developed from source - Version overrides are applied during constraint processing
- Ignore lists prevent certain packages from appearing in output
- Add default in
SETTINGS_DEFAULTSin config.py - Access via
configuration.settingsdictionary - Document in README.md under appropriate section
- Add validation if needed in
Configuration.__post_init__()
mxdev uses Hatchling as its build backend with the following plugins:
- Version source: Git tags
- No manual version bumps needed in code
- Version is automatically derived from git tags during build
- Development versions get format:
4.1.1.dev3+g1234abc - Tags must follow format:
vX.Y.Z(e.g.,v4.2.0)
- Concatenates multiple markdown files for PyPI long description
- Combines: README.md + CONTRIBUTING.md + CHANGES.md + LICENSE.md
- Adds section headers and separators automatically
- Source layout:
src/mxdev/ - Auto-generated version file:
src/mxdev/_version.py(in .gitignore) - Uses
.gitignorefor file inclusion (no MANIFEST.in needed) - PEP 420 namespace packages:
src/mxdev/vcs/has no__init__.py(implicit namespace package)
# Clean build artifacts
rm -rf dist/ build/
# Build (requires git tags to determine version)
uv tool run --from build pyproject-build
# Or with python -m build
python -m build
# Check package quality
uv tool run --from twine twine check dist/*Important: The version comes from git tags. If building from a commit without a tag, you'll get a development version like 4.1.1.dev3+g1234abc.
See RELEASE.md for complete release documentation.
Quick summary:
- Update CHANGES.md with release notes
- Commit and push to main
- Create GitHub Release with tag
vX.Y.Z - GitHub Actions automatically builds and publishes to PyPI
Key points:
- ✅ Version automatically set from git tag (no manual edit needed)
- ✅ GitHub Actions handles building and publishing
- ✅ All tests must pass before publishing
- ✅ See RELEASE.md for detailed workflow
CRITICAL: Always follow these steps before pushing code:
IMPORTANT: All bug fixes MUST follow the TDD (Test-Driven Development) approach:
-
Analysis
- Investigate and understand the root cause
- Identify the exact location and nature of the bug
- Document your findings
-
Comment on Issue with Analysis Only
gh issue comment <ISSUE_NUMBER> --body "Root cause analysis..."
- Post detailed root cause analysis to the GitHub issue
- Do NOT include the solution or plan in the comment
- Include code references, line numbers, and explanation
-
Create Failing Test
- Create branch:
git checkout -b fix/<issue-number>-description - Write a test that reproduces the bug
- Verify the test fails with the current code
- Commit:
git commit -m "Add failing test for issue #XX"
- Create branch:
-
Push and Create Draft PR
git push -u origin fix/<issue-number>-description gh pr create --draft --title "..." --body "..."
- PR body should explain the bug, show the failing test, and describe next steps
-
Implement the Fix
- Write the minimal code needed to make the test pass
- Verify the test now passes
- Run all related tests to ensure no regressions
-
Commit and Push Fix
git add <files> git commit -m "Fix issue #XX: description" git push
- Include issue reference in commit message
- Update CHANGES.md in the same commit or separately
-
Verify CI is Green
gh pr checks <PR_NUMBER>
- Wait for all CI checks to pass
- Address any failures
-
Mark PR Ready for Review
gh pr ready <PR_NUMBER>
- Only mark ready when all checks are green
- Update PR description if needed
Why TDD for Bug Fixes?
- Ensures the bug is actually fixed
- Prevents regressions in the future
- Documents the expected behavior
- Provides confidence in the solution
-
Always run linting before push
uvx --with tox-uv tox -e lint
- This runs black, mypy, and other code quality checks
- Fix any issues before committing
- Commit formatting changes separately if needed
-
Always update CHANGES.md
- Add entry under "## X.X.X (unreleased)" section
- Format:
- Fix #XX: Description. [author] - Create unreleased section if it doesn't exist
- Include issue number when applicable
-
Always check and update documentation
- README.md: Update configuration tables, usage examples, or feature descriptions
- EXTENDING.md: Update if hooks API changed
- RELEASE.md: Update if release process changed
- Check if new configuration options need documentation
- Check if new features need usage examples
- Update any affected sections (don't just append)
- MANDATORY: After any code change that adds/modifies features or configuration, verify documentation is updated
-
Run relevant tests locally
source .venv/bin/activate pytest tests/test_*.py -v
-
Check CI status before marking PR ready
gh pr checks <PR_NUMBER>
- Wait for all checks to pass
- Address any failures before requesting review
# 1. Make changes to code
# 2. Run linting
uvx --with tox-uv tox -e lint
# 3. Fix any linting issues and commit if changes were made
git add .
git commit -m "Run black formatter"
# 4. Run tests
source .venv/bin/activate
pytest -v
# 5. Update CHANGES.md
# Edit CHANGES.md to add entry
# 6. Commit everything
git add .
git commit -m "Fix issue description"
# 7. Push
git push
# 8. Check CI
gh pr checks <PR_NUMBER>
# 9. When green, mark PR ready for review- Main branch:
main - Create feature branches from
main - Create pull requests for all changes
- Ensure tests pass before merging
- Always lint before pushing (see Pre-Push Checklist above)
- Always update CHANGES.md for user-facing changes
IMPORTANT: Do NOT include Claude Code attribution in commit messages. Commit messages should be written as if by a human developer.
Bad (don't do this):
Fix #70: Implement HTTP caching
🤖 Generated with Claude Code
Co-Authored-By: Claude <noreply@anthropic.com>
Good (correct format):
Fix #70: Implement HTTP caching for offline mode
Previously, offline mode only skipped VCS operations but still
fetched HTTP-referenced requirements/constraints files.
Changes:
- Add HTTP content caching to .mxdev_cache/ directory
- Online mode: fetch from HTTP and cache for future use
- Offline mode: read from cache, error if not cached
All 190 tests pass, including 5 new HTTP caching tests.
Test fixture files in tests/data/requirements/ contain pinned package versions that can trigger Dependabot security alerts, even though they're not real dependencies.
Current Setup (Auto-Triage Rule):
A Dependabot auto-triage rule is configured via GitHub UI to automatically dismiss alerts from test fixtures:
- Rule name: "Dismiss test fixture alerts"
- Manifest filter: Comma-separated list of test fixture files:
tests/data/requirements/constraints.txt, tests/data/requirements/basic_requirements.txt, tests/data/requirements/nested_requirements.txt, tests/data/requirements/other_requirements.txt, tests/data/requirements/requirements_with_constraints.txt - Action: Dismiss indefinitely
- Location: GitHub Settings → Code security → Dependabot rules
How It Works:
Three separate GitHub systems handle dependency management:
-
GitHub Linguist (
linguist-vendoredin.gitattributes)- Only affects language statistics
- Does NOT affect dependency graph or Dependabot
-
Dependency Graph (vendored directory detection)
- Uses hardcoded regex patterns to identify vendored directories:
(3rd|[Tt]hird)[-_]?[Pp]arty/(^|/)vendors?/(^|/)[Ee]xtern(als?)?/
tests/data/does NOT match these patterns- Files in vendored directories are excluded from dependency graph
- Uses hardcoded regex patterns to identify vendored directories:
-
Dependabot Auto-Triage Rules
- This is the ONLY way to suppress security alerts for specific directories
- Can target by: manifest path, severity, package name, scope, ecosystem, CVE, CWE, GHSA, EPSS
- Rules are configured via GitHub UI (not version-controlled)
- Supports comma-separated manifest paths (no wildcards)
Key Limitations:
- ❌ Wildcards NOT supported in manifest paths (e.g.,
tests/data/**doesn't work) - ❌ Must specify exact file paths
- ❌ Configuration is in GitHub UI, not in repository files
- ✅ Can combine multiple paths with commas
Adding New Test Fixtures:
If you add a new test fixture file with pinned dependencies (e.g., tests/data/requirements/new_fixture.txt):
- Go to GitHub Settings → Code security → Dependabot rules
- Edit the "Dismiss test fixture alerts" rule
- Add the new path to the comma-separated manifest list
- Save the rule
Alternative Approaches (Don't Use):
- ❌
.gitattributeswithlinguist-vendored→ Only affects language stats, not Dependabot - ❌ Renaming
tests/data/totests/vendor/→ Breaking change, misleading name - ❌
exclude-pathsin.github/dependabot.yml→ Only affects version update PRs, NOT security alerts - ✅ Auto-triage rules → This is the correct solution for security alerts
The .mxdev_cache/ directory stores HTTP-referenced requirements/constraints files for offline use:
- Online mode: Content fetched from HTTP is automatically cached
- Offline mode (
-o/--offline): Content read from cache, errors if not cached - Cache key: SHA256 hash (first 16 hex chars) of the URL
- Location:
.mxdev_cache/(in.gitignore)
Cache Files:
.mxdev_cache/
a1b2c3d4e5f6g7h8 # Cached content (first 16 chars of SHA256)
a1b2c3d4e5f6g7h8.url # Original URL (for debugging)
Implementation Details:
- Cache functions:
_get_cache_key(),_cache_http_content(),_read_from_cache() - See
src/mxdev/processing.pyfor implementation - Tests in
tests/test_processing.py(5 comprehensive caching tests)
- Python: 3.10+
- pip: 23+ (required for proper operation)
- Runtime dependencies: Only
packaging - VCS tools: Install git, svn, hg, bzr, darcs as needed for VCS operations