Skip to content

Latest commit

 

History

History
808 lines (610 loc) · 26.1 KB

File metadata and controls

808 lines (610 loc) · 26.1 KB

CLAUDE.md

This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.

Project Overview

mxdev is a Python utility that enables managing Python projects with multiple interdependent packages where only a subset needs to be developed locally. It operates as a preprocessor that orchestrates VCS checkouts and generates combined requirements/constraints files for pip installation.

Key principle: mxdev does NOT run pip - it prepares requirements and constraints files for pip to use.

Development Commands

Installation

# Install development environment with all dependencies
make install

# This creates a virtual environment in .venv/ and installs:
# - mxdev in editable mode
# - All test dependencies
# - All development tools

# Activate the virtual environment
source .venv/bin/activate

Testing

# IMPORTANT: Tests must run from the activated virtual environment
source .venv/bin/activate

# Run all tests
pytest

# Run specific test file
pytest tests/test_git.py

# Run specific test function
pytest tests/test_git.py::test_function_name

# Run tests with verbose output
pytest -v

# Or use the Makefile (automatically uses venv)
make test

Coverage

Coverage is automatically collected when running tests via tox or CI.

Local Coverage Reporting

# Run tests with coverage
source .venv/bin/activate
coverage run -m pytest

# View terminal report with missing lines
coverage report --show-missing

# Generate and view HTML coverage report
coverage html
open htmlcov/index.html  # macOS
# or: xdg-open htmlcov/index.html  # Linux
# or: start htmlcov/index.html  # Windows

# Or use Makefile shortcuts (defined in include.mk)
make coverage          # Run tests + combine + show terminal report
make coverage-html     # Run tests + combine + open HTML report

Note: Coverage targets are defined in include.mk, which is included by the mxmake-generated Makefile and preserved during mxmake updates.

CI Coverage

Coverage is automatically collected and combined from all matrix test runs in GitHub Actions:

Process:

  1. Each test job (Python 3.10-3.14, Ubuntu/Windows/macOS) uploads its .coverage.* file as an artifact
  2. A dedicated coverage job downloads all artifacts
  3. Coverage is combined using coverage combine
  4. Reports are generated:
    • Terminal report added to GitHub Actions summary
    • HTML report uploaded as downloadable artifact
  5. CI fails if combined coverage falls below 35% (baseline threshold)

To view coverage from CI:

  1. Go to Actions tab in GitHub
  2. Click on the workflow run
  3. Scroll down to Artifacts section
  4. Download the html-coverage-report artifact
  5. Unzip and open htmlcov/index.html in a browser

To adjust coverage threshold: Edit .github/workflows/tests.yaml and change the --fail-under=35 value in the "Fail if coverage is below threshold" step.

Note: The threshold is currently set to 35% as a baseline. This should be gradually increased as test coverage improves.

Coverage Configuration

Coverage settings are in pyproject.toml under [tool.coverage.*] sections:

  • [tool.coverage.run]: Source paths, branch coverage, parallel mode, file patterns
  • [tool.coverage.paths]: Path mapping for combining coverage across environments
  • [tool.coverage.report]: Excluded lines, precision, display options
  • [tool.coverage.html]: HTML output directory

Key settings:

  • parallel = true - Allows multiple test runs without overwriting data
  • relative_files = true - Required for combining coverage across different OSes
  • branch = true - Measures branch coverage (not just line coverage)
  • Excludes: tests/, _version.py, defensive code patterns

### Code Quality
```bash
# Run all pre-commit hooks (using uvx with tox-uv)
uvx --with tox-uv tox -e lint

# Run ruff linter (with auto-fix)
uvx ruff check --fix src/mxdev tests

# Run ruff formatter
uvx ruff format src/mxdev tests

# Sort imports with isort
uvx isort src/mxdev tests

# Run type checking
uvx mypy src/mxdev

# Run all pre-commit hooks manually
uvx pre-commit run --all-files

Testing Multiple Python Versions (using uvx tox with uv)

# Run tests on all supported Python versions (Python 3.10-3.14)
# This uses uvx to run tox with tox-uv plugin for much faster testing (10-100x speedup)
uvx --with tox-uv tox

# Run on specific Python version
uvx --with tox-uv tox -e py311

# Run multiple environments in parallel
uvx --with tox-uv tox -p auto

# Run with extra pytest arguments
uvx --with tox-uv tox -e py312 -- -v -k test_git

# Use a specific Python version with uvx (uv will auto-install if needed)
uvx --python 3.11 --with tox-uv tox -e py311

Note:

  • This project uses uvx tox instead of globally installed tox - no installation required!
  • tox configuration is defined in pyproject.toml under [tool.tox]
  • The tox-uv plugin provides 10-100x speedup over traditional pip/virtualenv
  • You can create a shell alias for convenience: alias tox='uvx --with tox-uv tox'

Running mxdev

# Default: reads mx.ini, fetches sources, generates requirements/constraints
mxdev

# Custom config file
mxdev -c custom.ini

# Skip fetching (useful for offline work)
mxdev -n

# Fetch only (don't generate output files)
mxdev -f

# Offline mode (no VCS operations)
mxdev -o

# Control parallelism
mxdev -t 8  # Use 8 threads for fetching

# Verbose output
mxdev -v

# Silent mode
mxdev -s

Architecture

Core Workflow (Read → Fetch → Write)

The codebase follows a three-phase pipeline:

  1. Read Phase (processing.py:read)

    • Parses mx.ini configuration
    • Recursively reads requirements and constraints files
    • Supports both local files and HTTP(S) URLs
    • Calls read_hooks() for extensibility
  2. Fetch Phase (processing.py:fetch)

    • Checks out VCS sources into target directories
    • Uses multi-threaded queue-based workers for parallel operations
    • Supports Git, SVN, Mercurial, Bazaar, Darcs, and local filesystem
    • Controlled by threads setting (default: 4)
  3. Write Phase (processing.py:write)

    • Generates modified requirements file (packages from source as -e)
    • Generates modified constraints file (developed packages commented out)
    • Applies version overrides
    • Calls write_hooks() for extensibility

Key Modules

main.py - CLI entry point

  • Argument parsing and validation
  • Orchestrates read → fetch → write workflow

config.py - Configuration management

  • Configuration class: parses INI files with ExtendedInterpolation
  • Main sections: [settings], package sections, hook sections
  • Validates install-mode, version overrides, and package settings

state.py - Application state container

  • Immutable dataclass holding Configuration, requirements, constraints
  • Passed through the entire pipeline

processing.py - Core business logic

  • process_line(): Handles individual requirement lines, comments out developed packages
  • resolve_dependencies(): Recursively processes -c and -r references
  • File/URL resolution with proper path handling

vcs/ - VCS abstraction layer

  • BaseWorkingCopy: Abstract base class with checkout(), update(), status(), matches()
  • WorkingCopies: Orchestrates multiple VCS operations with threading
  • Entry points-based plugin system for VCS types
  • Git (production stable): Full support including submodules, shallow clones, branch/tag checkout
  • fs (stable): Local filesystem pseudo-VCS
  • svn, hg, bzr, darcs (unstable): Legacy VCS support

hooks.py - Extensibility system

  • Hook base class with namespace, read(state), write(state) methods
  • Loaded via [project.entry-points.mxdev] in pyproject.toml
  • Settings isolated by namespace to avoid conflicts

including.py - Recursive INI inclusion

  • read_with_included(): Handles include directive in [settings]
  • Supports local files and HTTP(S) URLs
  • Relative path resolution from parent file/URL

Design Patterns

  • Abstract Base Class: VCS abstraction with pluggable implementations
  • Factory/Registry: Entry points for VCS types and hooks
  • Producer-Consumer: Queue-based threading for parallel VCS operations
  • Immutable State: State dataclass prevents mutation bugs
  • Dependency Injection: Configuration and hooks passed through state

Important Constraints

  1. Minimal dependencies: Only packaging at runtime - no requests, no YAML parsers
  2. Standard library first: Uses configparser, urllib, threading instead of third-party libs
  3. No pip invocation: mxdev generates files; users run pip separately
  4. Backward compatibility: Supports Python 3.10+ with version detection for Git commands

Configuration System

mxdev uses INI files with configparser.ExtendedInterpolation syntax.

Variable Expansion

[settings]
github = git+ssh://git@github.com/

[mypackage]
url = ${settings:github}org/mypackage.git

Common Patterns

Develop multiple packages with version overrides:

[settings]
requirements-in = requirements.txt
requirements-out = requirements-mxdev.txt
constraints-out = constraints-mxdev.txt
version-overrides =
    somepackage==3.0.0
ignores =
    main-package-name
main-package = -e .[test]

[package1]
url = git+https://github.com/org/package1.git
branch = feature-branch
extras = test
install-mode = editable

[package2]
url = git+https://github.com/org/package2.git
branch = main
install-mode = fixed

[package3]
url = git+https://github.com/org/package3.git
install-mode = skip

Install mode options:

  • editable (default): Installs with -e prefix for development
  • fixed: Installs without -e prefix for production/Docker deployments
  • skip: Only clones, doesn't install
  • direct: Deprecated alias for editable (logs warning)

Using includes for shared configurations:

[settings]
include = https://example.com/shared.ini

# Settings here override included settings

Git-Specific Features

Shallow clones (faster checkouts):

[package]
url = ...
depth = 1

Or set globally:

export GIT_CLONE_DEPTH=1

Submodule handling:

  • always (default): Always checkout/update submodules
  • checkout: Only fetch during initial checkout
  • recursive: Use --recurse-submodules for nested submodules

Testing Infrastructure

Test Organization

Key Fixtures

  • tempdir: Temporary working directory
  • mkgitrepo: Factory for creating test Git repositories
  • develop: MockDevelop instance for simulating development environments
  • httpretty: HTTP mocking for URL-based tests

Test Coverage Areas

Writing Tests

When adding VCS functionality, follow existing patterns:

  1. Use mkgitrepo fixture to create test repositories
  2. Create a develop instance with mock configuration
  3. Test both initial checkout and update scenarios
  4. Verify output in requirements/constraints files

Extensibility via Hooks

To create a mxdev extension:

  1. Create a Hook subclass:
from mxdev import Hook, State

class MyExtension(Hook):
    namespace = "myext"  # Prefix for all settings

    def read(self, state: State) -> None:
        """Called after read phase."""
        # Access config: state.configuration.settings
        # Access packages: state.configuration.packages
        # Access hook config: state.configuration.hooks

    def write(self, state: State) -> None:
        """Called after write phase."""
        # Generate additional files, scripts, etc.
  1. Register as entry point in pyproject.toml:
[project.entry-points.mxdev]
myext = "mypackage:MyExtension"
  1. Add namespaced config to mx.ini:
[settings]
myext-global_setting = value

[myext-section]
specific_setting = value

[somepackage]
myext-package_setting = value

Code Style

  • Formatting: Ruff formatter (max line length: 120, target Python 3.10+)
    • Configured in pyproject.toml under [tool.ruff]
    • Rules: E, W, F, UP, D (with selective ignores for docstrings)
    • Automatically enforced via pre-commit hooks
  • Import sorting: isort with plone profile, force_alphabetical_sort = true, force_single_line = true
    • Configured in pyproject.toml under [tool.isort]
    • Runs after ruff in pre-commit pipeline
  • Type hints: Use throughout (Python 3.10+ syntax)
    • Use X | Y instead of Union[X, Y]
    • Use list[T], dict[K, V] instead of List[T], Dict[K, V]
  • Path handling: Prefer pathlib.Path over os.path for path operations
    • Use pathlib.Path().as_posix() for cross-platform path comparison
    • Use / operator for path joining: Path("dir") / "file.txt"
    • Only use os.path.join() in production code where needed for compatibility
  • Logging: Use logger = logging.getLogger("mxdev") from logging.py
  • Docstrings: Document public APIs and complex logic

CI/CD (GitHub Actions)

The project uses GitHub Actions for continuous integration, configured in .github/workflows/tests.yaml.

Workflow Overview

Lint Job:

  • Runs on: ubuntu-latest
  • Uses: uvx --with tox-uv tox -e lint
  • Executes pre-commit hooks

Test Job:

  • Matrix testing across:
    • Python versions: 3.10, 3.11, 3.12, 3.13, 3.14
    • Operating systems: Ubuntu, Windows, macOS
    • Total: 15 combinations (5 Python × 3 OS)
  • Uses: uvx --with tox-uv tox -e py{version}
  • Leverages astral-sh/setup-uv@v7 action for uv installation

Key Features

  • No pip caching needed: uv handles caching automatically and efficiently
  • Fast execution: uv's parallel installation and caching dramatically speeds up CI
  • Python auto-installation: uv python install automatically downloads required Python versions
  • Unified tooling: Same uvx --with tox-uv tox command used locally and in CI

Modifying CI Workflow

When adding new Python versions:

  1. Add to python-config matrix in .github/workflows/tests.yaml
  2. Add corresponding environment to env_list in pyproject.toml [tool.tox] section
  3. Update requires-python and classifiers in pyproject.toml if needed

Common Development Scenarios

Adding a new VCS type

  1. Create module in src/mxdev/vcs/ (e.g., newvcs.py)
  2. Subclass BaseWorkingCopy from vcs/common.py
  3. Implement: checkout(), status(), matches(), update()
  4. Register in pyproject.toml under [project.entry-points."mxdev.workingcopytypes"]
  5. Add tests following test_git.py pattern

Modifying requirements/constraints processing

The core logic is in processing.py:

  • process_line(): Handles line-by-line processing
  • Look for # -> mxdev disabled comments - this is how packages are marked as developed from source
  • Version overrides are applied during constraint processing
  • Ignore lists prevent certain packages from appearing in output

Adding configuration options

  1. Add default in SETTINGS_DEFAULTS in config.py
  2. Access via configuration.settings dictionary
  3. Document in README.md under appropriate section
  4. Add validation if needed in Configuration.__post_init__()

Build System

mxdev uses Hatchling as its build backend with the following plugins:

hatch-vcs (Automatic Versioning)

  • Version source: Git tags
  • No manual version bumps needed in code
  • Version is automatically derived from git tags during build
  • Development versions get format: 4.1.1.dev3+g1234abc
  • Tags must follow format: vX.Y.Z (e.g., v4.2.0)

hatch-fancy-pypi-readme (Multi-file README)

  • Concatenates multiple markdown files for PyPI long description
  • Combines: README.md + CONTRIBUTING.md + CHANGES.md + LICENSE.md
  • Adds section headers and separators automatically

Package Discovery

  • Source layout: src/mxdev/
  • Auto-generated version file: src/mxdev/_version.py (in .gitignore)
  • Uses .gitignore for file inclusion (no MANIFEST.in needed)
  • PEP 420 namespace packages: src/mxdev/vcs/ has no __init__.py (implicit namespace package)

Building Locally

# Clean build artifacts
rm -rf dist/ build/

# Build (requires git tags to determine version)
uv tool run --from build pyproject-build

# Or with python -m build
python -m build

# Check package quality
uv tool run --from twine twine check dist/*

Important: The version comes from git tags. If building from a commit without a tag, you'll get a development version like 4.1.1.dev3+g1234abc.

Release Process

See RELEASE.md for complete release documentation.

Quick summary:

  1. Update CHANGES.md with release notes
  2. Commit and push to main
  3. Create GitHub Release with tag vX.Y.Z
  4. GitHub Actions automatically builds and publishes to PyPI

Key points:

  • ✅ Version automatically set from git tag (no manual edit needed)
  • ✅ GitHub Actions handles building and publishing
  • ✅ All tests must pass before publishing
  • ✅ See RELEASE.md for detailed workflow

Development Workflow Best Practices

CRITICAL: Always follow these steps before pushing code:

Bug Fix Workflow (Test-Driven Development - MANDATORY)

IMPORTANT: All bug fixes MUST follow the TDD (Test-Driven Development) approach:

  1. Analysis

    • Investigate and understand the root cause
    • Identify the exact location and nature of the bug
    • Document your findings
  2. Comment on Issue with Analysis Only

    gh issue comment <ISSUE_NUMBER> --body "Root cause analysis..."
    • Post detailed root cause analysis to the GitHub issue
    • Do NOT include the solution or plan in the comment
    • Include code references, line numbers, and explanation
  3. Create Failing Test

    • Create branch: git checkout -b fix/<issue-number>-description
    • Write a test that reproduces the bug
    • Verify the test fails with the current code
    • Commit: git commit -m "Add failing test for issue #XX"
  4. Push and Create Draft PR

    git push -u origin fix/<issue-number>-description
    gh pr create --draft --title "..." --body "..."
    • PR body should explain the bug, show the failing test, and describe next steps
  5. Implement the Fix

    • Write the minimal code needed to make the test pass
    • Verify the test now passes
    • Run all related tests to ensure no regressions
  6. Commit and Push Fix

    git add <files>
    git commit -m "Fix issue #XX: description"
    git push
    • Include issue reference in commit message
    • Update CHANGES.md in the same commit or separately
  7. Verify CI is Green

    gh pr checks <PR_NUMBER>
    • Wait for all CI checks to pass
    • Address any failures
  8. Mark PR Ready for Review

    gh pr ready <PR_NUMBER>
    • Only mark ready when all checks are green
    • Update PR description if needed

Why TDD for Bug Fixes?

  • Ensures the bug is actually fixed
  • Prevents regressions in the future
  • Documents the expected behavior
  • Provides confidence in the solution

Pre-Push Checklist

  1. Always run linting before push

    uvx --with tox-uv tox -e lint
    • This runs black, mypy, and other code quality checks
    • Fix any issues before committing
    • Commit formatting changes separately if needed
  2. Always update CHANGES.md

    • Add entry under "## X.X.X (unreleased)" section
    • Format: - Fix #XX: Description. [author]
    • Create unreleased section if it doesn't exist
    • Include issue number when applicable
  3. Always check and update documentation

    • README.md: Update configuration tables, usage examples, or feature descriptions
    • EXTENDING.md: Update if hooks API changed
    • RELEASE.md: Update if release process changed
    • Check if new configuration options need documentation
    • Check if new features need usage examples
    • Update any affected sections (don't just append)
    • MANDATORY: After any code change that adds/modifies features or configuration, verify documentation is updated
  4. Run relevant tests locally

    source .venv/bin/activate
    pytest tests/test_*.py -v
  5. Check CI status before marking PR ready

    gh pr checks <PR_NUMBER>
    • Wait for all checks to pass
    • Address any failures before requesting review

Example Workflow

# 1. Make changes to code
# 2. Run linting
uvx --with tox-uv tox -e lint

# 3. Fix any linting issues and commit if changes were made
git add .
git commit -m "Run black formatter"

# 4. Run tests
source .venv/bin/activate
pytest -v

# 5. Update CHANGES.md
# Edit CHANGES.md to add entry

# 6. Commit everything
git add .
git commit -m "Fix issue description"

# 7. Push
git push

# 8. Check CI
gh pr checks <PR_NUMBER>

# 9. When green, mark PR ready for review

Git Workflow

  • Main branch: main
  • Create feature branches from main
  • Create pull requests for all changes
  • Ensure tests pass before merging
  • Always lint before pushing (see Pre-Push Checklist above)
  • Always update CHANGES.md for user-facing changes

Commit Message Format

IMPORTANT: Do NOT include Claude Code attribution in commit messages. Commit messages should be written as if by a human developer.

Bad (don't do this):

Fix #70: Implement HTTP caching

🤖 Generated with Claude Code
Co-Authored-By: Claude <noreply@anthropic.com>

Good (correct format):

Fix #70: Implement HTTP caching for offline mode

Previously, offline mode only skipped VCS operations but still
fetched HTTP-referenced requirements/constraints files.

Changes:
- Add HTTP content caching to .mxdev_cache/ directory
- Online mode: fetch from HTTP and cache for future use
- Offline mode: read from cache, error if not cached

All 190 tests pass, including 5 new HTTP caching tests.

GitHub Dependabot Management

Test Fixtures and False Positive Alerts

Test fixture files in tests/data/requirements/ contain pinned package versions that can trigger Dependabot security alerts, even though they're not real dependencies.

Current Setup (Auto-Triage Rule):

A Dependabot auto-triage rule is configured via GitHub UI to automatically dismiss alerts from test fixtures:

  • Rule name: "Dismiss test fixture alerts"
  • Manifest filter: Comma-separated list of test fixture files:
    tests/data/requirements/constraints.txt,
    tests/data/requirements/basic_requirements.txt,
    tests/data/requirements/nested_requirements.txt,
    tests/data/requirements/other_requirements.txt,
    tests/data/requirements/requirements_with_constraints.txt
    
  • Action: Dismiss indefinitely
  • Location: GitHub Settings → Code security → Dependabot rules

How It Works:

Three separate GitHub systems handle dependency management:

  1. GitHub Linguist (linguist-vendored in .gitattributes)

    • Only affects language statistics
    • Does NOT affect dependency graph or Dependabot
  2. Dependency Graph (vendored directory detection)

    • Uses hardcoded regex patterns to identify vendored directories:
      • (3rd|[Tt]hird)[-_]?[Pp]arty/
      • (^|/)vendors?/
      • (^|/)[Ee]xtern(als?)?/
    • tests/data/ does NOT match these patterns
    • Files in vendored directories are excluded from dependency graph
  3. Dependabot Auto-Triage Rules

    • This is the ONLY way to suppress security alerts for specific directories
    • Can target by: manifest path, severity, package name, scope, ecosystem, CVE, CWE, GHSA, EPSS
    • Rules are configured via GitHub UI (not version-controlled)
    • Supports comma-separated manifest paths (no wildcards)

Key Limitations:

  • ❌ Wildcards NOT supported in manifest paths (e.g., tests/data/** doesn't work)
  • ❌ Must specify exact file paths
  • ❌ Configuration is in GitHub UI, not in repository files
  • ✅ Can combine multiple paths with commas

Adding New Test Fixtures:

If you add a new test fixture file with pinned dependencies (e.g., tests/data/requirements/new_fixture.txt):

  1. Go to GitHub Settings → Code security → Dependabot rules
  2. Edit the "Dismiss test fixture alerts" rule
  3. Add the new path to the comma-separated manifest list
  4. Save the rule

Alternative Approaches (Don't Use):

  • .gitattributes with linguist-vendored → Only affects language stats, not Dependabot
  • ❌ Renaming tests/data/ to tests/vendor/ → Breaking change, misleading name
  • exclude-paths in .github/dependabot.yml → Only affects version update PRs, NOT security alerts
  • ✅ Auto-triage rules → This is the correct solution for security alerts

HTTP Caching for Offline Mode

The .mxdev_cache/ directory stores HTTP-referenced requirements/constraints files for offline use:

  • Online mode: Content fetched from HTTP is automatically cached
  • Offline mode (-o/--offline): Content read from cache, errors if not cached
  • Cache key: SHA256 hash (first 16 hex chars) of the URL
  • Location: .mxdev_cache/ (in .gitignore)

Cache Files:

.mxdev_cache/
  a1b2c3d4e5f6g7h8          # Cached content (first 16 chars of SHA256)
  a1b2c3d4e5f6g7h8.url      # Original URL (for debugging)

Implementation Details:

  • Cache functions: _get_cache_key(), _cache_http_content(), _read_from_cache()
  • See src/mxdev/processing.py for implementation
  • Tests in tests/test_processing.py (5 comprehensive caching tests)

Requirements

  • Python: 3.10+
  • pip: 23+ (required for proper operation)
  • Runtime dependencies: Only packaging
  • VCS tools: Install git, svn, hg, bzr, darcs as needed for VCS operations