We follow a Gitflow-inspired branching model to maintain a stable main branch and a dynamic develop branch.
- Branch Roles:
main: Reserved for stable, production-ready releases.develop: The primary branch for ongoing development, feature integration, and bug fixes. This serves as the "staging" area for the next release.
- Issue Tracking: Every contribution (bug fix or feature) must first be reported as a GitHub Issue. Issues should clearly define goals and, preferably, include an implementation plan.
- Branch Naming: Create a dedicated working branch for each issue. Branches must be named using the format
NUM-short-description, whereNUMis the issue number (e.g.,113-fix-file-loading). - Pull Requests (PRs):
- Once work is complete, open a PR targeting the
developbranch. - Communication: High-level discussion and planning should occur in the issue thread. The PR conversation is strictly for code review and implementation-specific feedback.
- Once work is complete, open a PR targeting the
- Releases:
- When
developis ready for a new release, open a PR fromdeveloptomainusing the "release" PR template. - After merging the release candidate into
main, manually tag the commit with the version number. This tag triggers the automated CI/CD pipeline for publishing.
- When
- Branch Protection: Both
mainanddevelopare protected branches. Direct pushes are disabled; all changes must be introduced via Pull Requests.
The mmif command-line interface supports subcommands (e.g., mmif source, mmif describe). These are implemented as Python modules in mmif/utils/cli/.
To add a new CLI subcommand, create a Python module in mmif/utils/cli/ with these three required functions:
-
prep_argparser(**kwargs)- Define and return anargparse.ArgumentParserinstance for your subcommand. When called during discovery, the main CLI will passadd_help=Falseto this function to avoid duplicate help flags. -
describe_argparser()- Return a tuple of two strings:- A one-line description (shown in
mmif --help) - A more verbose description (shown in
mmif <subcommand> --help)
- A one-line description (shown in
-
main(args)- Execute the subcommand logic with the parsed arguments.
To ensure a consistent user experience and avoid resource leaks, all CLI subcommands should adhere to the following I/O argument patterns using the mmif.utils.cli.open_cli_io_arg context manager (which replaces the deprecated argparse.FileType):
- Input: Use a positional argument (usually named
MMIF_FILE) that supports both file paths and STDIN.- In
prep_argparser, usenargs='?',type=str, anddefault=None. - In
main, usewith open_cli_io_arg(args.MMIF_FILE, 'r', default_stdin=True) as input_file:.
- In
- Output: Use the
-o/--outputflag for the output destination.- In
prep_argparser, usetype=stranddefault=None. - In
main, usewith open_cli_io_arg(args.output, 'w', default_stdin=True) as output_file:.
- In
- Formatting: Use the
-p/--prettyflag as a boolean switch (action='store_true') to toggle between compact and pretty-printed JSON/MMIF output.
[!NOTE]
CLI modules should typically act as thin wrappers. It is recommended to implement the core utility logic in other packages (e.g.,
mmif.utils) and import it into the CLI module. See existing modules likesummarize.py(which imports frommmif.utils.summarizer) ordescribe.pyfor examples.
The CLI system automatically discovers subcommands at runtime. The entry point is configured in the build script (currently setup.py) as follows:
entry_points={
'console_scripts': [
'mmif = mmif.__init__:cli',
],
},The cli() function in mmif/__init__.py handles discovery and delegation. It uses pkgutil.walk_packages to find all modules within the top-level of the mmif.utils.cli package. For the discovery logic to work, a "cli module" should implement the requirements outlined above.
This means adding a properly structured module within the CLI package is all that's needed—the module name will automatically be registered as a subcommand. No modifications to setup.py or other configuration files are required.
Note
Any "client" code (not shell CLI) wants to use a module in cli package should be able to directrly from mmif.utils.cli import a_module. However, for historical reasons, some CLI modules are manually imported in mmif/__init__.py (e.g., source.py) for backward compatibility for clients predateing the discovery system.
The documentation for mmif-python is built using Sphinx and published to the CLAMS documentation hub.
To build the documentation for the current checkout:
python3 build-tools/docs.pyThe output will be in docs-test. For more options, run python build-tools/docs.py --help.
As of 2026 (since the next version of 1.2.1), API documentation is automatically generated using sphinx-apidoc. When you run the documentation build:
- The
run_apidoc()function indocumentation/conf.pyruns automatically - It scans packages listed in
apidoc_package_names(currentlymmifandmmif_docloc_http) - RST files are generated in
documentation/autodoc/ - These files are not tracked in git - they're regenerated on each build
When you add a new module or subpackage, it will be automatically documented on the next build. No manual updates required.
To add a new top-level package (like mmif_docloc_http), add it to apidoc_package_names in documentation/conf.py.
To exclude a subpackage from documentation (like mmif.res or mmif.ver), add it to apidoc_exclude_paths.
Module docstrings in __init__.py files are used as package descriptions in the documentation. Keep them concise and informative.
To build documentation for a specific historical version (e.g., v1.0.0):
make doc-version
# OR
python3 build-tools/docs.py --build-ver v1.0.0This runs the build in a sandboxed temporary directory. The output will be in docs-test/<version>.
Important: The build script (build-tools/docs.py) uses a "Modern Environment, Legacy Source" strategy. It checks out the old source code but installs modern build dependencies (Sphinx 7.x, Furo) to ensure the build works on current systems (including Python 3.13).
If an old version fails to build because a dependency is missing (e.g., it was removed from requirements.txt in later versions but the old setup.py needs it), do not try to fix the old setup.py.
Instead, manually add the missing dependency to the run_pip call in build-tools/docs.py:
# In build-tools/docs.py
def build_versioned_docs(...):
# ...
# Add the missing dependency here
env.run_pip("install", "jsonschema", "requests", "pyyaml", "deepdiff<7", "YOUR_MISSING_DEP", cwd=source_path)This "overlay" strategy ensures we can build old docs without modifying historical git tags.