Commit Structure
Introduction
Commit messages are a fundamental part of software development. A good commit message clearly communicates what changes were made and why, facilitating code maintenance and team collaboration.
Commit Message Format
Basic Structure
<type>[optional scope]: <description>
[optional body]
[optional footer]
Format Rules
- Title length: Maximum 50 characters
- Body length: Maximum 72 characters per line
- Language: Use English for messages
- Verb tense: Use imperative present ("add" instead of "added" or "adding")
- Capitalization: First letter lowercase (except proper nouns)
Commit Types
Main Types
- feat: New functionality for the user
- fix: Bug fix
- docs: Documentation changes
- style: Format changes that don't affect code meaning (spaces, formatting, etc.)
- refactor: Code change that neither fixes a bug nor adds functionality
- test: Add missing tests or correct existing tests
- chore: Changes to build process or auxiliary tools
Additional Types (Optional)
- perf: Code change that improves performance
- ci: Changes to CI configuration files
- build: Changes affecting build system or external dependencies
- revert: Reverts a previous commit
Scope
The scope is optional and should indicate which part of the code is affected:
Scope Examples
- auth: Authentication system
- api: API endpoints
- db: Database
- email: Email system
- config: Configuration
- deps: Dependencies
Scope Format
feat(auth): add JWT token validation
fix(api): correct error in users endpoint
docs(readme): update installation instructions
Description
Description Rules
- Clarity: Describe what the change does, not how it does it
- Conciseness: Maximum 50 characters
- Imperative: Use imperative present
- No period: Don't end with a period
Good Description Examples
feat(auth): add JWT authentication middleware
fix(api): correct email validation in registration
docs(api): update endpoints documentation
refactor(db): simplify user queries
test(auth): add tests for token validation
Message Body (Optional)
The body should explain what and why, not how:
When to Use the Body
- Complex changes requiring explanation
- Important design decisions
- Additional context about the problem solved
Example with Body
feat(auth): add refresh token system
Implements a refresh token system to improve security
and user experience. Access tokens now have a shorter
duration (15 minutes) while refresh tokens last 7 days.
This allows maintaining active user sessions without
compromising security if a token is intercepted.
Footer (Optional)
Breaking Changes
For changes that break compatibility:
feat(api): change user response format
BREAKING CHANGE: The 'name' field is now split into 'firstName' and 'lastName'
Issue References
fix(auth): correct password validation error
Closes #123
Fixes #456
Resolves PROJ-789
Supported formats in Bitbucket:
Closes #123- Closes issue from same repositoryFixes #456- Fixes issue from same repositoryResolves OPD-789- Resolves Jira issue (using project-specific prefix)Implements #234- Implements feature request
Length and Format
Character Limits
- Title: 50 characters maximum
- Body: 72 characters per line maximum
- Separation: Blank line between title and body
Complete Example
feat(email): add customizable template system
Allows administrators to create and edit custom email
templates through the web interface. Templates support
dynamic variables and real-time preview.
- Add WYSIWYG editor for templates
- Implement {{variable}} system
- Add real-time preview
- Include syntax validation
Closes #234
Practical Examples
✅ Good Examples
# New features
feat(auth): add Google OAuth authentication
feat(api): implement file upload endpoint
feat(email): add email notification system
# Bug fixes
fix(auth): correct login validation error
fix(api): resolve timeout in large queries
fix(db): correct user table migration
# Documentation
docs(readme): update installation instructions
docs(api): add endpoint usage examples
docs(deploy): document deployment process
# Refactoring
refactor(auth): simplify token validation logic
refactor(db): optimize search queries
refactor(api): extract validators to separate modules
# Tests
test(auth): add integration tests for login
test(api): improve test coverage for endpoints
test(email): add unit tests for templates
# Maintenance
chore(deps): update security dependencies
chore(config): update ESLint configuration
chore(build): optimize production build process
❌ Bad Examples
# Too vague
fix: fix bug
feat: add new thing
update: various changes
# Too long
feat: add complete JWT authentication system including login, registration, token validation, refresh tokens and route protection middleware
# Incorrect verb tense
fix: fixed login error
feat: adding new functionality
docs: updated readme
# No context
fix: typo
feat: new function
refactor: changes
# With period
feat(auth): add token validation.
fix(api): correct validation error.
Recommended Tool: Pre-commit
Pre-commit for Python
Pre-commit is the standard tool for git hooks in Python projects.
Installation
# With pip
pip install pre-commit
# Add to requirements-dev.txt
echo "pre-commit" >> requirements-dev.txt
Requirements File Structure
To organize project dependencies:
# requirements.txt - Production dependencies
Django==4.2.7
psycopg2-binary==2.9.7
redis==5.0.1
# requirements-dev.txt - Development dependencies
-r requirements.txt
pre-commit==3.5.0
black==23.9.1
isort==5.12.0
flake8==6.1.0
pytest==7.4.3
pytest-cov==4.1.0
mypy==1.6.1
bandit==1.7.5
Basic Configuration
Create .pre-commit-config.yaml file:
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
- id: check-merge-conflict
- id: debug-statements
- repo: https://github.com/psf/black
rev: 23.9.1
hooks:
- id: black
args: [--line-length=88]
- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: isort
args: ["--profile", "black"]
- repo: https://github.com/pycqa/flake8
rev: 6.1.0
hooks:
- id: flake8
args: [--max-line-length=88, --extend-ignore=E203]
Hook Installation
# Install hooks in the repository
pre-commit install
# Run hooks manually on all files
pre-commit run --all-files
Commit Structure Validation with Pre-commit
To validate that commit messages follow conventions, you can add a custom hook:
# Add to .pre-commit-config.yaml
- repo: local
hooks:
- id: conventional-commit
name: Conventional Commit
entry: bash -c 'if ! grep -qE "^(feat|fix|docs|style|refactor|test|chore|perf|ci|build|revert)(\(.+\))?: .{1,50}" "$1"; then echo "Commit message does not follow conventional format"; exit 1; fi'
language: system
stages: [commit-msg]
args: [--]
Daily Pre-commit Usage
# Hooks run automatically on each commit
git add .
git commit -m "feat(auth): add JWT token validation"
# If any hook fails, the commit is cancelled
# Fix the issues and try again
git add .
git commit -m "feat(auth): add JWT token validation"
# Run hooks manually before commit
pre-commit run
# Run a specific hook
pre-commit run black
# Skip hooks temporarily (not recommended)
git commit -m "message" --no-verify
Django-Specific Configuration
For Django projects, add additional hooks:
- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: isort
args: ["--profile", "black", "--check-only", "--diff"]
- repo: https://github.com/adamchainz/django-upgrade
rev: 1.15.0
hooks:
- id: django-upgrade
args: [--target-version, "4.2"]
FastAPI-Specific Configuration
For FastAPI projects, add specific validations:
- repo: https://github.com/pycqa/bandit
rev: 1.7.5
hooks:
- id: bandit
args: ["-r", ".", "-f", "json"]
exclude: tests/
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.6.1
hooks:
- id: mypy
additional_dependencies: [types-requests]
Advanced Pre-commit Configuration
Complete Configuration for Django
# .pre-commit-config.yaml
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
- id: check-merge-conflict
- id: debug-statements
- repo: https://github.com/psf/black
rev: 23.9.1
hooks:
- id: black
args: [--line-length=88]
- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: isort
args: ["--profile", "black"]
- repo: https://github.com/pycqa/flake8
rev: 6.1.0
hooks:
- id: flake8
args: [--max-line-length=88, --extend-ignore=E203]
- repo: https://github.com/adamchainz/django-upgrade
rev: 1.15.0
hooks:
- id: django-upgrade
args: [--target-version, "4.2"]
- repo: local
hooks:
- id: django-check
name: Django Check
entry: python manage.py check
language: system
pass_filenames: false
files: \.(py)$
Complete Configuration for FastAPI
# .pre-commit-config.yaml
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
- id: check-merge-conflict
- repo: https://github.com/psf/black
rev: 23.9.1
hooks:
- id: black
- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: isort
args: ["--profile", "black"]
- repo: https://github.com/pycqa/flake8
rev: 6.1.0
hooks:
- id: flake8
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.6.1
hooks:
- id: mypy
additional_dependencies: [types-requests, pydantic]
- repo: https://github.com/pycqa/bandit
rev: 1.7.5
hooks:
- id: bandit
args: ["-r", ".", "-ll"]
exclude: tests/
Custom Scripts
Hook to Run Tests
- repo: local
hooks:
- id: pytest
name: pytest
entry: pytest
language: system
pass_filenames: false
files: \.py$
args: [--maxfail=1, -q]
Hook to Check Migrations (Django)
- repo: local
hooks:
- id: check-migrations
name: Check for missing migrations
entry: python manage.py makemigrations --check --dry-run
language: system
pass_filenames: false
files: models\.py$
Automation with Bitbucket Pipelines
Basic Pipeline Configuration
# bitbucket-pipelines.yml
image: python:3.11
definitions:
services:
postgres:
image: postgres:15
variables:
POSTGRES_DB: test_db
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
caches:
pip: ~/.cache/pip
pipelines:
default:
- step:
name: Code Quality and Tests
caches:
- pip
script:
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
- pre-commit run --all-files
- pytest --cov=. --cov-report=xml
artifacts:
- coverage.xml
pull-requests:
'**':
- step:
name: Code Quality and Tests
caches:
- pip
script:
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
- pre-commit run --all-files
- pytest
branches:
main:
- step:
name: Deploy to Production
deployment: production
caches:
- pip
script:
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
- pytest
# Specific deployment commands
develop:
- step:
name: Deploy to Staging
deployment: staging
caches:
- pip
script:
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
- pytest
# Staging deployment commands
Django-Specific Pipeline
# bitbucket-pipelines.yml for Django
image: python:3.11
definitions:
services:
postgres:
image: postgres:15
variables:
POSTGRES_DB: test_db
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
redis:
image: redis:7-alpine
caches:
pip: ~/.cache/pip
pipelines:
default:
- step:
name: Django Tests and Quality
services:
- postgres
- redis
caches:
- pip
script:
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
# Configure environment variables for tests
- export DATABASE_URL=postgres://postgres:postgres@localhost:5432/test_db
- export REDIS_URL=redis://localhost:6379/0
# Run Django validations
- python manage.py check
- python manage.py makemigrations --check --dry-run
# Run tests
- python manage.py test
# Run pre-commit hooks
- pre-commit run --all-files
pull-requests:
'**':
- step:
name: PR Validation
services:
- postgres
caches:
- pip
script:
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
- export DATABASE_URL=postgres://postgres:postgres@localhost:5432/test_db
# Django validations
- python manage.py check
- python manage.py makemigrations --check --dry-run
- python manage.py test
# Code quality
- pre-commit run --all-files
FastAPI-Specific Pipeline
# bitbucket-pipelines.yml for FastAPI
image: python:3.11
definitions:
services:
postgres:
image: postgres:15
variables:
POSTGRES_DB: test_db
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
caches:
pip: ~/.cache/pip
pipelines:
default:
- step:
name: FastAPI Tests and Quality
services:
- postgres
caches:
- pip
script:
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
# Configure environment variables
- export DATABASE_URL=postgres://postgres:postgres@localhost:5432/test_db
- export ENVIRONMENT=testing
# Type checking
- mypy .
# Security checks
- bandit -r . -f json
# Tests with coverage
- pytest --cov=. --cov-report=xml --cov-report=html
# Code quality
- pre-commit run --all-files
artifacts:
- coverage.xml
- htmlcov/
pull-requests:
'**':
- step:
name: PR Validation
services:
- postgres
caches:
- pip
script:
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
- export DATABASE_URL=postgres://postgres:postgres@localhost:5432/test_db
# FastAPI validations
- mypy .
- bandit -r . -ll
- pytest
- pre-commit run --all-files
Advanced Configuration with Multiple Steps
# bitbucket-pipelines.yml advanced
image: python:3.11
definitions:
services:
postgres:
image: postgres:15
variables:
POSTGRES_DB: test_db
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
caches:
pip: ~/.cache/pip
steps:
- step: &install-dependencies
name: Install Dependencies
caches:
- pip
script:
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
- step: &code-quality
name: Code Quality Checks
script:
- black --check .
- isort --check-only .
- flake8 .
- mypy .
- step: &security-checks
name: Security Checks
script:
- bandit -r . -f json
- safety check
- step: &run-tests
name: Run Tests
services:
- postgres
script:
- export DATABASE_URL=postgres://postgres:postgres@localhost:5432/test_db
- pytest --cov=. --cov-report=xml
artifacts:
- coverage.xml
pipelines:
pull-requests:
'**':
- step:
<<: *install-dependencies
- parallel:
- step:
<<: *code-quality
- step:
<<: *security-checks
- step:
<<: *run-tests
- step:
name: Final Validation
script:
- echo "All checks passed successfully"
branches:
main:
- step:
<<: *install-dependencies
- parallel:
- step:
<<: *code-quality
- step:
<<: *security-checks
- step:
<<: *run-tests
- step:
name: Deploy to Production
deployment: production
script:
- echo "Deploying to production..."
# Specific deployment commands
Environment Variables Configuration in Bitbucket
To configure sensitive environment variables in Bitbucket:
- Go to Repository Settings > Pipelines > Repository variables
- Add necessary variables:
# Common variables
DATABASE_URL=postgres://user:pass@host:5432/db
SECRET_KEY=your-secret-key
DEBUG=False
# Environment-specific variables
PRODUCTION_DATABASE_URL=postgres://prod-user:pass@prod-host:5432/prod-db
STAGING_DATABASE_URL=postgres://staging-user:pass@staging-host:5432/staging-db
# External APIs
SENDGRID_API_KEY=your-sendgrid-key
AWS_ACCESS_KEY_ID=your-aws-key
AWS_SECRET_ACCESS_KEY=your-aws-secret
Integration with Private Repositories
For projects with private dependencies in Bitbucket:
# bitbucket-pipelines.yml with private repositories
image: python:3.11
pipelines:
default:
- step:
name: Install Private Dependencies
script:
# Configure access to private repositories
- git config --global url."https://x-token-auth:${BITBUCKET_ACCESS_TOKEN}@bitbucket.org/".insteadOf "https://bitbucket.org/"
# Install dependencies (including private ones)
- pip install -r requirements.txt
- pip install -r requirements-dev.txt
- pytest
Configure BITBUCKET_ACCESS_TOKEN:
- Go to Personal settings > App passwords
- Create a token with Repositories: Read permissions
- Add as environment variable in the repository
Best Practices
1. Atomic Commits
- One commit = one logical change
- Avoid commits that mix multiple unrelated changes
2. Commit Frequency
- Make frequent, small commits
- Better many small commits than few large commits
3. Review Before Commit
- Review changes with
git diffbefore committing - Use
git add -pto add changes selectively
4. Descriptive Messages
- The message should explain the "what" and "why"
- Avoid messages like "fix", "update", "changes"
5. Team Consistency
- The entire team should follow the same conventions
- Document and share established rules
- Use tools to automate validation