Prerequisites
- Basic understanding of programming concepts ๐
- Python installation (3.8+) ๐
- VS Code or preferred IDE ๐ป
What you'll learn
- Understand the concept fundamentals ๐ฏ
- Apply the concept in real projects ๐๏ธ
- Debug common issues ๐
- Write clean, Pythonic code โจ
๐ฏ Introduction
Welcome to this exciting tutorial on advanced PyPI package publishing! ๐ In this guide, weโll explore how to professionally publish and maintain Python packages on the Python Package Index.
Youโll discover how to create high-quality packages that thousands of developers can use. Whether youโre sharing a utility library ๐, a framework ๐๏ธ, or a tool ๐ ๏ธ, understanding advanced PyPI publishing is essential for contributing to the Python ecosystem.
By the end of this tutorial, youโll feel confident publishing professional Python packages! Letโs dive in! ๐โโ๏ธ
๐ Understanding Advanced PyPI Publishing
๐ค What is Advanced PyPI Publishing?
Advanced PyPI publishing is like running a professional software distribution center ๐ญ. Think of it as not just shipping your code, but creating a complete product with documentation, versioning, and quality assurance.
In Python terms, advanced publishing means you can:
- โจ Create packages with multiple distribution formats
- ๐ Automate releases with CI/CD pipelines
- ๐ก๏ธ Sign packages for security verification
- ๐ Track download statistics and user analytics
- ๐ Manage complex dependencies and versions
๐ก Why Master Advanced Publishing?
Hereโs why developers need advanced PyPI skills:
- Professional Presence ๐: Stand out with high-quality packages
- User Trust ๐ก๏ธ: Security and reliability build confidence
- Maintenance Efficiency ๐ง: Streamline updates and releases
- Community Growth ๐: Attract contributors and users
Real-world example: Imagine publishing a data analysis library ๐. With advanced techniques, you can ensure compatibility across Python versions, provide wheels for fast installation, and automate security updates.
๐ง Basic Syntax and Usage
๐ Advanced Project Structure
Letโs start with a professional package structure:
# ๐๏ธ Professional package structure
my_awesome_package/
โโโ src/
โ โโโ my_package/
โ โโโ __init__.py # ๐ฆ Package initialization
โ โโโ core.py # ๐ฏ Core functionality
โ โโโ utils.py # ๐ ๏ธ Utility functions
โโโ tests/
โ โโโ test_core.py # ๐งช Core tests
โ โโโ test_utils.py # ๐งช Utility tests
โโโ docs/
โ โโโ conf.py # ๐ Sphinx configuration
โ โโโ index.rst # ๐ Documentation index
โโโ .github/
โ โโโ workflows/
โ โโโ publish.yml # ๐ CI/CD pipeline
โโโ setup.py # ๐ง Package configuration
โโโ setup.cfg # โ๏ธ Configuration file
โโโ pyproject.toml # ๐จ Modern build configuration
โโโ MANIFEST.in # ๐ File inclusion rules
โโโ LICENSE # ๐ License file
โโโ README.md # ๐ Project documentation
โโโ .gitignore # ๐ซ Git ignore rules
๐ก Explanation: This structure separates source code, tests, and documentation while providing modern build configurations!
๐ฏ Modern pyproject.toml Configuration
Hereโs a professional configuration:
# ๐จ pyproject.toml - Modern Python packaging
[build-system]
requires = ["setuptools>=61.0", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "awesome-package"
version = "2.0.0"
description = "An awesome package that does amazing things! โจ"
readme = "README.md"
license = {text = "MIT"}
authors = [
{name = "Your Name", email = "[email protected]"},
]
maintainers = [
{name = "Maintainer Team", email = "[email protected]"},
]
classifiers = [
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Operating System :: OS Independent",
"Topic :: Software Development :: Libraries",
]
keywords = ["awesome", "package", "python"]
requires-python = ">=3.8"
dependencies = [
"requests>=2.28.0",
"click>=8.0.0",
"pydantic>=2.0.0",
]
[project.optional-dependencies]
dev = [
"pytest>=7.0.0",
"pytest-cov>=4.0.0",
"black>=23.0.0",
"mypy>=1.0.0",
"ruff>=0.0.270",
]
docs = [
"sphinx>=5.0.0",
"sphinx-rtd-theme>=1.0.0",
"sphinx-autodoc-typehints>=1.0.0",
]
[project.urls]
Homepage = "https://github.com/you/awesome-package"
Documentation = "https://awesome-package.readthedocs.io"
Repository = "https://github.com/you/awesome-package.git"
"Bug Tracker" = "https://github.com/you/awesome-package/issues"
Changelog = "https://github.com/you/awesome-package/blob/main/CHANGELOG.md"
[project.scripts]
awesome-cli = "my_package.cli:main"
[tool.setuptools.packages.find]
where = ["src"]
[tool.setuptools.package-data]
my_package = ["py.typed", "data/*.json"]
๐ก Practical Examples
๐ Example 1: Multi-Format Distribution
Letโs create distributions for different platforms:
# ๐ญ build_distributions.py - Build multiple formats
import subprocess
import shutil
from pathlib import Path
class PackageBuilder:
def __init__(self, package_name: str):
self.package_name = package_name
self.dist_dir = Path("dist")
def clean_build(self):
"""๐งน Clean previous builds"""
print("๐งน Cleaning previous builds...")
for dir_name in ["build", "dist", "*.egg-info"]:
for path in Path(".").glob(dir_name):
if path.is_dir():
shutil.rmtree(path)
else:
path.unlink()
print("โจ Clean complete!")
def build_sdist(self):
"""๐ฆ Build source distribution"""
print("๐ฆ Building source distribution...")
subprocess.run([
"python", "-m", "build", "--sdist"
], check=True)
print("โ
Source distribution built!")
def build_wheel(self):
"""๐ก Build wheel distribution"""
print("๐ก Building wheel distribution...")
subprocess.run([
"python", "-m", "build", "--wheel"
], check=True)
print("โ
Wheel distribution built!")
def build_platform_wheels(self):
"""๐ Build platform-specific wheels"""
print("๐ Building platform wheels...")
# ๐ฏ Use cibuildwheel for multi-platform support
subprocess.run([
"cibuildwheel", "--output-dir", str(self.dist_dir)
], check=True)
print("โ
Platform wheels built!")
def verify_distributions(self):
"""๐ Verify built distributions"""
print("๐ Verifying distributions...")
for dist_file in self.dist_dir.glob("*"):
print(f" ๐ {dist_file.name} ({dist_file.stat().st_size:,} bytes)")
# ๐งช Check with twine
subprocess.run([
"twine", "check", "dist/*"
], check=True)
print("โ
All distributions verified!")
# ๐ฎ Let's use it!
builder = PackageBuilder("awesome-package")
builder.clean_build()
builder.build_sdist()
builder.build_wheel()
builder.verify_distributions()
๐ฏ Try it yourself: Add support for building conda packages!
๐ฎ Example 2: Automated Release Pipeline
Letโs create a GitHub Actions workflow:
# ๐ .github/workflows/publish.yml
name: ๐ฆ Publish Python Package
on:
release:
types: [published]
workflow_dispatch:
inputs:
test_pypi:
description: 'Publish to Test PyPI first?'
required: true
default: 'true'
jobs:
build-and-test:
name: ๐๏ธ Build and Test
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.8', '3.9', '3.10', '3.11']
steps:
- name: ๐ฅ Checkout code
uses: actions/checkout@v3
with:
fetch-depth: 0 # ๐ Full history for versioning
- name: ๐ Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: ๐ฆ Install dependencies
run: |
python -m pip install --upgrade pip
pip install -e .[dev]
pip install build twine
- name: ๐งช Run tests
run: |
pytest --cov=my_package --cov-report=xml
echo "โ
Tests passed on Python ${{ matrix.python-version }}!"
- name: ๐ Upload coverage
if: matrix.python-version == '3.11'
uses: codecov/codecov-action@v3
with:
file: ./coverage.xml
publish:
name: ๐ Publish Package
needs: build-and-test
runs-on: ubuntu-latest
steps:
- name: ๐ฅ Checkout code
uses: actions/checkout@v3
- name: ๐ Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: ๐๏ธ Build distributions
run: |
pip install build
python -m build
echo "๐ฆ Distributions built!"
- name: ๐งช Publish to Test PyPI
if: github.event.inputs.test_pypi == 'true'
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.TEST_PYPI_API_TOKEN }}
run: |
twine upload --repository testpypi dist/*
echo "โ
Published to Test PyPI!"
- name: ๐ Publish to PyPI
if: github.event_name == 'release'
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN }}
run: |
twine upload dist/*
echo "๐ Published to PyPI!"
๐ Example 3: Package Security and Signing
Letโs implement package signing:
# ๐ secure_publish.py - Secure package publishing
import subprocess
import hashlib
from pathlib import Path
import gnupg
class SecurePublisher:
def __init__(self, gpg_key_id: str):
self.gpg = gnupg.GPG()
self.key_id = gpg_key_id
self.dist_dir = Path("dist")
def generate_checksums(self):
"""๐ Generate checksums for distributions"""
print("๐ Generating checksums...")
checksums = {}
for dist_file in self.dist_dir.glob("*"):
if dist_file.suffix in [".whl", ".tar.gz"]:
# ๐ Calculate multiple hash types
sha256 = hashlib.sha256()
sha512 = hashlib.sha512()
with open(dist_file, "rb") as f:
while chunk := f.read(8192):
sha256.update(chunk)
sha512.update(chunk)
checksums[dist_file.name] = {
"sha256": sha256.hexdigest(),
"sha512": sha512.hexdigest()
}
print(f" โ
{dist_file.name}")
# ๐ Write checksums file
with open(self.dist_dir / "CHECKSUMS.txt", "w") as f:
for filename, hashes in checksums.items():
f.write(f"# {filename}\n")
f.write(f"SHA256: {hashes['sha256']}\n")
f.write(f"SHA512: {hashes['sha512']}\n\n")
print("โจ Checksums generated!")
return checksums
def sign_distributions(self):
"""๐ Sign distributions with GPG"""
print("๐ Signing distributions...")
for dist_file in self.dist_dir.glob("*"):
if dist_file.suffix in [".whl", ".tar.gz"]:
# ๐๏ธ Create detached signature
with open(dist_file, "rb") as f:
signed = self.gpg.sign_file(
f,
keyid=self.key_id,
detach=True,
output=f"{dist_file}.asc"
)
if signed:
print(f" โ
Signed: {dist_file.name}")
else:
print(f" โ Failed to sign: {dist_file.name}")
def verify_signatures(self):
"""๐ Verify all signatures"""
print("๐ Verifying signatures...")
for sig_file in self.dist_dir.glob("*.asc"):
dist_file = sig_file.with_suffix("")
with open(sig_file, "rb") as f:
verified = self.gpg.verify_file(f, str(dist_file))
if verified:
print(f" โ
Valid: {dist_file.name}")
else:
print(f" โ Invalid: {dist_file.name}")
def secure_upload(self):
"""๐ Securely upload to PyPI"""
print("๐ Uploading to PyPI...")
# ๐ Upload with signatures
subprocess.run([
"twine", "upload",
"--sign", "--identity", self.key_id,
"dist/*"
], check=True)
print("๐ Secure upload complete!")
# ๐ฎ Let's use it!
publisher = SecurePublisher("YOUR_GPG_KEY_ID")
publisher.generate_checksums()
publisher.sign_distributions()
publisher.verify_signatures()
๐ Advanced Concepts
๐งโโ๏ธ Dynamic Versioning with Git Tags
When youโre ready to level up, try automatic versioning:
# ๐ฏ setup.py with dynamic versioning
from setuptools import setup, find_packages
import subprocess
import re
def get_version_from_git():
"""๐ท๏ธ Get version from git tags"""
try:
# ๐ Get the latest tag
tag = subprocess.check_output(
["git", "describe", "--tags", "--abbrev=0"],
stderr=subprocess.DEVNULL
).decode().strip()
# ๐ Check if we're on the tagged commit
tag_commit = subprocess.check_output(
["git", "rev-list", "-n", "1", tag],
stderr=subprocess.DEVNULL
).decode().strip()
current_commit = subprocess.check_output(
["git", "rev-parse", "HEAD"],
stderr=subprocess.DEVNULL
).decode().strip()
if tag_commit == current_commit:
# โจ Clean tag version
return tag.lstrip("v")
else:
# ๐ Development version
commits_since = subprocess.check_output(
["git", "rev-list", f"{tag}..HEAD", "--count"],
stderr=subprocess.DEVNULL
).decode().strip()
short_hash = current_commit[:7]
base_version = tag.lstrip("v")
return f"{base_version}.dev{commits_since}+g{short_hash}"
except subprocess.CalledProcessError:
# ๐ก๏ธ Fallback version
return "0.0.1.dev0"
# ๐จ Use in setup
setup(
name="awesome-package",
version=get_version_from_git(),
# ... rest of configuration
)
๐๏ธ Advanced Dependency Management
For complex dependency scenarios:
# ๐ smart_dependencies.py
import sys
import platform
from typing import List, Dict
class DependencyManager:
"""๐ฏ Smart dependency resolution"""
@staticmethod
def get_platform_deps() -> List[str]:
"""๐ Platform-specific dependencies"""
deps = []
# ๐ฅ๏ธ Windows-specific
if platform.system() == "Windows":
deps.extend([
"pywin32>=300",
"windows-curses>=2.0",
])
# ๐ง Linux-specific
elif platform.system() == "Linux":
deps.extend([
"python-daemon>=2.0",
])
# ๐ macOS-specific
elif platform.system() == "Darwin":
deps.extend([
"pyobjc-core>=8.0",
])
return deps
@staticmethod
def get_python_version_deps() -> List[str]:
"""๐ Python version-specific dependencies"""
deps = []
# ๐ Python 3.8 needs backports
if sys.version_info < (3, 9):
deps.extend([
"importlib-metadata>=4.0",
"typing-extensions>=4.0",
])
# ๐ Python 3.11+ optimizations
if sys.version_info >= (3, 11):
deps.extend([
"speedups>=1.0", # Hypothetical C extensions
])
return deps
@staticmethod
def get_optional_deps() -> Dict[str, List[str]]:
"""โจ Optional feature dependencies"""
return {
"async": [
"aiohttp>=3.8",
"asyncpg>=0.27",
"aiofiles>=22.0",
],
"ml": [
"numpy>=1.20",
"scikit-learn>=1.0",
"pandas>=1.3",
],
"viz": [
"matplotlib>=3.5",
"seaborn>=0.12",
"plotly>=5.0",
],
"all": [
# ๐ Everything!
"aiohttp>=3.8",
"numpy>=1.20",
"matplotlib>=3.5",
],
}
# ๐ฎ Use in pyproject.toml generation
manager = DependencyManager()
all_deps = (
["requests>=2.28", "click>=8.0"] +
manager.get_platform_deps() +
manager.get_python_version_deps()
)
โ ๏ธ Common Pitfalls and Solutions
๐ฑ Pitfall 1: Broken Installation
# โ Wrong - Missing package data
setup(
name="my-package",
packages=find_packages(),
# ๐ฅ Static files not included!
)
# โ
Correct - Include all necessary files
setup(
name="my-package",
packages=find_packages(),
package_data={
"my_package": [
"data/*.json",
"templates/*.html",
"static/**/*",
],
},
include_package_data=True, # ๐ฆ Use MANIFEST.in
)
๐คฏ Pitfall 2: Version Conflicts
# โ Dangerous - Too strict versioning
dependencies = [
"requests==2.28.0", # ๐ฅ Locks users to exact version
"numpy==1.21.0", # ๐ฅ Causes conflicts
]
# โ
Safe - Flexible version ranges
dependencies = [
"requests>=2.28.0,<3.0", # โ
Compatible range
"numpy>=1.20,<2.0", # โ
Major version pin
]
๐ Pitfall 3: Security Vulnerabilities
# โ Insecure - No verification
def download_dependency(url: str):
response = requests.get(url)
with open("dep.tar.gz", "wb") as f:
f.write(response.content) # ๐ฅ No verification!
# โ
Secure - Verify checksums and signatures
def secure_download(url: str, expected_hash: str):
response = requests.get(url)
# ๐ Verify checksum
actual_hash = hashlib.sha256(response.content).hexdigest()
if actual_hash != expected_hash:
raise ValueError("โ Checksum mismatch!")
# โ
Safe to save
with open("dep.tar.gz", "wb") as f:
f.write(response.content)
print("โ
Download verified!")
๐ ๏ธ Best Practices
- ๐ฆ Use Modern Tools: Embrace pyproject.toml and build
- ๐ Automate Everything: CI/CD for testing and releases
- ๐ Version Semantically: Follow SemVer principles
- ๐ก๏ธ Security First: Sign packages and verify dependencies
- ๐ Document Thoroughly: README, CHANGELOG, and API docs
- ๐งช Test Extensively: Multiple Python versions and platforms
- ๐ Support Widely: Consider different OS and architectures
๐งช Hands-On Exercise
๐ฏ Challenge: Create a Professional Package
Build a complete Python package with:
๐ Requirements:
- โ Modern pyproject.toml configuration
- ๐ Automated GitHub Actions workflow
- ๐ Dynamic versioning from git tags
- ๐ก๏ธ Package signing with GPG
- ๐ Sphinx documentation
- ๐งช 90%+ test coverage
- ๐จ Pre-commit hooks for quality
๐ Bonus Points:
- Add multi-platform wheel building
- Implement automatic changelog generation
- Create a documentation site with Read the Docs
๐ก Solution
๐ Click to see solution
# ๐ฏ Complete professional package setup!
# ๐ Project structure
professional_package/
โโโ src/
โ โโโ pro_package/
โ โโโ __init__.py
โ โโโ __main__.py # ๐ฎ CLI entry point
โ โโโ core.py
โ โโโ utils.py
โ โโโ py.typed # ๐ฏ Type hints marker
โโโ tests/
โ โโโ conftest.py # ๐งช Pytest configuration
โ โโโ test_core.py
โ โโโ test_integration.py
โโโ docs/
โ โโโ conf.py
โ โโโ index.rst
โ โโโ api.rst
โโโ .github/
โ โโโ workflows/
โ โ โโโ test.yml # ๐งช Test workflow
โ โ โโโ publish.yml # ๐ Release workflow
โ โ โโโ docs.yml # ๐ Documentation workflow
โ โโโ dependabot.yml # ๐ Dependency updates
โโโ .pre-commit-config.yaml # ๐จ Code quality hooks
โโโ pyproject.toml
โโโ MANIFEST.in
โโโ LICENSE
โโโ README.md
โโโ CHANGELOG.md
โโโ tox.ini # ๐งช Multi-env testing
# ๐จ pyproject.toml
[build-system]
requires = ["setuptools>=61.0", "setuptools-scm[toml]>=7.0"]
build-backend = "setuptools.build_meta"
[project]
name = "professional-package"
dynamic = ["version"] # ๐ Version from git
description = "A professional Python package example ๐"
readme = "README.md"
license = {text = "MIT"}
authors = [{name = "Pro Dev", email = "[email protected]"}]
classifiers = [
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Typing :: Typed",
]
requires-python = ">=3.8"
dependencies = [
"click>=8.0",
"rich>=12.0",
"pydantic>=2.0",
]
[project.optional-dependencies]
dev = [
"pytest>=7.0",
"pytest-cov>=4.0",
"pytest-asyncio>=0.20",
"black>=23.0",
"ruff>=0.0.270",
"mypy>=1.0",
"pre-commit>=3.0",
"tox>=4.0",
]
docs = [
"sphinx>=5.0",
"sphinx-rtd-theme>=1.0",
"sphinx-click>=4.0",
"myst-parser>=0.18",
]
[project.scripts]
pro-cli = "pro_package.__main__:cli"
[project.urls]
Homepage = "https://github.com/prodev/professional-package"
Documentation = "https://professional-package.readthedocs.io"
Changelog = "https://github.com/prodev/professional-package/blob/main/CHANGELOG.md"
[tool.setuptools-scm]
write_to = "src/pro_package/_version.py"
[tool.pytest.ini_options]
testpaths = ["tests"]
addopts = "--cov=pro_package --cov-report=term-missing"
[tool.mypy]
python_version = "3.8"
warn_return_any = true
warn_unused_configs = true
disallow_untyped_defs = true
[tool.ruff]
target-version = "py38"
select = ["E", "F", "UP", "B", "SIM", "I"]
# ๐งช tox.ini for multi-environment testing
[tox]
envlist = py{38,39,310,311}, docs, lint
isolated_build = True
[testenv]
deps = .[dev]
commands = pytest {posargs}
[testenv:docs]
deps = .[docs]
commands = sphinx-build -W -b html docs docs/_build
[testenv:lint]
deps = .[dev]
commands =
black --check src tests
ruff check src tests
mypy src
# ๐ฏ Complete implementation!
print("๐ Professional package ready for PyPI!")
๐ Key Takeaways
Youโve learned so much! Hereโs what you can now do:
- โ Create professional packages with modern tools ๐ช
- โ Automate releases with CI/CD pipelines ๐ก๏ธ
- โ Sign and secure your distributions ๐ฏ
- โ Manage complex dependencies like a pro ๐
- โ Build platform-specific wheels for better performance ๐
Remember: Publishing to PyPI is about sharing your work with the world. Make it professional, secure, and user-friendly! ๐ค
๐ค Next Steps
Congratulations! ๐ Youโve mastered advanced PyPI publishing!
Hereโs what to do next:
- ๐ป Create your first professional package
- ๐๏ธ Set up automated workflows for your projects
- ๐ Explore package documentation with Sphinx
- ๐ Contribute to open source Python packages!
Remember: Every popular Python package started with someone learning to publish. Your package could be the next big thing! ๐
Happy packaging! ๐๐โจ