In case you're trying to read this through your email, be warned: This article is large and may get clipped by services like Gmail and others.
This isn't just an article or a typical post: it is a Python crash course delivered to your inbox. However, since services like Gmail often clip larger emails, I strongly recommend opening it in your browser instead.
Also, make sure to read the previous chapters before this one!
7) Private Sources and Internal Packages: Shipping Your SD-WAN SDK
In most networks of consequence, you’ll consume private artifacts (vendor SDK wheels, internal helpers) and you’ll eventually ship your own (an SD-WAN controller client, inventory library, or a CLI your Ops team runs). Poetry makes both sides predictable, as long as you wire sources, credentials, and promotion flows deliberately.
The pattern to aim for:
Declare sources in your project with safe priorities.
Keep credentials out of Git; pass them via environment or Poetry’s keychain.
Consume vendor/internal packages explicitly from the right index.
Build signed artifacts for your code and publish them via CI under change control.
Promote versions through gates (dev → staging → prod) in time with change windows.
Retain artifacts and lockfiles for audits and disaster recovery.
A. Add private sources (indexes) with safe priorities
You’ll typically have:
PyPI for public packages.
One or more internal registries (Artifactory/Nexus/Cloudsmith) for vendor wheels and your org’s packages.
Sometimes, a separate staging vs prod registry.
Add sources with Poetry and mark internal ones as explicit so resolution never “accidentally” falls back to them.
# Add internal prod index and mark it explicit:
poetry source add --priority explicit corp-prod https://artifactory.example.com/api/pypi/pypi/simple
# Optional: a staging index for pre-release testing:
poetry source add --priority explicit corp-staging https://artifactory.example.com/api/pypi/staging/simple
This writes entries under [[tool.poetry.source]] in pyproject.toml. Example:
[[tool.poetry.source]]
name = "pypi"
url = "https://pypi.org/simple"
default = true
[[tool.poetry.source]]
name = "corp-prod"
url = "https://artifactory.example.com/api/pypi/pypi/simple"
priority = "explicit"
[[tool.poetry.source]]
name = "corp-staging"
url = "https://artifactory.example.com/api/pypi/staging/simple"
priority = "explicit"
Why explicit? Poetry will only use that source when you explicitly target it for a dependency. That prevents a misconfigured registry from shadowing PyPI.
B. Keep credentials out of Git (env or Poetry config)
Use Poetry’s HTTP Basic auth, supplied at runtime via environment or a secure CI secret. Two easy options:
1) Environment variables (preferred in CI):
export POETRY_HTTP_BASIC_CORP_PROD_USERNAME=$CI_USER
export POETRY_HTTP_BASIC_CORP_PROD_PASSWORD=$CI_TOKEN
The env var names follow POETRY_HTTP_BASIC_<UPPERCASE_SOURCE_NAME>_USERNAME|PASSWORD.
2) Poetry config on a developer machine (not committed):
poetry config http-basic.corp-prod "$USER" "$TOKEN"
# For staging too, if needed:
poetry config http-basic.corp-staging "$USER" "$TOKEN"
Don’t put creds in pyproject.toml. Don’t commit .pypirc with tokens. Use short-lived, scoped service-account tokens (read for install jobs; publish for release jobs) and rotate them like device credentials.
C. Consume vendor wheels and internal packages explicitly
Point particular dependencies at the right source. This keeps your graph predictable and auditable.
[tool.poetry.dependencies]
python = "^3.11"
# public packages
httpx = "^0.27"
pydantic = "^2.8"
# internal SD-WAN SDK (from corp-prod index)
company-sdwan-sdk = { version = "^1.8", source = "corp-prod" }
# vendor wheel delivered via corp-prod mirror
vendor-sdwan-api = { version = "~=3.2.0", source = "corp-prod" }
Now Poetry will fetch those two only from corp-prod. If you want to test a release candidate first, temporarily point at staging:
company-sdwan-sdk = { version = "1.9.0-rc.3", source = "corp-staging" }
Once validated, switch the version (and source) in a PR, re-lock, and roll during the approved change window.
D. Build your own artifacts (wheel + sdist)
When your automation grows beyond a one-off script, package it. Poetry bakes the wheel and source tarball:
poetry build
# Produces dist/yourpkg-1.4.0-py3-none-any.whl and dist/yourpkg-1.4.0.tar.gz
Adopt semantic versioning:
PATCH: bugfixes, no API changes (
1.4.1)MINOR: backward-compatible features (
1.5.0)MAJOR: breaking changes (
2.0.0)Use pre-release tags (alpha/beta/rc) for gated testing:
1.6.0-rc.2
Set the version with:
poetry version patch|minor|major|<explicit>
E. Publish to your registry (manual and CI)
Manual (developer machine with publish rights):
# Publish to prod registry
poetry publish -r corp-prod
# Or to staging
poetry publish -r corp-staging
CI (GitHub Actions example) on a release tag:
name: release
on:
push:
tags: ['v*.*.*', 'v*.*.*-rc.*']
jobs:
build-publish:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version-file: .python-version
- uses: snok/install-poetry@v1
with:
virtualenvs-create: true
virtualenvs-in-project: true
installer-parallel: true
- run: poetry version $(echo "${GITHUB_REF_NAME}" | sed 's/^v//')
- run: poetry build
- name: Publish to staging for RCs, prod for finals
env:
POETRY_HTTP_BASIC_CORP_STAGING_USERNAME: ${{ secrets.STAGING_USER }}
POETRY_HTTP_BASIC_CORP_STAGING_PASSWORD: ${{ secrets.STAGING_TOKEN }}
POETRY_HTTP_BASIC_CORP_PROD_USERNAME: ${{ secrets.PROD_USER }}
POETRY_HTTP_BASIC_CORP_PROD_PASSWORD: ${{ secrets.PROD_TOKEN }}
run: |
if [[ "$GITHUB_REF_NAME" == *"-rc."* ]]; then
poetry publish -r corp-staging
else
poetry publish -r corp-prod
fi
This implements version gates: RCs land in staging; finals land in prod. Promotion is a merge/tag action, tied to a change window.
F. Promotion flows that match change control
Treat package promotion like you treat route-policy promotion:
Develop in feature branches; publish
-alpha/-betaartifacts to staging.Integrate and test in CI (unit + integration + device emulators).
Validate in a lab or canary environment during a staging window.
Promote by cutting a final tag (e.g.,
v1.8.0) that publishes to prod index.Roll to production bastions using locked installs (see Section 8).
Retain artifacts + lockfiles for audit/rollback.
Your poetry.lock at each consuming repo records exactly what was used when the change succeeded. Roll back by reverting the lockfile and redeploying—no guesswork.
G. Exporting for pip-only hosts and air-gapped installs
Many control planes (AWX/Ansible Controller, some bastions) are still pip-native. Use Poetry to export the resolved graph and pre-download wheels:
# On a build box with access to both PyPI and corp-prod:
poetry export -f requirements.txt --with-hashes -o requirements.lock.txt
pip download -r requirements.lock.txt -d wheelhouse/ \
--extra-index-url https://artifactory.example.com/api/pypi/pypi/simple
# Move wheelhouse/ and requirements.lock.txt to the target host (offline OK)
python -m venv .venv && source .venv/bin/activate
pip install --no-index --find-links wheelhouse/ --require-hashes -r requirements.lock.txt
This enforces the same resolved versions you tested, even without Poetry present.
H. Security and supply-chain hardening (practical minimums)
Explicit sources for internal deps; don’t let resolution wander.
Hash-pinned exports (
--with-hashes) for pip-only installs.Read vs publish tokens with least privilege; short TTLs; rotate routinely.
SBOM: keep
poetry.lockand built artifacts alongside change records.Quarantine new vendor wheels in staging; scan before promotion.
Repro caches: retain wheels for each approved build to survive upstream yanks.
I. Practical troubleshooting
“Poetry won’t use my internal index.”
Ensure the dependency specifiessource = "corp-prod"and that the source name matches exactly. Withpriority = "explicit", Poetry will ignore the index unless you target it.“Credentials work locally but fail in CI.”
CI env var names must be uppercase source names:POETRY_HTTP_BASIC_CORP_PROD_USERNAME|PASSWORD. Confirm the job actually needs publish rights (most jobs should be read-only).“Vendor wheel is Linux-only; devs on macOS fail.”
Provide a dev-only shim group (e.g., mock clients) or containerized dev workflows. For cross-platform libraries, ask vendors for universal wheels or source dists.“Accidental dependency drift on bastions.”
Never runpoetry updatethere. Install from a known lock (export + wheelhouse) created by CI.
J. The enterprise payoff
With declared sources, clean credentials, and a gated publish flow, your SD-WAN SDK and related automation become real, first-class artifacts. Teams consume them with a single line, CI tests them under the same constraints, and change windows become about intended updates, not surprises from the public internet. Poetry gives you the levers; your process turns them into repeatable, auditable releases.
8) Reproducibility and Drift Control: Lockfiles, Updates, and Policy
In networking, we separate intent from state: you write a routing policy (intent) and the devices compile it into a FIB (state), which we continuously monitor various signals from various sources to confirm the intended behavior. The same discipline is what turns Python environments from “hope” into infrastructure. In the Poetry world:
pyproject.tomlis your intent (what you mean to depend on, with version ranges and groups).poetry.lockis the compiled state (the exact package versions and hashes that were proven to work).
Treating the lockfile as a first-class artifact is how dev, CI, and prod converge on the same dependency graph; no “it worked yesterday” surprises.
The Lockfile Is the Contract
When you run poetry install, Poetry prefers the lockfile. If it’s present, it will install exactly those versions, including transitive dependencies and hashes. That means:
Your laptop, the shared bastion, and the CI runner get byte-for-byte identical wheels.
Rollbacks are trivial (revert the lockfile).
Audits are possible (the lockfile is your dependency inventory/SBOM).
Air-gapped builds can be pre-fetched from the lock (covered below).
The moment you stop honoring the lock, you reintroduce drift.
poetry update vs poetry lock --no-update (and when to use each)
Think of these as two different change procedures:
poetry update
What it does: Resolves newer versions within your constraints and rewrites
poetry.lock. By default, it updates everything; you can target specific packages:poetry update ncclient httpx.When to use: During planned renovations; on a schedule, with CI and canary tests. It’s like widening a prefix list: safe when you can observe the blast radius.
poetry lock --no-update
What it does: Re-compiles the lockfile without bumping versions. It refreshes metadata and markers to match the current
pyproject.tomland environment (e.g., after you changed Python constraints, added a source, or toggled groups), but it keeps the same resolved versions wherever possible.When to use: After non-functional changes to project config (Python marker tweaks, source priority changes) or to re-normalize the lock without pulling newer deps. It’s like re-rendering a config template with the same inputs.
Rules of thumb
Day-to-day: commit
poetry.lockand do not runpoetry updatecasually on bastions or during change windows.Renovations: run
poetry updatein a branch, let CI and integration tests burn it down, and merge only with evidence.Configuration edits (not dependency bumps):
poetry lock --no-updateto keep state stable.
A Renovation Strategy That Won’t Blow Up a Change Window
You already manage versions for firmware and route policies; treat dependencies the same way.
1) Cadence
Weekly or bi-weekly for minor bumps in dev branches (faster iteration environments).
Monthly or tied to change windows for production stacks (bastions, AWX/Controller images).
2) Scope
Prefer targeted updates:
poetry update httpx pydanticrather than the whole world.Run a full
poetry updatequarterly (or before major releases) with extra scrutiny.
3) Testing
CI runs unit + integration tests against device emulators or lab endpoints.
A vendor matrix (
--with junos|cisco|arista) isolates failures to a stack.Canary run on a lab or low-risk site before global rollout.
4) Risk controls
Major versions behind a feature flag or in a separate branch until validated.
Keep a wheelhouse of last-known-good artifacts so rollbacks are instant.
Lockfile diffs are reviewed like code (humans must see “what changed”).
5) Governance
One person (or bot) is the build farmer for dependency bumps; others review.
No direct commits to
mainthat alter the lock; always via PR with green CI.
Exporting for Non-Poetry Hosts (AWX/Ansible Controller, Air-Gapped Bastions)
Some run targets are pip-native. You can still keep reproducibility by exporting from the lockfile and pinning hashes:
# On a build box (with network access):
poetry export -f requirements.txt --with-hashes \
--with junos \
-o requirements.lock.txt
# Pre-fetch the exact wheels (optional but ideal for air-gapped hosts):
pip download -r requirements.lock.txt -d wheelhouse/
# On the target (no Poetry required):
python -m venv .venv && source .venv/bin/activate
pip install --no-index --find-links wheelhouse/ --require-hashes -r requirements.lock.txt
Notes:
Use
--with/--without/--onlyto export the exact groups needed (e.g.,junosonly for a Juniper bastion).--require-hashesforces pip to match the exported hashes; no silent substitution.
Practical Playbooks (copy/paste)
Rebuild lockfile after changing Python marker, same versions:
# e.g., you changed python = "^3.11" → "^3.11, <3.13"
poetry lock --no-update
git add poetry.lock && git commit -m "re-lock without bumps"
Bump a single dependency (safe, targeted):
poetry update ncclient
poetry install
git add poetry.lock && git commit -m "update: ncclient to latest within constraints"
Full renovation in a branch (with matrix CI):
git switch -c deps/2025-09-rotation
poetry update
poetry install
# push → PR → CI runs vendor/test matrices
Roll forward/roll back:
# Roll forward by merging the PR that changes poetry.lock.
# Roll back by git-reverting that commit (lockfile reverts too) and re-deploying.
Policy: How We Keep Environments Boring
Adopt these as team standards (and put them in your CONTRIBUTING.md):
poetry.lockis mandatory and must be committed for every repo.Never run
poetry updateon production hosts; deployments install from the existing lock or an exported lockfile.Dependency changes land via PRs only, with lockfile diffs reviewed and CI green.
Renovations have a cadence; majors require feature flags or canaries.
Exports (
poetry export --with-hashes) are used for pip-only systems; wheelhouses are retained per release.The lockfile is the source of truth for audits and DR; retain it with change records and artifacts.
The Operator’s View: Intent vs. State Loop
You declare intent in
pyproject.toml.Poetry compiles that intent into
poetry.lock.Dev, CI, and prod install from lock, so state matches intent.
Periodically, you renovate, updating intent and recompiling lock in a controlled, test-rich loop.
If reality bites, you revert state (lockfile) and instantly regain a known-good environment.
It’s the same loop you use to keep networks stable: explicit policy, compiled state, measured changes, and quick rollback. With that mindset, dependencies stop being an ambient risk and become just another piece of infrastructure you run, quietly, predictably, and on purpose.
9) CI/CD Integration That Actually Matches Your Laptop
Your pipelines should compile and run the same interpreter and the same dependency graph you used locally, otherwise CI isn’t validation, it’s roulette. The recipe is simple:
Read Python from your repo (prefer
.python-version; fall back topyproject.toml’s constraint).Install Poetry in a repeatable way, with in-project virtualenvs.
Cache what’s safe to cache (Poetry cache and pip wheels), not what’s fragile (entire venvs).
Install from the lockfile, not the internet’s mood.
Produce artifacts for environments that can’t run Poetry (AWX/Controller, air-gapped bastions).
Below are sample templates for GitHub Actions, GitLab CI, and Jenkins. Each has two modes:
Baseline: run on the exact Python in
.python-version(mirrors your laptop/jump host).Compatibility matrix: optionally test on multiple Pythons (e.g., 3.9 and 3.11) to keep legacy playbooks honest while you develop on 3.11+.
Note: poetry install --no-root skips installing the current project package into the venv. Use it for script-only repos. If your tests import your package (library/CLI), omit --no-root so Poetry installs your package in editable mode.
GitHub Actions
A. Baseline job (match .python-version)
name: ci
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
# 1) Interpreter: exactly what your repo pins
- uses: actions/setup-python@v5
with:
python-version-file: .python-version # falls back to default if absent
# 2) Poetry (clean, isolated)
- uses: snok/install-poetry@v1
with:
virtualenvs-create: true
virtualenvs-in-project: true
installer-parallel: true
# 3) Caches keyed on the lockfile (safe to share)
- name: Cache Poetry
uses: actions/cache@v4
with:
path: ~/.cache/pypoetry
key: poetry-${{ runner.os }}-${{ hashFiles('poetry.lock') }}
- name: Cache pip wheels
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: pip-${{ runner.os }}-${{ hashFiles('poetry.lock') }}
# 4) Install from lock (no surprises)
- run: poetry install --with test --no-interaction
# 5) Run tests
- run: poetry run pytest -q
B. Compatibility matrix (e.g., 3.9 and 3.11)
jobs:
matrix-test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.9', '3.11']
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- uses: snok/install-poetry@v1
with:
virtualenvs-create: true
virtualenvs-in-project: true
installer-parallel: true
- uses: actions/cache@v4
with:
path: ~/.cache/pypoetry
key: poetry-${{ runner.os }}-${{ matrix.python-version }}-${{ hashFiles('poetry.lock') }}
- uses: actions/cache@v4
with:
path: ~/.cache/pip
key: pip-${{ runner.os }}-${{ matrix.python-version }}-${{ hashFiles('poetry.lock') }}
- run: poetry install --with test --no-interaction
- run: poetry run pytest -q
C. Export artifacts for AWX/Controller (pip-only targets)
export-artifacts:
runs-on: ubuntu-latest
needs: [test] # only export if tests passed
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version-file: .python-version
- uses: snok/install-poetry@v1
with:
virtualenvs-create: false
- name: Export lock (runtime only, e.g., junos stack)
run: |
poetry export -f requirements.txt --with-hashes --with junos \
-o requirements.lock.txt
- name: Build wheelhouse (exact wheels for air-gapped)
run: |
python -m pip install --upgrade pip
pip download -r requirements.lock.txt -d wheelhouse/
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: pip-artifacts
path: |
requirements.lock.txt
wheelhouse/
You can later download these artifacts in a deployment job (or manually) and run:
python -m venv .venv && source .venv/bin/activate
pip install --no-index --require-hashes --find-links wheelhouse/ -r requirements.lock.txt
GitLab CI
.gitlab-ci.yml:
stages: [test, export]
variables:
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
POETRY_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pypoetry"
cache:
key:
files:
- poetry.lock
paths:
- .cache/pip
- .cache/pypoetry
test:
stage: test
image: python:3.11-slim
before_script:
- python -m pip install --upgrade pip pipx
- python -m pipx ensurepath
- pipx install poetry
- poetry config virtualenvs.in-project true --local
script:
- poetry install --with test --no-interaction
- poetry run pytest -q
artifacts:
when: on_failure
paths:
- .venv
- .pytest_cache
- junit.xml
expire_in: 1 week
matrix_compat:
stage: test
parallel:
matrix:
- PY: "3.9"
- PY: "3.11"
image: python:${PY}-slim
before_script:
- python -m pip install --upgrade pip pipx
- pipx install poetry
- poetry config virtualenvs.in-project true --local
script:
- poetry install --with test --no-interaction
- poetry run pytest -q
export_awx:
stage: export
image: python:3.11-slim
needs: ["test"]
script:
- pip install --upgrade pip pipx
- pipx install poetry
- poetry export -f requirements.txt --with-hashes --with junos -o requirements.lock.txt
- pip download -r requirements.lock.txt -d wheelhouse/
artifacts:
name: "pip-artifacts-$CI_COMMIT_SHORT_SHA"
paths:
- requirements.lock.txt
- wheelhouse/
expire_in: 2 weeks
Jenkins (Declarative Pipeline)
If your agents don’t have Pyenv, the easiest way to align versions is to run in Python Docker images. If you do have Pyenv on agents, you can read .python-version and pyenv install -s before building.
A. Matrix with Docker agents (portable)
pipeline {
agent none
options { timestamps() }
stages {
stage('Matrix') {
matrix {
axes {
axis {
name 'PY'
values '3.9', '3.11'
}
}
agent { docker { image "python:${PY}-slim" } }
stages {
stage('Install Poetry') {
steps {
sh '''
python -m pip install --upgrade pip pipx
python -m pipx ensurepath
pipx install poetry
poetry config virtualenvs.in-project true --local
'''
}
}
stage('Install deps & Test') {
steps {
sh '''
poetry install --with test --no-interaction
poetry run pytest -q
'''
}
}
}
}
}
stage('Export for AWX') {
agent { docker { image 'python:3.11-slim' } }
steps {
sh '''
python -m pip install --upgrade pip pipx
pipx install poetry
poetry export -f requirements.txt --with-hashes --with junos -o requirements.lock.txt
pip download -r requirements.lock.txt -d wheelhouse/
'''
}
post {
success {
archiveArtifacts artifacts: 'requirements.lock.txt, wheelhouse/**', fingerprint: true
}
}
}
}
}
B. Using Pyenv on Jenkins agents (if available)
pipeline {
agent any
stages {
stage('Setup') {
steps {
sh '''
PYVER=$(cat .python-version)
pyenv install -s "$PYVER"
pyenv local "$PYVER"
python -m pip install --user pipx
python -m pipx ensurepath || true
~/.local/bin/pipx install poetry
~/.local/bin/poetry config virtualenvs.in-project true --local
'''
}
}
stage('Test') {
steps {
sh '''
~/.local/bin/poetry install --with test --no-interaction
~/.local/bin/poetry run pytest -q
'''
}
}
}
}
Caching: what to cache (and what not to)
✅ Safe to cache
Poetry’s cache:
~/.cache/pypoetryPip’s cache:
~/.cache/pipDownloaded wheelhouse used for deploys/air-gapped installs
❌ Risky to cache across commits
The entire
.venv/; it’s brittle across lockfile changes and platform updates. Recreate from lock; it’s fast with good caches.
Cache keys should include a hash of poetry.lock and, for matrices, the Python version.
Ensuring pipelines mirror your laptop/jump host
Interpreter: prefer
.python-versionas the single source of truth for the exact minor (e.g., 3.11.9). Use matrices only when you purposefully support multiple versions.Dependencies: install from
poetry.lock. Neverpoetry updatein CI on mainline; run updates in branches with reviews.Artifacts: always export a hashed requirements file and a wheelhouse when your deployment target lacks Poetry.
When you wire CI this way, you get the same property you chase in production networks: determinism. The code you tested is the code you deploy, on the interpreter you intended, with the dependencies you locked. No more “CI is green, but prod is weird.”
10) Security & Supply-Chain Hardening for NetOps
If BGP has taught us anything, it’s this: never assume the path is trustworthy just because the packet arrived. Treat Python packages the same way. Your automation runs inside a dependency graph you didn’t author and can’t fully audit on the fly. The antidote is policy and plumbing that make installs deterministic, inspectable, and local; so you’re not pulling random wheels from the internet during a change window.
Below is a pragmatic hardening playbook that maps cleanly to how network engineers already operate.
1) Pin like you mean it: lockfile + hashes (intent vs. state)
Commit the lockfile (
poetry.lock). It is the contract between dev, CI, and prod.Install from the lock, not from floating constraints:
poetry install --no-interaction.For pip-only targets (AWX/Controller, jump hosts), export with hashes and enforce them:
# On a build box
poetry export -f requirements.txt --with-hashes \
--with junos \
-o requirements.lock.txt
# (Optional but recommended) Pre-fetch wheels for offline/DR use
pip download -r requirements.lock.txt -d wheelhouse/
# On the target host (no Poetry required)
python -m venv .venv && source .venv/bin/activate
pip install --no-index --require-hashes \
--find-links wheelhouse/ \
-r requirements.lock.txt
Hashes make tampering and “silent swaps” fail closed.
Governance tip: Make pyproject.toml and poetry.lock code-owned files; PRs that change them must get explicit review, just like a route-policy change.
2) Prefer internal mirrors; use explicit sources only in regulated environments
Keep production hosts off the public internet during installs. Mirror PyPI and host vendor wheels internally (Artifactory/Nexus/Cloudsmith), then declare sources with safe priorities:
[[tool.poetry.source]]
name = "pypi"
url = "https://pypi.org/simple"
default = true
[[tool.poetry.source]]
name = "corp-prod"
url = "https://artifactory.example.com/api/pypi/pypi/simple"
priority = "explicit" # only used when a dependency says source="corp-prod"
Target internal packages explicitly:
company-sdwan-sdk = { version = "^1.8", source = "corp-prod" }
vendor-sdwan-api = { version = "~=3.2.0", source = "corp-prod" }
On locked-down hosts, go further:
Set
priority = "explicit"for all non-PyPI sources.In prod, set
index-urlto your mirror and block egress:
~/.config/pip/pip.conf (or /etc/pip.conf)
[global]
index-url = https://artifactory.example.com/api/pypi/pypi/simple
timeout = 30
…and avoid accidental fallbacks by installing from exports with hashes (above). For truly air-gapped flows, use only --no-index + --find-links wheelhouse/.
3) Air-gapped & DR flows you can rehearse
You need a repeatable way to restore automation after an outage or to build in sites without internet.
Build box (connected):
poetry build # your own wheel(s)
poetry export -f requirements.txt --with-hashes -o requirements.lock.txt
pip download -r requirements.lock.txt -d wheelhouse/
tar -czf artifacts.tgz requirements.lock.txt wheelhouse/ dist/
Target site (disconnected):
tar -xzf artifacts.tgz
python -m venv .venv && source .venv/bin/activate
pip install --no-index --require-hashes --find-links wheelhouse/ -r requirements.lock.txt
pip install --no-index --find-links dist/ yourpkg-*.whl
Keep a per-release wheelhouse and lockfile alongside your change record. That’s your “golden image” for the automation plane.
4) Least-privilege tokens & non-interactive CI publishing
Treat credentials like device keys:
Read vs. Publish tokens: most jobs only need read to install; a dedicated release job holds the publish secret.
Short-lived or OIDC/Workload-identity where possible; never commit API tokens, never store in
pyproject.toml.CI passes creds via environment, not files; Poetry reads them as HTTP Basic for a given source:
# CI environment variables (names must match source names uppercased)
export POETRY_HTTP_BASIC_CORP_PROD_USERNAME="$CI_USER"
export POETRY_HTTP_BASIC_CORP_PROD_PASSWORD="$CI_TOKEN"
poetry publish -r corp-prod
Release pipeline pattern:
Tag
vX.Y.Z→ CI builds wheels (poetry build).CI publishes to staging for
-rc.*tags; publishes to prod for final tags.Promotion is a merge/tag action aligned with a change window.
Store published artifact digests + the exact
poetry.lockused.
5) Quarantine & promotion: don’t inject unknown routes
New or updated dependencies get quarantined in a staging index, validated in CI and a lab/canary, then promoted:
Dev/RC:
company-sdwan-sdk = { version = "1.9.0-rc.2", source = "corp-staging" }After validation: bump to
1.9.0and switchsource = "corp-prod"in a PR.Lock/exports are regenerated and shipped with the change.
This mirrors how you treat new route-policies: test in a small domain, then roll.
6) Visibility & scanning (defense-in-depth)
Keep the lockfile as your SBOM of record (store with the release).
Run a dependency vulnerability scan in CI (pick a tool compatible with Poetry/exports).
For containerized runners, pin base images (
python:3.11.X-slim) to avoid surprise libc/openssl bumps.Log who changed the lockfile, what changed (diff), when it was promoted, and which wheels were used (artifact digests).
7) Policy-as-code guardrails
CODEOWNERS for
pyproject.tomlandpoetry.lock.Require green CI (tests + export + wheelhouse) before merging dependency changes.
Pre-commit hooks to prevent committing secrets and to remind engineers to re-lock when
pyproject.tomlchanges.No
poetry updateon prod hosts; installs must originate from a reviewed lock or exported hash-pinned file.
8) Troubleshooting (fail closed, not open)
Install pulls from the internet in prod: Verify pip config and ensure you’re using
--no-indexand a localwheelhouse/. Check that the job doesn’t runpoetry update.Package not found on internal index: Confirm the dependency’s
source = "corp-prod"and that the artifact exists in that repo (correct name/version).Hash mismatch on target: Your artifacts and lockfile are out of sync—re-export from the same lock, rebuild the wheelhouse, redeploy both together.
The hardening mindset
You don’t trust a BGP feed without policy and monitoring. Don’t trust Python dependencies without pinning, locality, and promotion controls. Lockfiles + hashes stop drift. Internal mirrors and explicit sources control egress and provenance. Air-gapped wheelhouses make recovery boring. Least-privilege tokens and non-interactive publishing remove humans from the blast radius.
Do this, and your automation stops being a collection of helpful scripts and becomes a reliable system: secure by default, reproducible on demand, and operated with the same discipline you bring to the network itself.
While completing this article, part of the Poetry-related series, I came across some extra really helpful information for those diving deep into the realm of network automation, viewing it as living systems rather than just a bunch of scripts and code snippets, just the kind of thing I am trying to teach you here.
However, I keep this extra info outside the main article and instead push it to the next edition.
See you then!
Leonardo Furtado

