- Remove "All Jobs" title
- Move Status column to front of table
- Add Cancel button for in-progress jobs
- Add cancel endpoint: POST /pypi/cache/cancel/{package_name}
- Add btn-danger CSS styling
- Download packages in 64KB chunks to temp file instead of loading into memory
- Upload to S3 from temp file (streaming)
- Clean up temp file after processing
- Reduces memory footprint from 2x file size to 1x file size
- Add overall progress bar showing completed/active/failed counts
- Unify all job types into single table with Type column
- Simplify status to Working/Pending/Failed badges
- Remove NPM "Coming Soon" section
- Add get_recent_activity() function for future activity feed
- Fix dark mode CSS using CSS variables
Backend:
- Add _recover_stale_tasks() to reset tasks stuck in 'in_progress'
from previous crashes (tasks >5 min old get reset to pending)
- Called automatically on startup
Frontend:
- Fix dark mode colors using CSS variables instead of hardcoded values
- Add elapsed time column showing how long task has been running
- Add spinning indicator next to package name
- Add status badge (Running/Stale?)
- Highlight stale tasks (>5 min) in amber
- Auto-updates every 5 seconds with existing refresh
Shows currently processing cache tasks in a dynamic table with:
- Package name and version constraint being cached
- Recursion depth and attempt number
- Start timestamp
- Pulsing indicator to show live activity
Backend changes:
- Add get_active_tasks() function to pypi_cache_worker.py
- Add GET /pypi/cache/active endpoint to pypi_proxy.py
Frontend changes:
- Add PyPICacheActiveTask type
- Add getPyPICacheActiveTasks() API function
- Add Active Workers section with animated table
- Auto-refreshes every 5 seconds with existing data
When the cache worker downloaded a package through the proxy, dependencies
were always queued with depth=0 instead of depth+1. This meant depth limits
weren't properly enforced for nested dependencies.
Changes:
- Add cache-depth query parameter to pypi_download_file endpoint
- Worker now passes its current depth when fetching packages
- Dependencies are queued at cache_depth+1 instead of hardcoded 0
- Add tests for depth tracking behavior
Replace unbounded thread spawning with managed worker pool:
- New pypi_cache_tasks table tracks caching jobs
- Thread pool with 5 workers (configurable via ORCHARD_PYPI_CACHE_WORKERS)
- Automatic retries with exponential backoff (30s, 60s, then fail)
- Deduplication to prevent duplicate caching attempts
New API endpoints for visibility and control:
- GET /pypi/cache/status - queue health summary
- GET /pypi/cache/failed - list failed tasks with errors
- POST /pypi/cache/retry/{package} - retry single package
- POST /pypi/cache/retry-all - retry all failed packages
This fixes silent failures in background dependency caching where
packages would fail to cache without any tracking or retry mechanism.
When background threads fetch from our own proxy using the request's
base_url, it returns http:// but ingress requires https://. The 308
redirect was dropping trailing slashes, causing requests to hit the
frontend catch-all route instead of /pypi/simple/.
Force HTTPS explicitly in the background caching function to avoid
the redirect entirely.
PEP 503 normalizes package names to use hyphens, but wheel filenames
may use underscores (e.g., typing_extensions-4.0.0-py3-none-any.whl).
Convert the search pattern to match either separator.
The background dependency caching was getting 308 redirects because
request.base_url returns http:// but the ingress redirects to https://.
Enable follow_redirects=True in httpx client to handle this.
When a PyPI package is cached, its dependencies are now automatically
fetched in background threads. This ensures the entire dependency tree
is cached even if pip already has some packages installed locally.
Features:
- Background threads fetch each dependency without blocking the response
- Uses our own proxy endpoint to cache, which recursively caches transitive deps
- Max depth of 10 to prevent infinite loops
- Daemon threads so they don't block process shutdown
- Deduplicate dependencies by package name before inserting
- Some packages (like anyio) list the same dep (trio) multiple times with
different version constraints for different extras
- The unique constraint on (artifact_id, project, package) rejected these
- Also removed debug logging from dependencies.py
- Parse version constraints like >=1.9, <2.0 using packaging library
- Find the latest version that satisfies the constraint
- Support wildcard (*) to get latest version
- Fall back to exact version and tag matching
- Add functions to parse Requires-Dist metadata from wheel and sdist files
- Store extracted dependencies in artifact_dependencies table
- Fix streaming response for cached artifacts (proper tuple unpacking)
- Fix version uniqueness check to use version string instead of artifact_id
- Skip creating versions for .metadata files
- Fix artifact_count and total_size calculation to use Tags instead of
Uploads, so PyPI cached packages show their stats correctly
- Fix PackagePage dropdown menu positioning (use fixed position with backdrop)
- Add system project detection for projects starting with "_"
- Show Version as primary column for system projects, hide Tag column
- Hide upload button for system projects (they're cache-only)
- Rename section header to "Versions" for system projects
- Fix test_projects_sort_by_name to exclude system projects from sort comparison
PyPI proxy improvements:
- Set package format to "pypi" instead of "generic"
- Extract version from filename and create PackageVersion record
- Support .whl, .tar.gz, and .zip filename formats
Package page UX overhaul:
- Move upload to header button with modal
- Simplify table: combine Tag/Version, remove Type and Artifact ID columns
- Add row action menu (⋯) with: Copy ID, Ensure File, Create Tag, Dependencies
- Remove cluttered "Download by Artifact ID" and "Create/Update Tag" sections
- Add modals for upload and create tag actions
- Cleaner, more scannable table layout
- Use storage.get_stream(s3_key) instead of non-existent get_artifact_stream()
- Make _pypi project public (is_public=True) so cached packages are visible
Use correct Artifact model fields:
- original_name instead of filename
- Add required created_by and s3_key fields
- Include checksum fields from storage result
The code was calling storage.store_artifact() which doesn't exist.
Changed to use storage.store() which handles content-addressable
storage with automatic deduplication.
Users can now configure the full path including /simple in their
upstream source URL (e.g., https://example.com/api/pypi/repo/simple)
instead of having the code append /simple/ automatically.
This matches pip's --index-url format, making configuration more
intuitive and copy/paste friendly.
When behind a reverse proxy that terminates SSL, the server sees HTTP
requests internally. Added _get_base_url() helper that respects the
X-Forwarded-Proto header to generate correct external HTTPS URLs.
This fixes links in the PyPI simple index showing http:// instead of
https:// when accessed via HTTPS through a load balancer.
Artifactory and other registries may return relative URLs in their
Simple API responses (e.g., ../../packages/...). The proxy now resolves
these to absolute URLs using urljoin() before encoding them in the
upstream parameter.
This fixes package downloads failing when the upstream registry uses
relative URLs in its package index.
- Add create_default_admin() call to factory reset (admin user wasn't being
created after reset, only on server restart)
- Add retry logic to reset_stage CI job (3 attempts with 5s delay)
- Use proper context manager for httpx client
- Increase timeout to 120s for reset operation
- Add retry: 1 at job level for transient failures